Microsoft Revised Azure Certifications

Microsoft has revised the Azure Certifications. The idea of the new certification is that they should map to job roles and job duties. There are 3 levels from Foundation to Associate then to Expert.

Some of them have the transition exam from old Microsoft Azure Exams. But be careful that these transition exams will expire in June 2019. So be quick to take the exams if you plan to.

Here is the map of them.

Microsoft Certified Azure Fundamentals

This Certification requires you to sit and pass one exam (AZ-900 Microsoft Azure Fundamentals). This exam is listed as optional for all certification path, and it covers the core understanding of Azure. This exam is for both technical and non-technical people looking to validate their foundation knowledge of Microsoft Azure Services.

 

Microsoft Certified Azure Administrator Associate

This certification requires you to sit and pass two exams (AZ-100 Microsoft Azure Infrastructure and Deployment and AZ-101 Microsoft Azure Integration and Security). As this is a replacement for the old 70-533 exam if you have already passed it you can sit the transition exam (AZ-102 Microsoft Azure Administrator Certification Transition) but be quick as this exam is only available until June 2019.

 

Microsoft Certified Azure Developer Associate

This certification originally was two exams just like the AZ-100/101 however after reviewing feedback provided during the beta tests Microsoft removed some of the learning objectives and consolidated it in to one exam (AZ-203 Developing Solutions for Microsoft Azure). Whilst this is the replacement for the 70-532 certification there is no transition exam so anyone with that and looking to obtain the new Associate certification must take the whole exam again.

 

Microsoft Certified Azure Solution Architect Expert

This certification requires you to sit and pass two exams (AZ-300 Microsoft Azure Architect Technologies and AZ-301 Microsoft Azure Architect Design). As this is a replacement for the old 70-535 exam if you have already passed that you can sit the transition exam (AZ-302 Microsoft Azure Solutions Architect Certification Transition). As with the AZ-102 transition exam this one will also expire in June 2019.

 

Microsoft Certified Azure Dev-Ops Engineer Expert

To attain this certification you have to sit one additional exam on-top of either Associate certification – (AZ-400 Microsoft Azure DevOps Solutions)

This certification is a new domain for Microsoft, and is for those Azure Administrator Associates or Azure Developer Associates who wish to take their roles to the “expert” level. Not only does it focus on Microsoft Azure solutions but how certain OpenSource tools complement it. As this is a new domain there is no transition exam.

SQL Profiler to Azure SQL Database

For almost all .NET developers, I am sure that they may having the same questions after moving the database from on-prem SQL into Azure SQL. Where is the SQL Profiler? If you tried SQL Profiler that comes with SSMS, you will found that you cannot execute the profiling.

If you google it, you will find someone talk about using SQL Azure DMVs to profile queries. i.e.,

OR you may also found out that MSSQLGirl also blog about using Extened Events,

Is there any Microsoft built tool?
YES.
Actually Microsoft is trying to build the Azure SQL Profiler. It also collecting the information from the Extened Events. Although it is still in Preview, but I tried and it actually can work and can show me what SQL queries are running on the Azure SQL Database.

Common Azure SQL Profiler use-cases:

  • Stepping through problem queries to find the cause of the problem.
  • Finding and diagnosing slow-running queries.
  • Capturing the series of Transact-SQL statements that lead to a problem.
  • Monitoring the performance of SQL Server to tune workloads.
  • Correlating performance counters to diagnose problems.

You could follow these steps to get and run this tool,

    1. Download and Install Azure Data Studio. click here
    2. After installed Azure Data Studio, execute it and then click “Extensions” tab on the left menu (in version 1.2.4, it is the last second icon.
    3. You will find a lot of different extensions available, you could then install SQL Server Profiler here (at this moment, its version is v0.3.0
    4. After installed the extension, make a connection to a server in Servers tab
    5. After you make a connection, type Alt + P to launch Profiler.
    6. You may need to create one Profiler Session in the first time run.
    7. Select “Standard_Azure” in Session Template and entering a meaningful name in the Session Name. Click Create button.
    8. To start Profiler, type Alt + S.
    9. Now you can start seeing the Extended Events.
    10. To stop Profiler, type Alt + S again.

“Package is not found…” when updating NuGet Package from Azure DevOps

if you are using the Azure DevOps, you may already know that it provides Azure Artifacts (it was called NuGet Package Management before VSTS renamed to Azure DevOps). With that you could create and share NuGet pacakges feed from public and private sources.

With the Azure DevOps CI pipeline, you could then build your solution, pack and push the NuGet package into private Azure Artifacts.

If you are already running in this way, you may have experience on trying to update the existing NuGet Package and it returns an error,

“Attempting to gather dependency information for package ‘xxx.newer.version’ with respect to project ‘ThisProject’, targeting ‘.NETFramework,Version=v4.7’

Package ‘xxx.newer.version’ is not found in the following primary source(s): ‘https://[YourDevOps_Name].pkgs.visualstudio.com/_packaging/[Your_AzureArtifacts_Name]/nuget/v3/index.json,https://api.nuget.org/v3/index.json’. Please verify all your online package sources are available (OR) package id, version are specified correctly.”

I tried to search from internet and found someone who is working in Microsoft has replied on 10th May 2018 as the following,


“Seems you are trying to download the package or packages that where just freshly pushed to VSTS nuget feed.
Since Visual Studio 2017 is listing it correctly, then the issue should not related to the feed on VSTS server.
If this occurs very recently(download the new refresh package) and your package is very large, this maybe a network delay. Suggest you use a fiddler trace when this issue happens again. This makes “some” sense, what you see is probably an incorrect propagation of pushed packages showing up in the search results but not yet available to download.
And some other also encounter the same issue and error as you.”

Well, I tried to use fiddler to see what is happening when I tried to update the package. Because I have also set up the upstream sources, so it checks on all the available NuGet Packages in private Azure Artifact until it founds the match one. (That also explains me another question, why it takes so long to attempt the dependency information for the updating NuGet Package). I could confirm that the newer package is updated in feed and may be taking a longer time to upload the actual package.

So next time, if you found the similar message when updating NuGet Package from Azure Artifact, you could do,

  1. Wait and retry later.
  2. if the problem keep exists, you could try to clear the NuGet Cache from VS–>Tools–>Nuget Package Manager–>Package manager Settings–>General.

Document {Blob Path} Has Unsupported Content Content Type in Azure Search Indexer

If you have done some development with Azure Search and Azure Blob Storage, then you may also have the similar experience. For me, I have an Azure Search Indexer, and it is pointing to Azure Blob Storage. The Azure Search Indexer is reporting “Document ‘{blob Path}’ has unsupported content type ‘unsupported'” after the indexer runs.

From the Microsoft Docs on Indexing Documents in Azure Blob Storage with Azure Search , it states that the supported document formats are,

  • PDF
  • Microsoft Office formats: DOCX/DOC, XLSX/XLS, PPTX/PPT, MSG (Outlook emails)
  • HTML
  • XML
  • ZIP
  • EML
  • RTF
  • Plain text files (see also Indexing plain text)
  • JSON (see Indexing JSON blobs)
  • CSV (see Indexing CSV blobs preview feature)

And the above error message is stated when the format is txt, msg, and html. I have asked around and someone from Microsoft ask me to test by starting very simple content until I hit into error. But I found out it fails except blank content in the file.

After few months on the trying, testing, and back and forth the comments, my boss give me a deadline as we cannot wait to launch the application. So in the end, the only way to make it works, is turn off the “FailOnUnsupportedContentType” using the REST API.

"parameters": { 
    "configuration": { 
       "indexedFileNameExtensions" : ".html,.txt,.pdf,.docx",
       "excludedFileNameExtensions": ".bmp,.dib,.png,.jpeg,.jpg,.jpe,.jfif,.gif,.tif,.tiff,.ico",
      "failOnUnsupportedContentType" : false
     }
 },

After that, now my indexer is running good, all new uploaded blob can get all data I want into Azure Search Index. Hope this could help you. And if you know the possible reason why these formats (txt, msg, html) are listed as supported but it keeps generating error as unsupported, please leave me message. I will also come to this topic if I found any updates.

Configure CORS in Azure

In my last post, I showed how to enable CORS in ASP.NET WebAPI. I then found out that I have another issue when hosting it in Azure. Azure has a great supporting on the CORS. You could watch a video about how great it is, here

First, I would like to show you how to enable CORS in Azure.
1) Go to Azure portal, click into the App Service of your WebAPI.
2) Then under API Tag, click CORS.
3) Enter “*” or any specified website that you would like to allow CORS.

DONE! That is easy, isn’t it.

But then if you start running your mobile or website, and fire any jQuery to the WebAPI, you will find this error,

“SEC7128: Multiple Access-Control-Allow-Origin headers are not allowed for CORS response.”

 

If you check in Fiddler, you will find this,

This is because that Azure has enable the CORS and your app also enabled it. So it has more than one entries on “Access-Control-Allow-Origin” which the preflight request does not allow it. Now we could make some changes to the web.config under Azure,

  1. Go into Azure portal, Under “development Tools” tag and click “Advanced Tools”. In the Detail panel, click “GO”.
  2. A new browser will pop up and it is showing on the Kudu page.
  3. Now you have to click “Debug console” –> “CMD”
  4. An Azure command prompt shows with a window in upper area like a windows explorer which allowing us to browse into different directory in Azure.
  5. Now browse into “site” –> “wwwroot”, you could found your web.config here.
  6. In the left hand side, you could then click on the “Edit” (a pencil icon) to edit the web.config of the WebAPI in browser.
  7. Now you could comment both of the “Access-Control-Allow-Origin” and “Access-control-Allow-Methods”. And then click Save button in the upper area to save your changes.

DONE again. now your website will have only one entry of the “Access-control-Allow-Origin” and “Access-Control-Allow-Methods” and your client app now can fire any jQuery to the WebAPI without any error.

P.S., Azure has improved the handling of the CORS on the “OPTION” issue that I found from the last post.

You could also check here to learn more about the Kudu