Scheduler in Windows Azure

Windows Azure has been growing significantly and adding more and more services, but something that has always missed has been a Windows Azure Scheduler service. The windows azure just release one version of it that seems to have potential but still in very early stages.

Recently Aditi Product Services created their first Windows Azure Service available in the Windows Azure Store, and this new service is actually a Cloud Based CRON Scheduler.

This new service basically allows the user to create schedule tasks using a set of HTTP webhooks that run CRON schedules, being either simple or complex.

You can check more about this service and why it was built in this great post by Wade Wegner.

Looking at the features:

  • Support for CRON statements for reoccurrence scheduling.
  • Allows simple HTTP GET requests—easily allowing developers to use Scheduler to execute webhooks. With Scheduler, there is no requirement for a Windows Azure machine instance — every endpoint is an HTTP API. Scheduler supports development directly against the API or using our NuGet package.
  • Fully integrated into the Windows Azure store.
  • Run up to 5000 jobs in a month (during our Free Trial).
  • HTTP GET requests to your services and a full Web API for CRUD operations.
  • Simple and complex CRON job expressions.

For future releases will include HTTP POST, Auth, sending jobs into message queues and more

In terms of the architecture you can check this other post from Ryan Dunn

Now looking at all those features I can remember several different use cases where this might be helpful, one of them is something a lot of DBAs have been asking which is Database Backups. So what do we need to do in order to do this with this Scheduler Service?

1. Add Scheduler to your Subscription by going into the as choosing "PURCHASE ADD-ONS"

2. Choose the Scheduler Add-On


3. Now Create a Visual Studio Project and Add the Aditi.Scheduler Nuget package


4. Open your code where you want to add your Scheduler information and just add the following code:

   1:  var tenantId = "YOUR TENANT ID";
   2:  var secretKey = "YOUR SECRET KEY";
   3:  var scheduledTasks = new ScheduledTasks(tenantId, secretKey);
   4:  var backupTask = new TaskModel()
   5:              {
   6:                  Name = "Import/Export SQL Azure Database",
   7:                  JobType = JobType.Webhook,
   8:                  CronExpression = "0 0 12 1/1 * ? *", // taken from
   9:                  Start = DateTime.Now,
  10:                  Params = new Dictionary<string, object>() 
  11:                  {
  12:                      {"url", "http://myurl/api/BackupDB/TestDB"}
  13:                  }
  14:              };
  15:  scheduledTasks.CreateTask(backupTask);

5. You’re done. What you need now is your BackupDB Service to be listening in that URL and executing the requests. For that you can use this project that will help you do that.

There are a lot more scenarios for this but this looked like something a lot of people would be interested, but scenarios like:

  • Validating is a service or page is available by pinging a url
  • Performing backups of DB, Storage and so on
  • Deployments for Windows Azure based on specific conditions
    • Example would be a company that has an infrastructure On-Premises and wants to continue to use it but there are certain hours of the day that needs to scale-out into Windows Azure to handle more load. In this case it would do a CRON Task based on a specific time of day to deploy the solution to Windows Azure and change the Load Balancer for example, and doing the same for deleting the deployment when it’s not required anymore. This is important because a lot of customer look at Windows Azure in this moment only as a Scale-out platform and need something to help automate the tasks of spinning up new machines.
  • Perform Archiving processes of data
    • Example would be every day a process would be executed that would analyze all the data in Windows Azure SQL Databases and Table Storage and would provide an archiving mechanism passing the data that is older than 1 month into the Table Storage in order to save costs and to maintain only "live/active/most used" data in Windows Azure SQL Databases.

There’s a lot of other scenarios but those seemed good to start.

Leave a Reply

Your email address will not be published. Required fields are marked *