Showing posts with label Azure. Show all posts
Showing posts with label Azure. Show all posts

Handle Twilio SMS Status Call back in Azure Logic apps

In my previous post, I explained how to send SMS using Twilio connector in Azure logic app. In this post, I will describe how to capture the Twilio SMS Status call back events after we sent the SMS using Twilio and save them to the database using logic app . 

Prerequisites:

  • Azure subscription
  • A sender application to send SMS using Twilio

Steps:

1) Login to Azure portal

2) Select "Create a Resource" -> Search for Logic app -> Create

3) Select subscription, resource group, Region and give logic app name -> Click on Review & Create button -> Create

4) Open the logic app once the deployment completed

5) Select Http Request response template in the Logic Apps Designer. It will create a new app with Request and Response connectors.

6) I want to capture the SMS status back to the SQL server using SQL connector. Insert new step(+ icon) after Http Trigger connector -> Add an Action -> In the search box, enter "Sql" to filter the connectors -> Select "Sql Serve" -> Under the actions list select the "Insert row (V2)"

7) Create the connection by giving the required details 

8) Once you created the connection, select the Server name, Database name, table name. Once you          selected the table name, it will show the required columns name automatically in the form. If you want other columns, you can open the "Add New Parameter" drop down and select the required columns. 








9) Here I want to capture the MessageId (Unique message Id in our system), MessageSid (Sid from Twilio), MessageStatus (Status from Twilio) from the call back data. As I am sending the MessageId as query string parameter to the call back page, I am capturing it's value by using the expression 

 @triggerOutputs()['queries']['Id']

The main thing here is Twilio will post the data to our call back end point in application/x-www-form-urlencoded format. So I used the below expression to get the parameters posted by Twilio. We can red the form data posted to our logic app with the below expression

triggerFormDataValue('MessageSid')

triggerFormDataValue('MessageStatus')

10) In the Response connector, just send the 200 status code back to Twilio.






11) Save the logic app. Once you saved the logic app it will generate an URL in the HTTP trigger connector.

https://prod-13.eastus.logic.azure.com:443/workflows/5689fd744182e381f19a1a7ed4ee58cf/triggers/manual/paths/invoke?api-version=2016-10-01&sp=%2Ftriggers%2Fmanual%2Frun&sv=1.0&sig=ZrrSXg_0ZY5_hXz_yft0BPYiyXdWKK8g-ji7Ldm1Hb8

12) Use this URL as Status call back urn while sending the Twilio SMS along with the query string value id. Below is how I used in the logic app created in my previous post.




        



                                



 That's it. Now your logic app will receive the status call backs and save the response in the database.

Happy Coding 😀!

Azure Logic Apps - Sending SMS using Twilio connector

 Azure logic apps is a cloud service that helps you to define your works flows, schedule tasks etc in the cloud. It is server less and fully managed IPAAS (Integration Platform as Service) with inbuilt scalability. 

Logic apps provides a visual designer for create and configure your work flows. You can define your work flow with in-built connectors or other enterprise connectors. 

Advantages of Logic app

  • Inbuilt Connectors: We have plenty of inbuilt connectors, triggers and actions that cover many of the common business scenarios. 
  • Templates: Logic apps comes with many predefined templates that are available for common business cases which reduces the  development work.
  • Integration: You can integrate your on-premise legacy systems with the new ones in the cloud. 
  • Extensibility: You can create your own custom APIs, Azure Functions and integrate them to the workflow, if the inbuilt connectors are not met your needs.
  • Minimal Development Effort:  We can create the logic apps with minimal development effort in browser or visual studio.

Logic app for sending an SMS

Azure logic apps offers a pre-built Twilio connector which can allow you to send, get or list SMS / MMS from your Twilio account. In this post I will demonstrate how to use the logic app to Send an SMS using Twilio connector.

Prerequisites:

  • Azure subscription
  • Twilio AccountId and Authentication token
  • A Twilio number / Twilio verified number to send SMS
Steps:
1) Login to Azure portal
2) Select "Create a Resource" -> Search for Logic app -> Create
3) Select subscription, resource group, Region and give logic app name -> Click on Review & Create button -> Create
4) Open the logic app once the deployment completed
5) Select Http Request response template in the Logic Apps Designer. It will create a new app with Request and Response connectors as below


6) I want to send the SMS request in the following format
{
    "sendingNumber": "xxxxxxxxx",
    "destinationNumber": "xxxxxxxxx",
    "messageContent": "message from logic app",
    "statusCallBackURL": "https://hookb.in/yDRKx6el2xFJNNPaR2kk"
}
So, in the Http request trigger, give the following as request body JSON schema
{
    "type": "object",
    "properties": {
        "sendingNumber": {
            "type": "string"
        },
        "destinationNumber": {
            "type": "string"
        },
        "messageContent": {
            "type": "string"
        },
        "statusCallBackURL": {
            "type": "string"
        }
    }
}



7) Click on Insert new step(+ icon) after Http Trigger connector -> Add an Action -> In the search box, enter "twilio" to filter the connectors -> Under the actions list select the action you want. I selected Send Text Message (SMS) as we are going to send a message.



8) Now provide the necessary details for your connection like Connection name, Twilio Account Id, Twilio Access Token and select Create button










9) Give the necessary details for sending the SMS like From Phone Number, To Phone Number, Text, Status Call back (You can get this parameter by selecting it from "Add New Parameter" dropdown). You can pick these values from logic app Trigger.













10) It completes the message sending. But I want to send the message SID back to the user in response. So I modified the Response connector as below












11) Save the logic app. Once you saved the logic app it will generate an URL in the HTTP trigger connector.  Now you can trigger your logic app by posting your request to that URL.

I will explain the logic app for handling the Twilio SMS status call back in another post. 

Happy Coding 😀!

Deploy Azure function using Azure CLI

The Azure command-line interface (Azure CLI) is a set of commands used to create and manage Azure resources. It is available across Azure services and is designed to get you working quickly with Azure, with an emphasis on automation. You can check how to install and configure Azure CLI in you machine from here.

This article describes how to deploy your function app to Azure using AWS CLI commands. For deploying the azure fuction you need to first create a zip file with the contents of your azure function. The .zip file that you use for push deployment must contain all of the files needed to run your function. When you deploy it, the contents of the zip file will be extracted and copied to the wwwroot folder of your function app(If any files from an existing deployment that aren't found in the .zip file are deleted from your function app). The folder structure of your zip file should be as below

FunctionApp
 | - host.json
 | - MyFirstFunction
 | | - function.json
 | - MySecondFunction
 | | - function.json
 | - SharedCode
 | - bin


The code for all the functions in a specific function app is located in a root project folder that contains a host configuration file and one or more sub folders. Each sub folder contains the code for a separate function. The bin folder contains packages and other library files that the function app requires. 

 So for deploying your azure function app, you need to publish your app first in Visual Studio using folder publish. Once done, zip the publish folder and run the following command in command line

az functionapp deployment source config-zip -g  -n  --src 

Ex:
az functionapp deployment source config-zip -g "rg-test-group" -n "func-test-function" --src "D:\test-function\bin\Release\netcoreapp2.1\publish\publish.zip"

This command deploys project files from the .zip file to your function app and restarts it. Here is the path of your zip file on your computer. If you are using Azure CLI in Azure Cloud Shell, then you must upload your file to the Azure files account.

Happy Coding 😊!

Deploying Azure function With ARM Template

ARM Template

In agile development methodology, we need to repeatedly deploy our solution to the cloud. So we need to automate the deployment process and use the practice of Infrastructure as code. In our code, you can define the infrastructure that needs to be deployed. The infrastructure code is part of our code and we can store it in the repo and version it. 

In Azure we can use the Azure Resource Manage templates to implement the Infrastructure as code concept. The ARM template is a Json file that defines the infrastructure and configuration for your project. In the template, you specify the resources to deploy and the properties for those resources. 

Advantages of ARM Template

Following are the main advantages of using ARM Template
  1. Declarative Syntax: The template will deploy any infrastructure like VM, network infrastructure, storage systems etc
  2. Repeatable results: Repeatedly deploy your infrastructure throughout the development lifecycle and have confidence your resources are deployed in a consistent manner
  3. Built in validation: Resource Manager checks the template before starting the deployment to make sure the deployment will succeed.
  4. CI/CD integration: You can integrate templates into your continuous integration and continuous deployment (CI/CD) tools, which can automate your release pipelines for fast and reliable application and infrastructure updates. 
  5. Tracked deployments: In the Azure portal, you can review the deployment history and get information about the template deployment.

ARM Template Format

A template has the following elements
  1. $schema: Location of the JSON schema file that describes the version of the template language.
  2. contentVersion: Version of the template.
  3. apiProfile: An API version that serves as a collection of API versions for resource types
  4. parameters: Values that are provided when deployment is executed to customize resource deployment.
  5. variables: Values that are used as JSON fragments in the template to simplify template language expressions.
  6. functions: User-defined functions that are available within the template.
  7. resources: Resource types that are deployed or updated in a resource group or subscription.
  8. outputs: Values that are returned after deployment.
Here $schema, contentVersion, resources  are mandatory elements for any template
We can write template expressions that extend the capabilities of JSON file.

ARM Template for deploying Azure Function

{
    "$schema": "https://schema.management.azure.com/schemas/2015-01-01/deploymentTemplate.json#",
    "contentVersion": "1.0.0.0",
  "parameters": {
    "environtment": {
      "defaultValue": "dev",
      "type": "String"
    },
    "storageSKU": {
      "defaultValue": "Standard_LRS",
      "allowedValues": [
        "Standard_LRS",
        "Standard_GRS",
        "Standard_RAGRS",
        "Standard_ZRS",
        "Premium_LRS",
        "Premium_ZRS",
        "Standard_GZRS",
        "Standard_RAGZRS"
      ],
      "type": "String"
    },
    "dbConnectionString": {
      "defaultValue": "",
      "type": "String"
    },
    "packageUrl": {
      "defaultValue": "https://gopiportal.blob.core.windows.net/code/AzureFunctions.zip",
      "type": "String"
    }
    
  },
    "variables": {
        "storageAccountName": "[concat('gopitest',parameters('environtment'),'sa')]",
        "servicePlanName": "[concat('gopitest-',parameters('environtment'),'-sp')]",
        "azureFunctionAppName": "[concat('gopitest-',parameters('environtment'),'-af')]",
        "appInsightsName": "[concat('gopitest-',parameters('environtment'),'-ai')]"
    },
    "resources": [
        {
            "type": "Microsoft.Storage/storageAccounts",
            "apiVersion": "2019-04-01",
            "name": "[variables('storageAccountName')]",
            "location": "[resourceGroup().location]",
            "sku": {
                "name": "[parameters('storageSKU')]"
            },
            "kind": "StorageV2"
        },
        {
            "type": "Microsoft.Web/serverfarms",
            "apiVersion": "2018-02-01",
            "name": "[variables('servicePlanName')]",
            "location": "[resourceGroup().location]",
            "sku": {
                "name": "Y1",
                "tier": "Dynamic"
            },
            "properties": {
                "name": "[variables('servicePlanName')]",
                "computeMode": "Dynamic"
            }
        },
        {
            "type": "Microsoft.Insights/components",
            "apiVersion": "2015-05-01",
            "name": "[variables('appInsightsName')]",
            "location": "[resourceGroup().location]",
            "tags": {
                "[concat('hidden-link:', resourceGroup().id, '/providers/Microsoft.Web/sites/', variables('azureFunctionAppName'))]": "Resource"
            },
            "properties": {
                "ApplicationId": "[variables('azureFunctionAppName')]",
                "Application_Type": "web"
            }
        },
        {
            "type": "Microsoft.Web/sites",
            "apiVersion": "2019-08-01",
            "name": "[variables('azureFunctionAppName')]",
            "location": "[resourceGroup().location]",
            "dependsOn": [
                "[resourceId('Microsoft.Web/serverfarms', variables('servicePlanName'))]",
                "[resourceId('Microsoft.Storage/storageAccounts', variables('storageAccountName'))]",
    "[resourceId('Microsoft.Storage/components', variables('appInsightsName'))]"
            ],
            "kind": "functionapp",
            "properties": {
                "serverFarmId": "[resourceId('Microsoft.Web/serverfarms', variables('servicePlanName'))]",
                "siteConfig": {
                  "appSettings": [
                    {
                      "name": "AzureWebJobsDashboard",
                      "value": "[concat('DefaultEndpointsProtocol=https;AccountName=', variables('storageAccountName'), ';AccountKey=', listKeys(variables('storageAccountName'),'2019-04-01').keys[0].value)]"
                    },
                    {
                      "name": "AzureWebJobsStorage",
                      "value": "[concat('DefaultEndpointsProtocol=https;AccountName=', variables('storageAccountName'), ';AccountKey=', listKeys(variables('storageAccountName'),'2019-04-01').keys[0].value)]"
                    },
                    {
                      "name": "WEBSITE_CONTENTAZUREFILECONNECTIONSTRING",
                      "value": "[concat('DefaultEndpointsProtocol=https;AccountName=', variables('storageAccountName'), ';AccountKey=', listKeys(variables('storageAccountName'),'2019-04-01').keys[0].value)]"
                    },
                    {
                      "name": "WEBSITE_CONTENTSHARE",
                      "value": "[variables('storageAccountName')]"
                    },
                    {
                      "name": "FUNCTIONS_EXTENSION_VERSION",
                      "value": "~2"
                    },
                    {
                      "name": "FUNCTIONS_WORKER_RUNTIME",
                      "value": "dotnet"
                    },
                    {
                      "name": "APPINSIGHTS_INSTRUMENTATIONKEY",
                      "value": "[reference(resourceId('microsoft.insights/components/', variables('appInsightsName')), '2015-05-01').InstrumentationKey]"
                    },
                    {
                      "name": "dbConnectionString",
                      "value": "[parameters('reportingDBConnectionString')]"
                    }
                  ]
                }
            },
            "resources": [
                {
                    "type": "extensions",
                    "apiVersion": "2015-08-01",
                    "name": "MSDeploy",
                    "location": "[resourceGroup().location]",
                    "dependsOn": [
                        "[resourceId('Microsoft.Web/sites', variables('azureFunctionAppName'))]"
                    ],
                    "tags": {
                        "displayName": "webappdeploy"
                    },
                    "properties": {
                        "packageUri": "[parameters('packageUrl')]",
                        "dbType": "None",
                        "connectionString": ""
                    }
                }
            ]
        }
    ],
    "outputs": {}
}
The above code will create the following resources.
1) Storage account
For every function app, we need the storage account. So we should create it before function app. Here the storageSKU is a parameter and it has multiple values. We need to choose any one value from the list while deploying the template based on our function app requirement. 
2) Service Plan
After the storage account, we need a service plan for function app. So this part will create Service plan.
3) Applicaiton Insights
The third resource in the above is the Application insights which is required to store the function app insights
4) Function app
The fourth and final one is the Azure function app and it is the place where all the peices are getting together. 
Here, we need the first 3 resources to be created before creating this function app. So we need to tell the azure as this resource is dependsOn those.
"dependsOn": [
 "[resourceId('Microsoft.Web/serverfarms', variables('servicePlanName'))]",
 "[resourceId('Microsoft.Storage/storageAccounts', variables('storageAccountName'))]",
 "[resourceId('Microsoft.Storage/components', variables('appInsightsName'))]"
]


We can specify the function app settings under properties -> siteConfig -> appSettings as name & value pairs as below

"properties": {
 "serverFarmId": "[resourceId('Microsoft.Web/serverfarms', variables('servicePlanName'))]",
 "siteConfig": {
   "appSettings": [
  {
    "name": "AzureWebJobsDashboard",
    "value": "[concat('DefaultEndpointsProtocol=https;AccountName=', variables('storageAccountName'), ';AccountKey=', listKeys(variables('storageAccountName'),'2019-04-01').keys[0].value)]"
  },
  {
    "name": "AzureWebJobsStorage",
    "value": "[concat('DefaultEndpointsProtocol=https;AccountName=', variables('storageAccountName'), ';AccountKey=', listKeys(variables('storageAccountName'),'2019-04-01').keys[0].value)]"
  },
  {
    "name": "WEBSITE_CONTENTAZUREFILECONNECTIONSTRING",
    "value": "[concat('DefaultEndpointsProtocol=https;AccountName=', variables('storageAccountName'), ';AccountKey=', listKeys(variables('storageAccountName'),'2019-04-01').keys[0].value)]"
  },
  {
    "name": "WEBSITE_CONTENTSHARE",
    "value": "[variables('storageAccountName')]"
  },
  {
    "name": "FUNCTIONS_EXTENSION_VERSION",
    "value": "~2"
  },
  {
    "name": "FUNCTIONS_WORKER_RUNTIME",
    "value": "dotnet"
  },
  {
    "name": "APPINSIGHTS_INSTRUMENTATIONKEY",
    "value": "[reference(resourceId('microsoft.insights/components/', variables('appInsightsName')), '2015-05-01').InstrumentationKey]"
  },
  {
    "name": "dbConnectionString",
    "value": "[parameters('reportingDBConnectionString')]"
  },
  {
    "name": "TestGUID",
    "value": "[parameters('testGuid')]"
  }
   ]
 }
}
The last piece of the process is the sub-resource sourcecontrol inside the FunctionApp. This will define where Azure function code exists. Here I have given packageuri for the function app. 
"resources": [
 {
  "type": "extensions",
  "apiVersion": "2015-08-01",
  "name": "MSDeploy",
  "location": "[resourceGroup().location]",
  "dependsOn": [
   "[resourceId('Microsoft.Web/sites', variables('azureFunctionAppName'))]"
  ],
  "tags": {
   "displayName": "webappdeploy"
  },
  "properties": {
   "packageUri": "[parameters('packageUrl')]",
   "dbType": "None",
   "connectionString": ""
  }
 }
]

Deploying Azure Function

1) Login to the Azure portal
2) Open "Deploy a custom template" (by typing deploy in the search bar)


3) Select "Build your own template in the editor" 
4) Copy the content from Azure ARM template in editor and click on "Save"

5) In the next screen Select the following under Basic settings
Subscription: Azure subscription you want to deploy
Resource Group: Select the existing resource group under which you want to deploy these or Create new 
Location: Select location if you are creating new Resource Group

6) Update the parameters required for the ARM template (Under Settings)  and Accept the Terms and condition by clicking on the checkbox.

7) Click on "Purchase" button. This will deploy the components automatically

Increase request timeout of ASP.NET Core API hosted in Azure App Service

If you are using ASP.NET Core 2.0 API and deploying to an Azure App Service, and if the process takes more than 2 minutes, then you will get 502 Bad Gateway response with the message "The specified CGI application encountered an error and the server terminated the process".

You can fix this by adding web.config file to your project -> uncomment <system.webServer> block and add requestTimeout="00:20:00" as shown below

<?xml version="1.0" encoding="utf-8"?>
<configuration>
  <!-- To customize the asp.net core module uncomment and edit the following section. 
  For more info see https://go.microsoft.com/fwlink/?linkid=838655 -->
  
  <system.webServer>
    <handlers>
      <remove name="aspNetCore"/>
      <add name="aspNetCore" path="*" verb="*" modules="AspNetCoreModule" resourceType="Unspecified"/>
    </handlers>
    <aspNetCore processPath="%LAUNCHER_PATH%" arguments="%LAUNCHER_ARGS%" stdoutLogEnabled="false" stdoutLogFile=".\logs\stdout" forwardWindowsAuthToken="false" requestTimeout="00:20:00"/>
  </system.webServer>
    
</configuration>

Happy Coding  😊!!

Remove the word "api" from HttpTriggered Azure functions url

When you create an Http triggered azure function, it will automatically adds the keyword api in the url and give the url as

https://<function-app-url>.azurewebsites.net/api/<azure-function name="" route-configured-in-application=""></azure-function>

But when someone not using the function as an api, it is inappropriate to use the "api" keyword in it. So to remove the "api" in the url we need to change configure as following in the host.json (You can do it by opening the file using "App Service Editor" or Kudu console)

Version 1.x
{
  "http": {
    "routePrefix": ""
  }
}
Version 2.x
{
  "extensions": {
    "http": {
      "routePrefix": ""
    }
  }
}

With the above setting the azure function url will change as
https://<function-app-url>.azurewebsites.net/<azure-function name="" route-configured-in-application=""></azure-function></function-app-url>

Happy Coding 😊!

Reference: https://docs.microsoft.com/en-us/azure/azure-functions/functions-host-json

Working with Redis in C#.NET

Introduction to Redis

Redis (REmote DIctionay Server) is a very popular open-source key-value data store.It supports data structures such as strings, hashes, lists, sets, sorted sets with range queries, bitmaps, hyperloglogs and geospatial indexes with radius queries. Redis is well known for high performance, flexibility, provides a rich set of data structures, and a simple straightforward API.

Working with Redis in C#.NET

For working with Redis in C#.NET we have a good library StackExchange.Redis. It is a high performance general purpose redis client. It can be installed using Nuget package manager. After installing, we can use the following code to get or set the values
string connectionString="localhost";
ConnectionMultiplexer connection = ConnectionMultiplexer.Connect(connectionString);

IDatabase redis= connection.GetDatabase();

// Write to Redis
redis.StringSet("key", "value"))

// Read from Redis
var val = redis.StringGet("key");
Console.WriteLine(val);

For better modulation of Redis it is recommended to store ConnectionMultiplexer as a static lazy loaded singleton in your application.
public class RedisHelper
{
    public static string connectionString ="localhost";

    private static Lazy lazyConnection = new Lazy(() =>
    {
        ConfigurationOptions redisConfigOptions = ConfigurationOptions.Parse(connectionString);
        redisConfigOptions.ConnectRetry = 10;
        redisConfigOptions.ConnectTimeout = 30000;
        redisConfigOptions.KeepAlive = 60;
        redisConfigOptions.SyncTimeout = 30000;
        return ConnectionMultiplexer.Connect(redisConfigOptions);
    });

    public static ConnectionMultiplexer Connection
    {
        get
        {                      
            return lazyConnection.Value;
        }
    }
}

IDatabase redis = RedisHelper.Connection.GetDatabase();

// Write to Redis
redis.StringSet("key", "value"))

// Read from Redis
var val = redis.StringGet("key");
Console.WriteLine(val);
We can also use the StringSetAsync, StringGetAsync functions for asynchronous programming.

Connecting to a particular Database

In Redis instance we can have multiple database named as db1,db2,db3 etc. So we can connect to a particular database as 
IDatabase redis = RedisHelper.Connection.GetDatabase(dbNumber);

Working with Hashes

Redis Hashes are maps between string fields and string values, so they are the perfect data type to represent objects. While Hashes are used mainly to represent objects, they are capable of storing many elements, so you can use Hashes for many other tasks as well.  

Exmaple: HMSET user:1000 username antirez password P1pp0 age 34

In the above example
  • user:1000 is hashmap name. 
  • username, password, age are the keys and antirez,P1pp0,34 are the values for the respective keys in the hashmap.
We can create hashmap, using the following methods
// To Add the key,value pair to hashmap
IDatabase cache = RedisConnection.Connection.GetDatabase();
await cache.HashSetAsync(hashMapName, key, value);
Note: HashSetAsync will create the hashmap if it doesn't exits.

We can get the value from the hashmap with key as follows
// To retrive the key value from hashmap
IDatabase cache = RedisConnection.Connection.GetDatabase();
string value= await cache.HashGetAsync(hashMapName, key);
Following are the other useful methods for hashmaps
// HashGetAll -- Gets all the items in the hashMap
var allHash = redis.HashGetAll(hashMapName);
foreach (var item in allHash)
{
Console.WriteLine($"Key: {item.Name}, Value:{item.Value});
}

// HashValues -- Gets all the values in the hashMap
var values = redis.HashValues(hashMapName);
foreach (var value in values)
{
 Console.WriteLine(value);
}

// HashKeys -- Gets all the keys in the hashMap
var keys = redis.HashKeys(hashMapName);
foreach (var key in keys)
{
 Console.WriteLine(key);
}

// HashLength -- Gets the length of the hashMap
var len = redis.HashLength(hashMapName);  

// HashExists -- Checks whether the key exists in the hashMap or not
if (redis.HashExists(hashMapName, key))
{
 var item = redis.HashGet(hashMapName, key);
}
Happy Coding! 😊

Keep the Azure Web role Always On

Microsoft Azure Web roles generally suffers with the Service Slow problem for the First request after a certain time of in activity. It is because of the default configuration of an IIS Application pool that is set to have an idle-timeout of 20 minutes. So, if there is any inactivity for 20 minutes, then it will automatically shutdowns your Web Role, resulting the first user will experience long loading time, which is unacceptable in the present day standards.

So the simple work around for this is to have a small script in your package which runs at startup, every time the role starts up.

For the above to work, include the following script in a startup.cmd file that you place in a folder called Startup in your web role project.

REM *** Prevent the IIS app pools from shutting down due to being idle.
%windir%\system32\inetsrv\appcmd set config -section:applicationPools -applicationPoolDefaults.processModel.idleTimeout:00:00:00


REM *** Prevent IIS app pool recycles from recycling on the default schedule of 1740 minutes (29 hours).
%windir%\system32\inetsrv\appcmd set config -section:applicationPools -applicationPoolDefaults.recycling.periodicRestart.time:00:00:00

Set the "Copy to output directory" to "Copy always" so it becomes part of the package.

To ensure that startup.cmd script is called on web role starting add the following section in the "ServiceDefinition.csdef" of your Azure Cloud Service project under your Web Role part.

  

So your Web role section under "ServiceDefinition.csdef" will be look like below.

  
    
 
   
 
    
  
  
  ....
  
  
    
  
  
    
  

It's done. Just build your solution and deploy. The web role  will now no long automatically shutdown or recycle.

Clear the Dead Letter messages from an Azure Service Bus Queue

Sometimes we may have a situation to read the message from Dead Letter Queue to know which message are not processed and we may re-queue them if required. In such situations, we can read the messages from the Dead Letter Queue of an Azure Service Bus Queue in the same manner as we connect to the normal queue, but we need to post fix "$DeadLetterQueue" to the queue name.
Following is the code snippet for reading / clearing the messages from Dead Letter Queue.
using Microsoft.ServiceBus.Messaging;

 string serviceBusConnString = "****** Your Service bus Connection String ******";
 string queueName = "Your Queue name" + "/$DeadLetterQueue";  // for dead letter queue.

 QueueClient client = QueueClient.CreateFromConnectionString(serviceBusConnString, queueName, ReceiveMode.PeekLock);
 while (client.Receive() != null)
 {
     var deadMessage = client.Receive();
     //do something here with the message if required.
     deadMessage?.Complete();
 }

Writing and Reading Messages in Azure Queue

You can Send Message to Queue or Read all the Message from Azure Queue using the following code blocks.

Sending Message to Azure Queue

using Microsoft.ServiceBus.Messaging;
private async Task SendMessageToQueue(string message)
{
     string serviceBusConnString = "****** Your Service bus Connection String ******";

     string queueName = "Your Queue name";

     QueueClient client = QueueClient.CreateFromConnectionString(serviceBusConnString, queueName, ReceiveMode.PeekLock);

     BrokeredMessage brokeredMessage = new BrokeredMessage(message);

     brokeredMessage.TimeToLive = new TimeSpan(3, 0, 0);

      await client.SendAsync(brokeredMessage);

}

Reading all the Message from Azure Queue

using Microsoft.ServiceBus.Messaging;

private async Task ReadMessagesFromQueue(string message)
{

     string serviceBusConnString = "****** Your Service bus Connection String ******";

     string queueName = "Your Queue name";

     QueueClient client = QueueClient.CreateFromConnectionString(serviceBusConnString, queueName, ReceiveMode.PeekLock);
     if (client != null)
       {

           BrokeredMessage brokeredMessage = null;
           while (true)
              {

                 try
                    {

                       //receive messages from Queue

                       brokeredMessage = await client.ReceiveAsync();

                       if (brokeredMessage != null)
                        {

                          Console.WriteLine(string.Format("Message received: Id = {0}, Body = {1}", brokeredMessage.MessageId, brokeredMessage.GetBody()));

                          // Further custom message processing could go here…

                          brokeredMessage.Complete();

                        }

                        else
                        {
                            //no more messages in the queue
                            break;
                        }
                    }
                  catch (MessagingException e)
                    {

                       if (!e.IsTransient)
                        {
                            Console.WriteLine(e.Message);
                            throw;
                        }
                        else
                        {
                            Console.WriteLine(e.Message);
                            Console.WriteLine("Transient error occured. Will retry in 2 seconds");
                            Thread.Sleep(2000);
                        }
                    }
                }

                client.Close();
            }

        }

Error installing Azure SDK in Visual Studio 2015 Community Edition.

Today I tried to install azure SDK in Visual Studio 2015 community edition through Web platform installer and I am getting the following error.

"Azure Storage Emulator requires LocalDb to be installed. You must enable the Microsoft SQL Server Data Tools (SSDT) feature in Visual Studio 2015 to install Local Db before proceeding."










The reason for this issue is SSDT tools are not installed at the time of installing Visual studio.

The solution is
  • Open Control panel -> Programs and features
  • Select Microsoft Visual Studio
  • Right click on it and Select Change
  • Click on Modify
  • Check "Microsoft SQL Server Data Tool" under "Windows and Web Development"
  • Click Update
That's it :) Happy Coding.