Azure Durable Functions – Internals

Durable Functions is an offering as part of Azure Functions to develop stateful applications. If you have been developing micro-services which are stateful then durable functions would be the apt choice to consider.

Today in this post , we will walk through durable functions using Function chaining pattern. In this sample , we will be developing a durable function which performs three activities in order. I have used the sample as part of CodePaLOUsa and Boston Code Camp sessions.

If you are a developer using BizTalk Server you might be aware of the term Orchestration. Similar concept is used in Azure Durable Function as Orchestration. An orchestration is a work flow which can perform multiple activities. You would require a Orchestration client to invoke the orchestration. In the sample we are using HTTP based orchestration client.

DurableFunction

The complete code sample for this example can be found in the below Git repository

https://github.com/baskar3078/durableFunctionSample

In the sample here, we have the Orchestration client which is invoked through Http. The code for Orchestration Client is part of HttpStart.cs. 

This function takes in a Http Request and defines a Durable Orchestration Client and invokes the function “OrderProcessingSequence” which is our Orchestration.

Below url is used to invoke the Orchestration.

http://localhost:7071/api/orchestrators/OrderProcessingSequence

DurableFunction4

On debugging the solution by providing appropriate values for these two configuration values in local.settings.json (“AzureWebJobsStorage” and “AzureWebJobsDashboard”), we can observe the following changes which takes place in azure for the given storage account.

On exploring the blobs we can see new containers getting created one specific for regular azure web job hosts and other for durable functions hub leases.

DurableFunction1

Further exploring the Queues under the storage account, we could see the below queues getting created in the background. In the Durable functions sample we have three activities. Out of the five queues which got created, my guess is that one is specific to the orchestration and three queues for three activities and as the name suggests the other queue is used when durable function runtime is working on a specific item.

DurableFunction3

When we explore the Table Service under the storage account, we could see the below table getting created. This is used to maintain the state of the application and query the status of the application.

DurableFunction2

When invoking the orchestration using PostMan, we get three urls one is used to query the status of the request using “statusQueryGetUri” and other is to post events to orchestration using “sendEventPostUri” and other is to terminate the instance of orchestration using “terminatePostUri”. These urls are used to manage the state of the durable function.

DurableFunction5

In the example above , we have posted the email address with blank value. The third activity which processes and sends email,  exception is thrown when there is no email address provided as part of the input request. We can find the status of the above request using statusQueryGetUri.

DurableFunction6

If the orchestration is executed successfully the run time status will be Completed and we can find both input and output using statusQueryGetUri.

DurableFunction7

In my next blog we will be exploring sending events and developing timer activities using durable functions.

References :-

Thanks to Azure Functions team for publishing the below durable functions sample, which I have used to develop the following example.

https://docs.microsoft.com/en-us/azure/azure-functions/durable-functions-sequence

 

 

 

 

 

 

Using BenchMarkDotNet for Performance BenchMarking

This article is for my second post for c# advent calendar. Many might be using the BenchmarkDotNet nuget package for measuring the performance of your .Net Application Code. This nuget package is used in many projects to run their performance benchmarks.

I recently came to know about BenchMarkDotNet nuget while I was working with Robert in one of his github project(memstate). That was my first pull request to github where I implemented a surrogate converter for Json Serialization.  As part of implementing Surrogate converter, we had two different options to implement. We knew that one method was not good with respect to performance while second approach is good from performance perspective. So I asked Robert about his opinion, on which option to select to proceed with the implementation. Robert suggested me to take any option as they will be performing Benchmark later. This was my introduction to BenchMarkDotNet. So I wanted to explore the nuget package further.

There is good documentation present on how to use the nuget package along with some guidance on best practice.

For my sample test, I created a class library with below code.We need to add BenchMarkDotNet nuget package to both class library and console application.

namespace BenchMarkTest
{
[Config(typeof(Config))]
[MemoryDiagnoser]
public class TestClass1
{
private class Config : ManualConfig
{
public Config()
{
Add(new Job(EnvMode.LegacyJitX64, EnvMode.Clr, BenchmarkDotNet.Jobs.RunMode.Dry)
{
Env = { Runtime = Runtime.Clr },
Run = { LaunchCount = 3, WarmupCount = 5, TargetCount = 10 },
Accuracy = { MaxRelativeError = 0.01 }
});
Add(new Job(EnvMode.LegacyJitX86, EnvMode.Clr, BenchmarkDotNet.Jobs.RunMode.Dry)
{
Env = { Runtime = Runtime.Clr },
Run = { LaunchCount = 3, WarmupCount = 5, TargetCount = 10 },
Accuracy = { MaxRelativeError = 0.01 }
});

}
}
[Benchmark]
public void TestMethod()
{
Console.WriteLine("This is a benchMark test");
}
}
}

We need to use the attribute BenchMark against the method which we want to know the performance. I have also included the details of BenchMark jobs as part of the Configuration.

In my sample am trying to benchmark code against CLR which is my case is .Net Framework 4.6 and trying to measure performance against x86 and x64 configuration. We can also specify the LaunchCount, WarmupCount and TargetCount to be used as part of our benchmark.

Using the attribute of Memory Diagnoser we can measure the memory allocation against the method.

There are different run modes or run strategy which you can select like cold start, monitoring or throughput. Additional details on choosing an appropriate strategy can be found in the benchmark documentation.

To run the benchmark, we need to add a console application. Below is the code for my console application, where am trying to perform benchmark against my TestClass.

using System;
using System.Collections.Generic;
using System.Linq;
using System.Text;
using System.Threading.Tasks;
using BenchmarkDotNet.Running;
using BenchMarkTest;
namespace ConsoleApplication1
{
class Program
{
static void Main(string[] args)
{
TestClass1 mytest =new TestClass1();
BenchmarkRunner.Run<TestClass1>();
Console.ReadLine();
}
}
}

BenchmarkRunner.Run is used to initiate the benchmark against the class which we want to measure performance.

It is recommended to run the benchmark in release mode. They have made sure to provide a warning when the benchmark is run under Debug mode.

BenchMarkTest1

So I went ahead and made sure the build mode is release and tried run a benchmark from Visual Studio by executing my console application.

Under the bin folder of console application, under Release folder a benchmark artifact folder will be created, which has the results of the benchmark summary.

BenchMarkTest2

BenchMarkTest3

Results are exported in csv format, html format and summary is generated.

So in these results I can see that it is using an Attached Debugger. This is not recommended approach. The best way as per documentation is to launch the console application from command prompt from the Release folder.

BenchMarkTest4

Below are the results when launched from command prompt.

BenchMarkTest5

In these results, us means microseconds.

Am still trying to explore completely the nuget package. Some of the attributes mentioned in the benchmark documentation like MinColumn and MaxColumn were not working with the new package, may be I might have missed something, which I will be exploring further.

Thanks to Matthew Groves for allowing me to sign up for two slots as part of c# advent calendar. Follow the further posts as part of advent calendar here.

Click here to find the documentation on BenchMarkDotNet

 

Quick Actions in Visual Studio 2017

Quick Actions as part of Visual Studio 2017, provide suggestions to re-factor code or fix code style .

I wanted to explore Quick Actions further and wanted to see how it can be helpful. The below Microsoft documentation provides additional details on settings which needs to be updated in Visual Studio to customize Quick Actions.

https://docs.microsoft.com/en-us/visualstudio/ide/code-styles-and-quick-actions

I started to create a console application using .Net Core project template. Visual Studio creates the required project files with below code in Program.cs

using System;

namespace TestCoreConsoleApp
{
class Program
{
static void Main(string[] args)
{
Console.WriteLine("Hello World!");
}
}
}

On placing cursor over the method, quick actions icon will be shown as yellow bulb. On clicking the bulb, it provides two options to re-factor the code.

  1. Option is provided to change method signature.
  2. Use expression body for methods

QuickAction1

On clicking the second option the code gets updated as below.

static void Main(string[] args) => Console.WriteLine("Hello World!");

This style can be used when the method is having only one line of code.

On placing the cursor again over the converted line, we get quick action tip to consider changing the previous converted method to use block body for methods.

QuickAction2

So at this point if we convert back to use block body for methods, quick action tip will suggest to consider expression body for methods. This suggestion appears like a cycle.

I wanted to tweak the text editor settings as shown below.

QuickAction3

On performing the above , I could notice three dots … mentioning suggestion. This suggestion would appear only when there is a single line in the method.

QuickAction4

This auto suggestion disappears when we start typing new line in the main method.

QuickAction5

This way one can tweak the text editor settings according to your coding standards. This is my blog post for C# advent calendar and thanks to Matt Groves(@mgroves) for asking me to sign up for this.

You can follow the C# advent calendar for other posts here.

 

Thought on Source Control for Azure Functions

The most common question, I generally see in every forum is how to use source control for Azure functions which are developed using azure portal.

We currently have support for source control for Azure functions developed using Visual Studio 2017 which is similar to having source control for any common .Net Applications.

When a function is developed using the azure portal, there is no direct way to integrate and deploy changes from source control to azure functions runtime. We can achieve source control for functions developed using portal with some hacks.

Every azure function app running in azure platform requires a storage account to store the artifacts corresponding to a function. Below are the steps which I would use to solve the source control issue.

  1. Following the regular steps of a function creation, I created a function HttpTriggerCSharp2 in the portal. Note the storage account which is configured while creating a function app.
  2. Navigate to the storage account which just created in the previous step and select Files in the overview window.

FileService3

3. Select the Files Service name displayed . Navigate to site/wwwrootFileService1

4. This would display all the functions available in the function app.

5. On navigating to HttpTriggerCSharp2 folder , it would display the run.csx file which is used in the portal to define the function.

FileService2

I would use below options to move my changed code from source control to azure file share. The key here is to note the file reference urls and use those in your scripts or application.

a. Now we can write a custom powershell script to upload the file to azure using azure file storage API .

b. Write a custom application or tool to read the file from source control and push the change to azure using REST API.

Below url provides reference on File Service REST API.

https://docs.microsoft.com/en-us/rest/api/storageservices/file-service-rest-api

With this thought my next article will be focusing on any one approach (a or b) and shall share my experiences.

Controlling Azure Functions Invocation by Azure API Management

Azure Functions provides the feature to invoke a function using HTTP Trigger.  Functions can be invoked directly by keeping the function url which has the host key along with the function url.

Typical function url is usually of the below format.

https://functionappname.azurewebsites.net/api/FunctionName?code=key

It is not recommended to share the secure code to invoke the functions in a production environment.  This can be controlled by using Azure API Management, where the clients are provided with a different url instead of the function url.

Using an Azure API Management, provides some of the below benefits.

  • Can implement throttling mechanism to control the number of requests
  • Control access by subscription id and keys.
  • Perform analytics on the consumption of the API.
  • Extract reports to understand the consumption behavior.
  • Perform secure configuration to control the access of API.

Below are the steps used to invoke an Azure function from Azure API Management.

In this scenario, I have created a simple function which uses default template and triggered using HTTP.  Verifying the function by triggering the function and check if proper response is received from the function. Use the test window in azure functions to check the function.

API1

Once the function is verified. Use the below steps to create azure API Management. Login  to portal.azure.com and use your azure account and click on New in azure portal.

API1

Click on Create and provide all mandatory values to create the API Management Portal.

API1

Provide a valid administrator email and select an appropriate pricing tier. I have selected the developer portal. This administrator email will default be given developer access.

It would take around 10 minutes to activate and access the API Management overview. Azure API Management, by default has two portals.

  1. Publisher Portal
  2. Developer Portal

Publisher portal is used to develop the APIs and operations and define the policies. This is where all the development takes place.

Developer portal is used to test the published APIs and verify the trace logs of the APIs.

Open the publisher portal and create an API. API can be created by importing API or selecting Add API. Since the azure function which we are using does not have any swagger implementation, I have selected the ADD API option.

API1

Provide the values to define the API by providing an api name and description. The webservice url mentioned here corresponds to the host name of the function url.

https://functionappname.azurewebsites.net/

Provide value for web api url suffix and select the url scheme as HTTPS. This step also shows the url which external clients will be using to consume the API.

API1

Once the API is created, the next step is to create an operation in the API. The operation step defines the http verb and the url templates , request and responses formats.

API1

Url Template is used to define the actual REST API operation which will be used by client.

Rewrite Url template is mainly used to transform the request corresponding to the actual function url with the code used to consume the function. These secure codes are not visible to the client consuming the APIs.

Next step is to define the request Body and response which would be given by the API.

Provide a description and click on Add Representation. Provide the content type as application/json and paste the sample request format which will be accepted by the API.

API1

I have configured to return responses with status 200 and 400. Below steps provide the configuration for the same.  It is required to define the response format by selecting Add representation as application/json and mention the sample response for each HTTP status codes. The response format should match the response format returned by azure function.

API1

API1

Save the operation changes.

In order to publish the API and make it visible from outside, it is required to map the created API to a product.

By default Azure provides two products which are Starter and Unlimited.

API1

In my case I have added the created API to Starter product by clicking on Add API  to Products. Usually developers can be provided access to Started product as it controls the execution of the function.  We can also create our own product.

After adding API to Starter product, click on the Starter value. This will navigate to the product screen displaying the values of the product and other APIs which are part of the product.

API1

In order to consume the API in developer portal we need to Publish the product. On publishing the status changes to Published. The subscribers tab shows the users which are given access to consume the APIs in the product and we can also control the visibility of API among users and groups.

We can define policies in the publisher portal against the product and API which is not covered in this example. For now I have not defined any policies.

We can test the above API by navigating to developer portal.

API1

Here we can see that the API displayed in the developer portal. Since in my case am the administrator , I have access to the default Products. If we are using custom username other than administrator account to test the API in developer portal, we would need to request subscription. Once subscribed we can test the API.

Click on the API which we desire to test. This navigates to screen displaying all metadata of the API along with the request format and response formats and the code to consume in different clients.

API1

Click on the Try It button to test the API.  On providing valid values in the request we can test the API by clicking on Send Button. This invokes the API and provides the response returned from the back end service. In this case the back end service is the azure functions url.

Below is the raw format of the request posted from try it window. In order to access an API exposed by Azure it is required to pass the Subscription-Key as part of the header of request.  Providing trace options as true will enable the trace window in the response.

[code language="javascript"]
POST https://btest.azure-api.net/testazfunction/api/HttpTriggerCSharp1 HTTP/1.1
 Host: btest.azure-api.net
 Content-Type: application/json
 Ocp-Apim-Trace: true
 Ocp-Apim-Subscription-Key: ••••••••••••••••••••••••••••••••
{
   "name" : "test"
 }
[/code]

Below is the response displayed in the developer portal. Clicking on the trace tab provides a detailed trace of the request.

API1

On inspecting the trace tab, we can see how the incoming request to API is forwarded to actual Azure functions url.

The below displays the from portion and to portion and other fields which were mentioned as part of the Operation set up. This also displays the groups which have access to the API and which product the API is part of.

{ "configuration": { "api": { "from": "/testazfunction", "to": { "scheme": "https", "host": "btestfunction.azurewebsites.net", "port": 443, "path": "/", "queryString": "", "query": {}, "isDefaultPort": true }, "version": null, "revision": "1" }, "operation": { "method": "POST", "uriTemplate": "/api/HttpTriggerCSharp1" }, "user": { "id": "1", "groups": [ "Administrators", "Developers" ] }, "product": { "id": "starter" } } }

Rewrite url defined in the operation is used to construct the actual url to which request has to be posted.

rewrite-uri (1 ms){
"message": "Updated request URL per specified rewrite template.",
"request": {
"url": "<a href="https://btestfunction.azurewebsites.net/api/HttpTriggerCSharp1?code=XXYXYXUBIABSIASSASI">https://btestfunction.azurewebsites.net/api/HttpTriggerCSharp1?code=XXYXYXUBIABSIASSASI</a>"
}
}

The above rewrite url is used by API to post the incoming request to azure functions url.

{ "message": "Request is being forwarded to the backend service.", "request": { "method": "POST", "url": "<a href="https://btestfunction.azurewebsites.net/api/HttpTriggerCSharp1?code=YYJJBJGUOGGGGUGOOOG">https://btestfunction.azurewebsites.net/api/HttpTriggerCSharp1?code=YYJJBJGUOGGGGUGOOOG</a>", "headers": [ { "name": "Ocp-Apim-Subscription-Key", "value": "deefc6db0b1d405888051fe5ea8a26e3" }, { "name": "Content-Type", "value": "application/json" }, { "name": "X-Forwarded-For", "value": "13.85.23.48" } ] } }

It is to be noted that the domain name in API url and the actual azure function url domain are different.

When there is any change to the secure code of azure functions it is required to update the secure code in the operations of the API.