Telerik blogs
BlazorT4_1200x303

In this third article, we’ll work through the steps needed to create an Azure Functions backend that serves up our dataset via an HTTP API.

Introduction

Welcome to this third article in a series demonstrating how to create a Blazor client application that incorporates a Telerik UI For Blazor Chart component.

If you haven’t already read the second article, that should be your starting point.

An objective of this series is to demonstrate how we could combine our client application with simple serverless function backend solutions.

In this article, we’ll be exploring how to use Azure Functions (sometimes shortened to just “Functions”). This is Microsoft’s version of the increasingly popular "serverless computing paradigm."

If you’re new to “serverless computing” and more accustomed to working with traditional websites/services where code is published to a host server, it’s useful to understand that with Azure Functions:

  • We don’t publish code directly to a “Azure Function Host Server.” Instead, we ultimately deploy it to a “storage account,” a highly available, but otherwise fairly straightforward place to store files.
  • When a function is triggered, code is loaded into worker instances as demand requires. This is managed transparently.
  • CLI tooling simplifies this publication process, creating the illusion that we are simply publishing to an Azure Function “service.”

You can learn more about the subject here:

Note: Azure Functions support deployments of both file-based packages and container images. In this article, we’ll be demonstrating using a basic file-based deployment.


Series Contents

If you’re planning to follow along with this series, you will need to create a Blazor client application, following the instructions in Part 2. Having completed that step, we’ll then look at creating a backend API—in this article, we’ll be looking at creating that API using Azure Functions.

If you’re interested to learn how the experience compares between the cloud providers, a succinct comparison is in Part 1, and you may further be interested in reading all three of the platform-specific articles in this series.

What Are We Doing in This Article?

In this article, we’ll use Azure Functions to create a simple serverless HTTP API backend that responds to a request with a JSON dataset.

The Azure Functions runtime supports a number of languages including Node and Python. However, in this article, we will continue using .NET with C#, as we have previously with our Blazor client.

Azure Functions are event-driven, meaning that their code is triggered in response to different types of “bound event,” such as a timer-event or file-change in a storage account.

For our purposes, we’re interested in something generally referred to as an “HTTP-triggered function.”

Note: Azure architects HTTP-triggered functions in a way that is conceptually similar to AWS, meaning that we write our code using an “API Gateway” proxy class—specifically Microsoft.Azure.Functions.Worker.Http.HttpResponseData. In contrast, GCF simply uses the conventional ASP.NET HttpContext class.

In this article, we’re going to:

  • Create a project using the recently released .NET 6 combined with Azure Functions runtime v4.
  • Create code for an HTTP-triggered function that will respond to an HTTP-GET request:

    » The code will return a payload of JSON data.
    » For continuity with other articles in this series, that JSON payload will be exactly the same dataset.
    » We’ll retrieve the dataset from a remote source in GitHub.

  • Test our Azure Function locally.
  • Create resources in Azure.
  • Publish our function to Azure.
  • Reconfigure our Blazor+Telerik client application and re-test for a final result.

Note: Strictly speaking, the actual code that we are demonstrating in this article will work just fine with .NET Core 3.1—however, this demonstration uses .NET 6 so that readers can readily adapt the project using the latest language features.


Requirements for This Article

We’re going to expand upon the solution we developed in the second article of this series.

In addition to those previous requirements, we should now include the following:

Azure Functions In-Process and Isolated-Process Worker

image of two construction workers
Image credit: Pixabay on Pexels

In addition to .NET, Azure Functions can also be written using several other languages (including Node, Python, etc).

Azure Functions uses the Azure Functions Runtime, running on the host infrastructure. To support languages other than .NET, a separate worker process is spawned, from which the function code is executed.

However, up until recently, Functions written using .NET could only be executed from within this runtime—the so-called “In-Process” model.

An impact of this architecture was that Functions were tightly coupled to the version of .NET available on the host. Up until March 2021, this had previously meant that even though .NET 5 had been released at that time, Azure Functions only supported .NET Core 3.1.

This advice is no longer the case:

Succinctly, this means that today, we can now run whatever .NET version we like—e.g., .NET 5—or betas of future versions of .NET, etc.

In this article, we’re going to create an Azure Function using .NET 6. This means that we are presented with the choice of using either “in-process” or “isolated-process.”

To directly quote the announcement article, the reason for choosing one over the other would be:

“Choose in-process if you are upgrading from .NET Core 3.1 Azure Functions or if you are looking for features, such as Durable Functions, that are currently only available to in-process apps. Isolated function apps provide more control of the function worker process and greater compatibility with libraries.”

For the purpose of this article, we’re going to demonstrate the “isolated-process” option. This is an arbitrary choice that has mainly been made so as to demonstrate the marginally more flexible option.

From a developer’s perspective, using the in-process option requires us to initialize the Azure Functions project in a slightly different way. We’ll explore that workflow in the next section.

Use the CLI To Generate a New Azure Function Project

If you haven’t already done so, you need to install the items identified in the Requirements section above before progressing.

Great—now we’re ready, let’s crack on with actually creating some code! Using our command prompt:

  • Navigate to the folder, where, previously in the second article of this series, we created our .NET solution. If you were following along, we called it BlazorTelerikCloudDemo.
  • Inside this folder, create a new folder and call it AzureFunctionService and navigate into it.
  • Use the Azure Function CLI to create a new Azure Function project.
  • Use the CLI templating to add an HTTP Trigger function.
md AzureFunctionService
cd AzureFunctionService
func init --dotnet-isolated
func new --template "Http Trigger" --name HttpDemoFunction

The tooling will create a selection of project files for us.

  • (Optional) Next, for consistency, we’ll add the Azure Function project to the solution. We do this so that we have the option to run the entire solution from an IDE in the future:
cd.. 
dotnet sln add AzureFunctionService

Let’s just quickly test that our system is working.

It may help to explain the architecture involved: Azure Functions require a runtime environment in order to test them locally during development. This differs from comparable solutions from other cloud providers (for example, a Google Cloud Function project can be run directly using dotnet run).

The Azure Function Core Tools that we installed are more than just a way to template code—they also provide this runtime. Because of this, we use the func command to run Functions locally.

  • Again in the console, enter this command (change back to the project folder if needed):

    » If prompted, accept the change to the firewall.

func start
  • When the service runs, it will list the functions that are available along with a local URL to be used.

    » By default, we should see the same local URL as below (it will be in green text).
    » Highlight the URL and copy it:

Functions:
        HttpDemoFunction: [GET,POST] http://localhost:7071/api/HttpDemoFunction
  • Open a web browser and paste the address. If everything is working, we should see the default message: “Welcome to Azure Functions!”

Modifying the Azure Function To Return a Dataset

Next, we want our Azure Function to return some data.

  • We’re going to maintain consistency with the second article in this series by returning a static JSON dataset containing some arbitrary data.
  • We’re going to retrieve the JSON dataset from a GitHub page.

    » If we want, we can view that data directly in our browser:
    https://raw.githubusercontent.com/SiliconOrchid/BlazorTelerikCloudDemo/master/cloud-searchtrend-data.json

  • We’re going to request the dataset from GitHub using an HTTPClient … and then pass that dataset back as the response from our API.

Note: The purpose of this is so that we can learn from a really simple system. We can always layer in more complexity later. It’s completely reasonable to assume that we wouldn’t use this approach in a real system. Instead, it’s much more likely that we would use a separate storage account or even query a database.


Modifying the Function Startup Code

Excellent. Now that we have the dataset available, we can edit the code of the Azure Function.

  • In the file Program.cs, completely replace the code with the following:
using Microsoft.Extensions.Hosting;
using Microsoft.Extensions.DependencyInjection;
using System.Net.Http;

namespace AzureFunctionService
{
    public class Program
    {
        public static void Main()
        {
            var host = new HostBuilder()
                .ConfigureFunctionsWorkerDefaults()
                .ConfigureServices(s =>
                {
                    s.AddSingleton<HttpClient>();
                })
                .Build();

            host.Run();
        }
    }
}

The changes we just made in the above startup code were minimal.

In our function, we want to be able to make an HTTP request to a remote service (this happens to be a page in GitHub). To do that we need to have an HttpClient available.

Modifying the Function Code to Return the Dataset

Next, we need to update the code of the Function itself:

  • Completely replace the code in the file HttpDemoFunction.cs with the following:
using System.Net;
using System.Net.Http;
using System.Threading.Tasks;
using Microsoft.Azure.Functions.Worker;
using Microsoft.Azure.Functions.Worker.Http;
using Microsoft.Extensions.Logging;

namespace AzureFunctionService
{
    public class HttpDemoFunction
    {
        private readonly HttpClient _httpClient;

        public HttpDemoFunction(HttpClient httpClient)
        {
            _httpClient = httpClient;
        }

        [Function("HttpDemoFunction")]
        public async Task<HttpResponseData> Run([HttpTrigger(AuthorizationLevel.Anonymous, "get")] HttpRequestData req, FunctionContext context)
        {
            var remoteDatset = "https://raw.githubusercontent.com/SiliconOrchid/BlazorTelerikCloudDemo/master/cloud-searchtrend-data.json";
            var logger = context.GetLogger("HttpDemoFunction");
            var remoteResponse = await _httpClient.GetAsync(remoteDatset);

            if (remoteResponse.IsSuccessStatusCode)
            {
                var response = req.CreateResponse(HttpStatusCode.OK);
                response.WriteString(await remoteResponse.Content.ReadAsStringAsync());
                return response;
            }
            else
            {
                logger.LogError($"Fault getting remote dataset: {remoteResponse.StatusCode} {remoteResponse.ReasonPhrase}");
                var response = req.CreateResponse(HttpStatusCode.InternalServerError);
                return response;
            }
        }
    }
}

Firstly … yes! … by completely replacing all of our code, that does indeed mean that we didn’t actually need to use the CLI templating earlier. However by going through that additional step, we:

  • Introduced the AzFunc CLI
  • Tested that a baseline system worked in our development environment

Let’s review the various changes that we just made with the above code.

  • We’ve introduced the private member _httpClient to the class. This forms part of the code we use to make HttpClient available through dependency-injection.
  • We’ve also introduced a class constructor. This also forms part of the code needed to make HttpClient available through dependency-injection.
  • Looking at the Function method signature, we:

    » Made the method an Async Task<HttpResponseData>—this is because we’ve introduced async code into the body of the method.
    » Changed AuthorizationLevel from Function to Anonymous. If we didn’t do this, when we published to the cloud, we would need to supply an authorization key as an additional URL parameter.
    » Removed the argument for "post"—as we are not going to make any HTTP POST requests (only a GET), this was redundant.

  • Looking at the Function method body, we:

    » Defined a hardcoded URL to a demonstration dataset. This is the JSON dataset stored on GitHub.
    » Used the FunctionContext to access the logger (made available to us this way, by default)—this a is a way for our app to communicate problems.
    » Used the HttpClient to retrieve the dataset.
    » Created an HTTP response and returned it.

Adding in a CORS Policy Locally

Later in this article, we’ll be slightly updating the Blazor application that we created in the second article.

The change we’re going to make is that instead of retrieving the JSON dataset from a local file, we’ll revise it to request the data from an HTTP endpoint (our new Azure Function).

If we don’t change anything, a problem we’re about to run into is a CORS policy issue.

Explaining Cross-Origin Resource Sharing (CORS) is outside of the scope of this article, but very briefly: browsers implement security that, by default, prevents an application from just retrieving data from resources other than the immediate host. You can learn more here:

Microsoft: Call a web API from ASP.NET Core Blazor (WASM)

To add a CORS policy to an Azure Function we need to do the following:

  • Using VS Code, edit the file local.settings.json.
  • Introduce a new section of configuration (leave the existing items) that looks like this:
{
    … existing items…,

  "Host": {
    "CORS": "*"
  }
}

In this case, we’ve used a wildcard entry to completely open up access. For security reasons, this is not at all recommended and is absolutely not something we should do in a real system. This step is intended as a convenience while learning.

Test: Modify the Blazor Client To Retrieve From the Locally Hosted Azure Function

Let’s test our partial progress:

  • Firstly, re-run the Azure Function using the CLI—just like we did earlier:
func start

Next, we’ll shift attention back to the Blazor application created in the second article:

  • Using VS Code, edit the file /Pages/Index.razor.
  • Within the @code section, update the OnInitializedAsync() method as below:

    » Pay extra attention to check that the URL, and in particular the port number, matches the one reported when you started the Function):

protected override async Task OnInitializedAsync()
{
    //seriesData = await Http.GetFromJsonAsync<SeriesDataModel[]>("sample-data/cloud-searchtrend-data.json");
    seriesData = await Http.GetFromJsonAsync<SeriesDataModel[]>("http://localhost:7071/api/HttpDemoFunction");
}
  • With the Azure Function already running separately, start the Blazor application.
  • If everything has worked, we should see the Telerik Chart Component showing exactly as it did before (except that this time, behind the scenes, the data is now coming from a different source).

Note: As a heads-up, we’ll return here again at the end of the article, to update the URL with the URL of the Function hosted in the cloud.


Create and Configure Azure Resources

image of coloured pens
Image credit: Jessica Lewis Creative on Pexels

Next, we’ll start moving our API service into the cloud. To do that, we first need to create some resources.

We can check the version tooling using:

az –version
func –version

Note: In this article, we’ll be using the Azure CLI, but, if you prefer, using the web-based interface found in the Azure Portal is also a really good option.


Log into Azure CLI

  • From the command prompt, start by logging in like this (this command will open up a browser page where you can enter your Microsoft account credentials):
az login

Optional: Consider Using Other Azure Regions

When we create any Azure resources, we have a large choice of data centers from around the globe. Microsoft uses the term “Region” to describe their major data centers.

For the purpose of this article, we’ll show examples that use a region called “West Europe” (which happens to be in the Netherlands).

Normally, we should choose an Azure Region that is nearest to our customers. If we wanted to adapt the example in this article, we need to know what choices of “Azure Region” we have.

Azure Functions are not available in all regions, so check availability using the docs or the CLI command, below:

az functionapp list-consumption-locations

Create an Azure Resource Group

First, we create a “Resource Group.” Think of this as a “logical folder”—a way to keep various resources grouped together.

az group create --name BlazorTelerikCloudDemo-rg --location "westeurope"

Create an Azure Storage Account

Next, we create a “Storage Account.” This is what we use to store the deployable artifacts that we’ll build.

This next bit requires us to create our own storage account identifier, so you won’t be able to just cut and paste entire code snippets from the examples below.

The options we have for naming this resource are limited, so be aware it must be globally unique, between 3 and 24 characters, and comprised of numbers and lowercase characters only.

  • Enter the following command, substituting with your own unique storage account name into the example below.
  • It may be helpful to make a note of the name you use, as you’ll need it again shortly (e.g., copy and paste it into a notepad, etc.).
  • If the command has been successful, you’ll receive a large JSON response detailing the resource that has been created.
az storage account create --name <YOUR_UNIQUE_STORAGE_NAME> --location "westeurope" --resource-group "BlazorTelerikCloudDemo-rg" --sku "Standard_LRS"

Create an Azure Function App

Lastly, we create a “Function App.” This represents the Azure Function App Service that we’ll use to host our function.

We’ll be using the Consumption plan hosting, which means we are billed as we consume the function.

As with the “Storage Account” previously, we’ll need to create our own identifier.

  • Use the following command to provision an Azure Function app, substituting your own unique identifiers as needed:
az functionapp create --resource-group BlazorTelerikCloudDemo-rg --consumption-plan-location "westeurope" --runtime dotnet --functions-version 3 --name <YOUR_UNIQUE_FUNCTION_NAME> --storage-account <YOUR_UNIQUE_STORAGE_NAME>

Deploy the Azure Function App to Azure

  • Use the following command to publish our app to the cloud:
func azure functionapp publish <YOUR_UNIQUE_FUNCTION_NAME>

When the app has finished deploying, the CLI will return some confirmation information. Among the data, we can find the URL that we can use to invoke the function—it will look something like the following (it may be useful to cut and paste this into a notepad for later use):

https://<YOUR_UNIQUE_FUNCTION_NAME>.azurewebsites.net/api/httpdemofunction
  • Because this is a simple HTTP GET API—a straightforward way to test this is working is by visiting that link in a browser.

Note: Behind the scenes, a .zip file containing our build and its dependencies gets copied to the storage account.

Note: If you’re accustomed to working with AWS, but new to Azure, it’s useful to note that you don’t need to separately provision an API Gateway or similar to have access to a basic endpoint—the provisioning of a scalable HTTP endpoint is automatically done for you. If we have enterprise-level requirements for a more advanced gateway, that still remains a separate option.


Adding in a CORS Policy to Azure

Just as we did earlier with the locally deployed version of our Function, in order for our Blazor client to be able to work with the Function published to Azure, we need to add a CORS policy.

As before, we should reiterate that for security reasons, this is not at all recommended and is absolutely not something we should do in a real system. This step is intended as a convenience while we are learning.

  • Use the following command to update the CORS policy configuration in the cloud:
az functionapp cors add -g BlazorTelerikCloudDemo-rg -n <YOUR_UNIQUE_FUNCTION_NAME> --allowed-origins *

Note: Alternatively, it’s also possible to add CORS rules using the Azure Portal.


Test: Modify the Blazor Client To Retrieve From Azure

Finally, let’s test our finished result.

Return back to the Blazor application that we created in the second article of this series:

  • Just like earlier, using VS Code, edit the file /Pages/Index.razor
  • In the code section, update the OnInitializedAsync() method like this (modifying the URL to include your unique name as appropriate):
protected override async Task OnInitializedAsync()
{
    //seriesData = await Http.GetFromJsonAsync<SeriesDataModel[]>("sample-data/cloud-searchtrend-data.json");
    //seriesData = await Http.GetFromJsonAsync<SeriesDataModel[]>("http://localhost:7071/api/HttpDemoFunction");
    seriesData = await Http.GetFromJsonAsync<SeriesDataModel[]>("https://<YOUR_UNIQUE_FUNCTION_NAME>.azurewebsites.net/api/httpdemofunction");
}

Nothing Further? Remove the Azure Resources!

Once we’ve finished experimenting, it’s best not to leave resources hanging around (especially as we have undermined our CORS policy).

A quick way to clean everything is simply to delete the entire resource group, like this:

az group delete --name BlazorTelerikCloudDemo-rg

Wrapping Up

How Does the Azure Experience Compare?

If you’re curious to learn how my experience with Azure stacks up against AWS and Google Cloud, I wrote more about this in Part 1. You can also learn more specifics in Part 4 on AWS and Part 5 on Google Cloud.

Useful References

You may find it helpful to have awareness about the following subjects. If you get stuck, here are some useful articles:

Acknowledgments

Thanks to Mandy Mowers & Layla Porter for the overall document review.

Disclosure

I always disclose my position, association or bias at the time of writing. As an author, I received a fee from Progress Telerik in exchange for the content found in this series of articles. I have no association with Microsoft, Amazon nor Google—this series of articles is in no way intended to offer biased opinion nor recommendation as to your eventual choice of cloud provider.


Jim-McG
About the Author

Jim McG

A UK-based freelance developer using largely Microsoft technologies, Jim McG has been working with web and backend services since the late ‘90s. Today, he generally gravitates around cloud-based solutions using C#, .NET Core and Azure, with a particular interest in serverless technologies. Most recently, he's been transitioning across to Unity and the amazing tech that is Augmented Reality.

Related Posts

Comments

Comments are disabled in preview mode.