In this third article, we’ll work through the steps needed to create an Azure Functions backend that serves up our dataset via an HTTP API.
Welcome to this third article in a series demonstrating how to create a Blazor client application that incorporates a Telerik UI For Blazor Chart component.
If you haven’t already read the second article, that should be your starting point.
An objective of this series is to demonstrate how we could combine our client application with simple serverless function backend solutions.
In this article, we’ll be exploring how to use Azure Functions (sometimes shortened to just “Functions”). This is Microsoft’s version of the increasingly popular "serverless computing paradigm."
If you’re new to “serverless computing” and more accustomed to working with traditional websites/services where code is published to a host server, it’s useful to understand that with Azure Functions:
You can learn more about the subject here:
Note: Azure Functions support deployments of both file-based packages and container images. In this article, we’ll be demonstrating using a basic file-based deployment.
If you’re planning to follow along with this series, you will need to create a Blazor client application, following the instructions in Part 2. Having completed that step, we’ll then look at creating a backend API—in this article, we’ll be looking at creating that API using Azure Functions.
If you’re interested to learn how the experience compares between the cloud providers, a succinct comparison is in Part 1, and you may further be interested in reading all three of the platform-specific articles in this series.
In this article, we’ll use Azure Functions to create a simple serverless HTTP API backend that responds to a request with a JSON dataset.
The Azure Functions runtime supports a number of languages including Node and Python. However, in this article, we will continue using .NET with C#, as we have previously with our Blazor client.
Azure Functions are event-driven, meaning that their code is triggered in response to different types of “bound event,” such as a timer-event or file-change in a storage account.
For our purposes, we’re interested in something generally referred to as an “HTTP-triggered function.”
Note: Azure architects HTTP-triggered functions in a way that is conceptually similar to AWS, meaning that we write our code using an “API Gateway” proxy class—specifically
Microsoft.Azure.Functions.Worker.Http.HttpResponseData
. In contrast, GCF simply uses the conventional ASP.NETHttpContext
class.
In this article, we’re going to:
» The code will return a payload of JSON data.
» For continuity with other articles in this series, that JSON payload will be exactly the same dataset.
» We’ll retrieve the dataset from a remote source in GitHub.
Note: Strictly speaking, the actual code that we are demonstrating in this article will work just fine with .NET Core 3.1—however, this demonstration uses .NET 6 so that readers can readily adapt the project using the latest language features.
We’re going to expand upon the solution we developed in the second article of this series.
In addition to those previous requirements, we should now include the following:
» We can sign up for a free introductory account from Microsoft Azure.
» Microsoft: What is the Azure CLI?
» Microsoft: How to install the Azure CLI
» This article was written and tested using v2.30.0 (thirty).
» Microsoft: What is the Azure Functions Core Tools?
»
Microsoft: How to install the Azure Functions Core Tools
» This article was written and tested using tools v4.0.3971.
Image credit: Pixabay on Pexels
In addition to .NET, Azure Functions can also be written using several other languages (including Node, Python, etc).
Azure Functions uses the Azure Functions Runtime, running on the host infrastructure. To support languages other than .NET, a separate worker process is spawned, from which the function code is executed.
However, up until recently, Functions written using .NET could only be executed from within this runtime—the so-called “In-Process” model.
An impact of this architecture was that Functions were tightly coupled to the version of .NET available on the host. Up until March 2021, this had previously meant that even though .NET 5 had been released at that time, Azure Functions only supported .NET Core 3.1.
This advice is no longer the case:
» With .NET 6 representing an LTS version, it was included as an “in-process” runtime option.
» This now provides an option to decouple Functions written with .NET from the host, by using a separate worker process—in a similar way to those other non-.NET languages previously.
Succinctly, this means that today, we can now run whatever .NET version we like—e.g., .NET 5—or betas of future versions of .NET, etc.
In this article, we’re going to create an Azure Function using .NET 6. This means that we are presented with the choice of using either “in-process” or “isolated-process.”
To directly quote the announcement article, the reason for choosing one over the other would be:
“Choose in-process if you are upgrading from .NET Core 3.1 Azure Functions or if you are looking for features, such as Durable Functions, that are currently only available to in-process apps. Isolated function apps provide more control of the function worker process and greater compatibility with libraries.”
For the purpose of this article, we’re going to demonstrate the “isolated-process” option. This is an arbitrary choice that has mainly been made so as to demonstrate the marginally more flexible option.
From a developer’s perspective, using the in-process option requires us to initialize the Azure Functions project in a slightly different way. We’ll explore that workflow in the next section.
If you haven’t already done so, you need to install the items identified in the Requirements section above before progressing.
Great—now we’re ready, let’s crack on with actually creating some code! Using our command prompt:
BlazorTelerikCloudDemo
.AzureFunctionService
and navigate into it.md AzureFunctionService
cd AzureFunctionService
func init --dotnet-isolated
func new --template "Http Trigger" --name HttpDemoFunction
The tooling will create a selection of project files for us.
cd..
dotnet sln add AzureFunctionService
Let’s just quickly test that our system is working.
It may help to explain the architecture involved: Azure Functions require a runtime environment in order to test them locally during development. This differs from comparable solutions from other cloud providers (for example, a Google Cloud Function project
can be run directly using dotnet run
).
The Azure Function Core Tools that we installed are more than just a way to template code—they also provide this runtime. Because of this, we use the func
command to run Functions locally.
» If prompted, accept the change to the firewall.
func start
» By default, we should see the same local URL as below (it will be in green text).
» Highlight the URL and copy it:
Functions:
HttpDemoFunction: [GET,POST] http://localhost:7071/api/HttpDemoFunction
Next, we want our Azure Function to return some data.
» If we want, we can view that data directly in our browser:https://raw.githubusercontent.com/SiliconOrchid/BlazorTelerikCloudDemo/master/cloud-searchtrend-data.json
Note: The purpose of this is so that we can learn from a really simple system. We can always layer in more complexity later. It’s completely reasonable to assume that we wouldn’t use this approach in a real system. Instead, it’s much more likely that we would use a separate storage account or even query a database.
Excellent. Now that we have the dataset available, we can edit the code of the Azure Function.
Program.cs
, completely replace the code with the following:using Microsoft.Extensions.Hosting;
using Microsoft.Extensions.DependencyInjection;
using System.Net.Http;
namespace AzureFunctionService
{
public class Program
{
public static void Main()
{
var host = new HostBuilder()
.ConfigureFunctionsWorkerDefaults()
.ConfigureServices(s =>
{
s.AddSingleton<HttpClient>();
})
.Build();
host.Run();
}
}
}
The changes we just made in the above startup code were minimal.
In our function, we want to be able to make an HTTP request to a remote service (this happens to be a page in GitHub). To do that we need to have an HttpClient
available.
HttpClient
Next, we need to update the code of the Function itself:
HttpDemoFunction.cs
with the following:using System.Net;
using System.Net.Http;
using System.Threading.Tasks;
using Microsoft.Azure.Functions.Worker;
using Microsoft.Azure.Functions.Worker.Http;
using Microsoft.Extensions.Logging;
namespace AzureFunctionService
{
public class HttpDemoFunction
{
private readonly HttpClient _httpClient;
public HttpDemoFunction(HttpClient httpClient)
{
_httpClient = httpClient;
}
[Function("HttpDemoFunction")]
public async Task<HttpResponseData> Run([HttpTrigger(AuthorizationLevel.Anonymous, "get")] HttpRequestData req, FunctionContext context)
{
var remoteDatset = "https://raw.githubusercontent.com/SiliconOrchid/BlazorTelerikCloudDemo/master/cloud-searchtrend-data.json";
var logger = context.GetLogger("HttpDemoFunction");
var remoteResponse = await _httpClient.GetAsync(remoteDatset);
if (remoteResponse.IsSuccessStatusCode)
{
var response = req.CreateResponse(HttpStatusCode.OK);
response.WriteString(await remoteResponse.Content.ReadAsStringAsync());
return response;
}
else
{
logger.LogError($"Fault getting remote dataset: {remoteResponse.StatusCode} {remoteResponse.ReasonPhrase}");
var response = req.CreateResponse(HttpStatusCode.InternalServerError);
return response;
}
}
}
}
Firstly … yes! … by completely replacing all of our code, that does indeed mean that we didn’t actually need to use the CLI templating earlier. However by going through that additional step, we:
Let’s review the various changes that we just made with the above code.
_httpClient
to the class. This forms part of the code we use to make HttpClient
available through dependency-injection.HttpClient
available through dependency-injection.» Made the method an Async Task<HttpResponseData>
—this is because we’ve introduced async code into the body of the method.
» Changed AuthorizationLevel
from Function
to Anonymous
.
If we didn’t do this, when we published to the cloud, we would need to supply an authorization key as an additional URL parameter.
» Removed the argument for "post"
—as we are not going to make any HTTP POST
requests (only a GET), this was redundant.
» Defined a hardcoded URL to a demonstration dataset. This is the JSON dataset stored on GitHub.
» Used the FunctionContext
to access the logger (made available to us this way, by default)—this a is a way for
our app to communicate problems.
» Used the HttpClient
to retrieve the dataset.
» Created an HTTP response and returned it.
Later in this article, we’ll be slightly updating the Blazor application that we created in the second article.
The change we’re going to make is that instead of retrieving the JSON dataset from a local file, we’ll revise it to request the data from an HTTP endpoint (our new Azure Function).
If we don’t change anything, a problem we’re about to run into is a CORS policy issue.
Explaining Cross-Origin Resource Sharing (CORS) is outside of the scope of this article, but very briefly: browsers implement security that, by default, prevents an application from just retrieving data from resources other than the immediate host. You can learn more here:
Microsoft: Call a web API from ASP.NET Core Blazor (WASM)
To add a CORS policy to an Azure Function we need to do the following:
local.settings.json
.{
… existing items…,
"Host": {
"CORS": "*"
}
}
In this case, we’ve used a wildcard entry to completely open up access. For security reasons, this is not at all recommended and is absolutely not something we should do in a real system. This step is intended as a convenience while learning.
Let’s test our partial progress:
func start
Next, we’ll shift attention back to the Blazor application created in the second article:
/Pages/Index.razor
.@code
section, update the OnInitializedAsync()
method as below:» Pay extra attention to check that the URL, and in particular the port number, matches the one reported when you started the Function):
protected override async Task OnInitializedAsync()
{
//seriesData = await Http.GetFromJsonAsync<SeriesDataModel[]>("sample-data/cloud-searchtrend-data.json");
seriesData = await Http.GetFromJsonAsync<SeriesDataModel[]>("http://localhost:7071/api/HttpDemoFunction");
}
Note: As a heads-up, we’ll return here again at the end of the article, to update the URL with the URL of the Function hosted in the cloud.
Image credit: Jessica Lewis Creative on Pexels
Next, we’ll start moving our API service into the cloud. To do that, we first need to create some resources.
We can check the version tooling using:
az –version
func –version
Note: In this article, we’ll be using the Azure CLI, but, if you prefer, using the web-based interface found in the Azure Portal is also a really good option.
az login
When we create any Azure resources, we have a large choice of data centers from around the globe. Microsoft uses the term “Region” to describe their major data centers.
For the purpose of this article, we’ll show examples that use a region called “West Europe” (which happens to be in the Netherlands).
Normally, we should choose an Azure Region that is nearest to our customers. If we wanted to adapt the example in this article, we need to know what choices of “Azure Region” we have.
Azure Functions are not available in all regions, so check availability using the docs or the CLI command, below:
az functionapp list-consumption-locations
First, we create a “Resource Group.” Think of this as a “logical folder”—a way to keep various resources grouped together.
az group create --name BlazorTelerikCloudDemo-rg --location "westeurope"
Next, we create a “Storage Account.” This is what we use to store the deployable artifacts that we’ll build.
This next bit requires us to create our own storage account identifier, so you won’t be able to just cut and paste entire code snippets from the examples below.
The options we have for naming this resource are limited, so be aware it must be globally unique, between 3 and 24 characters, and comprised of numbers and lowercase characters only.
az storage account create --name <YOUR_UNIQUE_STORAGE_NAME> --location "westeurope" --resource-group "BlazorTelerikCloudDemo-rg" --sku "Standard_LRS"
Lastly, we create a “Function App.” This represents the Azure Function App Service that we’ll use to host our function.
We’ll be using the Consumption plan hosting, which means we are billed as we consume the function.
As with the “Storage Account” previously, we’ll need to create our own identifier.
az functionapp create --resource-group BlazorTelerikCloudDemo-rg --consumption-plan-location "westeurope" --runtime dotnet --functions-version 3 --name <YOUR_UNIQUE_FUNCTION_NAME> --storage-account <YOUR_UNIQUE_STORAGE_NAME>
func azure functionapp publish <YOUR_UNIQUE_FUNCTION_NAME>
When the app has finished deploying, the CLI will return some confirmation information. Among the data, we can find the URL that we can use to invoke the function—it will look something like the following (it may be useful to cut and paste this into a notepad for later use):
https://<YOUR_UNIQUE_FUNCTION_NAME>.azurewebsites.net/api/httpdemofunction
Note: Behind the scenes, a .zip file containing our build and its dependencies gets copied to the storage account.
Note: If you’re accustomed to working with AWS, but new to Azure, it’s useful to note that you don’t need to separately provision an API Gateway or similar to have access to a basic endpoint—the provisioning of a scalable HTTP endpoint is automatically done for you. If we have enterprise-level requirements for a more advanced gateway, that still remains a separate option.
Just as we did earlier with the locally deployed version of our Function, in order for our Blazor client to be able to work with the Function published to Azure, we need to add a CORS policy.
As before, we should reiterate that for security reasons, this is not at all recommended and is absolutely not something we should do in a real system. This step is intended as a convenience while we are learning.
az functionapp cors add -g BlazorTelerikCloudDemo-rg -n <YOUR_UNIQUE_FUNCTION_NAME> --allowed-origins *
Note: Alternatively, it’s also possible to add CORS rules using the Azure Portal.
Finally, let’s test our finished result.
Return back to the Blazor application that we created in the second article of this series:
/Pages/Index.razor
OnInitializedAsync()
method like this (modifying the URL to include your unique name as appropriate):protected override async Task OnInitializedAsync()
{
//seriesData = await Http.GetFromJsonAsync<SeriesDataModel[]>("sample-data/cloud-searchtrend-data.json");
//seriesData = await Http.GetFromJsonAsync<SeriesDataModel[]>("http://localhost:7071/api/HttpDemoFunction");
seriesData = await Http.GetFromJsonAsync<SeriesDataModel[]>("https://<YOUR_UNIQUE_FUNCTION_NAME>.azurewebsites.net/api/httpdemofunction");
}
Once we’ve finished experimenting, it’s best not to leave resources hanging around (especially as we have undermined our CORS policy).
A quick way to clean everything is simply to delete the entire resource group, like this:
az group delete --name BlazorTelerikCloudDemo-rg
If you’re curious to learn how my experience with Azure stacks up against AWS and Google Cloud, I wrote more about this in Part 1. You can also learn more specifics in Part 4 on AWS and Part 5 on Google Cloud.
You may find it helpful to have awareness about the following subjects. If you get stuck, here are some useful articles:
Thanks to Mandy Mowers & Layla Porter for the overall document review.
I always disclose my position, association or bias at the time of writing. As an author, I received a fee from Progress Telerik in exchange for the content found in this series of articles. I have no association with Microsoft, Amazon nor Google—this series of articles is in no way intended to offer biased opinion nor recommendation as to your eventual choice of cloud provider.
A UK-based freelance developer using largely Microsoft technologies, Jim McG has been working with web and backend services since the late ‘90s. Today, he generally gravitates around cloud-based solutions using C#, .NET Core and Azure, with a particular interest in serverless technologies. Most recently, he's been transitioning across to Unity and the amazing tech that is Augmented Reality.