In this fifth article, we’ll work through the steps needed to create a Google Cloud Functions backend that serves up our dataset via an HTTP API.
Welcome to the fifth and final article in a series demonstrating how to create a Blazor client application that incorporates a Telerik UI For Blazor Chart component.
If you haven’t already read the second article, that should be your starting point.
An objective of this series is to demonstrate how we could combine our client application with simple serverless function backend solutions.
In this article, we’ll be exploring how to use Google Cloud Functions (sometimes shortened to “GCF” or “Function”). This is Google’s version of the increasingly popular "serverless computing paradigm.”
If you’re new to “serverless computing” and more accustomed to working with traditional websites/services where code is published to a host server, it’s useful to understand that with Google Cloud Functions:
You can learn more about the subject here:
Note: Google Cloud Functions support deployments of both file-based packages and container images. In this article, we’ll be demonstrating using a basic file-based deployment.
If you’re planning to follow along with this series, you will need to create a Blazor client application, following the instructions in Part 2. Having completed that step, we’ll then look at creating a backend API—in this article, we’ll be looking at creating that API using Google Cloud Functions.
If you’re interested to learn how the experience compares between the cloud providers, a succinct comparison is in Part 1, and you may further be interested in reading all three of the platform-specific articles in this series.
In this article, we’ll use Google Cloud Functions to create a simple serverless HTTP API backend that responds to a request with a JSON dataset.
Google Cloud Functions supports a number of languages including Node and Python. However, in this article, we will continue using .NET with C#, as we have previously with our Blazor client.
Google Cloud Functions are event-driven, meaning that their code is triggered in response to various different types of “bound event,” such as a change to a Firebase database.
For our purposes, we’re interested in something generally referred to as an “HTTP-triggered function.”
An HTTP-triggered Google Cloud Function will be largely familiar-looking to anyone who has worked with a conventional Web API project.
Note: Unlike the equivalent Azure Function or Lambda Function, where we program against an “API Gateway” proxy class, when developing with GCF we simply use the familiar
Microsoft.AspNetCore.Http.HttpContext
class.
In this article, we’re going to:
» The code will return a payload of JSON data.
» For continuity with other articles in this series, that JSON payload will be exactly the same dataset.
» We’ll retrieve the dataset from a remote source in GitHub.
Currently, Google Cloud Functions supports only .NET Core 3.1.
In the future, there is speculation that Google may support .NET 6, which would be the next LTS (long-term support) version of .NET. However, at the time of writing, there is no official confirmation or announcement to that effect.
dotnet start
command.func start
command.Note: Google Engineer and C# expert Jon Skeet has suggested that, if using .NET 5 and up is important to us today, an option could be to adopt a solution that uses containers and publish to Google Cloud Run and then use an EventArc trigger.
We’re going to expand upon the solution we developed in the second article of this series.
In addition to those previous requirements, we should now include the following:
» We can sign up for a free introductory account from Google Cloud.
» Google: Quickstart: Getting started with Cloud SDK (download links).
If you haven’t already done so, you need to install the items identified in the Requirements section above, before progressing.
At this point, we need to grab ourselves some .NET project templates specific to GCF—these have been created by Google.
dotnet new -i Google.Cloud.Functions.Templates
Great—now we’re ready, let’s create some code! Using our command prompt:
BlazorTelerikCloudDemo
.GCFService
and navigate into it.md GCFService
cd GCFService
dotnet new gcf-http
» Enter the following:
cd..
dotnet sln add GCFService
Let’s just quickly test that our system is working.
dotnet run
If all is well, the system should respond with some logging information that confirms the URL that our service is running from:
[Microsoft.Hosting.Lifetime] [info] Now listening on: http://127.0.0.1:8080
Hello, Functions Framework.
Image credit: Anastasia Shuraeva on Pexels
If we look at the files that make up the new GCF project, it may strike us how very bare the code is—there is only a single Function.cs
file!
This may come as quite a surprise [to developers accustomed to convention ASP.NET apps] because normally we would expect a .NET hosted service to at least include a Program.cs
with a Main()
program entry point.
There is clearly something very unconventional going on here, which, fortunately, is explained in his hugely informative article from last year:
I’ll attempt to paraphrase the relevant explanation in Jon’s article:
Google.Cloud.Functions.Hosting
package, which is responsible for configuring and starting the server.» You can read more about those packages at GitHub: Detailed package descriptions.
One could suggest that this unconventional approach initially seeds some confusion for the beginner—it’s not immediately apparent what the code, provided by the template, is doing.
However, once we recognize that there is a bit of magic going on behind the scenes, this is actually a pretty ingenious approach.
Functions are supposed to be about embodying the essence of “small units of code”—which is exactly what we get here.
We’re also given a solution that can be run directly on our local systems (without any additional runtime) and be readily debugged and tested. In the same breath, when deployed, it transforms into something that runs within Google’s cloud infrastructure.
Next, we want our GCF to return some data.
» If we want, we can view that data directly in our browser:
https://raw.githubusercontent.com/SiliconOrchid/BlazorTelerikCloudDemo/master/cloud-searchtrend-data.json
Note: The purpose of this is so that we can learn from a really simple system. We can always layer in more complexity later. It’s completely reasonable to assume that we wouldn’t use this approach in a real system. Instead, it’s much more likely that we would use a separate storage account or even query a database.
In the previous section, we established that Google has done a bunch of work to get rid of the application startup code.
However, we now find ourselves in a position of saying, “Hey! We actually need some of that!”
In our code, we’re going to be using System.Net.Http.HttpClient
to make an HTTP call to GitHub, reading in the static JSON dataset published there.
To make HttpClient
available to the code in our GCF, we need to be able to configure the .NET dependency injection provider.
It turns out that this isn’t a problem—Google provides documentation with an example of one way that we could achieve this:
Go ahead and do the following:
Startup.cs
.using System.Net.Http;
using Microsoft.AspNetCore.Hosting;
using Microsoft.Extensions.DependencyInjection;
using Google.Cloud.Functions.Hosting;
namespace GoogleCloudFunctionService
{
public class Startup : FunctionsStartup
{
public override void ConfigureServices(WebHostBuilderContext context, IServiceCollection services) =>
services.AddSingleton<HttpClient>();
}
}
This service-configuration code looks very similar to something we would expect to find in a regular .NET app.
What we may find unfamiliar, though, is that this Startup
class derives from FunctionsStartup
—part of the Google.Cloud.Functions.Hosting
package.
Note: As a reminder from earlier, this
Hosting
package implements the GCF approach to service configuration and hosting.
Next, we need to update the code of the Function itself:
Function.cs
with the following:using System.Net;
using System.Net.Http;
using System.Threading.Tasks;
using Microsoft.AspNetCore.Http;
using Google.Cloud.Functions.Framework;
using Google.Cloud.Functions.Hosting;
namespace GoogleCloudFunctionService
{
[FunctionsStartup(typeof(Startup))]
public class Function : IHttpFunction
{
private readonly HttpClient _httpClient;
public Function(HttpClient httpClient)
{
_httpClient = httpClient;
}
public async Task HandleAsync(HttpContext context)
{
var remoteDatset = "https://raw.githubusercontent.com/SiliconOrchid/BlazorTelerikCloudDemo/master/cloud-searchtrend-data.json";
var remoteResponse = await _httpClient.GetAsync(remoteDatset);
if (remoteResponse.IsSuccessStatusCode)
{
context.Response.StatusCode = (int)HttpStatusCode.OK;
await context.Response.WriteAsync(await remoteResponse.Content.ReadAsStringAsync());
}
else
{
context.Response.StatusCode = (int)HttpStatusCode.InternalServerError;
await context.Response.WriteAsync(string.Empty);
}
}
}
}
Let’s review the various changes that we just made to the above code.
[FunctionsStartup(typeof(Startup))]
.» This is used to “wire-up” our Function class, with the startup code Startup
that we created in the previous step.
_httpClient
to the class.» This forms part of the code that we use to make HttpClient
available through dependency injection.
» This also forms part of the code needed to make HttpClient
available through dependency injection.
HandleAsync
, we:» Defined a hard-coded URL to a demonstration dataset—this is the JSON dataset stored on GitHub.
» Used the HttpClient
to retrieve the dataset.
» Returned an HTTP response containing the dataset along with the appropriate response status code.
Later in this article, we’ll be slightly updating the Blazor application that we created in the second article.
The change we’re going to make is that, instead of retrieving the JSON dataset from a local file, we’ll revise the code to request the data from an HTTP endpoint (our new GCF).
If we don’t change anything, a problem we’re about to run into is a CORS policy issue.
Explaining Cross-Origin Resource Sharing (CORS) is outside of the scope of this article, but very briefly: browsers implement security that, by default, prevents an application from just retrieving data from resources other than the immediate host. You can learn more here:
According to the above GCF docs, one way to address this issue is to define a CORS policy directly by hard-coding the header values.
Let’s make the necessary change to the code:
function.cs
....
if (remoteResponse.IsSuccessStatusCode)
{
context.Response.Headers.Append("Access-Control-Allow-Origin", "*");
context.Response.StatusCode = (int)HttpStatusCode.OK;
...
In this case, we’ve used a wildcard entry to completely open up access. For security reasons, this is not at all recommended and is absolutely not something we should do in a real system. This step is intended as a convenience while learning.
Note: Hard-coding the setting makes it easier to learn what’s happening, but looking ahead, we will probably be interested in supplementing this code with some form of environment-specific configuration.
Let’s test our partial progress:
» Navigate into the same folder as the GCF project (named GCFService
if you have followed-along exactly).
» Enter the command:
dotnet run
Next, we’ll shift attention back to the Blazor application created in the second article:
/Pages/Index.razor
.@code
section, update the OnInitializedAsync()
method as below:» Pay extra attention to check that the URL, and in particular the port number, matches the one reported when you started the GCF):
...
protected override async Task OnInitializedAsync()
{
//seriesData = await Http.GetFromJsonAsync<SeriesDataModel[]>("sample-datacloud-searchtrend-data.json");
seriesData = await Http.GetFromJsonAsync<SeriesDataModel[]>("http://localhost:8080");
}
...
Note: As a heads-up, we’ll return here again at the end of the article, to update the URL with the URL of the Function hosted in the cloud.
Image credit: Jessica Lewis Creative on Pexels
Note: In this article, we’ll be using the Google Cloud SDK CLI, but if you prefer, using the web-based interface found in the Google Cloud Console is also a really good option.
Next, we’ll start moving our API service into the cloud. To do that, we’ll need to create some resources—but first, we need to log in.
gcloud components update
gcloud auth login
First, we create a “Project.” This is how Google Cloud resources are grouped together.
A Google Cloud project has two aspects that are important to us:
» This identifier will be used to form part of the URL used to access our function.
gcloud projects create <YOUR_UNIQUE_PROJECT_ID> --name="Blazor Telerik Cloud Demo"
Note: In the following sections, we’ll be needing to use your unique project-id a few times, so it may be helpful to cut + paste that value somewhere convenient.
When we create any Google Cloud resources, we have a large choice of data centers from around the globe. Google uses the terms “Region” and “Zone” to describe the location of their data centers.
For the purpose of this article, we’ll show examples that use a region called “europe-west2” (which happens to be in London).
Normally, we should choose a Google Region that is nearest to our customers. If we wanted to adapt the example in this article, we need to know what choices of “Google Region” we have. That list is available in the documentation, but another convenient way is to use a CLI command, as below.
gcloud functions regions list --project <YOUR_UNIQUE_PROJECT_ID>
Later, when we deploy our Function, we could specify the region we want each and every time. A more convenient option is to set the choice as a default, like this:
gcloud config set functions/region europe-west1
Before we can start using our GCF in the cloud, we first need to enable some Google Cloud APIs.
Specifically for our project, we need the:
cloudfunctions
API» We need this to connect our GCF to Google’s infrastructure.
cloudbuild
API» We need this because our code will be (re)built and deployed remotely.
To enable these APIs, enter the following commands:
gcloud services enable cloudfunctions.googleapis.com --project <YOUR_UNIQUE_PROJECT_ID>
gcloud services enable cloudbuild.googleapis.com --project <YOUR_UNIQUE_PROJECT_ID>
Note: If you’re curious and would like to see the entire list of available APIs for a project, use
gcloud services list --available --project <YOUR_UNIQUE_PROJECT_ID>
.
Now that everything we need is in place, we can go ahead and publish our function to the cloud. You can read the documentation here: Google: gcloud functions deploy.
gcloud functions deploy my-function --project <YOUR_UNIQUE_PROJECT_ID> --entry-point GoogleCloudFunctionService.Function --runtime dotnet3 --trigger-http --allow-unauthenticated
There are some points in the above command that are worth explaining:
my-function
is an arbitrary name used to identify this http-triggered function. This will partly be used to form the URL path to this specific function.--project xyz
is your unique project-id that you defined earlier. The main part of the URL is derived from this globally unique identifier.--entry-point GoogleCloudFunctionService.Function
is used to connect to our C# code—specifically naming both the namespace along with the class name of the Function itself.When the app has finished deploying, the CLI will return some confirmation information.
Among the data, we need to look for the term httpsTrigger:
. This reveals the URL that we can use to invoke the function—it will look something like the following:
https://europe-west1-<YOUR_UNIQUE_PROJECT_ID>.cloudfunctions.net/httpdemofunction
Note: It may be helpful to cut + paste the URL to this GCF somewhere convenient, as we’ll be using that shortly.
Note: If you’re accustomed to working with AWS but new to Google Cloud, it’s useful to note that you don’t need to separately provision an API Gateway or similar—the provisioning of a scalable HTTP endpoint is automatically done for you.
Finally, let’s test our finished result:
Return back to the Blazor application, that we created in the second article of this series:
/Pages/Index.razor
.OnInitializedAsync()
method like this (modifying the URL to include your unique name as appropriate):...
protected override async Task OnInitializedAsync()
{
//seriesData = await Http.GetFromJsonAsync<SeriesDataModel[]>("sample-data/cloud-searchtrend-data.json");
//seriesData = await Http.GetFromJsonAsync<SeriesDataModel[]>("http://localhost:7071/api/HttpDemoFunction");
seriesData = await Http.GetFromJsonAsync<SeriesDataModel[]>("https://europe-west1-<YOUR_UNIQUE_PROJECT_ID>.cloudfunctions.net/httpdemofunction");
}
...
Once we’ve finished experimenting, it’s best not to leave resources hanging around (especially as we have undermined our CORS policy).
A quick way to clean everything is simply to delete the entire project, like this:
gcloud projects delete <YOUR_UNIQUE_PROJECT_ID>
Strictly speaking, this doesn’t actually delete the project, but instead marks it for “soft delete”—and retains the project for 30 days.
This won’t be of concern to us unless we want to immediately re-create the project using the same project-ID. If that happens, instead explore the console and follow instructions about retrieving the deleted project.
If you’re curious to learn how my experience with Google Cloud stacks up against AWS and Azure, I wrote more about this in Part 1. You can also learn more specifics in Part 3 on Azure and Part 4 on AWS.
I hope this series has been insightful for you! I know I learned a lot!
You may find it helpful to have awareness about the following subjects. If you get stuck, here are some useful articles:
Thanks to Mandy Mowers & Layla Porter for the overall document review.
I always disclose my position, association or bias at the time of writing. As an author, I received a fee from Progress Telerik in exchange for the content found in this series of articles. I have no association with Microsoft, Amazon nor Google—this series of articles is in no way intended to offer biased opinion nor recommendation as to your eventual choice of cloud provider.
A UK-based freelance developer using largely Microsoft technologies, Jim McG has been working with web and backend services since the late ‘90s. Today, he generally gravitates around cloud-based solutions using C#, .NET Core and Azure, with a particular interest in serverless technologies. Most recently, he's been transitioning across to Unity and the amazing tech that is Augmented Reality.