Telerik blogs
BlazorT4_1200x303

In this fifth article, we’ll work through the steps needed to create a Google Cloud Functions backend that serves up our dataset via an HTTP API.

Introduction

Welcome to the fifth and final article in a series demonstrating how to create a Blazor client application that incorporates a Telerik UI For Blazor Chart component.

If you haven’t already read the second article, that should be your starting point.

An objective of this series is to demonstrate how we could combine our client application with simple serverless function backend solutions.

In this article, we’ll be exploring how to use Google Cloud Functions (sometimes shortened to “GCF” or “Function”). This is Google’s version of the increasingly popular "serverless computing paradigm.”

If you’re new to “serverless computing” and more accustomed to working with traditional websites/services where code is published to a host server, it’s useful to understand that with Google Cloud Functions:

  • We don’t publish our code directly to a “Google Cloud Function Host Server.” Instead, we ultimately deploy it to a “storage bucket”— a highly available, but otherwise fairly straightforward place to store files.
  • When our GCF is triggered, our code is loaded into worker instances as demand requires. This is managed transparently for us.
  • CLI tooling simplifies this publication process, making it look as if we are simply publishing to a GCF “service.”

You can learn more about the subject here:

Note: Google Cloud Functions support deployments of both file-based packages and container images. In this article, we’ll be demonstrating using a basic file-based deployment.


Series Contents

If you’re planning to follow along with this series, you will need to create a Blazor client application, following the instructions in Part 2. Having completed that step, we’ll then look at creating a backend API—in this article, we’ll be looking at creating that API using Google Cloud Functions.

If you’re interested to learn how the experience compares between the cloud providers, a succinct comparison is in Part 1, and you may further be interested in reading all three of the platform-specific articles in this series.

What Are We Doing in This Article?

In this article, we’ll use Google Cloud Functions to create a simple serverless HTTP API backend that responds to a request with a JSON dataset.

Google Cloud Functions supports a number of languages including Node and Python. However, in this article, we will continue using .NET with C#, as we have previously with our Blazor client.

Google Cloud Functions are event-driven, meaning that their code is triggered in response to various different types of “bound event,” such as a change to a Firebase database.

For our purposes, we’re interested in something generally referred to as an “HTTP-triggered function.”

An HTTP-triggered Google Cloud Function will be largely familiar-looking to anyone who has worked with a conventional Web API project.

Note: Unlike the equivalent Azure Function or Lambda Function, where we program against an “API Gateway” proxy class, when developing with GCF we simply use the familiar Microsoft.AspNetCore.Http.HttpContext class.

In this article, we’re going to:

  • Create a project using .NET Core 3.1.
  • Create code for an HTTP-triggered function that will respond to an HTTP-GET request:

    » The code will return a payload of JSON data.
    » For continuity with other articles in this series, that JSON payload will be exactly the same dataset.
    » We’ll retrieve the dataset from a remote source in GitHub.

  • Test our Google Cloud Function locally.
  • Create resources and enable APIs in Google Cloud.
  • Publish our function to Google Cloud.
  • Reconfigure our Blazor+Telerik client application and re-test for a final result.

Google Cloud Functions .NET Runtime

Currently, Google Cloud Functions supports only .NET Core 3.1.

In the future, there is speculation that Google may support .NET 6, which would be the next LTS (long-term support) version of .NET. However, at the time of writing, there is no official confirmation or announcement to that effect.

  • NuGet library packages, provided by Google, take care of the hosting aspects. This means that despite having what looks like a minimal amount of project code, a GCF otherwise runs like a regular .NET service—e.g., it can be started with a regular dotnet start command.
  • It may be helpful to know that this compares differently to Azure Functions, which use a discrete runtime that needs to be installed on our development systems. We start the function using that runtime—e.g., func start command.

Note: Google Engineer and C# expert Jon Skeet has suggested that, if using .NET 5 and up is important to us today, an option could be to adopt a solution that uses containers and publish to Google Cloud Run and then use an EventArc trigger.


Requirements for This Article

We’re going to expand upon the solution we developed in the second article of this series.

In addition to those previous requirements, we should now include the following:

Use the CLI To Generate a New Google Cloud Function Project

If you haven’t already done so, you need to install the items identified in the Requirements section above, before progressing.

At this point, we need to grab ourselves some .NET project templates specific to GCF—these have been created by Google.

  • Enter the following command:
dotnet new -i Google.Cloud.Functions.Templates

Great—now we’re ready, let’s create some code! Using our command prompt:

  • Navigate to the folder, where previously in the second article of this series, we created our .NET solution. If you were following along, we called it BlazorTelerikCloudDemo.
  • Inside this folder, create a new folder and call it GCFService and navigate into it.
  • Because we now have the correct templates installed, we can generate a new project like this:
md GCFService
cd GCFService
dotnet new gcf-http
  • (Optional) Next, for consistency, we’ll add the GCF project to the overall .NET solution. We do this so that we have the option to run the entire solution from an IDE in the future:

    » Enter the following:

cd.. 
dotnet sln add GCFService

Let’s just quickly test that our system is working.

  • If we need to, navigate back into the project folder and enter the following command:
dotnet run

If all is well, the system should respond with some logging information that confirms the URL that our service is running from:

[Microsoft.Hosting.Lifetime] [info] Now listening on: http://127.0.0.1:8080
  • Open a web browser and paste the address. If everything is working, we should see the default message:
Hello, Functions Framework.

Inspect the Files … What Dark Magic Is This!?

image of incense and candles to signify dark magic
Image credit: Anastasia Shuraeva on Pexels

If we look at the files that make up the new GCF project, it may strike us how very bare the code is—there is only a single Function.cs file!

This may come as quite a surprise [to developers accustomed to convention ASP.NET apps] because normally we would expect a .NET hosted service to at least include a Program.cs with a Main() program entry point.

There is clearly something very unconventional going on here, which, fortunately, is explained in his hugely informative article from last year:

I’ll attempt to paraphrase the relevant explanation in Jon’s article:

One could suggest that this unconventional approach initially seeds some confusion for the beginner—it’s not immediately apparent what the code, provided by the template, is doing.

However, once we recognize that there is a bit of magic going on behind the scenes, this is actually a pretty ingenious approach.

Functions are supposed to be about embodying the essence of “small units of code”—which is exactly what we get here.

We’re also given a solution that can be run directly on our local systems (without any additional runtime) and be readily debugged and tested. In the same breath, when deployed, it transforms into something that runs within Google’s cloud infrastructure.

Modifying the Google Cloud Function To Return a Dataset

Next, we want our GCF to return some data.

  • We’re going to maintain consistency with the second article in this series by returning a static JSON dataset containing some arbitrary data.
  • We’re going to retrieve the JSON dataset from a GitHub page.

    » If we want, we can view that data directly in our browser:

https://raw.githubusercontent.com/SiliconOrchid/BlazorTelerikCloudDemo/master/cloud-searchtrend-data.json
  • We’re going to request the dataset from GitHub using an HTTPClient … and then pass that dataset back as the response from our API.

Note: The purpose of this is so that we can learn from a really simple system. We can always layer in more complexity later. It’s completely reasonable to assume that we wouldn’t use this approach in a real system. Instead, it’s much more likely that we would use a separate storage account or even query a database.


Modifying the Function Startup Code

In the previous section, we established that Google has done a bunch of work to get rid of the application startup code.

However, we now find ourselves in a position of saying, “Hey! We actually need some of that!”

In our code, we’re going to be using System.Net.Http.HttpClient to make an HTTP call to GitHub, reading in the static JSON dataset published there.

To make HttpClient available to the code in our GCF, we need to be able to configure the .NET dependency injection provider.

It turns out that this isn’t a problem—Google provides documentation with an example of one way that we could achieve this:

Go ahead and do the following:

  • Create a new code file Startup.cs.
  • Copy and paste the following code into that new file:
using System.Net.Http;
using Microsoft.AspNetCore.Hosting;
using Microsoft.Extensions.DependencyInjection;
using Google.Cloud.Functions.Hosting;


namespace GoogleCloudFunctionService
{
    public class Startup : FunctionsStartup
    {
        public override void ConfigureServices(WebHostBuilderContext context, IServiceCollection services) =>
            services.AddSingleton<HttpClient>();
    }
}

This service-configuration code looks very similar to something we would expect to find in a regular .NET app.

What we may find unfamiliar, though, is that this Startup class derives from FunctionsStartup—part of the Google.Cloud.Functions.Hosting package.

Note: As a reminder from earlier, this Hosting package implements the GCF approach to service configuration and hosting.


Modifying the Function Code to Return the Dataset

Next, we need to update the code of the Function itself:

  • For simplicity, completely replace the code in the file Function.cs with the following:
using System.Net;
using System.Net.Http;
using System.Threading.Tasks;
using Microsoft.AspNetCore.Http;
using Google.Cloud.Functions.Framework;
using Google.Cloud.Functions.Hosting;

namespace GoogleCloudFunctionService
{
    [FunctionsStartup(typeof(Startup))]
    public class Function : IHttpFunction
    {
        private readonly HttpClient _httpClient;
  
        public Function(HttpClient httpClient)
        {
            _httpClient = httpClient;
        }

        public async Task HandleAsync(HttpContext context)
        {
            var remoteDatset = "https://raw.githubusercontent.com/SiliconOrchid/BlazorTelerikCloudDemo/master/cloud-searchtrend-data.json";
            var remoteResponse = await _httpClient.GetAsync(remoteDatset);

            if (remoteResponse.IsSuccessStatusCode)
            {
                context.Response.StatusCode = (int)HttpStatusCode.OK;
                await context.Response.WriteAsync(await remoteResponse.Content.ReadAsStringAsync());
            }
            else
            {
                context.Response.StatusCode = (int)HttpStatusCode.InternalServerError;
                await context.Response.WriteAsync(string.Empty);
            }
        }
    }
}

Let’s review the various changes that we just made to the above code.

  • Significantly, we’ve added the class Attribute [FunctionsStartup(typeof(Startup))].

    » This is used to “wire-up” our Function class, with the startup code Startup that we created in the previous step.

  • We’ve introduced the private member _httpClient to the class.

    » This forms part of the code that we use to make HttpClient available through dependency injection.

  • We’ve also introduced a class constructor.

    » This also forms part of the code needed to make HttpClient available through dependency injection.

  • Looking at the main code of the Function, a method named HandleAsync, we:

    » Defined a hard-coded URL to a demonstration dataset—this is the JSON dataset stored on GitHub.
    » Used the HttpClient to retrieve the dataset.
    » Returned an HTTP response containing the dataset along with the appropriate response status code.

Adding in a CORS Policy Locally

Later in this article, we’ll be slightly updating the Blazor application that we created in the second article.

The change we’re going to make is that, instead of retrieving the JSON dataset from a local file, we’ll revise the code to request the data from an HTTP endpoint (our new GCF).

If we don’t change anything, a problem we’re about to run into is a CORS policy issue.

Explaining Cross-Origin Resource Sharing (CORS) is outside of the scope of this article, but very briefly: browsers implement security that, by default, prevents an application from just retrieving data from resources other than the immediate host. You can learn more here:

According to the above GCF docs, one way to address this issue is to define a CORS policy directly by hard-coding the header values.

Let’s make the necessary change to the code:

  • Using VS Code, edit the file function.cs.
  • Within the code, update with the following fragment:
...
if (remoteResponse.IsSuccessStatusCode)
{
    context.Response.Headers.Append("Access-Control-Allow-Origin", "*");
    context.Response.StatusCode = (int)HttpStatusCode.OK;
...

In this case, we’ve used a wildcard entry to completely open up access. For security reasons, this is not at all recommended and is absolutely not something we should do in a real system. This step is intended as a convenience while learning.

Note: Hard-coding the setting makes it easier to learn what’s happening, but looking ahead, we will probably be interested in supplementing this code with some form of environment-specific configuration.


Test: Modify the Blazor Client To Retrieve From the Locally Hosted Google Cloud Function

Let’s test our partial progress:

  • Ensure that the GCF is running locally. If it isn’t already running:

    » Navigate into the same folder as the GCF project (named GCFServiceif you have followed-along exactly).
    » Enter the command:

dotnet run

Next, we’ll shift attention back to the Blazor application created in the second article:

  • Using VS Code, edit the file /Pages/Index.razor.
  • Within the @code section, update the OnInitializedAsync() method as below:

    » Pay extra attention to check that the URL, and in particular the port number, matches the one reported when you started the GCF):

...
protected override async Task OnInitializedAsync()
{
    //seriesData = await Http.GetFromJsonAsync<SeriesDataModel[]>("sample-datacloud-searchtrend-data.json");
    seriesData = await Http.GetFromJsonAsync<SeriesDataModel[]>("http://localhost:8080");
}
...
  • With the GCF already running separately, start the Blazor application.
  • If everything has worked, we should see the Telerik Chart component showing exactly as it did before (except that this time, behind the scenes, the data is now coming from a different source).

Note: As a heads-up, we’ll return here again at the end of the article, to update the URL with the URL of the Function hosted in the cloud.


Create and Configure Google Cloud Resources

image of coloured pens
Image credit: Jessica Lewis Creative on Pexels

  • If you haven’t already, ensure that the Google Cloud SDK is installed before progressing further—check the prerequisites section earlier in this article.

Note: In this article, we’ll be using the Google Cloud SDK CLI, but if you prefer, using the web-based interface found in the Google Cloud Console is also a really good option.


Log into Google Cloud

Next, we’ll start moving our API service into the cloud. To do that, we’ll need to create some resources—but first, we need to log in.

  • From the command prompt, start by logging in, as below (this command will open up a browser page from where you can enter your Google Account credentials).
  • Now would also be a good opportunity to make sure our components are up to date:
gcloud components update
gcloud auth login

Create a Google Cloud Project

First, we create a “Project.” This is how Google Cloud resources are grouped together.

A Google Cloud project has two aspects that are important to us:

  • A Project name is a human-readable name for our project.
  • A Project ID is a globally unique identifier for our project.

    » This identifier will be used to form part of the URL used to access our function.

  • Create a new Google Cloud project using the CLI, like this:
gcloud projects create <YOUR_UNIQUE_PROJECT_ID>  --name="Blazor Telerik Cloud Demo" 

Note: In the following sections, we’ll be needing to use your unique project-id a few times, so it may be helpful to cut + paste that value somewhere convenient.


Optional: Consider Using Other Google Cloud Regions

When we create any Google Cloud resources, we have a large choice of data centers from around the globe. Google uses the terms “Region” and “Zone” to describe the location of their data centers.

For the purpose of this article, we’ll show examples that use a region called “europe-west2” (which happens to be in London).

Normally, we should choose a Google Region that is nearest to our customers. If we wanted to adapt the example in this article, we need to know what choices of “Google Region” we have. That list is available in the documentation, but another convenient way is to use a CLI command, as below.

gcloud functions regions list --project <YOUR_UNIQUE_PROJECT_ID>

Later, when we deploy our Function, we could specify the region we want each and every time. A more convenient option is to set the choice as a default, like this:

gcloud config set functions/region europe-west1

Enable Google APIs for a Project

Before we can start using our GCF in the cloud, we first need to enable some Google Cloud APIs.

Specifically for our project, we need the:

  • cloudfunctions API

    » We need this to connect our GCF to Google’s infrastructure.

  • cloudbuild API

    » We need this because our code will be (re)built and deployed remotely.

To enable these APIs, enter the following commands:

gcloud services enable cloudfunctions.googleapis.com --project <YOUR_UNIQUE_PROJECT_ID>  
gcloud services enable cloudbuild.googleapis.com --project <YOUR_UNIQUE_PROJECT_ID>  

Note: If you’re curious and would like to see the entire list of available APIs for a project, use gcloud services list --available --project <YOUR_UNIQUE_PROJECT_ID>.


Deploy the GCF App to Google Cloud

Now that everything we need is in place, we can go ahead and publish our function to the cloud. You can read the documentation here: Google: gcloud functions deploy.

  • Use the following command to publish our app to the cloud:
gcloud functions deploy my-function --project  <YOUR_UNIQUE_PROJECT_ID> --entry-point GoogleCloudFunctionService.Function --runtime dotnet3 --trigger-http --allow-unauthenticated

There are some points in the above command that are worth explaining:

  • my-function is an arbitrary name used to identify this http-triggered function. This will partly be used to form the URL path to this specific function.
  • --project xyz is your unique project-id that you defined earlier. The main part of the URL is derived from this globally unique identifier.
  • --entry-point GoogleCloudFunctionService.Function is used to connect to our C# code—specifically naming both the namespace along with the class name of the Function itself.

When the app has finished deploying, the CLI will return some confirmation information.

Among the data, we need to look for the term httpsTrigger:. This reveals the URL that we can use to invoke the function—it will look something like the following:

https://europe-west1-<YOUR_UNIQUE_PROJECT_ID>.cloudfunctions.net/httpdemofunction

Note: It may be helpful to cut + paste the URL to this GCF somewhere convenient, as we’ll be using that shortly.

Note: If you’re accustomed to working with AWS but new to Google Cloud, it’s useful to note that you don’t need to separately provision an API Gateway or similar—the provisioning of a scalable HTTP endpoint is automatically done for you.


Test: Modify the Blazor Client To Retrieve From Google Cloud

Finally, let’s test our finished result:

Return back to the Blazor application, that we created in the second article of this series:

  • Just like earlier, using VS Code, edit the file /Pages/Index.razor.
  • In the code section, update the OnInitializedAsync() method like this (modifying the URL to include your unique name as appropriate):
...
protected override async Task OnInitializedAsync()
{
    //seriesData = await Http.GetFromJsonAsync<SeriesDataModel[]>("sample-data/cloud-searchtrend-data.json");
    //seriesData = await Http.GetFromJsonAsync<SeriesDataModel[]>("http://localhost:7071/api/HttpDemoFunction");
    seriesData = await Http.GetFromJsonAsync<SeriesDataModel[]>("https://europe-west1-<YOUR_UNIQUE_PROJECT_ID>.cloudfunctions.net/httpdemofunction");
}
...

Nothing Further? Remove the Google Cloud Resources!

Once we’ve finished experimenting, it’s best not to leave resources hanging around (especially as we have undermined our CORS policy).

A quick way to clean everything is simply to delete the entire project, like this:

gcloud projects delete <YOUR_UNIQUE_PROJECT_ID>

Strictly speaking, this doesn’t actually delete the project, but instead marks it for “soft delete”—and retains the project for 30 days.

This won’t be of concern to us unless we want to immediately re-create the project using the same project-ID. If that happens, instead explore the console and follow instructions about retrieving the deleted project.

Wrapping Up

How Does the Google Cloud Experience Compare?

If you’re curious to learn how my experience with Google Cloud stacks up against AWS and Azure, I wrote more about this in Part 1. You can also learn more specifics in Part 3 on Azure and Part 4 on AWS.

I hope this series has been insightful for you! I know I learned a lot!

Useful References

You may find it helpful to have awareness about the following subjects. If you get stuck, here are some useful articles:

Acknowledgments

Thanks to Mandy Mowers & Layla Porter for the overall document review.

Disclosure

I always disclose my position, association or bias at the time of writing. As an author, I received a fee from Progress Telerik in exchange for the content found in this series of articles. I have no association with Microsoft, Amazon nor Google—this series of articles is in no way intended to offer biased opinion nor recommendation as to your eventual choice of cloud provider.


Jim-McGowan
About the Author

Jim McG

A UK-based freelance developer using largely Microsoft technologies, Jim McG has been working with web and backend services since the late ‘90s. Today, he generally gravitates around cloud-based solutions using C#, .NET Core and Azure, with a particular interest in serverless technologies. Most recently, he's been transitioning across to Unity and the amazing tech that is Augmented Reality.

Related Posts

Comments

Comments are disabled in preview mode.