Sunday, 26 May 2024

Getting Started with Azure Functions: A Step-by-Step Guide

 

Getting Started with Azure Functions: A Step-by-Step Guide

Introduction

Azure Functions is a serverless computing service provided by Microsoft Azure that allows developers to build and deploy event-driven applications without the need to manage infrastructure. With Azure Functions, developers can focus solely on writing code to handle specific events, leaving the underlying infrastructure and scaling aspects to the Azure platform. This enables faster development, reduced operational costs, and enhanced scalability.

Key Features of Azure Functions

  1. Event-driven Computing: Azure Functions are triggered by various events, such as HTTP requests, timers, messages in queues, file uploads, or changes to data in databases. This event-driven approach allows developers to respond to specific events in real-time without the need to maintain persistent server instances.
  2. Serverless Architecture: Azure Functions follows the serverless computing paradigm, meaning there are no servers to provision or manage. The platform automatically handles infrastructure scaling, ensuring resources are allocated based on the actual demand, making it highly cost-efficient.
  3. Language and Platform Support: Azure Functions supports multiple programming languages, including C#, JavaScript, Java, Python, and TypeScript. This flexibility allows developers to work with their preferred language and leverage existing codebases seamlessly.
  4. Pay-as-You-Go Pricing: Azure Functions offer a pay-as-you-go billing model. You only pay for the compute resources used during the execution of your functions. This cost-effective approach ensures you are charged only for the actual usage without any upfront commitments.
  5. Integration with Azure Services: Azure Functions seamlessly integrate with various Azure services like Azure Storage, Azure Cosmos DB, Azure Service Bus, Azure Event Grid, and more. This integration allows developers to build sophisticated applications by combining the power of different services.

Benefits of Using Azure Functions

  1. Faster Time to Market: With the ability to focus solely on business logic and event handling, developers can quickly build and deploy applications, reducing development time and accelerating time to market.
  2. Cost Savings: As Azure Functions automatically scales based on demand, there’s no need to pay for idle server resources, resulting in cost savings, especially for applications with varying workloads.
  3. Scalability and Elasticity: Azure Functions scales automatically to handle a large number of events concurrently. This elasticity ensures that your applications can handle high traffic and sudden spikes without any performance degradation.
  4. Serverless Management: Microsoft Azure handles all the server management, including updates, security patches, and scaling, freeing developers from operational overhead and allowing them to focus on code development.
  5. Event-Driven Architecture: The event-driven nature of Azure Functions enables the decoupling of application components, making it easier to build scalable and resilient microservices architectures.

Types of Azure Functions

  1. HTTP Trigger: This type of function is triggered by an HTTP request. It can be used to build web APIs and handle HTTP-based interactions.
  2. Timer Trigger: A timer trigger executes a function on a predefined schedule or at regular intervals. It is useful for performing tasks such as data synchronization, periodic processing, or generating reports.
  3. Blob Trigger: This type of function is triggered when a new or updated blob is added to an Azure Storage container. It enables you to automate processes based on changes in storage blobs.
  4. Queue Trigger: A queue trigger is triggered when a message is added to an Azure Storage queue. It provides a way to process messages in a queue-based architecture, allowing you to build event-driven applications.
  5. Event Grid Trigger: An event grid trigger is invoked when an event is published to an Azure Event Grid topic or domain. It enables reactive processing of events and can be used to build event-driven architectures.
  6. Cosmos DB Trigger: This type of function is triggered when there are changes to a document in an Azure Cosmos DB container. It allows you to build real-time data processing and synchronization scenarios.
  7. Service Bus Trigger: A service bus trigger is invoked when a new message arrives in an Azure Service Bus queue or topic subscription. It is suitable for building decoupled messaging-based systems.
  8. Event Hub Trigger: An event hub trigger is invoked when new events are published to an Azure Event Hub. It enables high-throughput event ingestion and processing scenarios.

Let’s dive into real-world example using HTTP Trigger function in Azure Functions:

Assume that we have data in azure data lake gen 2(ADLS Gen2), read the data from azure function and write back to the ADLS Gen2 storage account.

To accomplish the task of reading data from Azure Data Lake Gen2 (ADLS Gen2) using an Azure Function and then writing it back to the same storage account, we will follow a two-step approach: Firstly, we’ll create an Azure Function in the Azure Cloud, and secondly, we’ll access and deploy the code using Visual Studio Code (VS Code) from your local development environment.

Pre-Requisites:

  1. Azure Account (Free or pay-as-you-go).
  2. Azure Data Lake Gen2 Storage Account.
  3. Azure Function App.
  4. Azure Data Factory.

Step 1: Create an Azure function in Azure cloud

  1. Login to your Azure account, on top you can find search, search for function app.

2. Select the Function app and click ‘+ Create’ to create a new Azure function. Provide the necessary details to create an Azure function for your requirements.

select Subscription, Resource group, Function App Name, Deploy method code, or container.

In Runtime stack multiple options are available they are: .Net, Node.js, Python, Java, PowerShell core, and custom handler. Then select the version of the runtime stack and Region.

I am using Python for this example.

And Azure function only supports the Linux operating system for the Python Runtime stack. That doesn’t mean, we can’t implement it in Windows, it will be run on Linux, and that will take care of by Azure.

Hosting options and plans: For better understanding refer plans


3. Click Next: Storage, select the existing storage account or it creates new storage for storing the Azure function details.

Networking: Access to the function app

Monitoring: As a beginner, you need this feature to check the error while deploying the code. But it everything costs, don’t forgot to delete all resources after use.

Deployment: I am using my GitHub account to maintain this code as a backup.

After all these details, click on “Review + create”, to create a function app. It will run a validation on the details we provided and enables the “create” option if all details are correct. Click on “create”.


4. After deployment is done, got function app and check whether the resource is created or not.


Step-2: Create an Azure function template and deploy the code to the function app.

Pre-Requisites:

  1. Azure Function App.
  2. Visual Studio Code.
  3. open the vs code, click on “Azure” on the left side bar, and sign in to your Azure account.

It will open a login page to your azure account in browser, sign in.


After Successful sign in, can able to see the subscriptions and resources.


2. Install Extensions which is above Azure, Azure Functions Extension & Azure Account Extension.

Click on Explorer, the first one from the top, and click on Open folder, to create our Azure function in that folder. Select the location where you want to create the function then select a folder.

After this you can see your folder name in explorer.


3. In the Azure tab(vs code), click on the function app symbol, and select create function. Then select Folder, language, version.


Select the type of trigger we want to create, function name, and authorization level. After this, you can see the progress for creating the azure function on the bottom right of vs code.


After this, it will create a Python file with the name __init__.py, some default code will be there inside it. It looks like this.

For our example, we don’t need all the code. main is the actual function that receives and responds to a request.


3. Read the file from ADLS Gen2 and write back to ADLS Gen2.

For this, am uploading a sample data file into the ADLS Gen2 account and it will be available in my Git Hub, will provide a link to all code and functions at the end.

Have created 2 containers read and write, below image shows the containers and what it contains before running the Azure function.


I have developed the code to read from ADLS Gen2 and write back to it in a different container.

Also tested in local, it’s working fine. Let’s see the deployment process.

Deployment:

Note:

Before deployment, you need to make sure that the requirements.txt file has all the modules that we used in the function.

To auto-generate these module names, can use the below command in the command prompt.

pip freeze > requirements.txt

Go to Azure tab in vs code, click on the function app symbol, select Deploy to Function App, then select the function app that we created in Azure account.

click “Deploy” on the pop-up and deployment to the azure cloud is started.


Testing:

After deployment is done, time for testing. Go to the Azure account, open the Azure function, then to the Functions, there you can find the function name that we developed in vs code.

Open the Azure function, click on the function, and click on “Code + Test”.

Figure:2–11: Azure Function Test

Click Test/Run, and provide the information, using Get Method, click on Run.

Figure:2–12: Azure Function Test

The output will be like the below if it is executed successfully with out errors.


Let’s check the ADLS Gen2 account, write container, and whether the file is created or not.


Now that we have successfully developed and tested the Azure Function locally and in Azure, it’s time to explore how we can automate the deployment and testing process using Azure Data Factory (ADF) pipelines.

pre-requisites:

  1. Linked service to storage account and Azure function.
  2. Open your Azure data factory and go to the author, create a new pipeline, on the left side under activities search for Azure function. Drag and drop to the pipeline, and configure the activity by providing the necessary details.

2. Click on debug to execute the function.

Review and Practical Examples

 Review and Practical Examples


Introduction

Container Apps is a new serverless offering from Azure. As of the time when this article is published, it is still in preview.

The reason this offering is interesting is that it fills the gap between serverless and full-blown Kubernetes setup. Traditionally for microservice type workloads, one would use either serverless or Kubernetes.

This was not ideal as serverless is more suitable for event-driven architectures, whereas Kubernetes is complex and requires specialized knowledge to run production-grade workloads.

Microservices architecture moves complexity from inside of a program to surrounding infrastructure.

Another solution was to use Azure Container Instances. This is a great service, but for one it’s relatively low level and doesn’t work well where multiple container groups are used especially when they need to communicate with each other.

You can read more about Azure Container Instances my other blogs, Easily Deploy Containers to Azure directly from your Desktop and [Azure explained deep enough: Containers](Azure explained deep enough: Containers).

In this article, we will explore how Azure Container Apps helps with microservices-based architecture. This should be an interesting read if you are a developer or software architect designing software on Azure.

What are the benefits?

Container Apps is the missing link between serverless and AKS for microservices-based architecture.

This is achieved by utilizing open-source projects to provide standardized capabilities typically seen in microservices such as:

  • auto-scaling
  • secret and configuration management
  • versioning
  • advanced deployment capabilities, for example, blue-green deployment or A/B testing
  • traffic splitting between revisions
  • background, long-running services

Here are the open-source projects that power Container Apps:

Container Apps open source components

Under the hood, container apps run on AKS clusters with opinionated settings. This offering follows one of the best practices when leveraging Kubernetes:

Kubernetes is a platform to build platforms

DAPR provides platform and language agnostic building blocks for microservice-based architectures.

KEDA provides seamless event-driven auto-scaling capabilities.

Finally, Envoy takes tare of ingress and routing hiding Kubernetes complexity.

When to use Container Apps

This service is best suited for microservices ideally if they are already containerised. A system that is not so complex that it requires direct access to Kubernetes primitives, but also business logic that is not purely event-driven.

Demo Scenarios

If you want to practice along, I’ve created a repo with a devcontainer set up covering 2 separate scenarios. The first scenario located in the folder 1.Hello-World will deploy a sample "hello world" web app and expose the endpoint as internal ingress. The second scenario uses bicep to deploy additional configuration and showcase usage of secrets in the container app.

Prerequisites

There are a few prerequisites:

  • VS Code
  • Azure subscription
  • Docker host running on your machine
  1. Clone the repository: https://github.com/Piotr1215/azure-container-apps
  2. VS Code should prompt you to reopen the repo in devcontainer

If the prompt does not appear, you can use F1 or Ctrl+Shift+P and select Reopen in Container.

You need to perform az login. By default, the az login command will open up a browser to securely authenticate with an Azure subscription.

Hello World

To start with the example, navigate to the 1.Hello-World directory and run setup.sh.

You will be prompted to provide a few variables for the script. Default values are pre-populated. If you want to use the default values, just hit enter.

At this point the Container Apps service is available only in the northeurope and canadacentral regions.

The script will perform the following actions:

  • install container apps az extension
  • create a resource group
  • create a container app environment
  • create a container app
  • deploy a hello world container to the container app
  • expose URL where you can check the web app live
  • provide instructions to clean up resources

Once the script finishes, a URL with the running web app will be displayed as well as a command to delete the environment afterwards.

Script output

The URL should show a running hello world app:


Container Apps integrate fully with Azure Monitor observability. Navigate to Azure Portal and find the resource group

If you have accepted the default values it will be rg-app-container-test

From there we can execute a simple query to read the stdin logs from the sample app:

ContainerAppConsoleLogs_CL
| where ContainerAppName_s == 'my-container-app'
| project ContainerAppName_s, ContainerImage_s, format_datetime(TimeGenerated, "hh:mm:ss"), RevisionName_s, EnvironmentName_s, Log_s

State Store with Bicep

To start with the example, navigate to the 2.Bicep-Deploy directory and run setup.sh.

Bicep is out of scope for this article, but if you are interested, it’s worth poining out that together with az CLI, it creates a nice combination of imperative and declarative style of IaC.

The script will deploy the following infrastructure to Azure:

  • create a resource group
  • create a container app environment
  • create a container app
  • create a storage account with a default container named “test container”
  • deploy a simple Go API container (GithubDocker) to interact with the storage account

The script with output a URL of the container app. You can navigate to it using Ctrl + click. After a while, you should see a message that sample blob files were created.


Go to the Azure resource group (rg-test-containerapps by default) and check for blobs in the test-container. You should see at least 2 files. Refreshing the URL will generate additional files.

Blobs created by API

The test API writes logs to stdout using the Go fmt library, you can see custom logs in the Azure Monitor workspace.

API custom logs
ContainerAppConsoleLogs_CL
| where ContainerAppName_s == 'sample-app'
| project ContainerAppName_s, ContainerImage_s, format_datetime(TimeGenerated, "hh:mm:ss"), RevisionName_s, EnvironmentName_s, Log_s

So how does it work?

If you look closely at the bicep template, you can see that it defines an envVar array with configuration and secret references. If you are familiar with Kubernetes, this is how secrets are referenced in a pod spec. Remember that bicep is just a superset of ARM JSON, so it has all the fields exposed by Container App API.

The secret is exposed to the container during runtime, so as long as you are using the same environmental variables in your API, you should be able to interact with the storage account in the same way.

The benefit of this approach is that the storage account key is never shared, stored in the source code repository or embedded in an image.