Sunday, 19 May 2024

Azure Data Factory Interview Questions

 

Azure Data Factory: An Overview

Based on Cloud, Azure Data Factory is a Microsoft tool that gathers raw business data and subsequently converts it into functional information. Essentially, it is a data integration ETL (extract, transform, and load) service responsible for automating the revolution of the provided raw data. Let’s look at some of the Azure interview Questions answer that help you to prepare for Azure job interviews.

1. Briefly explain different components of Azure Data Factory:

  • Pipeline: It represents activities logical container.
  • Dataset: It is a pointer to the data utilized in the pipeline activities
  • Mapping Data Flow: Represents a data transformation UI logic
  • Activity: In the Data Factory pipeline, Activity is the execution step that can be utilized for data consumption and transformation.
  • Trigger: Mentions the time of pipeline execution.
  • Linked Service: It represents an explanatory connection string for those data sources being used in the pipeline activities.
  • Control flow: Regulates the execution flow of the pipeline activities

2. What is the need for Azure Data Factory?

While going through Azure tutorial, you would come across this terminology. Since data comes from different sources, it can be in any form. Such varied sources will transfer or channelize the particular data in various ways and the same can be in a varied format. Whenever we convey this data on the cloud or specific storage, it is inevitable to ascertain that this data is efficiently managed. So, you have to transform the data and remove unnecessary parts.

Since data transfer is concerned, it is important to ascertain that data is collected from various sources and conveyed in a common place. Now store it and if needed, transformation needs to be done. The same can be accomplished by a conventional data warehouse too but it comes with some limitations. Occasionally, we are impelled to use custom applications that can manage all such processes distinctly. But this process consumes time and integration of all such processes is troublesome. So, it is necessary to find an approach to automate this process or design appropriate workflows. Azure Data Factory assists you in coordinating this entire process more conveniently.

3. Is there any limit on how many integration runtimes can be performed?

No, there is no limit on the number of integration runtime occurrences you can have in an Azure data factory. However, there is a limit on the number of VM cores that the integration runtime can utilize for every subscription for SSIS package implementation. When you pursue Microsoft Azure Certification, you should be aware of these terms.

4. Explain Data Factory Integration Runtime.

Integration Runtime is a safe computing infrastructure being used by Data Factory for offering data integration abilities over various network environments. Moreover, it ascertains that such activities will be implemented in the nearest possible area to the data store. If you want to Learn Azure Step by step, you must be aware of this and other such fundamental Azure terminologies.

5. What it means by blob storage in Azure?

Blob storage in Azure is one of the key aspects to learn if you want to get Azure fundamentals certification. Azure Blob Storage is a service very useful for the storage of massive amounts of unstructured object data like binary data or text. Moreover, you can use Blob Storage to render data to the world or to save application data confidentially. Typical usages of Blob Storage include:


  1. Directly serving images or documents to a browser
  2. Storage of files for distributed access
  3. Streaming audio and video
  4. Storing data for backup and reinstating disaster recovery, and archiving
  5. Storing data for investigation by an on-premises or any Azure-hosted service

6. Mention the steps for creating the ETL process in Azure Data Factory.

When attempting to retrieve some data from the Azure SQL server database, if anything needs to be processed, it will be processed and saved in the Data Lake Store. Here are the steps for creating ETL:

  • Firstly, create a Linked Service for the source data store i.e. SQL Server Database
  • Suppose that we are using a car dataset
  • Now create a Linked Service for a destination data store that is Azure Data Lake Store
  • After that, create a dataset for Data Saving
  • Setup the pipeline and add copy activity
  • Finally, schedule the pipeline by inserting a trigger

7. Mention three types of triggers that Azure Data Factory supports.

  1. The Schedule trigger is useful for the execution of the ADF pipeline on a wall-clock timetable.
  2. The Tumbling window trigger is useful for the execution of the ADF pipeline over a cyclic interval. It holds on to the pipeline state.
  3. The Event-based trigger responds to an event that is related to the blob. Examples of such events include adding or deleting a blob from your Azure storage account.

8. How to create Azure Functions?

Azure Functions are solutions for implementing small lines of functions or code in the cloud. With these functions, we can choose preferred programming languages. You need to pay only for the time the code runs which means that you need to pay per usage. It supports a wide range of programming languages including F#, C#, Node.js, JavaPython, and PHP. Also, it supports continuous deployment as well as integration. It is possible to develop serverless applications through Azure Functions applications. When you enroll for Azure Training In Hyderabad, you can thoroughly know how to create Azure Functions.

9. What are the steps to access data through the use of the other 80 dataset types in the Data Factory?

Currently, the Mapping Data Flow functionality allows Azure SQL Data Warehouse, Azure SQL Database, defined text files from Azure Blob storage or Azure Data Lake Storage Gen2, and Parquet files from Blob storage or Data Lake Storage Gen2 natively for sink and source.

You need to use the Copy activity to point data from any of the supplementary connectors. Subsequently, you need to run a Data Flow activity to efficiently transform data after it is already staged.

10. What do you need for executing an SSIS package in the Data Factory?

You have to create an SSIS IR and an SSISDB catalog which is hosted in Azure SQL Managed Instance or Azure SQL Database.

11. What are Datasets in ADF?

The dataset is the data that you would use in your pipeline activities in the form of inputs and outputs. Generally, datasets signify the structure of data inside linked data stores like documents, files, folders, etc. For instance, an Azure blob dataset describes the folder and container in blob storage from which a specific pipeline activity must read data as input for processing.

12. What is the use of the ADF Service?

ADF is primarily used to organize the data copying between various relational and non-relational data sources that are being hosted locally in your data centers or in the cloud. Moreover, ADF Service can be used for the transformation of the ingested data to fulfill your business requirements. In most Big Data solutions, ADF Service is used as an ETL or ELT tool for data ingestion. When you enroll for Azure Training In Hyderabad, you can thoroughly know the usefulness of ADF Service.

13. How do the Mapping data flow and Wrangling data flow transformation activities differ in the Data Factory?

Mapping data flow activity is a data transformation activity that is visually designed. It enables you to effectively design graphical data transformation logic in the absence of an expert developer. Moreover, it is operated as an activity inside the ADF pipeline on a fully managed ADF scaled-out Spark cluster.
On the other hand, wrangling data flow activity denotes a data preparation activity that does not use code. It integrates with Power Query Online to make the Power Query M functions accessible for data wrangling through spark implementation.

14. What Are Azure Databricks?

Azure Databricks represents an easy, quick, and mutual Apache Spark-based analytics platform that is optimized for Azure. It is being designed in partnership with the founders of Apache Spark. Moreover, Azure Databricks blends the finest of Databricks and Azure to let customers speed up innovation through a quick setup. The smooth workflows and an engaging workspace facilitate teamwork between data engineers, data scientists, and business analysts.

15. What is Azure SQL Data Warehouse?

It is a huge storage of data collected from a broad range of sources in a company and is useful to make management decisions. These warehouses enable you to accumulate data from diverse databases existing as either remote or distributed systems.
An Azure SQL Data Warehouse can be created by integrating data from multiple sources which can be utilized for decision-making, analytical reporting, etc. In other words, it is a cloud-based enterprise application that allows you to function under parallel processing to rapidly examine a complex query from a massive data volume. Also, it works as a solution for Big-Data concepts.

16. What is Azure Data Lake?

Azure Data Lake streamlines processing tasks and data storage for analysts, developers, and data scientists. It is an advanced mechanism that supports the mentioned tasks across multiple platforms and languages.
It removes the barriers linked to data storage. Also, it makes it simpler to carry out steam, batch, and interactive analytics. Features in Azure Data Lake resolve the challenges linked with productivity and scalability and fulfill growing business requirements.

17. Explain the data source in the Azure data factory:

The data source is the source or destination system that comprises the data intended to be utilized or executed. Types of data can be binary, text, CSV files, JSON files, etc. It can be image files, video, audio, or might be a proper database.

Examples of data sources include Azure data lake storage, Azure blob storage, or any other database such as MySQL DB, Azure SQL database, postgres, etc.

18. Why is it beneficial to use the Auto Resolve Integration Runtime?

AutoResolveIntegrationRuntime automatically tries to execute the activities in the same region or in close proximity to the region of the particular sink data source. The same can boost performance.

19. How is lookup activity useful in the Azure data factory?

In the ADF pipeline, the Lookup activity is commonly used for configuration lookup purposes. The source dataset is available in it. Moreover, it is used to retrieve the data from the source dataset and then send it as the output of the activity. Generally, the output of the lookup activity is further used in the pipeline for taking some decisions or presenting any configuration as a result.
In simple terms, lookup activity is used for data fetching in the ADF pipeline. The way you would use it entirely relies on your pipeline logic. It is possible to obtain only the first row or you can retrieve the complete rows depending on your dataset or query.

20. What are the types of variables in the Azure data factory?

Variables in the ADF pipeline allow temporary holding of the values. Their usage is similar just to the variables used in the programming language. For assigning and manipulating the variable values, two types of activities are used i.e. Set Variable and append variable.

Two types of variables in Azure data factory are:

i. System variable: These are the fixed variables from the Azure pipeline. Their examples include pipeline ID, pipeline name, trigger name, etc.

ii. User variable: User variables are manually declared depending on the logic of the pipeline.

21. Explain the linked service in the Azure data factory.

In Azure Data Factory, linked service represents the connection system used to connect the external source. It functions as the connection string for holding the user validation information.
Two ways to create the linked service are:
1. ARM template way
2. Using the Azure Portal

22. What does it mean by the breakpoint in the ADF pipeline?

Breakpoint signifies the debug portion of the pipeline. If you wish to check the pipeline with any specific activity, you can accomplish it through the breakpoints.
To understand better, for example, you are using 3 activities in the pipeline and now you want to debug up to the second activity only. This can be done by placing the breakpoint at the second activity. To add a breakpoint, you can click the circle present at the top of the activity.

23. Is Azure Data Factory ETL or ELT tool?

It is a cloud-based Microsoft tool which provides a cloud-based integration service for data analytics at scale and supports ETL and ELT paradigms.

24. Why is ADF needed?

With an increasing amount of big data, there is a need for a service such as ADF that can orchestrate and operationalize processes to refine the enormous stores of raw business data into actionable business insights.

25.What is the purpose of Linked services in Azure Data Factory?

Linked services are used majorly for two purposes :

  • For a Data Store representation, i.e., any storage system such as Azure Blob storage account, a file share, or an Oracle DB/ SQL Server instance.
  • For Compute representation, i.e., the underlying VM will execute the activity defined in the pipeline.

26.What is required to execute an SSIS package in Data Factory?

We have to create an SSIS integration runtime and an SSISDB catalog hosted in the Azure SQL server database or Azure SQL-managed instance before executing an SSIS package.

27. How can we deploy code to higher environments in Data Factory?

We can do this with the below set of steps:

  • Create a feature branch that will store our code base.
  • Create a pull request to merge the code after we’re sure to the Dev branch.
  • Publish the code from the dev to generate ARM templates.

This can trigger an automated CI/CD DevOps pipeline to promote code to higher environments like Staging or Production.

28. If you want to use the output by executing a query, which activity shall you use?

The Look-up activity can return the result of executing a query or stored procedure. ,The output can be a singleton value or an array of attributes, which can be consumed in subsequent copy data activity, or any transformation or control flow activity Such as ForEach activity.

29. Can we pass parameters to a pipeline run?

The answer is Yes, parameters are a first-class, top-level concept in a Data Factory. You can define parameters at the pipeline level and pass arguments as you execute the pipeline run on demand or using a trigger, parameters are a first-class, top-level concept in Data Factory.

30. Can you Elaborate more on Data Factory Integration Runtime?

It is the compute infrastructure for Azure Data Factory pipelines. It is nothing but the bridge between activities and linked services. It provides the computing environment where the activity is run directly or dispatched. This allows the activity to be performed in the closest region to the target data stores.

31. What is required to execute an SSIS package in a Data Factory?

You must create an SSIS integration runtime and an SSISDB catalog hosted in the Azure SQL server database or Azure SQL-managed instance before executing an SSIS package.

32. What is the limit on the number of Integration Runtimes, if any?

Within a Data Factory, the default limit on any entities is set to 5000, including pipelines, data sets, triggers, linked services, Private Endpoints, and integration runtimes. If required, one can create an online support ticket to raise the limit to a higher number.

33. If you want to use the output by executing a query, which activity shall you use?

The Look-up activity can return the result of executing a query or stored procedure.The output can be a singleton value or an array of attributes, which can be consumed in subsequent copy data activity, or any transformation or control flow activity like ForEach activity.

34. Can a value be calculated for a new column from the existing column from mapping in ADF?

You can derive transformations in the mapping data flow to generate a new column based on our desired logic. You can create a new derived column or update an existing one when developing a derived one. Enter the name of the column you're making in the Column textbox.

35. How to debug an ADF pipeline?

It is one of the crucial aspects of any coding-related activity needed to test the code for any issues it might have. It also provides an option to debug the pipeline without executing it.

36. What does it mean by the breakpoint in the ADF pipeline?

To understand better, for example, you are using three activities in the pipeline, and now you want to debug up to the second activity only. You can do this by placing the breakpoint at the second activity. To add a breakpoint, click the circle present at the top of the activity.

37. What is the use of the ADF Service?

ADF primarily organizes the data copying between relational and non-relational data sources hosted locally in data centers or the cloud. Moreover, you can use ADF Service to transform the ingested data to fulfill business requirements. In most Big Data solutions, ADF Service is used as an ETL or ELT tool for data ingestion.

38. Explain the data source in the Azure data factory.

The data source is the source or destination system that comprises the data intended to be utilized or executed. The data type can be binary, text, CSV, JSON, image files, video, audio, or a proper database.

39. How to copy multiple tables from one datastore to another datastore?

Maintain a lookup table/ file containing the list of tables and their source, which needs to be copied.Then, we can use the lookup activity and each loop activity to scan through the list.Inside the for each loop activity, we can use a copy activity or a mapping dataflow to copy multiple tables to the destination datastore.

40. Can we integrate Data Factory with Machine learning data?

Yes, we can train and retrain the model on machine learning data from the pipelines and publish it as a web service.

41. What is an Azure SQL database? Can you integrate it with Data Factory?

Part of the Azure SQL family, Azure SQL Database is an always up-to-date, fully managed relational database service built for the cloud for storing data. Using the Azure data factory, we can easily design data pipelines to read and write to SQL DB.

42. Can you host SQL Server instances on Azure?

Azure SQL Managed Instance is the intelligent, scalable cloud database service that combines the broadest SQL Server instance or SQL Server database engine compatibility with all the benefits of a fully managed and evergreen platform as a service.

43. What is Azure Data Lake Analytics?

It is an on-demand analytics job service that simplifies storing data and processing big data.

44. How would you set up a pipeline that extracts data from a REST API and loads it into an Azure SQL Database while managing authentication, rate limiting, and potential errors or timeouts during the data retrieval?

You can use the REST-linked Service to set up authentication and rate-limiting settings. To handle errors or timeouts, you can configure a Retry Policy in the pipeline and use Azure Functions or Azure Logic Apps to address any issues during the process.

45. How can one combine or merge several rows into one row in ADF? Can you explain the process?

In Azure Data Factory (ADF), you can merge or combine several rows into a single row using the "Aggregate" transformation.

46. How many times may an integration be run through its iterations?

There are no limits placed in any way on the amount of integration runtime instances that can exist within a data factory. However, there is a limit on the number of virtual machine cores that can be utilized by the integration runtime for the execution of SSIS packages for each subscription.

47. How does the Data Factory's integration runtime actually function?

Integration Runtime, a safe computing platform, makes it feasible for Data Factory to offer data integration capabilities that are portable across various network configurations. This is made possible by the use of Integration Runtime. Because of its proximity to the data center, the work will almost certainly be performed there. If you want to Learn Azure Step by Step, you must be familiar with terminologies like this and other key aspects of Azure.

48. What prerequisites does Data Factory SSIS execution require?

Either an Azure SQL Managed Instance or an Azure SQL Database must be used as the hosting location for your SSIS IR and SSISDB catalog.

49. What are "Datasets" in the ADF framework?

The pipeline activities will make use of the inputs and outputs that are contained in the dataset, which contains those activities. A connected data store can be any kind of file, folder, document, or anything else imaginable; datasets frequently represent the organization of information within such a store. An Azure blob dataset, for example, details the blob storage folder and container from which a particular pipeline activity must read data to continue processing. This information is used to determine where the data will be read from.

50. What is Azure Databricks?

Azure Databricks is an analytics platform that is built on Apache Spark and has been fine-tuned for Azure. It is fast, simple, and can be used in collaboration with others. Apache Spark was conceived and developed in collaboration with its creators. Azure Databricks is a service that combines the most beneficial aspects of Databricks and Azure to enable rapid deployment. This service is designed to assist customers in accelerating innovation. The enjoyable activities and engaging environment both contribute to making collaboration between data engineers, data scientists, and business analysts easier to do.

Everything you need to know about Azure solutions architect salary

 

Azure Architect: An Overview

The Microsoft Azure Certification is one of the prestigious Microsoft Certifications. It imparts lots of new and useful aspects related to Microsoft Azure to the candidates. Also, it unlocks tons of job opportunities. An Azure Architect can earn a handsome salary if he/she has related certifications and skills. The present article is centered on the salary of an Azure solutions architect but before that let’s first know what an Azure Architect is:

What is an Azure Architect?

  • An Azure architect is alternatively famous as an Azure Cloud Solutions architect.
  • It is Microsoft’s acknowledged version of a generic cloud architect.
  • Primarily, Azure Architect focuses on cloud adoption, expansion and coordination of cloud architect, and formation of a cloud strategy.
  • An Azure Architect is one who independently functions with business and technology stakeholders to detect the hurdles faced in an organization. 
  • These professionals are accountable for the development and implementation of the Azure cloud architecture.
  • Moreover, they administer issues like migration.

Now let’s jump to the main topic i.e. salary of Azure Solutions Architect:

Azure Solutions Architect Salary:

The salary of an Azure Solutions Architect is $1,56,000/year, on average. The hourly rate is approx. $80/hour. At the entry-level positions, an Azure Solutions Architect earns $124,977/year. Moreover, those professionals who hold entry-level skills have 1-3 years of experience. But those Azure Solutions Architects who hold 4-7 years of experience can earn $1,60,000/year.

The exceptionally experienced Azure Solutions Architect can make up to $1,95,000/year. When additional bonuses and compensation are considered, an Azure Solution Architect can expect earnings of up to $1,70,000 per year.

Now let’s go through the wonderful job opportunities an Azure Solutions Architect can benefit from:

  • Azure / AWS Cloud Architect
  • AWS Cloud Architect
  • Azure Solutions Architect
  • Lead Solutions Architect (AZURE)
  • Assistant Director / Senior Azure Solution Architect
  • Infrastructure Cloud Architect, Copenhagen
  • Application Security Engineer - Oslo
  • Enterprise Architect
  • Cloud Security Architect - Copenhagen OR Aarhus
  • Senior Azure Data Architect

Though you have Learned Azure Step by step, it is vital to know how to proceed in this field. For that, you need to know what the proper Azure certification path is. So, let’s look at the Solutions Architect certification path.


  1. Microsoft Certified: Azure Fundamentals
  2. Microsoft Certified: Azure Solutions Architect Expert

  1. Microsoft Certified: Azure Fundamentals:

This level of certification is considered an entry-level certification. An experienced Azure Solutions Architect can ignore it if he/she possesses knowledge of Azure Fundamentals. This Microsoft Certified: Azure fundamentals certification imparts foundational knowledge on cloud services and how such services are provided with Microsoft Azure. Primarily, this certification is intended for those candidates who are just commencing to work with the cloud-based solutions and services or are fresher to Azure. The exam you need to pass to attain Microsoft Certified: Azure Fundamentals is AZ-900.

  1. Microsoft Certified: Azure Solutions Architect Expert:

The candidates who become Azure Solutions Architect Expert certified hold subject matter know-how in designing as well as implementing solutions that execute on Microsoft Azure. They are familiar with the aspects like compute, storage, network, and security. For Azure administration, the candidates must hold intermediate-level skills. Also, they must understand Azure development and the DevOps processes.

The corresponding job roles come with responsibilities like counseling stakeholders and decoding business requirements into scalable, secure, and reliable cloud solutions.

The exams you need to pass are:

When you pass Exam AZ-303: Microsoft Azure Architect Technologies and Exam AZ-304: Microsoft Azure Architect Design, you earn Microsoft Azure Solutions Architect Expert Certification.

Why attain the Azure Architect certification?

The question may arise as to why an Azure Solutions Architect should earn the Azure Architect certification. Even though you possess relevant skills to secure a good job position as an Azure Solutions Architect, it is vital to get this certification.

To attain more experience and exposure to the Azure Architect field, the professional should pass the AZ-303 and AZ-304 exams. These certification exams test the expert-level skills of Azure Solutions Architect. Passing these exams as certain that you are experienced in all aspects varying from executing an Azure infrastructure to handling data platforms. These exams train you with Azure interview Questions answer.

It is vital to know what factors affect the salary of an Azure Solutions Architect, so let’s look at the below section:

Factors influencing the Azure Solutions Architect Salary:

1. Geographical location:

The geographical location of your job affects not only the living conditions and living standards but also on the salary of an Azure Solution Architect Professional or Azure Developer. You can notice differences in the Azure solution architect certification salary all over the world. These differences may be due to the economic conditions of the particular country. You can choose to relocate to another location if there are better job opportunities available for the Azure Solutions Architect role.

2. Skillsets:

The companies hiring Azure Solutions Architects look for some specific skill sets accompanied by technical skills and soft skills. Consequently, it makes sure you would be paid an above-average salary. It is also possible that these skill sets let you stay ahead of the competition in the context of professional growth.

It is recommended that you enroll in some online courses before going for certification. This is because it helps you pass the certification instead of wasting time. Look for relevant Azure certification from renowned institutes. When you look for Azure Training In Hyderabadit is best to choose a training institute that imparts all essential skill sets.

3. Experience:

Entry-level Azure Solution Architects can attain job roles as Junior Azure Solutions Architects. At this level, they hold 0-3 years of experience. Gradually, the salary increases depending on your growth, performance, and experience.

The Azure Solutions Architect with 4 to 7 years of experience can earn approx.$150, 000 yearly. To attain advanced level expertise in the field of Azure Solutions Architect, you can go for some more certification or training courses.

An Introduction to Azure App Services

 

Azure App Services: An Overview

Azure App Services provides the platform to build and deploy web, mobile, or integration applications. We can build robust cloud-native apps that can scale as per the need and with complex architecture and secure connections for any platform or device without worrying about the Virtual machine that will host. It also offers to accommodate previously built applications to migrate and run as one of the app service types. App Service runs and maintains with the help of Azure service fabric that takes care of running the application and its availability.

In this Azure Tutorial, we will explore more about Azure App Services which will include Microsoft Azure App Service, Introduction to Azure App Services, Azure App service tutorial, implementing Azure App services, and Azure App service plan. Consider our Azure Certification Course for a better understanding of all Azure concepts.

Types of Azure App Services

  1. Web Apps

  2. API Apps

  3. Logic Apps

  4. Mobile Apps

  5. Function Apps

Types of Azure App Services

Web Apps

  • Web Apps enable us to host our web applications without risk about the infrastructure plumbing that is required. I
  • In a hosting mechanism, we need to make sure the Server is up, the OS is updated and IIS is running
  • Hosting in the Azure web app removes all this burden and the Service fabric layer below it makes sure that the app is up and running.
  • Deploying applications in web app service in Azure helps developers focus on delivering business values rather than consuming time on severe updates or OS patches.
  • Web apps not only support applications but also support Node.js, Java, PHP, or Python.
  • Azure promises to make web apps up and running with 99.95 % SLA.
  • It also provides a provision to attach custom domains and SSL certificates with the web app.
  • We can also have multiple deployment slots so that we can test our app in Staging or the pre-prod environment.
  • This also helps in moving new changes to production with no downtime.
  • Moreover, we also get the flexibility to revert a deployment.
  • This is made possible by swapping the virtual IP addresses of the slots.
  • And the staging site goes live seamlessly.
  • Web apps also come with a feature of manual or auto-scaling.
  • We can configure authentication and authorization out of the box like Azure AD.
  • We can also load balance traffic apps with traffic management.
  • Web apps can also access data that lies outside Azure like an on-premise data source with some hybrid connections.

API Apps

API apps are offering of App Service that helps to host Web APIs. This enables us to expose existing or new APIs. This is also a part of the platform as a service and we don’t need to worry about infrastructure plumbing to bring our APIs up and running. It also supports identity providers to secure the APIs. The API Apps support, Java, Python, and Node.js to build and deploy Web APIs. It also comes with an inbuilt swagger implementation which helps in API definition and creating client apps.

Logic Apps

Logic apps enable us to create functional workflows by orchestrating software as a service component. These are basically used to connect different components of a solution to manage and trigger events and perform the desired action on some other service. For example, we can build a logic app that triggers an event of a new file uploaded on a blob storage and performs an action of sending a notification to a user. In many complex solutions, logic apps act as a communication channel for various services in a microservice architecture.


Logic apps facilitate workflows by using triggers, connectors, and actions.

  1. Logic apps can be triggered manually, or at any scheduled time. Moreover, logic apps can also be triggered on the basis of some event on any connected component.

  2. To support various kind of workflows, logic apps also have something called conditions. This is a logical section to validate a few data based on some condition and we can perform the specific action on each result of the condition.

  3. Logic apps internally use connectors to connect to different components. These connectors may connect to Azure SQL DB, Mail Exchange, SharePoint, blob storage, or API Apps.

Mobile Apps

Mobile Apps enable us to build a backend for Mobile applications. It can provide capabilities to mobile client applications. This can be considered to be the same as a web service to support mobile client scenarios. The client can be Windows Universal apps, IOS apps, windows apps etc. They use Mobile app SDK to connect with the backend. There are certain unique capabilities with mobile apps:

  1. They are cross-platform. That means Apps built for any platform – Android, Windows, IOS can consume them.

  2. Mobile Apps also support secured client connection for client applications to connect with the default identity providers like Active Directory and Microsoft accounts.

  3. Offline Sync – This feature enables the client applications to work with data when they are offline and sync it when they are online.

  4. Push Notifications – The Mobile apps can be used to send push notifications to the client applications.

Azure Functions

Azure Functions are event-driven components that eliminate the need for a server to host a piece of logical code and process. Azure functions are used to intercept events occurring in any Azure service third-party service or on-prem system as well. They are an evolution of Azure web jobs which is a feature of Azure App Services.

For example, Azure Functions can be triggered on Event Hubs, Service Bus topics or queues, or via a timer.

  1. An Azure function can run any executable. Azure Functions are also referred to as Serverless. It's not that Azure functions do not run on servers. They do. They run on Azure service fabric. But We do not need to manage the server. Azure functions consume the memory only when it runs and scales automatically by making the replica of instances.

  2. Serverless Azure functions are not fully featured applications but a short-lived tasks in an application that does a specific job. We can also chain different Functions together to make some comprehensive solutions.

  3. Azure functions are supported in multiple languages like C#, F#, Node.js, Python, PHP, batch, bash, and any executable file format. In terms of security, they can be secured with OAuth systems and other identity providers like Azure AD.

App Service Plan

All the apps that run under app service are governed and observed by a contract with the cloud service provider known as the App Service plan. This acts as a container for the applications and defines the boundary limitations of resources available to consume and scale. An app service plan comes with measured compute resources that keep our app running. These compute resources include fixed computing power which can be consumed by different applications deployed in the same app service plan. The amount of computing power defines how much we need to pay for the plan.

The app service plan is categorized by its pricing tiers as below:

  1. Free

    This App Service plan uses a single VM for multiple app service plans and can host multiple applications with some limited computing power. Also, we cannot scale our apps in this app service plan and applications deployed in this plan cannot be provided with custom domain names.

  2. Shared

    The shared app service tier runs in a similar environment as that of a Free tier. This tier allocates CPU quotas to each app that runs on the shared resources, and the resources cannot scale out. We can add a custom domain to the apps in this tier.

  3. Dedicated

    The Dedicated tiers run apps on dedicated Azure VMs. The apps within the same app service plan can share the resources and power. This comes with 99.95% SLA and scaling options. This tier is further divided into Basic, Standard, and Premium with increasing computing power and features.

    • Basic

      The basic app service tier is generally used in dev and test environments during development and does not support auto-scale. Applications can be scaled out manually up to 3 instances.

    • Standard

      The standard tier features 5 deployment slots and can be configured to auto-scale on increasing traffic and load on the application. The production environment case fits into the standard tier plan.

    • Premium

      The premium tier is suitable for large scale comes with 20 deployment slots and can be taken backup 50 times a day.

  4. Isolated

    The isolated tier runs dedicated Azure VMs on dedicated Azure Virtual Networks. This means that we run a private instance of all web app infrastructure deployed in an isolated virtual network. This type of environment is also known as running in an App Service Environment (ASE).

App Service Pricing and Tiers

Source: https://azure.microsoft.com

App Service Scaling

Web apps offer 2 types of scaling based on our needs – Vertical Scaling (Scale up, scale down) and Horizontal Scaling(Scale Out and Scale In). Scaling is important for couple of reasons:

  1. As users accessing our app grow, we want them to have a seamless experience with the app.

  2. We only want to pay for the amount of computing power we use

Vertical Scaling

  1. Scale up

    Increasing the computing power of infrastructure to support heavy workloads by increasing the CPU power and storage efficiency.

  2. Scale Down

    Decreasing the computing power of infrastructure in case the website hit goes down by decreasing the CPU power and storage efficiency.

Horizontal Scaling

  1. Scale-Out

    This is also called horizontal scaling. The number of instances of the app is increased to distribute the traffic load.

  2. Scale In

    The number of instances of the app is decreased to reduce cost in the off-season when traffic on the web app goes low.

Limitations of Azure App Service

Though App Service comes with a lot of benefits and ease of doing cloud-native development and deployments, there are a few limitations that should be understood well.

  1. No Remote Desktop Connection- Since app services are part of the broader platform as a service modal, and we have very limited access to the infrastructure on which it runs. This makes it difficult to troubleshoot issues related to performance as we cannot log into the server and see log files or event viewer.

  2. No support for third-party Software management tools- Since we do not have to manage the server at all, we have no authority to install any monitoring tools like Dynatrace or Splunk.

  3. Performance Counters not visible- In order to keep the healthy state of an application, we always tend to see the performance indexes on key workloads like IIS queues but with app service in Azure, this is still not available.

Conclusion:

So in this article, we have learned about Azure App Services. I hope you enjoyed learning these concepts while programming with Azure. Feel free to ask any questions from your side. Your valuable feedback or comments about this article are always welcome. Level up your career in Azure with our Azure Fundamental Course.