Wednesday, 5 September 2018

Microsoft Azure - Tables

Storing a table does not mean relational database here. Azure Storage can store just a table without any foreign keys or any other kind of relation. These tables are highly scalable and ideal for handling large amount of data. Tables can be stored and queried for large amount of data. The relational database can be stored using SQL Data Services, which is a separate service.
The three main parts of service are −
  • Tables
  • Entities
  • Properties
For example, if ‘Book’ is an entity, its properties will be Id, Title, Publisher, Author etc. Table will be created for a collection of entities. There can be 252 custom properties and 3 system properties. An entity will always have system properties which are PartitionKey, RowKey and Timestamp. Timestamp is system generated but you will have to specify the PartitionKey and RowKey while inserting data into the table. The example below will make it clearer. Table name and Property name is case sensitive which should always be considered while creating a table.

How to Manage Tables Using PowerShell

Step 1 − Download and install Windows PowerShell as discussed previously in the tutorial.
Step 2 − Right-click on ‘Windows PowerShell’, choose ‘Pin to Taskbar’ to pin it on the taskbar of your computer.
Step 3 − Choose ‘Run ISE as Administrator’.

Creating a Table

Step 1 − Copy the following commands and paste into the screen. Replace the highlighted text with your account.
Step 2 − Login into your account.
$StorageAccountName = "mystorageaccount" 
$StorageAccountKey = "mystoragekey" 
$Ctx = New-AzureStorageContext $StorageAccountName - StorageAccountKey 
$StorageAccountKey
Step 3 − Create a new table.
$tabName = "Mytablename" 
New-AzureStorageTable Name $tabName Context $Ctx 
The following image shows a table being created by the name of ‘book’.
Create Table
You can see that it has given the following end point as a result.
https://tutorialspoint.table.core.windows.net/Book
Similarly, you can retrieve, delete and insert data into the table using preset commands in PowerShell.

Retrieve Table

$tabName = "Book" 
Get-AzureStorageTable Name $tabName Context $Ctx

Delete Table

$tabName = "Book"
Remove-AzureStorageTable Name $tabName Context $Ctx

Insert rows into Table

function Add-Entity() { 
   [CmdletBinding()] 
 
   param( 
      $table, 
      [String]$partitionKey, 
      [String]$rowKey, 
      [String]$title, 
      [Int]$id, 
      [String]$publisher, 
      [String]$author 
   )  
   
   $entity = New-Object -TypeName Microsoft.WindowsAzure.Storage.Table.DynamicTableEntity 
      -ArgumentList $partitionKey, $rowKey 
  
   $entity.Properties.Add("Title", $title) 
   $entity.Properties.Add("ID", $id) 
   $entity.Properties.Add("Publisher", $publisher) 
   $entity.Properties.Add("Author", $author) 
   
   
   $result = $table.CloudTable.Execute(
      [Microsoft.WindowsAzure.Storage.Table.TableOperation]
      ::Insert($entity)) 
}
  
$StorageAccountName = "tutorialspoint" 
$StorageAccountKey = Get-AzureStorageKey -StorageAccountName $StorageAccountName 
$Ctx = New-AzureStorageContext $StorageAccountName - StorageAccountKey 
   $StorageAccountKey.Primary  

$TableName = "Book"
  
$table = Get-AzureStorageTable Name $TableName -Context $Ctx -ErrorAction Ignore 
 
#Add multiple entities to a table. 
Add-Entity -Table $table -PartitionKey Partition1 -RowKey Row1 -Title .Net -Id 1
   -Publisher abc -Author abc 
Add-Entity -Table $table -PartitionKey Partition2 -RowKey Row2 -Title JAVA -Id 2 
   -Publisher abc -Author abc 
Add-Entity -Table $table -PartitionKey Partition3 -RowKey Row3 -Title PHP -Id 3
   -Publisher xyz -Author xyz 
Add-Entity -Table $table -PartitionKey Partition4 -RowKey Row4 -Title SQL -Id 4 
   -Publisher xyz -Author xyz

Retrieve Table Data

$StorageAccountName = "tutorialspoint" 
$StorageAccountKey = Get-AzureStorageKey - StorageAccountName $StorageAccountName 
$Ctx = New-AzureStorageContext  StorageAccountName $StorageAccountName -
   StorageAccountKey $StorageAccountKey.Primary; 

$TableName = "Book"
  
#Get a reference to a table. 
$table = Get-AzureStorageTable Name $TableName -Context $Ctx  

#Create a table query. 
$query = New-Object Microsoft.WindowsAzure.Storage.Table.TableQuery

#Define columns to select. 
$list = New-Object System.Collections.Generic.List[string] 
$list.Add("RowKey") 
$list.Add("ID") 
$list.Add("Title") 
$list.Add("Publisher") 
$list.Add("Author")
  
#Set query details. 
$query.FilterString = "ID gt 0" 
$query.SelectColumns = $list 
$query.TakeCount = 20
 
#Execute the query. 
$entities = $table.CloudTable.ExecuteQuery($query)

#Display entity properties with the table format. 

$entities  | Format-Table PartitionKey, RowKey, @{ Label = "Title"; 
Expression={$_.Properties["Title"].StringValue}}, @{ Label = "ID"; 
Expression={$_.Properties[“ID”].Int32Value}}, @{ Label = "Publisher"; 
Expression={$_.Properties[“Publisher”].StringValue}}, @{ Label = "Author"; 
Expression={$_.Properties[“Author”].StringValue}} -AutoSize 
The output will be as shown in the following image.
Retrive Table

Delete Rows from Table

$StorageAccountName = "tutorialspoint" 
 
$StorageAccountKey = Get-AzureStorageKey - StorageAccountName $StorageAccountName 
$Ctx = New-AzureStorageContext  StorageAccountName $StorageAccountName - 
   StorageAccountKey $StorageAccountKey.Primary  

#Retrieve the table. 
$TableName = "Book" 
$table = Get-AzureStorageTable -Name $TableName -Context $Ctx -ErrorAction 
Ignore 

#If the table exists, start deleting its entities. 
if ($table -ne $null) { 
   #Together the PartitionKey and RowKey uniquely identify every   
   #entity within a table.
 
   $tableResult = $table.CloudTable.Execute(
      [Microsoft.WindowsAzure.Storage.Table.TableOperation] 
      ::Retrieve(“Partition1”, "Row1")) 
  
   $entity = $tableResult.Result;
 
   if ($entity -ne $null) {
      $table.CloudTable.Execute(
         [Microsoft.WindowsAzure.Storage.Table.TableOperation] 
         ::Delete($entity)) 
   } 
}
The above script will delete the first row from the table, as you can see that we have specified Partition1 and Row1 in the script. After you are done with deleting the row, you can check the result by running the script for retrieving rows. There you will see that the first row is deleted.
While running these commands please ensure that you have replaced the accountname with your account name, accountkey with your account key.

How to Manage Table using Azure Storage Explorer

Step 1 − Login in to your Azure account and go to your storage account.
Step 2 − Click on the link ‘Storage explorer’ as shown in purple circle in the following image.
Storage Explorer
Step 3 − Choose ‘Azure Storage Explorer for Windows’ from the list. It is a free tool that you can download and install on your computer.
Step 4 − Run this program on your computer and click ‘Add Account’ button at the top.
Step 5 − Enter ‘Storage Account Name’ and ‘Storage account Key’ and click ‘Test Access. The buttons are encircled in following image.
Storage Account Name
Step 6 − If you already have any tables in storage you will see in the left panel under ‘Tables’. You can see the rows by clicking on them.

Create a Table

Step 1 − Click on ‘New’ and enter the table name as shown in the following image.
Create New Table

Insert Row into Table

Step 1 − Click on ‘New’.
Step 2 − Enter Field Name.
Step 3 − Select data type from dropdown and enter field value.
Select Data From Dropdown
Step 4 − To see the rows created click on the table name in the left panel.
Azure Storage Explorer is very basic and easy interface to manage tables. You can easily create, delete, upload, and download tables using this interface. This makes the tasks very easy for developers as compared to writing lengthy scripts in Windows PowerShell.

Microsoft Azure - Queues

In the common language used by developers, a queue is a data structure used to store data which follows First in-First out rule. A data item can be inserted from back of the queue while it is retrieved from front. Azure queues are a very similar concept that is used to store the messages in a queue. A sender sends the message and a client receives and processes them. A message has few attributes attached to it, for example expiry time.
A client usually processes and deletes the message. Windows Azure service lets the message to be stored for 7 days and later it gets deleted automatically, if it is not deleted by the client. There can be one sender and one client or one sender and many clients or many sender and many clients.
There are two services offered by Windows Azure for message queues. This chapter covers Windows Azure queue. The other service is called ‘Service Bus queue’.
Decoupling the components is one of the advantages of message queue services. It runs in an asynchronous environment where messages can be sent among the different components of an application. Thus, it provides an efficient solution for managing workflows and tasks. For example, a message to complete a task is sent from the frontend of the application and is received by a backend worker, who then completes the task and deletes the message.

Considerations

The messages in the storage queue are not replicated anywhere, that means there is only one copy of your message. The maximum number of messages that can be processed are 20,000. The maximum size of a message can be 40 kb.

Managing Queues using PowerShell

Create a Queue

Step 1 − Right-click on Windows PowerShell in the taskbar. Choose ‘Run ISE as administrator’.
Step 2 − Run the following command to access your account. Please replace the highlighted part for your account.
$context = New-AzureStorageContext -StorageAccountName tutorialspoint StorageAccountKey 
iUZNeeJD+ChFHt9XHL6D5rkKFWjzyW4FhV0iLyvweDi+Xtzfy76juPzJ+mWtDmbqCWjsu/nr+1pqBJj rdOO2+A==
Step 3 − Specify the storage account in which you want to create a queue.
Set-AzureSubscription –SubscriptionName "BizSpark" -CurrentStorageAccount tutorialspoint 
Step 4 − Create a Queue.
$QueueName = "thisisaqueue" 
$Queue = New-AzureStorageQueue Name $QueueName -Context $Ctx 
Create a Queue

Retrieve a Queue

$QueueName = "thisisaqueue" 

$Queue = Get-AzureStorageQueue Name $QueueName Context $Ctx

Delete a Queue

$QueueName = "thisisaqueue" 

Remove-AzureStorageQueue Name $QueueName Context $Ctx
Delete a Queue

Insert a Message into a Queue

Step 1 − Login to your account.
$context = New-AzureStorageContext -StorageAccountName tutorialspoint StorageAccountKey 

iUZNeeJD+ChFHt9XHL6D5rkKFWjzyW4FhV0iLyvweDi+Xtzfy76juPzJ+mWtDmbqCWjsu/nr+1pqBJj rdOO2+A==
Step 2 − Specify the storage account you want to use.
Set-AzureSubscription –SubscriptionName "BizSpark" -CurrentStorageAccount tutorialspoint
Step 3 − Retrieve the queue and then insert the message.
$QueueName = "myqueue" 
$Queue = Get-AzureStorageQueue -Name $QueueName -Context $ctx 

if ($Queue -ne $null) {  
   $QueueMessage = New-Object -TypeName Microsoft.WindowsAzure.Storage.Queue.CloudQueueMessage
      -ArgumentList "my message is this"  
   $Queue.CloudQueue.AddMessage($QueueMessage) 
}
The ‘if’ condition in the script above checks if the queue specified exists or not.

Dequeue Next Message from Queue

Step 1 − First connect to your account and specify the storage account, by running the commands as shown in the above steps.
Step 2 − Retrieve the queue.
$QueueName = "myqueue" 
$Queue = Get-AzureStorageQueue -Name $QueueName -Context $ctx 
$InvisibleTimeout = [System.TimeSpan]::FromSeconds(10)
Step 3 − Dequeue the next message.
$QueueMessage = $Queue.CloudQueue.GetMessage($InvisibleTimeout)
Step 4 − Delete the dequeued message.
$Queue.CloudQueue.DeleteMessage($QueueMessage)

Managing Queues using Azure Storage Explorer

Step 1 − Select the storage account from the dropdown at the top right. Accounts will be displayed if you have added them during your previous use. If not, you can add account and it will ask for your credentials. After signing in, you will be logged into your account in Azure Storage Explorer.
Step 2 − You can add a new queue by selecting ‘Queues’ from the left panel and clicking ‘New’ as shown in the following image.
Queue Storage Explorer
Step 3 − Enter the name of Queue and it is created in your storage account.
Step 4 − Add and delete the messages by selecting the queue in the left panel.
Queue Storage Explorer

Microsoft Azure - Blobs

Let us first understand what a Blob is. The word ‘Blob’ expands to Binary Large OBject. Blobs include images, text files, videos and audios. There are three types of blobs in the service offered by Windows Azure namely block, append and page blobs.
  • Block blobs are collection of individual blocks with unique block ID. The block blobs allow the users to upload large amount of data.
  • Append blobs are optimized blocks that helps in making the operations efficient.
  • Page blobs are compilation of pages. They allow random read and write operations. While creating a blob, if the type is not specified they are set to block type by default.
All the blobs must be inside a container in your storage. Here is how to create a container in Azure storage.

Create a Container

Step 1 − Go to Azure portal and then in your storage account.
Step 2 − Create a container by clicking ‘Create new container’ as shown in following image.
Blobs Create a Container
There are three options in the Access dropdown which sets the permission of who can access the blobs. ‘Private’ option will let only the account owner to access it. ‘Public Container’ will allow anonymous access to all the contents of that container. ‘Public blob’ option will set open access to blob but won’t allow access to the container.

Upload a Blob using PowerShell

Step 1 − Go to ‘Windows PowerShell’ in the taskbar and right-click. Choose ‘Run ISE as Administrator’.
Step 2 − Following command will let you access your account. You have to change the fields highlighted in all the commands.
$context = New-AzureStorageContext -StorageAccountName tutorialspoint StorageAccountKey

iUZNeeJD+ChFHt9XHL6D5rkKFWjzyW4FhV0iLyvweDi+Xtzfy76juPzJ+mWtDmbqCWjsu/nr+1pqBJj rdOO2+A== 
Step 3 − Run the following command. This will get you the details of you Azure account. This will make sure that your subscription is all set.
Get-AzureSubscription 
Step 4 − Run the following command to upload your file.
Set-AzureStorageBlobContent -Blob Montiorlog.png -Container images -File 
"E:\MyPictures\MonitorLog.png" -Context $context -Force
Upload a Blob using PowerShell
Step 5 − To check if the file is uploaded, run the following command.
Get-AzureStorageBlob -Container $ContainerName -Context $ctx | Select Name 

Download a Blob

Step 1 − Set the directory where you want to download the file.
$localTargetDirectory = "C:\Users\Sahil\Downloads"
Step 2 − Download it.
$BlobName = "Montiorlog.png" Get-AzureStorageBlobContent -Blob $BlobName 
Container $ContainerName -Destination $localTargetDirectory -Context $ctx
Remember the following −
  • All command names and file names are case sensitive.
  • Commands should be in one line or should be continued in the next line by appending ` in the preceding line (`is continuation character in PowerShell)

Manage Blobs using Azure Storage Explorer

Managing blobs is pretty simple using ‘Azure Storage Explorer’ interface as it is just like Windows files and folder explorer. You can create a new container, upload blobs, see them in a listed format, and download them. Moreover, you can copy them to a secondary location in a very simple manner with this interface. The following image makes the process clear. As can be seen, once an account is added, we can select it from the dropdown and get going. It makes operating Azure storage very easy.
Manage Storage Explorer

Microsoft Azure - Storage

The Storage component of Windows Azure represents a durable store in the cloud. Windows Azure allows developers to store tables, blobs, and message queues. The storage can be accessed through HTTP. You can also create our own client; although Windows Azure SDK provides a client library for accessing the Storage.
In this chapter, we will learn how to create a Windows Azure Storage account and use it for storing data.

Creating Azure Storage Account

Step 1 − When you login into your Azure account, you can find ‘Storage’ under ‘Data Services’.
Storage Account
Step 2 − Click on ‘Quick Create’ and it will ask for ‘Account Name’.
Storage Account
You can see there are four options in the ‘Replication’ dropdown. A copy of the data is kept so that it is durable and available at high speed. It is retained even in case of hardware failure. Let’s see what these options mean −
  • Locally redundant storage − Copy of the data is created in the same region where storage account is created. There are 3 copies of each request made against the data that resides on separate domains.
  • Zone-redundant storage (available for blobs only) − Copy of the data is created on separate facilities either in the same region or across two regions. The advantage is that even if there is failure on one facility, the data still can be retained. Three copies of data are created. One more advantage is that data can be read from a secondary location.
  • Geo-redundant storage − `Copy is created in a different region which means data is retained even if there is a failure in the complete region. The numbers of copies of data created are 6 in this case.
  • Read-access geo-redundant storage − This option allows reading of data from a secondary location when data on the primary location is not available. The number of copies created is 6. The main advantage here is that availability of data can be maximized.
There are different price plans for each replication option and the ‘Local Redundant’ is the cheapest of them all. So, choosing the replication of data depends on the cost and individual requirements.

Storage Account Endpoints

Step 1 − Click on the ‘Storage Account’ it will take you to the next screen.
Step 2 − Click on ‘Dashboard’ from top horizontal menu.
Storage Account Endpoints
Here you can see four items under services. You can create blobs, tables, queues and files in this storage account.
There will a unique URL for each object. For example, here account name is ‘tutorialspoint’ then the default URL for blob is https://tutorialspoint.blob.core.windows.net Similarly, replace blob with table, queue and file in the URL to get the respective URLs. To access an object in the location is appended in the URL. For example,http://tutorialspoint.blob.core.windows.net/container1/blob1

Generating an Access Key

Access key is used to authenticate the access to the storage account. Two access keys are provided in order to access the account without interrupting it, in case, one key has to be regenerated.
To get the Access Keys, click on ‘Manage Access Keys’ in your storage account. The following screen will come up.
Generating an Access Key
Regenerating the key at regular intervals is advised for security reasons.

Managing Data to Azure Storage

How can you upload or download data to Azure store? There are many ways to do it, but it can’t be done within the Azure portal itself. You will have to either create your own application or use an already built tool.
There are many tools available for accessing the data in an explorer that can be accessed by clicking on ‘Storage Explorer’ under ‘Get the Tools’ in your Azure storage account. Alternatively, an application can also be built using Software Development Kit (SDK) available in Windows Azure Portal. Using the PowerShell commands is also an option to upload data. PowerShell is a command line application that facilitates administering and managing the Azure storage. Preset commands are used for different tasks to manage the storage.
You can install PowerShell by going to ‘Downloads’ on the following screen in your account. You will find it under Command-Line tools.
Managing Data to Storage
There are specific commands for each task. You can manage you storage account, create a new account, and create a container. Additionally, blobs, tables, queues messages can also be managed using PowerShell.

Microsoft Azure - Fabric Controller

Microsoft Azure - Fabric Controller



Fabric Controller is a significant part of Windows Azure architecture. When thinking of the components or services provided by Windows Azure, we wonder how all this works and what is happening in clouds. It seems very complex from our end. Let us look into the physical architecture of these services to have a better understanding of Fabric Controller.
Fabric Controller
Inside the datacenter, there are many machines or servers aggregated by a switch. We can say that fabric controller is a brain of the azure service that analyses the processes and makes decisions. Fabrics are group of machines in Microsoft’s datacenter which are aggregated by a switch. The group of these machines is called cluster. Each cluster is managed and owned by a fabric controller. They are replicated along with these machines. It manages everything inside those machines, for e.g., load balancers, switches, etc. Each machine has a fabric agent running inside it and fabric controller can communicate with each fabric agent.
When selecting a virtual machine offered by Windows Azure services, there are five options to choose from. The configuration is as follows −
MemoryCPUInstance Storage
Extra Small768 MBSingle core 1.0 GHz20 GB
Small1.75 GBSingle core 1.6 GHz225 GB
Medium3.5 GBDual core 1.6 GHz490 GB
Large7 GBFour core 1.6 GHz1,000 GB
Extra Large14 GBEight core 1.6 GHz2,040 GB
When a user chooses one of the virtual machine, the operating system, patch updates and software updates are performed by fabric controller. It decides where the new application should run which is one of the most important functions of Fabric Controller. It also selects the physical server to optimize hardware utilization.
When a new application is published in Azure, an application configuration file written in XML is also attached. The fabric controller reads those files in Microsoft datacenter and makes the setting accordingly.
In addition to managing the allocation of resources to a specific application, it also monitors the health of compute and storage services. It also makes the failure recoveries for a system.
Imagine a situation where four instances of web role are running, and one of them dies. The fabric controller will initiate a new instance to replace the dead one immediately. Similarly, in case any virtual machine fails, a new one is assigned by the fabric controller. It also resets the load balancers after assigning the new machine, so that it points to the new machine instantaneously. Thus, all the intelligent tasks are performed by the Fabric Controller in Windows Azure architecture.