Create and Manage Amazon DynamoDB Database
Learn to Create and Manage Amazon DynamoDB Database
Tutorial Objectives:
1. Learn to create a DynamoDB table, store & retrieve data. 2. Learn to use AWS SDK for Python, for interacting with DynamoDB APIs.
Prerequisite:
1. Download the DynamoDB zip from here.
Step 1: Cloud9 operates in some specific regions for now. So we have to switch to one of those operative Regions.
In AWS Console, top blue bar, from region drop-down, select US East (N.Virginia) us-east-1.
Step 2: Go to Cloud9 Service. Click on Create Environment.
Provide the following configuration:
Environment Name: DynamoDB Deep Dive
Description: Cloud9 IDE for CloudPlusPlus Tutorial
Go to Next Step. Confirm the following list of the default selected choices:
Environment type: Create a new EC2 instance for the environment (direct access)
Instance type: t2.micro (1 GiB RAM + 1 vCPU)
Platform: Amazon Linux 2 (recommended)
Proceed to the Next Step. Review and click on Create environment.
You will have the Cloud9 IDE ready in some time. A window as below is visible.
Step 3: Upload the Resources.
To upload the folder, Click to File -> Upload Local Files->Select Folder.
And Select the DynamoDB Folder downloaded in Step 1 and Upload.
After uploading you can see the uploaded DynamoDB folder in the File Explorer Section as shown below.
In the bottom part of the screen, a Terminal window is visible. If not go to the Window option in the Menu Bar of Cloud9 IDE and click on New Terminal.
Step 4: To install Boto 3, run the following command in the AWS Cloud9 terminal.
pip install boto3
The AWS SDK for Python (Boto3) provides a Python API for AWS infrastructure services. Using the SDK for Python, you can build applications on top of Amazon S3, Amazon EC2, Amazon DynamoDB, and more.
Step 5: Inserting and retrieving data with DynamoDB.
1. Create DynamoDB Table. The environment folder you downloaded includes a create_table.py script that creates a Books table by using the CreateTable API. You can run this script by entering the following command in the AWS Cloud9 terminal.
cd DynamoDB
python create_table.py
You can go to DynamoDB Service and check there if the table is created.
If you open the create_table.py script with the AWS Cloud9 editor, you should notice:
The script specifies the composite primary key of your table with the KeySchema argument in the CreateTable API call. Your table uses Author as the hash key and Title as the range key.
The script specifies the provisioned throughput for your table by defining both read capacity units and write capacity units. DynamoDB lets you set read and write capacity separately, allowing you to fine-tune your configuration to meet your application’s needs without paying for costly overprovisioning.
2. Load items into Table.
python insert_items.py
3. Retrieve items from the Table
python get_item.py
Your terminal should print out the full book data retrieved from the tables as shown below.
Step 6: Querying and Global Secondary Indexes.
When your table uses a composite primary key, you can retrieve all items with the same hash key by using the Query API call. For your application, this means you can retrieve all books with the same Author attribute.
In the AWS Cloud9 terminal, run the following command.
python query_items.py
This command runs the following script that retrieves all books written by John Grisham.
After you run the script, you should see two John Grisham books, The Firm and The Rainmaker.
Retrieving multiple items with a single call in DynamoDB is a common pattern and easy to do with the Query API call.
Step 7: Creating a Secondary index.
DynamoDB has two kinds of secondary indexes: global secondary indexes and local secondary indexes. In this section, you add a global secondary index to your Category attribute that will allow you to retrieve all books in a particular category.
The following script adds a global secondary index to an existing table.
Creating a global secondary index has a lot in common with creating a table. You specify a name for the index, the attributes that will be in the index, the key schema of the index, and the provisioned throughput (the maximum capacity an application can consume from a table or index). Provisioned throughput on each index is separate from the provisioned throughput on a table. This allows you to define throughput granularly to meet your application’s needs.
Run the following command in your terminal to add your global secondary index.
python add_secondary_index.py
This script adds a global secondary index called CategoryIndex to your Books table.
Step 8: Querying the Secondary index
Use the query_with_index.py script to query against the new index. Run the script in your terminal with the following command.
python query_with_index.py
This command runs the script to retrieve all books in the store that have the Category of Suspense.
Note that there is a portion of the script that waits until the index is available for querying.
You should see the following output in your terminal.
The query returns three books by two different authors.
Updating an Item
python update_item.py
As you can see, The Rainmaker has a new Audiobook format after the update has been applied.
Amazon DynamoDB NoSQL database
Updated: Jul 23, 2021
Learn to create an Amazon DynamoDB NoSQL database table and populate using Amazon EC2 Instance.
Objectives:
1. Learn to create an Amazon DynamoDB NoSQL database table.
2. Populate using Amazon EC2 Instance.
Step 1: Go to Dynamo DB service in AWS console. Click on Create Table.
Table Name: ProductCatalog
Primary Key: Id Type: Number
Keep the rest as default and click on Create.
Step 2: Now we create a role for accessing providing EC2 access to Dynamo DB.
Go to IAM service from AWS console. Click on Roles. Click Create role.
Choose the EC2 use case. Proceed to Next: Permissions.
Search and select AmazonDynamoDBFullAccess policy.
Continue to tags and further to Review. Configure as follows:
Role name: EC2RoleForDynamoDB
Role Description: Allows EC2 instances to call AWS DynamoDB service on your behalf.
Click on Create Role at the bottom corner of screen.
Step 3: Launch an EC2 Linux Instance. For step by step instructions to launch EC2 instance visit our blog here.
After the instance is in running state.
Go to Actions -> Security -> Modify IAM Role
Select the DynamoDB role created in previous step and Save.
Step 4: SSH or Connect through AWS Management Console to the instance.
Run the following command to update the Linux instance.
sudo yum update
It would ask for confirmation. Enter Y and complete the update.
Further use the following command to get the data.
wget https://docs.aws.amazon.com/amazondynamodb/latest/developerguide/samples/sampledata.zip
Run the ls command once to confirm.
Unzip this data file using the next command:
unzip sampledata.zip
Now we write these files using next command. Change the highlighted part specific to the region you are operating in:
aws dynamodb batch-write-item --request-items file://ProductCatalog.json --region ap-south-1
Step 5: Now we go back to Dynamo DB console and when you refresh the table, it is observed to be populated.
We try to run a query in DynamoDB. In the Scan drop down above table select Query in drop down.
Provide a Number for Id and and run the query.
No comments:
Post a Comment