Monday, 5 August 2024

AWS Cloud Job Oriented Program: Step-by-Step Hands-on Labs & Projects

 AWS Cloud Job Oriented Program: Step-by-Step Hands-on Labs & Projects


The walkthrough of the Step-By-Step Activity Guides of the AWS Job Oriented training program will prepare you thoroughly for the AWS Certification & Jobs


List of Labs that we include in Our training AWS Job Oriented Program.

 AWS Job Path

AWS Job PathLab 01: Create an AWS Free Trial Account

Embark on your AWS journey by setting up a free trial account. This hands-on lab guides you through the initial steps of creating an AWS account, giving you access to a plethora of cloud services to experiment and build with.

Amazon Web Services (AWS) is providing a free trial account for 12 months to new subscribers to get hands-on experience with all the services that AWS provides. Amazon is giving no. of different services that we can use with some of the limitations to get hands-on practice and achieve more knowledge on AWS Cloud services as well as regular business use.

With the AWS Free Tier account, all the services offered have limited usage on what we can use without being charged. Here, we will look at how to register for an AWS FREE Tier Account.

To know how to create a free AWS account, check our Step-by-step blog How To Create AWS Free Tier Account

AWS Free Tier

Lab 02: CloudWatch – Create Billing Alarm & Service Limits

Dive into CloudWatch, AWS’s monitoring service. This lab focuses on setting up billing alarms to manage costs effectively and keeping an eye on service limits to ensure your applications run smoothly within defined boundaries.

AWS billing notifications can be enabled using Amazon CloudWatch. CloudWatch is an Amazon Web Services service that monitors all of your AWS account activity. CloudWatch, in addition to billing notifications, provides infrastructure for monitoring apps, logs, metrics collecting, and other service metadata, as well as detecting activity in your AWS account usage.

AWS CloudWatch offers a number of metrics through which you can set your alarms. For example, you may set an alarm to warn you when a running instance’s CPU or memory utilization exceeds 90% or when the invoice amount exceeds $100. We get 10 alarms and 1,000 email notifications each month with an AWS free tier account.

Free-tier-service-limitLab 03: Create And Connect To Windows EC2 Machine

Get a practical experience with Elastic Compute Cloud (EC2) by launching a Windows instance. Learn the nuances of instance creation and connect seamlessly to Windows-based virtual machines.

windows ec2Lab 04: Create And Connect To Linux EC2 Machine

Extend your EC2 skills by spinning up a Linux instance. Discover the nuances of the Linux environment within AWS, from creation to connecting and managing instances efficiently.

linux-ec2Lab 05: Working with AWS IAM

Unlock the power of Identity and Access Management (IAM) by navigating through user and group management. Gain insights into the foundation of AWS security, controlling who can access your resources and what actions they can perform.

Lab 06: Enable Multi-Factor Authentication

Elevate your security posture by implementing Multi-Factor Authentication (MFA) in IAM. This lab walks you through the process of adding an extra layer of authentication for enhanced account protection.IAM

Lab 07: IAM Power User

Become an IAM power user by delving into advanced features. Learn to create and manage policies, roles, and permissions, gaining mastery over nuanced access control scenarios.

PowerUserLab 08: Create S3 Bucket, Upload and access Files, And Host the Website

Explore the versatile Amazon Simple Storage Service (S3) by creating buckets, uploading files, and even hosting a static website. This lab provides hands-on experience in leveraging S3 for scalable and secure object storage.Lab: S3

Lab 09: Create and mount Elastic File System (EFS) on EC2 Instances

Dive into Elastic File System (EFS) to create a scalable and shared file storage solution. Learn to mount EFS on EC2 instances, facilitating collaboration and data access across multiple compute resources.Storage

Lab 10: Create and Manage EBS Volumes & Snapshots

Master Elastic Block Store (EBS), a fundamental component for EC2 instances. This lab guides you through the creation and management of EBS volumes and snapshots, ensuring data durability and flexibility.

EBS volumeLab 11: Host Website On Windows EC2 Instance Using IIS

Take a deeper dive into EC2 by deploying a website on a Windows instance using Internet Information Services (IIS). Gain insights into web hosting on AWS and configuring Windows-based servers.

IIS web server

Lab 12: Configure Apache Webserver on AWS Linux EC2 Machine

Extend your web hosting knowledge by configuring an Apache web server on a Linux-based EC2 instance. Understand the intricacies of hosting applications on a Linux environment within AWS.

Install-Apache-Web-ServerLab 13: Network Load Balancer

Delve into the world of load balancing with Network Load Balancer (NLB). This lab provides hands-on experience in setting up and optimizing network-based load balancing for high availability.

Load-balancerLab 14: Configure a Load Balancer And Autoscaling on EC2 Instances

Optimize your infrastructure’s performance and availability by configuring load balancers and implementing auto-scaling on EC2 instances. This lab equips you with essential skills for managing dynamic workloads.Load balancer & AutoScaling

Lab 15: Create Launch Template For Auto Scaling Group

Streamline the process of launching and scaling EC2 instances with launch templates. This lab guides you through creating templates for efficient and consistent auto-scaling group deployments.

auto_scaling-templateLab 16: Create a Custom Virtual Private Cloud

Build a tailored Virtual Private Cloud (VPC) to isolate and organize your AWS resources. This lab delves into creating custom VPCs, defining subnets, and configuring routing tables for optimal network architecture.

VPCLab 17: Work with VPC Peering Connection

Connect multiple VPCs seamlessly with VPC peering connections. This lab provides hands-on experience in establishing and configuring peering connections for secure and efficient communication.VPC peering

Lab 18: AWS Elastic IP

Explore the benefits of Elastic IP addresses in AWS. This lab guides you through the process of allocating and associating Elastic IPs to EC2 instances, providing static public IP addresses for dynamic cloud environments.elastic-IP-address

Lab 19: Establish a Client Side VPN

Enhance your network security by establishing a client-side VPN connection to AWS. This lab walks you through the setup, configuration, and optimization of VPN connections for secure data transmission.

AWS Client VPN Concept

Lab 20: Get Started with AWS X-Ray

Step into the world of AWS X-Ray, a powerful tool for analyzing and debugging distributed applications. This lab provides hands-on experience in instrumenting applications and gaining insights into performance bottlenecks.

xray-servicemapLab 21: Configure Amazon CloudWatch to Notify Change In EC2 CPU Utilization

Master Amazon CloudWatch for proactive monitoring of your EC2 instances. This lab focuses on setting up notifications based on changes in CPU utilization, ensuring timely responses to performance fluctuations.Cloud watch

Lab 22: Enable CloudTrail and Store Logs In S3

Enhance your AWS security by enabling AWS CloudTrail. This lab guides you through the process of setting up CloudTrail and storing logs in Amazon S3 for comprehensive audit trails and compliance.

Enable CloudTrail and Store Logs In S3Lab 23: Setting Up AWS Config to Assess Audit & Evaluate AWS Resources

Gain control and visibility into your AWS resource configurations with AWS Config. This lab walks you through the setup and utilization of AWS Config for continuous assessment, auditing, and evaluation of your cloud environment.

AWS Config to Assess Audit & Evaluate AWS ResourcesLab 24: Create & Query with Amazon DynamoDB

Dive into Amazon DynamoDB, a fully managed NoSQL database service. This lab provides hands-on experience in creating tables and executing queries, exploring the scalability and flexibility of DynamoDB.

dynamodbLab 25: Configure a MySQL DB Instance via Relational Database Service (RDS)

Navigate through Amazon Relational Database Service (RDS) to configure a MySQL database instance. This lab equips you with the skills to set up, manage, and optimize a relational database in the AWS cloud.

getting-started-mysqlLab 26: Create A Redis Cache and connect It To EC2 Instance

Explore Amazon ElastiCache and its Redis caching capabilities. This lab guides you through the creation of a Redis cache cluster and connecting it to an EC2 instance, enhancing data retrieval performance.

Elasti chacheLab 27: Amazon Athena

Unlock the power of serverless querying with Amazon Athena. This lab introduces you to Athena’s capabilities, allowing you to run ad-hoc SQL queries on data stored in Amazon S3.

Amazon AthenaLab 28: Introduction to AWS Glue

Delve into AWS Glue for data integration and transformation. This lab provides hands-on experience in discovering, cataloging, and transforming data, laying the foundation for efficient data processing.

AWS_ETL_GlueLab 29: Visualize Web Traffic Using Kinesis Data Streams

Enter the realm of real-time data streaming with Amazon Kinesis Data Streams. This lab guides you through the process of visualizing web traffic patterns, showcasing the capabilities of Kinesis for data analytics.

Web Traffic Using Kinesis Data StreamsLab 30: Send An E-mail Through AWS SES

Master Amazon Simple Email Service (SES) to send emails securely and reliably. This lab explores SES’s features, from setting up email sending to configuring templates and handling bounces and complaints for robust email communication.

E-mail Through AWS SESLab 31: Event-Driven Architectures Using AWS Lambda, SES, SNS and SQS

Build dynamic and scalable event-driven architectures with AWS Lambda, Simple Email Service (SES), Simple Notification Service (SNS), and Simple Queue Service (SQS). This lab guides you through creating a seamless communication flow within your AWS environment.

Event-Driven Architectures Using AWS Lambda, SES, SNS and SQSLab 32: Build API Gateway with Lambda Integration

Create a robust API Gateway and integrate it with AWS Lambda for efficient and scalable API management. This lab equips you with the skills to build and manage APIs seamlessly.

build_api_gateway_with_lambda_integrationLab 33: Create and Update Stacks using CloudFormation

Enter the realm of Infrastructure as Code (IaC) with AWS CloudFormation. Learn to create and update stacks to provision and manage AWS resources in a repeatable and automated fashion.

update-stack-changesets-diagramLab 34: Create S3 Bucket Using CloudFormation

Automate S3 bucket creation with AWS CloudFormation, streamlining your infrastructure management and ensuring consistency across deployments.

Create-S3-bucket-with-CRR-using-cloud-formation-v2Lab 35: Create & Configure EC2 with Helper-Scripts

Leverage helper scripts to automate the creation and configuration of EC2 instances. This lab provides hands-on experience in scripting for efficient and reproducible infrastructure deployment.

 

Create & Configure EC2 with Helper-ScriptsLab 36: Deploy An Application In Beanstalk Using Docker

Explore AWS Elastic Beanstalk for simplified application deployment. This lab focuses on deploying applications using Docker containers, providing scalability and ease of management.

dockerhub-to-elasticbeanstalkLab 37: Immutable Deployment on Beanstalk Environment

Implement immutable deployments on Elastic Beanstalk environments for enhanced reliability and consistency. This lab guides you through deploying applications with minimal downtime and risk.
eb-blue-green-success

Lab 38: Blue-Green Deployments Using Elastic Beanstalk

Optimize your deployment strategies with blue-green deployments on Elastic Beanstalk. This lab explores techniques for minimizing downtime and risk during application updates.

AWS-Blue-Green-Deployment-Elastic-BeanstalkLab 39: AWS Serverless Application Model

Enter the world of serverless architecture with the AWS Serverless Application Model (AWS SAM). This lab provides a hands-on introduction to building and deploying serverless applications efficiently.

Serverless Application ModelLab 40: AWS KMS Create & Use

Explore AWS Key Management Service (KMS) for creating and using cryptographic keys to enhance data security. This lab guides you through the process of securing sensitive information with KMS.

KMSLab 41: Block Web Traffic with WAF in AWS

Fortify your web applications with AWS Web Application Firewall (WAF). This lab demonstrates how to block malicious web traffic and protect your applications from common security threats.

Block Web Traffic with WAF in AWSLab 42: Amazon Inspector

Enhance your security posture with Amazon Inspector, an automated security assessment service. This lab guides you through setting up and using Inspector to identify vulnerabilities in your AWS resources.

Amazon InspectorLab 43: Working with AWS CodeCommit

Collaborate effectively on software development projects with AWS CodeCommit. This lab provides hands-on experience in version control, code collaboration, and repository management.

aws code commitLab 44: Build Application with AWS CodeBuild

Automate your build processes with AWS CodeBuild. This lab guides you through setting up build projects, compiling code, and generating artifacts for deployment.

AWS CodeBuildLab 45: Deploy Sample Application using AWS CodeDeploy

Streamline application deployment with AWS CodeDeploy. This lab demonstrates how to automate the deployment process, ensuring consistent and reliable releases.

CodeDeployLab 46: Create a Simple Pipeline (CodePipeline)

Build end-to-end continuous integration and continuous deployment (CI/CD) pipelines with AWS CodePipeline. This lab provides hands-on experience in orchestrating and automating your software delivery process.

Simple Pipeline (CodePipeline)Lab 47: Build Application with AWS CodeStar

Accelerate your development workflow with AWS CodeStar. This lab introduces you to the benefits of CodeStar for simplifying project setup, development, and deployment.

Simple Pipeline (CodePipeline)Lab 48: Create ECR, Install Docker, Create Image, and Push Image To ECR

Dive into containerization with Amazon Elastic Container Registry (ECR). This lab covers the entire container lifecycle, from creating a repository to pushing Docker images to ECR for seamless container management.

ECR, Install Docker, Create Image, and Push Image To ECRLab 49: Create Task Definitions, Scheduling Tasks, Configuring Services and Cluster using EC2 Launch Types

Explore Amazon ECS (Elastic Container Service) by creating task definitions, scheduling tasks, and configuring services and clusters. This lab provides hands-on experience in managing containerized applications using different EC2 launch types.

Create Task DefinitionsLab 50: Create Elastic Kubernetes Service (EKS) Cluster on AWS

Delve into Kubernetes on AWS by creating an Elastic Kubernetes Service (EKS) cluster. This lab guides you through provisioning and managing Kubernetes clusters for container orchestration.

Elastic Kubernetes Service (EKS) Cluster on AWSLab 51: Application Migration to AWS

Learn the intricacies of migrating applications to AWS. This lab covers assessment, planning, and execution strategies to ensure a seamless transition to the cloud.

Application Migration to AWSLab 52: Database Migration To AWS

Explore strategies and tools for migrating databases to AWS. This lab provides practical guidance on minimizing downtime and ensuring data consistency during database migration.

Database Migration To AWSLab 53: AWS Data Transfer Acceleration

Optimize your data transfer speed with AWS Data Transfer Acceleration. This lab introduces you to acceleration techniques for efficient data movement within the AWS infrastructure.

Lab 54: Migrating a Monolithic Application to AWS with Docker

Experience the process of migrating a monolithic application to AWS using Docker containers. This lab covers containerization strategies and best practices for achieving scalability and flexibility in a cloud environment.

monolith_Microservices-app-architecture

Real-time Projects

Project 1: 3-Tier Architecture Deployment (Web, App, Database)

Create a well-architected 3-tier system. The web layer serves as the user interface, the app layer processes business logic, and the database layer stores and manages data. This project ensures a scalable and modular infrastructure, fostering efficient resource utilization.

3-tier architectureProject 2: DevOps: End-to-End CI/CD Pipeline

Establish an end-to-end CI/CD pipeline for your development workflow. This includes automating code builds, running tests, and deploying applications seamlessly. The goal is to enhance collaboration among developers, reduce manual errors, and accelerate the release of high-quality software.

pipeline

Project 3: Blue/Green Deployment with ECS (Serverless)

Implement a blue/green deployment strategy using AWS Elastic Container Service (ECS), a serverless container orchestration service. This project enables you to deploy new versions of your application without downtime, allowing for instant rollback in case of issues.

Bule-greenProject 4: Migration: On-Premise to Cloud (App, Database, Data)

Execute a comprehensive migration from on-premise infrastructure to the AWS cloud. This involves moving applications, databases, and data to the cloud, optimizing for scalability, cost efficiency, and improved reliability.

On-Premise to Cloud (App, Database, Data) Project 5: Migration: Monolithic App to Microservices

Transform a monolithic application into a microservices architecture on AWS. Break down the application into smaller, independent services, each with its own functionality. This project aims to enhance agility, scalability, and ease of maintenance.

Monolithic App to MicroservicesProject 6: Security: SSL/TLS/Keys Certificate Management System

Implement a robust security system, including SSL/TLS protocols for encrypted communication and a key management system for secure key storage and access. This project ensures the confidentiality and integrity of data in transit.
SSL TLS Keys Certificate Management System

Project 7: Host Static Website on AWS using S3 & Route53

Host a static website using Amazon S3 for storage and Amazon Route 53 for domain management. This project includes setting up S3 buckets, configuring static website hosting, and managing domain routing, providing a reliable and scalable solution for static content.

Host Static Website on AWS using S3 & Route53Project 8: Host Dynamic Website on AWS: Apache, MariaDB, PHP

Deploy a dynamic website using a stack comprising an Apache web server, MariaDB database, and PHP scripting. This project involves configuring each component, ensuring seamless communication, and establishing a scalable environment for dynamic web applications.

Host Dynamic Website on AWS Apache MariaDB PHP (1)Project 9: Deploy API Gateway, Application & Database

Build and deploy a comprehensive solution that includes an API Gateway, application layer, and database. This project focuses on designing RESTful APIs, connecting them to backend services, and managing data storage effectively.

Deploy API Gateway, Application & DatabaseProject 10: Deploy React App using Amplify, Lambda & AppConfig

Develop and deploy a React.js application using AWS Amplify for hosting, AWS Lambda for serverless functions, and AppConfig for configuration management. This project showcases modern front-end development practices, serverless architecture, and efficient application configuration in the AWS cloud.

Deploy React App using Amplify, Lambda & AppConfig


AWS lab

If you browse through a wide variety of websites, one thing you will find in common just about everywhere is the search option. If you are a music lover browsing through your favorite tracks, a food blogger checking out new food trends from the modern city you are visiting, looking for your favorite artist playing their next gig or just doing window shopping on your favorite shopping app, searching is the essential part of any significant data-driven website. Without search or even a suitable searching mechanism your data is virtually inaccessible from the reach of your users.

So, to put it in simple words, your search engine’s implementation is one of the essential parts of your web application. This drives users to the content they are looking for in the fastest and efficient way. This also brings up the discussion about selecting the right search tool for your application. There are multiple searches as service provider tools available to choose from.

  • Google Cloud Search
  • Azure Cognitive Search
  • SharePoint Online / Office 365 Search
  • Amazon CloudSearch
  • Amazon ElasticSearch

Amazon

Elasticsearch and CloudSearch are Amazon’s cloud-based solutions for search. Elasticsearch is an open source solution, whereas CloudSearch is a fully managed search service. It is quite simple to set up, easy to, and a cost-effective search solution. Amazon CloudSearch is an AWS Cloud managed service that helps users to create fast, scalable, profitable, easy to setup search solutions for their applications. Amazon CloudSearch uses Apache Solr as the underlying text search engine, which supports full-text search, faceted search, real-time indexing, dynamic clustering, database integration, NoSQL features, and productive document handling.

You will be able to interact with Amazon CloudSearch through three service layers mentioned below:

  • Document service
  • Configuration Service
  • Search service

Your Identity and Access Management (IAM) policies will allow you to access, manage, and configure your Amazon CloudSearch services. The Configuration service and Document service are for developers to set up and maintain the CloudSearch domain and its data. Whereas Search service is client facing where developers and QA can verify the configuration, indexes, and validate the data. Also, this is exposed to the real world users where it’ll cater all the search requests coming from either a web application or from a mobile app where the search domain is configured. Search service is also responsible for delivering fast, accurate, and real-time data for all the user queries.

cloudsearch architecture

Document service

Document service is used to manage and configure searchable data of the domain. Each domain will have its endpoint. To upload your data, you need to format it into XML or JSON. Each item that you want to be returnable as a search response is called a document. Every object in your search response will have that unique document id and search fields that you requested in your search request. Document service also allows you to add new data anytime you want. Once the information is uploaded, it’ll reindex automatically and will be available to search in just minutes.

Configuration Service

Configuration Service allows you to create and manage the search domain. This helps you to index, scale deploy it to multiple availability zones. This has quite a helpful step by step wizard to guide you through all the configuration steps for your search domain. You have to start with a unique name for your search domain, then configure the search index, set your scaling options and availability zone with the instance size.

Search service

Once all this is set up, you’ll be able to test out your domain with unique HTTP endpoint and query parameters handled by Search service. Search service handles search and suggestion requests for the area. The search service will have a single HTTP endpoint for querying. You can configure your search results in either XML or JSON format. CloudeSeach supports rich query language that helps users to build search queries with related text search, range search, facet option to filter through options and other options to create composite questions.

Following are the main benefits Amazon CloudSearch provides that compelled me to use CloudSearch over other services:

Simple

If your tech stack is already hosted on AWS, then setting up CloudSearch is quite simple. You can set that up via AWS Management Console, AWS SDKs or AWS CLI. Adding data to your search is as simple as uploading a file(JSON, XML or text file) from AWS Management Console, browse your data and upload it. CloudSearch will automatically do the grunt work, identifies the file type, analyze the data, set up indexes, create a search, sort and facet options. The user has to review it and save changes.

Fully Managed

Amazon CloudSearch is a fully managed custom search service, where you will be able to pick your instance type, select availability zones, provisioning as well as scaling and partition using either AWS CLI or management console. Data uploading, reindexing, suggesting facet search options are all done from the quite intuitive and user-friendly management console, which helps to set up your domain and be ready for start testing it in minutes with simple wizard-based instructions.

Scalable

Scalability is one of the essential aspects when you are growing. The CloudSearch domain will autoscale as your data or query volumes increases. AWS CloudSearch will automatically scale up or down depending on your domain’s as per its usage. So if the load increases, it’ll scale up to meet the requirements and scale down when there aren’t any significant number of queries.

Reliable

Reliability is most important when you are working with data that drives your applications search service. AWS CloudSearch has Multi-AZ options, so your data is secure of any hardware failure and also minimize latency. Search traffic automatically distributes across all available zones to meet the requirements and auto-scales depending on the load. It will make sure if one free zone fails then it’ll request to next nearest data source and fetch the data.

High Performance

Performance is one of the most important criteria while choosing your search engine. Faster data delivery is one if the main reason many engineers migrate from having a self build searching mechanism from searching from DB to externally hosted search service like AWS CloudSearch. Automatic indexing, horizontal – vertical scaling, and distributed data give you all the edge for delivering your data with low latency and high performance.

Prepare your test data :

Now we’ll show you how to create your domain on Amazon CloudSearch and set that up with your personal data. As we discussed earlier, you can upload your data in json, XML or text CSV file for your search domain. If you have your data, then you can upload that, or you can download a wide range of pre-formatted data from kaggle datasets. For this document purpose, I have chosen Google Play Store Apps data set. This data set has around 10.8K records; for this example purpose I have truncated it down to about 5K lines. You could do that with the following command.

head -100 oldfile > newfile

The truncated sample file is uploaded to https://github.com/akashs-cuelogic/CloudSearch. Feel free to use that.

Prerequisite:

AWS account !!

  1. Collect data From the AWS home page, navigate to Cloud Search under the Analytics section. This will give you a step by step instructions on how to create your own Cloud Search domain, upload data, set indexes, and start searching. Start by clicking the Create a new search domain shown in the picture below.AWS_create
  2. Create a new search domain and setup size Setting up a name for you CloudSearch domain is relevant because that will also be a part of your search API’s url. Under that, you’ll see a couple of options to set up the size of your instance and replication count. The size of the case and replication count is directly proportional to the size of your data and the volume of your request. If you are working with an extensive set of data, it’s advisable to use a more significant instance type. And if you are expecting a large number of concurrent requests, then increase the replication count accordingly.
    AWS_add_title
  3. Upload index definitions Indexing your data will dramatically increase the performance of your search and Cloud Search will do that automatically when you upload your data. For indexing, you don’t need your the whole set of data, and you need a few samples for Cloud Search to identify the data attributes and how to index them. There are multiple ways to upload sample data. You can upload it from your local machine, from your S3 bucket. You can also provide your DynamoDB data or do it manually. If you are trying things out, then, they also have sets of sample data to choose from and start testing things.AWS_configuring_index
  4. Configuring indexes Index configuration is the essential thing while setting up your domain. In most cases, Amazon CloudSearch will automatically index your data and will be reflected in your results in a few minutes. Once any changes on your domain configuration need reindexing your data and you can run it manually also from either client or the dashboard.Indexing options control how your data is mapped to index fields and what information you can search and retrieve from the index. The data you upload must contain the same fields configured in your domain’s indexing options, and the field values must be compatible with the set field types.AWS_review_index
  5. Set up Access Policies Amazon CloudSearch providers you various way to allow and restrict service APIs(Search and Suggester) and the domain(Document) services APIs. There are multiple options you can toggle your accessibility to your services.
    1. Search and Suggester service: Allow all. Document Service: Account owner only.
      This will allow search and suggester service accessible to everyone without any restrictions. And for who maintains domain and data there will be another service where they can upload new data, index them and make all the scaling and optimization they require for the CloudSearch to work efficiently.
    2. Allow open access to all services (not recommended because anyone can upload documents)
      This will all open access to search as well as maintain the data and other options. This is not recommended as it will expose all the data and may not be the secure option if you want to use it in your application where there is sensitive data.
    3. Allow access to all services from specific IP(s)
      This is the same as above, but we are restricting the search, suggestions, and domain request from some particular IPs only. This is a good option where you search domain used in some internal application, and it is not open for other users. Then you can whitelist those IPs and restrict others.
    4. Deny access to all services (No one can access your endpoint)
      Search and document requests must either be submitted through the console or authenticated with your account credentials. The document and search endpoints do not allow anonymous access or accept requests from other AWS users.AWS_access_settings
  6. Confirm Domain info The last step is to verify the domain information. It’ll list all the indexed fields, scaling options, and access policies. You can click on edit options next to it and make changes before confirming. If you are satisfied with your configuration, click Confirm. This will take you to the dashboard page where you will see the status of the domain with other information.AWS_confirming_settings
  7. Domain dashboard The dashboard will have the following information
    Searchable documents: Count of number of records that are available to search from
    Index Fields: Fields which are indexed and either searchable, returnable, or have sort options.
    Search Endpoint: Endpoint of an API where you’ll be querying your data.
    Document Endpoint : It’ll have all the information needed to set up the domain and how to use search services.
    Domain ARN: Amazon Resource Names (ARNs) uniquely identify AWS resources. Every resource in AWS will have a unique identification number with specifying a resource unambiguously across all of AWS.
    Engine Type: Type of search engine (CloudSearch) with the API version (2013). A search engine makes it possible to search extensive collections of mostly textual data items (called documents) to find the best matching results quickly.Note: Search Endpoint and Document Endpoint will be dynamically created from the name of your search domain given in the first step.
    AWS_Domain_dashboard
  8. Upload search documents As you can see above, the count of searchable papers is 0. In the initial steps, we just configured the indexes and added options to make them searchable sortable, etc. But if that is only a schema of search documents, we need the data sets with the same attributes and make them available for search requests. CloudSearch will allow you to upload your data in multiple ways. They are the same options that we choose in part 3, where we were uploading the document for adding indexes. Upload search documents
  9. Review the data Before all the records get uploaded to the service, it’ll first ask you to verify the data fields. Once you confirm that it’ll upload it to the service. Once you click on Upload Documents, it’ll start uploading all the records from the selected data source and index them and make them searchable. Review the data
    After all the data is uploaded, you can start testing out your data ……..
  10. Testing your data Testing your data is quite easy in Amazon CloudSearch. You can put your query text in Test Search input field and click GO. This will search through all the searchable fields that you selected while indexing and give you results as shown below. Testing your dataYou can also query via your Amazon CloudSearch Domain and simple q expression which will do a default search to all the searchable fields.

So these are the steps to set up your Cloud domain with your personal data set and start testing and tweaking with indexes and data configurations. There are multiple options available to do the advanced search options with nested and query suggestions that you can build depending on your search filters. We will try to cover in the next part. Amazon CloudSearch is a complete search solution which will allow you to scale and upload new data realine and make available to search in no time. With Amazon CloudSearch, one should be able to create their search domain, set search attributes, upload the data, and start testing them out in no time. It provides intuitive step by step wizard that will allow you to set that up easily.