Thanks to visit codestin.com
Credit goes to www.scribd.com

0% found this document useful (0 votes)
29 views19 pages

CC Manual

The document provides a comprehensive guide on various AWS services including AWS CloudShell, AWS Cloud9 IDE, Amazon S3, AWS Step Functions, DynamoDB, API Gateway, Lambda Functions, Docker, ElastiCache, and CloudFront. Each section outlines objectives, step-by-step instructions, and examples for accessing and utilizing these services effectively. The experiments are designed for students to understand cloud computing concepts and practical applications in a structured manner.

Uploaded by

hemanth
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
29 views19 pages

CC Manual

The document provides a comprehensive guide on various AWS services including AWS CloudShell, AWS Cloud9 IDE, Amazon S3, AWS Step Functions, DynamoDB, API Gateway, Lambda Functions, Docker, ElastiCache, and CloudFront. Each section outlines objectives, step-by-step instructions, and examples for accessing and utilizing these services effectively. The experiments are designed for students to understand cloud computing concepts and practical applications in a structured manner.

Uploaded by

hemanth
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 19

Experiment 1

Exploring AWS CloudShell and the AWS Cloud9 IDE

Here's a step-by-step explanation for the topic "Exploring AWS CloudShell and the AWS Cloud9 IDE"
suitable for a student cloud computing manual:
Exploring AWS CloudShell and the AWS Cloud9 IDE
Objective:
Understand and explore AWS CloudShell and AWS Cloud9 IDE, including how to access, use, and
differentiate them.
Part 1: AWS CloudShell
What is AWS CloudShell?
AWS CloudShell is a browser-based shell that provides a command-line interface to manage AWS
resources securely using the AWS CLI, without installing anything locally.
Step-by-Step: Accessing and Using AWS CloudShell
1.​ Log in to AWS Console:
○​ Visit: https://console.aws.amazon.com/
○​ Log in with your credentials.
2.​ Open CloudShell:
○​ In the top navigation bar, click on the terminal icon (>) next to the Region selector.
○​ A new terminal window will open at the bottom of the console.
3.​ Choose Region (if needed):
○​ CloudShell uses the default region. You can switch by selecting a different region from
the console.
4.​ Use AWS CLI Commands:
Try basic commands like:​
aws s3 ls
aws ec2 describe-instances
aws configure list
○​ You can run any supported AWS CLI commands here.
5.​ Upload/Download Files:
○​ Use the file menu in CloudShell to upload or download files/scripts.
6.​ Use Pre-installed Tools:
○​ Tools like Python, Git, Node.js, and AWS CLI are pre-installed.
Example:​
python3 --version
git –version
Part 2: AWS Cloud9 IDE
What is AWS Cloud9?
AWS Cloud9 is a browser-based Integrated Development Environment (IDE) that lets you write, run, and
debug code with just a browser. It supports various programming languages like Python, JavaScript, PHP,
and more.

Step-by-Step: Creating and Using an AWS Cloud9 Environment


1.​ Open AWS Console:​

○​ Go to
2.​ Create New Environment:
○​ Click "Create environment".
○​ Enter a Name (e.g., MyDevEnvironment) and an optional description.
3.​ Configure Environment Settings:
○​ Choose environment type:
■​ Create a new EC2 instance for environment (default).
○​ Select instance type (e.g., t2.micro for free tier).
○​ Set timeout and network options.
4.​ Review and Create:
○​ Click "Next step", then "Create environment".
○​ Wait for the environment to launch (takes a few minutes).
5.​ Start Coding:
○​ You will see a code editor, terminal, and file browser.
○​ Start writing code in your desired language.
6.​ Use Built-in Terminal:
○​ You can use the terminal just like CloudShell to run AWS CLI commands.
7.​ Save & Manage Files:
○​ Files are stored in the associated EC2 instance.
○​ Use the file tree on the left for navigation and file operations.
8.​ Collaborate:
○​ Share your Cloud9 environment with team members using IAM permissions.
Experiment 2
Working with Amazon S3 Orchestrating Serverless Functions with AWS Step Functions

Step 1: Log in to AWS Console


●​ Visit: https://aws.amazon.com/console/
●​ Sign in with your credentials.
Step 2: Open Amazon S3
●​ In the Search bar, type S3 and select S3 from the results.
Step 3: Create a Bucket
1.​ Click Create bucket.
2.​ Enter a unique bucket name (e.g., student-cloud-bucket).
3.​ Choose an AWS Region.
4.​ Leave default settings or enable versioning if needed.
5.​ Click Create bucket.
Step 4: Upload a File
1.​ Click on your bucket name.
2.​ Click Upload.
3.​ Click Add files, select a file from your computer.
4.​ Click Upload.
Step 5: Download a File
1.​ Inside your bucket, find the uploaded file.
2.​ Click the file name → Click Download.
Step 6: Make File Public (Optional)
1.​ Select the file → Click Actions → Make public.
2.​ Copy the public URL to access the file via a browser.
Step 7: Delete Files or Bucket
●​ You can delete files or entire buckets as needed using the Actions menu.​

2. Orchestrating Serverless Functions with AWS Step Functions


Step 1: Open AWS Step Functions
●​ In the AWS Console, search for Step Functions and open it.
Step 2: Create a State Machine
1.​ Click Create state machine.
2.​ Choose Author with workflow studio (visual editor).
Step 3: Add Lambda Functions
1.​ Click Add state → Choose Lambda Invoke.
2.​ Configure the Lambda by selecting an existing function (or create one in advance).
3.​ Repeat to add multiple Lambda states if needed.
Step 4: Connect States
●​ Drag arrows to define execution order (like: Function A → Function B → Function C).
Step 5: Define Success/Failure Paths
●​ Add Choice or Fail states to handle conditions and errors.
Step 6: Review and Create
1.​ Click Next, name your state machine.
2.​ Choose an IAM role or create a new one.
3.​ Click Create state machine.
Step 7: Execute the Workflow
1.​ Click Start Execution.
2.​ Provide input JSON if needed.
3.​ View the visual execution path and logs in real-time.
Experiment 3
Working with Amazon DynamoDB
Step 1: Log in to AWS Console
●​ Go to
Step 2: Open DynamoDB
●​ Search and select DynamoDB from the services.
Step 3: Create a Table
1.​ Click Create Table.
2.​ Enter:
○​ Table name: Students
○​ Primary key: student_id (String)
3.​ Leave other options as default.
4.​ Click Create Table.
Step 4: Add Data (Items)
1.​ Open the created table.
2.​ Go to Items > Click Create Item.
Use JSON editor or form to enter data:​
{
"student_id": "101",
"name": "Arjun",
"marks": 87
}

3.​ Click Create Item.

Step 5: Query Data


●​ In the Items tab, click Query.
●​ Use student_id or scan for all records.
Step 6: Optional – Use AWS CLI
aws dynamodb scan --table-name Students
Experiment 4
Developing REST APIs with Amazon API Gateway

Step 1: Log in to AWS Console


●​ Visit: https://console.aws.amazon.com/
●​ Sign in using your AWS account.​

Step 2: Create a Lambda Function (Backend)


We’ll create a simple function that returns a JSON message.
1.​ Go to AWS Lambda > Create function
2.​ Choose:
○​ Author from scratch
○​ Function name: HelloAPI
○​ Runtime: Python 3.10 (or Node.js)
3.​ Click Create function
In the code editor, paste this Python code:​
def lambda_handler(event, context):
return {
'statusCode': 200,
'headers': {
'Content-Type': 'application/json'
},
'body': '{"message": "Hello from API Gateway!"}'
}
4.​
5.​ Click Deploy​

Step 3: Create a REST API in API Gateway


1.​ Go to API Gateway > APIs
2.​ Choose Create API
3.​ Select REST API (Build) (not HTTP API)
4.​ Set:​

○​ API name: MyHelloAPI


○​ Endpoint Type: Regional
○​ Click Create API

Step 4: Create a Resource and Method


1.​ In the left panel, click Actions > Create Resource
2.​ Resource Name: hello
3.​ Resource Path: /hello
4.​ Click Create Resource
5.​ Now select /hello → Actions > Create Method
6.​ Choose GET and click the checkmark
7.​ Integration type: Lambda Function
8.​ Check “Use Lambda Proxy Integration”
9.​ Enter the Lambda function name: HelloAPI
10.​ Click Save → Allow permissions​

Step 5: Deploy the API


1.​ Click Actions > Deploy API
2.​ Deployment stage: [New Stage]
3.​ Stage name: prod
4.​ Click Deploy​

A URL will be generated like:​


https://abc123.execute-api.us-east-1.amazonaws.com/prod/hello
Step 6: Test the API
Open a browser or use Postman/cURL to test:​
GET https://abc123.execute-api.us-east-1.amazonaws.com/prod/hello
You should see:​

{
"message": "Hello from API Gateway!"
}
Experiment 5
Creating Lambda Functions Using the AWS SDK for Python
Pre-requisites
Before starting, make sure:
●​ You have Python 3.x installed.
●​ AWS CLI is installed and configured with credentials (aws configure).
Boto3 is installed:​
pip install boto3

Step 1: Create a Simple Lambda Function Code


Save the following code as lambda_function.py:
def lambda_handler(event, context):
return {
'statusCode': 200,
'body': 'Hello from Lambda via Boto3!'
}

Then, zip this file:


zip function.zip lambda_function.py

Step 2: Import Boto3 and Set Up IAM Role


Create a Python script create_lambda.py:
import boto3
lambda_client = boto3.client('lambda')
iam_client = boto3.client('iam')

Step 3: Get or Create an IAM Role for Lambda


Create or use an existing Lambda execution role with trust policy:
Example Trust Policy (trust-policy.json):
{
"Version": "2012-10-17",
"Statement": [
{
"Effect": "Allow",
"Principal": {
"Service": "lambda.amazonaws.com"
},
"Action": "sts:AssumeRole"
}
]
}
Then create role via AWS CLI (one-time setup):
aws iam create-role --role-name lambda-execute-role \
--assume-role-policy-document file://trust-policy.json

Attach basic Lambda execution policy:


aws iam attach-role-policy --role-name lambda-execute-role \
--policy-arn arn:aws:iam::aws:policy/service-role/AWSLambdaBasicExecutionRole

Step 4: Create the Lambda Function Using Boto3


Now complete your Python script:
import boto3
import time
lambda_client = boto3.client('lambda')
role_arn = 'arn:aws:iam::<your-account-id>:role/lambda-execute-role' # Replace with your ARN
with open('function.zip', 'rb') as f:
zipped_code = f.read()
response = lambda_client.create_function(
FunctionName='MyBoto3Lambda',
Runtime='python3.10',
Role=role_arn,
Handler='lambda_function.lambda_handler',
Code=dict(ZipFile=zipped_code),
Timeout=15,
Publish=True,
)
print("Lambda function created:")
print(response)
Run it:
python create_lambda.py

Step 5: Test the Lambda Function (Optional)


You can invoke the function using:
response = lambda_client.invoke(
FunctionName='MyBoto3Lambda',
InvocationType='RequestResponse',
Payload=b'{}',
)

print(response['Payload'].read().decode())
Experiment 6
Migrating a Web Application to Docker Containers

Step 1: Create Your Web App Files


Create a folder named my-web-app and inside it, create a file:
index.php
<?php
echo "Hello from Docker!";
?>

Step 2: Create a Dockerfile


Dockerfile
# Use official PHP image
FROM php:8.1-apache
# Copy our code into the web directory
COPY . /var/www/html/
The Dockerfile tells Docker to:
●​ Use the official PHP image
●​ Copy your code into the Apache web directory​

Step 3: Create .dockerignore File (optional)


.dockerignore
*.log
*.tmp

This avoids copying unnecessary files into the Docker image.

Step 4: Build the Docker Image


Open terminal and navigate to your project folder:
cd my-web-app
docker build -t my-php-app .
●​ -t gives a name to your image (my-php-app)
●​ . means current directory
Step 5: Run the Container
docker run -d -p 8080:80 my-php-app
●​ -d = run in background
●​ -p 8080:80 = map local port 8080 to container port 80
Now open your browser:​
http://localhost:8080​
You’ll see: Hello from Docker!
Step 6: View Running Containers
docker ps

To stop the container:


docker stop <container_id>

Optional: Use docker-compose (Advanced)


Create a docker-compose.yml to manage multiple services.
Experiment 7
Caching Application Data with ElastiCache, Caching with Amazon CloudFronT, Caching Strategies

Step 1: Go to AWS Console → ElastiCache


1.​ Choose Redis as the cache engine
2.​ Click Create
3.​ Enter:
○​ Name: my-redis-cache
○​ Node type: Choose cache.t2.micro (for free tier or testing)
○​ Number of replicas: 0 or 1
4.​ Choose default VPC and subnet
5.​ Click Create
Step 2: Connect ElastiCache with your App
In Python (using redis library):
pip install redis
In your app:
import redis
cache = redis.StrictRedis(host='your-cache-endpoint', port=6379, decode_responses=True)
# Set cache
cache.set('user_101', 'John Doe')
# Get from cache
print(cache.get('user_101'))

PART 2: Caching with Amazon CloudFront


Used For:
Caching static content (images, CSS, JS, videos) close to users using global CDN.
Step-by-Step: Set up CloudFront
Step 1: Go to AWS Console → CloudFront
1.​ Click Create Distribution
2.​ Set Origin domain to your S3 bucket or web server
3.​ Keep default settings for caching behavior
4.​ Click Create Distribution
Step 2: Access Through CloudFront
Your content is now available via a CloudFront URL, like:
https://d123.cloudfront.net/image.jpg
Experiment 8
Implementing CloudFront for Caching and Application Security

Step 1: Prepare the Origin (Your Web Server or S3 Bucket)


If using S3:
1.​ Upload your website files to an S3 bucket
2.​ Enable static website hosting
3.​ Note down the S3 website endpoint URL​

Step 2: Create a CloudFront Distribution


1.​ Open AWS Console → CloudFront
2.​ Click Create Distribution
3.​ Under Origin domain, choose your:
○​ S3 website endpoint​
Example: mybucket.s3-website-us-east-1.amazonaws.com
4.​ In Cache Policy:
○​ Use CachingOptimized (or create custom)
5.​ Set Viewer Protocol Policy:
○​ Redirect HTTP to HTTPS
6.​ Enable Compress Objects Automatically: Yes
7.​ Click Create Distribution​

Step 3: Test CloudFront Caching


After 10–15 minutes (distribution deploys), access:
https://<your-cloudfront-id>.cloudfront.net/index.html
●​ First request → fetched from origin
●​ Next requests → served from nearest CloudFront edge location

PART 2: CloudFront Security Features


A. Enforce HTTPS
In distribution settings → set:
●​ Viewer Protocol Policy → Redirect HTTP to HTTPS​

Ensures secure communication between users and CloudFront.

B. Add Signed URLs or Cookies


Use signed URLs to restrict access to content.
1.​ Create a CloudFront key pair
2.​ Generate signed URL using private key
3.​ Only users with valid URL can access your file
C. Geo-Restriction (Location-based Blocking)
Block or allow users from specific countries:
1.​ Go to your distribution
2.​ Choose Restrictions → Edit Geo Restriction
3.​ Choose Blacklist or Whitelist
4.​ Add countries (e.g., block CN, RU)​

D. Integrate with AWS WAF (Web Application Firewall)


1.​ Create WAF in AWS WAF Console
2.​ Add rules to block attacks (like SQL injection, XSS)
3.​ Attach WAF to your CloudFront distribution​
Experiment 9
Orchestrating Serverless Functions with AWS Step Functions

PART 1: Create the Lambda Functions


Step 1: Create 3 Lambda Functions
Go to AWS Lambda and create the following:

Function: ValidateUser
Code (Python):
def lambda_handler(event, context):
if "email" in event:
return {"status": "valid", "data": event}
else:
raise Exception("Invalid input")

Function: SaveToDB
Code:
import boto3
dynamodb = boto3.resource('dynamodb')
table = dynamodb.Table('Users'
def lambda_handler(event, context):
table.put_item(Item=event["data"])
return {"status": "saved"}

Function: SendEmail
Code:
def lambda_handler(event, context):
print("Email sent to:", event["data"]["email"])
return {"status": "email_sent"}

PART 2: Create Step Function Workflow


Step 2: Open AWS Step Functions
1.​ Go to AWS Console → Step Functions
2.​ Click Create State Machine
3.​ Choose Author with code snippets
4.​ Paste the following Amazon States Language (ASL) definition:​

Sample Definition (JSON)


{
"Comment": "User Signup Workflow",
"StartAt": "ValidateUser",
"States": {
"ValidateUser": {
"Type": "Task",
"Resource": "arn:aws:lambda:REGION:ACCOUNT_ID:function:ValidateUser",
"Next": "SaveToDB"
},
"SaveToDB": {
"Type": "Task",
"Resource": "arn:aws:lambda:REGION:ACCOUNT_ID:function:SaveToDB",
"Next": "SendEmail"
},
"SendEmail": {
"Type": "Task",
"Resource": "arn:aws:lambda:REGION:ACCOUNT_ID:function:SendEmail",
"End": true
}
}
}

Replace REGION and ACCOUNT_ID with your actual AWS region/account.

Step 3: Permissions
Step Functions must have permission to call your Lambda functions. Make sure:
●​ Each Lambda has IAM permissions
●​ Step Functions IAM role has access to invoke Lambda
Step 4: Run the Workflow
1.​ Click Start Execution
2.​ Enter input:
{
"email": "[email protected]",
"name": "Student"
}

3.​ Click Start

You’ll see a visual workflow graph and live status updates!


Optional: Add Retry & Error Handling
Inside each state, you can add:
"Retry": [
{
"ErrorEquals": ["States.ALL"],
"IntervalSeconds": 2,
"MaxAttempts": 2
}
]
Experiment 9
Automating Application Deployment Using a CI/CD Pipeline

PART 1: Set Up Your Environment


Step 1: Create a Sample App
Create a small Node.js, Python, or HTML/CSS app.
<!-- index.html -->
<html><body><h1>Hello from CI/CD!</h1></body></html>

Push it to CodeCommit or GitHub.

Step 2: Prepare a buildspec.yml file


This file tells AWS CodeBuild how to build your app.
version: 0.2
phases:
install:
commands:
- echo Installing dependencies
build:
commands:
- echo Build successful
artifacts:
files:
- index.html

Place this in the root folder of your project.

PART 2: Set Up the CI/CD Pipeline


Step 3: Create CodeCommit Repository (Optional)
●​ Go to CodeCommit
●​ Click Create Repository
●​ Push your code using Git
OR use GitHub as your source provider in next step.
Step 4: Create a Pipeline in AWS CodePipeline
1.​ Go to CodePipeline → Click Create pipeline
2.​ Enter pipeline name: MyAppPipeline
3.​ Choose:
○​ Source: CodeCommit or GitHub
○​ Build: AWS CodeBuild
■​ Create a new CodeBuild project
■​ Point to buildspec.yml
○​ Deploy: Amazon S3, EC2, or Elastic Beanstalk​

Step 5: Choose a Deployment Method


Option 1: Deploy to S3 (for static websites)
●​ Select "Deploy to Amazon S3"
●​ Choose your S3 bucket with static hosting enabled
Option 2: Deploy to EC2 (using CodeDeploy)
●​ Set up CodeDeploy agent on EC2 instance
●​ Define appspec.yml and scripts to copy files and restart server​

Example: EC2 Deployment (Simple HTML Site)


appspec.yml
version: 0.0
os: linux
files:
- source: /
destination: /var/www/html

install.sh
#!/bin/bash
echo "Installing application"
Zip your code + scripts and upload to CodeDeploy.

What Happens Now?


1.​ Developer pushes code to Git
2.​ Pipeline:
○​ Pulls the code
○​ Builds using CodeBuild
○​ Deploys to S3 or EC2 using CodeDeploy
All steps happen automatically!

You might also like