Cloud Computing Lab Manual
Studocu is not sponsored or endorsed by any college or university
Downloaded by divya p
Cloud Computing
Cloud Computing Lab Manual
LABORATORY MANUAL
6th Semester
(Academic Year: 2024-25)
1
Cloud Computing
STUDENT DETAILS
NAME
USN
BATCH
SECTION
STAFF DETAILS
NAME OF THE FACULTY Prathima/Jyothi
NAME OF THE INSTRUCTOR
EVALUATION
MAXIMUM MARKS MARKS OBTAINED
WEEKLY
EVALUATION*
LAB CIE
TOTAL
*OBE Sheet Evaluation
Signature of the Faculty
Cloud Computing
EXPERIMENT No. 01: Exploring AWS CloudShell and the AWS
Cloud9 IDE
1.1 Objective 1.6 Observation
1.2 Apparatus Required 1.7 Results
1.3 Pre-Requisite 1.8 Discussions
1.4 Introduction 1.9 Pre-Requisite Question
1.5 Procedure 1.10 post-requisite
1.1 Objectives:
Exploring AWS CloudShell and the AWS Cloud9 IDE
1.2 Apparatus Required:
AWS Account
1.3 Pre-Requisite:
Basic Linux knowledge
1.4 Introduction:
AWS Cloudshell - AWS CloudShell is a browser-based shell that makes it
easier to securely manage, explore, and interact with your AWS resources.
AWS Cloud9 IDE - AWS Cloud9 is a cloud-based integrated
development environment (IDE) that lets you write, run, and debug your
code with just a browser. It includes a code editor, debugger, and terminal.
1.5 Procedure:
Cloud Computing
SELECT CLOUDSHELL
Cloud Computing
CREATE USING AN INSTANCE
UPLOAD A FILE
Cloud Computing
SPLIT INTO MULTIPLE ENVIRONMENTS
DOWNLOAD A FILE
Cloud Computing
REMOVE A FILE AND UPLOAD A FILE WITH CONTENTS AND READ IT
CREATE A VIRTUAL PRIVATE CLOUD USING CLOUDSHELL (IT DOESN’T SUPPORT
UPLOAD/DOWNLOAD)
GO TO ACTIONS -- RIGHT TOP CORNER
Cloud Computing
Cloud Computing
Cloud Computing
DELETE A CLOUDSHELL
EC2 instead of cloud9
Cloud Computing
Cloud Computing
Cloud Computing
Cloud Computing
LAUNCH INSTANCE
Cloud Computing
Cloud Computing
CONNECT TO INSTANCE
Cloud Computing
#JUST TO SEE PRICING
CLICK ON ID TO KNOW THE DEATAILS OF INSTANCE
Cloud Computing
PUBLIC IP: 54.147.247.42
Cloud Computing
#create a simple html file to verify web browser is working
Cloud Computing
Exploring cloudshell
Cloud Computing
1.6 Observations:
We created a cloud shell console and experimented on the same. Also we created a EC2 instance and
created a VPC
Cloud Computing
EXPERIMENT No. 02: Working with Amazon S3 0rchestrating
Serverless Functions with AWS Step Functions
2.1 Objective 2.6 Observation
2.2 Apparatus Required 2.7 Results
2.3 Pre-Requisite 2.8 Discussions
2.4 Introduction 2.9 Pre-Requisite Question
2.5 Procedure 2.10 Post-Requisite
2.1 Objectives:
Working with Amazon S3 0rchestrating Serverless Functions with AWS
Step Functions
2.2 Apparatus Required:
AWS Account, S3, Step Functions
2.3 Pre-Requisite:
Basic of AWS and Web development
2.4 Introduction:
Amazon S3 - Amazon Simple Storage Service (Amazon S3) is an
object storage service that offers industry-leading scalability, data
availability, security, and performance. You can use Amazon S3 to store and
retrieve any amount of data at any time, from anywhere
AWS Step Functions - AWS Step Functions lets you coordinate
individual tasks into a visual workflow, so you can build and update apps
quickly. The workflows you build with Step Functions are called state
machines, and each step of your workflow is called a state.
2.5 Procedure:
Host a static website on AWS using S3
Cloud Computing
Home page of aws
Before host a website get ready with html code and browser
Go to aws management console
Select s3
Cloud Computing
Create bucket
Give the name of the bucket create
Cloud Computing
Bucket is created
Upload the required files by selecting the created bucket
Cloud Computing
Click on upload-add files
Click on upload
Cloud Computing
Enable static website hosting
Go to properties
Scroll down—edit
Cloud Computing
1.html
Enable
Save
Then give permission through permission menu
Block public access
Edit uncheck the button
Cloud Computing
Then save
Confirm
All objects to public accessible
Select the objects-check box enable it
Cloud Computing
Then go to the actions and check it
Public option is disabled
Go to permissionsobject ownership- edit
Cloud Computing
Acl access control list enable and save it
Now check
Go to objects select all objectscheck actions-
Cloud Computing
Click on make public using ACL
Click on make public
Cloud Computing
Refresh the website
Hosted static website through AWS using s3.
Cloud Computing
EXPERIMENT No. 03: Working with Amazon
DynamoDB
3.1 Objective 3.6 Observation
3.2 Apparatus Required 3.7 Results
3.3 Pre-Requisite 3.8 Discussions
3.4 Introduction 3.9 Pre-Requisite Question
3.5 Procedure
3.1 Objectives: Working with Amazon DynamoDB
3.2 Apparatus Required:
AWS Account, AWS Dynamo DB
3.3 Pre-Requisite:
Aws basics, DBMS basics
3.4 Introduction:
AWS DynamoDB is a fast and flexible NoSQL database service for all
applications that need consistent, single-digit-millisecond latency at any scale. Its flexible
data model and reliable performance make DynamoDB a great fit for mobile, web,
gaming, advertising technology, Internet of Things, and other applications
3.5 Procedure:
Cloud Computing
CREATE TABLE
GIVE TABLE NAME, PARTITION KEY (IS A PRIMARY KEY)
SET -CUSTOMIZED SETTINGS, DynamoDB Standard-IA
Cloud Computing
TABLE CREATED -> SELECT UR TABLE -> EXPLORE TABLE ITEMS ->
CREATE ITEMS
USING JSON TO CREATE AN ITEM
Cloud Computing
{
"USN": {
"S": "03"
},
"NAME": {
"S": "XYZ"
},
"DEPT": {
"S": "MECH"
},
"AGE": {
"N": "13"
}
}
Cloud Computing
Cloud Computing
UPDATE COMMAND
Cloud Computing
3.6 Observations:
Tables and items are created, and we can run the commands to manipulate the date in the tables
Cloud Computing
EXPERIMENT No. 04: Developing REST APIs
with Amazon API Gateway
4.1 Objective 4.6 Observation
4.2 Apparatus Required 4.7 Results
4.3 Pre-Requisite 4.8 Discussions
4.4 Introduction 4.9 Pre-Requisite Question
4.5 Procedure
4.1 Objectives:
Developing REST APIs with Amazon API Gateway.
4.2 Apparatus Required:
AWS Account
4.3 Pre-Requisite:
PRE-REQUISITS IS CREATION OF LAMBDA FUNCTION (AS IN Program5)
4.4 Introduction:
API Gateway enables you to connect and access data, business logic, and functionality
from backend services such as workloads running on Amazon Elastic Compute Cloud
(Amazon EC2), code running on AWS Lambda, any web application, or real-time
communication applications.
Cloud Computing
4.5 Procedure:
Cloud Computing
CONFIGERATIONS --- FUNCTION URL -- CHECK URL RUNS
LAMBDA CODE SAMPLES
import json
print(json.dumps({"name": "John", "age": 30}))
print(json.dumps(["apple", "banan as"]))
print(json.dumps(("apple", "bananas")))
print(json.dumps("hello"))
print(json.dumps(42))
print(json.dumps(31.76))
print(json.dumps(True))
print(json.dumps(False))
print(json.dumps(None))
Cloud Computing
START FOR P4 : GO TO API GATEWAY IN CONSOLE AND SELECT REST API
BUILD REST API
Cloud Computing
CREATE AN API WITH ABOVE OPTIONS
CREATE METHOD AND SELECT LAMBA INTEGRATION
SELECT GET METHOD AND LAMBDA CREATED (PREVIOUSLY)
Cloud Computing
TO TEST API
DEPLOY API
Cloud Computing
USE URL AND COPY IN NEW TAB
Cloud Computing
EXPERIMENT No. 05: Creating Lambda
Functions Using the AWS SDK for Python
5.1 Objective 5.6 Observation
5.2 Apparatus Required 5.7 Results
5.3 Pre-Requisite 5.8 Discussions
5.4 Introduction 5.9 Pre-Requisite Question
5.5 Procedure
5.1 Objectives: Creating Lambda Functions Using the AWS SDK for Python
5.2 Apparatus Required:
AWS Account
5.3 Pre-Requisite:
AWS basic knowledge, basics of lambda functions and Python
5.4 Introduction:
An AWS Lambda function is a serverless function that runs code in
response to events, such as user actions or system updates. You can use Lambda
functions to create custom backend services, extend other AWS services, or
trigger actions in other services.
5.5 Procedure:
CREATE FUNCTION
Cloud Computing
Cloud Computing
CONFIGERATIONS --- FUNCTION URL -- CHECK URL RUNS
Cloud Computing
Cloud Computing
EXPERIMENT No. 06: Migrating a Web Application to Docker
Containers
6.1 Objective 6.6 Observation
6.2 Apparatus Required 6.7 Results
6.3 Pre-Requisite 6.8 Discussions
6.4 Introduction 6.9 Pre-Requisite Question
6.5 Procedure
6.1 Objectives: Migrating a Web Application to Docker Containers
6.2 Apparatus Required:
AWS Account
6.3 Pre-Requisite:
AWS basics knowledge, Basics of docker
6.4 Introduction:
Docker is a set of platform as a service (PaaS) products that use OS-level
virtualization to deliver software in packages called containers.
Docker is a software platform that packages software into containers. Docker
images are read-only templates that contain instructions for creating a Container.
A Docker image is a snapshot or blueprint of the libraries and dependencies
required inside a container for an application to run.
A Docker container image is a lightweight, standalone, executable package of
software that includes everything needed to run an application: code, runtime,
6.5 Procedure:
Cloud Computing
CREATE EC2 INSTANCE
• LINUX MACHINE
• ENABLE HTTPS AND HTTP
Remaining all default and launch instance
Cloud Computing
GO TO UR INSTANCE
Press connect
IT WILL TAKE YOU LINUX MACHINE
EXECUTE BELOW COMMANDS IN LINUX
ENGINE
sudo yum update –y
Cloud Computing
sudo yum install docker –y
sudo service docker start
sudo service docker status
(ctrl Z)
sudo su
docker version
docker pull nginx
docker images
docker run -d -p 80:80 nginx
docker ps
after this go to ec2 instance - connect - copy public
ip
Paste it in basic browser
Cloud Computing
EXPERIMENT No. 07: Caching Application Data with
ElastiCache, Caching with Amazon CloudFront, Caching
Strategies
7.1 Objective 7.6 Observation
7.2 Apparatus Required 7.7 Results
7.3 Pre-Requisite 7.8 Discussions
7.4 Introduction 7.9 Pre-Requisite Question
7.5 Procedure
7.1 Objectives: Caching Application Data with ElastiCache, Caching with
Amazon CloudFront, Caching Strategies
7.2 Apparatus Required:
AWS Account
7.3 Pre-Requisite:
AWS basics, caching basics and AWS CloudFront
7.4 Introduction:
AWS CloudFront - CloudFront offers several options for streaming your media to
global viewers—both pre-recorded files and live events. For video on demand
(VOD) streaming, you can use CloudFront to stream in common formats such as
MPEG DASH, Apple HLS, Microsoft Smooth Streaming, and CMAF, to any
device.
The two main components of AWS Cloudfront are content delivery and
dynamic content caching
7.5 Procedure:
Cloud Computing
ON LEFT SIDE RESOURCES - REDIS OSS CACHE--
CONTINUE TO REDDIS OSS
Cloud Computing
#######DON’T ENABLE MULTI A-Z
SELECT ENGINE VERSION (7.0) , PORT NUM (6379), NODE
TYPE (t3 micro), SHARDS 2
Cloud Computing
GIVE REDDIS SUBNET NAME AND MANAGE THE
RESOURCE
Cloud Computing
NEXT
Select AUTH DEFAULT ACCESS
Cloud Computing
AUTHtoken123456789
STEP2: GO TO EC2 IN NEW TAB AND CREATE A
SECURITY GROUP
INBOUND RULES ADD RULE CUSTOM TCP PORT
NUM 6379 (AS OUR ELASTIC STACLK RUNNING OVER
THERE) - SOURCE (ANYWHERE IPV4)
Cloud Computing
CREATE SECURITY GROUP
--------- STEP3: COMEBACK TO ELASTIC CACHE AND
SELECT THE SECURITY GROUP----
GO TO MANAGE AND REFRESH & SELECT THE SECURITY
GROUP
Cloud Computing
Cloud Computing
NEXT
WITH DEFAULT SETTINGS IN NEXT PAGE CLICK ON
CREATE
Cloud Computing
Cloud Computing
EXPERIMENT No. 08: Implementing CloudFront for Caching and
Application Security
8.1 Objective 8.6 Observation
8.2 Apparatus Required 8.7 Results
8.3 Pre-Requisite 8.8 Discussions
8.4 Introduction 8.9 Pre-Requisite Question
8.5 Procedure
8.1 Objectives: Implementing CloudFront for Caching and Application Security
8.2 Apparatus Required:
AWS Account
8.3 Pre-Requisite:
AWS Basics, CloudFront knowledge
8.4 Introduction:
Amazon CloudFront is a content delivery network (CDN) operated by Amazon
Web Services. The content delivery network was created to provide a globally
distributed network of proxy servers to cache content, such as web videos or other
bulky media, more locally to consumers, to improve access speed for downloading
the content.
8.5 Procedure:
GO TO S3 & CREATE A BUCKET
GIVE BUCKET NAME AND KEEP ALL DEFAULTS
Cloud Computing
Cloud Computing
GO TO BUCKET CREATED
GO TO PROPERTIES AND HOST STATIC WEBSITE
Cloud Computing
SAVE CHANGES
GO TO OBJECT AND UPLOAD A FILE (HTML)
Cloud Computing
Cloud Computing
ACCESS URL FROM UR BUCKET - PROPERTIES-> STATIC WEB HOSTING SITE
ADDRESS
IF YOU TRY TO ACCESS THE URL
Cloud Computing
STEP 2: CLOUDFRONT
CREATE A DISTRIBUTION
CREATE A NEW OAI
Cloud Computing
KEEP ALL SETTINGS DEFAULT
Cloud Computing
Cloud Computing
#### POLICY BEEN CREATED IN S3 BUCKET
GO TO THE LINK (DISTRIBUTION DOMAIN NAME)
Cloud Computing
S3 RESULT
CLOUD FRONT RESULT
Cloud Computing
EXPERIMENT No.09: Orchestrating Serverless Functions with
AWS Step Functions
9.1 Objective 9.6 Observation
9.2 Apparatus Required 9.7 Results
9.3 Pre-Requisite 9.8 Discussions
9.4 Introduction 9.9 Pre-Requisite Question
9.5 Procedure
9.1 Objectives: Orchestrating Serverless Functions with AWS Step Functions
9.2 Apparatus Required:
AWS Account
9.3 Pre-Requisite:
AWS basics, AWS step functions
9.4 Introduction:
Serverless functions let developers code in any language or framework with
which they're comfortable. Simpler backend code. Serverless removes a lot of
coding complexity for developers, allowing them to create simple, self-contained
functions that independently perform one purpose.
GO TO STEP FUNCTION CONSOLE
Cloud Computing
CREATE YOUR OWN
######################
Lab 1. Step Function to Select/Reject a student based on Maths/Physics cut-off mark
######################
Task 1: Create SNS topic
======================
1. Create an SNS topic. Subscribe your mail id to the topic.
Task 2: Definition of Step Function
==============
1. Goto Step Functions console.
Create a State Machine. Use the below code.
Cloud Computing
Name: course_selection_state_machine
{
"Comment": "An example of the Amazon States Language for scheduling a
task.",
"StartAt": "StartHere",
"States": {
"StartHere": {
"Type": "Pass",
"Next": "SubjectChoice"
},
"SubjectChoice": {
"Type": "Choice",
"Choices": [
{
"Variable": "$.Subject",
"StringEquals": "Physics",
"Next": "Physics"
},
{
"Variable": "$.Subject",
"StringEquals": "Maths",
"Next": "Maths"
}
],
"Default": "EndState"
},
"Physics": {
"Type": "Pass",
"Next": "CheckMarks"
},
"Maths": {
"Type": "Pass",
"Next": "CheckMarks"
},
"CheckMarks": {
"Type": "Choice",
"Choices": [
{
"Variable": "$.Marks",
"NumericGreaterThan": 70,
"Next": "EndState"
}
],
"Default": "EndState"
},
"EndState": {
"Type": "Pass",
"End": true
Cloud Computing
}
}
}
2. Execute the state machine. Provide the below timer value as the
input. Execution input:
{
"Subject": "Maths",
"Marks": 76
}
{
"Subject": "Physics",
"Marks": 91
}
3. Observe the graph view. Observe each step turning green.
Cloud Computing
4. Cleanup - delete the step function
#####
End
#####
Cloud Computing
EXPERIMENT No. 10: Automating Application Deployment Using
CI/CD Pipeline
10.1 Objective 10.6 Observation
10.2 Apparatus Required 10.7 Results
10.3 Pre-Requisite 10.8 Discussions
10.4 Introduction 10.9 Pre-Requisite Question
10.5 Procedure
9.1 Objectives: Automating Application Deployment Using CI/CD Pipeline
9.2 Apparatus Required:
AWS Account
9.3 Pre-Requisite:
AWS knowledge, basic CI/CD knowledge, IAM Roles, AWS Codedeploy
9.4 Introduction:
Serverless functions let developers code in any language or framework with
which they're comfortable. Simpler backend code. Serverless removes a lot of
coding complexity for developers, allowing them to create simple, self-contained
functions that independently perform one purpose.
A continuous integration and continuous deployment (CI/CD) pipeline is a
series of steps that must be performed in order to deliver a new version of
software. CI/CD pipelines are a practice focused on improving software delivery
throughout the software development life cycle via automation
AWS IAM - AWS Identity and Access Management (IAM) is a web service for
securely controlling access to AWS services. With IAM, you can centrally manage
users, security credentials such as access keys, and permissions that control which
AWS resources users and applications can access.
Cloud Computing
AWS CodeDeploy is a service that automates code deployments to any instance,
including Amazon EC2 instances and instances running on-premises. AWS
CodeDeploy makes it easier for you to rapidly release new features, helps you
avoid downtime during deployment, and handles the complexity of updating your
applications.
Step 1: Create IAM Role for EC2 and AWS CodeDeploy
• Navigate to IAM service.
• Then go to roles and create a new role.
• Select trusted entity type as AWS Service and use case as EC2
Cloud Computing
Step 2: Add permissions To IAM Role
• Select AmazonS3ReadOnlyAccess permission. It will allow
our EC2 instance to access stored artifacts from the Amazon S3
bucket.
Cloud Computing
Step 3: Creating The Role For AWS CodeDeploy
• Provide the Name, review and Click on Create for creating the
Role.
• Select an appropriate role name and click on create role.
Step 4: Creating New Service Role For CodeDeploy
Cloud Computing
• Create a new service role for CodeDeploy and attach
AWSCodeDeployRole policy which will provide the permissions
for our service role to read tags of our EC2 instance, publish
information to Amazon SNS topics and much more task.
• Repeat the Above 3 steps again with trusted entity type AWS
Service, use case CodeDeploy.
• Add AWSCodeDeployRole permissions to this creating Role
Cloud Computing
• Provide the Name, review and create the role.
Step 5: Launch An Linux EC2 instance
• Select the instance with AMI such as "Amazon Linux" and
connect to CLI Console.
• Switch to root user from ec2-user to gain admin access power by
using following command "sudo su" in Linux.
sudo su
Cloud Computing
Step 6: Update The Packages
• The command "sudo yum update" is used in Amazon Linux,
CentOS, and Red Hat Linux distributions to update installed
packages on your system to their latest available versions.
sudo yum update
Step 7: Install The Ruby And Wget Software
• The command 'sudo yum install ruby' is used to install the Ruby
programming software using the YUM package manager.
sudo yum install ruby
• The command sudo yum install wget is used to install the "wget"
package on a system running Amazon Linux, CentOS, or other
Red Hat-based Linux distributions that use the YUM package
manager.
sudo yum install wget
Step 8: Download CodeDeploy Agent Script
• Downloading the AWS CodeDeploy agent installation script
from the AWS S3 bucket is an essential step in setting up AWS
CodeDeploy for your infrastructure.
• The CodeDeploy agent is a lightweight, scalable software
component that enables AWS CodeDeploy to deploy and
manage applications on your EC2 instances or on-premises
servers.
wget
https://aws-codedeploy-us-east-1.s3.amazonaws.com/latest/install
Step 9: Run Installation Script
• The command chmod +x ./install is used to make a file
executable in a Unix-like operating system, including Linux.
chmod +x ./install
The command 'sudo ./install auto' is likely used to run an
installation script with superuser (administrator) privileges and
pass the "auto" argument to the script.
sudo ./install auto
Step 10: Check CodeDeploy Agent Status
• The command sudo service codedeploy-agent status is used to
check the status of the AWS CodeDeploy agent running on your
system.
Cloud Computing
sudo service codedeploy-agent status
Step 11: Modifying IAM Role
• After running the following commands, select the instance and
click on "Actions", then click on "Security" and click on
"Modify IAM Role". Then choose the above created IAM Role
and click on "Update IAM Role".
• After this step, your EC2 instance gets attached with your above
created IAM Role.
• Modify the IAM role by clicking on the button Update IAM
role as shown in the figure.
Cloud Computing
Step 12: Finalizing The Configuration
After this process, go to the console where your instance is
connected and run the command "exit" to exit from the root folder
and go back to the EC2 folder. Make a directory on the EC2 folder
named "server", this is the directory where my source code will be
deployed.
Cloud Computing
• Then after doing the above process, come back to the running
instances list.
• Select your currently created running instance and go to the
"Security" section present at the end of the page.
• Click on the link present under the "Security Groups". After
redirecting to the required page, click on "Edit Inbound rules"
under the section of "Inbound rules" present at the end of the
page.
Cloud Computing
• Then add a rule, select a port range of your choice and select the
source as "Anywhere-IPv4" from the dropdown menu and then
click on "Save rules".
• Basically, let me give you a overview what we are actually doing
here. In brief, when you add an inbound rule to a security group
for an instance with port range (in my case, it was 4000) and set
the source to "Anywhere-IPv4," you are allowing any computer
or device on the internet to connect to your instance through
port 4000.
• This is like opening a door (port 4000) on your server and letting
anyone from anywhere access the service or application running
on that port.
Cloud Computing
Step 13: Create A New Pipeline
• Create a CodePipeline using Github, CodeBuild and
CodeDeploy
• Firstly Create CodePipeline navigate to CodePipeline via AWS
Management Console and click on Create pipeline.
Cloud Computing
Step 14: Choose Github In Code Source
• After selecting GitHub as the source provider, click on the
Connect to GitHub button. You’ll then be prompt to enter your
GitHub login credentials.
• Once you grant AWS CodePipeline access to your GitHub
repository, you can select a repository and branch for
CodePipeline to upload commits to this repository to your
pipeline.
Cloud Computing
Step 15: Configure CodeBuild (Optional)
• If you haven’t created a project prior to creating your pipeline,
then you can create a project directly from here by clicking
Create project button.
• Note: Buildspec file is a collection of build commands and
related settings, in YAML format, that CodeBuild uses to run a
Cloud Computing
build. For my project, I created a buildspec.yaml file and added it
in the root of my project directory.
Step 16: Add Deploy Stage
Note : Before going to configure Add Deploy Stage, Let's make
duplicate tab of current tab.
• Go to code deploy in the navigation, Select Application, then add
create a deployment group.
Cloud Computing
• Create a deployment Group by clicking on the button "Create
deployment group", the following screenshot illustrates with
practically.
Cloud Computing
• In deployment group Select EC2 instances and select Tag and
Value
• Provide the Environment configurations such as select
the Amazon EC2 Instances and provide the key and values to it.
Cloud Computing
• Uncheck Load Balancer Option
Cloud Computing
• Finally Come on Add Deploy Stage and select that created
Application name & Deployment group
Cloud Computing
Step 17: Review And Create
• As a final step review and create it. By creating this we have
successful the created a CI/CD pipeline in AWS.
Cloud Computing
Cloud Computing
PIPELINE CREATED STAGE ALTERNATE METHOD
Cloud Computing
NEXT
CONNECT TO GITHUB
Cloud Computing
CONNECT TO GITHUB
CONNECT
Cloud Computing
Cloud Computing
NEW CONSOLE CREATE APPLICATION AND
DEPLOYEMENT GROUP
Cloud Computing
Cloud Computing