Internship Report
Internship Report
On
BACHELOR OF TECHNOLOGY
in
CSE(AIML)
Submitted to:
Mr. Chandershekhar Singh Submitted by:
Head of Technology Mr. Hemant Vaishnav
Dept. of Technology. JIET/AIML/23/014
JIET Group of Institutions, 23EJIAI041
Jodhpur
Department of Technology
Jodhpur Institute of Engineering Technology
JIET Group of Institutions, Jodhpur
2025-26
Candidate’s Declaration
I hereby declare that the project report titled ”DevOps and Cloud computing”, submitted
in partial fulfillment of the requirements for the degree of Bachelor of Technology in Computer
Science and Engineering with a specialization in Artificial Intelligence and Machine Learning,
is a genuine record of my own work. This internship was carried out under the guidance of
Mr. Rajendra jangid, Internship Coordinator at Upflairs Private Limited Company.
I affirm that the content of this report has not been submitted elsewhere for the award of
any other degree or diploma.
1
Certificate
CERTIFICATE OF INTERNSHIP
2
Acknowledgement
3
Abstract
This report details the work and learning undertaken during an industrial training program
focused on DevOps and Cloud Computing at Upflairs Private Limited Company. The training
provided in-depth practical experience in deploying and managing scalable cloud-based
applications.
The core project during this internship was implementing Infrastructure as Code (IaC) using
Terraform on AWS, aimed at strengthening skills in automated cloud provisioning. Key AWS
services like EC2, S3, and IAM were configured and managed through Terraform scripts to
ensure consistent and scalable deployments.
Firebase was integrated for user authentication and Firestore for real-time database
operations. The training also covered CI/CD pipelines using GitHub Actions, along with
basic monitoring and logging techniques. This hands-on experience provided a solid
foundation in modern DevOps practices and cloud-native development.
4
Contents
Candidate’s Declaration 1
Certificate 2
Acknowledgement 3
Abstract 4
1 Introduction 9
1.1 Background and Context . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 9
1.2 Importance of DevOps and Cloud Computing . . . . . . . . . . . . . . . . . . . 9
1.3 Motivation . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 9
1.4 Problem Statement . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 10
1.5 Aims and Objectives . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 10
1.6 Report Outline . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 10
3 DevOps Projects 16
3.1 Introduction to Projects . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 16
3.2 Tools and Technologies Used . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 16
3.3 Small Projects . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 16
3.3.1 Hosting Website with Nginx and Jenkins . . . . . . . . . . . . . . . . . . 16
3.3.2 Hosting Website on AWS EC2 with Apache . . . . . . . . . . . . . . . . 17
3.3.3 Running Git Repo using Docker . . . . . . . . . . . . . . . . . . . . . . . 18
3.3.4 Creating an Instance with Terraform . . . . . . . . . . . . . . . . . . . . 18
3.4 Major Project: Automated Website Deployment Using Terraform, Docker, and
AWS . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 19
3.5 Conclusion . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 20
5
4.3 Text Editors: nano, vim . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 21
4.4 Permissions and sudo Usage . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 22
4.5 System Directories: /etc, /var, /home . . . . . . . . . . . . . . . . . . . . . . . . 22
4.6 Apache2 Web Server: Installation & Configuration . . . . . . . . . . . . . . . . . 22
4.6.1 Installing Apache2 using apt . . . . . . . . . . . . . . . . . . . . . . . . . 22
4.6.2 Starting and Enabling Apache2 Service . . . . . . . . . . . . . . . . . . . 23
4.6.3 Default Web Root: /var/www/html . . . . . . . . . . . . . . . . . . . . . 23
4.6.4 Config Files in /etc/apache2 . . . . . . . . . . . . . . . . . . . . . . . . . 23
4.6.5 Restarting and Checking Status . . . . . . . . . . . . . . . . . . . . . . . 23
4.7 AWS CLI Basics: aws configure, s3 ls, ec2 describe . . . . . . . . . . . . . . . . 23
8 Conclusion 32
8.1 Summary of Achievements . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 32
8.2 Challenges Faced . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 32
8.3 Learning Outcomes . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 32
8.4 Future Scope . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 33
References 34
6
List of Figures
7
List of Tables
8
Chapter 1
Introduction
1.3 Motivation
The motivation behind this report stems from the increasing reliance on digital solutions and
the need for efficient software delivery methods. Traditional development models often fall
short in meeting modern demands for speed and flexibility. The combination of DevOps and
cloud computing presents an opportunity to address these issues by enhancing collaboration,
automating processes, and enabling rapid deployments. This study aims to explore how these
technologies can be successfully integrated to improve productivity and reduce operational
risks. The researcher’s interest in emerging IT trends and the real-world impact of technology
adoption further motivates a deeper investigation into these transformative practices.
9
1.4 Problem Statement
Despite the growing adoption of DevOps and cloud computing, many organizations face
challenges in achieving effective integration. These issues often prevent them from realizing
the full benefits of speed, efficiency, and scalability
• Cultural Resistance : Teams often resist change, limiting collaboration between devel-
opment and operations.
• Skills Gap :
• Cloud Complexity : Managing security, compliance, and costs in the cloud adds oper-
ational difficulty.
10
Chapter 2
In this chapter, we’ll go through the setup of all the essential tools and software required for
working in DevOps and Cloud Computing. These tools help us write code, manage cloud
infrastructure, work with containers, and automate tasks.
11
Configure Git:
1
2 git config -- global user . name " Your Name "
3 git config -- global user . email " you@example . com "
Listing 2.3: Steps to Configure Git
GitHub Setup:
GitHub is a cloud-based platform that enables developers to host, share, and collaborate on
code repositories. After installing Git on a local system, users can create a GitHub account
by visiting https://github.com. To securely interact with GitHub from the terminal, it is
recommended to generate an SSH key pair using ssh-keygen and add the public key to the
GitHub account under “SSH and GPG keys.” This allows for password-less authentication
during Git operations like clone, push, and pull. Once configured, users can create
repositories on GitHub and connect them to local Git repositories using git remote add origin.
This setup enables seamless version control and collaboration, making GitHub an essential
tool in modern DevOps workflows.
1 aws configure
Listing 2.5: Configure AWS CLI
• Installing extensions for Terraform, Docker, AWS, and Git tailors VS Code to your
DevOps stack.
Why It Matters:
• Productivity is greatly improved when your editor understands your tools and code
syntax.
12
• VS Code becomes a central hub where you write, edit, test, and push code — all from
one place.
Install VS Code (UBUNTU)
1
2 wget -qO - https :// packages . microsoft . com / keys / microsoft . asc | gpg -- dearmor
> packages . microsoft . gpg
3 sudo install -o root -g root -m 644 packages . microsoft . gpg / etc / apt / trusted .
gpg . d /
4 sudo sh -c ’ echo " deb [ arch = amd64 ] https :// packages . microsoft . com / repos /
vscode stable main " > / etc / apt / sources . list . d / vscode . list ’
5 sudo apt update
6 sudo apt install -y code
Listing 2.6: Install VS Code - Ubuntu
Install Extensions:
1
13
1
2 curl - fsSL https :// apt . releases . hashicorp . com / gpg | sudo gpg -- dearmor -o /
usr / share / keyrings / hashicorp - archive - keyring . gpg
3 echo " deb [ signed - by =/ usr / share / keyrings / hashicorp - archive - keyring . gpg ]
https :// apt . releases . hashicorp . com $ ( lsb_release - cs ) main " | sudo tee /
etc / apt / sources . list . d / hashicorp . list > / dev / null
4 sudo apt update
5 sudo apt install -y terraform
6 terraform - version
Listing 2.8: Commands
Lorem ipsum dolor sit amet, consectetuer adipiscing elit. Ut purus elit, vestibulum ut,
placerat ac, adipiscing vitae, felis. Curabitur dictum gravida mauris. Nam arcu libero,
nonummy eget, consectetuer id, vulputate a, magna. Donec vehicula augue eu neque.
Pellentesque habitant morbi tristique senectus et netus et malesuada fames ac turpis egestas.
Mauris ut leo. Cras viverra metus rhoncus sem. Nulla et lectus vestibulum urna fringilla
ultrices. Phasellus eu tellus sit amet tortor gravida placerat. Integer sapien est, iaculis in,
pretium quis, viverra ac, nunc. Praesent eget sem vel leo ultrices bibendum. Aenean faucibus.
Morbi dolor nulla, malesuada eu, pulvinar at, mollis ac, nulla. Curabitur auctor semper nulla.
Donec varius orci eget risus. Duis nibh mi, congue eu, accumsan eleifend, sagittis quis, diam.
Duis eget orci sit amet orci dignissim rutrum.
Nam dui ligula, fringilla a, euismod sodales, sollicitudin vel, wisi. Morbi auctor lorem non
justo. Nam lacus libero, pretium at, lobortis vitae, ultricies et, tellus. Donec aliquet, tortor
sed accumsan bibendum, erat ligula aliquet magna, vitae ornare odio metus a mi. Morbi ac
orci et nisl hendrerit mollis. Suspendisse ut massa. Cras nec ante. Pellentesque a nulla. Cum
sociis natoque penatibus et magnis dis parturient montes, nascetur ridiculus mus. Aliquam
tincidunt urna. Nulla ullamcorper vestibulum turpis. Pellentesque cursus luctus mauris.
14
Test Docker:
1 docker run hello - world
Listing 2.11: Test Docker
Lorem ipsum dolor sit amet, consectetuer adipiscing elit. Ut purus elit, vestibulum ut,
placerat ac, adipiscing vitae, felis. Curabitur dictum gravida mauris. Nam arcu libero,
nonummy eget, consectetuer id, vulputate a, magna. Donec vehicula augue eu neque.
Pellentesque habitant morbi tristique senectus et netus et malesuada fames ac turpis egestas.
Mauris ut leo. Cras viverra metus rhoncus sem. Nulla et lectus vestibulum urna fringilla
ultrices. Phasellus eu tellus sit amet tortor gravida placerat. Integer sapien est, iaculis in,
pretium quis, viverra ac, nunc. Praesent eget sem vel leo ultrices bibendum. Aenean faucibus.
Morbi dolor nulla, malesuada eu, pulvinar at, mollis ac, nulla. Curabitur auctor semper nulla.
Donec varius orci eget risus. Duis nibh mi, congue eu, accumsan eleifend, sagittis quis, diam.
Duis eget orci sit amet orci dignissim rutrum.
Nam dui ligula, fringilla a, euismod sodales, sollicitudin vel, wisi. Morbi auctor lorem non
justo. Nam lacus libero, pretium at, lobortis vitae, ultricies et, tellus. Donec aliquet, tortor
sed accumsan bibendum, erat ligula aliquet magna, vitae ornare odio metus a mi. Morbi ac
orci et nisl hendrerit mollis. Suspendisse ut massa. Cras nec ante. Pellentesque a nulla. Cum
sociis natoque penatibus et magnis dis parturient montes, nascetur ridiculus mus. Aliquam
tincidunt urna. Nulla ullamcorper vestibulum turpis. Pellentesque cursus luctus mauris.
15
Chapter 3
DevOps Projects
This chapter provides an overview of various projects undertaken to apply DevOps principles
and technologies. These projects demonstrate a practical understanding of tools like Docker,
Terraform, Apache, and GitHub Actions, and how they can be used to automate the software
delivery lifecycle.
Tool/Technology Purpose
Docker Containerizing and running applications
Terraform Provisioning infrastructure as code (IaC)
AWS EC2 Hosting applications on virtual machines
Apache Web server for serving websites
GitHub Version control and remote repositories
Jenkins Automating deployment pipelines
16
Figure 3.1: Nginx and Jenkins Website Hosting
1 pipeline {
2 agent any
3 stages {
4 stage ( ’ Clone Repo ’) {
5 steps {
6 git ’ https :// github . com / your - repo . git ’
7 }
8 }
9 stage ( ’ Deploy with Nginx ’) {
10 steps {
11 sh ’ sudo systemctl restart nginx ’
12 }
13 }
14 }
15 }
Listing 3.1: Jenkinsfile Script
17
4 git clone https :// github . com / your - repo . git
5 cp -r your - repo /* / var / www / html /
Listing 3.2: Apache Install Script on EC2
18
10 }
11 }
Listing 3.4: Terraform Configuration
Project Overview: This project showcases a complete DevOps pipeline where a static
website is deployed in an automated and scalable manner using Terraform, AWS EC2,
Docker, Apache HTTP Server, and GitHub.
Steps Involved:
19
1 resource " aws_instance " " docker_web " {
2 ami = " ami - xxxxxxxxxxxxx "
3 instance_type = " t2 . micro "
4
5 user_data = <<- EOF
6 #!/ bin / bash
7 apt update -y
8 apt install docker . io -y
9 git clone https :// github . com / your - repo . git
10 cd your - repo
11 docker build -t site .
12 docker run -d -p 80:80 site
13 EOF
14 }
Listing 3.6: Terraform with Docker Setup
3.5 Conclusion
These projects enabled practical learning of DevOps concepts by integrating automation,
containerization, and cloud infrastructure. Using tools like Terraform, Docker, Jenkins, and
AWS, I was able to deploy web applications efficiently and repeatedly.
—
20
Chapter 4
Command Purpose
ls Lists files and directories in the current path
cd Changes directory
pwd Prints the current working directory
mkdir Creates a new directory
21
• vim: A powerful, advanced editor. It has a steeper learning curve but offers high efficiency
for experienced users. To open a file: vim filename.txt. To edit, you must enter ”insert”
mode by pressing i. To save and exit, press Esc and type :wq.
• Permissions: The ls -l command shows file permissions for the owner, group, and
others (e.g., -rw-r--r--).
• chmod: Changes permissions (e.g., chmod 755 script.sh makes a script executable for
all users).
• sudo: Prefacing a command with sudo (e.g., sudo apt update) temporarily grants ad-
ministrative privileges, allowing you to install software or modify system files.
• /etc: Contains all system-wide configuration files (e.g., Apache2 configs, network set-
tings).
• /var: Stores variable data, such as log files (/var/log), web server files (/var/www), and
mail queues.
• /home: The directory where personal user files are stored. Each user has a subdirectory
here (e.g., /home/rahul).
22
4.6.2 Starting and Enabling Apache2 Service
The systemctl command is used to manage system services.
1 sudo systemctl start apache2 # Starts the service
2 sudo systemctl enable apache2 # Enables it to start on boot
3 sudo systemctl status apache2 # Checks its status
Listing 4.2: Managing Apache2 Service
• sites-enabled/: Symbolic links to the files in sites-available for currently enabled web-
sites.
• aws configure: This command sets up your AWS credentials. It prompts for your access
key, secret key, default region, and output format.
• aws ec2 describe-instances: Provides detailed information about all EC2 instances in
your account.
• aws sts get-caller-identity: Verifies your configured AWS user and account details.
These commands allow for powerful automation and management of your cloud infrastructure
without needing to use the web console.
—
23
Chapter 5
24
The docker build -t my-app . command uses this Dockerfile to create an image named
my-app.
• Networking: Docker creates a virtual network for containers, allowing them to com-
municate with each other using service names as hostnames. In the docker-compose.yml
example above, the web service can communicate with the db service simply by using the
hostname db.
• Volumes: Containers are ephemeral by nature. To ensure data persistence (e.g., for a
database), Docker volumes are used. A volume is a designated area on the host machine
that can be mounted into a container. This ensures that even if the container is deleted,
the data remains.
25
Chapter 6
26
6.3 Writing Basic Terraform Scripts
Explanation: Terraform uses its own language, HashiCorp Configuration Language (HCL),
to define resources. A basic script typically includes a provider block, which specifies the
cloud service provider (e.g., AWS), and resource blocks, which define the infrastructure
components (e.g., EC2 instances, S3 buckets).
1 resource " aws_instance " " example " {
2 ami = " ami -0 c55b159cbfafe1f0 "
3 instance_type = " t2 . micro "
4 tags = {
5 Name = " My - Terraform - Instance "
6 }
7 }
Listing 6.2: Creating a Simple EC2 Instance
This script defines a single EC2 instance. The ami and instance_type are essential
arguments for creating the instance.
• terraform plan: Generates an execution plan showing what changes will be made.
• terraform destroy: Tears down all the resources defined in the configuration.
This workflow ensures that you always know what changes are about to happen, preventing
accidental deletions or misconfigurations.
• Variables: Terraform uses variables to make your code reusable. You can define variables
in a variables.tf file and provide values via a terraform.tfvars file or on the command
line. This allows you to use the same script for different environments (e.g., development,
production).
• State Management: Terraform maintains a state file (.tfstate) that maps the real-world
resources to your configuration. This file is critical, and for team collaboration, it should
be stored in a remote, shared location, such as an S3 bucket with versioning enabled. A
remote backend ensures that all team members are working with the same, up-to-date
state.
27
6.6 Terraform Cloud and Workspaces
Explanation: Terraform Cloud is a managed service that provides a consistent environment
for Terraform workflows. It includes features like remote state management, team
collaboration, and a centralized plan and apply workflow. Workspaces in Terraform allow you
to manage multiple distinct sets of infrastructure with the same configuration. This is ideal
for provisioning separate environments like staging and production, each with its own state.
—
28
Chapter 7
29
7.3 Building CI/CD for a Sample App
Explanation: A typical CI/CD pipeline for a web application might include the following
stages:
4. If tests pass, a Docker image is built and pushed to a container registry (e.g., Docker
Hub).
4. The old container is stopped, and a new one is started with the latest image.
• Static Websites (S3): For a front-end-only application (like the Resume Builder), the
build artifacts (HTML, CSS, JS) can be directly uploaded to an S3 bucket configured for
static website hosting. This is a simple, cost-effective, and highly scalable deployment
method.
30
• Dynamic Applications (EC2): For applications requiring a backend server, an EC2
instance is used. A CI/CD pipeline can automate the process of building the application,
containerizing it with Docker, and deploying the container onto the EC2 instance. The
EC2 instance acts as the host for the running container.
This chapter summarizes the final stage of the DevOps pipeline, where code transitions from
a developer’s machine to a live, production environment.
Phasellus fringilla, metus id feugiat consectetuer, lacus wisi ultrices tellus, quis lobortis nibh
lorem quis tortor. Donec egestas ornare nulla. Mauris mi tellus, porta faucibus, dictum vel,
nonummy in, est. Aliquam erat volutpat. In tellus magna, porttitor lacinia, molestie vitae,
pellentesque eu, justo. Class aptent taciti sociosqu ad litora torquent per conubia nostra, per
inceptos hymenaeos. Sed orci nibh, scelerisque sit amet, suscipit sed, placerat vel, diam.
Vestibulum nonummy vulputate orci. Donec et velit ac arcu interdum semper. Morbi pede
orci, cursus ac, elementum non, vehicula ut, lacus. Cras volutpat. Nam vel wisi quis libero
venenatis placerat. Aenean sed odio. Quisque posuere purus ac orci. Vivamus odio. Vivamus
varius, nulla sit amet semper viverra, odio mauris consequat lacus, at vestibulum neque arcu
eu tortor. Donec iaculis tincidunt tellus. Aliquam erat volutpat. Curabitur magna lorem,
dignissim volutpat, viverra et, adipiscing nec, dolor. Praesent lacus mauris, dapibus vitae,
sollicitudin sit amet, nonummy eget, ligula.
Cras egestas ipsum a nisl. Vivamus varius dolor ut dolor. Fusce vel enim. Pellentesque
accumsan ligula et eros. Cras id lacus non tortor facilisis facilisis. Etiam nisl elit, cursus sed,
fringilla in, congue nec, urna. Cum sociis natoque penatibus et magnis dis parturient montes,
nascetur ridiculus mus. Integer at turpis. Cum sociis natoque penatibus et magnis dis
parturient montes, nascetur ridiculus mus. Duis fringilla, ligula sed porta fringilla, ligula wisi
commodo felis, ut adipiscing felis dui in enim. Suspendisse malesuada ultrices ante.
Pellentesque scelerisque augue sit amet urna. Nulla volutpat aliquet tortor. Cras aliquam,
tellus at aliquet pellentesque, justo sapien commodo leo, id rhoncus sapien quam at erat.
Nulla commodo, wisi eget sollicitudin pretium, orci orci aliquam orci, ut cursus turpis justo et
lacus. Nulla vel tortor. Quisque erat elit, viverra sit amet, sagittis eget, porta sit amet, lacus.
In hac habitasse platea dictumst. Proin at est. Curabitur tempus vulputate elit. Pellentesque
sem. Praesent eu sapien. Duis elit magna, aliquet at, tempus sed, vehicula non, enim. Morbi
viverra arcu nec purus. Vivamus fringilla, enim et commodo malesuada, tortor metus
elementum ligula, nec aliquet est sapien ut lectus. Aliquam mi. Ut nec elit. Fusce euismod
luctus tellus. Curabitur scelerisque. Nullam purus. Nam ultricies accumsan magna. Morbi
pulvinar lorem sit amet ipsum. Donec ut justo vitae nibh mollis congue. Fusce quis diam.
Praesent tempus eros ut quam.
31
Chapter 8
Conclusion
This industrial training at Apex Planet Software Company has been an immensely valuable
experience, providing deep practical insights into modern web development. The development
of the SaaS Resume Builder from scratch successfully translated theoretical knowledge into a
tangible, real-world application.
• State Management: Keeping the UI in sync with the data without a formal state
management library required careful planning and organization of the JavaScript code.
32
8.4 Future Scope
While the current prototype is fully functional, there are several avenues for future
enhancement:
• Writing unit and integration tests to ensure code quality and reliability.
Curabitur ac lorem. Vivamus non justo in dui mattis posuere. Etiam accumsan ligula id
pede. Maecenas tincidunt diam nec velit. Praesent convallis sapien ac est. Aliquam
ullamcorper euismod nulla. Integer mollis enim vel tortor. Nulla sodales placerat nunc. Sed
tempus rutrum wisi. Duis accumsan gravida purus. Nunc nunc. Etiam facilisis dui eu sem.
Vestibulum semper. Praesent eu eros. Vestibulum tellus nisl, dapibus id, vestibulum sit amet,
placerat ac, mauris. Maecenas et elit ut erat placerat dictum. Nam feugiat, turpis et sodales
volutpat, wisi quam rhoncus neque, vitae aliquam ipsum sapien vel enim. Maecenas suscipit
cursus mi.
Quisque consectetuer. In suscipit mauris a dolor pellentesque consectetuer. Mauris convallis
neque non erat. In lacinia. Pellentesque leo eros, sagittis quis, fermentum quis, tincidunt ut,
sapien. Maecenas sem. Curabitur eros odio, interdum eu, feugiat eu, porta ac, nisl. Curabitur
nunc. Etiam fermentum convallis velit. Pellentesque laoreet lacus. Quisque sed elit. Nam quis
tellus. Aliquam tellus arcu, adipiscing non, tincidunt eleifend, adipiscing quis, augue.
Vivamus elementum placerat enim. Suspendisse ut tortor. Integer faucibus adipiscing felis.
Aenean consectetuer mattis lectus. Morbi malesuada faucibus dolor. Nam lacus. Etiam arcu
libero, malesuada vitae, aliquam vitae, blandit tristique, nisl.
Maecenas accumsan dapibus sapien. Duis pretium iaculis arcu. Curabitur ut lacus. Aliquam
vulputate. Suspendisse ut purus sed sem tempor rhoncus. Ut quam dui, fringilla at, dictum
eget, ultricies quis, quam. Etiam sem est, pharetra non, vulputate in, pretium at, ipsum.
Nunc semper sagittis orci. Sed scelerisque suscipit diam. Ut volutpat, dolor at ullamcorper
tristique, eros purus mollis quam, sit amet ornare ante nunc et enim.
33
Bibliography
[1] MDN Web Docs, ”HTML: HyperText Markup Language.” [Online]. Available: https:
//developer.mozilla.org/en-US/docs/Web/HTML
[2] MDN Web Docs, ”CSS: Cascading Style Sheets.” [Online]. Available: https://
developer.mozilla.org/en-US/docs/Web/CSS
[5] Google for Developers, ”Learn the Basics of Web Development.” [Online]. Available:
https://web.dev/learn/
34