Thanks to visit codestin.com
Credit goes to www.scribd.com

0% found this document useful (0 votes)
1 views35 pages

Internship Report

Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
1 views35 pages

Internship Report

Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 35

Industrial Training Report

On

DevOps and Cloud Computing

Submitted for the Partial Fulfillment of the Requirement of the


Degree
of

BACHELOR OF TECHNOLOGY
in
CSE(AIML)

Submitted to:
Mr. Chandershekhar Singh Submitted by:
Head of Technology Mr. Hemant Vaishnav
Dept. of Technology. JIET/AIML/23/014
JIET Group of Institutions, 23EJIAI041
Jodhpur

Department of Technology
Jodhpur Institute of Engineering Technology
JIET Group of Institutions, Jodhpur
2025-26
Candidate’s Declaration

I hereby declare that the project report titled ”DevOps and Cloud computing”, submitted
in partial fulfillment of the requirements for the degree of Bachelor of Technology in Computer
Science and Engineering with a specialization in Artificial Intelligence and Machine Learning,
is a genuine record of my own work. This internship was carried out under the guidance of
Mr. Rajendra jangid, Internship Coordinator at Upflairs Private Limited Company.

I affirm that the content of this report has not been submitted elsewhere for the award of
any other degree or diploma.

Mr. Hemant Vaishnav


B.Tech (V Semester)
CSE (AIML)
JIET/AIML/23/014

Counter signed by:


Mr. Chandershekhar Singh
Head (Technology)
Department of Technology
JIET Group of Institutions
Jodhpur

1
Certificate

CERTIFICATE OF INTERNSHIP

At Upflairs Private Limited, he demonstrated a strong understanding of DevOps practices


and cloud computing technologies. His primary project involved deploying and managing a
cloud-based SaaS Resume Builder application, where he effectively utilized AWS services,
Docker, Jenkins, and GitHub Actions for continuous integration and deployment. He
also worked with Terraform for infrastructure as code.

Internship Duration: Mr. Rajendra Jangid


Start Date:- 01-05-2025 Internship Coordinator
to End Date:- 15-06-2025 Upflairs Private Limited Company

2
Acknowledgement

I would like to express my sincere gratitude to everyone who supported me throughout my


industrial training. The completion of this report on ”DevOps and Cloud Computing”
would not have been possible without their guidance and assistance.
I extend my deepest thanks to my internship coordinator, Mr. Rajendra Jangid at
Upflairs Private Limited Company, for his invaluable mentorship, constant supervision,
and encouragement. His profound knowledge and practical guidance were instrumental in
helping me navigate the challenges of the project and successfully complete my training.
I am also grateful to Mr. Chandershekhar Singh, Head (Technology), for providing me
with this opportunity and for her support in the preparation of this report.
Finally, I would like to thank my parents, friends, and the entire team at Upflairs for their
unwavering support and for creating a conducive learning environment.

Mr. Hemant Vaishnav


Date: [Date of Submission]
B.Tech. (CSE(AIML))
Place: Jodhpur
Roll No.: 23EJIAI041

3
Abstract

This report details the work and learning undertaken during an industrial training program
focused on DevOps and Cloud Computing at Upflairs Private Limited Company. The training
provided in-depth practical experience in deploying and managing scalable cloud-based
applications.
The core project during this internship was implementing Infrastructure as Code (IaC) using
Terraform on AWS, aimed at strengthening skills in automated cloud provisioning. Key AWS
services like EC2, S3, and IAM were configured and managed through Terraform scripts to
ensure consistent and scalable deployments.
Firebase was integrated for user authentication and Firestore for real-time database
operations. The training also covered CI/CD pipelines using GitHub Actions, along with
basic monitoring and logging techniques. This hands-on experience provided a solid
foundation in modern DevOps practices and cloud-native development.

4
Contents

Candidate’s Declaration 1

Certificate 2

Acknowledgement 3

Abstract 4

1 Introduction 9
1.1 Background and Context . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 9
1.2 Importance of DevOps and Cloud Computing . . . . . . . . . . . . . . . . . . . 9
1.3 Motivation . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 9
1.4 Problem Statement . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 10
1.5 Aims and Objectives . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 10
1.6 Report Outline . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 10

2 Tools and Environment Setup 11


2.1 Operating System and Software Requirements . . . . . . . . . . . . . . . . . . . 11
2.2 Installing Git and Github . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 11
2.3 Introduction to AWS CLI and IAM . . . . . . . . . . . . . . . . . . . . . . . . 12
2.3.1 To Install AWS CLI: . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 12
2.3.2 Configure AWS CLI: . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 12
2.4 Setting up VS Code and Extensions . . . . . . . . . . . . . . . . . . . . . . . . 12
2.5 Installing and Configuring Terraform . . . . . . . . . . . . . . . . . . . . . . . . 13
2.6 Installing Docker and Docker Compose . . . . . . . . . . . . . . . . . . . . . . . 14

3 DevOps Projects 16
3.1 Introduction to Projects . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 16
3.2 Tools and Technologies Used . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 16
3.3 Small Projects . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 16
3.3.1 Hosting Website with Nginx and Jenkins . . . . . . . . . . . . . . . . . . 16
3.3.2 Hosting Website on AWS EC2 with Apache . . . . . . . . . . . . . . . . 17
3.3.3 Running Git Repo using Docker . . . . . . . . . . . . . . . . . . . . . . . 18
3.3.4 Creating an Instance with Terraform . . . . . . . . . . . . . . . . . . . . 18
3.4 Major Project: Automated Website Deployment Using Terraform, Docker, and
AWS . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 19
3.5 Conclusion . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 20

4 Basic Ubuntu & AWS CLI Commands 21


4.1 Basic Linux Commands: ls, cd, pwd, mkdir . . . . . . . . . . . . . . . . . . . . . 21
4.2 File Operations: cp, mv, rm, touch . . . . . . . . . . . . . . . . . . . . . . . . . 21

5
4.3 Text Editors: nano, vim . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 21
4.4 Permissions and sudo Usage . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 22
4.5 System Directories: /etc, /var, /home . . . . . . . . . . . . . . . . . . . . . . . . 22
4.6 Apache2 Web Server: Installation & Configuration . . . . . . . . . . . . . . . . . 22
4.6.1 Installing Apache2 using apt . . . . . . . . . . . . . . . . . . . . . . . . . 22
4.6.2 Starting and Enabling Apache2 Service . . . . . . . . . . . . . . . . . . . 23
4.6.3 Default Web Root: /var/www/html . . . . . . . . . . . . . . . . . . . . . 23
4.6.4 Config Files in /etc/apache2 . . . . . . . . . . . . . . . . . . . . . . . . . 23
4.6.5 Restarting and Checking Status . . . . . . . . . . . . . . . . . . . . . . . 23
4.7 AWS CLI Basics: aws configure, s3 ls, ec2 describe . . . . . . . . . . . . . . . . 23

5 Containerization with Docker 24


5.1 Introduction to Docker . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 24
5.2 Docker Images and Containers . . . . . . . . . . . . . . . . . . . . . . . . . . . . 24
5.3 Writing Dockerfiles . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 24
5.4 Docker Compose Basics . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 25
5.5 Container Networking and Volumes . . . . . . . . . . . . . . . . . . . . . . . . . 25

6 Infrastructure as Code with Terraform 26


6.1 Introduction to IaC and Terraform . . . . . . . . . . . . . . . . . . . . . . . . . 26
6.2 Terraform Installation and Setup . . . . . . . . . . . . . . . . . . . . . . . . . . 26
6.3 Writing Basic Terraform Scripts . . . . . . . . . . . . . . . . . . . . . . . . . . . 27
6.4 Provisioning AWS Services with Terraform . . . . . . . . . . . . . . . . . . . . . 27
6.5 Variables and State Management . . . . . . . . . . . . . . . . . . . . . . . . . . 27
6.6 Terraform Cloud and Workspaces . . . . . . . . . . . . . . . . . . . . . . . . . . 28

7 CI/CD and Deployment 29


7.1 Introduction to CI/CD Pipelines . . . . . . . . . . . . . . . . . . . . . . . . . . 29
7.2 GitHub Actions for Automation . . . . . . . . . . . . . . . . . . . . . . . . . . . 29
7.3 Building CI/CD for a Sample App . . . . . . . . . . . . . . . . . . . . . . . . . 30
7.4 Docker Deployment via CI/CD . . . . . . . . . . . . . . . . . . . . . . . . . . . 30
7.5 Deployment to AWS (EC2 / S3) . . . . . . . . . . . . . . . . . . . . . . . . . . . 30

8 Conclusion 32
8.1 Summary of Achievements . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 32
8.2 Challenges Faced . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 32
8.3 Learning Outcomes . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 32
8.4 Future Scope . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 33

References 34

6
List of Figures

3.1 Nginx and Jenkins Website Hosting . . . . . . . . . . . . . . . . . . . . . . . . . 17


3.2 Apache Website on EC2 . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 17
3.3 Dockerized Git Project . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 18
3.4 Terraform EC2 Instance Creation . . . . . . . . . . . . . . . . . . . . . . . . . . 18
3.5 Automated Deployment Pipeline . . . . . . . . . . . . . . . . . . . . . . . . . . 19

7
List of Tables

3.1 Tools and Technologies Used in Projects . . . . . . . . . . . . . . . . . . . . . . 16

4.1 Fundamental Linux File System Commands . . . . . . . . . . . . . . . . . . . . 21

8
Chapter 1

Introduction

1.1 Background and Context


The traditional separation of software development and IT operations often led to
inefficiencies, miscommunication, and slow product delivery. To overcome these challenges,
DevOps emerged as a methodology that integrates both disciplines, promoting collaboration,
automation, and continuous improvement. At the same time, cloud computing revolutionized
IT infrastructure by offering scalable, flexible, and cost-effective services. These developments
transformed how businesses build, deploy, and manage applications. Together, DevOps and
cloud computing create a dynamic environment that supports agile development and rapid
delivery. This section outlines the evolution of these technologies and their significance in
reshaping modern IT practices and business strategies.

1.2 Importance of DevOps and Cloud Computing


DevOps and cloud computing are critical in accelerating software development and
deployment. DevOps promotes automation, collaboration, and continuous integration,
reducing delays and improving software quality. Cloud computing provides scalable
infrastructure and flexibility, enabling teams to deploy applications quickly and
cost-effectively. When combined, they offer powerful capabilities that support innovation,
agility, and resilience. Their adoption is essential for organizations aiming to remain
competitive in today’s fast-paced, technology-driven environment.

1.3 Motivation
The motivation behind this report stems from the increasing reliance on digital solutions and
the need for efficient software delivery methods. Traditional development models often fall
short in meeting modern demands for speed and flexibility. The combination of DevOps and
cloud computing presents an opportunity to address these issues by enhancing collaboration,
automating processes, and enabling rapid deployments. This study aims to explore how these
technologies can be successfully integrated to improve productivity and reduce operational
risks. The researcher’s interest in emerging IT trends and the real-world impact of technology
adoption further motivates a deeper investigation into these transformative practices.

9
1.4 Problem Statement
Despite the growing adoption of DevOps and cloud computing, many organizations face
challenges in achieving effective integration. These issues often prevent them from realizing
the full benefits of speed, efficiency, and scalability

• Lake of Clear Strategy: Without a structured implementation plan, efforts become


fragmented and inconsistent.

• Cultural Resistance : Teams often resist change, limiting collaboration between devel-
opment and operations.

• Skills Gap :

• A shortage of cloud and DevOps expertise hinders successful adoption.

• Cloud Complexity : Managing security, compliance, and costs in the cloud adds oper-
ational difficulty.

1.5 Aims and Objectives


The aim of this report is to explore how DevOps and cloud computing can be effectively
integrated to improve software development and operations. It focuses on identifying key
benefits, challenges, and best practices.
Objectives:

• Understand the core principles of DevOps and cloud computing

• Analyze real-world implementation cases

• Identify common challenges in integration

• Recommend effective strategies and tools for successful adoption

1.6 Report Outline


This section provides a structured overview of the report’s contents, guiding the reader
through each chapter. It explains how the research is organized and what each section will
cover. After this introduction, the Literature Review will explore existing research and
theories related to DevOps and cloud computing, establishing a foundation for analysis. The
Methodology chapter will detail the research approach, data collection methods, and
analytical tools used. The Findings and Discussion section will present and interpret the
results, highlighting key trends, successes, and obstacles in DevOps-cloud integration. The
Conclusion and Recommendations chapter will summarize the main insights, reflect on
the research objectives, and offer practical suggestions for future implementations or studies.
This outline ensures clarity and coherence, allowing readers to understand the report’s flow
and anticipate the logical progression of ideas. It acts as a roadmap, enhancing the report’s
readability and accessibility for a diverse audience.

10
Chapter 2

Tools and Environment Setup

In this chapter, we’ll go through the setup of all the essential tools and software required for
working in DevOps and Cloud Computing. These tools help us write code, manage cloud
infrastructure, work with containers, and automate tasks.

2.1 Operating System and Software Requirements


To work with DevOps tools smoothly, a Linux-based operating system is recommended. For
this internship, Ubuntu was used because:

• It’s free and open-source


• It supports most DevOps tools like Docker, Git, AWS CLI, and Terraform
• It gives hands-on experience with command-line tools

Minimum Software Requirements:

• Ubuntu 20.04 or later (or any Linux distro)


• Internet connection
• Minimum 4 GB RAM (8 GB recommended)
• 20 GB disk space

2.2 Installing Git and Github


Git is a version control system. It allows us to track changes in code and collaborate with
others.
GitHub is an online platform to host repositories and collaborate.
1
2 sudo apt update
3 sudo apt install git
Listing 2.1: Steps to install Git
To verify installation:
1
2 git -- version
Listing 2.2: Steps to Varify Installation

11
Configure Git:
1
2 git config -- global user . name " Your Name "
3 git config -- global user . email " you@example . com "
Listing 2.3: Steps to Configure Git
GitHub Setup:
GitHub is a cloud-based platform that enables developers to host, share, and collaborate on
code repositories. After installing Git on a local system, users can create a GitHub account
by visiting https://github.com. To securely interact with GitHub from the terminal, it is
recommended to generate an SSH key pair using ssh-keygen and add the public key to the
GitHub account under “SSH and GPG keys.” This allows for password-less authentication
during Git operations like clone, push, and pull. Once configured, users can create
repositories on GitHub and connect them to local Git repositories using git remote add origin.
This setup enables seamless version control and collaboration, making GitHub an essential
tool in modern DevOps workflows.

2.3 Introduction to AWS CLI and IAM


AWS CLI (Command Line Interface) helps to manage AWS services from the terminal.
IAM (Identity and Access Management) is used to securely control access to AWS
services.

2.3.1 To Install AWS CLI:

1 sudo apt install awscli


Listing 2.4: To Install AWS CLI

2.3.2 Configure AWS CLI:

1 aws configure
Listing 2.5: Configure AWS CLI

2.4 Setting up VS Code and Extensions


• VS Code is a free, cross-platform code editor with a rich ecosystem of extensions.

• It supports syntax highlighting, terminal access, debugging, and extension integration.

• Installing extensions for Terraform, Docker, AWS, and Git tailors VS Code to your
DevOps stack.

Why It Matters:

• Productivity is greatly improved when your editor understands your tools and code
syntax.

• Extensions provide real-time linting, code completion, and CLI integration.

12
• VS Code becomes a central hub where you write, edit, test, and push code — all from
one place.
Install VS Code (UBUNTU)
1
2 wget -qO - https :// packages . microsoft . com / keys / microsoft . asc | gpg -- dearmor
> packages . microsoft . gpg
3 sudo install -o root -g root -m 644 packages . microsoft . gpg / etc / apt / trusted .
gpg . d /
4 sudo sh -c ’ echo " deb [ arch = amd64 ] https :// packages . microsoft . com / repos /
vscode stable main " > / etc / apt / sources . list . d / vscode . list ’
5 sudo apt update
6 sudo apt install -y code
Listing 2.6: Install VS Code - Ubuntu
Install Extensions:
1

2 code -- install - extension ms - azuretools . vscode - docker


3 code -- install - extension hashicorp . terraform
4 code -- install - extension am azonwe bservi ces . aws - toolkit - vscode
5 code -- install - extension eamodio . gitlens
Listing 2.7: Install Extensions
These extensions help with:
• Docker management
• Terraform code editing
• AWS integration
• Git tracking and logs
Lorem ipsum dolor sit amet, consectetuer adipiscing elit. Ut purus elit, vestibulum ut,
placerat ac, adipiscing vitae, felis. Curabitur dictum gravida mauris. Nam arcu libero,
nonummy eget, consectetuer id, vulputate a, magna. Donec vehicula augue eu neque.
Pellentesque habitant morbi tristique senectus et netus et malesuada fames ac turpis egestas.
Mauris ut leo. Cras viverra metus rhoncus sem. Nulla et lectus vestibulum urna fringilla
ultrices. Phasellus eu tellus sit amet tortor gravida placerat. Integer sapien est, iaculis in,
pretium quis, viverra ac, nunc. Praesent eget sem vel leo ultrices bibendum. Aenean faucibus.
Morbi dolor nulla, malesuada eu, pulvinar at, mollis ac, nulla. Curabitur auctor semper nulla.
Donec varius orci eget risus. Duis nibh mi, congue eu, accumsan eleifend, sagittis quis, diam.
Duis eget orci sit amet orci dignissim rutrum.
Nam dui ligula, fringilla a, euismod sodales, sollicitudin vel, wisi. Morbi auctor lorem non
justo. Nam lacus libero, pretium at, lobortis vitae, ultricies et, tellus. Donec aliquet, tortor
sed accumsan bibendum, erat ligula aliquet magna, vitae ornare odio metus a mi. Morbi ac
orci et nisl hendrerit mollis. Suspendisse ut massa. Cras nec ante. Pellentesque a nulla. Cum
sociis natoque penatibus et magnis dis parturient montes, nascetur ridiculus mus. Aliquam
tincidunt urna. Nulla ullamcorper vestibulum turpis. Pellentesque cursus luctus mauris.

2.5 Installing and Configuring Terraform


Terraform is an open-source tool that lets you define and provision cloud infrastructure
using code (Infrastructure as Code - IaC).
Commands :

13
1
2 curl - fsSL https :// apt . releases . hashicorp . com / gpg | sudo gpg -- dearmor -o /
usr / share / keyrings / hashicorp - archive - keyring . gpg
3 echo " deb [ signed - by =/ usr / share / keyrings / hashicorp - archive - keyring . gpg ]
https :// apt . releases . hashicorp . com $ ( lsb_release - cs ) main " | sudo tee /
etc / apt / sources . list . d / hashicorp . list > / dev / null
4 sudo apt update
5 sudo apt install -y terraform
6 terraform - version
Listing 2.8: Commands
Lorem ipsum dolor sit amet, consectetuer adipiscing elit. Ut purus elit, vestibulum ut,
placerat ac, adipiscing vitae, felis. Curabitur dictum gravida mauris. Nam arcu libero,
nonummy eget, consectetuer id, vulputate a, magna. Donec vehicula augue eu neque.
Pellentesque habitant morbi tristique senectus et netus et malesuada fames ac turpis egestas.
Mauris ut leo. Cras viverra metus rhoncus sem. Nulla et lectus vestibulum urna fringilla
ultrices. Phasellus eu tellus sit amet tortor gravida placerat. Integer sapien est, iaculis in,
pretium quis, viverra ac, nunc. Praesent eget sem vel leo ultrices bibendum. Aenean faucibus.
Morbi dolor nulla, malesuada eu, pulvinar at, mollis ac, nulla. Curabitur auctor semper nulla.
Donec varius orci eget risus. Duis nibh mi, congue eu, accumsan eleifend, sagittis quis, diam.
Duis eget orci sit amet orci dignissim rutrum.
Nam dui ligula, fringilla a, euismod sodales, sollicitudin vel, wisi. Morbi auctor lorem non
justo. Nam lacus libero, pretium at, lobortis vitae, ultricies et, tellus. Donec aliquet, tortor
sed accumsan bibendum, erat ligula aliquet magna, vitae ornare odio metus a mi. Morbi ac
orci et nisl hendrerit mollis. Suspendisse ut massa. Cras nec ante. Pellentesque a nulla. Cum
sociis natoque penatibus et magnis dis parturient montes, nascetur ridiculus mus. Aliquam
tincidunt urna. Nulla ullamcorper vestibulum turpis. Pellentesque cursus luctus mauris.

2.6 Installing Docker and Docker Compose


Docker lets you run applications in isolated containers. Docker Compose helps manage
multi-container setups using a single file. Both are vital for building and deploying modern
applications.
Install Docker:
1
2 sudo apt install -y ca - certificates curl gnupg
3 sudo install -m 0755 -d / etc / apt / keyrings
4 curl - fsSL https :// download . docker . com / linux / ubuntu / gpg | sudo gpg -- dearmor
-o / etc / apt / keyrings / docker . gpg
5 echo \
6 " deb [ arch = $ ( dpkg -- print - architecture ) signed - by =/ etc / apt / keyrings / docker
. gpg ] https :// download . docker . com / linux / ubuntu \
7 $ ( lsb_release - cs ) stable " | sudo tee / etc / apt / sources . list . d / docker . list
> / dev / null
8 sudo apt update
9 sudo apt install -y docker - ce docker - ce - cli containerd . io docker - buildx -
plugin docker - compose - plugin
Listing 2.9: Install Docker
Add user to Docker group:
1
2 sudo usermod - aG docker $USER
Listing 2.10: Add User to Docker Group

14
Test Docker:
1 docker run hello - world
Listing 2.11: Test Docker
Lorem ipsum dolor sit amet, consectetuer adipiscing elit. Ut purus elit, vestibulum ut,
placerat ac, adipiscing vitae, felis. Curabitur dictum gravida mauris. Nam arcu libero,
nonummy eget, consectetuer id, vulputate a, magna. Donec vehicula augue eu neque.
Pellentesque habitant morbi tristique senectus et netus et malesuada fames ac turpis egestas.
Mauris ut leo. Cras viverra metus rhoncus sem. Nulla et lectus vestibulum urna fringilla
ultrices. Phasellus eu tellus sit amet tortor gravida placerat. Integer sapien est, iaculis in,
pretium quis, viverra ac, nunc. Praesent eget sem vel leo ultrices bibendum. Aenean faucibus.
Morbi dolor nulla, malesuada eu, pulvinar at, mollis ac, nulla. Curabitur auctor semper nulla.
Donec varius orci eget risus. Duis nibh mi, congue eu, accumsan eleifend, sagittis quis, diam.
Duis eget orci sit amet orci dignissim rutrum.
Nam dui ligula, fringilla a, euismod sodales, sollicitudin vel, wisi. Morbi auctor lorem non
justo. Nam lacus libero, pretium at, lobortis vitae, ultricies et, tellus. Donec aliquet, tortor
sed accumsan bibendum, erat ligula aliquet magna, vitae ornare odio metus a mi. Morbi ac
orci et nisl hendrerit mollis. Suspendisse ut massa. Cras nec ante. Pellentesque a nulla. Cum
sociis natoque penatibus et magnis dis parturient montes, nascetur ridiculus mus. Aliquam
tincidunt urna. Nulla ullamcorper vestibulum turpis. Pellentesque cursus luctus mauris.

15
Chapter 3

DevOps Projects

This chapter provides an overview of various projects undertaken to apply DevOps principles
and technologies. These projects demonstrate a practical understanding of tools like Docker,
Terraform, Apache, and GitHub Actions, and how they can be used to automate the software
delivery lifecycle.

3.1 Introduction to Projects


This report contains small and major DevOps projects involving tools like Docker, Terraform,
Jenkins, Apache, GitHub, and AWS EC2. These projects are intended to automate
infrastructure and web deployment tasks. The following sections detail the goals, technologies
used, and key code snippets for each project.

3.2 Tools and Technologies Used


The following table provides a quick reference for the tools and technologies utilized
throughout the projects.

Tool/Technology Purpose
Docker Containerizing and running applications
Terraform Provisioning infrastructure as code (IaC)
AWS EC2 Hosting applications on virtual machines
Apache Web server for serving websites
GitHub Version control and remote repositories
Jenkins Automating deployment pipelines

Table 3.1: Tools and Technologies Used in Projects

3.3 Small Projects

3.3.1 Hosting Website with Nginx and Jenkins


This project demonstrates a basic CI/CD pipeline using Jenkins to automatically deploy a
static website with Nginx. GitHub Link:
https://github.com/WebAddicted9/Nginx_Jenkins.git
Code Snippet (Jenkins Pipeline):

16
Figure 3.1: Nginx and Jenkins Website Hosting

1 pipeline {
2 agent any
3 stages {
4 stage ( ’ Clone Repo ’) {
5 steps {
6 git ’ https :// github . com / your - repo . git ’
7 }
8 }
9 stage ( ’ Deploy with Nginx ’) {
10 steps {
11 sh ’ sudo systemctl restart nginx ’
12 }
13 }
14 }
15 }
Listing 3.1: Jenkinsfile Script

3.3.2 Hosting Website on AWS EC2 with Apache


This project focuses on manual deployment on an AWS EC2 instance. The process involves
installing Apache and copying the website files to the web root directory using a user data
script. GitHub Link:
https://github.com/WebAddicted9/Animated-portfolio-website.git

Figure 3.2: Apache Website on EC2

Code Snippet (User Data Script):


1 # !/ bin / bash
2 sudo apt update -y
3 sudo apt install apache2 -y

17
4 git clone https :// github . com / your - repo . git
5 cp -r your - repo /* / var / www / html /
Listing 3.2: Apache Install Script on EC2

3.3.3 Running Git Repo using Docker


This project shows how to containerize a web application using Docker. A Dockerfile is used
to create an Nginx image that serves the website content. GitHub Link:
https://github.com/WebAddicted9/Docker_Project.git

Figure 3.3: Dockerized Git Project

Code Snippet (Dockerfile):


1 FROM nginx
2 COPY ./ website / usr / share / nginx / html
Listing 3.3: Dockerfile for Website

3.3.4 Creating an Instance with Terraform


This project uses Infrastructure as Code (IaC) to automatically provision an AWS EC2
instance. Terraform scripts define the desired state of the infrastructure. GitHub Link:
https://github.com/WebAddicted9/Instance_using_Terra.git

Figure 3.4: Terraform EC2 Instance Creation

Code Snippet (Terraform Code):


1 provider " aws " {
2 region = " ap - south -1"
3 }
4
5 resource " aws_instance " " web " {
6 ami = " ami - xxxxxxxxxxxxx "
7 instance_type = " t2 . micro "
8 tags = {
9 Name = " Terraform - Instance "

18
10 }
11 }
Listing 3.4: Terraform Configuration

3.4 Major Project: Automated Website Deployment Us-


ing Terraform, Docker, and AWS
This project showcases a complete DevOps pipeline where a static website is deployed in an
automated and scalable manner using Terraform, AWS EC2, Docker, Apache HTTP Server,
and GitHub.

Figure 3.5: Automated Deployment Pipeline

Project Overview: This project showcases a complete DevOps pipeline where a static
website is deployed in an automated and scalable manner using Terraform, AWS EC2,
Docker, Apache HTTP Server, and GitHub.
Steps Involved:

• Provision infrastructure using Terraform (IaC)

• Create a Docker image for the website

• Host the containerized website on EC2

• Use Apache to serve the content

• Pull code from GitHub and deploy automatically

Important Code Snippets: Docker + Apache Setup:


1 sudo apt update
2 sudo apt install apache2 docker . io -y
3 git clone https :// github . com / your - repo . git
4 cd your - repo
5 docker build -t mywebsite .
6 docker run -d -p 80:80 mywebsite
Listing 3.5: Install Apache and Run Docker
Terraform with Docker User Data Script:

19
1 resource " aws_instance " " docker_web " {
2 ami = " ami - xxxxxxxxxxxxx "
3 instance_type = " t2 . micro "
4
5 user_data = <<- EOF
6 #!/ bin / bash
7 apt update -y
8 apt install docker . io -y
9 git clone https :// github . com / your - repo . git
10 cd your - repo
11 docker build -t site .
12 docker run -d -p 80:80 site
13 EOF
14 }
Listing 3.6: Terraform with Docker Setup

3.5 Conclusion
These projects enabled practical learning of DevOps concepts by integrating automation,
containerization, and cloud infrastructure. Using tools like Terraform, Docker, Jenkins, and
AWS, I was able to deploy web applications efficiently and repeatedly.

20
Chapter 4

Basic Ubuntu & AWS CLI Commands

4.1 Basic Linux Commands: ls, cd, pwd, mkdir


Explanation: These are the most fundamental file system commands in Linux, crucial for
navigating and organizing files. They form the backbone of any command-line interaction.

Command Purpose
ls Lists files and directories in the current path
cd Changes directory
pwd Prints the current working directory
mkdir Creates a new directory

Table 4.1: Fundamental Linux File System Commands

4.2 File Operations: cp, mv, rm, touch


Explanation: These commands allow for managing files and directories by copying, moving,
renaming, and deleting them. They are essential for day-to-day administration and scripting.

• cp (Copy): cp file1.txt file2.txt creates a copy of file1.txt as file2.txt.

• mv (Move): mv old_name.txt new_name.txt renames a file. mv file.txt /path/to/directory


moves a file to a new location.

• rm (Remove): rm file.txt deletes a file. rm -r directory recursively deletes a


directory and its contents.

• touch (Create): touch new_file.txt creates an empty new file.

4.3 Text Editors: nano, vim


Explanation: When working on a server without a graphical interface, you need a
command-line text editor to modify configuration files or scripts.

• nano: A simple, beginner-friendly editor. It displays command shortcuts at the bottom


of the screen, making it easy to learn. To open a file: nano filename.txt.

21
• vim: A powerful, advanced editor. It has a steeper learning curve but offers high efficiency
for experienced users. To open a file: vim filename.txt. To edit, you must enter ”insert”
mode by pressing i. To save and exit, press Esc and type :wq.

4.4 Permissions and sudo Usage


Explanation: Linux is built on a strong security model that uses permissions to control
access to files and directories. sudo (superuser do) is a command that allows a permitted user
to execute a command as the superuser (root) or another user.

• Permissions: The ls -l command shows file permissions for the owner, group, and
others (e.g., -rw-r--r--).

• chmod: Changes permissions (e.g., chmod 755 script.sh makes a script executable for
all users).

• chown: Changes the owner and group of a file.

• sudo: Prefacing a command with sudo (e.g., sudo apt update) temporarily grants ad-
ministrative privileges, allowing you to install software or modify system files.

4.5 System Directories: /etc, /var, /home


Explanation: The Linux filesystem has a standard structure where specific directories are
used for specific purposes.

• /etc: Contains all system-wide configuration files (e.g., Apache2 configs, network set-
tings).

• /var: Stores variable data, such as log files (/var/log), web server files (/var/www), and
mail queues.

• /home: The directory where personal user files are stored. Each user has a subdirectory
here (e.g., /home/rahul).

4.6 Apache2 Web Server: Installation & Configuration

4.6.1 Installing Apache2 using apt


Apache2 is a widely used open-source web server. It’s often the first service installed on a
Linux server to host a website.
1 sudo apt update
2 sudo apt install apache2
Listing 4.1: Installing Apache2
After installation, the Apache2 service automatically starts. You can verify it by navigating
to your server’s public IP address in a web browser.

22
4.6.2 Starting and Enabling Apache2 Service
The systemctl command is used to manage system services.
1 sudo systemctl start apache2 # Starts the service
2 sudo systemctl enable apache2 # Enables it to start on boot
3 sudo systemctl status apache2 # Checks its status
Listing 4.2: Managing Apache2 Service

4.6.3 Default Web Root: /var/www/html


By default, Apache2 serves content from the /var/www/html directory. Any HTML files
placed in this directory will be accessible via the web server.

4.6.4 Config Files in /etc/apache2


The main configuration files for Apache2 are located in /etc/apache2.

• apache2.conf: The main configuration file.

• sites-available/: Contains configuration files for different websites (virtual hosts).

• sites-enabled/: Symbolic links to the files in sites-available for currently enabled web-
sites.

4.6.5 Restarting and Checking Status


Whenever you change a configuration file, you must restart the service for the changes to take
effect.
1 sudo systemctl restart apache2
Listing 4.3: Restarting Apache2

4.7 AWS CLI Basics: aws configure, s3 ls, ec2 describe


Explanation: The AWS CLI is an essential tool for managing AWS resources directly from
the command line.

• aws configure: This command sets up your AWS credentials. It prompts for your access
key, secret key, default region, and output format.

• aws s3 ls: Lists all S3 buckets in your account.

• aws ec2 describe-instances: Provides detailed information about all EC2 instances in
your account.

• aws sts get-caller-identity: Verifies your configured AWS user and account details.

These commands allow for powerful automation and management of your cloud infrastructure
without needing to use the web console.

23
Chapter 5

Containerization with Docker

5.1 Introduction to Docker


Containerization has revolutionized software development by packaging an application and all
its dependencies into a single, isolated unit called a container. This ensures the application
runs consistently across different environments, from a developer’s laptop to a production
server. **Docker is the most popular platform for building, shipping, and running these
containers.

5.2 Docker Images and Containers


Explanation: The relationship between an image and a container is similar to that of a
class and an object in object-oriented programming.
• Docker Image: A lightweight, standalone, executable package that includes everything
needed to run an application—the code, a runtime, system tools, system libraries, and
settings. Images are read-only templates.
• Docker Container: A runnable instance of a Docker image. You can start, stop, move,
or delete a container. Each container is isolated from the host and other containers,
providing a secure and consistent environment.
A command like docker run hello-world pulls the hello-world image and creates a
container from it.
1 docker run -d -p 80:80 nginx
Listing 5.1: Running a Docker Container
This command runs the Nginx web server image, mapping port 80 of the host machine to
port 80 of the container. The -d flag runs the container in detached mode.

5.3 Writing Dockerfiles


Explanation: A Dockerfile is a text file that contains a series of instructions for building a
Docker image. It’s the blueprint for your containerized application.
1 # Use a base image
2 FROM nginx
3 COPY ./ website / usr / share / nginx / html
Listing 5.2: Example Dockerfile

24
The docker build -t my-app . command uses this Dockerfile to create an image named
my-app.

5.4 Docker Compose Basics


Explanation: While a single container is great for simple applications, most real-world
applications consist of multiple services (e.g., a web server, a database, a cache). Docker
Compose is a tool for defining and running multi-container Docker applications. It uses a
single YAML file to configure all application services.
1 version : ’3.8 ’
2 services :
3 web :
4 build : .
5 ports :
6 - "80:80"
7 db :
8 image : postgres :14
9 environment :
10 POSTGRES_USER : user
11 POST GRES_P ASSWOR D : password
12 POSTGRES_DB : mydb
Listing 5.3: Example docker-compose.yml
With this file, you can start all services with a single command: docker compose up.

5.5 Container Networking and Volumes


Explanation: Docker provides tools for managing communication between containers and
for persistent storage.

• Networking: Docker creates a virtual network for containers, allowing them to com-
municate with each other using service names as hostnames. In the docker-compose.yml
example above, the web service can communicate with the db service simply by using the
hostname db.

• Volumes: Containers are ephemeral by nature. To ensure data persistence (e.g., for a
database), Docker volumes are used. A volume is a designated area on the host machine
that can be mounted into a container. This ensures that even if the container is deleted,
the data remains.

25
Chapter 6

Infrastructure as Code with Terraform

6.1 Introduction to IaC and Terraform


Infrastructure as Code (IaC) is the practice of managing and provisioning infrastructure
through machine-readable definition files, rather than through manual configuration. IaC
allows you to treat your infrastructure like software, using version control and automation to
ensure consistency and repeatability. Terraform is a popular open-source IaC tool created by
HashiCorp. It enables you to define and manage cloud resources from a single configuration
file.

6.2 Terraform Installation and Setup


Explanation: Terraform can be installed on various operating systems. The commands to
install Terraform on Ubuntu were covered in Chapter 2. Once installed, the setup involves
writing a configuration file that defines the desired state of your infrastructure.
1 terraform {
2 re qu ir ed _p ro vi de rs {
3 aws = {
4 source = " hashicorp / aws "
5 version = "~ > 4.0"
6 }
7 }
8 }
9

10 provider " aws " {


11 region = " us - east -1"
12 }
13
14 # Optional : Configure a remote backend for state management
15 terraform {
16 backend " s3 " {
17 bucket = " my - terraform - state - bucket "
18 key = " path / to / my / key . tfstate "
19 region = " us - east -1"
20 }
21 }
Listing 6.1: Provider and Backend Configuration
The terraform init command initializes the working directory, downloading the necessary
provider plugins.

26
6.3 Writing Basic Terraform Scripts
Explanation: Terraform uses its own language, HashiCorp Configuration Language (HCL),
to define resources. A basic script typically includes a provider block, which specifies the
cloud service provider (e.g., AWS), and resource blocks, which define the infrastructure
components (e.g., EC2 instances, S3 buckets).
1 resource " aws_instance " " example " {
2 ami = " ami -0 c55b159cbfafe1f0 "
3 instance_type = " t2 . micro "
4 tags = {
5 Name = " My - Terraform - Instance "
6 }
7 }
Listing 6.2: Creating a Simple EC2 Instance
This script defines a single EC2 instance. The ami and instance_type are essential
arguments for creating the instance.

6.4 Provisioning AWS Services with Terraform


Explanation: Terraform can manage a wide range of AWS services. The typical workflow
is:

• terraform init: Initializes the directory.

• terraform plan: Generates an execution plan showing what changes will be made.

• terraform apply: Applies the changes, provisioning the resources.

• terraform destroy: Tears down all the resources defined in the configuration.

This workflow ensures that you always know what changes are about to happen, preventing
accidental deletions or misconfigurations.

6.5 Variables and State Management


Explanation:

• Variables: Terraform uses variables to make your code reusable. You can define variables
in a variables.tf file and provide values via a terraform.tfvars file or on the command
line. This allows you to use the same script for different environments (e.g., development,
production).

• State Management: Terraform maintains a state file (.tfstate) that maps the real-world
resources to your configuration. This file is critical, and for team collaboration, it should
be stored in a remote, shared location, such as an S3 bucket with versioning enabled. A
remote backend ensures that all team members are working with the same, up-to-date
state.

27
6.6 Terraform Cloud and Workspaces
Explanation: Terraform Cloud is a managed service that provides a consistent environment
for Terraform workflows. It includes features like remote state management, team
collaboration, and a centralized plan and apply workflow. Workspaces in Terraform allow you
to manage multiple distinct sets of infrastructure with the same configuration. This is ideal
for provisioning separate environments like staging and production, each with its own state.

28
Chapter 7

CI/CD and Deployment

7.1 Introduction to CI/CD Pipelines


CI/CD (Continuous Integration/Continuous Deployment) is a core DevOps practice that
automates the process of building, testing, and deploying software. A CI/CD pipeline
automates the steps from code commit to production, reducing human error and accelerating
the software delivery lifecycle.

7.2 GitHub Actions for Automation


Explanation: GitHub Actions is a CI/CD platform that allows you to automate software
workflows directly from your GitHub repository. It uses YAML files to define workflows that
trigger on various events, such as a code push or a pull request.
1 name : CI / CD Pipeline
2
3 on :
4 push :
5 branches :
6 - main
7
8 jobs :
9 build :
10 runs - on : ubuntu - latest
11 steps :
12 - uses : actions / checkout@v3
13 - name : Set up Node . js
14 uses : actions / setup - node@v3
15 with :
16 node - version : ’18 ’
17 - name : Install dependencies
18 run : npm install
19 - name : Run tests
20 run : npm test
Listing 7.1: Basic GitHub Actions Workflow
This workflow automatically runs tests whenever new code is pushed to the main branch.

29
7.3 Building CI/CD for a Sample App
Explanation: A typical CI/CD pipeline for a web application might include the following
stages:

• Continuous Integration (CI):

1. Code is pushed to a Git repository.

2. A GitHub Actions workflow is triggered.

3. The workflow installs dependencies and runs tests.

4. If tests pass, a Docker image is built and pushed to a container registry (e.g., Docker
Hub).

• Continuous Deployment (CD):

1. The successful image build triggers the deployment stage.

2. The workflow connects to the AWS EC2 instance.

3. It pulls the latest Docker image from the registry.

4. The old container is stopped, and a new one is started with the latest image.

7.4 Docker Deployment via CI/CD


Explanation: Docker images are a perfect artifact for CI/CD. After the build stage of the
pipeline, the image is created. Instead of deploying code, you deploy the entire image, which
guarantees that the application and its environment are exactly the same in every stage.
1 # SSH into the EC2 instance
2 ssh -i " key . pem " ubuntu@ < ec2 - public - ip >
3

4 # Pull the new image and run the container


5 docker pull my - app : latest
6 docker stop my - old - container || true
7 docker rm my - old - container || true
8 docker run -- name my - app -p 80:3000 -d my - app : latest
Listing 7.2: Deploying a Docker Image to a Server

7.5 Deployment to AWS (EC2 / S3)


Explanation: Depending on the application, deployment to AWS can take different forms:

• Static Websites (S3): For a front-end-only application (like the Resume Builder), the
build artifacts (HTML, CSS, JS) can be directly uploaded to an S3 bucket configured for
static website hosting. This is a simple, cost-effective, and highly scalable deployment
method.

30
• Dynamic Applications (EC2): For applications requiring a backend server, an EC2
instance is used. A CI/CD pipeline can automate the process of building the application,
containerizing it with Docker, and deploying the container onto the EC2 instance. The
EC2 instance acts as the host for the running container.

This chapter summarizes the final stage of the DevOps pipeline, where code transitions from
a developer’s machine to a live, production environment.
Phasellus fringilla, metus id feugiat consectetuer, lacus wisi ultrices tellus, quis lobortis nibh
lorem quis tortor. Donec egestas ornare nulla. Mauris mi tellus, porta faucibus, dictum vel,
nonummy in, est. Aliquam erat volutpat. In tellus magna, porttitor lacinia, molestie vitae,
pellentesque eu, justo. Class aptent taciti sociosqu ad litora torquent per conubia nostra, per
inceptos hymenaeos. Sed orci nibh, scelerisque sit amet, suscipit sed, placerat vel, diam.
Vestibulum nonummy vulputate orci. Donec et velit ac arcu interdum semper. Morbi pede
orci, cursus ac, elementum non, vehicula ut, lacus. Cras volutpat. Nam vel wisi quis libero
venenatis placerat. Aenean sed odio. Quisque posuere purus ac orci. Vivamus odio. Vivamus
varius, nulla sit amet semper viverra, odio mauris consequat lacus, at vestibulum neque arcu
eu tortor. Donec iaculis tincidunt tellus. Aliquam erat volutpat. Curabitur magna lorem,
dignissim volutpat, viverra et, adipiscing nec, dolor. Praesent lacus mauris, dapibus vitae,
sollicitudin sit amet, nonummy eget, ligula.
Cras egestas ipsum a nisl. Vivamus varius dolor ut dolor. Fusce vel enim. Pellentesque
accumsan ligula et eros. Cras id lacus non tortor facilisis facilisis. Etiam nisl elit, cursus sed,
fringilla in, congue nec, urna. Cum sociis natoque penatibus et magnis dis parturient montes,
nascetur ridiculus mus. Integer at turpis. Cum sociis natoque penatibus et magnis dis
parturient montes, nascetur ridiculus mus. Duis fringilla, ligula sed porta fringilla, ligula wisi
commodo felis, ut adipiscing felis dui in enim. Suspendisse malesuada ultrices ante.
Pellentesque scelerisque augue sit amet urna. Nulla volutpat aliquet tortor. Cras aliquam,
tellus at aliquet pellentesque, justo sapien commodo leo, id rhoncus sapien quam at erat.
Nulla commodo, wisi eget sollicitudin pretium, orci orci aliquam orci, ut cursus turpis justo et
lacus. Nulla vel tortor. Quisque erat elit, viverra sit amet, sagittis eget, porta sit amet, lacus.
In hac habitasse platea dictumst. Proin at est. Curabitur tempus vulputate elit. Pellentesque
sem. Praesent eu sapien. Duis elit magna, aliquet at, tempus sed, vehicula non, enim. Morbi
viverra arcu nec purus. Vivamus fringilla, enim et commodo malesuada, tortor metus
elementum ligula, nec aliquet est sapien ut lectus. Aliquam mi. Ut nec elit. Fusce euismod
luctus tellus. Curabitur scelerisque. Nullam purus. Nam ultricies accumsan magna. Morbi
pulvinar lorem sit amet ipsum. Donec ut justo vitae nibh mollis congue. Fusce quis diam.
Praesent tempus eros ut quam.

31
Chapter 8

Conclusion

This industrial training at Apex Planet Software Company has been an immensely valuable
experience, providing deep practical insights into modern web development. The development
of the SaaS Resume Builder from scratch successfully translated theoretical knowledge into a
tangible, real-world application.

8.1 Summary of Achievements


The project successfully met all its objectives. A fully functional application was built using
HTML, CSS, and JavaScript. Backend services for authentication and database management
were seamlessly integrated using the Firebase API. Furthermore, essential concepts of web
optimization were learned and applied, enhancing the overall quality of the application.

8.2 Challenges Faced


Several challenges were encountered and overcome during the development process:

• Managing Asynchronous Code: Understanding and correctly implementing Promises


and async/await for API calls was initially challenging but crucial for the application’s
functionality.

• State Management: Keeping the UI in sync with the data without a formal state
management library required careful planning and organization of the JavaScript code.

• Cross-browser Compatibility: Ensuring the application looked and behaved consis-


tently across different web browsers required testing and occasional CSS adjustments.

8.3 Learning Outcomes


The key takeaway from this training is the importance of hands-on practice and the
understanding of how different technologies come together to create a cohesive product. The
experience of working with APIs, managing user data securely, and focusing on user
experience has provided a robust foundation for a future career in software development.

32
8.4 Future Scope
While the current prototype is fully functional, there are several avenues for future
enhancement:

• Implementing a ”Download as PDF” feature using a library like jsPDF.

• Adding multiple resume templates for users to choose from.

• Integrating an AI-powered feature to provide suggestions for improving resume content.

• Enhancing the UI with more advanced animations and transitions.

• Writing unit and integration tests to ensure code quality and reliability.

Curabitur ac lorem. Vivamus non justo in dui mattis posuere. Etiam accumsan ligula id
pede. Maecenas tincidunt diam nec velit. Praesent convallis sapien ac est. Aliquam
ullamcorper euismod nulla. Integer mollis enim vel tortor. Nulla sodales placerat nunc. Sed
tempus rutrum wisi. Duis accumsan gravida purus. Nunc nunc. Etiam facilisis dui eu sem.
Vestibulum semper. Praesent eu eros. Vestibulum tellus nisl, dapibus id, vestibulum sit amet,
placerat ac, mauris. Maecenas et elit ut erat placerat dictum. Nam feugiat, turpis et sodales
volutpat, wisi quam rhoncus neque, vitae aliquam ipsum sapien vel enim. Maecenas suscipit
cursus mi.
Quisque consectetuer. In suscipit mauris a dolor pellentesque consectetuer. Mauris convallis
neque non erat. In lacinia. Pellentesque leo eros, sagittis quis, fermentum quis, tincidunt ut,
sapien. Maecenas sem. Curabitur eros odio, interdum eu, feugiat eu, porta ac, nisl. Curabitur
nunc. Etiam fermentum convallis velit. Pellentesque laoreet lacus. Quisque sed elit. Nam quis
tellus. Aliquam tellus arcu, adipiscing non, tincidunt eleifend, adipiscing quis, augue.
Vivamus elementum placerat enim. Suspendisse ut tortor. Integer faucibus adipiscing felis.
Aenean consectetuer mattis lectus. Morbi malesuada faucibus dolor. Nam lacus. Etiam arcu
libero, malesuada vitae, aliquam vitae, blandit tristique, nisl.
Maecenas accumsan dapibus sapien. Duis pretium iaculis arcu. Curabitur ut lacus. Aliquam
vulputate. Suspendisse ut purus sed sem tempor rhoncus. Ut quam dui, fringilla at, dictum
eget, ultricies quis, quam. Etiam sem est, pharetra non, vulputate in, pretium at, ipsum.
Nunc semper sagittis orci. Sed scelerisque suscipit diam. Ut volutpat, dolor at ullamcorper
tristique, eros purus mollis quam, sit amet ornare ante nunc et enim.

33
Bibliography

[1] MDN Web Docs, ”HTML: HyperText Markup Language.” [Online]. Available: https:
//developer.mozilla.org/en-US/docs/Web/HTML

[2] MDN Web Docs, ”CSS: Cascading Style Sheets.” [Online]. Available: https://
developer.mozilla.org/en-US/docs/Web/CSS

[3] MDN Web Docs, ”JavaScript.” [Online]. Available: https://developer.mozilla.org/


en-US/docs/Web/JavaScript

[4] Google, ”Firebase Documentation.” [Online]. Available: https://firebase.google.com/


docs

[5] Google for Developers, ”Learn the Basics of Web Development.” [Online]. Available:
https://web.dev/learn/

[6] The Modern JavaScript Tutorial. [Online]. Available: https://javascript.info/

[7] CSS-Tricks, ”A Complete Guide to Flexbox.” [Online]. Available: https://css-tricks.


com/snippets/css/a-guide-to-flexbox/

34

You might also like