This project automates the provisioning and deployment of a basic Apache web application using Python-based DevOps tools. It showcases the integration of AWS (via Boto3), SSH-based automation (via Fabric), and infrastructure scripting to streamline web server deployment.
Goal:
Provision an EC2 instance, upload and install Apache2 using a shell script, and manage infrastructure using Python automation.
-
run_ec2_instances.py
Automates the creation of an EC2 instance using Boto3 with custom tags, security group, subnet, and monitoring options.
-
create_s3_bucket.py
Creates a private S3 bucket using Boto3 for optional static file storage or artifact management.
-
ssh_into_instance.py
Uses the Fabric library to:- SSH into the EC2 instance via PEM key
- Upload the Apache setup script
- Make the script executable and run it with
sudo
-
apache.sh
A Bash script executed remotely to:- Install Apache2
- Enable and start the service
- Display the server’s public IP address
- Python 3
- Boto3 – AWS automation SDK
- Fabric – SSH and remote task automation
- AWS EC2 – Cloud compute provisioning
- AWS S3 – Cloud object storage
- Apache2 – Web server installed via Bash
- Ubuntu – EC2 AMI base OS
- Provision EC2 instance with
run_ec2_instances.py - Create an S3 bucket using
create_s3_bucket.py(optional step) - SSH and deploy Apache using
ssh_into_instance.pyandapache.sh - Apache is now running and serving web content on the EC2 public IP
- Automate web app zip upload to S3 and extraction on EC2
- Add teardown scripts to delete EC2 and S3 resources
- Integrate with CI/CD (e.g., GitHub Actions or Jenkins)
- Add error handling and logging
✅ Completed — ready for demonstration or extension.

