Webserver Distribution with AWS-EFS & S3 using Terraform!

Introduction: This blog will guide you to create an automation script that will automate the complete infrastructure and works as an infrastructure as a code (IAAS). First of all, Terraform is explained in-depth, then the automation script is explained which will automate the creation of key-pair, security-groups, ec2-instance, EFS-volume creation, attachment of EFS volume, mounting of EFS volume, downloading code from Github to host on the webserver hosted on AWS instance, creation of S3 bucket with private access, adding data in that S3 bucket, creation of OAI, creation of IAM policy and attach it to the S3 bucket, & finally creating a CloudFront distribution for that S3 bucket!

Ever wondered about launching the whole infrastructure in just one command, yes, it seems to be impossible, but it is possible. In this blog, I will guide you to construct the complete Pipeline described in the Introduction section of this blog.

The future is really about automation, it is a fact, it can be proved even by looking in the daily life examples, each & every one of us desires the things which are automated. No one wants to wait for even a second, everyone requires the work to be completed in lightning-fast speed, which is not possible if the work is being done manually. Therefore to achieve lightning-fast speed, work has to done automatically.

Without automation, today’s world can’t survive, in the present situation almost every company uses automation. For example, consider banks, they send the messages for each transaction, that process is automated right, it is not done manually. There are tons of examples present like these, & I am sure that most of you are experiencing these examples in your daily life.

Prerequisite for this Pipeline to Implement

  • Some knowledge of AWS & Github.
  • It would be a plus point for you if you are having the knowledge of JSON, because, in this Pipeline, HCL (Hashi Configuration Language) has been used which is the native language for Terraform, & it is very similar to JSON.

Explanation of Terraform

It is a tool developed by HashiCorp. Link for it is given below:

It is a tool that has the capability to handle the multi-cloud setup. It is a standardized tool, i.e. we have to write the code in one language only, but it will be automatically used to interact with multiple clouds (technically known as Terraform Providers). Its providers are listed in its documentation, the link for the same is present above.

Terraform works on declarative language, i.e. we just have to tell it what has to be done, it will automatically look into the situation & do that thing for us.

Terraform is intelligent because of the plugins for each of the providers it has, using them as API, it can interact with any of its providers.

Now, you are having more than enough knowledge required for this Pipeline, let’s begin the explanation of the complete automation Pipeline of Terraform.

List of steps in the Pipeline

Steps in the Pipeline:

  1. Setting up the Terraform Provider.
  2. Creation of Key-Pair for AWS Instance.
  3. Creation of Security-Group for AWS Instance.
  4. Creating an EC2 instance.
  5. Entering into EC2 Instance to install some software.
  6. Creating an EFS volume.
  7. Creating a Mount Point for the EFS volume.
  8. Mounting the EFS volume to the EC2 instance.
  9. Creating an S3 bucket with private access.
  10. Uploading data into the S3 bucket.
  11. Creating an OAI (Origin Access Identity) for the CloudFront distribution!
  12. Creating an AWS CloudFront distribution for that data uploaded into the S3 bucket.
  13. Creating an IAM Bucket Policy for CloudFront Distribution.
  14. Applying the bucket policy to the S3 Bucket so that CloudFront Distribution can access the private contents of the S3 bucket.

Code of the End-to-End Pipeline

In this pipeline, its explanation for each of the part is present in the code shown below with the help of comments for each part.

Step 1: Setting up the Terraform Provider.

Creating a Provider for the Terraform

Step 2: Creation of Key-Pair for AWS Instance.

Creating a public key!
Image for post
Key Pair created in AWS!

Step 3: Creation of Security-Group for AWS Instance.

Creating a security group!
Image for post
Security Group created in AWS!

Step 4 & 5: “Creating an EC2 instance” & “Entering into EC2 Instance to install some software”!

An instance is created & Software[git] is installed into it!
Image for post
EC2 Instance is created & software is installed in it, verification for the same can be done by accessing the EC2 instance by whichever way is possible. For example SSH

Step 6: Creating an EFS volume!

Terraform code to create an EFS Volume!
Image for post
EFS Storage created by Terraform! [Image by Author]

Step 7: Creating a Mount Point for the EFS volume.

Terraform Code to create an EFS Mount Point!
Image for post
EFS Mount point created by Terraform! [Image by Author]

Step 8: Mounting the EFS volume to the EC2 instance.

External EFS volume mounted to the EC2 instance in the folder “/var/www/html”!
Image for post
EFS Volume, confirmed by doing SSH to the web server and running the “df -h” command to list all the disks and mount-points! [Image by Author]

Step 9: Creating an S3 bucket with private access.

Code to create an S3 bucket!
Image for post
S3 Bucket Created by Terraform! [Image by Author]

Step 10: Uploading data into the S3 bucket.

Code to upload data to S3!
Image for post
File Uploaded by Terraform! [Image by Author]

Step 11: Creating an OAI

Code to create the OAI!
Image for post
OAI created by Terraform!

Step 12: Creating an AWS CloudFront distribution for that data uploaded into the S3 bucket.

CloudFront Distribution Code!

In the above code, I have explained all the important features for creating CloudFront distribution through comments, the features which I didn’t explain, are not necessary as of now. Copy them exactly the same!

Image for post
CloudFront Distribution Created by Terraform!

Step 13: Creating an IAM bucket policy for CloudFront distribution!

Terraform Code to create an IAM bucket Policy!

Step 14: Applying IAM bucket policy to the S3 bucket!

Terraform code to attach the IAM bucket Policy to the S3 bucket!
Image for post
IAM bucket policy attached to the S3 bucket by Terraform! [Image by Author]

If the code shown till here is combined in one single file and been executed then it becomes the complete infrastructure as a code (IAAC).

This concludes our Pipeline!

Important Commands & Facts to run this code!

  • You should have Terraform installed in your system!
  • After copying this code in a file, save that file with the “.tf” extension.
  • Run “terraform init” command.
  • Then run “terraform apply” to create your complete infrastructure!
  • Finally, when your work is completed, destroy your environment with the command “terraform destroy”.

I hope my article explains each and everything related to the complete end-to-end Terraform-DevOps-Github-Cloud Pipeline along with the explanation of the code, & configuration & execution of code. Thank you so much for investing your time in reading my blog & boosting your knowledge!

Written by

Big Data Enthusiast, have a demonstrated history of delivering large and complex projects. Interested in working in the field of AI and Data Science.