Complete end-to-end Terraform Pipeline!
Introduction: This blog will guide you to create an automation script that will automate the complete infrastructure and works as an infrastructure as a code (IAAS). First of all, Terraform is explained in-depth, then the automation script is explained which will automate the creation of key-pair, security-groups, ec2-instance, EBS-volume creation, attachment of EBS volume, mounting of EBS volume, downloading code from Github to host on the webserver hosted on AWS instance, creation of S3 bucket, adding data in that S3 bucket, & finally creating a CloudFront distribution for that S3 bucket!
Ever wondered about launching the whole infrastructure in just one command, yes, it seems to be impossible, but it is possible. In this blog, I will guide you to construct the complete Pipeline described in the Introduction section of this blog.
The future is really about automation, it is a fact, it can be proved even by looking in the daily life examples, each & every one of us desires the things which are automated. No one wants to wait for even a second, everyone requires the work to be completed in lightning-fast speed, which is not possible if the work is being done manually. Therefore to achieve lightning-fast speed, work has to done automatically.
Without automation, today’s world can’t survive, in the present situation almost every company uses automation. For example, consider banks, they send the messages for each transaction, that process is automated right, it is not done manually. There are tons of examples present like these, & I am sure that most of you are experiencing these examples in your daily life.
Prerequisite for this Pipeline to Implement
- Some knowledge of AWS & Github.
- It would be a plus point for you if you are having the knowledge of JSON, because, in this Pipeline, HCL (Hashi Configuration Language) has been used which is the native language for Terraform, & it is very similar to JSON.
Explanation of Terraform
It is a tool developed by HashiCorp. Link for it is given below:
Terraform by HashiCorp
Deliver infrastructure as code with Terraform Collaborate and share configurations Evolve and version your…
It is a tool that has the capability to handle the multi-cloud setup. It is a standardized tool, i.e. we have to write the code in one language only, but it will be automatically used to interact with multiple clouds (technically known as Terraform Providers). Its providers are listed in its documentation, the link for the same is present above.
Terraform works on declarative language, i.e. we just have to tell it what has to be done, it will automatically look into the situation & do that thing for us.
Terraform is intelligent because of the plugins for each of the providers it has, using them as API, it can interact with any of its providers.
Now, you are having more than enough knowledge required for this Pipeline, let’s begin the explanation of the complete automation Pipeline of Terraform.
List of steps in the Pipeline
Steps in the Pipeline:
- Setting up the Terraform Provider.
- Creation of Key-Pair for AWS Instance.
- Creation of Security-Group for AWS Instance.
- Creating an EC2 instance.
- Entering into EC2 Instance to install some software.
- Creating another EBS volume & attaching it to the EC2 instance.
- Mounting the EBS volume to the EC2 instance.
- Creating an S3 bucket.
- Uploading data into the S3 bucket.
- Creating an AWS CloudFront distribution for that data uploaded into the S3 bucket.
Code of the End-to-End Pipeline
In this pipeline, its explanation for each of the part is present in the code shown below with the help of comments for each part.
Step 1: Setting up the Terraform Provider.
Step 2: Creation of Key-Pair for AWS Instance.
Step 3: Creation of Security-Group for AWS Instance.
Step 4 & 5: “Creating an EC2 instance” & “Entering into EC2 Instance to install some software”!
Step 6: Creating another EBS volume & attaching it to the EC2 instance.
Step 7: Mounting the EBS volume to the EC2 instance.
Step 8: Creating an S3 bucket.
Step 9: Uploading data into the S3 bucket.
Step 10: Creating an AWS CloudFront distribution for that data uploaded into the S3 bucket.
In the above code, I have explained all the important features for creating CloudFront distribution through comments, the features which I didn’t explain, are not necessary as of now. Copy them exactly the same!
If the code shown till here is combined in one single file and been executed then it becomes the complete infrastructure as a code (IAAC).
This concludes our Pipeline!
Important Commands & Facts to run this code
- You should have Terraform installed in your system!
- After copying this code in a file, save that file with the “.tf” extension.
- Run “terraform init” command.
- Then run “terraform apply” to create your complete infrastructure!
- Finally, when your work is completed, destroy your environment with the command “terraform destroy”.
I hope my article explains each and everything related to the complete end-to-end Terraform-DevOps-Github-Cloud Pipeline along with the explanation of the code, & configuration & execution of code. Thank you so much for investing your time in reading my blog & boosting your knowledge!