Using AWS Services with Terraform

“Core Services used for this automated environment”

What is Terraform ?

Use Infrastructure as Code to provision and manage any cloud, infrastructure, or service. Terraform users define infrastructure in a simple, human-readable configuration language called HCL (HashiCorp Configuration Language). Users can write unique HCL configuration files or borrow existing templates from the public module registry.

In this post, I will be describing about Integrated environment for Webserver on AWS platform using Terraform.

Highlights of the AWS’s services used in this small project :

  1. EC2 service — For creating a instance on which webserver will be deployed.
  2. EFS (Elastic File System) — For giving persistent(permanent) storage to our instance’s folder “/var/www/html”
  3. S3 Storage — For uploading a image into the bucket which will be displayed on the webpage.
  4. CloudFront Distributions — This is used for providing CDN (Content Delivery Network) for our webpage for faster loading of our webpage.

DESCRIPTION OF THIS PROJECT -

We have to use code using Terraform tool by HashiCorp . Using this code file , we have to create key pair, security group for instance, a EFS Volume which is to be mounted so for storage, Using S3 service,we have to create a bucket in which will be storing a image , CloudFront for CDN and Integrating CloudFront with S3 .

PROCESS AND EXPLANATION STEPWISE : —

  1. Install Terraform from the official site “terraform.io” and then unzip it and save it in a folder wherever you want. Then add the path of this folder in the path variable . For this :-

Goto search bar → search env → then click on environmental variables →then edit path variable → new → then copy the path there → save

  • Then in CLI ,use command “ terraform version” to check whether terraform is installed/working or not.

Then create a folder for using terraform with any name and create a file with any name but with “.tf extension”. We will be typing our code in this file for our project.

2. First we need to configure AWS CLI. Open cmd and execute the following command.

# aws configure

“Then enter the access key ID, secret access key and the default region name.”

To get the above a access key ID, secret access key and the default region name use the below process :-

Use only need a AWS account for this deployment and open the AWS Management Console.

AWS Management Console

Then goto Services and then you will be observing Iam subservice in Security, Identity and Compliance.

Services

Then create a new Iam user in this and give this user Administrator Access.

Iam page

Then click on Users in Iam Resources and create a new user and give any name to this user and in access type -give programatic access to use this user in Cli, Sdk and on may more platforms.

Create User by clicking on Add User

  • Give administrator access to this user
  • Create the user and copy this access key and security access key as it will be shown only once.
See Access key and security access key
aws configure

Let’s start with terraform code :

  1. Initially, we have to write “provider” in starting of the code.
AWS Provider

2. In our task, we have to create a security key pair for our instance. So, below is the code for key pair creation. “key1.pem” is the name for our key.

Key Pair Creation

3. In our case, we are using default vpc and subnet, so below is the the code to use these :

Default VPC and Subnet

4. Then comes the addition of security rules to our instance and hence creating a security group :

4. Next step is to create instance and then we will be connecting to it using ssh and then using provisioner, we will enter the following commands written in inline block.

5. Now, we will be giving EFS Network Storage to our instance and then creating a mount target.

And again creating a connection to instance, we will be executing the commands in instance, this will be done automatically using this code of terraform.

6. Now, we will make S3 bucket which will store our image ,which will furthur be used for our webpage.

7. Now at last, we will create CloudFront Distributions for faster loading of S3 object image.

Now after all the code written, we have to deploy it with a command :

# terraform init

— Use this command only once in the folder where file is located

# terraform apply

— Use this command whenever to deploy code

These screenshots are pasted here because these are exported variables and these can be used to assign values in our code

— Enter “yes” to approve

Deployment is creating

Above you can see, deployment has begin to create.

After creation is complete, we can see on AWS WebUI that all the above mentioned resources are deployed and running.

EC2 Dashboard
EC2 Instance
Webserver file -> “index.php”

I have changed the URL to CLoudFront Domain.

S3 Bucket
Object in the S3 bucket
EFS Volume

Now, all the resources are created and we have added the CloudFront URL in our webpage, so we can see the output :

  1. OUTPUT FROM CLOUDFRONT URL
Cloudfront URL displays the object

2. OUTPUT FROM WEBSERVER

You can see my webserver is running

→ Github link for downloading this Terraform code (“task2.tf”is the file name) :-

“I practiced and gained knowledge of this project under the mentorship of Mr. VIMAL DAGA Sir during the Hybrid Multi Cloud Training by Linux World India.”

I hope this article is Informative and Explanatory. Hope you like it !!!

For any suggestions or if any reader find any flaw in this article, please email me to “akhileshjain9221@gmail.com”

Thank You Readers!!!

--

--

--

I am a student and persuing under graduation in computer science and engineering.

Love podcasts or audiobooks? Learn on the go with our new app.

Recommended from Medium

Programming with Loops

High DEMAND CAREER OPTIONS FOR KIDS IN THE FUTURE

Advanced Monitoring & Observability for IBM Power Systems with Instana

Download Instana agents for IBM Power Systems.

How to have Angular2 and Rails coexist peacefully on your laptop and servers

Quick Overview of C Programming

JAVA — RMI

How coding improves creativity?

An impromptu test of Apple’s privacy claims thanks to Brave

Get the Medium app

Akhilesh Jain

Akhilesh Jain

I am a student and persuing under graduation in computer science and engineering.

More from Medium

Jenkins CI/CD to update GCP MIG Instance using CloudBuild

GKE Operations on Google Cloud

IP Address management in GKE — https://cloud.google.com/blog/products/containers-kubernetes/ip-address-management-in-gke

Multinode cluster over AWS

Learn How To Get the File name and File ID From Google Drive In Apps Script

Google Apps Script — Getting File Name And Id From Google Drive Files