Logo
  • Ubuntu
  • CentOS
  • Debian
  • Fedora
  • RedHat

Setting up a DevOps Pipeline in AWS - DesignLinux

Jul 15 2020
designlinux 0 Comments

In this article, we will guide you through how to set up a DevOps pipeline in AWS. However, let’s start off with defining what DevOps actually is.

In the past, application development was usually split between two teams – development and operations. The development team would write the code, test it, and then deliver it to the operations team who would deploy it to a server and make sure it runs and scales without interruption.

With the recent shift towards cloud computing, developers are now expected to have more knowledge of the infrastructure running their applications. This means the development team and the operations team are merged into one and working together in a DevOps manner. This enables developers to write code that scales and can be more easily managed in production environments.

Another benefit of the DevOps way of working is that issues in production can be identified quicker if the team responsible for the operations is the same as the one that originally developed the application.

So, how do pipelines tie into DevOps? If we look at the past way of working in teams, split between development and operations, a release would typically follow a variation of the process described below.

  • The development team merges the code changes that should be included in the release to the code repository.
  • The development team (or a dedicated tester) carries out testing of the release.
  • The development team creates a production build that is ready to be released.
  • The operations team receives the production build and deploys it to production manually. Typically by placing the release package inside the server environment and running scripts.

With DevOps, and a merged development and operations team, we can instead release small features at a faster pace. Operations related tasks can be carried out in parallel to the usual development activities. To do this at an even faster pace, we can automate release and testing tasks using a DevOps pipeline. The placing of the release packages inside the server environment and running scripts can then be run automatically at the push of a button or simply by pushing the code changes to a particular branch.

A pipeline like this can be set up with a number of different tools. However, if you are already running your workloads on AWS, it comes with many different services that help you do this quite efficiently without leaving the AWS ecosystem.

Let’s get started creating our own pipeline to automate some deployment tasks.

Getting Started #

We will automate the deployment of a simple static site using the following AWS services:

  • CodePipeline – An orchestration tool that helps us trigger a deployment by pushing to a source code repository or by manual pushes of buttons
  • CodeBuild – A build container that can run scripts needed for deployment tasks
  • S3 – A static file hosting service that will host our static site

The static site that we will deploy consists of a simple HTML file accessed from the S3 URL. To deploy this, we need to manually upload the HTML file in the bucket from the AWS console. While this might not be terribly difficult to do, we can always save us a couple of minutes by automating this task.

The goal of the pipeline is to combine the aforementioned AWS services to achieve the following:

Static Site Hosting on S3 #

1. Create a bucket #

To get our static site running, we start off by creating an S3 bucket. This is done by going to AWS Console → Services → S3 → Create Bucket. Ensure to enable public access to this bucket so that our site can be accessed over the internet. Leave the other options to their defaults.

2. Enable static site hosting #

Now it is time to make our HTML files in the bucket available as a static site. To do this, go to your S3 bucket → Properties → Static website hosting → Use this bucket to host a static website. Make sure to input index.html as your index document and press Save. Now your site should be up and running if you go to the endpoint URL that is displayed in the Static website hosting dialog.

Great! Now we have a static site. To update it, you need to upload a new version of the index.html file in the bucket. Let’s automate that!

Creating the Pipeline #

1. Create a CodeCommit repository #

To host the code, we need a repository for our files. This can be GitHub or whatever another repository service you prefer. For simplicity’s sake, we will use the AWS repository service CodeCommit.

Create a repository by going to AWS Console → CodeCommit → Create repository. Enter a name and hit Save. Finally, push an HTML by connecting to the repository over SSH or HTTPS. If you don’t have any inspiration for a file you can use the one below:

<p>Hello from Linuxize.com!</p>

2. Create a CodePipeline pipeline #

Now it is time to create the pipeline that will orchestrate the deployment process of our static site. To start creating the pipeline, go to AWS Console → CodePipeline → Create new pipeline.

Step 1 #

  • Enter the name of the pipeline.
  • Choose “New service role”.
  • Leave the rest to the defaults.

Step 2 #

  • Choose AWS CodeCommit as the Source provider.
  • Choose your newly created repository as the source.
  • Choose the branch that you want to build from as the Branch name.
  • Leave the rest to the defaults.

Step 3 #

  • Press Skip build stage – we don’t need to build our files in this pipeline since it is simply static HTML.

Step 4 #

  • Choose Amazon S3 as your Deployment stage.
  • Choose the bucket you created before as the Bucket.
  • Leave S3 object key empty.
  • Tick Extract file before deploy.
  • Expand the Additional configuration pane and choose public-read as the Canned ACL.
  • Hit Save.

Step 5 #

Tada! Now your pipeline should run and deploy the HTML file in your CodeCommit repository to S3. Push a change to the file, and the pipeline should trigger again automatically.

Conclusion #

While this is one of the simplest setups you can have, the fundamentals are the same, even for very complex back-end applications. They might require more steps in the pipeline, but the basic flow should be the same. Setting up a deployment pipeline once and automating the workflow saves you a lot of time in the long run, and avoiding manual tasks always mean more safely and less human errors.

Good luck with using your new DevOps skills!

If you have any questions or feedback, feel free to comment below.

devops aws

About the authors

Karl Eriksson

Founder of the mock API tool Mocki.

Related

Tags: aws, devops

How to Install Elasticsearch on CentOS 8

Prev Post

How to Check for Listening Ports in Linux (Ports in use)

Next Post
Archives
  • January 2023
  • December 2022
  • November 2022
  • October 2022
  • September 2022
  • July 2022
  • June 2022
  • April 2022
  • March 2022
  • February 2022
  • January 2022
  • December 2021
  • November 2021
  • October 2021
  • September 2021
  • August 2021
  • July 2021
  • June 2021
  • May 2021
  • April 2021
  • March 2021
  • February 2021
  • January 2021
  • December 2020
  • November 2020
  • October 2020
  • September 2020
  • August 2020
  • July 2020
  • June 2020
  • May 2020
Categories
  • AlmaLinux
  • Android
  • Ansible
  • Apache
  • Arch Linux
  • AWS
  • Backups
  • Bash Shell
  • Bodhi Linux
  • CentOS
  • CentOS Stream
  • Chef
  • Cloud Software
  • CMS
  • Commandline Tools
  • Control Panels
  • CouchDB
  • Data Recovery Tools
  • Databases
  • Debian
  • Deepin Linux
  • Desktops
  • Development Tools
  • Docker
  • Download Managers
  • Drupal
  • Editors
  • Elementary OS
  • Encryption Tools
  • Fedora
  • Firewalls
  • FreeBSD
  • FTP
  • GIMP
  • Git
  • Hadoop
  • HAProxy
  • Java
  • Jenkins
  • Joomla
  • Kali Linux
  • KDE
  • Kubernetes
  • KVM
  • Laravel
  • Let's Encrypt
  • LFCA
  • Linux Certifications
  • Linux Commands
  • Linux Desktop
  • Linux Distros
  • Linux IDE
  • Linux Mint
  • Linux Talks
  • Lubuntu
  • LXC
  • Mail Server
  • Manjaro
  • MariaDB
  • MongoDB
  • Monitoring Tools
  • MySQL
  • Network
  • Networking Commands
  • NFS
  • Nginx
  • Nodejs
  • NTP
  • Open Source
  • OpenSUSE
  • Oracle Linux
  • Package Managers
  • Pentoo
  • PHP
  • Podman
  • Postfix Mail Server
  • PostgreSQL
  • Python
  • Questions
  • RedHat
  • Redis Server
  • Rocky Linux
  • Security
  • Shell Scripting
  • SQLite
  • SSH
  • Storage
  • Suse
  • Terminals
  • Text Editors
  • Top Tools
  • Torrent Clients
  • Tutorial
  • Ubuntu
  • Udemy Courses
  • Uncategorized
  • VirtualBox
  • Virtualization
  • VMware
  • VPN
  • VSCode Editor
  • Web Browsers
  • Web Design
  • Web Hosting
  • Web Servers
  • Webmin
  • Windows
  • Windows Subsystem
  • WordPress
  • Zabbix
  • Zentyal
  • Zorin OS
Visits
  • 0
  • 990
  • 1,055,762

DesignLinux.com © All rights reserved

Go to mobile version