The Cloud Resume Challenge, created by Forrest Brazeal, is a project designed to help people gain practical experience with cloud technologies and stand out as job candidates. It's not a traditional tutorial, but rather a set of goals you need to achieve by figuring things out yourself.

Why is it Different?

  • Focus on Learning - Unlike tutorials, the challenge pushes you to learn by doing and troubleshooting problems on your own. This 'trial by fire' approach reinforces knowledge better than memorisation
  • Open-ended Project - The challenge provides a roadmap but lets you figure out the details. This simulates real-world cloud engineering where you need to learn new things quickly and adapt
  • Valuable for Everyone - The challenge benefits both beginners and experienced professionals. Beginners gain practical skills, while experienced professionals can learn new technologies and showcase their problem-solving abilities.

What Does the Challenge Entail?

The challenge involves building a functional resume website hosted on AWS, covering the following:
  • Certification - Earn the AWS Cloud Practitioner certification
  • Build the website - Create a website using HTML and CSS
  • Deploy to AWS - Host the website on an Amazon S3 static website
  • Secure the website - Use HTTPS with CloudFront for secure access
  • Connect a custom domain - Register a domain name and point it to your website
  • Add a visitor counter - Implement a visitor counter using Javascript
  • Store visitor data - Use a database (DynamoDB) to store visitor count data
  • Create an API - Build an API (API Gateway & Lambda) to interact with the database
  • Use Python - Write the back-end logic in Python for the API
  • Write Tests - Ensure the code functionality with unit tests
  • Use Infrastructure as Code (IaC) - Define your cloud infrastructure using AWS Serverless Application Model (SAM)
  • Version Control - Manage your code using a Github repo
  • Automate Deployment (CI/CD) - Set up Github Actions to automate code testing and deployment
  • Share Your Experience - Write a blog post detailing your learning experience during the challenge

My Implementation

There are particular areas I did differently. For example the website build, this was made using Next.js, Material UI as well as CSS and Tailwind (tapping into the skills learnt from my software developer course).
For the cloud, I added AWS CodeBuild and SSM Parameter Store (storing env variables) to automate the static folder creation (npm run build). The CodeBuild phase was particularly challenging as I had never used it before, especially getting my head around the buildspec.yml file (This file being a set of build instructions).
Most of these changes I had to do differently from the guide primarily due to the fact I was using Next.js, not vanilla HTML, CSS and JavaScript.
I decided after the hosting stage to do the CI/CD pipeline first, then the API for the counter. Reason being is that the counter was going to require myself editing the code base, followed by pushing the changes. I though this would be a good way of putting my freshly implemented pipeline to the test.

Diagrams & Designs

The following images below are the cloud architectural diagrams with draw.io, along with my low and high-fidelity designs using Ok! So... and Figma.
As you can see, the general structure of the site is very similar to the final product, but I have also made a number of changes during the building stage.
Draw.io aws architecture of the services used to host the site
  • 1. Users attempt to access the website via their browser
  • 2. The sites domain and routing (A, NS, SOA Records) are set-up via Route 53. These will then point to my CloudFront (CF) distribution (AWS's Content Delivery Network)
  • 2.1. In order to gain the S (Secure) in HTTPS, AWS Certificate Manager is required to validate both www. and non-www. versions of the domain. It provides the SSL/TLS certificates for CF
  • 3. The routing is then connected to my CF distribution. CF handles requests and forwards them to the origin, being the S3 (Simple Storage Service) bucket
  • 4. Using the Origin Access Control (OAC) method via CF, this will allow CF to gain access to the S3 bucket containing the static files for the site. But with one critical difference compared to just using CF without the OAC; keeping the S3 bucket access private, only allowing specific access for the CF distribution (via a Bucket Policy)
  • 5. The S3 bucket containing the static web files then allows CF access
  • 6. CF caches the site's static files on-demand in its global network of edge locations, providing faster access for users worldwide by serving files from the closest edge location
  • 7. Users can now access the site
Draw.io aws architecture of the services used to host the site
  • 1. Changes are made in the code base (in this case using the VS Code IDE) of the site
  • 2. These changes are then pushed to the GitHub Repository (Repo), then subsequently merged with the main branch when ready
  • 3. Once merged with main, this in turn triggers AWS CodePipeline to begin its pipeline
  • 4. CodePipeline is configured to monitor the repo for changes. Merges trigger CodePipeline which initiates AWS CodeBuild (you can also use third party tools such as Jenkins if you wish), then it begins executing its buildspec.yml file
  • 5. The buildspec file contains a list of instructions written in yaml. This is the blueprint which will not only build a cloned version of the site, but also refresh the CF (CloudFront) caches. This file is broken down into various phases: install, pre_build, build, post_build, and artifacts. This installs all necessary dependencies that are required to generate a build artifact (the static site files). You must state all commands that are needed, for this site's build I needed the following:
  • 5.1 The buildspec file accesses the env variables via the SSM Parameter Store (which is free to use, while the alternative Secrets Manager incurs a cost)
  • 5.2 A Docker image as the build environment (provided by AWS or custom)
  • 5.3 Node.js run-time environment
  • 5.4 AWS CLI to execute the latter AWS specific commands
  • 5.5 npm package manager
  • 5.6 Next.js commands to build the static files
  • 5.7 Using AWS CLI, removing old static site files and replacing them with the new ones
  • 5.8 Using AWS CLI, refreshing the CF caches to reflect the new site changes
  • 6. CodePipeline jumps back in after CodeBuild has finished
  • 7. Deploys the static files to the S3 bucket
Draw.io aws architecture of the services used to host the site
  • 1. Users visit the site and enter via a button (in this case the door).
  • 1.1 This button triggers a visitor count script in the code base which uses "GET" & "POST" HTTP requests
  • 2. HTTP requests are routed through AWS API Gateway. The Gateway acts as the entry point for the HTTP requests, routing them to their appropriate endpoints
  • 3. The API Gateway GET & POST routes are both connected to a specific Lambda (an AWS managed 'serverless' compute service) function. This function (which can be written in a variety of languages) fetches the count and adds a count to the AWS DynamoDB (a non-relational database) table
  • 4. Lambda receiving POST requests adds a visitor count to the DynamoDB table
  • 5. Lambda receiving GET requests fetches the visitor count from the DynamoDB table
  • 6. API Gateway then retrieves this data from Lambda and sends it to the codes script
  • 7. The visitor count code completes the script and the count is displayed to the user
low-detail home page on desktop
low-detail home page on mobile
These are my low-fidelity designs. As you can see, 'pen to paper' ideas and a general layout of where I intended to go with this site. The key was not to get caught up in detailing.
high-detail projects page on desktop
high-detail projects page on mobile
My high-fidelity designs. Where I expanded on the previous low-fidelity mock-ups. This would give me a good reference point to go off when I began the build. However, I did change my mind on a number of design aspects over time. For instance, I initially thought of having a dynamic three-colour background would be a good idea at the time. In practice, it just looked a bit 'naff' to say the least.

Project details pending...⏳

blah