Cloud Resume Challenge (Azure) [Part 3 of 3]

ยท

8 min read

Introduction

If you've made it this far, thanks for tagging along with me! If you have no idea what I'm talking about, you should probably go check out Part 1 and Part 2 of my Cloud Resume Challenge journey.

A little bit of recap before we get started: In Part 1 of this series, we looked at what a static website consists of, and the bare necessities of setting up the Azure resources required to host said website (by the rules of the challenge, at least - there are other ways to host static websites in Azure). Part 2, we got a little more complex and looked at using JavaScript to implement a dynamic view counter, but that view counter has to have a permanent server-side component so we made it fancy by connecting it to a CosmosDB instance using an Azure Function.

As we're wrapping everything up, we need a nice bow to put on top, and we're going to make that bow with the final components of the challenge - IaC, Source Control, and CI/CD (and I'll be doing the official blog post as requested in step 16 - stay tuned).

Step 12: Infrastructure as Code (IaC)

So, I approached this a little differently than the challenge recommends - it says to use an ARM template for the CosmosDB instance and Azure Function. I initially built the DB in the Portal and used VS Code for the Function App, because I wasn't really comfortable using the CLI to do either since I'd never set them up before. This step then kind of rolled into the next three (Steps 14 and 15, more so than 13). I'll go over how I implemented IaC in those steps, but - Spoiler - I'll go ahead and post the Terraform files here, starting with the front-end .tf:

terraform {
  required_version = ">= 1.5.7"
  # This 'backend' section is required to store the .tfstate in a
  # separate storage account, so it persists
  backend "azurerm" {
    resource_group_name  = "sy4cloud"
    storage_account_name = "sy4tfstate"
    container_name       = "sy4resume-tfstate"
    key                  = "sy4resume-frontend.tfstate"
  }
}

provider "azurerm" {
  features {}
}

resource "azurerm_resource_group" "sy4rg" {
  name     = "sy4rgresume"
  location = "North Central US"
}

resource "azurerm_storage_account" "sy4sa" {
  name                     = "sy4saresume"
  resource_group_name      = azurerm_resource_group.sy4rg.name
  location                 = azurerm_resource_group.sy4rg.location
  account_tier             = "Standard"
  account_replication_type = "LRS"
  account_kind             = "StorageV2"

  static_website {
    index_document     = "index.html"
    error_404_document = "404.html"
  }
}

resource "azurerm_cdn_profile" "sy4cdn-profile" {
  name                = "sy4cdnresume"
  location            = azurerm_resource_group.sy4rg.location
  resource_group_name = azurerm_resource_group.sy4rg.name
  sku                 = "Standard_Microsoft"
}

resource "azurerm_cdn_endpoint" "sy4cdn-endpoint" {
  name                = "sy4endresume"
  profile_name        = azurerm_cdn_profile.sy4cdn-profile.name
  location            = azurerm_resource_group.sy4rg.location
  resource_group_name = azurerm_resource_group.sy4rg.name
  is_http_allowed     = false
  is_https_allowed    = true
  origin_host_header  = azurerm_storage_account.sy4sa.primary_web_host
  content_types_to_compress = [
    "text/html",
    "text/css",
    "application/javascript",
  ]
  querystring_caching_behaviour = "IgnoreQueryString"

  origin {
    name      = "sy4resume"
    host_name = azurerm_storage_account.sy4sa.primary_web_host
  }
}

resource "azurerm_cdn_endpoint_custom_domain" "sy4cdn-custom-domain" {
  name            = "sy4resume"
  cdn_endpoint_id = azurerm_cdn_endpoint.sy4cdn-endpoint.id
  host_name       = "resume.seanyoung.me"

  cdn_managed_https {
    certificate_type = "Dedicated"
    protocol_type    = "ServerNameIndication"
    tls_version      = "TLS12"
  }
}

And, the back-end .tf:

terraform {
  required_version = ">= 1.5.7"
  backend "azurerm" {
    resource_group_name  = "sy4cloud"
    storage_account_name = "sy4tfstate"
    container_name       = "sy4resume-tfstate"
    key                  = "sy4resume-backend.tfstate"
  }
}

provider "azurerm" {
  features {}
}

resource "azurerm_resource_group" "sy4rg-backend" {
  name     = "sy4rgbackend"
  location = "North Central US"
}

resource "azurerm_storage_account" "sy4sa-backend" {
  name                     = "sy4sabackend"
  resource_group_name      = azurerm_resource_group.sy4rg-backend.name
  location                 = azurerm_resource_group.sy4rg-backend.location
  account_tier             = "Standard"
  account_replication_type = "LRS"
}

resource "azurerm_cosmosdb_account" "sy4cosmos" {
  name                = "sy4dbresume"
  location            = azurerm_resource_group.sy4rg-backend.location
  resource_group_name = azurerm_resource_group.sy4rg-backend.name
  offer_type          = "Standard"
  kind                = "GlobalDocumentDB"

  enable_automatic_failover = true
  enable_free_tier          = true

  consistency_policy {
    consistency_level = "Session"
  }

  cors_rule {
    allowed_headers = ["*"]
    allowed_methods = ["GET", "POST"]
    allowed_origins = ["https://resume.seanyoung.me"]
    exposed_headers = ["*"]
  }

  geo_location {
    location          = azurerm_resource_group.sy4rg-backend.location
    failover_priority = 0
  }
}

resource "azurerm_cosmosdb_sql_database" "sy4db" {
  name                = "AzureResume"
  resource_group_name = azurerm_resource_group.sy4rg-backend.name
  account_name        = azurerm_cosmosdb_account.sy4cosmos.name
  throughput          = 400
}

resource "azurerm_cosmosdb_sql_container" "sy4container" {
  name                = "VisitorCount"
  resource_group_name = azurerm_resource_group.sy4rg-backend.name
  account_name        = azurerm_cosmosdb_account.sy4cosmos.name
  database_name       = azurerm_cosmosdb_sql_database.sy4db.name
  partition_key_path  = "/id"
  throughput          = 400
}

resource "azurerm_service_plan" "sy4sp" {
  name                = "sy4aspbackend"
  location            = azurerm_resource_group.sy4rg-backend.location
  resource_group_name = azurerm_resource_group.sy4rg-backend.name
  os_type             = "Linux"
  sku_name            = "Y1"
}

resource "azurerm_application_insights" "sy4ai" {
  name                = "sy4aibackend"
  location            = azurerm_resource_group.sy4rg-backend.location
  resource_group_name = azurerm_resource_group.sy4rg-backend.name
  application_type    = "web"
}

resource "azurerm_linux_function_app" "sy4app" {
  name                                           = "sy4faresume"
  location                                       = azurerm_resource_group.sy4rg-backend.location
  resource_group_name                            = azurerm_resource_group.sy4rg-backend.name
  service_plan_id                                = azurerm_service_plan.sy4sp.id
  storage_account_name                           = azurerm_storage_account.sy4sa-backend.name
  storage_account_access_key                     = azurerm_storage_account.sy4sa-backend.primary_access_key
  builtin_logging_enabled                        = false
  webdeploy_publish_basic_authentication_enabled = false
  https_only                                     = true

  app_settings = {
    "APPINSIGHTS_INSTRUMENTATIONKEY" = azurerm_application_insights.sy4ai.instrumentation_key
    "FUNCTIONS_WORKER_RUNTIME"       = "python"
    "FUNCTIONS_EXTENSION_VERSION"    = "~4"
    "COSMOS_DB_KEY"                  = azurerm_cosmosdb_account.sy4cosmos.primary_key
    "COSMOS_DB_URL"                  = azurerm_cosmosdb_account.sy4cosmos.endpoint
    "SCM_DO_BUILD_DURING_DEPLOYMENT" = true
  }



  site_config {
    cors {
      allowed_origins = ["https://resume.seanyoung.me"]
    }
    application_stack {
      python_version = "3.11"
    }
  }
}

Step 13: Source Control

I guess I kind of cheated on this step. I've been using source control since the inception of this project - you may remember, if you paid attention to the commands in the first post, that the local directory on my computer is labeled 'source_control' and I keep all the directories inside that synced with GitHub repos. So this step was easy.

Step 14: CI/CD (Back-End)

So, I actually did this step and the next step in reverse order. Skip down a little bit, read that one, and then come back. I'll wait. ๐Ÿ™‚

Since I did the front-end code first and I was already using source control and GitHub Actions for that, I did the same thing here... mostly. Source control was a given, but I struggled with the GitHub Actions for deploying this to my Azure Function - when I would do the deployment via GitHub Actions, it would fail, but when I deployed via VS Code, it worked as intended. That was maddening, but such is the learning process sometimes. Then I stumbled upon the post, from last year (2023), stating GitHub actions could not contact the Microsoft SCM servers because they've moved to a private subnet... So - I switched over to Azure Pipelines to do the Function deployment! Nothing special to report here, because Pipelines made it for me (I chose the "Python Function App on Linux" template) - but I did have to wait a few days for my free "hosted parallelism" in order to run jobs - and I had to update the Python version it wanted to use. After that, it just worked.

I also tussled with a Cosmos DB issue, where the Python API couldn't update/retrieve the viewer count. Took me a few hours to figure out it was just a typo in my Terraform config. There were so many tear-downs and rebuilds.

Step 15: CI/CD (Front-End)

I jumped the gun here, too, honestly. After I finished the steps in Part 1, I started looking into what was required to hook that up via GitHub Actions. Luckily I found this in the Microsoft documentation, and I just made a single change - before the upload-batch step to copy the files up (which I initially did with --overwrite true at the end), I added a step to clear the web$ folder so I'd start fresh each time.

And then.. I went hard. What I had was cool - I hadn't set up GitHub Actions before, but it wasn't quite enough for me. I could deploy changes to my site with that, but IaC is powerful... very powerful. So, I did what any sane person would do and I ripped it all down. I built it back up again, entirely with Terraform, which naturally broke everything (I could have imported the existing Azure resources into Terraform, but I wanted the full experience). My main source of reference for this part was this blog post by Thomas Thornton. After some trial and error, and some waiting - it's the cloud, after all - I was back in business. I initially ran the Terraform from my computer to create the infrastructure, but that worked out because I also had to adjust the permissions on the Service Provider accounts and whatnot - I just added the steps to my GitHub Action to make sure everything stays nice and tidy afterwards, but it's not neccesary

name: CRC Front-End CI

on:
    push:
        branches: [ main ]

jobs:
  terraform:
    name: 'Terraform'
    env:
      ARM_CLIENT_ID: ${{ secrets.AZURE_AD_CLIENT_ID }}
      ARM_CLIENT_SECRET: ${{ secrets.AZURE_AD_CLIENT_SECRET }}
      ARM_SUBSCRIPTION_ID: ${{ secrets.AZURE_SUBSCRIPTION_ID }}
      ARM_TENANT_ID: ${{ secrets.AZURE_AD_TENANT_ID }}
    runs-on: ubuntu-latest

# ... the Terraform job from the mentioned blog is here ... #

  build:
    needs: terraform
    runs-on: ubuntu-latest
    steps:
    - uses: actions/checkout@v3
    - uses: azure/login@v1
      with:
          creds: ${{ secrets.AZURE_CREDENTIALS }}

    - name: Delete current contents of $web
      uses: azure/CLI@v1
      with:
        inlineScript: |
            az storage blob delete-batch --account-name sy4saresume --auth-mode key -s '$web'
    - name: Upload to blob storage
      uses: azure/CLI@v1
      with:
        inlineScript: |
            az storage blob upload-batch --account-name sy4saresume --auth-mode key -d '$web' -s website --overwrite true
    - name: Purge CDN endpoint
      uses: azure/CLI@v1
      with:
        inlineScript: |
           az cdn endpoint purge --content-paths  "/*" --profile-name "sy4cdnresume" --name "sy4endresume" --resource-group "sy4rgresume"

  # Azure logout
    - name: logout
      run: |
            az logout
      if: always()

Conclusion

Man, what a project. The difficulty curve (for me, at least - remember, I have very little programming/coding experience) was quite a bit steeper than I expected - but I feel so good for completing it. I learned a lot and bolstered the existing Azure knowledge that I did have. I know my current setup isn't completely automated, because of service principle accounts and GitHub Secrets - I'm not sure if I can "fix" that in the future, or if it's good enough as it is ("Perfect is the enemy of good", or however the saying goes). Stay tuned for my follow-up (Part 4/3, I guess) where I do the official blog post.

Next ->

ย