I recently ran into some trouble trying to update my blog engine (Ghost) to the latest version. I was more than a year behind, which represented about 29 updates 😳. As you can imagine, cathing up didn't go so well. In fact, nothing at all was working after my first attempt. I even corrupted my database at some point. Fortunately, I had backups. In the end, I was able to correct all the issues, but I had to build some tooling to help me.

Then, I remembered that, one of the first rules of continuous delivery is:

If it hurts, do it more often

At that point, it was pretty clear that I needed a much better process to keep my blog engine up to date.

Context

To make it a bit easier to follow, I'll give you some context about my environment.
I run this blog on Azure using an App Service which hosts Ghost on NodeJS and the underlying database is SQLite. It's important to note that Ghost doesn't run of the box on Azure. However, Yannick Reekmans has an excellent fork that take care of all the details to host it in an Azure App Service.

Objectives

  • Kick off automatically on new versions of Ghost
  • Backup important data
  • Use App Service slots to safely deploy and test
  • Build and run the migration off the production servers to minimize downtime

Packaging

The first step really is to build a deployable package in a repeatable way. Let's see how we can do it using Azure DevOps.

Requirements:

Setup

  1. Create a new Build pipeline in Azure DevOps
    Make sure to use the classic editor
    ClassicEditor

  2. Select your git repository

  3. Choose "Empty Job"
    EmptyJob

  4. Make sure to add the following variables
    Variables

  • backupWebJobName
  • scmUsername
  • scmPassword
  • scmUrl

i.e.
Variables--2-

Tasks

Backup DB

This will manually start the webjob that creates a backup of our SQLite DB and name it backup.db. I won't go over the details, but if you try to download the running DB directly, you'll get a DB with 0 bytes as it's locked by another process. Fortunately, Tom Chantler made a simple WebJob that does the trick.

Task type: Rest call

variables:
  backupWebJobName: 'BackupDb-Manual'

steps:
- task: Cdisco[email protected]0
  displayName: 'Start DB backup web job'
  inputs:
    webserviceEndpoint: '$(Parameters.webserviceEndpoint)'
    relativeUrl: 'api/triggeredwebjobs/$(backupWebJobName)/run'
    httpVerb: POST

WebService endpoint
create a generic service connection that points to your App Service Kudu dashboard. The format is: https://[YOUR_APP_SERVICE_NAME_HERE].scm.azurewebsites.net/

Screenshot

BackupDB-2


Download images

This will download the content/images folder. As we use slots rotation we need to copy all the data from the production slot to our dev slot.

Task type: Download file

steps:
- task: Fizcko.azure-de[email protected]1
  displayName: 'Download images'
  inputs:
    strUrl: '$(scmUrl)/api/zip/site/wwwroot/content/images/'
    strTargetDir: '$(Build.ArtifactStagingDirectory)'
    strTargetFilename: images.zip
    authType: basic
    basicAuthUsername: '$(scmUsername)'
    basicAuthPassword: '$(scmPassword)'
Screenshot

Download-images


Extract images

Now let's extract images.zip we just downloaded to the files we pulled from git and merge everything together.

Task type: Extract files

steps:
- task: [email protected]
  displayName: 'Extract images.zip'
  inputs:
    archiveFilePatterns: '$(Build.ArtifactStagingDirectory)/images.zip'
    destinationFolder: content/images
Screenshot

Extract-images


Download DB backup

Time to download our newly created backup DB (backup.db) and copy it in the file structure we pulled from git. You might wonder why we don't do this step right after the first step where we kick-off the webjob to perform the backup? Well, the webjob is fairly fast (~2s), but also asynchronous, which means it's possible that the backup file is not complete when you try to download it. That's why we pushed that task a bit later on our pipeline. That way, we make sure the backup job is completely done before trying to download the backup.db file.

Task type: Download file

steps:
- task: Fizcko.azure-de[email protected]1
  displayName: 'Download backup.db'
  inputs:
    strUrl: '$(scmUrl)/api/vfs/site/wwwroot/content/data/backup.db'
    strTargetDir: content/data/
    strTargetFilename: ghost.db
    authType: basic
    basicAuthUsername: '$(scmUsername)'
    basicAuthPassword: '$(scmPassword)'
Screenshot

Download-backup-db-1


Npm install

Nothing to say really here. Just download the trillion packages we need to run the database migration.

Task type: npm install

steps:
- task: [email protected]
  displayName: 'npm install'
  inputs:
    verbose: false

DB Migration

Let's run the DB migration scripts. This will make sure to migrate our SQLite DB the latest schema. Ghost alreay provides these scripts for each version. We only need to lauch their javascript DB migrator.

Ghost server will run this database migration automatically on startup if needed. However, to minimize the downtime while deploying a new version, it's better to run it as part of our build pipeline. This will also allow us to inspect the DB before deploying it, you know... in case the migration went wrong. Cough... Cough... Murphy's law.

Task type: PowerShell

steps:
- powershell: 'node .\db.js'
  failOnStderr: true
  displayName: 'DB migration'

Remove node_modules

node_modules is a really big folder and might cause issues if we add it to our deploy package. Let's remove it!

Task type: PowerShell

steps:
- powershell: 'Remove-Item .\node_modules -Recurse'
  failOnStderr: true
  displayName: 'Remove node_modules'

Archive files

It's now time to zip everything into one deployable package.

Task type: Archive file

steps:
- task: [email protected]
  displayName: 'Archive files '
  inputs:
    rootFolder: '$(System.DefaultWorkingDirectory)'
    includeRootFolder: false
Screenshot

Archive


Copy Files

Copy that package from the $ArtifactStagingDirectory and rename it with the build number. This could be usefull if we need to debug multipe deployments.

Task type: Copy files

steps:
- task: [email protected]
  displayName: 'Copy Files'
  inputs:
    SourceFolder: '$(Build.ArtifactStagingDirectory)'
    Contents: '$(Build.BuildId).zip'
    TargetFolder: '$(Build.ArtifactStagingDirectory)\ArtifactsToBePublished'
Screenshot

CopyFiles


Publish artifact

Finally, publish that package to the Azure Pipelines, so it will be available for our release pipeline later on.

Task type: Publish build artifacts

steps:
- task: [email protected]
  displayName: 'Publish Artifact: drop'
  inputs:
    PathtoPublish: '$(Build.ArtifactStagingDirectory)\ArtifactsToBePublished'
Screenshot

Publish


Summary

You should now have a pipeline that looks like this one
Pipeline

Let's not forget to set a trigger when there's a new commit. To do so, go to the Triggers tab and select
Enable continuous integration
CD

You are now all set, congratulations! Click Save & Queue and give it a try. On my next post, I'll show you how to deploy that package while having zero downtime, all of that using Azure App Service slots.

Make sure to look at the Part 2 to learn how to build the Release Pipeline for this package.