Automatic Drupal updates using composer and GitHub Actions
We’ve been testing different services to update Drupal automatically, such as Renovate and Dependabot. They all have more or less the same features and even though they work great on paper, we always ended up having the same issue: database updates.
For example:
- Commerce module receives an update that:
- Dependabot/Renovate updates Commerce to the latest version.
- The next deployment installs the updated Commerce, runs config import, database updates and installs the missing field and token dependency.
- At this point, your active configuration (in database) has the required changes, but the configuration stored in Git does not.
- The next deployment will run the configuration import again and override all changes done by previous deployment, meaning the code depending on that new field or Token module will suddenly break.
While you can work around this by triggering a secondary GitHub action against Dependabot/Renovate pull requests that runs the required tasks and exports the changed configuration, it just felt overly complicated. This made us think if we actually even needed sophisticated update management. The answer was no, not really.
What we actually needed was just a way to run composer update drupal/*
.
Automating the update process
Create a .github/workflows/update.yml
file:
name: Update Drupal
on:
# Workflow_dispatch will allow you to run this manually from Actions tab.
# See https://docs.github.com/en/actions/using-workflows/events-that-trigger-workflows for
# available triggers.
workflow_dispatch:
jobs:
update:
runs-on: 'ubuntu-latest'
services:
db:
image: mariadb
env:
MYSQL_USER: drupal
MYSQL_PASSWORD: drupal
MYSQL_DATABASE: drupal
MYSQL_ROOT_PASSWORD: drupal
ports:
- 3306:3306
steps:
- name: Checkout
uses: actions/checkout@v4
- name: Setup PHP
uses: shivammathur/setup-php@v2
with:
php-version: '8.3'
extensions: gd, pdo_mysql
- name: Install Drupal using existing configuration
run: |
composer install
./vendor/bin/drush site-install --existing-config
# Flush caches and re-run config:import to make sure everything
# is up-to-date.
./vendor/bin/drush cr && ./vendor/bin/drush config:import -y
- name: Update packages
run: |
# You can update whatever you want here. We only update commerce
# to keep things simple.
composer update drupal/commerce
# Run update hooks.
./vendor/bin/drush updb
# Export configuration changes done by update hooks.
./vendor/bin/drush config:export
- name: Create Pull Request
uses: peter-evans/create-pull-request@v6
with:
commit-message: Update configuration
title: Automatic update
body: |
- Updated active configuration.
branch: update-configuration
The example above uses GitHub Actions to install Drupal using an existing configuration, update the necessary packages and then export changed files to the Git repository.
In the simplest terms, all it does is:
- Install Drupal:
drush site-install --existing-config
- Flush caches and re-import configuration to make sure everything is up-to-date:
drush cache:rebuild
,drush config:import
. - Update required dependencies. For example, if you wish to update all composer dependencies, you can just run
composer update
. - Run update hooks and export configuration:
drush updb
,drush config:export
. - Commit changed files to Git repository. The example uses peter-evans/create-pull-request Action to create a pull request, so we can run tests/code checks against the changes.
The action can be run manually by going to Actions -> Update Drupal -> Run workflow on GitHub.
See https://docs.github.com/en/actions/using-workflows/events-that-trigger-workflows for available workflow events.
You can see this example in action here: tuutti/drupal-automatic-updates-example/pull/1. The Commerce module is updated from 2.18 to the latest version, and the changed configuration is exported to Git.
Speeding things up
Installing Drupal from scratch with existing configuration can take a really long time, especially for more complex sites.
One way to speed things up is to create a “reference” database dump and install Drupal using that dump.
Create a .github/workflows/database-artifact.yml
file:
on:
# Allow workflow to be run manually.
workflow_dispatch:
# This will re-create the database dump once a week (every sunday at 00:00).
schedule:
- cron: '0 0 * * 0'
name: Create a database dump
jobs:
build:
runs-on: ubuntu-latest
services:
db:
image: mariadb
env:
MYSQL_USER: drupal
MYSQL_PASSWORD: drupal
MYSQL_DATABASE: drupal
MYSQL_ROOT_PASSWORD: drupal
ports:
- 3306:3306
steps:
- uses: actions/checkout@v3
with:
fetch-depth: 1
- name: Setup PHP
uses: shivammathur/setup-php@v2
with:
php-version: '8.3'
extensions: gd, pdo_mysql
- name: Setup Drupal
run: |
composer install
./vendor/bin/drush site-install --existing-config
# Flush caches and re-run config:import to make sure
# everything is up-to-date.
- name: Update config
run: ./vendor/bin/drush cr && ./vendor/bin/drush config:import -y
- name: Export database dump
run: ./vendor/bin/drush sql-dump --result-file=${GITHUB_WORKSPACE}/latest.sql
- name: Upload latest database dump
uses: actions/upload-artifact@v4
with:
name: latest.sql
path: latest.sql
retention-days: 10
The example above uses actions/upload-artifacts action to store the database dump as a “workflow artifact”. See Storing workflow data as artifacts. The action is run once a week, and the dump is stored for 10 days to leave some leeway in case the action fails for some reason.
We can then use github-cli to download the stored database and use it in our Update Drupal
action:
- name: Download latest database dump
env:
GITHUB_TOKEN: ${{ secrets.GITHUB_TOKEN }}
run: |
# Use github-cli to download the database dump
# Alternatively, you can use https://github.com/actions/download-artifact
gh run download -n latest.sql
- name: Install Drupal from database dump
run: |
composer install
# Import existing database instead of installing Drupal manually.
$(./vendor/bin/drush sql:connect) < latest.sql
# Flush caches and re-run config:import to make sure everything is up-to-date.
./vendor/bin/drush cr && ./vendor/bin/drush config:import -y
This has the added benefit of being able to use the same dump to speed up your integration tests (like Behat or RobotFramework) as well.