Apologies in advance for the long questions.
Background:
We are building a 4-step AWS pipeline to deploy our site to an ECS container.
The stages are as follows:
- Source
- Build
- Deploy
- Post-deploy
At the moment we have ECS commands running on Deploy stage to execute the Drush commands(drush cim -y
/drush updb -y
/ drush cr
)
Due to some reason the ECS commands execution is not consistent and we need to run the pipeline twice in order to apply the configuration changes to Drupal. This is not ideal, so, we came up with the idea of adding the Drush commands during the Build stage inside our docker file.
FROM amd64/drupal:8-apache
ARG DRUPAL_ENV
ENV DRUPAL_ENV=$DRUPAL_ENV
ARG DB_HOST
ENV DB_HOST=$DB_HOST
ARG DB_PASSWORD
ENV DB_PASSWORD=$DB_PASSWORD
RUN apt-get update
RUN apt-get install git -y
RUN rm -rf /var/www/html/*
COPY drupal.conf /etc/apache2/sites-enabled/000-default.conf
COPY memcache_client/amazon-elasticache-cluster-client.so /usr/local/lib/php/extensions/no-debug-non-zts-20190902
RUN touch /usr/local/etc/php/conf.d/memory_limit.ini
RUN echo "memory_limit = 512M" > /usr/local/etc/php/conf.d/memory_limit.ini
RUN touch /usr/local/etc/php/conf.d/memcache.ini
RUN echo "extension=amazon-elasticache-cluster-client.so" > /usr/local/etc/php/conf.d/memcache.ini
RUN touch /usr/local/etc/php/conf.d/file_upload_limit.ini
RUN echo "upload_max_filesize = 256M" > /usr/local/etc/php/conf.d/file_upload_limit.ini
RUN touch /usr/local/etc/php/conf.d/post_size_limit.ini
RUN echo "post_max_size = 256M" > /usr/local/etc/php/conf.d/post_size_limit.ini
WORKDIR /opt/drupal
COPY . /opt/drupal/
RUN /bin/sh -c set -eux;
RUN chown -R www-data:www-data docroot
RUN chown -R www-data:www-data docroot/sites docroot/modules docroot/themes;
RUN ln -sf /opt/drupal/docroot /var/www/html;
RUN chown -R www-data:www-data /var/www/html/docroot
RUN chown -R 750 /var/www/html/docroot
RUN alias drush="./vendor/bin/drush"
RUN drush updb -y
RUN drush cim -y
RUN drush cr
The above Dockerfile doesn't give us any errors but completes the pipeline. We initially had a problem that the Drush at this stage cannot find the database. After some digging, we realised that during the build stage, it's looking for the DB credentials defined in the settings.local.php
. Whereas it is meant to be checking the credentials from settings.stg.php
or settings.uat.php
. So just to fix this issue we introduced a .env
file so that we can pass the DB credentials for the particular environment. We also updated our settings.local.php
to be like this, so that it can read values from the .env
$databases['default']['default'] = [
'database' => 'drupal',
'username' => 'admin',
'password' => $_ENV['DB_PASSWORD'],
'prefix' => '',
'host' => $_ENV['DB_HOST'],
'port' => '3306',
'namespace' => 'Drupal\\Core\\Database\\Driver\\mysql',
'driver' => 'mysql',
];
I know this doesn't sound right at all, but, we just wanted to see what's happening.
After the pipeline is executed, login into Drupal and we can see the config changes for Local applied to Staging/UAT, which is incorrect. We assume this is because we run the Drush commands during the build stage and it can only access configs for local only at this stage.
We have the following logic in our settings.php
to fetch the settings file per environment.
// Set environment to local dev by default.
$env = 'local';
$is_aws_env = isset($_ENV['DRUPAL_ENVIRONMENT']);
// Override AWS environment if available.
if ($is_aws_env) {
$env = $_ENV['DRUPAL_ENVIRONMENT'];
}
// Set site and environment specific settings.
$settings_file = DRUPAL_ROOT . "/sites/default/settings/settings.{$env}.php";
if (file_exists($settings_file)) {
require $settings_file;
}
Questions:
Why exactly the Dockerfile or the Pipeline is looking for settings.local.php
when it should be looking for the settings file for the current environment?
If we are to run the Drush commands inside a Dockerfile, what's the best stage or way to do that without hacking the settings.local.php
like how we are doing now?
Thanks for taking the time to read this.