Bitbucket Pipelines & Variables
I worked on setting up a continuous delivery pipeline for our team, so when code is merged to a test branch, it’s automatically deployed to a test environment. I wanted to reuse the deployment pipelines already written because they do a good job handling the nuances between each microservice we deploy.
My solution at a high level was a lot like this:
- Code is deployed to a
test-branch
branch. - A Bitbucket pipeline is kicked off that triggers the deploy.
- The deploy script deploys a microservice to an environment,
test
.
This seemed really straightforward on the surface, except its a little harder than I initially thought.
Let’s imagine our Bitbucket pipelines look like this:
# deploys service-a
deploy-service-a: &deploy-service-a
step:
image: node:12
name: service-a
script:
cd service-a
serverless deploy --stage ${STAGE}
# sets the STAGE variable that's defined when the pipeline is manually triggered
set-variables: &set-variables
variables:
- name: STAGE
default: "${STAGE}"
pipelines:
# our custom pipeline, what shows up Bitbucket web app
custom:
deploy:
- <<: *set-variables
- <<: *deploy-service-a
How this works:
- A developer manually triggers the
deploy
pipeline, specifying theSTAGE
variable in the Bitbucket web app STAGE
is set and assigned to theSTAGE
variable viaset-variables
deploy-service-a
is kicked off, executes, and deploys new code forservice-a
in the appropriate stage.
Struggle
This works well for manual deployments. However, I needed to have this triggered automatically. So I update the pipelines
object and added:
pipelines:
branches:
test-branch:
- <<: *deploy
Theoretically, this triggers deploy
to execute whenever there is a commit to test-branch
, however this fails spectacularly because the STAGE
variable is not defined.
I try the following:
pipelines:
branches:
variables:
- name: STAGE
default: test
test-branch:
- <<: *deploy
I soon find out that you can only add a variables
object to custom
pipelines. Honestly, this is where I start to lose my mind, having spent hours on something that should be simple and straightforward.
Solution
Fortunately, for me, I’m able to string together a correct seqeuence of characters and words and executed the perfect Google search and found a solution on StackOverflow.
First, there were a few limitations of Bitbucket I hadn’t realized until I went on this journey:
- Only
custom
pipelines can have avariables
object. - You cannot pass variables between
steps
(because each step is it’s own deploy, with it’s own scope, docker image, variables, etc.).
However, I learned I can get around this by using artifacts
. We’d normally use artifacts to save packaged binaries or test results that were executed during a Bitbucket pipeline run. They can be used to pass data between steps as well.
Armed with my newly found knowledge, by Bitbucket pipelines were refactored into:
deploy-service-a: &deploy-service-a
step:
image: node:12
name: service-a
script:
- if [ -e set_env.sh ]; then
- cat set_env.sh
- source set_env.sh
- fi
cd service-a
serverless deploy --stage ${STAGE}
set-variables: &set-variables
variables:
- name: STAGE
default: "${STAGE}"
pipelines:
custom:
deploy:
- <<: *set-variables
- <<: *deploy-service-a
branches:
test-branch:
- step:
name: Save stage
script:
- STAGE="test"
- echo $STAGE
- echo "export STAGE=$STAGE" >> set_env.sh
artifacts:
- set_env.sh
- <<: *deploy
This works perfectly. If you look closely to the output when running this Bitbucket pipeline, you’ll see something in deploy-service-a
’s Build setup
phase that resembles:
Artifact "set_env.sh": Downloading
Artifact "set_env.sh": Downloaded 140 B in 0 seconds
Artifact "set_env.sh": Extracting
Artifact "set_env.sh": Extracted in 0 seconds
Artifacts from previous steps are passed to subsequent steps!
🧇