Testing & deploying Google Cloud Functions in BitBucket Pipelines
The below will run you through on how to setup BitBucket Pipelines to test and deploy your Google Cloud Function. Compared to AWS we have found the documentation for Google Cloud Functions and the best practices for deployment from CI is lacking and wanted to document it to help the next soul.
Exporting a service account key file which is used to authorise the deployment of the function
Creating a custom role with only the permissions required to deploy the function
Configuring BitBucket Pipelines
Firstly we need to create a service account is necessary for deploying the funtion (read more here on what it actually is – https://cloud.google.com/iam/docs/creating-managing-service-account-keys). In the Google Cloud Console, ensure you have you have selected the project you wish to work within in the top navigation bar.
In navigation menu open the service account page – IAM & admin > Service accounts. Click + CREATE SERVICE ACCOUNT at the top.
Give your service account a meaninfuly name such as BitBucket CI Deployment.
Click next and then add the Service Account Permissions, add the role of Service Account User.
Click next and you will come to the final screen.
Click create key and a JSON key file will be downloaded and keep this safe
This JSON key key file is your service account JSON key file and this is what we will use to authenticate to in order to deploy the cloud function.
We will now create the custom role.. Go to IAM & admin > Roles then click + Create Role at the top and Give this role the name of BitBucket CI Deployment Role.
This file above will represent the structure in JSON file that was downloaded previously when creating the service account. We will be using the BitBucket Repository / Environment variables to store the credentials from your service account JSON file.
Head over to your BitBucket Repo and go to Settings > Repository Variables. Add the following environment variables.
The values for each of these you will take from the Service JSON Key file that you downloaded earlier. So for example you will take the value of project_id and input it into GCLOUD_PROJECT_ID.
Now create the file bitbucket-pipelines.yml with the following contents:
- tests: &tests
- npm install
- npm test
- step: *tests
- step: *tests
name: Deploy to Production
deployment: production # can be test, staging or production
- apt-get -y install gettext-base
- envsubst < service-account.dist.json > service-account.json
- gcloud auth activate-service-account --key-file=service-account.json
- gcloud functions deploy MyFunction --runtime nodejs8 --trigger-http --project
First we create a test stepdefinition. These are useful as they allow us to define one step without have to repeat them for each particular branch. This test uses a Node Image to install the dependencies and then run the tests.
In the pipelines node, we create a default node which simply states that for every branch run the test step. In the master node we simply invoke the test step and then create a new new step that deploys our Google Cloud Function. We set this as a deployment step (read more here – https://confluence.atlassian.com/bitbucket/bitbucket-deployments-940695276.html) and define that we have to deploy this manually.
We then use a Docker image of google/cloud-sdk which provides us an image which has all the tools required to deploy the Cloud SDK commands.
We install gettext-base which allows us use envsubst which interpolates the environment variables set up in BitBucket Repository Variables into service-account.dist.json and then outputs this to service-account.json.
The script then authenticates with Google Cloud using the service-account.json file, which will contain the values of your service account created above and then simply deploys the function.