- Follow the instructions in Setting Up OAuth 2.0
- For the "Application Type", choose "Web application"
- Add a URI to "Authorized JavaScript origins":
http://localhost
- Add a URI to "Authorized redirect URIs":
http://localhost:3000/api/auth/callback/google
- Save the Client ID and the Client secret somewhere safe, we will use these during the blog application setup
- Clone the repository using:
git clone [email protected]:CPSC319-2022/Continuus.git
-
Go to: https://nodejs.org/en/
-
Download the LTS version
-
Follow the setup instructions
-
Run
npm --version
- If you don’t get the version number (the error message is something like “what’s npm?”), then try restarting your terminals, try checking the PATH environment variable and try restarting your system
- You can install dependencies using (run in the project root folder):
npm install
- Copy-paste the .env.example and rename to .env
- Set the variables in the .env to appropriate values
- DATABASE_URL: Use the default value, if needed you can update this depending on your Postgres setup (next section)
- NEXTAUTH_URL: If you are developing locally, leave this as default
- GOOGLE_CLIENT_ID & GOOGLE_CLIENT_SECRET: Set these to the client id and client secret you have saved from the Google OAuth Setup section
- NEXTAUTH_SECRET: Follow the instructions in the .env-example files to generate a secret for it
-
Install Postgres Database locally
- IMPORTANT: Set your username to postgres and password to password for .env-example to work. Follow How to Run and Setup a Local PostgreSQL Database | Prisma
- Open the psql terminal
- Run the command
CREATE DATABASE continuus;
-
Run
npx prisma migrate dev
- This will update the database using the schema
- And also generate the Prisma Client
Use npm run dev
to start a dev server running in your local
- You have 2 options, choose only one of these options; forking the repo, or creating a new repo
- Forking a repo is easier
- You may want to create a new repo; if you have changes in your local, it's easier to create a new repo
- Go to
https://github.com/CPSC319-2022/Continuus
- Click on the "Fork" button
- Enter the repository name as you wish
- Deselect the "Copy the
dev
branch only" option - Press "Create fork" button
- The repo you created will have the 'dev', 'qa' and 'prod' branches by default.
- If you have not cloned the repository previously
- Clone the repository using:
git clone [email protected]:CPSC319-2022/Continuus.git
- Clone the repository using:
- Create a new repository in GitHub
- Deselect 'Add a README file' option
- Do NOT add .gitignore
- Do NOT add a licence
- Set the remote origin url using:
git remote set-url origin [github ssh address]
- Then set the main branch as dev and push
git branch -M dev
git push -u origin dev
- The repo will NOT have the 'qa' and 'prod' branches by default, so feel free to create them now or later.
We designed the CI/CD pipeline around a development workflow that uses branch rules to stop regression. This setup is optional, but recommended strongly
- Follow the GitHub guideline to create branch rules, with branch name patterns:
**/**
- Applies to all feature branches
- Allow force pushes for everyone
- Allow deletions
dev
,qa
andprod
- Create a new rule for each of these branches
- Require a pull request before merging & Require approvals of at least 1 person (depending on your team size)
- Require status checks to pass before merging & Require branches to be up to date before merging
- Add the Google Cloud Build status checks as required, if you don't have them listed here, come back to this step later when you have the CI/CD setup finished
- Require conversation resolution before merging
- Do not allow bypassing the above settings
- Follow instructions here to create a Postgres instance on Google Cloud
- Store the password for the postgres user somewhere safe
- Don't need to "configure a password policy for the instance"
- Database version should be
PostgeSQL 14
- Make sure to note down the region you selected, you will need it during the CI-CD Setup
- You do not need to customize your instance (choose the free/default option)
- Follow instructions here to create a database
- Create 3 databases:
dev
prod
qa
- Create 3 databases:
- The repository comes with configuration files for Github Actions and Google Cloud Platform. Github Actions coordinates build, test and deployment triggers for Google Cloud Build, while Google Cloud Build builds the containers and deploys the product.
- The Github Actions configuration files are found in .github/workflows/. The status for the stages defined in these yml files are shown in the Github Actions pipeline view
- The
dev-qa-prod.yml
coordinates building, testing and deployment with Google Cloud Build. It will run the pipeline for all commits to these branches - The
feature-workflow.yml
coordinates building and testing with Google Cloud Build. It will run the pipeline for opened pull requests todev
,qa
andprod
- The
- The Google Cloud Build configuration files are found in cloudbuild/.
- For the Google Cloud triggers to work correctly, Google Cloud Build must be set up for the Github repository.
- Ensure that the Github repository is connected to Google Cloud Build. Follow the instructions in [Connect to a Github repository](https://cloud.google.com/build/docs/automating-builds/github/connect-repo-github?generation=1st-gen) to connect it.
- Add a trigger for each yaml in the cloudbuild. Set up each trigger to run on the event of "Manual invocation" on branch name "*" and configured with a Cloud Build configuration file pointing to be one of the yamls in the cloudbuild folder
- Follow the instructions in Setting Up OAuth 2.0
- We will create 3 credentials, one for each environment
- Repeat steps 3-6 for each environment ('dev', 'qa', 'prod')
- For the "Application Type", choose "Web application"
- Add a URI to "Authorized JavaScript origins" (you can take the deployed URI from the CI-CD pipeline's deploy stage):
- deployed URI
- e.g.
https://dev-hmcu4gyu5a-pd.a.run.app
- OR
https://qa-hmcu4gyu5a-pd.a.run.app
- OR
https://prod-hmcu4gyu5a-pd.a.run.app
- Add a URI to "Authorized redirect URIs":
${deployed URI}/api/auth/callback/google
- e.g.
https://dev-hmcu4gyu5a-pd.a.run.app/api/auth/callback/google
- OR
https://qa-hmcu4gyu5a-pd.a.run.app/api/auth/callback/google
- OR
https://prod-hmcu4gyu5a-pd.a.run.app/api/auth/callback/google
- Save the Client ID and the Client secret somewhere safe for each environment, we will use these during the secret manager setup
- Go to the Secret Manager service (https://cloud.google.com/secret-manager)
- Create secrets for each of these (Names of secrets should be exact):
CLOUD_SQL_INSTANCE_NAME
: Take the instance name from Cloud SQL instance- e.g.:
automatic-bot-376307:us-central1:postgres
- e.g.:
DB_USERNAME
: The DB Username (default name ispostgres
)DB_PASSWORD
: The DB password (password of the db user, by default; you set this while setting up the instance)- Per environment secrets
- The next set of secrets are created for each of the environments
dev-GOOGLE_CLIENT_ID
: This is the Google OAuth Client ID for the dev environmentdev-GOOGLE_CLIENT_SECRET
: This is the Google OAuth Client Secret for the dev environmentdev-NEXTAUTH_SECRET
: Generate this (don't write something that is easily crackable)dev-NEXTAUTH_URL
: The deployment link for the dev environment- e.g.:
https://dev-hmcu4gyu5a-pd.a.run.app
- e.g.:
- And now we create the same set for other 2 environments, follow the same instructions but use the associated environment's values instead of
dev
environment's valuesprod-GOOGLE_CLIENT_ID
prod-GOOGLE_CLIENT_SECRET
prod-NEXTAUTH_SECRET
prod-NEXTAUTH_URL
qa-GOOGLE_CLIENT_ID
qa-GOOGLE_CLIENT_SECRET
qa-NEXTAUTH_SECRET
qa-NEXTAUTH_URL
- Give
Secret Manager Secret Accessor
role to the Cloud Build & Cloud Run service accounts- This guide can be helpful: https://cloud.google.com/build/docs/securing-builds/use-secrets
- The application server will check if these secrets are set, and you can look at the logs at Cloud Run (or Cloud Build) to see if there are any errors.
- *NOTE: If you don't have values for these secrets at the moment, you can continue with the setup and create the secrets on-the-go
A service account will be needed to perform specific actions in GCP services. For the services we are using (Cloud Build, Cloud Run, Artifact Registry, Cloud SQL, and Cloud Logging), a service account will need to be created with the following roles:
- Artifact Registry Administrator
- Artifact Registry Service Agent
- Artifact Registry Test
- Artifact Registry Writer
- Cloud Build Editor
- Cloud Run Admin
- Cloud Run Service Agent
- Cloud Run Service Role
- Cloud SQL Client
- Editor
- Logs View Accessor
- Logs Writer
- Secrets Manager Secret Accessor
- Storage Object Viewer
Cloud Build is a GCP service that lets us perform specific build actions for different stages in our pipeline. See CI-CD Setup for the workflows we are using. We have connected Cloud Build and our GitHub repository using the Google Cloud Build GitHub Marketplace App.
A Cloud Build Trigger lets us define a sequence of instructions that can be run to help us perform all the actions required to build, test, and deploy our application to Cloud Run. A trigger will need to be created for each step in the pipeline (see here for how to create a trigger). We created 4 triggers:
build
- Creates 2 Docker containers: one for deployment and one for testing
- Testing container
- Runs
npm install
to install all dependencies (required for running unit tests) - Pushes container to Artifact Registry
- Runs
- Deployment container
- Builds our application from the source code (runs
npm run build
) - Pushes the build container to Artifact Registry
- Note: this trigger will fail if an error occurs in
npm run build
(eg. linting errors)
- Builds our application from the source code (runs
feature-build
- Same as the
build
trigger except does not create/push a container to Artifact Registry for deployment because builds in this stage will not be deployed to an environment
- Same as the
test
- Pulls the Test container created in the
build
stage of the pipeline - Runs unit tests with
npm run test
- Note: this trigger will fail if any unit tests are not passing
- Pulls the Test container created in the
deploy
- Pulls the Deployment container created in the
build
stage of the pipeline - Deploys application to Cloud Run
- Pulls the Deployment container created in the
Cloud Run is the runtime environment we are using for our application deployments. There is very little configuration needed for this part of the pipeline as Cloud Build is capable of creating and deploying the applications directly from a build trigger. We are currently deploying applications to 3 environments: dev
, qa
, and prod
. Our GitHub branch names are used to generate the URLs to which the applications will be deployed. Here are the list of active deployments we have in Cloud Run:
- https://dev-hmcu4gyu5a-pd.a.run.app/
- https://qa-hmcu4gyu5a-pd.a.run.app/
- https://prod-hmcu4gyu5a-pd.a.run.app/
GitHub actions is what we are using as our Web UI to display information about the current status of builds. Actions lets us declare a sequence of jobs in a .yml
file that allow us to kick off the triggers in our Cloud Build pipeline.
- Google Cloud Platform account
- GCP Cloud Build triggers set up for each build step
wait-for-build
script present in the/scripts
directory
In order to get GitHub Actions working, a few repository secrets need to be specified. An account with proper authorization in the Continuus repository is required to complete this step.
Navigate to the Actions secrets and variables page in the repository and enter the following repository secrets:
GCP_BUILD_BUILD_ID
- Description: Google Cloud Build trigger ID for the
build
stage of the pipeline - Where to find: GCP Console > Cloud Build > Triggers >
build
trigger - Parameter: uuid in the URL of the trigger
- Description: Google Cloud Build trigger ID for the
GCP_BUILD_DEPLOY_ID
- Same as above for
build
trigger
- Same as above for
GCP_FEATURE_BUILD_ID
- Same as above for
feature-build
trigger
- Same as above for
GCP_BUILD_TEST_ID
- Same as above for
test
trigger
- Same as above for
GCP_CREDENTIALS
- Description: GCP credentials file in JSON format for a service account with sufficient permissions (see IAM)
- Where to find: https://developers.google.com/workspace/guides/create-credentials#create_credentials_for_a_service_account
A workflow.yml file lets us define the jobs for our GH Actions. Additional workflows can be added to the .github/workflows
folder in the repository and triggered on any GitHub action (eg. on pull request, push, etc)
The Continuus repository currently has 2 workflows:
dev-qa-prod.yml
- Initiates
build
,test
, anddeploy
triggers in Cloud Build (in that order)
- Initiates
feature-workflow.yml
- Initiates
feature-build
andtest
triggers in Cloud Build
- Initiates
The wait-for-build
monitors the status of a Cloud Build trigger and blocks the GitHub Actions pipeline from progressing if the current trigger has not yet completed. The script polls for the status of the GCP Cloud Build trigger every 10 seconds. If the trigger is in progress, it will continue to poll.
If the build failed, an exit 1
will be sent to the GitHub Actions runner and the pipeline will fail. If the build times out (takes more than 15 minutes), the build will fail with an exit 1
as well.
This script will also print out logs of a failed build, and also provides a link to the GCP log for the Cloud Build trigger. An example of the output of the wait-for-build
script can be found here.
Each job in the workflow must call this script if it is using a trigger from Cloud Build. The script takes in 2 parameters: the Cloud Build trigger ID and the branch name of the current branch. An example of a call to wait-for-build
in a GitHub Actions workflow file can be found here.
- Google cloud project is setup.
- Billing is enabled.
- Enable the Cloud Functions and Pub/Sub APIs
- You can enable using this link.
- Setup Incoming Webhook for Slack
- Follow the steps here. The webhook url will be useful in later steps.
- Create a Cloud Storage Bucket for staging the cloud function.
- Using the cloud shell terminal in the Google cloud console. Ensure that the terminal session is for the correct gcp project. (You should see the project id in the terminal).
- Tip for choosing a bucket name is to use
[PROJECT-ID]_cloudbuild
- Run
gsutil mb gs://[BUCKET_NAME]
- In the cloud shell terminal,
- create a directory called slackbot and run
cd slackbot
. - create a two files app.js and package.json with the command
touch app.js package.json
. - Using vim or any equivalent text editor, copy and paste the respective contents of the files from this gist and follow the following steps;
- Run
npm install
to install the dependencies. - On line 4 of app.js, change the
SLACK_WEBHOOK_URL
to the url of your webhook created in step 2. - On line 61 change the
repoName
to the name of your repository. - Finally, on line 105, modify the url to match that of your repository
- Run
- create a directory called slackbot and run
- Deploy the Cloud Function
- Run the command below from the
slackbot/
directory, passing in the name of the bucket created in step 3 for[BUCKET_NAME]
;
gcloud functions deploy subscribe --stage-bucket [BUCKET_NAME] --runtime=nodejs18 --trigger-topic cloud-builds
- Run the command below from the