Skip to content

Commit

Permalink
DOP-4127: Monorepo builds without Makefiles (#938)
Browse files Browse the repository at this point in the history
* experiments, added nextGenParse call to jobHandler

* stage ecs - no makefiles

* alter snooty version

* next gen html

* command key

* parse

* catch

* logger

* remove error

* more logging

* cwd

* repoDir correct

* quotations

* successful parse? odd address

* oas page build

* xlarge

* persistence module

* quote

* no parse

* try catch parse

* type error

* no parse, use persistence and build

* log errors

* localApp working

* remove /dist

* add build deps & nextgenhtml

* add logging to build deps

* log html

* remove build deps for envs

* env vars

* build deps before executeBuild

* staging monorepo jobs

* comment

* localApp type cleanup

* stage

* deploy

* clean fs

* status

* no event body

* throw new error

* commented out all unnecessary steps

* explicitly call build steps

* remove steps that should not be used

* include job inqueue for error

* log status

* Empty-Commit

* comment

* used process.env to find URL and BUCKET

* save localApp

* uncomment prepNextGenBuild

* remove throw error

* not build on preprd

* buildCommands array

* log publish

* use normal deploy

* deploy

* add cp commands

* cd snooty

* log outputs

* ref repoDir rather than cwd

* correct reposDir vs repoDir

* add slash

* add repos to prodFileName

* log build deps

* change prodFIleName to cwd/snooty

* run parse

* remove deploy

* override build in stagingJobHandler

* override build, set up debugger

* vscode launch

* remove dist

* remove dist

* use project

* readd build deps

* add cp and cd commands to nextGenHtml

* cp correct paths

* remove first cp and cd

* remove cp

* ref snooty filepath correctly

* dockerfile snooty branch mm-log

* remove first cp

* cd ..

* commands

* stringify, logs

* use chdir

* cp second

* log in clicommand

* pass logger to mut publish

* dotcomstg -> stg

* checkout and pull branch

* add pull repo and change clone

* commented out incorrect code

* local run works for both cloud-docs and monorepo/cloud-docs

* log event and boyd

* log out trigger build

* log repo name and feat flag

* feature flag

* ssmprefix

* env

* dist

* remove feature flag for feature branch build

* log why no paths

* change slash of path

* clean

* organize code

* clean

* redoc

* fixes from merge

* comment out builddeps, use redoc rc

* duplicate clone

* clean, get bucket and url

* log env vars

* pass logger to getEnvVar

* takeover preprd

* add to v1?

* log which build

* force directory

* add directory to debug command and local build

* remove from preprd

* clean up

* keep conditionals for buildCommands in normal build

* further cleaning

* remove logs

* number of logs

* curl into repoDir/targetDir

* try new flow of logging

* remove comments

* allow output and error text to be returned from nextGenStage

* clean logs

* revert targetDir for downloadBuildDependencies

* clean, wrapWithBenchmark

* PR feedback

* [DOP-4127]: Update dockerfile.local to have redoc installed properly

* [DOP-4127]: Use new SQS queue URL

* [DOP-4127]: Install redoc bundle

* [DOP-4127]: Revert how redoc is installed

* PR feedback, second round

* replace useWithBenchmarks with isNextGen

* source patchId from getBuildAndGetDependencies

* [DOP-4204]: Update README for local Autobuilder (#954)

* [DOP-4204]: Add help command

* [DOP-4204]: Typo

* [DOP-4204]: Formatting

* [DOP-4024]: Update README.md

* [DOP-4204]: Troubleshooting

* [DOP-4204]: Fix typos

* [DOP-4204]: Rename images and move into shared folder

* [DOP-4204]: Update troubleshooting

* added logging and error throwing in wrapWithBenchmark

* conditionally write patchId and commitHash env vars

* remove logger as parameter of wrapWithBenchmarks

* [DOP-4127]: Update error logging and other misc changes

* [DOP-4127]: Fix tests

* [DOP-4127]: Fix up localbuild (hopefully)

* small clean-up changes

* clean

* dockerfile misspelling

* [DOP-4127]: Fix up localbuild oas-page-builder

* [DOP-4127]: Fix up localbuild

* [DOP-4127]: Fix readme error

* preprd

* Update README.md

Co-authored-by: rayangler <[email protected]>

* remove preprd

---------

Co-authored-by: branberry <[email protected]>
Co-authored-by: rayangler <[email protected]>
  • Loading branch information
3 people authored Jan 8, 2024
1 parent e47c263 commit d5194e9
Show file tree
Hide file tree
Showing 46 changed files with 2,879 additions and 389 deletions.
14 changes: 8 additions & 6 deletions .vscode/launch.json
Original file line number Diff line number Diff line change
Expand Up @@ -5,13 +5,15 @@
"version": "0.2.0",
"configurations": [
{
"name": "Docker: Attach to Autobuilder",
"type": "node",
"request": "launch",
"name": "Launch Program",
"skipFiles": ["<node_internals>/**"],
"program": "${workspaceFolder}/modules/oas-page-builder/tests/unit/services/pageBuilder.test.ts",
"preLaunchTask": "tsc: build - tsconfig.json",
"outFiles": ["${workspaceFolder}/build/**/*.js"]
"request": "attach",
"port": 9229,
"address": "localhost",
"localRoot": "${workspaceFolder}",
"remoteRoot": "/home/docsworker-xlarge",
"autoAttachChildProcesses": false,
"protocol": "inspector"
}
]
}
3 changes: 3 additions & 0 deletions .vscode/settings.json
Original file line number Diff line number Diff line change
@@ -0,0 +1,3 @@
{
"debug.javascript.autoAttachFilter": "disabled"
}
78 changes: 49 additions & 29 deletions Dockerfile.local
Original file line number Diff line number Diff line change
@@ -1,4 +1,4 @@
FROM arm64v8/ubuntu:20.04
FROM arm64v8/ubuntu:20.04 as initial
ARG NPM_BASE_64_AUTH
ARG NPM_EMAIL
ARG SNOOTY_PARSER_VERSION=0.15.0
Expand All @@ -7,10 +7,13 @@ ARG MUT_VERSION=0.10.7
ARG REDOC_CLI_VERSION=1.2.3
ARG NPM_BASE_64_AUTH
ARG NPM_EMAIL
ARG WORK_DIRECTORY=/home/docsworker-xlarge

ENV DEBIAN_FRONTEND=noninteractive
WORKDIR ${WORK_DIRECTORY}

# helper libraries for docs builds
RUN apt-get update && apt-get install -y vim git unzip zip
RUN apt-get update && apt-get install -y vim git unzip zip chromium-browser rsync

# get node 18
# https://gist.github.com/RinatMullayanov/89687a102e696b1d4cab
Expand All @@ -20,7 +23,6 @@ RUN apt-get install --yes nodejs
RUN apt-get install --yes build-essential
RUN apt-get install --yes python3-pip libxml2-dev libxslt-dev python-dev pkg-config

WORKDIR /app

RUN python3 -m pip install poetry

Expand All @@ -39,53 +41,66 @@ RUN git clone -b v${MUT_VERSION} --depth 1 https://github.com/mongodb/mut.git \
&& make package \
&& mv dist/mut /opt/

RUN curl -L -o redoc.zip https://github.com/mongodb-forks/redoc/archive/refs/tags/v${REDOC_CLI_VERSION}.zip \
&& unzip redoc.zip \
&& mv redoc-${REDOC_CLI_VERSION} redoc/

ENV PATH="${PATH}:/opt/snooty:/opt/mut:/app/.local/bin"
ENV PATH="${PATH}:/opt/snooty:/opt/mut:/${WORK_DIRECTORY}/.local/bin"

# setup user and root directory
RUN useradd -ms /bin/bash docsworker
RUN chmod 755 -R /app
RUN chown -Rv docsworker /app
USER docsworker
RUN useradd -ms /bin/bash docsworker-xlarge
RUN chmod 755 -R ${WORK_DIRECTORY}
RUN chown -Rv docsworker-xlarge ${WORK_DIRECTORY}
USER docsworker-xlarge

# install snooty frontend and docs-tools
RUN git clone -b v${SNOOTY_FRONTEND_VERSION} --depth 1 https://github.com/mongodb/snooty.git \
&& cd snooty \
&& npm ci --legacy-peer-deps --omit=dev

RUN curl https://raw.githubusercontent.com/mongodb/docs-worker-pool/meta/makefiles/shared.mk -o shared.mk


RUN git clone -b @dop/redoc-cli@${REDOC_CLI_VERSION} --depth 1 https://github.com/mongodb-forks/redoc.git redoc \
# Install dependencies for Redoc CLI
&& cd redoc/ \
&& npm ci --prefix cli/ --omit=dev

FROM initial as persistence

RUN mkdir -p modules/persistence && chmod 755 modules/persistence
COPY modules/persistence/package*.json ./modules/persistence/
RUN cd ./modules/persistence \
&& npm ci --legacy-peer-deps
# Build persistence module

COPY --chown=docsworker-xlarge modules/persistence/tsconfig*.json ./modules/persistence
COPY --chown=docsworker-xlarge modules/persistence/src ./modules/persistence/src/
COPY --chown=docsworker-xlarge modules/persistence/index.ts ./modules/persistence

RUN cd ./modules/persistence \
&& npm run build:esbuild

FROM initial as oas

RUN mkdir -p modules/oas-page-builder && chmod 755 modules/oas-page-builder
COPY modules/oas-page-builder/package*.json ./modules/oas-page-builder/
RUN cd ./modules/oas-page-builder \
&& npm ci --legacy-peer-deps
# Build modules
# OAS Page Builder
COPY --chown=docsworker-xlarge modules/oas-page-builder/tsconfig*.json ./modules/oas-page-builder
COPY --chown=docsworker-xlarge modules/oas-page-builder/src ./modules/oas-page-builder/src/
COPY --chown=docsworker-xlarge modules/oas-page-builder/index.ts ./modules/oas-page-builder

# Root project build
COPY package*.json ./
RUN npm ci --legacy-peer-deps
# Build persistence module
RUN cd ./modules/oas-page-builder \
&& npm run build:esbuild

COPY --chown=docsworker modules/persistence/tsconfig*.json ./modules/persistence
COPY --chown=docsworker modules/persistence/src ./modules/persistence/src/
COPY --chown=docsworker modules/persistence/index.ts ./modules/persistence
FROM initial as root

RUN cd ./modules/persistence \
&& npm run build
COPY --from=persistence --chown=docsworker-xlarge ${WORK_DIRECTORY}/modules/persistence/dist/ ./modules/persistence
COPY --from=oas --chown=docsworker-xlarge ${WORK_DIRECTORY}/modules/oas-page-builder/dist/ ./modules/oas-page-builder

# Build modules
# OAS Page Builder
COPY --chown=docsworker modules/oas-page-builder/tsconfig*.json ./modules/oas-page-builder
COPY --chown=docsworker modules/oas-page-builder/src ./modules/oas-page-builder/src/
COPY --chown=docsworker modules/oas-page-builder/index.ts ./modules/oas-page-builder
# Root project build
COPY package*.json ./
RUN npm ci --legacy-peer-deps

RUN cd ./modules/oas-page-builder \
&& npm run build

COPY tsconfig*.json ./
COPY config config/
Expand All @@ -94,8 +109,13 @@ COPY src src/

RUN npm run build:esbuild

ENV PERSISTENCE_MODULE_PATH=${WORK_DIRECTORY}/modules/persistence/index.js
ENV OAS_MODULE_PATH=${WORK_DIRECTORY}/modules/oas-page-builder/index.js
ENV REDOC_PATH=${WORK_DIRECTORY}/redoc/cli/index.js

RUN mkdir -p modules/persistence && chmod 755 modules/persistence
RUN mkdir repos && chmod 755 repos

EXPOSE 3000

CMD ["node", "--enable-source-maps", "dist/entrypoints/localApp.js"]
CMD ["node", "--inspect-brk=0.0.0.0", "--enable-source-maps", "dist/entrypoints/onDemandApp.js"]
104 changes: 68 additions & 36 deletions README.md
Original file line number Diff line number Diff line change
Expand Up @@ -21,49 +21,81 @@ To add a new property:
- Go to `infrastructure/ecs-main/ecs-service.yml` `TaskDefinition` section
- Add the new property to the `ContainerDefinitions`/`Environment` section

## Build and Run Docker Image for local testing
## Debug the Autobuilder for Local Testing

The npm build args are required for the portion of the dockerfile that installs the [snooty-frontend]. `NPM_CONFIG__AUTH`
and `NPM_CONFIG_EMAIL` are environment variables available in our working directory. `NPM_CONFIG_{OPTION}` environment
variables can actually be used instead of the `~/.npmrc` file. The reason we need the build args to be `NPM_BASE_64_AUTH`
and `NPM_EMAIL` is because that's what's expected in the `.npmrc` within [snooty-frontend].
### Setup

```shell
docker build --tag=workerpool --build-arg NPM_BASE_64_AUTH=${NPM_CONFIG__AUTH} --build-arg NPM_EMAIL=${NPM_CONFIG_EMAIL} .
```
To debug the Autobuilder for local testing, you first need to ensure the following has been done:

```shell
docker run \
--env MONGO_ATLAS_USERNAME \
--env MONGO_ATLAS_PASSWORD \
--env AWS_ACCESS_KEY_ID \
--env AWS_SECRET_ACCESS_KEY \
--env GITHUB_BOT_USERNAME \
--env GITHUB_BOT_PASSWORD \
--env DB_NAME \
--env XLARGE \
--env SNOOTY_ENV \
--env FASTLY_TOKEN \
--env FASTLY_DOCHUB_MAP \
--env FASTLY_SERVICE_ID \
workerpool
```
1. Docker is running
2. The `~/.aws/credentials` file contains unexpired credentials for the `default` profile

- `MONGO_ATLAS_USERNAME` and `MONGO_ATLAS_PASSWORD` is username/password of atlas database
- `AWS_ACCESS_KEY_ID` and `AWS_SECRET_ACCESS_KEY` are needed for uploading to S3 via [mut](https://github.com/mongodb/mut)
- `GITHUB_BOT_USERNAME` and `GITHUB_BOT_PASSWORD` are needed so builder can access private repos
- `DB_NAME` allows the indication of a pool database (pool, pool_test)
- `XLARGE` true or false indicates whether this instance will run on an XLARGE server or not
- `SNOOTY_ENV` indicates whether the build environment is stage, prod, or dev
- `FASTLY_TOKEN` is needed for connecting to the Fastly edge dictionary
- `FASTLY_DOCHUB_MAP` is the id of the redirect map that we publish dochub links to
- `FASTLY_SERVICE_ID` is the id of the service used for dochub
For retrieving credentials, head to AWS and under `Docs Platform`, click on `Command line or programmatic access`.
![AWS console](images/aws-console-admin.png)

If you are running a local version of the docker image for testing, we have a separate staging environment setup. Testing in this environment is automated through the "stage" branch. Add the following env variables to the `docker run` command:
Copy the value in option 2, `Manually add a profile to your AWS credentials file (Short-term credentials)`.

![Alt text](images/aws-credentials.png)

From there, paste this value in `~/.aws/credentials`, and replace the randomly generated profile (which looks something like `[123456789_AdministratorAccess]`) with `[default]`.
You should now have the correct credentials to run the debugger.

_**NOTE: credentials expire pretty quickly. Not sure how exactly how long they last for, but in my experience they expire in approximately 30 minutes.**_

You should now be all set to run the debugger command:

`npm run debug`

To view all of the options for the command, you can run:

`npm run debug -- --help`

Here is an example of running the local debugger for `cloud-docs`:

`npm run debug -- -o 10gen -n cloud-docs`

Here is an example of running the local debugger for `docs-monorepo/docs-landing` on branch `groot`:

`npm run debug -- -o 10gen -n docs-monorepo -d cloud-docs -b groot`

By default, the environment that is used for the local Autobuilder is `stg`.

### Debugger Behavior

When the command is run, there are several steps that occur before the Autobuilder begins:

1. Environment variables and other information are pulled from Parameter Store
2. The GitHub repository is queried for data to create the job
3. The container is built
- NOTE: If you have not run the debug command before, the build will take a substantial amount of time (approximately 10-15 minutes).
Subsequent builds will be much shorter, especially if the changes are just code changes. If just a code change is made after the initial build, it should only take a few seconds for the build to complete and the container to run. Changes such as updating the version of the Snooty Parser, or the Redoc CLI will cause the builds to take much longer, but these happen much less frequently. The majority of the build should be on the order of a few seconds.
4. The data from step 2 is then added as a record in the `pool_test.queue`.
5. The container is then run, and waits for the user to connect to it via the VSCode debugger.

Once the container starts successfully, you should see something like the following message:

`Debugger listening on ws://0.0.0.0:9229/....`

To connect, click on the debug tab on the left side of your VSCode editor. Make sure the dropdown to the right of the green play button is set to the `Docker: Attach to Autobuilder` configuration. Press the green play button, and you will attach to the container.

### Troubleshooting

The most frequent cause of build failures will be related to expired AWS credentials, or not having Docker running. Also, if you haven't run `npm ci` in a while, you will need to do so as a new dependency was added to run the command.

Occasionally, errors may occur inexplicably, and the error messages may seem unrelated to any change made. Oftentimes, running the following commands can resolve these sporadic issues:

```sh
docker image prune
docker container prune
```
--env DB_NAME
```

Also, another potential error could be due to the Dockerfile.local not being updated. If you are not seeing changes that are occurring in the Autobuilder in another environment, this may be why. For example, the Dockerfile.local could be using an older version of the Snooty Parser.

![Alt text](images/vsode-debugger.png)

By default, the container will break at the first line of code, which will be in a file called `bind.js`. Press the fast-forward button to continue the execution. You are also able to add other breakpoints to stop the application. Once the application is complete, press `CTRL + C` for the terminal to exit out of the connection to the container.

If you receive `CredentialsProviderError: Could not load credentials from any providers`, make sure that there is no env `AWS_PROFILE` defined as a different profile anywhere (such as in the global `~/.zshrc` file). Otherwise, ensure that `AWS_PROFILE` matches the same profile defined in `~/.aws/credentials`.

## Run Tests

Expand Down
24 changes: 23 additions & 1 deletion api/controllers/v1/github.ts
Original file line number Diff line number Diff line change
Expand Up @@ -6,6 +6,7 @@ import { RepoBranchesRepository } from '../../../src/repositories/repoBranchesRe
import { markBuildArtifactsForDeletion, validateJsonWebhook } from '../../handlers/github';
import { DocsetsRepository } from '../../../src/repositories/docsetsRepository';
import { ReposBranchesDocsetsDocument } from '../../../modules/persistence/src/services/metadata/repos_branches';
import { PushEvent } from '@octokit/webhooks-types';

async function prepGithubPushPayload(
githubEvent: any,

Check warning on line 12 in api/controllers/v1/github.ts

View workflow job for this annotation

GitHub Actions / test

Unexpected any. Specify a different type
Expand Down Expand Up @@ -77,7 +78,28 @@ export const TriggerBuild = async (event: any = {}, context: any = {}): Promise<
body: errMsg,
};
}
const body = JSON.parse(event.body);
if (!event.body) {
const err = 'Trigger build does not have a body in event payload';
consoleLogger.error('TriggerBuildError', err);
return {
statusCode: 400,
headers: { 'Content-Type': 'text/plain' },
body: err,
};
}

let body: PushEvent;
try {
body = JSON.parse(event.body) as PushEvent;
} catch (e) {
consoleLogger.error('[TriggerBuild]', `ERROR! Could not parse event.body ${e}`);
console.log(`event: ${event} and event body: ${event.body}`);
return {
statusCode: 502,
headers: { 'Content-Type': 'text/plain' },
body: ' ERROR! Could not parse event.body',
};
}

if (body.deleted) {
return {
Expand Down
2 changes: 1 addition & 1 deletion api/controllers/v1/jobs.ts
Original file line number Diff line number Diff line change
Expand Up @@ -90,7 +90,7 @@ export const HandleJobs = async (event: any = {}): Promise<any> => {
await SubmitArchiveJob(jobId);
break;
default:
consoleLogger.error(jobId, 'Invalid status');
consoleLogger.error(jobId, `Invalid status: ${jobStatus}`);
break;
}
} catch (err) {
Expand Down
12 changes: 7 additions & 5 deletions api/controllers/v2/github.ts
Original file line number Diff line number Diff line change
Expand Up @@ -18,7 +18,8 @@ async function prepGithubPushPayload(
githubEvent: PushEvent,
repoBranchesRepository: RepoBranchesRepository,
prefix: string,
repoInfo: ReposBranchesDocsetsDocument
repoInfo: ReposBranchesDocsetsDocument,
directory?: string
): Promise<Omit<EnhancedJob, '_id'>> {
const branch_name = githubEvent.ref.split('/')[2];
const branch_info = await repoBranchesRepository.getRepoBranchAliases(
Expand Down Expand Up @@ -53,6 +54,7 @@ async function prepGithubPushPayload(
urlSlug: urlSlug,
prefix: prefix,
project: project,
directory: directory,
},
logs: [],
};
Expand Down Expand Up @@ -110,7 +112,7 @@ export const TriggerBuild = async (event: APIGatewayEvent): Promise<APIGatewayPr
async function createAndInsertJob(path?: string) {
const repoInfo = await docsetsRepository.getRepo(body.repository.name, path);
const jobPrefix = repoInfo?.prefix ? repoInfo['prefix'][env] : '';
const job = await prepGithubPushPayload(body, repoBranchesRepository, jobPrefix, repoInfo);
const job = await prepGithubPushPayload(body, repoBranchesRepository, jobPrefix, repoInfo, path);

consoleLogger.info(job.title, 'Creating Job');
const jobId = await jobRepository.insertJob(job, c.get('jobsQueueUrl'));
Expand All @@ -128,10 +130,10 @@ export const TriggerBuild = async (event: APIGatewayEvent): Promise<APIGatewayPr
ownerName: body.repository.owner.name,
updatedFilePaths: getUpdatedFilePaths(body.head_commit),
});
consoleLogger.info('monoRepoPaths', `Monorepo Paths with new changes: ${monorepoPaths}`);
consoleLogger.info(body.repository.full_name, `Monorepo Paths with new changes: ${monorepoPaths}`);
}
} catch (error) {
console.warn('Warning, attempting to get repo paths caused an error', error);
consoleLogger.warn('Warning, attempting to get monorepo paths caused an error', error);
}

/* Create and insert Job for each monorepo project that has changes */
Expand All @@ -141,7 +143,7 @@ export const TriggerBuild = async (event: APIGatewayEvent): Promise<APIGatewayPr
if (path.split('/').length > 1) continue;

try {
await createAndInsertJob(`/${path}`);
await createAndInsertJob(path);
} catch (err) {
return {
statusCode: 500,
Expand Down
4 changes: 2 additions & 2 deletions api/controllers/v2/jobs.ts
Original file line number Diff line number Diff line change
Expand Up @@ -72,7 +72,7 @@ export const HandleJobs = async (event: SQSEvent): Promise<void> => {
await notifyBuildSummary(jobId);
break;
default:
consoleLogger.error(jobId, 'Invalid status');
consoleLogger.error(jobId, `Invalid status: ${jobStatus}`);
break;
}
} catch (err) {
Expand Down Expand Up @@ -150,7 +150,7 @@ async function NotifyBuildProgress(jobId: string): Promise<void> {
return;
}

const jobTitle = fullDocument.title;
const jobTitle = `${fullDocument.title}${fullDocument.payload.directory ? `/${fullDocument.payload.directory}` : ''}`;
const username = fullDocument.user;
const repoEntitlementRepository = new RepoEntitlementsRepository(db, c, consoleLogger);
const entitlement = await repoEntitlementRepository.getSlackUserIdByGithubUsername(username);
Expand Down
Loading

0 comments on commit d5194e9

Please sign in to comment.