Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Migrate from Apollo codegen to graphql-code-generator #999

Merged
merged 7 commits into from
Aug 12, 2022
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension


Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
1 change: 1 addition & 0 deletions .eslintrc.js
Original file line number Diff line number Diff line change
Expand Up @@ -48,6 +48,7 @@ module.exports = {
"react-hooks/rules-of-hooks": "error",
"@typescript-eslint/prefer-nullish-coalescing": "warn",
"@typescript-eslint/prefer-optional-chain": "warn",
// complexity: "warn",
// Place to specify ESLint rules. Can be used to overwrite rules specified from the extended configs
// e.g. "@typescript-eslint/explicit-function-return-type": "off",
},
Expand Down
4 changes: 3 additions & 1 deletion .prettierrc.js
Original file line number Diff line number Diff line change
Expand Up @@ -11,7 +11,9 @@ module.exports = {
"",
"^[@]",
"",
"^[./]",
"^[./](?!graphql)",
"",
"^/graphql",
],
plugins: [require.resolve("@ianvs/prettier-plugin-sort-imports")],
importOrderBuiltinModulesToTop: true,
Expand Down
53 changes: 27 additions & 26 deletions README.md
Original file line number Diff line number Diff line change
@@ -1,57 +1,58 @@
## Development environment
# Development environment

Note: Before starting development, please run the following command on the repo root:
## Requirements

```bash
npm ci
```
Create `.env` files for backend and frontend. See examples and ask your local boffin for details.

frontend:
Install `docker-compose`, if not already installed.

```bash
cd frontend
npm ci
npm run dev
```
## Development workflow

backend:
Run `npm ci` in the each of the root, backend and frontend directories to install dependencies.

Create separate shells for the database container, backend and frontend:

```bash
cd backend
docker-compose up
```

```bash
cd frontend
npm run dev
```

```bash
cd backend
npm ci
npm run migrate
npm run dev
```

## Generating GraphQL types for frontend
If the database doesn't seem to do anything, ie. no messages after the initial ones after running `docker-compose up` and the database queries are not getting through, run `docker-compose down` and try again. You can always run the database container in detached mode (`-d`) but then you won't see the logs live.

If you make changes to the GraphQL schema or the queries in the frontend, you probably need to regenerate the Typescript types. Run `npm run generate-graphql-types` in the frontend folder.
Run `npm run prettier` in the root directory before committing. The commit runs hooks to check this as well as some linters, type checks etc.

If you're getting errors about mismatching GraphQL versions, then try `npm i --legacy-peer-deps --save-dev apollo` to force it. This is because `apollo-tooling` keeps on packaging some old versions of dependencies instead of using the newer ones available.
## Using installed `librdkafka` to speed up backend development
<details>
<summary>Using pre-built <code>librdkafka</code> to speed up backend development</summary>

By default, `node-rdkafka` builds `librdkafka` from the source. This can take minutes on a bad day and can slow development down quite considerably. However, there's an option to use the version installed locally.
By default, `node-rdkafka` builds `librdkafka` from the source. This can take minutes on a bad day and can slow development down quite considerably, especially when you're working with different branches with different dependencies and need to run `npm ci` often. However, there's an option to use the version installed locally.

Do this in some other directory than the project one:

```bash
wget https://github.com/edenhill/librdkafka/archive/v1.4.0.tar.gz -O - | tar -xz
cd librdkafka-1.4.0
```bash
wget https://github.com/edenhill/librdkafka/archive/v1.8.2.tar.gz -O - | tar -xz
cd librdkafka-1.8.2
./configure --prefix=/usr
make && make install
```

(Ubuntu 20.04 and later seem to require v1.5.0, so change accordingly.)

You may have to do some of that as root. Alternatively, you can install a prebuilt package - see [here](https://github.com/edenhill/librdkafka) for more information. Just be sure to install version >1.4.0.
You may have to do some of that as root. Alternatively, you can install a prebuilt package - see [here](https://github.com/edenhill/librdkafka) for more information.

Set the env `BUILD_LIBRDKAFKA=0` when doing `npm ci` or similar on the backend to skip the build.

## Documentation
</details>

## More documentation

[Kafka](backend/docs/kafka.md)
- [Kafka](docs/kafka.md)
- [GraphQL](docs/graphql.md)
3 changes: 2 additions & 1 deletion backend/.env.example
Original file line number Diff line number Diff line change
Expand Up @@ -20,4 +20,5 @@ GOOGLE_CLOUD_STORAGE_BUCKET=x
GOOGLE_CLOUD_STORAGE_PROJECT=x
GOOGLE_CLOUD_STORAGE_KEYFILE=x

UPDATE_USER_SECRET=secret
UPDATE_USER_SECRET=secret
BACKEND_URL=http://localhost:4000
1 change: 1 addition & 0 deletions backend/api/completions.ts
Original file line number Diff line number Diff line change
Expand Up @@ -152,6 +152,7 @@ export class CompletionController {
const { user } = getUserResult.value
const { slug } = req.params

// TODO: typing
let tierData: any = []

const course = (
Expand Down
3 changes: 2 additions & 1 deletion backend/bin/kafkaConsumer/common/createKafkaConsumer.ts
Original file line number Diff line number Diff line change
Expand Up @@ -16,7 +16,8 @@ import { KafkaError } from "../../lib/errors"
import checkConnectionInInterval from "./connectedChecker"

const logCommit =
(logger: winston.Logger) => (err: any, topicPartitions: any) => {
(logger: winston.Logger) =>
(err: any, topicPartitions: Kafka.TopicPartition[]) => {
if (err) {
logger.error(new KafkaError("Error in commit", err))
} else {
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -3,7 +3,7 @@ export interface Message {
user_id: number
course_id: string
service_id: string
progress: [PointsByGroup]
progress: PointsByGroup[]
message_format_version: Number
}

Expand Down
12 changes: 7 additions & 5 deletions backend/bin/kafkaConsumer/common/userCourseProgress/validate.ts
Original file line number Diff line number Diff line change
@@ -1,6 +1,8 @@
import { Message as KafkaMessage } from "node-rdkafka"
import * as yup from "yup"

import { Message } from "./interfaces"

const CURRENT_MESSAGE_FORMAT_VERSION = 1

const PointsByGroupYupSchema = yup.object().shape({
Expand Down Expand Up @@ -32,14 +34,14 @@ export const MessageYupSchema = yup.object().shape({
.required(),
})

const handleNullProgressImpl = (value: any) => ({
const handleNullProgressImpl = (value: Message) => ({
...value,
progress: value?.progress?.map((progress: any) => ({
...progress,
progress: value?.progress?.map((pointsByGroup) => ({
...pointsByGroup,
progress:
progress.progress === null || isNaN(progress.progress)
pointsByGroup.progress === null || isNaN(pointsByGroup.progress)
? 0
: progress.progress,
: pointsByGroup.progress,
})),
})

Expand Down
69 changes: 37 additions & 32 deletions backend/bin/kafkaConsumer/common/userFunctions.ts
Original file line number Diff line number Diff line change
Expand Up @@ -21,15 +21,20 @@ import {
ServiceProgressType,
} from "./userCourseProgress/interfaces"

interface WithKafkaContext {
context: KafkaContext
}

interface GetCombinedUserCourseProgressArgs extends WithKafkaContext {
user: User
course: Course
}

export const getCombinedUserCourseProgress = async ({
user,
course,
context: { prisma },
}: {
user: User
course: Course
context: KafkaContext
}): Promise<CombinedUserCourseProgress> => {
}: GetCombinedUserCourseProgressArgs): Promise<CombinedUserCourseProgress> => {
const userCourseServiceProgresses = await prisma.user
.findUnique({ where: { id: user.id } })
.user_course_service_progresses({
Expand Down Expand Up @@ -61,15 +66,16 @@ export const getCombinedUserCourseProgress = async ({
return combined
}

interface CheckRequiredExerciseCompletionsArgs extends WithKafkaContext {
user: User
course: Course
}

export const checkRequiredExerciseCompletions = async ({
user,
course,
context: { knex },
}: {
user: User
course: Course
context: KafkaContext
}): Promise<boolean> => {
}: CheckRequiredExerciseCompletionsArgs): Promise<boolean> => {
if (course.exercise_completions_needed) {
const exercise_completions = await knex("exercise_completion")
.countDistinct("exercise_completion.exercise_id")
Expand All @@ -85,15 +91,16 @@ export const checkRequiredExerciseCompletions = async ({
return true
}

interface GetExerciseCompletionsForCoursesArgs extends WithKafkaContext {
user: User
courseIds: string[]
}

export const getExerciseCompletionsForCourses = async ({
user,
courseIds,
context: { knex },
}: {
user: User
courseIds: string[]
context: KafkaContext
}) => {
}: GetExerciseCompletionsForCoursesArgs) => {
// picks only one exercise completion per exercise/user:
// the one with the latest timestamp and latest updated_at
const exercise_completions: ExerciseCompletionPart[] = await knex(
Expand All @@ -118,15 +125,16 @@ export const getExerciseCompletionsForCourses = async ({
return exercise_completions // ?.rows ?? []
}

interface PruneDuplicateExerciseCompletionsArgs extends WithKafkaContext {
user_id: string
course_id: string
}

export const pruneDuplicateExerciseCompletions = async ({
user_id,
course_id,
context: { knex },
}: {
user_id: string
course_id: string
context: KafkaContext
}) => {
}: PruneDuplicateExerciseCompletionsArgs) => {
// variation: only prune those with the latest timestamp but older updated_at
/*const deleted: Array<Pick<ExerciseCompletion, "id">> = await knex(
"exercise_completion",
Expand Down Expand Up @@ -198,9 +206,7 @@ export const pruneDuplicateExerciseCompletions = async ({

export const pruneOrphanedExerciseCompletionRequiredActions = async ({
context: { knex },
}: {
context: KafkaContext
}) => {
}: WithKafkaContext) => {
const deleted: Array<Pick<ExerciseCompletionRequiredAction, "id">> =
await knex("exercise_completion_required_actions")
.whereNull("exercise_completion_id")
Expand All @@ -210,15 +216,16 @@ export const pruneOrphanedExerciseCompletionRequiredActions = async ({
return deleted
}

interface GetUserCourseSettingsArgs extends WithKafkaContext {
user_id: string
course_id: string
}

export const getUserCourseSettings = async ({
user_id,
course_id,
context: { prisma },
}: {
user_id: string
course_id: string
context: KafkaContext
}): Promise<UserCourseSetting | null> => {
}: GetUserCourseSettingsArgs): Promise<UserCourseSetting | null> => {
// - if the course inherits user course settings from some course, get settings from that one
// - if not, get from the course itself or null if none exists
const result = await prisma.course.findUnique({
Expand Down Expand Up @@ -256,12 +263,11 @@ export const getUserCourseSettings = async ({
)
}

interface CheckCompletionArgs {
interface CheckCompletionArgs extends WithKafkaContext {
user: User
course: Course
handler?: Course | null
combinedProgress?: CombinedUserCourseProgress
context: KafkaContext
}

export const checkCompletion = async ({
Expand Down Expand Up @@ -301,12 +307,11 @@ export const checkCompletion = async ({
}
}

interface CreateCompletionArgs {
interface CreateCompletionArgs extends WithKafkaContext {
user: User
course: Course
handler?: Course | null
tier?: number
context: KafkaContext
}

export const createCompletion = async ({
Expand Down
2 changes: 1 addition & 1 deletion backend/bin/kafkaConsumer/exerciseConsumer/interfaces.ts
Original file line number Diff line number Diff line change
Expand Up @@ -2,7 +2,7 @@ export interface Message {
timestamp: string
course_id: string
service_id: string
data: ExerciseData[] //[ExerciseData]
data: ExerciseData[]
message_format_version: number
}

Expand Down
Loading