Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Added e2e tests to ensure kafka, postgres, mysql samples run with the latest dozer #116

Draft
wants to merge 5 commits into
base: main
Choose a base branch
from
Draft
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension


Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
2 changes: 1 addition & 1 deletion connectors/kafka/README.md
Original file line number Diff line number Diff line change
Expand Up @@ -54,7 +54,7 @@ The producer will start generating sample data and publish it to Redpanda.
2. Run the following command to start Dozer:

```bash
dozer -c dozer-config.yaml
dozer run -c dozer-config.yaml
```

Dozer will ingest the data from Redpanda and perform the specified operations based on the configuration.
Expand Down
1 change: 0 additions & 1 deletion connectors/kafka/docker-compose.yml
Original file line number Diff line number Diff line change
@@ -1,5 +1,4 @@
version: "3.7"
name: redpanda-quickstart
networks:
redpanda_network:
driver: bridge
Expand Down
2 changes: 1 addition & 1 deletion connectors/kafka/producer.py
Original file line number Diff line number Diff line change
Expand Up @@ -13,7 +13,7 @@
producer = KafkaProducer(bootstrap_servers=kafka_bootstrap_servers)

# Generate mock transaction data and send it to the Kafka topic
for index in range(10000000):
for index in range(10):
# Generate mock transaction data using the Faker library
transaction_data = {
'id': index,
Expand Down
4 changes: 2 additions & 2 deletions connectors/postgres/README.md
Original file line number Diff line number Diff line change
Expand Up @@ -16,13 +16,13 @@ docker-compose up

Running Dozer
```
dozer
dozer run
```

That's all to it. You have APIs instantly available over REST and gRPC.

```
dozer
dozer run

____ ___ __________ ____
| _ \ / _ \__ / ____| _ \
Expand Down
2 changes: 1 addition & 1 deletion connectors/postgres/dozer-config.yaml
Original file line number Diff line number Diff line change
Expand Up @@ -7,7 +7,7 @@ connections:
password: postgres
host: localhost
port: 5433
database: film
database: pagila

sources:
- name: actors
Expand Down
4 changes: 2 additions & 2 deletions package.json
Original file line number Diff line number Diff line change
Expand Up @@ -20,7 +20,7 @@
},
"devDependencies": {
"@types/google-protobuf": "^3.15.7",
"@types/mocha": "^10.0.1",
"@types/mocha": "^10.0.2",
"@types/node": "^20.6.5",
"@typescript-eslint/eslint-plugin": "^6.7.2",
"@typescript-eslint/parser": "^6.7.2",
Expand Down Expand Up @@ -50,4 +50,4 @@
],
"recursive": "test/**/*.js"
}
}
}
8 changes: 4 additions & 4 deletions pnpm-lock.yaml

Some generated files are not rendered by default. Learn more about how customized files appear on GitHub.

20 changes: 20 additions & 0 deletions test/connectors/initKafka.sh
Original file line number Diff line number Diff line change
@@ -0,0 +1,20 @@
#!/bin/sh
BASEDIR=$(dirname "$0")
cd ${BASEDIR}/../../connectors/kafka

#start RedPanda
docker-compose up -d

#Register schema
curl -X POST -H "Content-Type: application/vnd.schemaregistry.v1+json" --data '{"schema": "{\"type\":\"record\",\"name\":\"transaction\",\"namespace\":\"dozer.samples\",\"fields\":[{\"name\":\"id\",\"type\":\"int\"}, {\"name\":\"customer_id\",\"type\":\"int\"},{\"name\":\"amount\",\"type\":\"float\"},{\"name\":\"location\",\"type\":\"string\"},{\"name\":\"provider\",\"type\":\"string\"}]}"}' http://localhost:18081/subjects/transactions-value/versions


curl -X POST -H "Content-Type: application/vnd.schemaregistry.v1+json" --data '{"schema": "{\"type\":\"record\",\"name\":\"transactions\",\"namespace\":\"dozer.samples\",\"fields\":[{\"name\":\"id\",\"type\":\"int\"}]}"}' http://localhost:18081/subjects/transactions-key/versions

#Run producer script
pip install kafka-python
pip install Faker
python producer.py



5 changes: 5 additions & 0 deletions test/connectors/initMySQL.sh
Original file line number Diff line number Diff line change
@@ -0,0 +1,5 @@
#!/bin/sh
BASEDIR=$(dirname "$0")
cd ${BASEDIR}/../../

docker-compose -f ./connectors/mysql/docker-compose.yml up -d
7 changes: 7 additions & 0 deletions test/connectors/initPostgres.sh
Original file line number Diff line number Diff line change
@@ -0,0 +1,7 @@
#!/bin/sh
BASEDIR=$(dirname "$0")
cd ${BASEDIR}/../../connectors/postgres

sh download.sh

docker-compose up -d
30 changes: 30 additions & 0 deletions test/connectors/mysql.ts
Original file line number Diff line number Diff line change
@@ -0,0 +1,30 @@
import { execSync } from 'child_process';
import path from 'path';
import {
assertEndpointsWithRetry, initDozer,
} from '../helper';

const TEST_PATH = './connectors/mysql';

describe('Connectors: MySQL', async () => {
beforeEach(async () => {
process.chdir('../../'); // go to root
console.log(`Starting directory: ${process.cwd()}`);

// Download init.sql and setup docker image
execSync(`${__dirname}/initMySQL.sh`, { stdio: 'inherit' });

// navigate to test path
const baseDir = path.join(__dirname, '../../');
const fullPath = path.join(baseDir, TEST_PATH);
process.chdir(fullPath);
execSync('rm -rf .dozer && rm -f dozer.lock', { stdio: 'inherit' });
});

it('should run and return API endpoints', async () => {
const dozer = await initDozer();
await assertEndpointsWithRetry();
dozer.kill(9);
console.log('Killed dozer mysql');
});
});
26 changes: 26 additions & 0 deletions test/connectors/postgres.ts
Original file line number Diff line number Diff line change
@@ -0,0 +1,26 @@
import { execSync } from 'child_process';
import {
assertEndpointsWithRetry, initDozer,
} from '../helper';

const TEST_PATH = './connectors/postgres';

describe('Connectors: Postgres', async () => {
beforeEach(async () => {
process.chdir('../../'); // go to root
console.log(`Starting directory: ${process.cwd()}`);

// Download init.sql and setup docker image
execSync(`${__dirname}/initPostgres.sh`, { stdio: 'inherit' });

process.chdir(TEST_PATH);
execSync('rm -rf .dozer && rm -f dozer.lock', { stdio: 'inherit' });
});

it('should run and return API endpoints', async () => {
const dozer = await initDozer();
await assertEndpointsWithRetry();
dozer.kill(9);
console.log('Killed dozer postgres');
});
});
27 changes: 27 additions & 0 deletions test/connectors/runkafka.ts
Original file line number Diff line number Diff line change
@@ -0,0 +1,27 @@
import { execSync } from 'child_process';
import {
assertEndpointsWithRetry, initDozer,
} from '../helper';

const TEST_PATH = './connectors/kafka';

describe('Connectors: Kafka', async () => {
beforeEach(async () => {
process.chdir('../../'); // go to root
console.log(`Starting directory: ${process.cwd()}`);

// Download init.sql and setup docker image
execSync(`${__dirname}/initKafka.sh`, { stdio: 'inherit' });

process.chdir(TEST_PATH);
execSync('rm -rf .dozer && rm -f dozer.lock', { stdio: 'inherit' });
});

it('should run and return API endpoints', async () => {
const dozer = await initDozer();
console.log('Dozer started');
await assertEndpointsWithRetry();
dozer.kill(9);
console.log('Killed dozer Kafka');
});
});
Loading