Skip to content

Commit

Permalink
Merge pull request #22 from DanRoscigno/allin1-tests
Browse files Browse the repository at this point in the history
Test using the allin1 container
  • Loading branch information
DanRoscigno authored Feb 11, 2024
2 parents c29962a + b398197 commit c493035
Show file tree
Hide file tree
Showing 10 changed files with 155 additions and 14 deletions.
66 changes: 66 additions & 0 deletions .github/workflows/test_with_allin1.yml
Original file line number Diff line number Diff line change
@@ -0,0 +1,66 @@
name: Test with allin1

on:
schedule:
- cron: "5 9 * * 1"
push:
branches: [ main ]
paths:
- 'ci/**/quickstart/basic/*'
- '.github/workflows/test_with_allin1.yml'
- 'allin1-docker-compose.yml'
- 'quickstart_basic_test.go'
- 'helper.go'
- 'ginkgo.Dockerfile'
pull_request:
branches: [ main ]
paths:
- 'ci/**/quickstart/basic/*'
- '.github/workflows/test_with_allin1.yml'
- 'allin1-docker-compose.yml'
- 'quickstart_basic_test.go'
- 'helper.go'
- 'ginkgo.Dockerfile'

jobs:
build:

name: Build and test
runs-on: ubuntu-latest

steps:
# Checkout the repo as this CI needs:
# - the compose file for StarRocks and Ginkgo/Gomega
- uses: actions/checkout@v3

- name: Set up Golang
uses: actions/setup-go@v5
with:
go-version-file: 'ci/go.mod'

- name: Install ginkgo
run: |
version=$(cat go.mod| grep "ginkgo/v2" | awk '{print $2}')
go install -v github.com/onsi/ginkgo/v2/ginkgo@$version
working-directory: ./ci

- name: Start StarRocks
run: docker compose -f allin1-docker-compose.yml up --detach --wait --wait-timeout 60

# Any tests that will run against the StarRocks env would be
# launched in steps like this one. Make sure to reset the
# StarRocks environment after each run (remove any tables
# and databases created, and reset any settings to the default)
#
# The ginkgo command uses `--focus-file` to run only the one test
# file.
- name: Test; Basic Quick Start
if: always()
env:
AWS_S3_ACCESS_KEY: ${{ secrets.AWS_S3_ACCESS_KEY }}
AWS_S3_SECRET_KEY: ${{ secrets.AWS_S3_SECRET_KEY }}
run: ginkgo -v --focus-file=./quickstart_basic_test.go
working-directory: ./ci

# Add more tests here if there are other things
# that should run against allin1
40 changes: 40 additions & 0 deletions allin1-docker-compose.yml
Original file line number Diff line number Diff line change
@@ -0,0 +1,40 @@
version: "3.9"
name: tests_using_allin1_container

# Goal: a compose equivalent of:
# "docker run -p 9030:9030 -p 8030:8030 -p 8040:8040 -itd --name quickstart starrocks/allin1-ubuntu"

networks:
allin1:

services:
allin1:
image: starrocks/allin1-ubuntu:3.2-latest
hostname: fe
container_name: quickstart
ports:
- 9030:9030
- 8030:8030
- 8040:8040
user: root

healthcheck:
test: 'mysql -u root -h fe -P 9030 -e "show backends\G" |grep "Alive: true"'
interval: 10s
timeout: 5s
retries: 6
networks:
- allin1

# This section is commented out as the ports of the allin1 need to be
# made available to the host env and the tests need to run from the
# host. Leaving this in the file as I will need it for other
# situations and need a reference.
# test-harness:
# extends:
# file: ./test-harness-docker-compose.yml
# service: test-harness
# command: ash
# tty: true
# networks:
# - allin1
2 changes: 1 addition & 1 deletion ci/SHELL/quickstart/basic/NYPD_download
Original file line number Diff line number Diff line change
@@ -1,4 +1,4 @@
#!/bin/ash
#!/bin/bash
cd /tmp/
curl --silent --no-buffer \
-o /tmp/NYPD_Crash_Data.csv \
Expand Down
4 changes: 2 additions & 2 deletions ci/SHELL/quickstart/basic/NYPD_stream_load
Original file line number Diff line number Diff line change
@@ -1,4 +1,4 @@
#!/bin/ash
#!/bin/bash
cd /tmp/
curl --silent --no-buffer \
--location-trusted -u root:"" \
Expand All @@ -9,4 +9,4 @@ curl --silent --no-buffer \
-H "enclose:\"" \
-H "max_filter_ratio:1" \
-H "columns:tmp_CRASH_DATE, tmp_CRASH_TIME, CRASH_DATE=str_to_date(concat_ws(' ', tmp_CRASH_DATE, tmp_CRASH_TIME), '%m/%d/%Y %H:%i'),BOROUGH,ZIP_CODE,LATITUDE,LONGITUDE,LOCATION,ON_STREET_NAME,CROSS_STREET_NAME,OFF_STREET_NAME,NUMBER_OF_PERSONS_INJURED,NUMBER_OF_PERSONS_KILLED,NUMBER_OF_PEDESTRIANS_INJURED,NUMBER_OF_PEDESTRIANS_KILLED,NUMBER_OF_CYCLIST_INJURED,NUMBER_OF_CYCLIST_KILLED,NUMBER_OF_MOTORIST_INJURED,NUMBER_OF_MOTORIST_KILLED,CONTRIBUTING_FACTOR_VEHICLE_1,CONTRIBUTING_FACTOR_VEHICLE_2,CONTRIBUTING_FACTOR_VEHICLE_3,CONTRIBUTING_FACTOR_VEHICLE_4,CONTRIBUTING_FACTOR_VEHICLE_5,COLLISION_ID,VEHICLE_TYPE_CODE_1,VEHICLE_TYPE_CODE_2,VEHICLE_TYPE_CODE_3,VEHICLE_TYPE_CODE_4,VEHICLE_TYPE_CODE_5" \
-XPUT http://fe:8030/api/quickstart/crashdata/_stream_load
-XPUT http://localhost:8030/api/quickstart/crashdata/_stream_load
2 changes: 1 addition & 1 deletion ci/SHELL/quickstart/basic/Weather_download
Original file line number Diff line number Diff line change
@@ -1,4 +1,4 @@
#!/bin/ash
#!/bin/bash
cd /tmp/
curl --silent --no-buffer \
-o 72505394728.csv \
Expand Down
4 changes: 2 additions & 2 deletions ci/SHELL/quickstart/basic/Weather_stream_load
Original file line number Diff line number Diff line change
@@ -1,4 +1,4 @@
#!/bin/ash
#!/bin/bash
cd /tmp/
curl --silent --no-buffer \
--location-trusted -u root:"" \
Expand All @@ -9,5 +9,5 @@ curl --silent --no-buffer \
-H "enclose:\"" \
-H "max_filter_ratio:1" \
-H "columns: STATION, DATE, LATITUDE, LONGITUDE, ELEVATION, NAME, REPORT_TYPE, SOURCE, HourlyAltimeterSetting, HourlyDewPointTemperature, HourlyDryBulbTemperature, HourlyPrecipitation, HourlyPresentWeatherType, HourlyPressureChange, HourlyPressureTendency, HourlyRelativeHumidity, HourlySkyConditions, HourlySeaLevelPressure, HourlyStationPressure, HourlyVisibility, HourlyWetBulbTemperature, HourlyWindDirection, HourlyWindGustSpeed, HourlyWindSpeed, Sunrise, Sunset, DailyAverageDewPointTemperature, DailyAverageDryBulbTemperature, DailyAverageRelativeHumidity, DailyAverageSeaLevelPressure, DailyAverageStationPressure, DailyAverageWetBulbTemperature, DailyAverageWindSpeed, DailyCoolingDegreeDays, DailyDepartureFromNormalAverageTemperature, DailyHeatingDegreeDays, DailyMaximumDryBulbTemperature, DailyMinimumDryBulbTemperature, DailyPeakWindDirection, DailyPeakWindSpeed, DailyPrecipitation, DailySnowDepth, DailySnowfall, DailySustainedWindDirection, DailySustainedWindSpeed, DailyWeather, MonthlyAverageRH, MonthlyDaysWithGT001Precip, MonthlyDaysWithGT010Precip, MonthlyDaysWithGT32Temp, MonthlyDaysWithGT90Temp, MonthlyDaysWithLT0Temp, MonthlyDaysWithLT32Temp, MonthlyDepartureFromNormalAverageTemperature, MonthlyDepartureFromNormalCoolingDegreeDays, MonthlyDepartureFromNormalHeatingDegreeDays, MonthlyDepartureFromNormalMaximumTemperature, MonthlyDepartureFromNormalMinimumTemperature, MonthlyDepartureFromNormalPrecipitation, MonthlyDewpointTemperature, MonthlyGreatestPrecip, MonthlyGreatestPrecipDate, MonthlyGreatestSnowDepth, MonthlyGreatestSnowDepthDate, MonthlyGreatestSnowfall, MonthlyGreatestSnowfallDate, MonthlyMaxSeaLevelPressureValue, MonthlyMaxSeaLevelPressureValueDate, MonthlyMaxSeaLevelPressureValueTime, MonthlyMaximumTemperature, MonthlyMeanTemperature, MonthlyMinSeaLevelPressureValue, MonthlyMinSeaLevelPressureValueDate, MonthlyMinSeaLevelPressureValueTime, MonthlyMinimumTemperature, MonthlySeaLevelPressure, MonthlyStationPressure, MonthlyTotalLiquidPrecipitation, MonthlyTotalSnowfall, MonthlyWetBulb, AWND, CDSD, CLDD, DSNW, HDSD, HTDD, NormalsCoolingDegreeDay, NormalsHeatingDegreeDay, ShortDurationEndDate005, ShortDurationEndDate010, ShortDurationEndDate015, ShortDurationEndDate020, ShortDurationEndDate030, ShortDurationEndDate045, ShortDurationEndDate060, ShortDurationEndDate080, ShortDurationEndDate100, ShortDurationEndDate120, ShortDurationEndDate150, ShortDurationEndDate180, ShortDurationPrecipitationValue005, ShortDurationPrecipitationValue010, ShortDurationPrecipitationValue015, ShortDurationPrecipitationValue020, ShortDurationPrecipitationValue030, ShortDurationPrecipitationValue045, ShortDurationPrecipitationValue060, ShortDurationPrecipitationValue080, ShortDurationPrecipitationValue100, ShortDurationPrecipitationValue120, ShortDurationPrecipitationValue150, ShortDurationPrecipitationValue180, REM, BackupDirection, BackupDistance, BackupDistanceUnit, BackupElements, BackupElevation, BackupEquipment, BackupLatitude, BackupLongitude, BackupName, WindEquipmentChangeDate" \
-XPUT http://fe:8030/api/quickstart/weatherdata/_stream_load
-XPUT http://localhost:8030/api/quickstart/weatherdata/_stream_load

1 change: 1 addition & 0 deletions ci/ginkgo.Dockerfile
Original file line number Diff line number Diff line change
Expand Up @@ -14,3 +14,4 @@ RUN go install github.com/onsi/ginkgo/v2/[email protected]

RUN go mod download

CMD ["ash"]
2 changes: 1 addition & 1 deletion ci/helper.go
Original file line number Diff line number Diff line change
Expand Up @@ -16,7 +16,7 @@ func GetDSNConnection() (*sql.DB, error) {
User: "root",
Passwd: "",
Net: "tcp",
Addr: "fe:9030",
Addr: "localhost:9030",
AllowNativePasswords: true,
}
return sql.Open("mysql", cfg.FormatDSN())
Expand Down
41 changes: 41 additions & 0 deletions partiallyworking.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,41 @@
## To Do

1. Figure out why the curl command (see fail down at end of this file) fails. It is failing because the stream load forwards to `Failed to connect to 127.0.0.1 port 8040 after 0 ms: Couldn't connect to server` (I removed the `--silent` to see what the error is). Running the Ginkgo commands from localhost works (required editing all of the shell scripts to use bash and localhost instead of ash and fe). These changes should work from the github workflow also.

So, run Gingkgo from a container when testing against separate FE and BE, and from the workflow when testing against allin1.
1. Need to write the workflow file

### Remember to export

```bash
export AWS_S3_SECRET_KEY=redacted
export AWS_S3_ACCESS_KEY=redacted
```

### Works

```bash
docker compose -f allin1-docker-compose.yml build
```

### Works

This starts the allin1 and the ginkgo container. The Ginkgo container just runs ash (shell) in the background and then I can use `docker compose exec` to run ginkgo commands.

```bash
docker compose -f allin1-docker-compose.yml up --detach --wait --wait-timeout 60
```

### This one works partially

The connection to the database succeeds, the DDL commands work, but the curl to populate the crash data fails.

```bash
docker compose -f allin1-docker-compose.yml exec test-harness ginkgo -v --focus-file=./quickstart_basic_test.go
```

### This command works

```bash
docker compose -f allin1-docker-compose.yml exec test-harness ginkgo -v --focus-file=./docs_test.go
```
7 changes: 0 additions & 7 deletions test-harness-docker-compose.yml
Original file line number Diff line number Diff line change
Expand Up @@ -12,10 +12,3 @@ services:
build:
context: ci
dockerfile: ginkgo.Dockerfile
networks:
mynet: {}

networks:
mynet:
external: true
name: ${network}

0 comments on commit c493035

Please sign in to comment.