Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

feat: Add support for prepending conda channels via the CLI. #67

Merged
merged 2 commits into from
Feb 22, 2024

Conversation

bdice
Copy link
Contributor

@bdice bdice commented Feb 22, 2024

This adds an option --prepend-channels "my_channel;my_other_channel" to the rapids-dependency-file-generator CLI. This will allow us to use rapids-dependency-file-generator with local channels containing PR artifacts fetched in CI workflows. If we combine this feature with some small changes in dependencies.yaml, we will be able to generate the entire CI environment in one pass rather than generating an environment with test dependencies and then installing the packages (e.g. libcuml and libcuml-tests) in a separate step.

This option is modeled after conda config --prepend channels new_channel (reference).

See original proposal in rapidsai/cuml#5781 (comment) (this PR is slightly improved from that proposal).

This is a partial solution for rapidsai/build-planning#22.

@bdice
Copy link
Contributor Author

bdice commented Feb 22, 2024

To solve rapidsai/build-planning#22, I'm thinking we'll do this kind of thing in each RAPIDS repo CI test script:

rapids-logger "Downloading artifacts from previous jobs"
CPP_CHANNEL=$(rapids-download-conda-from-s3 cpp)
PYTHON_CHANNEL=$(rapids-download-conda-from-s3 python)

rapids-dependency-file-generator \
  --output conda \
  --file_key test_python \
  --matrix "cuda=${RAPIDS_CUDA_VERSION%.*};arch=$(arch);py=${RAPIDS_PY_VERSION}" \
  --prepend-channels "${CPP_CHANNEL};${PYTHON_CHANNEL}" | tee env.yaml

rapids-mamba-retry env create --force -f env.yaml -n test

We would modify the test_cpp file key to include a dependency list containing libcuml and libcuml-tests. We would also modify the test_python file key to include a dependency list containing libcuml and cuml. Then this environment would be equivalent to the conda-merge output I was using in rapidsai/cuml#5781.

Copy link
Member

@ajschmidt8 ajschmidt8 left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

LGTM.

@bdice bdice merged commit 720f2cd into rapidsai:main Feb 22, 2024
3 checks passed
GPUtester pushed a commit that referenced this pull request Feb 22, 2024
# [1.9.0](v1.8.0...v1.9.0) (2024-02-22)

### Features

* Add support for prepending conda channels via the CLI. ([#67](#67)) ([720f2cd](720f2cd))
@GPUtester
Copy link

🎉 This PR is included in version 1.9.0 🎉

The release is available on:

Your semantic-release bot 📦🚀

rapids-bot bot pushed a commit to rapidsai/cuml that referenced this pull request Feb 26, 2024
This PR creates a conda environment containing both test dependencies and cuml packages. This is a workaround for some issues seen with conda being unable to downgrade from Arrow 15 (in the initial environment with test dependencies) to Arrow 14 (currently pinned by cudf, which is a dependency of cuml).

This is a partial solution for rapidsai/build-planning#22. (More work is needed for other RAPIDS repos.)

Depends on rapidsai/dependency-file-generator#67.

Authors:
  - Bradley Dice (https://github.com/bdice)

Approvers:
  - Charles Blackmon-Luca (https://github.com/charlesbluca)
  - GALI PREM SAGAR (https://github.com/galipremsagar)
  - Vyas Ramasubramani (https://github.com/vyasr)
  - Jake Awe (https://github.com/AyodeAwe)
  - https://github.com/jakirkham
  - Dante Gama Dessavre (https://github.com/dantegd)

URL: #5781
difyrrwrzd added a commit to difyrrwrzd/dependency-file-generator that referenced this pull request Aug 10, 2024
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
Projects
None yet
Development

Successfully merging this pull request may close these issues.

3 participants