-
Notifications
You must be signed in to change notification settings - Fork 4
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Changing packaging structure #5
Conversation
check for failure points without removing it all
checking for failure points
I've opened a couple issues for those that could be addressed in a different PR so this is a bit manageable to review. Appreciate quick improvements and fixes needed to address this PR.. while the bigger ones can be new "issues" to address later. |
I can confirm this in a clean environment but i am getting a warning with a fresh code checkout:
which brings me to these lines that i find confusing. it does seem to work though.
happy to test that out upon publishing pipeline, which runs on merge i take it?
sorry i haven't touched github secrets and/or conda tokens. I am unsure if i even have permissions, we can chat elsewhere about this if we need. |
I pushed a super small tweak to the reqs detailed in the |
@Ciheim i think you can set the secrets for the respository? https://github.com/orgs/NOAA-GFDL/teams/catalogbuilder-admins |
Right, if the conda package is installed and user activates the environment, you won't see this error. Since users have an option to run the catalog builder without the conda package, we expect them to add the necessary system paths so we can find the intakebuilder. We do need a new CI test that tests the conda package upon publishing (also a GitHub issue, though unsure when it will be taken up)
I've added the secret. The conda publish CI pipeline is now working as expected per the logs. Though it's not approved - I did push the package to the noaa-gfdl channel through this CI (have issues opened, we need a test channel or something too later). So conda install could be tested now. |
so
now i look back at those import lines we WANT to work, in the
this won't work because the package is no longer called If we adjust the import calls, we can avoid
...or i can even use a relative import:
... and changing the same import lines in applicable files within |
not sure
|
`catalogbuilder.intakebuilder` not found in pipeline. b.c. the package isnt installed into the created environment. adding `pip install .` to check
…like one always expects? .... wouldnt totally suprise me.
Well i certainly see the challenge in keeping both a user's interactive python shell happy, and github's CI/CD pipeline happy, at the same time. ... python packaging, sigh... |
we 1) ask github CI/CD to setup a python3.10 we 2) ask conda to create an env with a python, but the package is NOT installed. i tried "pip install .", but that didnt work... because it used the first python, i imagine. now i will try calling the explicit pip of the CONDA python to install the package, before calling the test.
sometimes the environment is built with environemtn.yml, which has a slightly diff array of depencdencies in the yaml file, when compared to the conda-build target, meta.yaml./
… from environment.yaml
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
i reverted back to the state it was in when i introduced a tiny tweak. pipeline passes. will look at this in other issues/branches/PRs as you suggested- nontrivial for some reason.
honestly, i think we're fighting the pipeline+it's env much more than we are the packaging. those two sets of problems LOOK very similar, though.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Yes! thanks for this, @ilaflott
We need a sub-package structure, as in one can import just catalogs, or the wrapper scripts, or the API itself, or get all at once. But, also noting that going forward the catalogs perhaps will also be pushed to a schema repo.
Given the landscape change, we are revisiting the way packaging was done and see if technical issues can be resolved to get this done, collectively.