Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Migrate selector tests to pytest #10281

Closed

Conversation

QMalcolm
Copy link
Contributor

@QMalcolm QMalcolm commented Jun 7, 2024

resolves #9868

Problem

In #9868 we wanted to ensure we write, and can write, some selector tests. After creating the ticket, we actually found some selector tests while we were reorganizing tests. Thus #9868 became about converting these tests to use pytest.

Solution

Converted selector tests to use pytest, showing that we can write selector tests

Checklist

  • I have read the contributing guide and understand what's expected of me
  • I have run this code in development and it appears to resolve the stated issue
  • This PR includes tests, or tests are not required/relevant for this PR
  • This PR has no interface changes (e.g. macros, cli, logs, json artifacts, config files, adapter interface, etc) or this PR has already received feedback and approval from Product or DX
  • This PR includes type annotations for new and modified functions

QMalcolm added 7 commits June 6, 2024 16:29
We have a globally available `manifest` fixture in our unit tests. In the
coming commits we're going to add tests to the file which use the gloablly
available `manifest` fixture. Prior to this commit, the locally defined
`manifest` fixture was taking precidence. To get around this, the easiest
solution was to rename the locally defined fixture.

I had tried to isolate the locally defined fixture by moving it, and the relevant
tests to a test class like `TestNodeSelector`. However because of _how_ the relevant
tests were parameterized, this proved difficult. Basically for readability we define
a variable which holds a list of all the parameterization variables. By moving to a
test class, the definition of the variables would have had to be defined directly in
the parameterization macro call. Although possible, it made the readability slighty
worse. It might be worth doing anyway in the long run, but instead I used a less heavy
handed alternative (already described)
…in unit tests

The `Compiler.compile` method accesses `self.config.args.which`. The `config`
is the `RuntimeConfig` the `Compiler` was instantiated with. Our `runtime_config`
fixture was being instatiated with an empty dict for the `args` property. Thus
a `which` property of the args wasn't being made avaiable, and if `compile` was run
a runtime error would occur. To solve this, we've begun instantiating the args from
the global flags via `get_flags()`. This works because we ensure the `set_test_flags`
fixture is run first which calls `set_from_args`.
We had some tests in `test_selector.py::GraphTest` that didn't add
anything ontop of what was already being tested else where in the file
except the parsing of models. However, the tests in `test_parser.py::ModelParserTest`
cover everything being tested here (and then some). Thus these tests
in `test_selector.py::GraphTest` are unnecessary and can be deleted.
There was a test `test__partial_parse` in `test_selector.py` which tested
the functionality of `is_partial_parsable` of the `ManifestLoader`. This
doesn't really make sense to exist in `test_selector.py` where we are
testing selectors. We test the `ManifestLoader` class in `test_manifest.py`
which seemed like a more appropriate place for the test. Additionally we
renamed the test to `test_is_partial_parsable_by_version` to more accurately
describe what is being tested.
@QMalcolm QMalcolm requested a review from a team as a code owner June 7, 2024 23:38
@cla-bot cla-bot bot added the cla:yes label Jun 7, 2024
@QMalcolm QMalcolm added the Skip Changelog Skips GHA to check for changelog file label Jun 7, 2024
Copy link

codecov bot commented Jun 7, 2024

Codecov Report

All modified and coverable lines are covered by tests ✅

Project coverage is 88.67%. Comparing base (b680c7a) to head (9237332).
Report is 4 commits behind head on main.

Additional details and impacted files
@@            Coverage Diff             @@
##             main   #10281      +/-   ##
==========================================
- Coverage   88.73%   88.67%   -0.06%     
==========================================
  Files         180      180              
  Lines       22474    22495      +21     
==========================================
+ Hits        19942    19948       +6     
- Misses       2532     2547      +15     
Flag Coverage Δ
integration 85.93% <ø> (-0.16%) ⬇️
unit 61.73% <ø> (-1.52%) ⬇️

Flags with carried forward coverage won't be shown. Click here to find out more.

☔ View full report in Codecov by Sentry.
📢 Have feedback on the report? Share it here.

@QMalcolm
Copy link
Contributor Author

QMalcolm commented Jun 8, 2024

Closing this pull request as I don't want anyone to interact with a comment made by user simulified. It overtakes the page and displays a flashing image which contains an invite code to a discord server.

@QMalcolm QMalcolm closed this Jun 8, 2024
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
cla:yes Skip Changelog Skips GHA to check for changelog file
Projects
None yet
Development

Successfully merging this pull request may close these issues.

[SPIKE+] Write a unittest for a simple selection
1 participant