Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Quick skip validation test if models or clouds mismatch #1562

Merged
merged 2 commits into from
Aug 19, 2024

Conversation

addyess
Copy link
Member

@addyess addyess commented Aug 19, 2024

Overview

  • More quickly skip tests if they use the on_model test marking
  • More quickly skip tests if they use the clouds test marking

Details

Using the pytest feature pytest_runtest_setup, we can use the required model keyword from config to quickly skip a test if it doesn't match that model name

  • This works well with models because the name doesn't have derived from a module level fixture -- but rather test configuration.
  • Removed the "arch" level skipping because it wasn't testing the architecture of the units under test -- but rather the architecture of the machine running the tests.
  • By making the Tools object in the test not an async loaded object, it can be constructed and loaded in pytest_configure
    • this allows the tests to determine the controller's "cloud" before connecting to the juju model

@addyess addyess force-pushed the akd/pytest-skip-on-models branch from 1cef38a to ecc5bac Compare August 19, 2024 18:57
@addyess addyess changed the title Quick skip validation test if models mismatch Quick skip validation test if models or clouds mismatch Aug 19, 2024
Comment on lines -55 to +57
parser.addoption(
"--cloud", action="store", default="aws/us-east-2", help="Juju cloud to use"
)
parser.addoption("--cloud", action="store", help="Juju cloud to use")
Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Stop defaulting the cloud type to 'aws/us-east-2`. This isn't necessary

Comment on lines -146 to +147
def __init__(self, request):
self._request = request
def __init__(self, config):
self._config = config
Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

construct with the pytest config rather than a request object

self.requests = requests
self.requests_get = asyncify(requests.get)
self.run = asyncify(self.exec)
Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

the existing tools.run is an async version of the new tools.exec which does away with the async stuff...

Comment on lines +179 to +188
@cached_property
def cloud(self):
if _it := self._config.getoption("--cloud"):
return _it
controller_data, _ = self.exec(
"juju", "show-controller", self.controller_name, "--format", "yaml"
)
controller_infos = yaml.safe_load(controller_data)
controller_info, *_ = controller_infos.values()
return controller_info["details"]["cloud"]
Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

the cloud field can be loaded from the existing required controller name argument.

Comment on lines -194 to +202
async def run(self, cmd: str, *args: str, stdin=None, _tee=False):
def exec(self, cmd: str, *args: str, stdin=None, _tee=False):
Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

async subprocess is actually really hard -- this method actually gets much simpler

Comment on lines -307 to +292
tools = Tools(request)
await tools._load()
return tools
yield request.config.test_tools
Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

the old fixture now just has to yield the existing tools from the request.config. See below in pytest_configure

Comment on lines -433 to -445
@pytest.fixture
def system_arch():
return arch


@pytest.fixture(autouse=True)
def skip_by_arch(request, system_arch):
"""Skip tests on specified arches"""
if request.node.get_closest_marker("skip_arch"):
if system_arch in request.node.get_closest_marker("skip_arch").args[0]:
pytest.skip("skipped on this arch: {}".format(system_arch))


Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

This never would have really worked the way it was intended

Comment on lines +695 to +698
def pytest_runtest_setup(item):
"""Called to perform the setup phase for a test item."""
skip_by_model(item) # skip tests if model marking on test mismatches
skip_by_cloud(item) # skip tests if cloud marking on test mismatches
Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

handles quick skipping of the tagged test by model or cloud

Comment on lines +701 to +703
def pytest_configure(config):
config.test_tools = Tools(config)
config.test_tools._load()
Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Load the tools for the validation tests into the config's global space

Copy link
Contributor

@kwmonroe kwmonroe left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Yup, this LGTM and is a welcomed fix to better include/skip bits on non-pytest-running arches and for non-gpu capable models (for stuff like nvidia tests).

@kwmonroe kwmonroe merged commit d5b6ff1 into main Aug 19, 2024
6 checks passed
@kwmonroe kwmonroe deleted the akd/pytest-skip-on-models branch August 19, 2024 22:01
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Development

Successfully merging this pull request may close these issues.

2 participants