-
Notifications
You must be signed in to change notification settings - Fork 249
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Rethink testing strategy #1019
Comments
/cc @nickmccurdy @timdeschryver @MatanBobi |
I was thinking of getting some Testing Library projects running on test frameworks other than Jest. Vitest would probably be a good one to focus on, as it's fast, easy to configure, mostly compatible with Jest, and supports enabling/disabling global injection. |
Can you leverage Vitest to run tests in a headless browser? |
@ph-fritsche great initiative, I highly agree with all of the points. |
I don't know if we should run (some?) tests in happy-dom too, but I think we need to run our tests in a headless browser. Some things are hard to test in a browser and also don't need to be tested twice - like the tests on correctly wiring the APIs through There might be exceptions due to limited implementation in the environment, but in general a test using |
A little update here: I'm trying to make this work without rewriting too much of the tests. Current approach is Karma+Jasmine with Jest's Providing the transpiled files is already resolved. Letting Karma manage the files and adding a preprocessor was slow and inflexible. I wrote a little tool using |
That sounds like an interesting approach. Is your goal to have a more minimal test environment, or specifically ensure Karma and Jasmine are supported? If the former, I think it could be easier to use Vitest, optionally with globals disabled. |
The goal is to run tests in at least one major browser (preferably Chrome) and in Jsdom. Inspecting any test breakpoint in the browser console or walk through it per debugger would be nice but isn't strictly necessary. Neither Karma nor Jasmine is a requirement. The more code I read, the more I start thinking that by the point I'll have understood the necessary configurations and plugin hooks, I could also have written the runner myself. |
I've had success with using Karma for that sort of thing in the past, but maybe a more modern alternative like Cypress or Playwright would be easier to set up now. |
Karma looks great, as it is built on a plugin system. But there are no types and the dependency injection makes it really hard to identify which interfaces are being used and which way to change some detail would be the "correct" one without causing undocumented side-effects. If I understood their documentation correctly, Vitest is running in node with Jsdom or Happy-dom. |
I'm aware, but maybe we could write an abstraction layer that could rust tests in something like Vitest and something like Cypress. |
Little update here: |
@ph-fritsche Is there something still open here? :) Need a hand? |
@MatanBobi Yes, your help is much appreciated. We need to complete #1091 by fixing or at least explaining the different results when running our tests in Chrome. If we can fix them, we can let the CI step fail on errors and make sure that any other PRs don't cause regression in compatibility with in-browser tests. |
@ph-fritsche It looks like the |
@MatanBobi (After fixing the linting errors triggered by updated deps,) here's a new report. |
I am late to the party, but is there something I can do @ph-fritsche? |
Hey! Would love to help with this as well. Set up very rudimental example where we run tests against happy-dom environment artursvonda#1 but would love to have some help on how to set this up properly. Initially we'll probably need to set it up in a way that allows these tests to fail on happy-dom as there's work to be done on both sides of user-event/happy-dom until we get to green. I already opened ticket on happy-dom for xpath capricorn86/happy-dom#1125 Let me know if this should be opened as a separate issue. |
Not sure entirely if this helps but I just stumbled upon this: https://github.com/material-components/material-web/blob/main/testing/harness.ts It seems Google has invested a good bit of effort to simulate clicks and the like. I'd imagine the one for Angular Material is probably even more extensive. |
Problem description
We run linting and tests on multiple node versions that we support according to
user-event/package.json
Lines 13 to 14 in ea50231
This has no merits.
Our source code has no dependency on any NodeJS API, so that the result for the source itself will always be the same.
The tested code is not the distributed code, so that if the different NodeJS versions yield different results, it only flags problems with our testing environment, but don't indicate anything about the build we would distribute from that commit.
We don't test our build. Misconfigurations or bugs in our build tools result in broken builds being published and any potential fix is verified only by manually testing the build.
We only test on Jsdom. We try to be platform agnostic, but out automation doesn't help us to develop software that works as intended in different environments, and the differences between Jsdom and e.g. Chrome are too numerous and wide for relying on manual testing only.
Suggested solution
a) those that rely on mocks or test internals and
b) those that only interact with the DOM and can be observed using only the main export.
Additional context
No response
The text was updated successfully, but these errors were encountered: