You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
{{ message }}
This repository has been archived by the owner on Jan 2, 2023. It is now read-only.
It would be a great feature if we had a way to mark code snippets that exist in the documentation in a way that indicates that they should be run as tests. During python autogen.py, we could extract those snippets into tests that can be run with pytest. This way we could ensure our examples stay up-to-date
Well, if possible, I'd prefer to keep something like this outside the code generation repository. The more independent are the pieces, the easier it is to maintain them.
After reading the thread on keras tuner, it made me think of a package I came across a while ago to test notebooks and use them in mkdocs. See https://pypi.org/project/mknotebooks/ .
Even if we need a function which takes a bunch of docstrings and run the python snippets, there is no need for it to be in keras-autodoc. We can still use the PAGES variable from autogen.py in the respective repos.
Overall I'm not sure which form will take the solution, but it'd be nice if someone could look around what exists in the wild and make a proposition for a tool (should we write it or not). With, if possible, the minimum amount of future maintenance necessary.
@omalleyt12 could you give a try to https://pypi.org/project/pytest_markdown/ after the documentation is built and report if it works well or not? It will extract code snippets from the documentation and run them.
On tensorflow we've been running doctest but it expects the >>> format.
For TensorFlow we went with the default >>> because that way we could easily distinguish between working and not-working examples. But I think it wouldn't be hard to make a doctest runner that picks up ``` blocks using the existing classes in the doctest module like DoctestFinder.
Sign up for freeto subscribe to this conversation on GitHub.
Already have an account?
Sign in.
@gabrieldemarmiesse Thanks for fixing the previous issues so quickly!
It would be a great feature if we had a way to mark code snippets that exist in the documentation in a way that indicates that they should be run as tests. During
python autogen.py
, we could extract those snippets into tests that can be run with pytest. This way we could ensure our examples stay up-to-dateSee discussion on keras-team/keras-tuner#136 (comment), we would probably have to roll our own solution for this
The text was updated successfully, but these errors were encountered: