Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Project-wide Pytest integration #85

Open
2 of 11 tasks
smellycloud opened this issue Oct 2, 2024 · 0 comments
Open
2 of 11 tasks

Project-wide Pytest integration #85

smellycloud opened this issue Oct 2, 2024 · 0 comments
Assignees
Labels
improvement Refactoring, improving existing codebase

Comments

@smellycloud
Copy link
Contributor

smellycloud commented Oct 2, 2024

Title

Include a Test Directory with Pytest test scripts for all relevant existing scripts

Description

This feature request proposes the inclusion of a dedicated test directory containing Pytest test scripts for all relevant existing scripts in the project. The goal is to enhance the project's reliability and maintainability by ensuring comprehensive testing coverage.

Motivation

Currently, the project lacks a structured approach to testing, which can lead to undetected bugs and reduced code quality. By integrating a test directory with Pytest scripts, we can:

  • Ensure that all existing scripts are thoroughly tested.
  • Detect and fix bugs early in the development process.
  • Improve code quality and maintainability.
  • Facilitate easier onboarding for new contributors by providing clear testing guidelines.

Proposed Solution

  1. Directory Structure: Create a tests directory under the relevant module. For example,

views_pipeline/
├── meta_tools/
│ ├── tool1.py
│ ├── tool2.py
│ ├── tool3.py
│ ├── tests/
│ │ ├── test_tool1.py
│ │ ├── test_tool2.py
│ │ ├── test_tool3.py

  1. Test Scripts: Develop Pytest test scripts for each relevant existing util/model script. Each test script should:
  • Import the necessary modules and functions from the corresponding script.
  • Include unit tests for individual functions and methods.
  • Include integration tests where applicable.
  • Use fixtures to set up any required test data or configurations.
  • Create scripts for relevant model-specific functions.
  • Focus on Data Type and Shape Checking: Ensure that tests include checks for data types and shapes, and handle exceptions appropriately. This helps in catching errors related to incorrect data formats early in the process.
  1. Example Test Script:
# tests/test_ingestion.py
import pytest
from src.ingestion import ingest_data

@pytest.fixture
def sample_data():
    return {"key": "value"}

def test_ingest_data(sample_data):
    result = ingest_data(sample_data)
    assert result is not None
    assert result["key"] == "value"
  1. CI Integration: Update the CI/CD pipeline to run Pytest on every commit and pull request. This ensures that all tests are executed automatically, and any issues are detected early.

Alternatives Considered

  • Manual Testing: This approach is less reliable and more time-consuming compared to automated testing.
  • Other Testing Frameworks: While other frameworks like Unittest were considered, Pytest was chosen for its simplicity, flexibility, and widespread adoption in the Python community.

Additional Context

  • External Resources: Refer to the Pytest documentation for more information on writing and running tests.

Actionable Tasks

  • Create the tests directory.
  • Develop Pytest scripts for each relevant existing script.
  • Update the CI/CD pipeline to include Pytest. (Future implementation)

Relevant directories

  • Tests for common_utils #86
  • meta_tools (High priority)
  • models/model_name/src/forecasting
  • models/model_name/src/offline_evaluation
  • models/model_name/src/online_evaluation
  • models/model_name/src/utils
  • models/model_name/src/management
  • models/model_name/src/training

Branch Guidelines for Adding Tests

Branch Naming Convention

  • Use the following naming convention for branches related to adding tests: add-tests-for-<module-name> or add-tests-for-<model_name>.
    • Example: add-tests-for-meta_tools or add-tests-for-purple_alien

README Generation

Ensure that instructions to run the Pytest tests are included in the relevant module README file. For example,
To run tests: pytest --pyargs common_utils

ADR and Documentation Generation

Establish more concrete principles to integrate constant testing into the current workflow and future CI/CD pipeline after considering all opinions and criticisms.

@smellycloud smellycloud added the improvement Refactoring, improving existing codebase label Oct 2, 2024
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
improvement Refactoring, improving existing codebase
Projects
None yet
Development

No branches or pull requests

6 participants