Skip to content

Minor test suite refactoring and adding benchmarks#168

Open
hulk8 wants to merge 7 commits into
cayleypy:mainfrom
hulk8:refactor/benchmark
Open

Minor test suite refactoring and adding benchmarks#168
hulk8 wants to merge 7 commits into
cayleypy:mainfrom
hulk8:refactor/benchmark

Conversation

@hulk8
Copy link
Copy Markdown

@hulk8 hulk8 commented Oct 31, 2025

Summary

This pull request introduces a refactoring of the test suite, benchmarking process and adds benchmarks to the CI pipeline.

Key Changes:

  • Test suite refactoring with pytest.mark:

    • Tests are now categorized using pytest markers:
      • @pytest.mark.unit: For fast, isolated unit tests that should run on every commit.
      • @pytest.mark.slow: For long-running tests that can be skipped during regular development.
      • @pytest.mark.benchmark: For performance benchmark tests.
    • This replaces the previous environment variable-based approach (BENCHMARK_RUN, RUN_SLOW_TESTS), making the test execution more explicit and configurable.
  • Configuration:

    • The pytest configuration has been added to pyproject.toml.
  • Added benchmarking workflow:

    • The main CI workflow is now configured to run benchmarks and check regressions.
    • Benchmark results are published to the project GitHub Pages with plots. Example of how it looks in my fork.
    • Benchmark results are stored as JSON files in cache (for history because action benchmark-action/github-action-benchmark@v1 doesn't store the source pytest-benchmark json files in the gh-pages branch).
  • CI improvements:

    • The CI jobs have been updated to leverage the new pytest markers, allowing for more granular control over which tests are run.

Comments

  • Regression alert threshold is set to 1.25 and will fail CI.
  • If a regression is detected, a comment message will appear in the PR.
image * Also maybe need to add more users to alert on regression detection (option `alert-comment-cc-users`).

@hulk8
Copy link
Copy Markdown
Author

hulk8 commented Oct 31, 2025

For uploading benchmark results to GH Pages we need to create branch gh-pages in which they will be stored (it's the reason of pipeline fail on uploading benchmark step).

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

1 participant