Skip to content

[CI] Refactor Wan Model Tests#13082

Open
DN6 wants to merge 1 commit intomainfrom
wan-test-refactor
Open

[CI] Refactor Wan Model Tests#13082
DN6 wants to merge 1 commit intomainfrom
wan-test-refactor

Conversation

@DN6
Copy link
Collaborator

@DN6 DN6 commented Feb 4, 2026

What does this PR do?

Update Wan tests with new format

Fixes # (issue)

Before submitting

Who can review?

Anyone in the community is free to review the PR once the tests have passed. Feel free to tag
members/contributors who may be interested in your PR.

@DN6 DN6 requested review from dg845, sayakpaul and yiyixuxu February 4, 2026 12:58
Copy link
Collaborator

@yiyixuxu yiyixuxu left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

really nice thanks!
should we start to add a guide for contributor some where, maybe https://huggingface.co/docs/diffusers/main/en/conceptual/contribution

Copy link
Collaborator

@dg845 dg845 left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Thanks! I see there are two Wan model related failures from the CI:

  • tests/models/transformers/test_models_transformer_wan_animate.py::TestWanAnimateTransformer3DAttention::test_fuse_unfuse_qkv_projections
  • tests/models/transformers/test_models_transformer_wan_vace.py::TestWanVACETransformer3DAttention::test_fuse_unfuse_qkv_projections

If I try to run the new Wan tests locally, for example with

pytest tests/models/transformers/test_models_transformer_wan.py

I get some more test failures:

  • tests/models/transformers/test_models_transformer_wan.py::TestWanTransformer3D
    • test_keep_in_fp32_modules
    • test_from_save_pretrained_dtype_inference[fp16,bf16]
  • tests/models/transformers/test_models_transformer_wan.py::TestWanTransformer3DGGUF
    • test_gguf_quantization_inference
    • test_gguf_keep_modules_in_fp32
    • test_gguf_quantization_dtype_assignment
    • test_gguf_quantization_lora_inference
    • test_gguf_dequantize
    • test_gguf_quantized_layers
  • tests/models/transformers/test_models_transformer_wan.py::TestWanTransformer3DGGUFCompile
    • test_gguf_torch_compile
    • test_gguf_torch_compile_with_group_offload

Are these test failures expected?

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

3 participants