fix: missing AutoencoderKL lora adapter#9807
Conversation
sayakpaul
left a comment
There was a problem hiding this comment.
Thank you! Just a single comment.
09cd44d to
9fb4880
Compare
|
The docs for this PR live here. All of your documentation changes will be reflected on that endpoint. The docs are available until 30 days after the last update. |
|
can you run |
|
the failing tests are related, can we look into them? I think we may need to add a @require_peft_backend similar to |
sayakpaul
left a comment
There was a problem hiding this comment.
Sorry I had actually forgot to submit my reviews.
|
|
||
| from ..test_modeling_common import ModelTesterMixin, UNetTesterMixin | ||
|
|
||
| from peft import LoraConfig |
There was a problem hiding this comment.
This needs to be guarded like this:
|
|
||
| self.assertTrue(torch_all_close(output_slice, expected_output_slice, rtol=1e-2)) | ||
|
|
||
| def test_lora_adapter(self): |
There was a problem hiding this comment.
This needs to be decorated with:
There was a problem hiding this comment.
@sayakpaul ah apologies, may have forgot to push to repo. Done. Thanks for your vigilance.
All done @yiyixuxu I believe, let me know if anything else remains. |
|
@beniz thanks! Possible to fix the quality issues by running |
I've fix a missing dependency. FYI running make style && qualilty fails early on other unrelated files, so I've been looking at the underlying ruff calls to apply them to the relevant files. |
|
@beniz I pushed the quality fixes directly to your branch, which I hope is okay. If not, please let me know, I will revert immediately |
Much appreciated, thank you. |
* fix: missing AutoencoderKL lora adapter * fix --------- Co-authored-by: Sayak Paul <spsayakpaul@gmail.com>
* fix: missing AutoencoderKL lora adapter * fix --------- Co-authored-by: Sayak Paul <spsayakpaul@gmail.com>
…h bnb components (#9840) * allow device placement when using bnb quantization. * warning. * tests * fixes * docs. * require accelerate version. * remove print. * revert to() * tests * fixes * fix: missing AutoencoderKL lora adapter (#9807) * fix: missing AutoencoderKL lora adapter * fix --------- Co-authored-by: Sayak Paul <spsayakpaul@gmail.com> * fixes * fix condition test * updates * updates * remove is_offloaded. * fixes * better * empty --------- Co-authored-by: Emmanuel Benazera <emmanuel.benazera@jolibrain.com>
* fix: missing AutoencoderKL lora adapter * fix --------- Co-authored-by: Sayak Paul <spsayakpaul@gmail.com>
…h bnb components (#9840) * allow device placement when using bnb quantization. * warning. * tests * fixes * docs. * require accelerate version. * remove print. * revert to() * tests * fixes * fix: missing AutoencoderKL lora adapter (#9807) * fix: missing AutoencoderKL lora adapter * fix --------- Co-authored-by: Sayak Paul <spsayakpaul@gmail.com> * fixes * fix condition test * updates * updates * remove is_offloaded. * fixes * better * empty --------- Co-authored-by: Emmanuel Benazera <emmanuel.benazera@jolibrain.com>
What does this PR do?
This PR fixes the missing lora adapter with VAE (AutoencoderKL class).
Discussion is here: #9771
Related reports:
GaParmar/img2img-turbo#64
radames/Real-Time-Latent-Consistency-Model#38
Who can review?
cc @sayakpaul