Fix multi-prompt inference#7208
Fix multi-prompt inference#7208nikitabalabin wants to merge 0 commit intohuggingface:mainfrom nikitabalabin:main
Conversation
|
Can you show an example of where the current code (without the changes in this PR) fails? |
images[1] and images[2] will be generated with wrong prompt_attention_mask. |
|
Thanks! Could we add a fast test for this as well? |
|
The docs for this PR live here. All of your documentation changes will be reflected on that endpoint. The docs are available until 30 days after the last update. |
|
This issue has been automatically marked as stale because it has not had recent activity. If you think this still needs to be addressed please comment on this thread. Please note that issues that do not follow the contributing guidelines are likely to be ignored. |
|
hi @nikitabalabin |
|
can we run |
|
@hlky was the close expected? |
|
Rebase to fix conflict went wrong 😅 |
Fix generation of multiple images with multiple prompts, e.g len(prompts)>1, num_images_per_prompt>1
What does this PR do?
Fixes # (issue)
Before submitting
documentation guidelines, and
here are tips on formatting docstrings.
Who can review?
Anyone in the community is free to review the PR once the tests have passed. Feel free to tag
members/contributors who may be interested in your PR.