[Safety Checker] Add Safety Checker Module#36
Conversation
rromb
left a comment
There was a problem hiding this comment.
Looks good, we could consider making this optional
|
Happy to adapt the way you prefer |
|
Any instructions to adjust this? Emad on twitter had mentioned there was a way to specifically block Clowns for example, which is a fun insight into how this works. Or just a simple 'comment this out to completely take safety off' might also be a good note to have. |
Updating ipynb with colab-convert
yes, would be nice to be able to remove the safety checker. some of us are adults! |
You can already remove the safety, it is just that the removal procedure is not explained in the documentation. |
|
I saw the model come from https://huggingface.co/CompVis/stable-diffusion-safety-checker/blob/main/pytorch_model.bin and I am wondering where is this model come from. There is some nsfw detector in github but this one looks different (larger). |
|
So far I've had no luck finding the right script to remove the nsfw feature. It's blocking images that are already safe. Very frustrating. |
Maybe try this notebook. |
With @anton-l and @patil-suraj, we've made sure that the safety_checker works correctly see here: huggingface/diffusers#219
The only thing we're not 100% sure about is whether this line https://github.com/huggingface/diffusers/blob/89e9521048067acacfdcbc2b985af8f6b155cfb6/src/diffusers/pipelines/stable_diffusion/safety_checker.py#L65 that defines the threshold is correct