Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

Haha, busted!


To be fair, the reason the filter is there is that if you ask for a picture of a woman, stable diffusion is pretty likely to generate a naked one!

If you tweak the prompt to explicitly mention clothing, you should be OK though.


Wow, is that true? I’ve never heard a more textbook ethical problem with a model.


Safari blocking searches for "asian" probably had more impact: https://9to5mac.com/2021/03/30/ios-14-5-no-longer-blocks-web...


It's an ethical problem with our society, not the model.


If you consider that the training set includes all western art from the last few centuries it’s not too surprising. There’s an awful lot of nudes in that set & most of them are female.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: