Image moderation models are used to detect inappropriate content in images.
Label | Description |
---|---|
toxicity | Harmful content such as violence, offensive memes, hate, etc. |
nudity | Exposed male or female genitalia, female nipples, sexual acts. |
suggestive | Partial nudity, kissing. |
gore | Blood, wounds, death. |
violence | Graphic violence, causing harm, weapons, self-harm. |
weapon | Weapons either in use or displayed. |
drugs | Drugs such as pills. |
hate | Symbols related to nazi, terrorist groups, white supremacy and more. |
smoking | Smoking or smoking related content. |
alcohol | Alcohol or alcohol related content. |
text | Text inside the picture |