Moderating different content types
Learn about the different content types you can analyze with Moderation API.
Text
Text moderation is the most common type of content moderation. It’s used to moderate text-based content such as:
- chat messages
- forum posts
- comments
- reviews
- product descriptions
- profile bios
If you are analyzing chat messages or other thread based content, you might benefit from enabling context awareness in your project settings.
Image
Image moderation involves analyzing visual content to detect inappropriate or harmful images. This can include identifying nudity, violence, or other objectionable content. The Moderation API uses advanced image recognition techniques to provide accurate and reliable results.
Audio
Audio moderation is available for enterprise users and involves analyzing audio content to detect inappropriate language or sounds. This can be useful for moderating podcasts, voice messages, or any other audio content.
Video
Video moderation, also available for enterprise users, involves analyzing video content to detect inappropriate scenes or actions. This can include identifying violence, nudity, or other objectionable content within video files.
Object
Object moderation can be used for a comprehensive content analysis across different media types, and simplifies analyzing an entire entity.
You can select the type of object you want to moderate. Selecting the right type of object can improve the accuracy of the moderation. We currently support the following types:
- Profile
- Product
- Event
- Object (general)
Was this page helpful?