- Block persistent violators identified through trust levels or fraud detection
- Remove spam accounts posting promotional content repeatedly
- Suspend users to send clear signals about acceptable behavior
Built-in actions
The system provides three essential moderation actions for comprehensive account management:Block (Permanent Ban)
Block (Permanent Ban)
Permanently prevents a user from submitting content for moderation.When to use:
- Spam accounts with no legitimate content
- Users repeatedly violating major community guidelines
- Accounts identified as malicious through fraud detection
- Coordinated attacks or bot networks
- User status set to
"blocked"
- All future content submissions are rejected
block.until
field set tonull
(permanent)block.reason
stores the reason provided
Suspend (Temporary Ban)
Suspend (Temporary Ban)
Temporarily prevents a user from submitting content for a specified period.When to use:
- First-time serious violations that warrant a cooling-off period
- Users who might reform with temporary consequences
- Escalating enforcement before permanent blocks
- Accounts needing investigation time
- User status set to
"suspended"
- Content submissions rejected until suspension ends
block.until
field set to suspension end time (timestamp)block.reason
stores the reason provided- Automatic reinstatement when period expires
Enable (Remove Block/Suspension)
Enable (Remove Block/Suspension)
Removes existing blocks or suspensions, restoring normal access.When to use:
- Appeals that have been approved
- Mistaken blocks that need correction
- Policy changes requiring user reinstatement
- Early release from suspensions for good behavior
- User status reset to
"enabled"
- Content submission access restored
- Block fields cleared from user record
Action customization
You can customize how actions appear and behave: Value customization:- Predefined values: Add custom dropdown options like “Spam”, “Harassment”, “Fraud”, “Policy Violation”
- Free text option: Enable “Other” option that shows a text input field for custom reasons
- Combined approach: Mix predefined categories with free text for flexibility
Adding custom actions
Beyond built-in actions, create custom author-level actions for specialized workflows that show up alongside standard actions. Use cases for custom actions:- Warning systems: Formal warnings that don’t restrict access
- Verification requirements: Require additional account verification
- Feature restrictions: Limit specific platform features
- Escalation triggers: Automatically escalate to senior moderators
Executing actions
- Dashboard
- Navigate to the user detail page in the user dashboard
- Select the appropriate action (Block, Suspend, or Enable)
- Choose from predefined reason options or select “Other” for free text
- For suspensions: select duration from dropdown or set custom end time
- Confirm the action
- Required for documentation and potential appeals
- Visible to moderators in user timeline
- Returned in API responses for application logic
- Helps identify patterns in violation types

- Action history: Complete chronological record of all actions taken
- Moderator attribution: Which team member executed each action
- Reason logging: Full context for every moderation decision
- Audit trail: Comprehensive record for compliance and appeals
Technical implementation
Moderation actions modify several fields returned in API responses, allowing seamless integration with your application logic.Status fields
Monitor user status through API responses:Content submission integration
The system automatically handles blocked/suspended users when content is submitted to moderation endpoints, but you can also check status proactively in your application by querying the user in the Author API.Webhooks
Receive real-time notifications when moderation actions are executed through webhook integration: Common webhook use cases:- Sync with external systems like customer support platforms
- Trigger email notifications to affected users
- Update internal user databases with moderation status
- Log actions for compliance and audit requirements
- Alert security teams about fraud-related blocks
FAQ
Can blocked users still view content?
Can blocked users still view content?
Moderation actions only prevent content submission for moderation. They don’t affect content viewing unless you implement additional restrictions in your application using the status fields.
What happens to existing content when a user is blocked?
What happens to existing content when a user is blocked?
Existing content remains in your system and review queues. moderation actions
only prevent new submissions. You may want to review existing content from
blocked users separately.
Can suspended users see their suspension status?
Can suspended users see their suspension status?
The system provides the suspension information through API responses, but
displaying this to users depends on your application implementation. Use the
user.block.until
and user.block.reason
fields to show appropriate
messages.How do I handle ban appeals?
How do I handle ban appeals?
Use the Enable action to reinstate users after successful appeals. The action
history timeline maintains a record of the original block and subsequent
reinstatement for transparency.
Can I modify suspension periods after they're set?
Can I modify suspension periods after they're set?
You can use the Enable action to end suspensions early, or apply a new Suspend
action to change the period. Each action creates a new timeline entry for full
audit tracking.
Do moderation actions work with trust levels and fraud detection?
Do moderation actions work with trust levels and fraud detection?
Yes, moderation actions complement these systems perfectly. High fraud risk scores or low trust levels often indicate when moderation actions are needed. The systems work together to provide comprehensive user management.