All Products
Search
Document Center

Intelligent Media Management:Content moderation

Last Updated:Nov 14, 2024

Content moderation plays a significant role in media content management and can be divided into more specific types, such as text moderation, image moderation, and video moderation. Text moderation detects and marks non-compliant text, whereas image and video moderation monitors visual media materials for inappropriate content. Content moderation helps prevent dissemination of inappropriate content and mitigate content security risks.

Scenarios

Image moderation

Image moderation is a common image processing technology for moderating profile photos and photos that are posted on websites and social apps. For more information about supported image moderation capabilities, see Image moderation.

Image auditing for website communities

Image moderation can be used to scan and moderate images posted on blogging communities and forums for pornographic content, forbidden content, unwanted ads, and other inappropriate content. This helps platform operators improve content quality and reduce compliance violation risks.

Image auditing for social applications

Social applications can be a huge source of advertisements and pornographic images and fail to comply with content compliance requirements due to a lack of effective content moderation. Image moderation can be used to audit real-time messages for sensitive images to prevent business compliance violations.

Profile review

Image moderation can be used to review profile photos for improper content such as pornographic content, forbidden content, and advertising contacts to help create a good platform environment, which improves user experience.

Text moderation

Social media content moderation

Text moderation can be used to automatically detect and filter cyberbullying terms, hate speech, offensive content, and other media content that goes against platform guidelines. This helps protect social media users from disturbing content and contributes to a safer and healthier online environment that delivers content compliance and positivity and eventually improves user experience and satisfaction.

Video moderation

Video moderation is commonly used in scenarios such as intelligent pornography detection, terrorist content detection, undesirable scene detection, logo detection, and ad violation detection. For more information, see Video moderation.

Limits

Image moderation

Limit

Description

Image formats

Image moderation supports the following image formats:

  • PNG

  • JPG

  • JPEG

Image size

The following image size limits apply:

  • The image to detect cannot exceed 20 MB in size.

  • The height or width of the image to detect cannot exceed 30,000 pixels.

  • The number of total pixels of the image to detect cannot exceed 250,000,000.

Video moderation

Limit

Description

Video formats

Video moderation supports the following formats: AVI, MPEG, MPG, DAT, DIVX, XVID, RM, RMVB, MOV, QT, ASF, WMV, VOB, 3GP, MP4, FLV, AVS, MKV, TS, OGM, NSV, and SWF.

Prerequisites

  • An AccessKey pair is created and obtained. For more information, see Create an AccessKey pair.

  • OSS is activated, a bucket is created, and objects are uploaded to the bucket. For more information, see Upload objects.

  • IMM is activated. For more information, see Activate IMM.

  • A project is created in the IMM console. For more information, see Create a project.

    Note
    • You can call the CreateProject operation to create a project. For more information, see CreateProject.

    • You can call the ListProjects operation to query the existing projects in a specific region. For more information, see ListProjects.

Usage notes

Important

This feature is in invitational preview. If you are interested in this feature, contact your account manager or submit a ticket to contact us. We are happy and ready to provide professional support.