User Reporting- Introduction
See something wrong? Use the in-app user reporting function on the WeChat app to report potential violations of these Community Guidelines.
To retain content as per the following sections of the current version of the Community Guidelines: To make a report against content in a chat or group chat; To make a report against a WeChat user; To make a report against a Moment Post
We detect violations of these Community Guidelines through user reports as well as proactive identification of violating content or behavior. We enforce these Community Guidelines with the help of our content moderation team and technology. Our content moderation team undergoes regular and ad-hoc training to ensure that they are equipped with the necessary knowledge and tools to perform content moderation in accordance with the relevant policies. To ensure a high level of consistency and accuracy in our content moderation, we have internal escalation processes whereby our content moderation team can escalate more complex issues to their team leaders or other relevant experts, such as our trust and safety or legal team, where appropriate.
We also use automated tools, including machine learning models and logic-based rules, to identify and moderate violating content such as nudity, fraud and gambling. These tools take into account various factors to determine whether to action against a certain content or user which has violated our content moderation policies.