Microsoft has released its fifth Transparency Report for Xbox, revealing that the company was able to prevent as many as 19 million pieces of Xbox Community Standards-violating content from reaching players on its platform from January 2024 to June 2024. The new report, which includes several key takeaways, also notes how the Xbox moderation team now leverages two new AI tools: Xbox AutoMod, a new solution that has already handled “1.2M cases and enabled the team to remove content affecting players 88% faster,” and a second tool that “proactively works to prevent unwanted communications.” See below for how AI tools are helping the Call of Duty team as well, with the launch of Call of Duty: Black Ops 6 said to feature significantly less voice toxicity thanks to expanded voice moderation and more.
…the [Call of Duty] team deploys advanced tech, including AI, to empower moderation teams and combat toxic behavior. These tools are purpose-built to help foster a more inclusive community where players are treated with respect and are competing with integrity. Since November 2023, over 45 million text-based messages were blocked across 20 languages and exposure to voice toxicity dropped by 43%. With the launch of Call of Duty: Black Ops 6, the team rolled out support for voice moderation in French and German, in addition to existing support for English, Spanish, and Portuguese. As part of this ongoing work, the team also conducts research on prosocial behavior in gaming.