Xbox's latest AI transparency report for the first half of 2024 highlights significant advancements in moderation techniques that combine artificial intelligence with human oversight. During this period, Xbox blocked over 19 million pieces of content violating its Community Standards. The moderation system employs a dual AI approach for rapid identification and classification of inappropriate messages. Xbox is also implementing advanced moderation tools in games like Minecraft and Call of Duty to reduce toxic behavior. A new 'strike system' has been introduced, where players accumulate strikes for rule violations, with escalating consequences for repeated offenses. Severe violations can lead to immediate permanent suspensions or device bans. The report emphasizes the positive role of AI in enhancing player safety and reducing the burden on human moderators.