Activision Partners with Modulate to Combat Toxic Chat in Call of Duty

The gaming industry has long grappled with issues of toxic behavior and harassment within voice chat channels. To address this persistent problem, Activision has joined forces with Modulate, an AI-based company specializing in voice chat moderation systems. This collaboration marks a significant step forward in creating a more welcoming and fair environment for all players in Call of Duty.

Modulate’s ToxMod technology stands at the forefront of AI-powered voice chat moderation. By analyzing speech patterns in real-time, it can identify toxic speech, including hate speech, discriminatory language, and harassment. Once detected, the system automatically takes appropriate action to enforce consequences. This integration of Modulate’s state-of-the-art machine learning technology into Call of Duty’s voice chat feature promises to improve the overall gaming experience by curbing disruptive behavior.

Activision’s collaboration with Modulate serves as a powerful addition to its existing efforts to combat toxicity within Call of Duty. The game already employs text-based filtering across 14 languages and a robust in-game reporting system. With the inclusion of Modulate’s technology, the voice chat moderation system will work in tandem with these existing features to provide a comprehensive approach to tackling disruptive behavior.

Michael Vance, CTO at Activision, emphasizes the importance of creating a welcoming and fair environment for all players. He firmly asserts that there is no place for disruptive behavior or harassment in games. Voice chat has posed an extraordinary challenge in combating toxicity, and this collaboration with Modulate allows Activision to implement cutting-edge machine learning technology on a global scale. By enforcing strict moderation, Activision aims to foster a community marked by respect and inclusivity.

To ensure the efficacy of the voice chat moderation system, an initial beta rollout is planned for North America on August 30. This phase will encompass the existing games, Call of Duty: Modern Warfare II and Call of Duty: Warzone. Following the beta phase, a full worldwide release, excluding Asia, will coincide with the launch of Call of Duty: Modern Warfare III on November 10, 2023. Initially, the system will be available in English, but additional language support will be added at a later date to accommodate a diverse player base.

Activision’s collaboration with Modulate signifies a significant advancement in trust and safety measures within the gaming industry. By partnering with Modulate, Activision demonstrates its dedication to leading the charge in creating a safe and enjoyable gaming experience for its players. Mike Pappas, CEO at Modulate, expresses his excitement at contributing to the cutting-edge efforts of ensuring a positive player community of Call of Duty’s scale and prominence.

Call of Duty’s existing anti-toxicity moderation systems have already taken action against over a million accounts found in violation of the Call of Duty Code of Conduct since the release of Call of Duty: Modern Warfare II. The implementation of updated text and username filtering technology has significantly improved the real-time rejection of harmful language. Additionally, Activision’s Ricochet tech has proven instrumental in combating cheating, further solidifying the company’s commitment to maintaining a fair gaming environment.

The incorporation of voice chat moderation in Call of Duty is a proactive measure to safeguard players from toxic behavior. While player reporting remains an essential tool to report instances of disruptive behavior, voice chat moderation bolsters the system’s ability to identify and take action against unreported instances of misconduct. By proactively detecting harmful behavior, the voice chat moderation system enables the community to focus on enjoying the game without unnecessary distractions.

Voice Chat Moderation in Call of Duty is powered by the AI model ToxMod from Modulate and is operated and managed by Activision. The system monitors and records voice chat conversations for the specific purpose of moderation. Rather than relying on specific keywords, the system focuses on detecting harmful language and behavior within voice chat. Violations of the Call of Duty Code of Conduct are subject to account enforcement, ensuring that those who engage in bullying, harassment, or other disruptive activities face appropriate penalties.

During the beta rollout, the Voice Chat Moderation System will analyze voice chat in English. However, following the global launch, voice chat moderation will expand to incorporate additional languages, ensuring inclusivity for players worldwide. It is important to note that players who do not wish to have their voice moderated can choose to disable in-game voice chat in the settings menu, providing customization options for players’ preferences.

To maintain transparency with players, Call of Duty’s in-game Notification system provides updates on the status of reported incidents. Players can access their report status, including details of the report and any associated penalties, through the Notifications menu on the Home screen. While the system detects violations in real-time, the enforcement process may require additional reviews to understand the context fully. Continuous improvements and adjustments will be made to ensure efficient response times as the system evolves.

It is essential to differentiate between “trash-talk” and harmful behavior. Activision’s Voice Chat Moderation system aligns with the existing Call of Duty Code of Conduct, allowing for friendly banter and competitive interactions. However, hate speech, discrimination, sexism, and other forms of harmful language outlined in the Code of Conduct will not be tolerated. The system’s aim is to create a balanced environment that encourages healthy communication and discourages toxic behavior.

Activision’s Voice Chat Moderation system makes use of AI to detect and categorize toxic behavior. However, it is ultimately Activision that determines the enforcement measures based on these reports. The system provides categorized reports on toxic behavior, but the decision regarding enforcement is a human-led process. This allows for a more nuanced evaluation of context, ensuring fair and appropriate action is taken.

Activision’s collaboration with Modulate presents a significant milestone in addressing toxic chat within Call of Duty. By integrating Modulate’s advanced AI-based voice chat moderation system, Activision aims to foster a community marked by respect and inclusivity. As the efforts to combat toxicity continue to evolve, Call of Duty players can look forward to an enhanced gaming experience that prioritizes their well-being and enjoyment.


Articles You May Like

Clearview AI Reaches Settlement in Facial Recognition Privacy Lawsuit
An In-Depth Analysis of the Recent Ransomware Attack on London Hospitals
The Growing Trend of State Regulations on Social Media for Kids
Meta’s New Auto A/B Testing for Reels: Are Robots Taking Over?

Leave a Reply

Your email address will not be published. Required fields are marked *