40% Of Online Gaming Platform’s Don’t Have A Grievance Redressal Officer: Report

Over 40% of online gaming platforms have failed to comply with the Information Technology (IT) Rules by appointing a Grievance Redressal Officer, revealed an industry report from the CCAOI. The report found that only 59% of online gaming platforms have established structured grievance redressal mechanisms by appointing designated grievance officers and publicly sharing their contact information. Rule 3(2) of the IT Rules requires intermediaries to appoint a grievance officer responsible for addressing user complaints with the platform and publicly sharing their contact information. Additionally, the officer must acknowledge user grievances within 24 hours and resolve them within 15 days. Rule 4A(8) also requires online gaming platforms to establish dedicated mechanisms that address game-related concerns such as fairness, user harm, and financial disputes. Platforms that did not have a dedicated grievance redressal mechanism included Ubisoft, Rockstar Games, GetMega, Adda52, Spartan, Games24x7, PlaySimple, MoonFrog, and JetSynthesys. Age and KYC verification The report also investigated age verification and KYC practices across platforms, revealing significant discrepancies. None of the platforms carried out age verification during account creation and instead relied mostly on user self-declaration. While some platforms requested phone numbers for OTP verification, this process did not actually verify the user’s age, as anyone could complete it. Among platforms designed specifically for children, only 31% implemented proper age verification mechanisms. Among the 69% of platforms intended exclusively for adults, 93% conducted age verification only after the initial sign-up process. These platforms typically delayed verification until users attempted to withdraw cash, verify their accounts, or report incidents. Several major platforms, including Dream11, MPL, and WinZo, required mandatory Know Your Customer (KYC) verification only at these later stages. Others, like First Games, allowed users to play with cash limits up to Rs 25,000 without any identity verification. This approach left substantial room for minors to access and engage with adult gaming content before any meaningful age checks occurred. Rule 4A(3) requires online gaming intermediaries to implement robust mechanisms to prevent minors from accessing games made available only for adults. This includes features such as age-gating, obtaining parental consent, and enabling parental controls for games accessible by children. Additionally, Rule 3(1)(b)(ii) obligates platforms to prevent hosting or transmission of content that is harmful to children. Cybersecurity and Fraud Incident Reporting Only 22% of online gaming platforms provide a mechanism to report cybersecurity and fraud-related incidents. These include real money gaming (RMG) platforms such as MPL, RummyCulture, Adda52, and WinZo, as well as video game platforms like Ubisoft. Importantly, Rule 3(1)(l), read with Rule 3(2), requires cybersecurity and fraud-related incidents to be reported to the Grievance Redressal Officer. The study also examined responsible gaming tools across platforms and found inconsistent implementation of protective measures. While common tools included self-exclusion options, session time limits, age restriction notices, and responsible gaming guides, platforms varied significantly in their adoption of these features. The research revealed that 27% of platforms—including Gamezop, Ubisoft, PlaySimple, and MoonFrog—provided no responsible gaming disclaimers or notices whatsoever. The remaining 73% of platforms, such as Dream11, MPL, WinZo, and Adda52, included responsible gaming mandates with disclaimers during the login process. However, many platforms only offered disclaimers without real enforcement mechanisms.Advertisements Recommendations Based on the assessment of online gaming platforms, the research proposed four key recommendations to address identified weaknesses: Enforce Strict Age Verification Mechanisms The study recommended that online gaming platforms enforce stricter verification protocols, particularly for high-risk and real-money gaming platforms. Government ID verification at the login stage should be made mandatory for all new users to ensure that minors do not gain access to age-restricted content. The research suggested that AI-based age detection could be introduced during sign-ups to enhance accuracy and reduce reliance on self-declaration. It also recommended regulatory penalties for platforms that fail to implement proper age verification to ensure compliance. Enhance Cybersecurity Measures The assessment found that cybersecurity measures needed significant enhancement across gaming platforms to protect users from fraud, hacking, and data breaches. The research recommended a direct and separate reporting mechanism for cybersecurity threats and fraud within gaming platforms to ensure users did not have to rely solely on grievance redressal officers for escalation. Two-factor authentication should be made mandatory for all gaming accounts, including OTP verification along with KYC at the sign-in stage, and password or biometric authentication for subsequent logins. The study noted that security audits were not standard practice across the industry, with several platforms lacking dedicated cybersecurity teams or clear data protection policies. The recommendations included periodic security audits, published security compliance reports, cybersecurity resource links, and standardised reporting mechanisms for fraud and security threats as per applicable laws. Improve Grievance Redressal and Incident Reporting Systems The research found that grievance redressal and incident reporting systems needed substantial improvement to ensure users had clear, accessible channels for filing complaints and reporting security breaches. Platforms with no grievance mechanisms or unclear redressal policies left users without effective means to seek recourse. The study recommended that gaming platforms appoint designated grievance officers in line with the IT Act and make their contact details easily accessible to users. It also recommended well-defined and transparent escalation mechanisms, clearly explained reporting processes, and defined response times. Promote Responsible Gaming The assessment identified promoting responsible gaming as a key focus for online gaming platforms to curb gaming addiction and ensure safe environments. The research recommended that platforms implement and enforce session time limits, cash limits, self-exclusion tools, guides and tips for responsible gameplay, and parental control options. The study suggested platforms could engage with third parties to provide free counselling and discussion forums for users at risk of addiction. The recommendations included awareness campaigns to educate users about problem gaming risks and the importance of setting limits, with responsible gaming measures made mandatory rather than optional. Also Read: India’s Real-Money Gaming Industry Adopts New Code of Ethics—What’s Changed? Tamil Nadu Tightens Online Real-Money Gaming Rules: How Will Enforcement Work? Inside an industry body’s alleged draft voluntary code of ethics for online real money gaming companies