AI can advance user engagement and safety in gaming – SoftServe
By SoftServe
AI solutions can proactively identify issues in gaming, by analysing user data to catch early signs of problematic behavior and then tailoring the interventions to individual player profiles, say Gen AI and technical experts from SoftServe.
AI can proactively identify issues in gaming, by analysing user data to catch early signs of problematic behavior and then tailoring the interventions to individual player profiles. Image: Canva.
In October 2023, the National Council on Problem Gambling in the US created an AI-based self-test screening tool for gamblers seeking responsible gaming resources on their website.
The self-test screening tool helps users understand their risk level of having a gambling problem.
Instead of asking users questions about their gambling habits, the tool taps on gamification and AI to “analyse how users play the game, and then provides insights based on neuroimaging principles,” previously reported PlayUSA, a gaming news site.
This is only one of the many uses of AI for the gaming industry.
Speaking to GovInsider, Solutions Architect Anatolii Okhotnikov and Enterprise Solution Lead Andrew Tan from SoftServe share how AI could enhance the government’s ability to improve oversight and protection for consumers, and key considerations for governments in leveraging AI for the gaming sector.
AI is dynamic and proactive, not static
“With AI-based personalisation, the biggest advantage is that AI doesn’t operate in a static manner; it dynamically assesses risks in real time, adjusting protective measures as player behaviour evolves,” says Okhotnikov.
AI has two-fold benefits for user engagement and responsible gaming.
For user engagement, AI tools like personalised avatars and marketing campaigns can help enhance user engagement and experience for gamers, says Tan.
As for ensuring responsible gaming practices, AI tools like chatbots, personalised recommendations, rule engines for deposit limits and game workflows, as well as self-exclusion parameters, can help guide players.
These tools provide players with clear information and self-management tools around their gaming habits, Okhotnikov explains.
For example, SoftServe has developed AI algorithms for a gaming operator that address “loss chasing” behaviour, or the tendency for users to continue or even accelerate their gaming despite losses, says Tan.
“This real-time algorithm provided alerts that would lead to player protection interventions. The algorithms were also able to provide explanations for why the behaviour was considered ‘loss chasing." he explains.
Aside from proactively monitoring and limiting excessive gaming behaviours for users, algorithms can also be used to moderate harmful language use during gameplay interactions, he adds.
Using AI to maintain regulatory compliance
Complying with local laws and regulations; the know-your-customer (KYC) process, where operators perform due diligence by verifying the user’s identity; anti-money laundering (AML)and fraud detection are among some of the challenges faced by gaming operators, says Okhotnikov.
“These painpoints in user safety and responsible gaming boil down to questions of security, technology concerns, payments, language and culture and customer support,” he explains.
AI has a wide range of applications to address some of these painpoints. For example, AI-powered identity verification can address KYC, AML, security, and payments concerns.
Advanced AI and machine learning technologies can help operators perform automatic analyses, enhance due diligence, keep up with regulatory change and compliance, automate document management and improve client onboarding, he elaborates.
Digital payment options can also help tighten risk management by setting up opportunities for real-time data, enhanced transparency and automated regulatory compliance processes, stated an article on SoftServe’s blog.
The need for clear guidelines and regulations
For the government - whose main goal is to enhance oversight and protection for consumers in the gaming sector – it is essential to ensure that actors follow governing body regulations, says Okhotnikov.
He underlined the key considerations that public sector stakeholders should keep in mind when leveraging AI technologies for gaming platforms, which include:
a) Guidelines on acceptable behaviour and means to enforce it,
b) Cyber security measures for KYC, AML, fraud detection and protecting digital assets,
c) Data protection and privacy while performing content moderation,
d) User and actor reporting and compliance, and
e) Protecting vulnerable gamers like children and gamblers.
AI and machine learning algorithms depend both on existing information, such as financial transactions, as well as guidelines provided by government bodies, such as privacy regulations, to be able to enforce responsible gaming practices.
For instance, operators can use AI to automate the regulatory compliance process by identifying misdemeanours and enforcing corrective actions, he explains.
“Some governments could even monitor user activity for acceptable behaviour and multi-dimensional content moderation depending on the privacy regulations,” he adds.
Okhotnikov says that AI’s use in the gaming sector is still in its “trending” and “emerging” phases.
“Some will be quite useful, and the others will follow the general hype with low usability,” he explains. Looking ahead, he predicts that AI use in the gaming world would become more prevalent in the areas of compliance, security, customer engagement and protection.