Snap may be forced to stop offering ‘My AI’ chat feature in UK if it fails to adequately address ICO concerns over child data protection
The UK data protection regulator has warned Snapchat parent Snap over its “My AI” feature, which it said may pose a risk to UK users, and especially children aged 13 to 17.
The Information Commissioner’s Office (ICO) said that a preliminary investigation found Snap had failed to adequately identify the risks of the feature, resulting in a preliminary enforcement notice.
If a final enforcement notice were issued, Snap could be forced to stop offering My AI to UK users until it carries out an “adequate risk assessment”, the ICO said.
The office also has the power to impose fines of several million pounds in the case of serious breaches of data protection law — £17.5 million or 4 percent of a company’s annual worldwide turnover the preceding financial year, whichever is higher.
Snap, which will have an opportunity to respond to the preliminary notice, said it was “closely reviewing” the ICO’s notice.
“Like the ICO, we are committed to protecting the privacy of our users,” the company said in a statement.
“In line with our standard approach to product development, My AI went through a robust legal and privacy review process before being made publicly available.
“We will continue to work constructively with the ICO to ensure they’re comfortable with our risk assessment procedures.”
Snap began offering My AI to paying Snapchat+ users in the UK in February, and released it to all UK users in April.
Snapchat has 21 million active monthly users in the UK and research firm Insider Intelligence estimates about 48 percent of its users overall are aged 24 and under. About 18 percent of Snapchat’s UK users are aged 12 to 17.
Generative AI tools have attracted controversy around their use of data, including the information used to train them – which is largely scoured from a variety of sources across the internet – and the data acquired from chats with users.