Go to Contents Go to Navigation

(2nd LD) Developer of AI chatbot service fined for massive personal data breach

All News 22:01 April 28, 2021

(ATTN: ADDS Scatter Lab's response in paras 7-10)

SEOUL, April 28 (Yonhap) -- South Korea's data protection watchdog on Wednesday imposed a hefty monetary penalty on a startup for leaking a massive amount of personal information in the process of developing and commercializing a controversial female chatbot.

The Personal Information Protection Commission (PIPC) said Scatter Lab, a Seoul-based startup, was ordered to pay 103.3 million won (US$92,900) in penalties -- a penalty surcharge of 55.5 million won and an administrative fine of 47.8 million won -- for illegally using personal information of its clients in the development and operation of its artificial intelligence-driven chatbot service called "Lee Luda."

This image, captured from South Korean startup Scatter Lab's website, shows its AI chatbot, Lee Luda, a 20-year-old female college student persona. (PHOTO NOT FOR SALE) (Yonhap)

This image, captured from South Korean startup Scatter Lab's website, shows its AI chatbot, Lee Luda, a 20-year-old female college student persona. (PHOTO NOT FOR SALE) (Yonhap)

It is the first time in South Korea that the government has sanctioned the indiscriminate use of personal information by companies using AI technology.

Scatter Lab is accused of using about 600,000 people's 9.4 billion KakaoTalk conversations collected from its emotional analysis apps Science of Love and Text At in the process of developing and operating the Lee Luda chatbot service without obtaining their prior consent.

The company is also criticized for failing to delete or encode the app users' names, mobile phone numbers and personal addresses before using them in the development of its AI chatbot learning algorithms.

In addition, the Lee Luda chatbot was programmed to select and speak one of about 100 million KakaoTalk conversation sentences from women in their 20s, the PIPC said.

Scatter Lab said it takes full responsibility and is taking steps to prevent recurrence.

"We feel a heavy sense of social responsibility as an AI tech company over the necessity to engage in proper personal information processing in the course of developing related technologies and services," the company said in a press release late Wednesday.

"Upon the PIPC's decision, we will not only actively implement the corrective actions put forth by the PIPC but also work to comply with the law and industry guidelines related to personal information processing," the company said.

To prevent this from happening again, the company said it is taking various measures under more stringent standards, including the work to restrict services for minors under the age of 14 and other upgrades to enhance protection of personal data.

The Lee Luda chatbot service attracted more than 750,000 users in just three weeks after its launch on Dec. 23, but Scatter Lab suspended the Facebook-based service the following month amid complaints over its discriminatory and offensive language against sexual minorities.

The PIPC said Scatter Lab has used personal information collected from its Science of Love and Text At apps beyond the purpose of the collection.

The company is also accused of collecting personal information of about 200,000 children under the age of 14 without obtaining the consent of their parents or guardians in the development and operation process for its services.

Scatter Lab did not set any age limit in recruiting subscribers for its app services and collected 48,000 children's personal information through Text At, 120,000 children's information from Science of Love and 39,000 children's information from Lee Luda, the commission said.

"This case is meaningful in that companies are not allowed to use personal information collected for specific services indiscriminately for other services without obtaining explicit consent from the concerned people," PIPC Chairman Yoon Jong-in said.


Send Feedback
How can we improve?
Thanks for your feedback!