Lawsuit Claims LinkedIn Utilized Private Messages for AI Model Training


LinkedIn is currently embroiled in a class-action lawsuit regarding allegations that it utilized private messages to develop its artificial intelligence (AI) model.

The **lawsuit**, lodged in the U.S. District Court for the Northern District of California, charges the Microsoft-owned platform with “illegally revealing its Premium customers’ private messages to outside parties” and “hiding” these actions by “furtively modifying its privacy policies and statements.” A key claim is that LinkedIn transmitted private InMail messages to third parties to enhance its AI technologies.

A representative from LinkedIn has refuted these allegations, asserting, “We are not employing member messages to train models as claimed in the complaint.”

The topic of obtaining training data for AI models has sparked significant debate. LinkedIn isn’t the only company facing such accusations; other tech leaders like **Google**, **Microsoft**, and **OpenAI** have also been targeted by lawsuits for allegedly using personal information without the knowledge or consent of users.

This specific lawsuit reflects the grievances of LinkedIn Premium users, who pay for extra features, including enhanced privacy safeguards. The contention arises from a privacy setting introduced in August 2024, which permitted users to opt out of data sharing for AI training. However, this setting was pre-activated by default.

In September 2024, LinkedIn revised its privacy policy to indicate that user data could be utilized for training AI models and might be disclosed to third parties. The lawsuit contends that this retroactive adjustment to the policy contravened data privacy laws and violated LinkedIn’s agreement with users. It also accuses the firm of using private data, such as InMail messages, without appropriate consent and attempting to “erase its tracks” by modifying its privacy policies.

The lawsuit is seeking damages of $1,000 for each plaintiff representing LinkedIn Premium users.