Slack is utilizing chat data to enhance its machine learning models

Slack Trains Machine-Learning Models on User Data Without Explicit Permission

Slack, the popular workplace communication platform, has come under fire for training machine-learning models on user messages, files, and other content without explicit permission. The training is opt-out, meaning that by default, your private data will be used for this purpose. To make matters worse, users cannot opt out themselves; they must ask their organization’s Slack admin to email the company to request that the training stop.

Corey Quinn, an executive at DuckBill Group, brought attention to this policy after spotting it in Slack’s Privacy Principles. The section in question states, “To develop AI/ML models, our systems analyze Customer Data (e.g. messages, content, and files) submitted to Slack as well as Other Information (including usage information) as defined in our Privacy Policy and in your customer agreement.”

In response to concerns raised by Quinn and others, Slack published a blog post to clarify how customer data is used. The company stated that customer data is not used to train Slack’s generative AI products, but is instead used for products like channel and emoji recommendations and search results. Slack assured users that their data is de-identified and aggregated for these purposes.

However, the opt-out process requires users to take proactive steps to protect their data. They must have their organization’s Slack admin contact Slack’s Customer Experience team to request that their data be excluded from training the machine-learning models.

The confusion surrounding Slack’s privacy policies and practices has led to criticism from users and experts alike. Some have pointed out inconsistencies in the company’s statements, such as the discrepancy between its claims about data privacy for AI models and its actual practices.

As the debate continues, it remains to be seen how Slack will address the concerns raised by its users. In the meantime, the company’s handling of user data serves as a cautionary tale in the era of the new AI training data gold rush.

LEAVE A REPLY

Please enter your comment!
Please enter your name here