File-sharing giant WeTransfer sparked global backlash after a recent terms-of-service update led users to believe the company would use their shared files to train AI models, citing language about improving "machine learning model performance." It's set to take effect on Aug. 8.
The language immediately raised red flags on social media, with customers blaming WeTransfer for claiming entitlements to sell or share files with AI companies for model training, a highly contentious topic in today's data-aware digital environment.
WeTransfer Says AI Has No Place in Customer Content
On Tuesday, the Dutch-based platform responded directly to the controversy. In a public statement, WeTransfer said it "does not sell user content to third parties" and "does not use AI in connection with customer content."
Assuring its user base, the company continued:
"Your content is always your content. We don't use machine learning or any form of AI to process content shared via WeTransfer."
The firm said the controversial clause was meant to include a hypothetical future feature involving AI-powered content moderation tools, a system the company does not yet have on its platform.
Machine Learning Clause Struck Down Following User Feedback
According to Digital Trends, WeTransfer acknowledged that its wording caused unintended confusion and concern, and as a result, removed all references to machine learning from its terms of service.
According to the revised section, the users are hereby giving the company a royalty-free license to use their content as part of developing the Service, which is under their Privacy and Cookie Policy.
The change is intended to align the firm's terms with its true business practices, in addition to suppressing concerns that uploaded user content can be harvested for artificial intelligence training purposes without permission.
User Privacy and AI Transparency in the Limelight
In the age of AI, we can't blame people who doubt what it can do. While technology companies experiment with machine learning to improve their sites, consumers are increasingly vigilant about the use of their data.
For some users, especially content creators, writers, musicians, artists, and more, the unauthorized use of their work to train AIs has no place in their craft.
WeTransfer's experience suggests the importance of transparent, clear communication when revising policies that tackle data use. AI might be getting smarter, but that doesn't mean it should always lead the way for the companies. Even hypothetical features or unclear legal jargon can set off controversy if users perceive themselves as blindsided or uninformed.
In the end, trust is currency, and firms should handle user data with utmost care and reputation. Dropbox's Dutch version has finally listened, but the internet won't easily forget what happened.
ⓒ 2025 TECHTIMES.com All rights reserved. Do not reproduce without permission.