Earlier this year, Anker, the Chinese electronics manufacturer best known for its Eufy line of smart home security cameras, introduced an unconventional yet revealing initiative designed to bolster the development of its artificial intelligence technologies. The company invited its customers to submit security footage showing incidents of package theft and attempted car break-ins in exchange for monetary compensation. Specifically, Eufy announced it would pay individuals two dollars for each qualifying video, integrating this user-generated material into the dataset intended to enhance the capability of its AI models to accurately recognize and respond to theft-related activities.
On its official website, Eufy elaborated that in order to amass a sufficiently comprehensive training corpus for its algorithms, the company was seeking both authentic security incidents and carefully staged re-enactments. This dual approach was meant to ensure that the AI system could robustly identify a wide variety of behaviors associated with theft. Participants were even encouraged to perform simulated scenarios — for example, posing as a thief attempting to remove parcels from a doorstep or testing car door handles — and record those events using their own Eufy cameras. An additional incentive was detailed, suggesting that if multiple cameras captured the same event simultaneously, participants could accomplish the task efficiently and potentially multiply their earnings, up to as much as eighty dollars in the case of various staged theft actions.
Eufy explicitly assured participants that the collected videos, including those resulting from staged performances, would be utilized solely for internal AI training and development purposes. The company emphasized that these recordings would not be exploited for unrelated commercial interests or shared externally. This strategy illustrates a growing industry trend in which technology companies directly compensate customers for data contributions that can improve machine learning systems. While this type of user-centered data exchange allows everyday consumers to derive tangible value from their digital activity, it simultaneously raises serious questions regarding information security, privacy protection, and user consent.
A related example underscoring these risks emerged recently when TechCrunch revealed a vulnerability in Neon, a rapidly popularizing calling application that similarly enticed users with payments in return for sharing recorded conversations and accompanying transcripts. The investigation discovered that the platform’s inadequate security left users’ private data broadly exposed, granting unauthorized access to others. Following disclosure of the issue, Neon promptly deactivated its service, reinforcing how fragile such trust-for-data arrangements can become when proper safeguards are not in place.
Eufy’s own compensation program, which offered two dollars per submitted theft-related video, was active from December 18, 2024, through February 25, 2025. According to user comments posted on the official campaign page, at least 120 participants publicly confirmed their involvement, contributing to an initiative that the company hoped would yield an ambitious collection of about twenty thousand clips depicting package theft and another twenty thousand featuring individuals checking or pulling car doors. To participate, users were required to complete a Google Form, upload their video clips, and provide PayPal details to facilitate payment distribution.
Despite repeated inquiries from TechCrunch, Eufy declined to comment and did not disclose essential metrics about the campaign—such as the total number of participants, the overall amount spent on rewards, the total volume of footage collected, or whether retained data had been permanently deleted following the AI training process. Nonetheless, subsequent to this publicity, Eufy began launching similar projects that continued to incentivize users to share recordings, effectively expanding its crowdsourced approach to algorithmic improvement.
Currently, Eufy operates another in-app initiative known as the “Video Donation Program,” which offers non-monetary incentives designed to encourage continued user engagement. Participants may earn virtual distinctions such as an “Apprentice Medal,” which functions primarily as a status badge visible within the app’s interface, or tangible rewards like complimentary cameras and retailer gift cards. This new campaign is limited exclusively to videos that include human subjects, reflecting a more targeted dataset collection effort. Within the Eufy application, an “Honor Wall” displays a leaderboard showcasing top contributors; notably, one highly active user is credited with an astonishing 201,531 donated video clips. The company reiterates that these voluntarily donated recordings are employed strictly to refine AI algorithms and are not transferred to third-party organizations for external use.
Beyond security footage, Eufy also solicits contributions captured via its baby monitor devices. The company offers detailed instructions for uploading these videos through its support portal, though this particular program does not appear to provide any financial compensation. When questioned about the nature and purpose of the baby monitor video collection effort, Eufy once again did not issue any official response.
Skepticism regarding Eufy’s assurances on user privacy, however, is both persistent and warranted. In 2023, the technology news outlet The Verge uncovered troubling evidence that Eufy’s earlier claims of end-to-end encryption for its security camera streams were misleading. The investigation demonstrated that video feeds accessible through the web interface were, in fact, unencrypted. Following public scrutiny and extensive correspondence with journalists, Anker ultimately conceded that its prior statements had been inaccurate and pledged to address the problem. This episode underscored the fragility of consumer trust in smart home ecosystems, where even well-intentioned efforts to advance AI technology can be overshadowed by a company’s lapses in transparency and technical diligence.
Thus, Eufy’s evolving campaigns reflect both the promise and the peril of contemporary data practices. They exemplify how corporations are increasingly inclined to turn to their own customers as active collaborators in training artificial intelligence systems, blurring the line between product use and data labor. Yet, they also expose the delicate balance that must be maintained between innovation and ethical responsibility—a balance that continues to define the modern relationship between technology developers and the individuals whose data make such innovations possible.
Sourse: https://techcrunch.com/2025/10/04/anker-offered-to-pay-eufy-camera-owners-to-share-videos-for-training-its-ai/