Photo Storage App Built Facial Recognition Tool With User Photos

Photo storage app Ever is facing backlash after it was revealed that it is using billions of photos uploaded by its users to train its facial recognition system.

SAN FRANCISCO — Almost every service you use requires you to review and accept a privacy policy and terms of service. It’s probably fair to say most people immediately scroll to the bottom of these documents and click “accept,” without actually reading them.

This story is a great example of why you should always read the privacy policy.

Photo storage app Ever is facing backlash after it was revealed that it is using billions of photos uploaded by its users to train its facial recognition system.

It isn’t out of the ordinary for photo storage apps to use facial recognition. That’s how they are able to sort your photos into categories like “baby,” or “dog.”

However, using photos to train a facial recognition tool that is then being sold to private companies, law enforcement and the military is highly questionable.

According to NBC News, Ever pivoted from a photo storage app, to a new business called Ever AI, without telling its millions of users.

Ever CEO Doug Aley, told NBC News that Ever AI does not share the photos or any identifying information about users with its facial recognition customers.

He says the billions of images are used to instruct an algorithm how to identify faces. Whenever an Ever user enables the app’s facial recognition to group together images of the same people, the facial recognition technology trains itself. This technology is then used in the company’s commercial facial recognition products.

“This looks like an egregious violation of people’s privacy,” says Jacob Snow, a technology and civil liberties attorney at the American Civil Liberties Union of Northern California. “They are taking images of people’s families, photos from a private photo app, and using it to build surveillance technology. That’s hugely concerning.”

Ever AI has even mentioned in a press releases that it posses an “ever-expanding private global dataset of 13 billion photos and videos,” however the company wasn’t given explicit permission to use these photos by Ever members.

NBC reached out to several members of the photo storage service, with one woman stating, “I was not aware of any facial recognition in the Ever app. Which is kind of creepy since I have pictures of both my children on there as well as friends that have never consented to this type of thing.” That woman has since deleted the app.

Ever updated its privacy policy after NBC reached out in April about its facial recognition practices. From NBC:

Previously, the privacy policy explained that facial recognition technology was used to help “organize your files and enable you to share them with the right people.” The app has an opt-in face-tagging feature much like Facebook that allows users to search for specific friends or family members who use the app.

In the previous privacy policy, the only indication that the photos would be used for another purpose was a single line: “Your files may be used to help improve and train our products and these technologies.”

On April 15, one week after NBC News first contacted Ever, the company added a sentence to explain what it meant by “our products.”

“Some of these technologies may be used in our separate products and services for enterprise customers, including our enterprise face recognition offerings, but your files and personal information will not be,” the policy now states.

In an email, Aley explained why the change was made.

“While our old policy we feel covered us and our consumers well, several recent stories (this is not a new story), and not NBC’s contact, caused us to think further clarification would be helpful,” he wrote. “We will continue to make appropriate changes as this arena evolves and as we receive feedback, just as we have always done.”

Jason Schultz, a law professor at New York University, says Ever AI should do more to inform Ever app’s users about how their photos are being used, instead of burying it in a 2,500-word privacy policy that most users do not read.

“They are commercially exploiting the likeness of people in the photos to train a product that is sold to the military and law enforcement,” he says. “The idea that users have given real consent of any kind is laughable.”

If you enjoyed this article and want to receive more valuable industry content like this, click here to sign up for our FREE digital newsletters!

About the Author

Contact:

Steven A. Karantzoulidis is the Web Editor for Security Sales & Integration. He graduated from the University of Massachusetts Amherst with a degree in Communication and has a background in Film, A/V and Social Media.

Security Is Our Business, Too

For professionals who recommend, buy and install all types of electronic security equipment, a free subscription to Commercial Integrator + Security Sales & Integration is like having a consultant on call. You’ll find an ideal balance of technology and business coverage, with installation tips and techniques for products and updates on how to add to your bottom line.

A FREE subscription to the top resource for security and integration industry will prove to be invaluable.

Subscribe Today!

Leave a Reply

Your email address will not be published. Required fields are marked *

Get Our Newsletters