Facebook is asking to use Meta AI on photos you haven’t yet shared

by pier25on 6/28/25, 12:08 AMwith 354 comments
by notsydoniaon 6/28/25, 8:27 AM

It's also a huge danger as the system FB uses to tag and categorize photos is clearly flawed. example: Meta took a business page I ran that had over 150K followers offline because of a photo that violated their 'strict anti-pornography' etc etc policies. The picture was of a planet - Saturn - and it took weeks of the most god-awful to and fro with (mostly) bots to get them to revoke the ban - their argument was that the planet was 'flesh-toned' and that their A.I. could not tell that was not actually skin. The image was from NASA via a stock library and labelled as such.

by coef2on 6/28/25, 1:56 AM

I miss the old days when Facebook was simply a fun way to reconnect with friend and family who lived far away. Unfortunately, those days are gone. It feels like an over engineered attention-hogging system that collects a large amount of data and risks people's mental health along the way.

by pymanon 6/28/25, 2:36 PM

The joy of deleting Facebook in 2021 is something I'll never be able to put into words.

A company that's right up there with gambling and tobacco: designed to keep you hooked, no matter the cost.

by ants_everywhereon 6/28/25, 1:33 AM

This is why I requested family not to post pictures of my children on Facebook.

They will get to decide what to do with their likenesses when they're older. It seemed cruel to let Facebook train a model on them from the time they were babies until they first start using social media in earnest.

by bgwalteron 6/28/25, 3:03 PM

They are still pushing the "AI dominance over China" argument to clueless politicians.

The anti regulation clause sneaked into the "Big Beautiful Bill" ($5 trillion new debt) facilitates consumer exploitation and has no impact at all on military applications.

If China dominates consumer exploitation, let them and shut off their Internet companies.

Strangely enough, why not invest $500 billion in a working fusion reactor if these people are so worried about U.S. dominance?

by goku12on 6/28/25, 1:10 AM

This is truly egregious. Facebook and Instagram are installed by default on many android phones and cannot be fully uninstalled. And even if asked for consent, many people may choose the harmful option by mistake or due to lack of awareness. It's alarming that these companies cannot be held to even the bare minimum standards of ethics.

As an aside, there was a discussion a few days back where someone argued that being locked in to popular and abusive social/messaging platforms like these is an acceptable compromise, if it means retaining online contacts with everyone you know. Well, this is precisely the sort of apathy that gives these platforms the power to abuse their marketshare so blatantly. However, it doesn't affect only the people who choose to be irresponsible about privacy. It also drags the ignorant and the unwilling participants under the influence of these spyware.

by ch_fron 6/28/25, 12:13 PM

Be it opt-in or not, I don't like that Meta is comfortable enough to even suggest it. Even when putting AI out of the equation, this is still one more of Meta's repeated attempts at breaking out of mobile app encapsulation (see the Onavo VPN or localhost tracking).

by windexon 6/28/25, 8:06 AM

zuck needs to fade into irrelevance. The guy hasnt done anything interesting in years. Every few years he raids private data and thinks he can do something with it.

by robin_realaon 6/28/25, 11:26 AM

Remember that you can delete your Meta accounts and have nothing to do with them. It’s not hard to do.

by ATechGuyon 6/28/25, 2:41 PM

Some of the best decisions I made ever: 1. Deleted FB in 2012 :) 2. Didn't create insta or whatsapp account 3. Never applied to meta jobs

by jbombadilon 6/28/25, 1:32 AM

https://archive.is/3lllh

by puttycaton 6/28/25, 3:14 PM

Some time ago I asked on HN "will you go work for Meta?" [1].

I'm so glad I didn't pass their (ridiculous, redundant) set of interviews.

[1] https://news.ycombinator.com/item?id=40935199

by demarqon 6/28/25, 3:31 PM

Question to the Meta engineers on here, do you ever speak out about this internally?

by Jackson__on 6/28/25, 3:59 AM

Curious, is this really necessary? I'd assume the subtotal of public images posted on meta services to be in the trillions.

by windexon 6/28/25, 4:02 PM

I mentioned this on another thread. I tried my best to avoid FB, but then they acquire products like WhatsApp to then hoover up personal data again. This shouldn't be allowed. PII and personal data should be bound to the original terms on which the product launched.

Zuck should find a quiet part of the internet or the metaverse to curl up and fade away. The guy just doesn't have any redeeming qualities.

by toofyon 6/28/25, 1:32 AM

how long until we find out that the brand new government/palantir deal is using these photos as well against citizens?

i give it a year or less.

by geor9eon 6/28/25, 2:59 PM

Hacker News users keep getting worked up about opt-in services they don't want to opt in to.

by AJ007on 6/28/25, 1:17 AM

Very helpful for ad targeting. As Apple kills tracking and ramps up its own ad business, Meta will need to collect as many signals as possible.

by phendrenad2on 6/28/25, 3:40 PM

Meta products are such a bad deal for users.

I wish there was an alternative to Facebook and Instagram, even if it had no users. We, as users, can solve the "no users" problem for you. Facebook and Instagram became popular, contrary to popular belief, not because it had "critical mass" or some Hoffmanite bullshit like that, but because it had the technical community using them, and they brought their friends and family.

Someone just needs to build it.

by aetherspawnon 6/28/25, 2:18 AM

iOS -> Settings -> Privacy and Security -> Photos -> Facebook -> Set limited access

by kevingaddon 6/28/25, 1:50 AM

This seems like a liability nightmare. If they're just scanning all the image files on people's devices and using them for training, they're inevitably going to scoop up nudes without permission, not to mention the occasional CSAM or gore photo, right? Why would you want to risk having stuff like that sneak into your training set when you already have access to all people's public photos?

by cpersonaon 6/28/25, 4:10 PM

There's a future where people (or AI) will take pictures, AI will edit and post the ones that will be liked, and then AI will like pictures based on previous like history.

by barbazooon 6/28/25, 3:32 PM

We’re on here are privileged. I feel bad for the billions of people that aren’t aware of or unable to see how truly terrible that organization is for societies and the planet.

by IncreasePostson 6/28/25, 1:50 AM

I wonder how many pieces of code at facebook there are with guards like

    if (userId == 1) {
      // don't add mark's data to training set
    }

by nottorpon 6/28/25, 10:35 AM

Hmm I don't post photos privately on FB, and I maybe post one public photo every 2-3 years.

What can I use to "poison" their training? I'll just send them privately to the friends that would consider that fun.

by gorjusborgon 6/28/25, 2:07 PM

As commercial AI kills its food source (the internet) we'll see corporations doing desperate things to keep feeding it.

by bee_rideron 6/28/25, 3:47 PM

> Unfortunately for end users, in tech companies’ rush to stay ahead, it’s not always clear what they’re agreeing to when features like this appear.

At this point, is there really a lack of clarity? I think we all know Facebook is going to interpret any permission to look at anything, as full permission to do whatever the hell they want with it.

There are people who care about this, and people who don’t. Telling ourselves there’s confusion… I think is not going to produce an accurate model of reality.

I think these social media companies are evil. I just don’t see the point in deluding myself into thinking that they are outsmarting everybody. It is a difference of priorities, not smarts.

by b0a04glon 6/28/25, 3:44 PM

this shifts meta ai from reaction to anticipation. before: algo sees what you post ,reacts. now: it sees what you might post ,decides how to shape it. your intent used to live in the gap between photo taken and photo shared. they're moving compute into that gap

by JimDabellon 6/28/25, 5:57 AM

This article seems false.

> On Friday, TechCrunch reported that Facebook users trying to post something on the Story feature have encountered pop-up messages asking if they’d like to opt into “cloud processing”, which would allow Facebook to “select media from your camera roll and upload it to our cloud on a regular basis”, to generate “ideas like collages, recaps, AI restyling or themes like birthdays or graduations.”

> By allowing this feature, the message continues, users are agreeing to Meta AI terms, which allows their AI to analyze “media and facial features” of those unpublished photos, as well as the date said photos were taken, and the presence of other people or objects in them. You further grant Meta the right to “retain and use” that personal information.

The straightforward explanation is this: they have a feature where it is helpful to group people together. For instance suggesting a photo of you and a friend to be posted on their birthday. In order to make this work, they need to perform facial recognition, so they ask for permission using their standard terms.

Can they train their AI with it? Yes, you are giving them permission to do so. Does the information available tell us that is what they are doing? No, it does not. In fact, a Meta spokesperson said this:

> “These suggestions are opt-in only and only shown to you – unless you decide to share them – and can be turned off at any time,” she continued. “Camera roll media may be used to improve these suggestions, but are not used to improve AI models in this test.”

— https://techcrunch.com/2025/06/27/facebook-is-asking-to-use-...

Could they be lying about this? Sure, I guess. But don’t publish an article saying that they are doing it, when you have no evidence to show that they are doing this and they say they aren’t doing this.

Might they do it in the future? Sure, I guess. But don’t publish an article saying that they are doing it, if the best you have is speculation about what they might do in the future.

Does it make sense for them to do this? Not really. They’ve already got plenty of training data. Will your private photos really move the needle for them? Almost certainly not. Will it be worth the PR fallout? Definitely not.

Should you grant them permission if you don’t want them to train on your private photos? No.

This could have been a decent article if they were clearer about what is fact and what is speculation. But they overreached and said that Facebook is doing something when that is not evident at all. That crosses the line into dishonesty for me.

by charcircuiton 6/28/25, 3:21 PM

Who would want to have AI be applied after you share the photos? Most people would want to check what the photos actually look like before publishing them. The appeal of this feature is to be able to see the suggestions immediately. The feature is opt in and you don't have to grant permission to your camera roll if you don't want to.

by deadbabeon 6/28/25, 3:51 AM

Would it be any better if Facebook hired photographers to walk around cities and major events and just photograph random people doing stuff? AI will get hungrier.

by exabrialon 6/28/25, 3:54 PM

Silicon Valley has a problem with one word: consent.

What stinks is the original concept: keeping up with disperate friends, its pretty awesome. I enjoy seeing my friends' kids grow up even though I don't really know them.

by akomtuon 6/28/25, 4:05 PM

Corps are going to be as abusive as the situation allows. Today Facebook is asking, tomorrow the consent to AI will be required to continue using the service.

by apion 6/28/25, 3:35 PM

Not your computer, data not encrypted with keys you control, not your data.

by Finnucaneon 6/28/25, 3:07 PM

I continue to be retroactively vindicated for never using fb from my phone. Now, if they figure out how to get access to my Hasselblad 503 I'm screwed.

by 29athrowawayon 6/28/25, 2:52 PM

The AR glasses are also about the same... just get a lot of pictures so they can feed their AI.

by squigzon 6/28/25, 7:29 AM

Hasn't Facebook (and pretty much all major social media platforms) had a clause in their TOS giving them a license to whatever you upload to their services, since forever?

by xyston 6/28/25, 3:53 AM

Data and people are the commodity in this Ai gold rush. Primary benefactors are big tech.

by dzhiurgison 6/28/25, 3:50 AM

Wonder if you could ddos it by taking selfies with ai generated faces in background.

by jurschreuderon 6/28/25, 7:02 AM

They're also developing VR glasses.

The company that is destroying children's mental health with phone addiction is developing VR glasses.

I guess nobody cares

by imartin2kon 6/28/25, 5:40 PM

The world would be a better place without this shitty company.

by msgodelon 6/28/25, 2:19 AM

Neat-O.

Maybe this will finally convince people to throw out their smartphones.