‘Take It Down:’ a tool for teenagers and anyone else to remove explicit online images

A new online tool lets you take down explicit images and videos of yourself from the internet. It’s from the National Center for Missing and Exploited Children and funded in part by Meta Platforms, owner of Instagram and Facebook.

SHARE ‘Take It Down:’ a tool for teenagers and anyone else to remove explicit online images
A young man checks his phone.

AP file

“Once you send that photo, you can’t take it back,” goes the warning to teenagers, often ignoring the reality that many teens send explicit images of themselves under duress or without understanding the consequences.

A new online tool aims to give some control back to teenagers and anyone else, allowing them to take down explicit images and videos of themselves from the internet.

Called Take It Down, the tool is operated by the National Center for Missing and Exploited Children and funded in part by Meta Platforms, the owner of Instagram and Facebook.

The site lets anyone anonymously — and without uploading any actual images — create what is essentially a digital fingerprint of the image. This fingerprint — a unique set of numbers called a “hash” — then goes into a database, and the tech companies that have agreed to participate in the project remove the images from their services.

The participating platforms so far include Instagram, Facebook, Yubo, OnlyFans and Pornhub, owned by Mindgeek.

If the image is on another site, though, or if it is sent in an encrypted platform such as WhatsApp, it won’t be taken down.

Also, if someone alters the original image — say by cropping it, adding an emoji or turning it into a meme — it becomes a new image and would need a new hash.

“Take It Down is made specifically for people who have an image that they have reason to believe is already out on the Web somewhere or that it could be,” said Gavin Portnoy, a spokesman for the National Center for Missing and Exploited Children.

Portnoy said teenagers might feel more comfortable going to a site than to involve law enforcement, which wouldn’t be anonymous.

Meta, then still called Facebook, tried to create a similar tool, but for adults, in 2017. It didn’t go over well because the site asked people basically to send their nude images, though encrypted, to Facebook. The company briefly tested the service in Australia but didn’t expand it to other countries.

In 2021, it helped launch a toll for adults called StopNCII — for nonconsensual intimate images, aka revenge porn. That site is run by a British nonprofit, the UK Revenge Porn Helpline, but anyone anywhere can use it.

Many tech companies already use this hash system to share, take down and report to law enforcement images of child sexual abuse.

Portnoy said the goal is to have more companies sign up. “We never had anyone say no,” he said.

Antigone Davis, Meta’s global head of safety, said the site works with real as well as artificial intelligence-generated images and deepfakes.

The Latest
If it makes him feel any better (it doesn’t), Vaughn not alone in the classy company of struggling hitters.
Family physicians perform nearly 40% of all visits by patients seeking treatment for depression, anxiety, substance use disorder and other mental health concerns.
It happened about 9:30 p.m. at a residential apartment building in the 1300 block of South Throop Street in the Little Italy/UIC neighborhood, officials said.
President Joe Biden and Donald Trump have bypassed the commission and agreed to debates organized directly by media outlets, without in-studio audiences. The head of the National Urban League explains why that’s better for our democracy.
Ruth never clarified if he really called his shot, or was actually taunting the Cubs dugout.