Some Android users are seeing new privacy protections in Google Messages. Images marked as nude will be blurred before opening. It’s part of Google’s sensitive content warning system, a feature designed to protect users from receiving unwanted or offensive photos.
If enabled, like Google Help Center message As described, the phone automatically scans images for nudity, removes anything that looks explicit, and displays a warning before viewing, sending or transferring them. The tracking takes place entirely on your device. Therefore, the reported content is not uploaded to Google’s servers. In addition to blurring, the system also provides advice on what to do with sensitive images.
The setting is not enabled by default for adults, but is locked for teens using controlled accounts unless a parent changes it The Google Family Link app. And while the tool is intended to help, Google acknowledges that it can mistakenly flag harmless images.
How to enable or disable the feature
For adults who want to receive nude photo notifications or turn off the feature, there is the switch at the bottom. Google Messages settings / Safety and security / Manage notifications about sensitive content / Notifications in Google Messages.
The feature for nude content is part of Security core on devices running Android 9 and later. SafetyCore also includes features that Google has worked on to protect against fraud, dangerous SMS links and to verify contacts.
Measure the effectiveness of the feature.
Filters that look for offensive images have become more sophisticated thanks to a better understanding of the context through artificial intelligence.
“Compared to older systems, today’s filters are much better at detecting explicit or unwanted content, such as nudity, and with fewer errors,” said Patrick Moynihan, co-founder and president of Tracking laboratories. “But they are not infallible. Extreme cases, such as artistic nudity, images with cultural nuances or even memes, can still trip them up.”
Moynihan says his company combines artificial intelligence with Trust ID tools to flag content without compromising privacy.
“Combining artificial intelligence with human supervision and continuous feedback loops is important to minimize blind spots and ensure user safety,” he says.
Compared to Apple’s iOS operating system, Android offers more flexibility. But opening third-party app stores, downloading and customizing them creates more potential entry points for the kind of content Google wants to protect users from.
“The decentralized nature of Android can make law enforcement more difficult, especially for younger users who may encounter unfiltered content outside of certain areas,” Moynihan said.
“Kids can immediately confuse it”
While Apple sticks to it Communication security Features that parents can enable and Android’s ability to enable third-party monitoring tools “make this type of protection easier to implement at scale and more family-friendly,” says Titania Jordan. Author and parental responsibility Bay Technology that produces digital tools to protect children.
According to Jordan, mobile operating systems have not allowed parents to proactively protect themselves against content such as nudity.
“Parents shouldn’t have to navigate system settings to protect their kids,” he says. He points out that Google’s new feature only displays images temporarily.
“Kids can turn it off immediately,” he says. “This therefore needs to be accompanied by ongoing conversations around pressure, consent and sustainability, as well as monitoring tools that go beyond a single app or operating system.”
Moynihan says opt-out for adults and automatic opt-out for minors is a practical way to provide first-line protection. But he adds: “The trick is to keep things transparent. Minors and their guardians need clear, reliable information about what has been leaked, how it works and how their data is protected.”