Do you want more control over the types of texts, images, and videos you get on Google Messages? If so, we have some good news. This week, we learned that Google is rolling out an optional new feature to prevent NSFW and other unwanted videos from appearing on your phone.
In 2024, Google discussed several upcoming initiatives to make Google Messages safer. Those changes included preventing spam, blocking numbers, and rolling out a powerful new sensitive content warning system. In April, those sensitive image warnings started appearing, then rolled out to everyone over the summer.
We reported that Google was looking to expand this mode to video back in June, and that’s finally happening. Here’s what you need to know.
Sensitive Video Content Warnings in Google Messages
First spotted by Android Authority, Google Messages is busy rolling out an update to its sensitive content warning system that goes beyond images. The new system gives users and parents more control over videos.
With this feature, Google Messages can identify incoming videos that might be inappropriate, such as nudity and other sensitive content, and instantly blur the footage. So, when you open a message, you won’t instantly view something you’d rather not see. In the latest October 2025 Play Services update (v25.39), there’s a row dedicated to this expansion. The changelog notes, “With this update, Sensitive Content Warnings feature in Google Messages can now detect explicit nude media in shared videos.”
The new feature should be rolling out now, and you can get it in the latest Play Services and Google Messages app updates. Once the feature arrives, you can head to Settings > Protection & Safety and enable it. After you turn it on, instead of instantly seeing an inappropriate video you might not want to look at, especially if you’re out in public, you’ll see a blurry box with a warning label. Then, you’ll need to click again to verify that you want to see it.
It’s important to note that these new “Sensitive Content Warnings” are turned off by default on adult accounts, as they are an opt-in feature. For those under 18, they’ll automatically be enabled to prevent unwanted content from getting through. This is the system for photos, and we’re assuming it’ll function the same way for videos.
Additionally, parents can easily manage and control this through the Family Link app. It’s also worth noting that Google is being proactive here, and this feature isn’t only for incoming images or videos. When you go to send this type of content, or a child does, they’ll see a pop-up suggesting to “make responsible decisions online,” and it has a few links to why you shouldn’t share certain content.
Last but not least, Google mentions that all of this happens on-device. Your photos and videos aren’t sent to Google for analysis, and the content system “doesn’t send identifiable data or any of the classified content or results to Google servers.” This is only available for photos and videos, but there’s a good chance it’ll eventually expand to GIFs. Those interested in cleaning up their messages can look for it in the Google Messages settings menu.
Source: Android Authority
Add comment
You must be logged in to post a comment.