Add Google Content Safety API (industry service for CSAM detection)#23
Conversation
|
I think the API docs are permission-gated so we should remove the link from the entry to this list. I'm super curious to hear from others on if this should be included since it's not actually open source? We lean towards 'free like freedom' vs 'free like free drinks' :) |
|
Are you sure @verdverm and @julietshen ? I was able to open this incognito mode |
|
I think http://safety.google.com/content-safety works, but https://cloud.google.com/safesearch/docs/content-safety is permission-gated :) I want to keep this open for comments nonetheless, since it's not an open source tool... @verdverm what do you think (picking on you since you already commented) |
|
Hello, sorry to helicopter in here, but I've been lurking on the PR since it was opened. :) I lean towards not including the Google Content Safety API here—or if we do (and if we think we'll add similar useful-but-not-open tools), we should explicitly categorize it as useful but not open e.g. under a new section. |
Signed-off-by: Cassidy James Blaede <cassidyjames@roost.tools>
Signed-off-by: Cassidy James Blaede <cassidyjames@roost.tools>
This is the link Google uses in public places like blog posts. I also added a note that the service requires registration. Signed-off-by: Cassidy James Blaede <cassidyjames@roost.tools>
|
Necroing this PR... since we've added Google Content Safety API into Coop itself, I think it is worth mentioning here, now! There's now an open source way to use the API, even if the API itself is not technically open source. |
Since Coop is released as open source and includes Content Safety API integration, this makes more sense to add now, so long as we mention that rationale Signed-off-by: Cassidy James Blaede <cassidyjames@roost.tools>


Summary
This PR adds the Google Content Safety API under the "Classification" section.
While not open source, it is a widely adopted industry service that leverages machine learning to detect CSAM, nudity, and sexually explicit material in images and videos. It’s actively used by NGOs, platforms, and law enforcement to support online child safety efforts.
Reference: https://safety.google/content-safety/