Google will use hashes to find and remove nonconsensual intimate imagery from Search

7 hours ago 1

On Wednesday, Google announced a concern with StopNCII.org to combat the dispersed of non-consensual intimate imagery (NCII), the institution announced today. Over the adjacent fewer months, Google volition commencement utilizing StopNCII’s hashes to proactively place nonconsensual images successful hunt results and region them. Hashes are algorithmically-generated unsocial identifiers that let services to place and artifact imagery flagged arsenic maltreatment without sharing oregon storing the existent source. StopNII says it uses PDQ for images and MD5 for videos.

As Bloomberg points out, Google has been called retired for being slower than others successful the manufacture to instrumentality this attack and its blog station seemed to admit that. “We person besides heard from survivors and advocates that fixed the standard of the unfastened web, there’s much to beryllium done to trim the load connected those who are affected by it,” the station reads. Facebook, Instagram, TikTok, and Bumble each signed connected with StopNCII arsenic aboriginal arsenic 2022, and Microsoft integrated it into Bing successful September of past year. 

The institution has rolled retired tools to request the removal of specified content, on with personal interaction information, but similar its erstwhile efforts to combat revenge porn, they enactment the onus connected the unfortunate to place and emblem the content. While flagging and removing content, particularly AI-generated content, without victims having to create and taxable hashes from their ain devices would beryllium challenging, it’s a situation some advocates would similar to spot Google tackle.

Read Entire Article