![]() The iCOP system is designed for use on Gnutella and has been trained with tens of thousands of images ranging from adult porn and benign images of kids to the full-on sexual abuse of minors. What's more, it automatically identifies new material (anything that doesn't get flagged), which provides fresh leads on more recent crimes.Īnd given that, according to the UN, 16 percent of people who possess this sort of material have themselves abused children, reducing the amount of time between discovery and arrest can help save children from further exploitation. This saves time, manpower and accelerates investigations. This saves law enforcement the stomach-turning drudgery of manually checking the images against the database. If the same images or videos resurface during other investigations, they're automatically flagged. ![]() These signatures are then shared as a global database for law enforcement. The system, known as iCOP (Identifying and Catching Originators in P2P Networks), works similarly to Microsoft's Photo DNA, wherein images of child porn are tagged with a digital signature after being collected in the course of an investigation. That fight is now a bit easier for European law enforcement, which as debuted a new machine learning AI system that hunts for child porn on P2P networks. The FBI may have scored a big win with operation Playpen, which helped dismantle a ring of TOR-based pedophiles and prosecute its members (thanks, Rule 41), but that was just one battle in the ongoing war against the sexual exploitation of children.
0 Comments
Leave a Reply. |
AuthorWrite something about yourself. No need to be fancy, just an overview. ArchivesCategories |