Tech News

A new system helps crack down on images of child sexual abuse

[ad_1]

Every day, a The team of analysts in the UK looks like a huge endless mountain. The 21 Watch team, which works at the Internet Watch Foundation’s office in Aldazabal, spends hours dragging through images and videos related to child sexual abuse. And, every time they find a photo or footage, it needs to be evaluated and labeled. Last year alone the group identified 153,383 websites linked to images of child sexual abuse. This creates an extensive database that can then be shared internationally in an attempt to stop the flow of abuse. Problem? Different countries have different ways of classifying images and videos.

To date, UK child protection charity analysts have verified that the material they find falls into three categories: A, B or C. These groups are based on UK law and guidelines on sentences of child sexual abuse. broadly determine the types of abuse. Among the category A images, for example, the most serious classification are the worst crimes against children. These rankings are used to determine how much time should be given to someone convicted of a crime. But other countries use different classifications.

Now the IWF believes that a breakthrough in data can eliminate some of these differences. The group has rebuilt its hashing software, called Intelligrade, to automatically link images and videos to known rules and laws in Australia, Canada, New Zealand, the US and the UK (Eyes Five Eyes) countries. The change should mean less duplication of analytical work and make it easier for technology companies to prioritize the most serious images and videos of abuse.

“We believe we share data better so that more people can use it in meaningful ways, rather than all of us working in our small silos,” says Chris Hughes, director of the IWF’s telephone line. “Nowadays, when we share data, it’s very difficult to get meaningful comparisons against the data, simply because they don’t link correctly.”

Countries put different weights on the images, depending on what happens in them and the age of the children involved. Some countries classify images according to whether they are pre-dressed or pubescent and the crime that is taking place. The most serious categories in the UK, A, include penetrative sexual activity, savagery and sadism. Hughes says it doesn’t necessarily include masturbation actions. In the US, on the other hand, it is in a higher category. “At the moment, the U.S. that requires IWF A-category images would lose that level of content,” Hughes says.

Every photo and video IWF sees is given a hash, basically code, that is shared with tech companies around the world and law enforcement agencies. These hashes are used to re-detect and block excess known content being uploaded to the web. The hashing system has had a huge impact on spreading child sexual abuse material online, but the latest IWF tool adds new information to each hash.

The IWF’s secret weapon is metadata. They are data about data; what, who, how and when may be contained in the images. Metadata is a powerful tool for researchers as it allows them to perceive patterns and analyze trends in people’s actions. Among the biggest drivers of metadata are spies, they say it can be more attractive than that the content of people’s posts.

Hughes says the IWF has increased the amount of metadata it creates for each image and video it adds to the hash list. Every image or video you look at is being valued in more detail than ever before. In addition to working on whether the content of sexual abuse is dependent on three groups in the UK, analysts are adding 20 pieces of information to their reports. These fields correspond to what is needed to determine the classifications of an image of the other five-eyed countries. Charity policy staff have compared each law and worked out what metadata should be. “We decided to provide a high level of granularity to describe age, as well as a high level of granularity in terms of representing what is happening in the picture and confirming gender,” says Hughes.

[ad_2]

Source link

Related Articles

Leave a Reply

Your email address will not be published. Required fields are marked *

Back to top button