Twitter Closes Accounts With Images Of Child Abuse

December 28, 2012 0

Los Angeles — In a rigorous attempt to curb online child abuse, several private Twitter accounts have been disabled after it was discovered that they contained indecent images of children.


According to the National Society for the Prevention of Cruelty to Children (NSPCC), a child protection charity in the UK, were first notified late Sunday and groups quickly warned Twitter members not to retweet or point their followers towards such accounts if they appear in their feed, but to report them to Twitter and law enforcement authorities.

“People have expressed concern about a twitter feed with disturbing images of children. We have alerted CEOP [Child Exploitation and Online Protection Center]. They are looking at it urgently,” NSPCC tweeted.

“Nearly 500 reports to the Hotline this weekend. 200+ for Twitter content which is now down,” said Emma Lowther, the organization’s communications director.

The Internet Watch Foundation, a non-profit body funded by internet firms that seeks to remove child abuse material from the web as quickly as possible, said it had receive more than 200 complaints about the accounts over the weekend. Notably, the NSPCC is a charity organization that focuses on child protection in the areas of England, Northern Ireland, Wales, as well as the Channel Islands.

It further said that the alarming images were revealed to the public after groups of hackers said that they had broken into several private accounts in order to uncover their shocking content, the NSPCC said.

Members of the public have reported the accounts to Greater Manchester Police and North Yorkshire Police, while Ceop — the Child Exploitation and Online Protection Centre — says it is “aware”.

According to CEOP, it received around 25 to 30 reports overnight on at least four accounts, which were later disabled and will be investigated in the U.S. (where Twitter is based). Twitter is then obligated by law to forward any information they find to the National Center for Missing and Exploited Children (NCMEC), which is the U.S. equivalent of CEOP.

Taking prompt action, “NCMEC will then forward information for investigation to law enforcement agencies in the relevant country where the user is believed to be based, or children believed to be at risk,” a CEOP spokesman explained.

Besides, it is unclear whether the users who uploaded the disturbing images were located in the UK or outside the country, and the nationalities of the children involved are not yet known. The NSPCC asked for people to be vigilant and to report activity when they become aware of it.

“It you see something, or are aware of something, you should report it.”

“To be honest, it is not a massive surprise,” the spokesman said. “In our experience, sex offenders will use whichever mean they can to connect with each other. They are usually quite devious.”

As for those people hoarding such content, Professor Alan Woodward, of the University of Surrey’s department of computing, said they were increasingly using social media rather than computers.

“If they use the web to keep any pictures then they will be able to claim it was not them. The weight of evidence is not the same.”

In fact, the micro-blogging outfit Twitter has yet to comment on the closed accounts, but its policy explicitly states it does not tolerate child sexual exploitation.

“When we are made aware of links to images of or content promoting child sexual exploitation they will be removed from the site without further notice and reported to the National Center for Missing & Exploited Children (NCMEC); we permanently suspend accounts promoting or containing updates with links to child sexual exploitation.”