You might not expect to hear this, but Facebook – the largest photo-sharing site on the Internet – has been a haven for pedophiles, who upload “several thousand registered illegal images” to the site each day, according to the New York Times. As you probably know, Facebook as a corporation has been stringent about removing offensive images from its site, so much so that it has removed more than one image of a breastfeeding mother from its servers. Now they’ve teamed up with Microsoft, which “has refined a technology it created called PhotoDNA” that can identify images used by pedophiles “and cull through large amounts of data quickly and accurately enough to police the world’s largest online services.”
This isn’t the first move Facebook has made to stop the sexual exploitation of children. In January, the site joined the National Center for Missing & Exploited Children’s Amber Alert network. Ernie Allen, chief executive of the center, says, “Our hope and belief is that Facebook will be just the first of many companies to use what has proven to be highly effective technology. Online services are going to become a hostile place for child pornographers and pedophiles.”
The New York Times reports that “by focusing on images of children under 12, the initiative is battling the worst of the worst images, which are often shared over and over again. Child pornography is growing increasingly violent and depicting increasingly young children, including infants and toddlers.” Hany Farid, a Dartmouth computer science professor who worked with Microsoft on PhotoDNA, estimates that “at least 50,000 child pornography images are being transmitted online every day.”
The National Center for Missing & Exploited Children “has amassed 48 million images and videos depicting child exploitation since 2002, including 13 million in 2010 alone.” That’s chilling. PhotoDNA uses a digital code to search for and identify 10,000 of these images for confiscation. The Times notes, “Tests on Microsoft properties showed it accurately identifies images 99.7 percent of the time and sets off a false alarm only once in every 2 billion images.”
For more on PhotoDNA, visit The New York Times.
Internet Safety: Keeping an eye on underage Facebook users