Sunday, August 10, 2014

Email companies match images (by digital hashtag) to known child pornography keys on databases; could this be done with Cloud storage?

Recently, several media outlets have reported that Google scans images in gmail attachments for child pornography.  It probably would do this with other products online, like Picasa albums, its social media and blogs.  The way this is done is to check any hash associated with the image with a hash code from a database on known images from the Center for Missing and Exploited Children in Alexandria, VA.  The UK telegraph has a story by Matthew Sparkes here
The Washington Post has a story on August 9 in print on the Switch Blog by Hayley Tsukayama, p. G4, not yet online.  Extreme Tech has an even more detailed story here by Sebastian Anthony here.   Extreme Tech uses the terminology “digital fingerprint”.  Microsoft also has a similar technology called Photo DNA which it donates to other service providers, and itr seems to be based on recognizing the same kinds of wartermarks or digital fingerprints.  In all cases, fingerprints (computed from the images) are compared to known databases, which would include NCMEC (“missing kids”) and possibly other industry databases that seem to be in development. 

Watermarks are associated with content labeling, which a British group, no longer active, called the ICRA (and later Family Online Safety) wanted to sell to content providers as a way of labeling content for age suitability.  This idea was pertinent during the days that COPA was being litigated.

There is no technology that can automatically determine “new” c/p. without matching to an image already classified as such by law enforcement or perhaps consumer complaints. Possibly, a consumer could publish or send an image that he or she thinks is legal and then gets reported and classified and watermarked anyway, and then the image is discovered later.  One problem for consumers  would be that the legal  definition of child pornography varies among different countries, even in the west.  Generally, in western countries, legitimate media companies follow a standard that actors in erotic  explicit images (“NC-17”) must be 18 or older, and federal law requires that they check actors n(for minimum age 18) for films made in the US (and Canada).  There could be more questions about standards for materials from Russia or some parts of Asia.

Security companies (like Sophos and Webroot) say they sometimes find images with matching fingerprints with their technology and will notify authorities, but do not pro-actively scan networks or client computers. Computer repair services seem to have a similar policy, and there have been a few cases where people were reported, and here there could be a problem of judgment as to what kind of image is illegal if it was detected by a human viewer. 

It would seem that cloud server storage could be scanned for hashmarked images, but I haven’t heard of this being done. That possibility could include Carbonite, or new cloud services from Micorsoft with Windows 8, as well as Apple’s iCloud.  The NSA would be capable of detecting images like this as part of terrorism detection, but would it notify police?

Extreme Tech points out that this whole process can raise troubling questions.  Could service providers scan for other illegal content, like copyright infringement? This process, after all, deals with content that may have been intended to be totally "private" and not posted on the web at all

Email providers would also be able to scan spam, but consumers who mark a suspect email as spam and don’t open it would not have any legal liability.  

Update: August 13

Tech Republic has a similar story on the matter by Scott Matteson, generally supportive of how Google handled this.  See the comments, also often supportive.  People have been arrested after film developers called police for manual films to be developed, and sometimes the standards are subjective and carpricious.  Is a baby nude picture taken by parents really child pornography?  I've seen a very few shots in smaller films (with parents present), in commercial distribution (such as a circumcision scene), that might fit someone's definition, although they obviously weren't taken with the intention of eroticism or with abuse of children.

Update: August 16

The Washington Post supports Google on this, saying it is complying with federal law, editorial. "Google's Careful Watch", here

Saturday, August 02, 2014

Teen who sexted ordered to stay off social media by judge, but given a chance for dismissal after one year

The 17 year old boy in Manassas VA whom prosecutors pursued aggressively in a sexting case was placed on probation for one year by a Prince William County judge, with the possibility of dismissal of the case after one year. This process is called "deferred adjudication".  It is a legal  procedure, at least in Virginia, that was used for a teacher in a matter ancillary to an incident that I was involved in a substitute teacher in 2005 (see main blog, July 27, 2007)/  

Taking a picture of one’s own parts does not pose the same hazard to others or public that usual production of child pornography would.

However, the teen must do community service and stay off social media completely for one year.  This could have a serious affect in other areas, like college in the future. 
The Washington Post has a story by Tom Jackman here.