Saturday, May 19, 2018

Is it possible for visitors inadvertently loading or watching an illegal video to "possess" c.p. in the eyes of the law?

Recently I’ve indulged in watching some gay videos on YouTube.  I do notice that the captions for some of them mention “boys” or “teens”, and I hope (generously) that wording means ages 18 or 19. These videos usually have higher viewing counts.  But I spotted at least one video tagline in the suggested list that had “pre-pubescent” in a title, and that would imply child pornography.  That particular video listed a very low view count and might have just been posted.  YouTube might soon remove it with its own monitoring procedures.

There are two issues:  the legality of viewing material according to the age of the viewer (or terms of service rules), and the legality of the content itself, and of whether it might be criminal for anyone to view it or “possess” it.
Google explains its age verification policy here

Apparently you have to be 18 to view an age restricted video (even if embedded) or at least 13 to have an account.
However, a video showing nudity or explicit intimacy with an actor under 18 at the time of filming might fit the legal definition of child pornography.  When a video is made with an actor whose appearance suggests the likelihood of being under 18, in the US and most western countries, some sort of legal record is required of the actor’s age.  It would sound conceivable that a video made in a non-western country might be more likely to have used underage actors by western law, posing a legal risk to viewers (below). 
The main legislation in the US would be the Child Protection and Obscenity Enforcement Act of 1988.  The is more information here about adult film industry regulations. 
The concern would be that a visitor who views a video that he/she could reasonably suspect was illegal (according to the laws requiring actors to be 18 or older) might later be found to have possessed c.p.  It is unclear how easily government agencies (or NCMEC) could detect such viewing or whether government would really want to prosecute.  Some states (like Arizona) seem more aggressive than others.  

Viewing behind TOR or similar platforms might eschew detection during viewing (but not necessarily after the fact).  It's unclear whether https encyption alone would do so.

Since YouTube videos are often UGC (user-generated content), there is no guarantee, from the viewer's knowledge, that the actors were age verified.  But Section 230 would not protect the platform from downstream liability in c.p. cases (and now trafficking has been folded in to the liability risk with FOSTA), so users might be entitled to the belief that the platforms take some reasonable care to remove what they reasonably suspect is c.p. or illegal, even without age verification of actors presented in all cases. 
But obviously videos whose title brag about illegality should be removed by services like YouTube and Vimeo, after detection by automated monitoring if possible.

Monday, May 07, 2018

Cloud Act could be used to screen unposted, private data for illegal behavior, even given the 4th Amendment

This piece by Daivd Ruiz of Electronic Frontier Foundation on the Cloud Act and the Fourth Amendment does bear attention. 

The bill would allow data detected by foreign law enforcement partners from Cloud backups to share data with the US. 

It’s pretty clear that cloud backups will be scanned more often in the future for child pornography and now sex trafficking content, which might be a tool to reduce downstream liability exposures due to FOSTA (although this would go further, into private backups that the user didn’t intend to post).
It would seem possible to scan them for suspicious behavior or interests of various kinds.  Although a way down the pike, it sounds feasible to scan for low-level copyright infringement, merely illegally saving personal copies of material that was supposed to be purchased.