Wednesday, June 20, 2018

Prostasia Foundation weighs in on Twitter on aggressive screening for c.p. in cloud and Internet posts



I’ll mention the “Prostasia Foundation” (also called “YouCaring”) which has a Kickstarter campaign now, on this site

Today the group answered a tweet concerning the practice of screening images in the cloud (in addition to social media posts) for possible matches to known watermarked images in the National Center for Missing and Exploited Children, which has sometimes resulted in arrests.
  
Link for their tweet is here.  One observation worth noting is that this mechanism probably would not catch secondary images of original watermarks. 

Sunday, June 10, 2018

FOSTA consequences seem quiet right now but could erupt with a slippery slope for platforms; the "harmful to minors" concept seems expandable




Two months after Trump signed FOSTA into law, the news on it seems to dwindle, but a few contradictory pieces stand out.

A piece by Elizabeth Nolan Brown on 3-23-2018 in Daily Beast still warns that the law could end the web as we know it, primarily because of the “slippery slope” problem.

In the third paragraph from the end Brown makes the comparison to liability for promoting terrorism or weapons use, and many platforms have become pro-active on that point.

In fact, the whole “harmful to minors” concept makes more sense morally if it does include the issue of promoting violence or weapons use.

We’re facing a world in which some young adults go off the rails and social media may be a major influence.  This may be as critical as the actual availability of weapons (which is arguably too easy, but that’s beyond this particular blog).

Back in February, Insider Source had reported that FOSTA really is narrow enough to allow startups to deal with it (better than SESTA, which is admits is too vague), but it’s not clear if the last minute changes to FOSTA added too much ambiguity, as EFF claims.

An article in The Hill (conservative) in late March, upping the ante on the importance of stopping trafficking as a policy priority, claimed that Cloud monitoring tools were readily available on the web even for startups.  This sounds optimistic.

But we haven’t heard yet of major instances outside of the areas of personal hookups (the actions of Craigslist, Reddit, and a few other companies) where mass censorship has happened.  But the downstream liability issue creep could indeed eventually determine who can easily speak on the web of the future.

An older article in Medium in January had tried to compare what FOSTA and SESTA purported to do legally.  This may be out of date. 

Saturday, May 19, 2018

Is it possible for visitors inadvertently loading or watching an illegal video to "possess" c.p. in the eyes of the law?



Recently I’ve indulged in watching some gay videos on YouTube.  I do notice that the captions for some of them mention “boys” or “teens”, and I hope (generously) that wording means ages 18 or 19. These videos usually have higher viewing counts.  But I spotted at least one video tagline in the suggested list that had “pre-pubescent” in a title, and that would imply child pornography.  That particular video listed a very low view count and might have just been posted.  YouTube might soon remove it with its own monitoring procedures.

There are two issues:  the legality of viewing material according to the age of the viewer (or terms of service rules), and the legality of the content itself, and of whether it might be criminal for anyone to view it or “possess” it.
  
Google explains its age verification policy here

Apparently you have to be 18 to view an age restricted video (even if embedded) or at least 13 to have an account.
  
However, a video showing nudity or explicit intimacy with an actor under 18 at the time of filming might fit the legal definition of child pornography.  When a video is made with an actor whose appearance suggests the likelihood of being under 18, in the US and most western countries, some sort of legal record is required of the actor’s age.  It would sound conceivable that a video made in a non-western country might be more likely to have used underage actors by western law, posing a legal risk to viewers (below). 
   
The main legislation in the US would be the Child Protection and Obscenity Enforcement Act of 1988.  The is more information here about adult film industry regulations. 
  
The concern would be that a visitor who views a video that he/she could reasonably suspect was illegal (according to the laws requiring actors to be 18 or older) might later be found to have possessed c.p.  It is unclear how easily government agencies (or NCMEC) could detect such viewing or whether government would really want to prosecute.  Some states (like Arizona) seem more aggressive than others.  

Viewing behind TOR or similar platforms might eschew detection during viewing (but not necessarily after the fact).  It's unclear whether https encyption alone would do so.

Since YouTube videos are often UGC (user-generated content), there is no guarantee, from the viewer's knowledge, that the actors were age verified.  But Section 230 would not protect the platform from downstream liability in c.p. cases (and now trafficking has been folded in to the liability risk with FOSTA), so users might be entitled to the belief that the platforms take some reasonable care to remove what they reasonably suspect is c.p. or illegal, even without age verification of actors presented in all cases. 
   
But obviously videos whose title brag about illegality should be removed by services like YouTube and Vimeo, after detection by automated monitoring if possible.

Monday, May 07, 2018

Cloud Act could be used to screen unposted, private data for illegal behavior, even given the 4th Amendment



This piece by Daivd Ruiz of Electronic Frontier Foundation on the Cloud Act and the Fourth Amendment does bear attention. 

The bill would allow data detected by foreign law enforcement partners from Cloud backups to share data with the US. 


It’s pretty clear that cloud backups will be scanned more often in the future for child pornography and now sex trafficking content, which might be a tool to reduce downstream liability exposures due to FOSTA (although this would go further, into private backups that the user didn’t intend to post).
  
It would seem possible to scan them for suspicious behavior or interests of various kinds.  Although a way down the pike, it sounds feasible to scan for low-level copyright infringement, merely illegally saving personal copies of material that was supposed to be purchased.   

Wednesday, April 11, 2018

Activist groups file complaint that YouTube has violated COPPA



Some activist groups have claimed that YouTube is collecting data from users under 13, putatively in violation of the 1998 Children’s Online Privacy Protection Act, or COPPA, as in this CNN story. The CNN story reports that the YouTube may be picking up both minors’ and parents’ data when children sign on. 
  
Over 20 groups have filed a complaint with the FTC.  The story maintains that the groups want YouTube to be able to distinguish kids’ data from parents’.

  
In December, CNN had reported that YouTube would hire 10000 people to “clean up YouTube”.

Sunday, March 18, 2018

Some states want to put mandatory filters on all Internet devices sold in their states with a registry of those who unblock; Hollywood seems to up threats on all user content



Wired has an important story on states considering their own variations of the Human Trafficking and Child Exploitation Prevention Act (HTCEPA), by requiring every device sold in their states to have porn filters!  Right now, the dishonor roll includes Rhode Island, South Carolina, and Texas. Similar bills are considered in the UK.  (I couldn't find this act in Wikipedia.) 
  
Users could pay a fee to remove the filters but then the states would have a registry of users who had.
  
Wired's story,  by Louise Matsakis on this, where she traces the laws back to the original Communications Decency Act of 1996, largely overturned; and the Child Online Protection Act (COPA), which has been the main subject of this particular blog.  The filter issue came up in the COPA litigation, particularly at the trial I covered here in 2006.

Ironically it is Section 230 of the original 1996 law that survived and that is now threatened by FOSTA and SESTA as covered here before.  This article mentions these, and notes the lack of distinction between consensual adult sex and trafficking (which often involves minors).  The article’s comments on these are a little overbroad (it’s more than “social media sites” though).

  
Recently, Congress has gotten some letters actually supportive of SESTA/FOSTA from Hollywood (Fox) and some parts of Tech (Oracle).  I’ll get into these again, but there is a disturbing undertone to these letters:  there is no reason users should be able to post to the whole planet at will without gatekeepers unless they give something back (like help fight trafficking, or volunteer in some intersectional way).  That really isn’t what Senator Portman thinks SESTA says; he still says it is narrow.

Saturday, March 10, 2018

Legally dangerous tweet from Uganda circulating; (don't forward it)



A few weeks ago WJLA7 warned viewers about a child pornography video circulating on Facebook, and that it could be a crime to share it.  The video has surely been removed.

But today I saw in my Twitter feed a post whose title seemed to hint at c.p. filmed in Uganda.  The image in the video showed minimal dress but no nudity.  I simply ignored it as it passed out of sight, but I realized I could have (with a little more presence of mind) reported it and unfollowed the sender. 

Presumably it could be a crime to retweet such a post.  Just a warning or a tip. 

I don’t recall that this has happened in own input feed before.  Let’s hope someone reports it and that Twitter gets rid of it quickly.