Wednesday, June 20, 2018

Prostasia Foundation weighs in on Twitter on aggressive screening for c.p. in cloud and Internet posts



I’ll mention the “Prostasia Foundation” (also called “YouCaring”) which has a Kickstarter campaign now, on this site

Today the group answered a tweet concerning the practice of screening images in the cloud (in addition to social media posts) for possible matches to known watermarked images in the National Center for Missing and Exploited Children, which has sometimes resulted in arrests.
  
Link for their tweet is here.  One observation worth noting is that this mechanism probably would not catch secondary images of original watermarks. 

Sunday, June 10, 2018

FOSTA consequences seem quiet right now but could erupt with a slippery slope for platforms; the "harmful to minors" concept seems expandable




Two months after Trump signed FOSTA into law, the news on it seems to dwindle, but a few contradictory pieces stand out.

A piece by Elizabeth Nolan Brown on 3-23-2018 in Daily Beast still warns that the law could end the web as we know it, primarily because of the “slippery slope” problem.

In the third paragraph from the end Brown makes the comparison to liability for promoting terrorism or weapons use, and many platforms have become pro-active on that point.

In fact, the whole “harmful to minors” concept makes more sense morally if it does include the issue of promoting violence or weapons use.

We’re facing a world in which some young adults go off the rails and social media may be a major influence.  This may be as critical as the actual availability of weapons (which is arguably too easy, but that’s beyond this particular blog).

Back in February, Insider Source had reported that FOSTA really is narrow enough to allow startups to deal with it (better than SESTA, which is admits is too vague), but it’s not clear if the last minute changes to FOSTA added too much ambiguity, as EFF claims.

An article in The Hill (conservative) in late March, upping the ante on the importance of stopping trafficking as a policy priority, claimed that Cloud monitoring tools were readily available on the web even for startups.  This sounds optimistic.

But we haven’t heard yet of major instances outside of the areas of personal hookups (the actions of Craigslist, Reddit, and a few other companies) where mass censorship has happened.  But the downstream liability issue creep could indeed eventually determine who can easily speak on the web of the future.

An older article in Medium in January had tried to compare what FOSTA and SESTA purported to do legally.  This may be out of date. 

Saturday, May 19, 2018

Is it possible for visitors inadvertently loading or watching an illegal video to "possess" c.p. in the eyes of the law?



Recently I’ve indulged in watching some gay videos on YouTube.  I do notice that the captions for some of them mention “boys” or “teens”, and I hope (generously) that wording means ages 18 or 19. These videos usually have higher viewing counts.  But I spotted at least one video tagline in the suggested list that had “pre-pubescent” in a title, and that would imply child pornography.  That particular video listed a very low view count and might have just been posted.  YouTube might soon remove it with its own monitoring procedures.

There are two issues:  the legality of viewing material according to the age of the viewer (or terms of service rules), and the legality of the content itself, and of whether it might be criminal for anyone to view it or “possess” it.
  
Google explains its age verification policy here

Apparently you have to be 18 to view an age restricted video (even if embedded) or at least 13 to have an account.
  
However, a video showing nudity or explicit intimacy with an actor under 18 at the time of filming might fit the legal definition of child pornography.  When a video is made with an actor whose appearance suggests the likelihood of being under 18, in the US and most western countries, some sort of legal record is required of the actor’s age.  It would sound conceivable that a video made in a non-western country might be more likely to have used underage actors by western law, posing a legal risk to viewers (below). 
   
The main legislation in the US would be the Child Protection and Obscenity Enforcement Act of 1988.  The is more information here about adult film industry regulations. 
  
The concern would be that a visitor who views a video that he/she could reasonably suspect was illegal (according to the laws requiring actors to be 18 or older) might later be found to have possessed c.p.  It is unclear how easily government agencies (or NCMEC) could detect such viewing or whether government would really want to prosecute.  Some states (like Arizona) seem more aggressive than others.  

Viewing behind TOR or similar platforms might eschew detection during viewing (but not necessarily after the fact).  It's unclear whether https encyption alone would do so.

Since YouTube videos are often UGC (user-generated content), there is no guarantee, from the viewer's knowledge, that the actors were age verified.  But Section 230 would not protect the platform from downstream liability in c.p. cases (and now trafficking has been folded in to the liability risk with FOSTA), so users might be entitled to the belief that the platforms take some reasonable care to remove what they reasonably suspect is c.p. or illegal, even without age verification of actors presented in all cases. 
   
But obviously videos whose title brag about illegality should be removed by services like YouTube and Vimeo, after detection by automated monitoring if possible.

Monday, May 07, 2018

Cloud Act could be used to screen unposted, private data for illegal behavior, even given the 4th Amendment



This piece by Daivd Ruiz of Electronic Frontier Foundation on the Cloud Act and the Fourth Amendment does bear attention. 

The bill would allow data detected by foreign law enforcement partners from Cloud backups to share data with the US. 


It’s pretty clear that cloud backups will be scanned more often in the future for child pornography and now sex trafficking content, which might be a tool to reduce downstream liability exposures due to FOSTA (although this would go further, into private backups that the user didn’t intend to post).
  
It would seem possible to scan them for suspicious behavior or interests of various kinds.  Although a way down the pike, it sounds feasible to scan for low-level copyright infringement, merely illegally saving personal copies of material that was supposed to be purchased.   

Wednesday, April 11, 2018

Activist groups file complaint that YouTube has violated COPPA



Some activist groups have claimed that YouTube is collecting data from users under 13, putatively in violation of the 1998 Children’s Online Privacy Protection Act, or COPPA, as in this CNN story. The CNN story reports that the YouTube may be picking up both minors’ and parents’ data when children sign on. 
  
Over 20 groups have filed a complaint with the FTC.  The story maintains that the groups want YouTube to be able to distinguish kids’ data from parents’.

  
In December, CNN had reported that YouTube would hire 10000 people to “clean up YouTube”.

Sunday, March 18, 2018

Some states want to put mandatory filters on all Internet devices sold in their states with a registry of those who unblock; Hollywood seems to up threats on all user content



Wired has an important story on states considering their own variations of the Human Trafficking and Child Exploitation Prevention Act (HTCEPA), by requiring every device sold in their states to have porn filters!  Right now, the dishonor roll includes Rhode Island, South Carolina, and Texas. Similar bills are considered in the UK.  (I couldn't find this act in Wikipedia.) 
  
Users could pay a fee to remove the filters but then the states would have a registry of users who had.
  
Wired's story,  by Louise Matsakis on this, where she traces the laws back to the original Communications Decency Act of 1996, largely overturned; and the Child Online Protection Act (COPA), which has been the main subject of this particular blog.  The filter issue came up in the COPA litigation, particularly at the trial I covered here in 2006.

Ironically it is Section 230 of the original 1996 law that survived and that is now threatened by FOSTA and SESTA as covered here before.  This article mentions these, and notes the lack of distinction between consensual adult sex and trafficking (which often involves minors).  The article’s comments on these are a little overbroad (it’s more than “social media sites” though).

  
Recently, Congress has gotten some letters actually supportive of SESTA/FOSTA from Hollywood (Fox) and some parts of Tech (Oracle).  I’ll get into these again, but there is a disturbing undertone to these letters:  there is no reason users should be able to post to the whole planet at will without gatekeepers unless they give something back (like help fight trafficking, or volunteer in some intersectional way).  That really isn’t what Senator Portman thinks SESTA says; he still says it is narrow.

Saturday, March 10, 2018

Legally dangerous tweet from Uganda circulating; (don't forward it)



A few weeks ago WJLA7 warned viewers about a child pornography video circulating on Facebook, and that it could be a crime to share it.  The video has surely been removed.

But today I saw in my Twitter feed a post whose title seemed to hint at c.p. filmed in Uganda.  The image in the video showed minimal dress but no nudity.  I simply ignored it as it passed out of sight, but I realized I could have (with a little more presence of mind) reported it and unfollowed the sender. 

Presumably it could be a crime to retweet such a post.  Just a warning or a tip. 

I don’t recall that this has happened in own input feed before.  Let’s hope someone reports it and that Twitter gets rid of it quickly. 

Thursday, March 08, 2018

Geek Squad appears to be working undercover with FBI in a cozy relationship at a repair center to nab child pornography possession




The Electronic Frontier Foundation, in a disturbing article by Aaron Mackey, reports that there was more collusion between a Geek Squad repair center in Kentucky and the FBI looking for child pornography, than had been thought.

Some employees seem to have gone out of the way to look for images in unallocated space, that the customer thought had been deleted.


There are tools that can detect digital watermarks from known images identified by NCMEC. But it is hard to imagine how one could find a “needle in a haystack” otherwise.

It would sound plausible to do this also with sex trafficking in the future (related to the FOSRA-SESTA debate).
  
In August 2014, I had a large Toshiba laptop crash with a burned out motherboard from overheating when trying to upgrade from Windows 8.0 to 8.1, as repeatedly prompted by Microsoft.  The computer was sent to the repair center and was there for six weeks before we gave up on it (the store said “Tennessee”).  I had to replace it and apply the service plan warranty to that replacement.

Update: March 11

A further report on admissions of payments to GS members.  

Tuesday, March 06, 2018

The question of porn, pirated and placed out of context -- reminds me of the COPA trial in 2007



Stoya has an op-ed in the New York Times Monday, March 5, 2018, p. A27, “Can there be good porn?
  
She discusses how adult content needs to be put in context – a discussion that I remember well from the COPA litigation of more than a decade ago. But then it gets pirated, she says, out of context, posted “for free” on YouTube, and discovered by kids out of context.
  
But even when found in its original location, many people will not bother to read the context.

Saturday, March 03, 2018

FOSTA passage and the "should have known" standard



Note well the Wall Street Journal editorial “Political Sex-Trafficking Exploitation”, with the byline “Fast moving legislation could open the web to a lawsuit bonanza, link
  
  
WSJ admits that Backpage should have been prosecuted under existing law, which really does exempt criminal activity by websites or service providers that they know about from downstream liability protection. A judge In Boston will let another case go forward.
  
But all the rub is with the “should have known” idea, already put forth by EFF.



Update: March 22

Senate has passed FOSTA unchanged and the president has signed.  See my BillBoushka blog today. 

Wednesday, February 28, 2018

House passes sex trafficking bill reducing downstream liability protections under Section 230 (Backpage case)



The House late Tuesday passed the “FOSTA” bill, HR 1865, based on the problems with sex-trafficking on the web.

The Wall Street Journal has the best account right now, by John D. McKinnon.  I have links to other accounts on my main blog, as well as on Wordpress where there are many more details. 

The bill seems to have been amended at the last minute to narrow the Section 230 exemptions of services (websites, social media sites, and possibly hosting companies – the last is not clear) when their users engage in promoting large scale prostitution or any sex trafficking.  What is unclear is what the legal standard would be how a service would know that this is going on because it cannot prescreen all content.  Congress obviously believes it is targeting “classified ads” sites known for selling sex ads.  There is some question as to what the “reckless disregard” language means. 

Tuesday, February 20, 2018

Reposting school threats sent to a student could be crime (WJLA-Sinclair warning)



WJLA7, Sinclair-owned station in Washington DC,  is warning parents that students who repost threats to schools they receive to others could face criminal charges or at least expulsion from school.  Reposting of such messages could be treated very much like reposting child pornography (for which there was a scare a week ago).
  
Some attention to this possibility comes from a story about a Spapchat at a different Florida school, reported here.  This is even more ironic because Snapchat posts are supposed to disappear. 

Sunday, February 18, 2018

Utah wants to pass a civil liability version of COPA



Matthew La Plante has a story in the Sunday Washington Post, by Matthew La Plante, p. A11, maintaining that politicians in Utah and several other states want to frame porn addiction as a public health crisis.  Utah, in particular, considers a law allowing “victims” to sue publishers for adult content viewed accidentally by children. 
  
Is this COPA all over again?  Remember the COPA trial a decade ago, and the debate on filters and adult-id? 



Update:  Feb. 22

Florida's legislature passed a resolution declaring pornography a public health risk, and not military style weapons in the hands of a deranged person. Holly Yan on CNN reports with a video interview. 

Tuesday, February 13, 2018

Hosting providers use security companies to scan for illegal content which could be placed by foreign hackers



In conjunction with a recent Cato Institute briefing on the unintended overreach of some state sex offender registry laws, it’s worth noting that hosting providers are now using software to scan customer sites for malware and this could include illegal content like child pornography (identifiable with digital watermarks) which could be placed particularly by foreign enemies as a kind of subtle terrorism or attack on American legal and democratic values.  Site Lock is one of the companies used to scan sites.  Older sites, those not updated or accessed as often, or with unused unneeded capacity (email accounts, for example) could start to develop  a risk. 

The topic came up at the end of the session in an audience question. There was a comment from the panel that “mens rea” is more likely to be useful in defense today when a person’s property is hacked (by an enemy) than it would have been fifteen years ago, when law enforcement and the court systems did not understand the Internet as well.
   
This problem could be related to issues reported before, when computers are infected with illegal content discovered by repair technicians.

Saturday, February 03, 2018

Arrest of a substitute teacher in Maryland shows the practical risks of Internet abuse; sudden warning about a Facebook video


A substitute teacher in Charles County, Maryland, about 30 miles SE of Washington DC, was child with showing sexually explicit materials to a minor and child pornography possession, with state charges, after having worked only 34 days.  A student reported his texting of another student, WTOP story here
  
But the incident shows that, despite fingerprint background checks, it is very difficult for school systems to vet substitute teachers well before hire.  They can look at social media, but this may run into First Amendment concerns with public employees.  But with substitute teachers principals at individual schools can ban substitutes under any suspicion whatsoever, and most school systems have “three strike” rules.
  
 Just as I was typing this story, I learned of a c.p. video circulating on Facebook from station WJLA7 in Washington, link.  It may have originated in Alabama, but was reported to Sinclair news in Cincinnati first. Resharing it is illegal and could lead to prosecution by federal law. But Facebook is likely to have removed the video before it gets very far. The WJLA story notes that resending such a video for "journalistic" purposes would not prevent criminal prosecution. But that statement could, by analogy, raise serious questions about citizen journalism in other areas, like terrorism.  But possession of a c.p. image is itself a crime;  possession of bomb-making instructions is not, although possession of the actual materials might be.    

Tuesday, January 09, 2018

Tech companies soften opposition to SESTA as liability language is narrowed


Lost in all the attention to net neutrality recently is the progress of the SESTA and related House bills to weaken Section 230 protections from services that host sex trafficking ads.


On November 7, 2017, the Washington Post had reported (Tom Jackman) that major tech companies were removing objections to SESTA since the Senate seems to be willing to narrow the language that could lead to downstream liability.
  
The language about “knowingly” allowing trafficking content was narrowed to “participation in a venture” but there is also a provision about “knowingly assisting, supporting or facilitating a violation of sex trafficking laws” and removes language “by any means”.