Sunday, February 18, 2018

Utah wants to pass a civil liability version of COPA



Matthew La Plante has a story in the Sunday Washington Post, by Matthew La Plante, p. A11, maintaining that politicians in Utah and several other states want to frame porn addiction as a public health crisis.  Utah, in particular, considers a law allowing “victims” to sue publishers for adult content viewed accidentally by children. 
  
Is this COPA all over again?  Remember the COPA trial a decade ago, and the debate on filters and adult-id? 

Tuesday, February 13, 2018

Hosting providers use security companies to scan for illegal content which could be placed by foreign hackers



In conjunction with a recent Cato Institute briefing on the unintended overreach of some state sex offender registry laws, it’s worth noting that hosting providers are now using software to scan customer sites for malware and this could include illegal content like child pornography (identifiable with digital watermarks) which could be placed particularly by foreign enemies as a kind of subtle terrorism or attack on American legal and democratic values.  Site Lock is one of the companies used to scan sites.  Older sites, those not updated or accessed as often, or with unused unneeded capacity (email accounts, for example) could start to develop  a risk. 

The topic came up at the end of the session in an audience question. There was a comment from the panel that “mens rea” is more likely to be useful in defense today when a person’s property is hacked (by an enemy) than it would have been fifteen years ago, when law enforcement and the court systems did not understand the Internet as well.
   
This problem could be related to issues reported before, when computers are infected with illegal content discovered by repair technicians.

Saturday, February 03, 2018

Arrest of a substitute teacher in Maryland shows the practical risks of Internet abuse; sudden warning about a Facebook video


A substitute teacher in Charles County, Maryland, about 30 miles SE of Washington DC, was child with showing sexually explicit materials to a minor and child pornography possession, with state charges, after having worked only 34 days.  A student reported his texting of another student, WTOP story here
  
But the incident shows that, despite fingerprint background checks, it is very difficult for school systems to vet substitute teachers well before hire.  They can look at social media, but this may run into First Amendment concerns with public employees.  But with substitute teachers principals at individual schools can ban substitutes under any suspicion whatsoever, and most school systems have “three strike” rules.
  
 Just as I was typing this story, I learned of a c.p. video circulating on Facebook from station WJLA7 in Washington, link.  It may have originated in Alabama, but was reported to Sinclair news in Cincinnati first. Resharing it is illegal and could lead to prosecution by federal law. But Facebook is likely to have removed the video before it gets very far. The WJLA story notes that resending such a video for "journalistic" purposes would not prevent criminal prosecution. But that statement could, by analogy, raise serious questions about citizen journalism in other areas, like terrorism.  But possession of a c.p. image is itself a crime;  possession of bomb-making instructions is not, although possession of the actual materials might be.    

Tuesday, January 09, 2018

Tech companies soften opposition to SESTA as liability language is narrowed


Lost in all the attention to net neutrality recently is the progress of the SESTA and related House bills to weaken Section 230 protections from services that host sex trafficking ads.


On November 7, 2017, the Washington Post had reported (Tom Jackman) that major tech companies were removing objections to SESTA since the Senate seems to be willing to narrow the language that could lead to downstream liability.
  
The language about “knowingly” allowing trafficking content was narrowed to “participation in a venture” but there is also a provision about “knowingly assisting, supporting or facilitating a violation of sex trafficking laws” and removes language “by any means”. 

Friday, December 22, 2017

Cloud services now seem to be checking consumers for watermarked child pornography images; an invitation to hackers?


Last night, television station WJLA7 in Washington mentioned a story in which a former sheriff’s deputy (Charles County MD, SE of Washington DC) had apparently been prosecuted for possession of child pornography. The only online story dates back to 2016, here

But the story was interesting because this time it was announced that the person had been caught by a data cloud service.  This appears to have been related to images on his smartphone.

This is the first time I can recall a data cloud service reporting a consumer to authorities, based on cooperation with the National Center for Missing and Exploited Children in Alexandria VA.  This process must have occurred by automatically checking backed up images for known digital watermarks in the NCMEC database.  Google has in the past checked Gmail attachments for this and in one case a Houston TX man was arrested.

Whatever one’s moral outrage over the issue, in practice consumers who use any cloud backup service for their PC’s, laptops or phones should be aware that this possibility now exists.  I would be concerned that devices could be hacked with c.p. placed deliberately in order to get people framed for crimes, as part of retaliation or harassment or terror-related campaigns.

A related problem is what makes an image fit the legal definition of c.p.   We’ve heard of cases where parents were arrested after turning on photo film for development of pictures of their kids, and the pictures were notices by clerks (and this sounds a bit like the Best Buy issue recently discussed).  I found even in some old family film of me as a child with some nude footage that I had removed before I posted it, out of this “fear”, even though I know that my parents had no prurient interest when they took this film in the 1940s, when no one was thinking this way.  The fact that one would follow through this line of thought has some implications: maybe an image that is not completely nude would be viewed as c.p. of capable of inducing arousal in the viewer.  There could be risks in possessing copies of foreign explicit films where the same rules (requiring registry of actors to make sure they are over 18) might not have been followed.  Federal law for images in the US has a minimum age of 18 regardless of state laws on age of consent (often lower) or European country laws (often lower). 
  
It’s pretty easy to imagine that the sex trafficking issue will start mixing with child pornography in enforcement.  

Thursday, December 07, 2017

Sexting case seems to fail in unreasonable intimate search of male teen by police


Here is a bizarre case in Virginia where police went too far in investigating a teenager suspected of sending pornographic images of himself to an underage girl. Tom Jackman has the story in the Washington Post Dec. 6.  The incident took place in Manassas in Prince William County, about 30 miles from Washington.

The cased move to the federal system and to the 4th Circuit, as a 4th Amendment issue. The 4th Circuit Ruling is here

Saturday, November 25, 2017

Case against California doctor dismissed because original image was not illegal and subsequent searches were illegal


The child pornography possession case against a California oncologist has been dismissed after a federal judge had ruled that the original evidence used to justify his home search inadmissible (May 18). Tom Jackman has his update story in the Washington Post on p A2 on Friday, November 24, 2017.  The defendant had filed suit with the help of the Electronic Frontier Foundation. 

Again, it’s worthy of note that images found in unallocated space (related to deletion) may lack the metadata necessary to show possession in the eyes of the law.

Furthermore, the one image in question, allegedly of a minor female, may have constituted erotica but was not explicit enough to meet the definition of child pornography, according to the judge.  There is some controversy over this, as I know other material on this issue claims that images (at least photos, not animation) intended to arouse could be considered pornographic even if they don’t “show everything.”

But the case is thrown out because all the subsequent searches of the doctor’s house were ruled illegal, and therefore inadmissible, because the original image did not meet the legal definition of child pornography.


There remains a troubling question of whether technicians at the Geek Squad center in Kentucky (where computers needing extensive repairs are sent) were paid by the FBI and may have made gratuitous examinations of the hard drives. Company policy says that technicians will not look for illegal images but must report them when found. There remains a questions as to the legal judgment of a technician as to what is an illegal image (as in this case).  We’ve covered before the troubling possibility that an illegal image could be placed on a computer by malware (and there is at least one variant of ransomware that does so).

The sex trafficking issue may seem part of this, because many sex trafficking victims (especially overseas) are minors.  Viewing an ad for sex trafficking would not be illegal, but responding to one might well be.

Viewing of a motion picture of video or still image (on a computer or smartphone) produced overseas could raise legal questions.  In the United States, actors must certify they are 18 or over in order to act in adult films. The age may be lower in Europe.  Most reputable production companies enforce the 18 year old rule overseas, but some might not, exposing the viewer to possible legal liability, it would seem.