Thursday, December 27, 2018

COPPA sets up a situation that parallels Section 230 problems (with Google apps)

The Washington Post offered a significant editorial Thursday that may impact the downstream liability question. “Lessons from Google: Legislators can learn from a complaint about the marketing of apps for children”, link .
The significant issue is that Google had some downstream liability protection from app developers if they violated COPPA, of the Children’s Online Privacy Protection Act of 1998. The liability reverts to the developers.  But calls to change this run parallel to calls to weaken Section 230 in other areas, as recently with FOSTA.

Tuesday, December 04, 2018

Tumblr's ban on nudity (of adults) seems based on filtering issues which may spread to most other platforms quickly

Late on Monday Eli Rosenberg wrote in the Washington Post about Tumblr’s decision to ban explicit nudity and sex on its platform as of Dec. 17, story here

The Post notes that Tumblr had been one of the last repositories of adult content online – not sure that’s true.  On YouTube, for example, there is a lot of “soft core” gay adult video (with age verification) which stops short of full nudity and usually stops when physical intimacy might cross a certain boundary (which in one case appears to suggest shaving).  It is true that you don’t usually find full nudity; it is probably banned (as was a lot of material related to weapons lately).  It's banned on Facebook except in certain medical contexts.  It’s also noteworthy that the closing of gay independent bookstores (because of competition online) means it isn’t as easy to find gay porn in print (as was common from the 70s into the early 2000’s) or video, for hardcopy purchase. In the 1980s, for example, there was a gay owned business called the Crossroads Market on Cedar Springs in Dallas (where I lived then), that had all the nice mainstream art items and crafts, but also had porn, sold as pre-wrapped. (There was also a friendly store cat, Gracie.)   This thrived while the AIDS epidemic, with its local political tensions, crested.

Getting back to Tumblr, there is a controversial story (by Lance Whitney) on CNET tracing Tumblr’s decision to its being banned from Apple’s App Store, because some child pornography had allegedly gotten through its filters. 

It must be emphasized that Tumblr's new policy would apply to adult nude content; 

There is a detailed discussion which surprised me as to how good tech companies have gotten in screening for it, before any video or image is posted, without the user noticing any slowdown. It might even apply to the cloud. (It also relates to the question of arrests when tech repairmen at a Best Buy facility in Kentucky discover c.p., which they are not supposed to look for – we’ve covered that before).

It’s true that it is quite easy and quick now to check images and videos against the digital watermarks of known images on the ever-expanding National Center for Missing and Exploited Children database.  This capacity seems to be growing rapidly. But it is probably not perfect;  no filtering is. 

Blogger, in early 2015, had announced it would ban explicit nudity by March 21 of that year, but then relented after a popular revolt.  I suppose that Blogger could have to reconsider this issue given Tumblr’s action (I do not have an account with Tumblr). Blogs with certain content are supposed to be marked “adult”, and certain videos are supposed to invoke logging onto the Google account to prove age 18 (although I wonder about that – I can’t believe that the Science Fair teens of the world (inventing cancer tests and fusion reactors at age 14) or the Parkland activists, at first under 18, don’t have accounts – David Hogg started what looked like a run for the presidency online when he was still “f---” 17 – he wasn’t 18 yet when he gave that passionate speech in front of thousands in DC).

Likewise, Automattic, which is so tied to hosting companies like BlueHost and GoDaddy, will have to look at this now. 
Somewhere, as we ponder all this, we have to realize that teens really do mature at very different rates.

But in the meantime we have to watch this sudden issue carefully. 

Monday, December 03, 2018

Incident where Facebook blocks a "journalistic" post about the Charlottesville trial for an offensive meme raises even more questions about lawful content and press credentials

On Sunday, I reported (on my main blog) an incident where Facebook blocked the posting and even access of a Virginia journalist, Hawes Spencer, after he posted a link to his own news story which in turn caused an image of an offensive meme created by defendant James Alex Fields to show directly in the post.

The post was eventually restored, but it leaves a troubling question, of what happens when a news story posts a disturbing image for reporting purposes but there is a risk that illiterate users will misconstrue the purpose of the post and act on it.

In fact, reputable and established news sites won’t reproduce some images, particularly illegal ones, most notably child pornography, even for storytelling purposes.  It would be logical to wonder if, under FOSTA, an image promoting trafficking or prostitution would be illegal to embed this way.

What’s even more troubling is that this incident again begs the question, who gets to call himself a journalist?  Would a different standard be applied to an amateur blog post than to one in an established newspaper? 

Even the recent controversy over Jim Acosta also reminds us of this question:  who is fully accredited as a reporter who stands outside the impulse to take sides?
I don’t have press credentials and in generally don’t need them to do what I do.  But I am left wondering if this could change?

Friday, November 02, 2018

Reviewing the 2007 COPA opinion as to "megaphone without gatekeepers"

Let me go back for a moment to the March 22, 2007 decision by Lowell Reid in striking down COPA, the Child Online Protection Act of 1998. 
It is noteworthy that the Third Circuit upheld his opinion on July 22, 2008 (Wired story )

On January 21, 2009, the US Supreme Court refused to hear the case (ironically, one day after Obama’s inauguration). Is there any reason to wonder if the current Supreme Court, with its more conservative makeup, would ever want to reconsider it?  It is settled law.

There is a lingering question, in my mind at least, as to whether the First Amendment (in combination with the Fourteenth) automatically incorporates the right to use a megaphone to reach the entire planet without a gatekeeper controlling what gets published, for other considerations, especially such as controlling fakes news or propaganda manipulation.

The actual censorship and de-platforming is the result of the actions of large privately owned (often publicly traded) tech companies, not governments, in conjunction with cultural pressures, which can include international pressure.

Conceivably in the future bodies like ICANN might have to consider this question in a philosophical sense.

One problem is that the capability to broadcast (and bypass the practical economic supervision of the legacy trade publishing industry and literary agents) should be viewed as part of the free speech right since it did not exist in a practical sense until the late 1990s, when the WWW opened up to users after Congress passed Section 230 in 1996 (shielding downstream liability for platforms).  AOL opened up Hometown in October 1996 – I remember that Sunday afternoon well. 
Although the COPA opinion does seem to state that Congress cannot provide content-related restrictions on the speech itself (even “hate speech”) once a distribution method has been technologically enabled, the opinion does not preclude the possibility of restrictions on who can have this kind of enabled access based on other factors, like open financial accountability, which might be relevant to stopping fake news. 

One other historical fact is born out -- the "Smallville Problem" -- minors really do vary as to their maturity.  Look at the Parkland H.S. activists and what they have accomplished. 

Friday, October 05, 2018

Accusations against Kavanaugh do show the difficulty of dealing with events in the very distant past

The recent furor over allegations against Brett Kavanaugh, Trump’s appointment to fill Justice Kennedy’s vacancy on the Supreme Court, brings back the question of a statute of limitations.
In Maryland, felony charges have no statute of limitations. However, no one has asked Maryland law enforcement to investigate the purported acts in a way normally required by state law.  The fact that Ford was a minor in 1982 would matter, but so was Kavanaugh.

As a practical matter, it sounds very improbable that anyone could prove an act occurred beyond a legal doubt with an incident so old.

Dan Morse and Erin Cox explain in the Washington Post here

Susan Collins speech before the Senate on the facts needs to be listened to.

The remarks are important with regard to Kavanaugh on Roe v. Wade, on gay marriage, on privacy, and on the importance of legal precedent in general. 

Nevertheless, accusations from decades ago can be very hard to refute.

Tuesday, September 04, 2018

People can find themselves served with search warrants after clicking on URL's connected to child pornography

Electronic Frontier Foundation is warning users about the possibility of being subject to a search warrant after even clicking on a URL capable of taking users to child pornography, in a sting.

The EFF Press Release went out on Friday, August 31, here. EFF has submitted an amicus brief, noting that users often don't know what is in links they are clicking on (misspelling, hacks, tiny url's).  

The case is “USA v. Boysk” in Alexandria, VA.  

But the case seems also connected to P2P and to a practice called “rickrolling”.
Sometimes on YouTube I see softcore gay porn videos offered that might have been filmed overseas with legally underage actors. In one or two cases these videos have quickly been taken down.  

Sunday, August 05, 2018

Woodhull Sexual Freedom Summit has forum on FOSTA, recalls history of COPA

I’ll have a more detailed post on the Woodhull Foundation’sSexual Freedom Summit’s forum on FOSTA/SESTA on Saturday, Aug. 4 from Alexandria, VA, watched online.
The forum can be watched retrospectively now from the Woodhull Facebook page, here.  I will review it in more detail on a newer Wordpress blog soon.

I wanted to mention that the early part of the presentation gave the history of the Communications Decency Act, the irony of how Section 230 got written and passed as a counterweight, and the repeal of the censorship portions of the CDA by the Supreme Court in 1997. The speakers predicted FOSTA could have a similar course, although it's hard to say how a more conservative Supreme Court will rule.  The panel discounted the idea that the law was really intended to stop trafficking, but instead wanted to target Internet adult content and undermine over individualized free speech -- and had curious, irresistible bipartisan support from populist bases that disregarded logic.  Free speech is simply not as important to millennials (the way we usually argue it) as it has been the previous generation. There is a curious, inconsistent communitarianism.  
It also gave a brief history of COPA, the Child Online Protection Act of 1998, and described that it made two trips to the Supreme Court (in 2002 and 2004) before it was finally overturned in a bench trial in Philadelphia, in late 2006 (I attended one day) with ruling in March 2007.

There was mention of Buffnet, an old case that gives some clues as to how platforms must deal with c.p. when Section 230 would not protect them. 
Stay tuned.

Sunday, July 22, 2018

P2P provides opportunities for law enforcement stings for C.P.

Saturday, at Shenandoah Valley Gay Pride, inside a restaurant called Artful Dodger on Court Square in downtown Harrisonburg, VA, I was seated in a small lounge area where there were some newspapers on a small table.  One small local newspaper was open to a story about a local man who had been arrested for child pornography found on his home computer, about thirty images, simply when an undercover state police officer discovered them through a P2P connection.

I found it interesting that the printed article seemed to have been read and noticed by several people.  That’s a good thing.

People have used P2P for years (I don’t), but it may become even more popular soon as companies promote “the distributed web” and block chain use.
However users need to be aware of this.  CP can also be detected in cloud backups and email attachments and image or video uploads by automated screening for hashmarks.  And harshmark technology is rapidly developing.

Tuesday, July 17, 2018

It's well to review the CPPA and its replacement, the Protect Act of 2003

I generally try to keep up with news that happens in my own court, even small incidents that stay relatively private. That remains so even when the media is so filled with sensational international affairs.
This is a good time to review the CPPA of 1996 (presented here Aug/ 17, 2015), which would have made it a crime to put simulated c.p. online even when there is no actual minor.  It was struck down as unconstitutional in 2002 (after considerable outcry in the artistic community on First Amendment grounds), but replaced by a “Protect Act” of 2003 under the first Bush administration. It can be illegal when there is explicit sex shown, and/or when the item is legally obscene or lacks legitimate value.  Here is the Wikipedia reference (look for paragraph 1466A).

In many countries overseas,  simulated or hand-drawn c.p. is illegal.  People who have posted images or video that is legal in the U.S. should bear this in mind if they travel overseas and if their content is available in the country they visit. .

Ironically, possession of c.p. is legal in Russia (CNN story ).  That’s even more surprising given the tone of the 2013 “anti-gay propaganda law” and the Russian idea (even espoused by Putin) that homosexuality is connected to pedophilia.

Wikipedia link for  DOJ Protect our Children banner, was used against Backpage which was seized before FOSTA became law.

Saturday, July 14, 2018

Police use K9 dogs to look for digitally stored child pornography (NBC story)

Scott MacFarlane et al have a story on NBC Washington about the use of K-9 dogs in police searches of homes or offices to look for child pornography.
The dogs can detect the scent of thumb drives, San disks, and the like, which could be hidden away from possible police searches. 

Of course, people may hide thumb drives or put them in safe deposit boxes as part of completely legitimate home security concerns.
A more important development might be the little reported practice of scanning email attachments (even when someone sends an email to self to move it to a different computer) and even cloud storage for watermarked child porn images matched against law enforcement databases.  This has resulted in spotty arrests (at least one in Houston, one in southern Maryland).
You wonder if this practice could extend to other illegal behavior, like even storying copyrighted images without publishing them.

Thursday, July 05, 2018

"Child Limits and the Limits of Censorship" paper by Protasia foundation, urges moderation in platform enforcement of TOS standards

The Protasia foundation has an important article, “Child Protection and the Limits of Censorship”, on medium, by Jeremy Malcolm.

The article discusses the downstream liability issue in general for service platforms, and explains clearly (near the end) why it opposed (Backpage) FOSTA (now under a lawsuit filed June 29 by EFF) as going after parties distant from real sex trafficking.

It discusses Microsoft’s PhotoDNA system, as to its future implementation by platforms to prevent copyright infringement (it mentions the Copyright Directive and Article 13 issue, which thankfully has been tabled for a while this morning) and possible child abuse. It also discusses the problem of image hash lists (or watermarks) at a higher level.

It discusses the termination of user account when users may have inadvertently handled images marked by NCMEC; apparently not all of these are technically illegal under child pornography laws.
It also gives other examples of overzealous enforcement of “grey” areas shutting down many users, such as a case with Wikipedia in Britain

Monday, July 02, 2018

VPN Mentor overs parents' guide for Internet use

A site called VPNMentor contacted me an provided this link to “The Ultimate Parent Guide for Protecting Your Kid on the Internet”, here
The most obviously important sections are social media (4) and cyberbullying (5).
This may lead to a bigger post on my Wordpress sites later.

Wednesday, June 20, 2018

Prostasia Foundation weighs in on Twitter on aggressive screening for c.p. in cloud and Internet posts

I’ll mention the “Prostasia Foundation” (also called “YouCaring”) which has a Kickstarter campaign now, on this site

Today the group answered a tweet concerning the practice of screening images in the cloud (in addition to social media posts) for possible matches to known watermarked images in the National Center for Missing and Exploited Children, which has sometimes resulted in arrests.
Link for their tweet is here.  One observation worth noting is that this mechanism probably would not catch secondary images of original watermarks. 

Sunday, June 10, 2018

FOSTA consequences seem quiet right now but could erupt with a slippery slope for platforms; the "harmful to minors" concept seems expandable

Two months after Trump signed FOSTA into law, the news on it seems to dwindle, but a few contradictory pieces stand out.

A piece by Elizabeth Nolan Brown on 3-23-2018 in Daily Beast still warns that the law could end the web as we know it, primarily because of the “slippery slope” problem.

In the third paragraph from the end Brown makes the comparison to liability for promoting terrorism or weapons use, and many platforms have become pro-active on that point.

In fact, the whole “harmful to minors” concept makes more sense morally if it does include the issue of promoting violence or weapons use.

We’re facing a world in which some young adults go off the rails and social media may be a major influence.  This may be as critical as the actual availability of weapons (which is arguably too easy, but that’s beyond this particular blog).

Back in February, Insider Source had reported that FOSTA really is narrow enough to allow startups to deal with it (better than SESTA, which is admits is too vague), but it’s not clear if the last minute changes to FOSTA added too much ambiguity, as EFF claims.

An article in The Hill (conservative) in late March, upping the ante on the importance of stopping trafficking as a policy priority, claimed that Cloud monitoring tools were readily available on the web even for startups.  This sounds optimistic.

But we haven’t heard yet of major instances outside of the areas of personal hookups (the actions of Craigslist, Reddit, and a few other companies) where mass censorship has happened.  But the downstream liability issue creep could indeed eventually determine who can easily speak on the web of the future.

An older article in Medium in January had tried to compare what FOSTA and SESTA purported to do legally.  This may be out of date. 

Saturday, May 19, 2018

Is it possible for visitors inadvertently loading or watching an illegal video to "possess" c.p. in the eyes of the law?

Recently I’ve indulged in watching some gay videos on YouTube.  I do notice that the captions for some of them mention “boys” or “teens”, and I hope (generously) that wording means ages 18 or 19. These videos usually have higher viewing counts.  But I spotted at least one video tagline in the suggested list that had “pre-pubescent” in a title, and that would imply child pornography.  That particular video listed a very low view count and might have just been posted.  YouTube might soon remove it with its own monitoring procedures.

There are two issues:  the legality of viewing material according to the age of the viewer (or terms of service rules), and the legality of the content itself, and of whether it might be criminal for anyone to view it or “possess” it.
Google explains its age verification policy here

Apparently you have to be 18 to view an age restricted video (even if embedded) or at least 13 to have an account.
However, a video showing nudity or explicit intimacy with an actor under 18 at the time of filming might fit the legal definition of child pornography.  When a video is made with an actor whose appearance suggests the likelihood of being under 18, in the US and most western countries, some sort of legal record is required of the actor’s age.  It would sound conceivable that a video made in a non-western country might be more likely to have used underage actors by western law, posing a legal risk to viewers (below). 
The main legislation in the US would be the Child Protection and Obscenity Enforcement Act of 1988.  The is more information here about adult film industry regulations. 
The concern would be that a visitor who views a video that he/she could reasonably suspect was illegal (according to the laws requiring actors to be 18 or older) might later be found to have possessed c.p.  It is unclear how easily government agencies (or NCMEC) could detect such viewing or whether government would really want to prosecute.  Some states (like Arizona) seem more aggressive than others.  

Viewing behind TOR or similar platforms might eschew detection during viewing (but not necessarily after the fact).  It's unclear whether https encyption alone would do so.

Since YouTube videos are often UGC (user-generated content), there is no guarantee, from the viewer's knowledge, that the actors were age verified.  But Section 230 would not protect the platform from downstream liability in c.p. cases (and now trafficking has been folded in to the liability risk with FOSTA), so users might be entitled to the belief that the platforms take some reasonable care to remove what they reasonably suspect is c.p. or illegal, even without age verification of actors presented in all cases. 
But obviously videos whose title brag about illegality should be removed by services like YouTube and Vimeo, after detection by automated monitoring if possible.

Monday, May 07, 2018

Cloud Act could be used to screen unposted, private data for illegal behavior, even given the 4th Amendment

This piece by Daivd Ruiz of Electronic Frontier Foundation on the Cloud Act and the Fourth Amendment does bear attention. 

The bill would allow data detected by foreign law enforcement partners from Cloud backups to share data with the US. 

It’s pretty clear that cloud backups will be scanned more often in the future for child pornography and now sex trafficking content, which might be a tool to reduce downstream liability exposures due to FOSTA (although this would go further, into private backups that the user didn’t intend to post).
It would seem possible to scan them for suspicious behavior or interests of various kinds.  Although a way down the pike, it sounds feasible to scan for low-level copyright infringement, merely illegally saving personal copies of material that was supposed to be purchased.   

Wednesday, April 11, 2018

Activist groups file complaint that YouTube has violated COPPA

Some activist groups have claimed that YouTube is collecting data from users under 13, putatively in violation of the 1998 Children’s Online Privacy Protection Act, or COPPA, as in this CNN story. The CNN story reports that the YouTube may be picking up both minors’ and parents’ data when children sign on. 
Over 20 groups have filed a complaint with the FTC.  The story maintains that the groups want YouTube to be able to distinguish kids’ data from parents’.

In December, CNN had reported that YouTube would hire 10000 people to “clean up YouTube”.

Sunday, March 18, 2018

Some states want to put mandatory filters on all Internet devices sold in their states with a registry of those who unblock; Hollywood seems to up threats on all user content

Wired has an important story on states considering their own variations of the Human Trafficking and Child Exploitation Prevention Act (HTCEPA), by requiring every device sold in their states to have porn filters!  Right now, the dishonor roll includes Rhode Island, South Carolina, and Texas. Similar bills are considered in the UK.  (I couldn't find this act in Wikipedia.) 
Users could pay a fee to remove the filters but then the states would have a registry of users who had.
Wired's story,  by Louise Matsakis on this, where she traces the laws back to the original Communications Decency Act of 1996, largely overturned; and the Child Online Protection Act (COPA), which has been the main subject of this particular blog.  The filter issue came up in the COPA litigation, particularly at the trial I covered here in 2006.

Ironically it is Section 230 of the original 1996 law that survived and that is now threatened by FOSTA and SESTA as covered here before.  This article mentions these, and notes the lack of distinction between consensual adult sex and trafficking (which often involves minors).  The article’s comments on these are a little overbroad (it’s more than “social media sites” though).

Recently, Congress has gotten some letters actually supportive of SESTA/FOSTA from Hollywood (Fox) and some parts of Tech (Oracle).  I’ll get into these again, but there is a disturbing undertone to these letters:  there is no reason users should be able to post to the whole planet at will without gatekeepers unless they give something back (like help fight trafficking, or volunteer in some intersectional way).  That really isn’t what Senator Portman thinks SESTA says; he still says it is narrow.

Saturday, March 10, 2018

Legally dangerous tweet from Uganda circulating; (don't forward it)

A few weeks ago WJLA7 warned viewers about a child pornography video circulating on Facebook, and that it could be a crime to share it.  The video has surely been removed.

But today I saw in my Twitter feed a post whose title seemed to hint at c.p. filmed in Uganda.  The image in the video showed minimal dress but no nudity.  I simply ignored it as it passed out of sight, but I realized I could have (with a little more presence of mind) reported it and unfollowed the sender. 

Presumably it could be a crime to retweet such a post.  Just a warning or a tip. 

I don’t recall that this has happened in own input feed before.  Let’s hope someone reports it and that Twitter gets rid of it quickly. 

Thursday, March 08, 2018

Geek Squad appears to be working undercover with FBI in a cozy relationship at a repair center to nab child pornography possession

The Electronic Frontier Foundation, in a disturbing article by Aaron Mackey, reports that there was more collusion between a Geek Squad repair center in Kentucky and the FBI looking for child pornography, than had been thought.

Some employees seem to have gone out of the way to look for images in unallocated space, that the customer thought had been deleted.

There are tools that can detect digital watermarks from known images identified by NCMEC. But it is hard to imagine how one could find a “needle in a haystack” otherwise.

It would sound plausible to do this also with sex trafficking in the future (related to the FOSRA-SESTA debate).
In August 2014, I had a large Toshiba laptop crash with a burned out motherboard from overheating when trying to upgrade from Windows 8.0 to 8.1, as repeatedly prompted by Microsoft.  The computer was sent to the repair center and was there for six weeks before we gave up on it (the store said “Tennessee”).  I had to replace it and apply the service plan warranty to that replacement.

Update: March 11

A further report on admissions of payments to GS members.  

Tuesday, March 06, 2018

The question of porn, pirated and placed out of context -- reminds me of the COPA trial in 2007

Stoya has an op-ed in the New York Times Monday, March 5, 2018, p. A27, “Can there be good porn?
She discusses how adult content needs to be put in context – a discussion that I remember well from the COPA litigation of more than a decade ago. But then it gets pirated, she says, out of context, posted “for free” on YouTube, and discovered by kids out of context.
But even when found in its original location, many people will not bother to read the context.

Saturday, March 03, 2018

FOSTA passage and the "should have known" standard

Note well the Wall Street Journal editorial “Political Sex-Trafficking Exploitation”, with the byline “Fast moving legislation could open the web to a lawsuit bonanza, link
WSJ admits that Backpage should have been prosecuted under existing law, which really does exempt criminal activity by websites or service providers that they know about from downstream liability protection. A judge In Boston will let another case go forward.
But all the rub is with the “should have known” idea, already put forth by EFF.

Update: March 22

Senate has passed FOSTA unchanged and the president has signed.  See my BillBoushka blog today. 

Wednesday, February 28, 2018

House passes sex trafficking bill reducing downstream liability protections under Section 230 (Backpage case)

The House late Tuesday passed the “FOSTA” bill, HR 1865, based on the problems with sex-trafficking on the web.

The Wall Street Journal has the best account right now, by John D. McKinnon.  I have links to other accounts on my main blog, as well as on Wordpress where there are many more details. 

The bill seems to have been amended at the last minute to narrow the Section 230 exemptions of services (websites, social media sites, and possibly hosting companies – the last is not clear) when their users engage in promoting large scale prostitution or any sex trafficking.  What is unclear is what the legal standard would be how a service would know that this is going on because it cannot prescreen all content.  Congress obviously believes it is targeting “classified ads” sites known for selling sex ads.  There is some question as to what the “reckless disregard” language means. 

Tuesday, February 20, 2018

Reposting school threats sent to a student could be crime (WJLA-Sinclair warning)

WJLA7, Sinclair-owned station in Washington DC,  is warning parents that students who repost threats to schools they receive to others could face criminal charges or at least expulsion from school.  Reposting of such messages could be treated very much like reposting child pornography (for which there was a scare a week ago).
Some attention to this possibility comes from a story about a Spapchat at a different Florida school, reported here.  This is even more ironic because Snapchat posts are supposed to disappear. 

Sunday, February 18, 2018

Utah wants to pass a civil liability version of COPA

Matthew La Plante has a story in the Sunday Washington Post, by Matthew La Plante, p. A11, maintaining that politicians in Utah and several other states want to frame porn addiction as a public health crisis.  Utah, in particular, considers a law allowing “victims” to sue publishers for adult content viewed accidentally by children. 
Is this COPA all over again?  Remember the COPA trial a decade ago, and the debate on filters and adult-id? 

Update:  Feb. 22

Florida's legislature passed a resolution declaring pornography a public health risk, and not military style weapons in the hands of a deranged person. Holly Yan on CNN reports with a video interview. 

Tuesday, February 13, 2018

Hosting providers use security companies to scan for illegal content which could be placed by foreign hackers

In conjunction with a recent Cato Institute briefing on the unintended overreach of some state sex offender registry laws, it’s worth noting that hosting providers are now using software to scan customer sites for malware and this could include illegal content like child pornography (identifiable with digital watermarks) which could be placed particularly by foreign enemies as a kind of subtle terrorism or attack on American legal and democratic values.  Site Lock is one of the companies used to scan sites.  Older sites, those not updated or accessed as often, or with unused unneeded capacity (email accounts, for example) could start to develop  a risk. 

The topic came up at the end of the session in an audience question. There was a comment from the panel that “mens rea” is more likely to be useful in defense today when a person’s property is hacked (by an enemy) than it would have been fifteen years ago, when law enforcement and the court systems did not understand the Internet as well.
This problem could be related to issues reported before, when computers are infected with illegal content discovered by repair technicians.

Saturday, February 03, 2018

Arrest of a substitute teacher in Maryland shows the practical risks of Internet abuse; sudden warning about a Facebook video

A substitute teacher in Charles County, Maryland, about 30 miles SE of Washington DC, was child with showing sexually explicit materials to a minor and child pornography possession, with state charges, after having worked only 34 days.  A student reported his texting of another student, WTOP story here
But the incident shows that, despite fingerprint background checks, it is very difficult for school systems to vet substitute teachers well before hire.  They can look at social media, but this may run into First Amendment concerns with public employees.  But with substitute teachers principals at individual schools can ban substitutes under any suspicion whatsoever, and most school systems have “three strike” rules.
 Just as I was typing this story, I learned of a c.p. video circulating on Facebook from station WJLA7 in Washington, link.  It may have originated in Alabama, but was reported to Sinclair news in Cincinnati first. Resharing it is illegal and could lead to prosecution by federal law. But Facebook is likely to have removed the video before it gets very far. The WJLA story notes that resending such a video for "journalistic" purposes would not prevent criminal prosecution. But that statement could, by analogy, raise serious questions about citizen journalism in other areas, like terrorism.  But possession of a c.p. image is itself a crime;  possession of bomb-making instructions is not, although possession of the actual materials might be.    

Tuesday, January 09, 2018

Tech companies soften opposition to SESTA as liability language is narrowed

Lost in all the attention to net neutrality recently is the progress of the SESTA and related House bills to weaken Section 230 protections from services that host sex trafficking ads.

On November 7, 2017, the Washington Post had reported (Tom Jackman) that major tech companies were removing objections to SESTA since the Senate seems to be willing to narrow the language that could lead to downstream liability.
The language about “knowingly” allowing trafficking content was narrowed to “participation in a venture” but there is also a provision about “knowingly assisting, supporting or facilitating a violation of sex trafficking laws” and removes language “by any means”.