Thursday, December 27, 2018
The Washington Post offered a significant editorial Thursday that may impact the downstream liability question. “Lessons from Google: Legislators can learn from a complaint about the marketing of apps for children”, link .
The significant issue is that Google had some downstream liability protection from app developers if they violated COPPA, of the Children’s Online Privacy Protection Act of 1998. The liability reverts to the developers. But calls to change this run parallel to calls to weaken Section 230 in other areas, as recently with FOSTA.
Tuesday, December 04, 2018
Tumblr's ban on nudity (of adults) seems based on filtering issues which may spread to most other platforms quickly
Late on Monday Eli Rosenberg wrote in the Washington Post about Tumblr’s decision to ban explicit nudity and sex on its platform as of Dec. 17, story here.
The Post notes that Tumblr had been one of the last repositories of adult content online – not sure that’s true. On YouTube, for example, there is a lot of “soft core” gay adult video (with age verification) which stops short of full nudity and usually stops when physical intimacy might cross a certain boundary (which in one case appears to suggest shaving). It is true that you don’t usually find full nudity; it is probably banned (as was a lot of material related to weapons lately). It's banned on Facebook except in certain medical contexts. It’s also noteworthy that the closing of gay independent bookstores (because of competition online) means it isn’t as easy to find gay porn in print (as was common from the 70s into the early 2000’s) or video, for hardcopy purchase. In the 1980s, for example, there was a gay owned business called the Crossroads Market on Cedar Springs in Dallas (where I lived then), that had all the nice mainstream art items and crafts, but also had porn, sold as pre-wrapped. (There was also a friendly store cat, Gracie.) This thrived while the AIDS epidemic, with its local political tensions, crested.
Getting back to Tumblr, there is a controversial story (by Lance Whitney) on CNET tracing Tumblr’s decision to its being banned from Apple’s App Store, because some child pornography had allegedly gotten through its filters.
It must be emphasized that Tumblr's new policy would apply to adult nude content;
There is a detailed discussion which surprised me as to how good tech companies have gotten in screening for it, before any video or image is posted, without the user noticing any slowdown. It might even apply to the cloud. (It also relates to the question of arrests when tech repairmen at a Best Buy facility in Kentucky discover c.p., which they are not supposed to look for – we’ve covered that before).
It’s true that it is quite easy and quick now to check images and videos against the digital watermarks of known images on the ever-expanding National Center for Missing and Exploited Children database. This capacity seems to be growing rapidly. But it is probably not perfect; no filtering is.
Blogger, in early 2015, had announced it would ban explicit nudity by March 21 of that year, but then relented after a popular revolt. I suppose that Blogger could have to reconsider this issue given Tumblr’s action (I do not have an account with Tumblr). Blogs with certain content are supposed to be marked “adult”, and certain videos are supposed to invoke logging onto the Google account to prove age 18 (although I wonder about that – I can’t believe that the Science Fair teens of the world (inventing cancer tests and fusion reactors at age 14) or the Parkland activists, at first under 18, don’t have accounts – David Hogg started what looked like a run for the presidency online when he was still “f---” 17 – he wasn’t 18 yet when he gave that passionate speech in front of thousands in DC).
Likewise, Automattic, which is so tied to hosting companies like BlueHost and GoDaddy, will have to look at this now.
Somewhere, as we ponder all this, we have to realize that teens really do mature at very different rates.
But in the meantime we have to watch this sudden issue carefully.
Monday, December 03, 2018
Incident where Facebook blocks a "journalistic" post about the Charlottesville trial for an offensive meme raises even more questions about lawful content and press credentials
On Sunday, I reported (on my main blog) an incident where Facebook blocked the posting and even access of a Virginia journalist, Hawes Spencer, after he posted a link to his own news story which in turn caused an image of an offensive meme created by defendant James Alex Fields to show directly in the post.
The post was eventually restored, but it leaves a troubling question, of what happens when a news story posts a disturbing image for reporting purposes but there is a risk that illiterate users will misconstrue the purpose of the post and act on it.
In fact, reputable and established news sites won’t reproduce some images, particularly illegal ones, most notably child pornography, even for storytelling purposes. It would be logical to wonder if, under FOSTA, an image promoting trafficking or prostitution would be illegal to embed this way.
What’s even more troubling is that this incident again begs the question, who gets to call himself a journalist? Would a different standard be applied to an amateur blog post than to one in an established newspaper?
Even the recent controversy over Jim Acosta also reminds us of this question: who is fully accredited as a reporter who stands outside the impulse to take sides?
I don’t have press credentials and in generally don’t need them to do what I do. But I am left wondering if this could change?