Monday, February 04, 2019
Maybe you do need VPN to protect yourself from unexpected government spying for the "context" of your browsing habits?
ThioJoe has a video on why you need a VPN (or else, his title says).
He gives a story from July 2017 when a teenager (younger than 18) made a meme which imitates Trump getting in a fight, where the kid did a brief “deepfake” on top of a CNN brief video.
CNN tracked down the user’s IP address and contacted the teen and demanded an apology, and threatened to “dox” him.
What I wonder about is browsing. The post previous to this referred to a government’s getting a search warrant for a user who alleged attempted to connect to a site or address thought to have child pornography. There could be an ancillary followup. A prosecutor could then look at a pattern of video watching, of images technically legal (or posted for special purposes in another context) but possible viewed for fantasy satisfaction, which might become a legal problem. YouTube will sometimes warn viewers when they go directly to private videos embedded in web pages or blog posts.
Thio also talks about “deep packet inspection” as a possible tool for law enforcement.
Tuesday, January 29, 2019
Electronic Frontier Foundation is arguing in Richmond at the Fourth Circuit on January 31 a particularly disturbing case.
A search warrant was obtained for a user’s home and computer(s) or device(s) when the user had connected to a URL for a file serving link that apparently was closely associated with child pornography.
EFF argues that this is not probable cause.
All modern web hosting services have logs that enable investigators to find all IP addresses that ever linked to any element on a site. Google no longer allows search arguments to be displayed (it had back in 2005). Theoretically, when a site is identified by NCMEC or similar facility, all possible users who attempted to connect could be found and investigated.
If an illegal image is loaded into a cache on the end user’s computer and the user knows that this site has illegal material, a criminal violation of possession is possible. (At one time, some states like Arizona even had “strict liability” in these circumstances.) This might be possible when an email preview loads, which makes phishing at least a remote risk for all users for being “framed”.
It would be possible to land on a site containing illegal images but not on a specific element that has such and image, so an investigation would show no illegal material.
In ambiguous cases, dangerous plea bargains could be possible, including agreeing to give up Internet use. So this situation sets up an invitation to frame and silence political enemies.
One possibility that I have wondered about contains YouTube videos claiming “boys” or “teens” in sexual activity in the titles. YouTube has become stricter in recent months on community guidelines (as have other services like Tumblr). Presumably an end user has a “right” to assume that age 18 or 19 is possible and intended. But some of these videos come from overseas and might be illegal to view in the U.S. Could a prosecution happen in these circumstances.
The law has to draw a line somewhere on age. In practice teens and young adults vary enormously on maturity at any given age.
Sunday, January 20, 2019
Someone sent me a link to a product called “ConsumersBase” for parents to use to protect their kids from inappropriate content online.
The product protects against cyber bullying, malware, inappropriate content, phishing, sexual predators, and what is most attention-getting, grooming.
This should not be confused with “consumer base exact data”.
Sunday, January 06, 2019
Vigilante groups with no law enforcement authority entrap people on the Internet for interest in minors and shame them publicly
Brandy Zadrozny writes for NBC a story about vigilante groups that function much like Chris Hansen’s “To Catch a Predator” in the 2000s.
The groups are apparently particularly effective in entrapping young gay men, if they answer dark web or even social media ads for possibly minor gay men.
The result is often a public shaming online, although you would think platforms would take these down.
But the groups do not have the authority of law enforcement and might be guilty of impersonating law enforcement.
Wednesday, January 02, 2019
The Washington Times, admittedly a “conservative” daily newspaper for the DC area, opens the first business day in 2019 with a front page article by Jeff Mordock on the use of the “Darnket” or Dark Web and Tor for child pornography
But the prosecutions and investigations described in this article are particularly shocking, one even involving an unborn child. I’ll leave the details to the article.
Electronic Frontier Foundation and the libertarian community as a whole has encouraged people to learn to use TOR, especially in non-democratic countries.
Thursday, December 27, 2018
The Washington Post offered a significant editorial Thursday that may impact the downstream liability question. “Lessons from Google: Legislators can learn from a complaint about the marketing of apps for children”, link .
The significant issue is that Google had some downstream liability protection from app developers if they violated COPPA, of the Children’s Online Privacy Protection Act of 1998. The liability reverts to the developers. But calls to change this run parallel to calls to weaken Section 230 in other areas, as recently with FOSTA.
Tuesday, December 04, 2018
Tumblr's ban on nudity (of adults) seems based on filtering issues which may spread to most other platforms quickly
Late on Monday Eli Rosenberg wrote in the Washington Post about Tumblr’s decision to ban explicit nudity and sex on its platform as of Dec. 17, story here.
The Post notes that Tumblr had been one of the last repositories of adult content online – not sure that’s true. On YouTube, for example, there is a lot of “soft core” gay adult video (with age verification) which stops short of full nudity and usually stops when physical intimacy might cross a certain boundary (which in one case appears to suggest shaving). It is true that you don’t usually find full nudity; it is probably banned (as was a lot of material related to weapons lately). It's banned on Facebook except in certain medical contexts. It’s also noteworthy that the closing of gay independent bookstores (because of competition online) means it isn’t as easy to find gay porn in print (as was common from the 70s into the early 2000’s) or video, for hardcopy purchase. In the 1980s, for example, there was a gay owned business called the Crossroads Market on Cedar Springs in Dallas (where I lived then), that had all the nice mainstream art items and crafts, but also had porn, sold as pre-wrapped. (There was also a friendly store cat, Gracie.) This thrived while the AIDS epidemic, with its local political tensions, crested.
Getting back to Tumblr, there is a controversial story (by Lance Whitney) on CNET tracing Tumblr’s decision to its being banned from Apple’s App Store, because some child pornography had allegedly gotten through its filters.
It must be emphasized that Tumblr's new policy would apply to adult nude content;
There is a detailed discussion which surprised me as to how good tech companies have gotten in screening for it, before any video or image is posted, without the user noticing any slowdown. It might even apply to the cloud. (It also relates to the question of arrests when tech repairmen at a Best Buy facility in Kentucky discover c.p., which they are not supposed to look for – we’ve covered that before).
It’s true that it is quite easy and quick now to check images and videos against the digital watermarks of known images on the ever-expanding National Center for Missing and Exploited Children database. This capacity seems to be growing rapidly. But it is probably not perfect; no filtering is.
Blogger, in early 2015, had announced it would ban explicit nudity by March 21 of that year, but then relented after a popular revolt. I suppose that Blogger could have to reconsider this issue given Tumblr’s action (I do not have an account with Tumblr). Blogs with certain content are supposed to be marked “adult”, and certain videos are supposed to invoke logging onto the Google account to prove age 18 (although I wonder about that – I can’t believe that the Science Fair teens of the world (inventing cancer tests and fusion reactors at age 14) or the Parkland activists, at first under 18, don’t have accounts – David Hogg started what looked like a run for the presidency online when he was still “f---” 17 – he wasn’t 18 yet when he gave that passionate speech in front of thousands in DC).
Likewise, Automattic, which is so tied to hosting companies like BlueHost and GoDaddy, will have to look at this now.
Somewhere, as we ponder all this, we have to realize that teens really do mature at very different rates.
But in the meantime we have to watch this sudden issue carefully.