Wednesday, December 18, 2013

Center for Digital Democracy challenges "self-regulation" under COPPA over problems and Marvel, Sanrio

The Center for Digital Democracy has filed complaints with the Federal Trade Commission about information collection practices of both “Marvel Kids” and Sanrio, which are said to violate recent FTC rules regarding implementing COPPA, the Children’s Online Privacy Protection Act. . Slate has a detailed blog posting about the case here
In general, the companies are said to be enticing children to share information without seeking required parental permissions.  Slate gives some discussion of the Hello Kitty Carival from Marvel.
Slate linked to the Digital Democracy complaint here  and also discusses the Children’s Advertising Review Unit. 
COPPA is said to have a de facto “safe harbor” provision that encourages self-regulation.  It is also supposed to apply to sites that intentionally market to children or who know that minors are providing information online.  It is not completely clear that it could not affect a site intended for the “general public”. There was some discussion of this aspect of COPPA here Jan. 1, 2013. 
This situation can test whether “self-regulation” within the child-site industry can keep problems from spreading. 

Thursday, December 05, 2013

UVa dean arrested on c.p. viewing and possession, shows how insidious the problem gets

The Associated Press and Washington television station WJLA report that an associate dean at the University of Virginia in Charlottesville was arrested for viewing child pornography, apparently at least partly from home, between January and October 2013.  It appears that some of the activity occurred through a P2P network. It’s not clear if this was a sting, but the FBI and local or state police departments do set up stings to see who will link on sites they know to be illegal. 
Again, this case heightens the risk to individuals who visit such sites and then rationalize their behavior because nothing seems to happen for months.  They could also incur civil liability to each victim (previous post).  WJLA has the AP story here. A problem that should be explored is whether this happens with other people's WiFi routers.  

Tuesday, December 03, 2013

Civil liability exists to individual minor victims for possession or viewing of c.p.

The New York Times has a story on p. A12 Tuesday about a case before the Supreme Court to decided whether persons who viewed child pornography bear civil liability to the specific minor involved as well as criminal liability for possession.  I had not heard that such civil liability for viewing or possession can exist.  The link for the story is here

Civil liability for possession (even without publication or distribution) is well known from lawsuits against supposed P2P downloaders of copyrighted material. 
Victims often received repeated letters from attorneys advising them of new individuals who may owe them liability. 

Monday, November 25, 2013

Microsoft helps law enforcement remove c.p. images from web with new digital watermarking technology

Several major media outlets, especially CNN Money, report an increased effort by Google to remove child pornography from search engine results, and a teaming with Microsoft which will use new digital watermarking technology to tag images and videos so that they are then removed from the web.

The Centers for Missing and Exploited Children in Alexandria does also use this technology to locate offending images, in conjunction with law enforcement.
The CNN story is here
Digital watermarking had been proposed in the past by groups (like ICRA) to rate Internet content for suitability for minors, but that idea has never really caught on, as reported here before.  

Friday, November 01, 2013

ABC Nightline report on teen porn addiction; -- could that bring back a call for COPA-like laws?

Late Halloween night (really, into “All Saints Day”), ABC Nightline presented the issue of teen “porn addiction” that starts on the Internet, and often moves to access through mobile devices.
One problem is that the porn often emphasizes violent or quirky images, that are far removed from the world of “real life relationships”. 
The story is at least relevant to the idea of “protecting” minors. 
Physiological studies show certain areas of the brain being stimulated in visual or behavior addictions.  That’s still not substance addiction.
But reports like this could open up calls for COPA-like laws to be reconsidered in the future. 

Thursday, October 17, 2013

Wordpress does offer a voluntary age verification scheme that is COPA-like

I’ve notice that Wordpress does offer an age-verification plug-in, although it appears to require honesty on the part of the viewer.  The plugin would ask the visitor to verify her age before viewing content.  This would not have completely satisfied the requirements of COPA (Child Online Protection Act) had it been upheld, because it probably can’t verify age fully.  But even credit card or driver’s license verification is far from foolproof anyway.

The link on Wordpress is here.

It’s a good idea to keep track of the capabilities of service providers to provide voluntary verification and filtering schemes, anyway.  All very libertarian. 

Friday, October 11, 2013

UK requires ISP's to default to filtering adult content, a curious recollection of COPA in the US

Britain, with heavy pressure from the Prime Minister David Cameron, has been requiring ISP’s to turn on porn filtering or adult content filtering as a default option.  Parents have to deliberately turn it off if they want to watch adult material themselves.  This could obviously annoy people who do not have children.
The British independent newspaper reported it here July 22, 2013. 
The concept is interesting, because the effectiveness of filters was an important concept in the COPA trial in the United States (in Philadelphia) in the fall of 2006. 

Friday, October 04, 2013

C.P. concerns can lead to cases where hyperlinks are illegal even without content display

Could linking to a website known to be likely to host child pornography, or particularly linking to a url known to load an illegal image, itself be a crime?
Of course, if one has viewed the image, then one has “possessed” the illegal content and already broken the law.  But what if someone linked to such a url on advice of someone else without ever viewing or loading it?  It sounds like a legally perplexing question, although maybe morally abhorrent behavior. But many states would probably interpret this as "knowingly" distributing illegal content/ 
It’s legally disturbing because making  a hyperlink illegal is always dangerous for the Web free-speech environment.  The issue with respect to copyright law was supposedly settled in 2000, where a judge said that a link was essentially like a bibliographic reference or footnote.  But linking to libelous content has the remote risk of making one a libel defendant oneself.  And the original 1996 Communications Decency Act (the ancestor of COPA and source of today’s Section 230) night have made it a crime to link to indecent materials, in some interpretations.  

Wednesday, October 02, 2013

Sometimes opening "phishing" emails could result in prosecution for possession of child pornography

On September 23, 2013 I wrote about a Webroot report on the BKA Trojan that can put child pornography through “drive by” on a user’s home computer, creating a possible legal bind.
Recently, I’ve received repeated spam with spoofed senders saying “You should take a look at this picture” as subject line and attachments.  To open such an attachment (maybe even the email itself with HTML enabled) would possibly create a legal risk of falling under the interpretation of “knowing” possession of c.p., at least as the law might be interpreted in many states, because there is enhanced reasonable suspicion that this is what the content will be.  At the very least, if an “illegal” image appeared (and could get backed up in the user’s Cloud quickly), the user might have to completely destroy the computer or erase the hard drive and all backups. 

Such emails should be marked as spam and not opened.  They stopped, and are not even appearing in the spam folder.  Maybe AOL is screening them before they get to my account and sending them of to NCMEC.  

Thursday, September 12, 2013

Merely clicking on a CP site, even in an email, could draw a police raid; draconian enforcement of possession statutes

ABC affiliate WJLA reports on a guilty plea by 74-year old physician  (Robert Paul Dickey)in SE Washington DC, who was arrested for possession of child pornography based on a single tip after he had visited a particular website.  Federal agents had found him watching it when they raided his home, with the news story here  The story is remarkable because, when compared to many other cases, it seemed to take very little activity by the person to get into legal trouble. Authorities found 132 illegal images on the defendant’s computer in a May 2013 raid.   

Wikipedia reports that law enforcement sometimes posts links and then raids the homes of people who click on them, which could present an issue with roommates, teenage children, or tenants, for example.  Sometimes activity is detected by technology at the National Center for Missing and Exploited Children in Alexandria, VA.  Other risks could be the discovery of the content by computer repair technicians, or the remote possibility of a “wardriving” poach on an unsecured Internet router (which has led to false arrests in Florida and upstate New York before).   There could be a risk in opening attachments or even enabling HTML in email from unknown sources, possibly with spoofed sender addresses.  (That’s because once the illegal image is on your harddrive, even in a cache, you possess it, and it isn’t easy to remove it, short of wiping out the hard drive.)  Some states (like Florida) tell consumers that they must call law enforcement if they "accidentally" encounter it with an email or link.  It's possible on a cell phone or laptop or tablet to click "accidentally" merely by passing a mouse or finger over a link.  This should get more attention, still, as an Internet safety issue. 

Saturday, August 10, 2013

State attorneys general proposals to undermine Section 230 bring up some old issues known from COPA (age verification)

Friday, I learned about a proposal by state attorneys general to cut Section 230 of the 1996 Telecommunications Act, or “Communications Decency Act”, presumably allowing service providers to be held to downstream liability standards set in individual state criminal codes.  I wrote about this in he “BillBoushka” blog yesterday immediately, and there will be a lot more to say soon. 
The issue relates to the history of COPA, however.  Part of the motivation for the states is to crack down on sex trafficking, especially of minors (often from overseas).  One of the most important provisions a few states (Washington, New Jersey) wanted was to force “classified advertisements” websites like Craigslist and especially Backpage to screen “workers” for age.  Courts have so far upheld immunity for these sites under Section 230.  There would be serious ramifications of “relaxing” this immunity for the entire web, as I explained yesterday and will explore further.   But the “age screening” requirement definitely reminds me of similar requirements for websites that had been proposed for COPA in 1998, which had attempted to replace the “censorship” provisions of the Communications Decency Act, whih the Supreme Court struck down in 1997. 
I had myself at one time thought that age screening would be feasible (when I wrote my first 1997 “Do Ask Do Tell”) book, but realized quickly the difficulties that would ensue had I been required to do this for my own “” and then “” sites, as I have earlier explained in this blog (especially the postings during the COPA trial in Philadelphia in October and November, 2006).
It might be possible for the federal government or FBI to develop a system that “classified ads” sites could use to verify ages, or that ordinary sites could use as plug-ins that users execute once when using a site for the first time with any standard operating system (Windows, Mac, Linux, and Mobile systems).  I think this is possible because I’ve discussed, on another blog, a similarly spirited proposal for a master federal system for verifications to prevent identity theft.  (Go to this link.).  I think this is possible because when I worked for ING-ReliaStar back in 1998, I developed a system component that would fit into this sort of facility, which could be run by the USPS NCOA (hint as to how it works is here.). 

The main objection to the government’s deploying such mechanisms would obviously be personal privacy and surveillance concerns, as we have seen with the recent debate on the NSA and Wikileaks.  That debate has distracted the public from paying attention to other threats to free speech, such as proposals to gut Section 230.  But deploying such mechanism might deflect the downstream liability problem and help preserve “free entry” on the Web.

It’s important to notice that the state attorneys general want the ability to implement and enforce state laws regardless of limitations on the federal government as a constitutional matter (which gets us back to discussions of the 14th Amendment and the “Incorporation Doctrine”, another good item for US history tests).  It’s important to remember that many states have or had “harmful to minors” laws that, when applied to the Internet, would have resembled COPA.  Some of these states included Virginia and New Mexico.  I think that these state laws were overturned by their respective supreme courts after the COPA decision in 2007, but I am not completely sure.  The recent letter from NAAG suggests the idea that states could try to implement their own versions of COPA and make them stand up.  

Sunday, August 04, 2013

Russian anti-gay speech law recalls fights in US over COPA and CDA

The latest spin on the much reported anti-gay law recently passed in Russia is that it primarily targets speech about homosexuality (that is, "nontraditional" sexual behavior outside of heterosexual marriage) made in front of minors.  While much attention has been made of the likely effects of the law in the physical world (especially in view of the upcoming Winter Olympics in Sochi), as with practically shutting down gay pride events as they occur in western countries, an obvious question is, what about the Internet.
As I’ve noted on the LGBT blog, it appears that the law is motivated by Russia’s low birthrate, and panders to the scientifically dubious (at best) notion that teenagers and minors (especially male) are likely to become less interested in having children and raising families if they learn about “gay values”. 
The obvious question is, how would it apply to the Internet.  In structure (although not exactly purpose) it sounds as though it could work a bit like COPA or even the CDA. If a blog posting was published in a place where minors could find it (through search engines), that would make it illegal and make its author liable to arrest and prosecution.  This could be very serious for foreign guests (not necessarily just for the Olympics) who have written pro-gay blogs from home (or even blogs critical of the Russian government and Putin) which can be accessed by the Internet inside Russia and are reachable by search engines, unless Russia has already blocked the blogs.   Maybe someone planning to visit Russia could block the sites or blogs himself in Russia before going to the country; it’s something that might need to be looked into. It could become a serious problem when employers send contractors to Russia (or to other anti-gay countries like Uganda). 

ABC News had reported that, before the Russian DUMA passed the law, some Russian cities or autonomous regions had already passed similar legislation. St. Petersburg, one of the most popular cities because of the museums and convenient access from Poland and Finland, is one such city. 

A few years ago, similar anti-gay sentiment had been reported in Poland, much of it motivated again by low birth rates.    

There's something telling about a government's viewing all public speech as "propaganda".  

Saturday, July 20, 2013

Some states require computer technicians to report suspected child pornography when they find it during consumer repairs

Recently, I’ve commented here on the risks to consumers that can occur if they are “turned in” to police by retail establishments when material they have submitted for development or repair is thought to contain child pornography.
The National Conference of State Legislatures has a list of states where computer repair technicians and information technology workers are required to turn in suspect material and identities of consumers to police or law enforcement, with the link here.

These states include Arkansas, California, Illinois, Michigan, Missouri, North Carolina, Oklahoma, Oregon, South Carolina, and South Dakota.

Generally, these laws don’t require technicians to look for it, only to contact authorities if they find it.
It’s not immediately clear how they would find it, or what their responsibilities will if they really believe it was planted by a virus without the consumer’s knowledge. Most states now follow federal law and consider the law violated only when a consumer knows that he or she possesses the material, but this has not always been the case.  

As with photo development, there would be issues of how technicians, not legally trained, would react to ambiguous images.  There are many gray areas, such as with animation or computer graphics, or with the use of adults to simulate the appearance of minors, or, of course, normal family intimacies.

Possibly a manager would be trained in the law, and only a store manager would contact police.  Perhaps police would come to the store or have a policy of returning the items within 24 hours if they don’t believe that a state or federal law was actually broken.  Is that asking a lot.

I took up this problem on the Internet Safety blog Nov. 11, 2009.  I expect I will add another entry about the issue there soon.  

Sunday, July 14, 2013

Intimate family photos clash with the way prosecutors, politicians, and retail outlets interpret child pornography laws -- "too literally"?

There is quite a bit of controversy on the Internet over whether intimate family pictures, which used to be commonplace in the 1940's, now meet the definition of child pornography in some states.  Salon has a long article about this by James Kincaid, although back to 2000.

A Findlaw blog entry weights in on the problem here.

And the Wall Street Journal law blog has a 2009 discussion of a defamation suit against Wal-Mart after it called police on innocent photos,(website url) here.

We're deputizing retail clerks and sometimes computer technicians as a posse in the name of protecting children. And such persons won't have legal training in what defines c.p. and believe they have to take the strictest possible course, whereas aggressive predators remain relatively undeterred in practice.

The problem is that prosecutors and politicians overreach on what used to be common sense.  My own parents took pictures of me in the 1940's (in 8 mm home movies, now on a DVD) that would not be acceptable today, but there is no conceivable lewd intent behind them.  The political culture has changed, not always for the best.

But no, I won't post those pictures. 

Thursday, July 11, 2013

Maryland cyberbullying law may face problems with vagueness of "emotional distress" provision

Maryland has passed an anti-cyberbullying bill, called Grace’s Law, named after Grace McComas.
But the ACLU said that the portion that makes it a crime to inflict emotional distress online may not pass constitutional muster, because (compared to the idea of making a threat) it seems too vague for criminal law.  But it could be the grounds for civil litigation, as emotional distress is a factor in tort law.
The Baltimore Sun article by Arthur Hirsch was reproduced by the Huffington Post in April here

NBC Washington reported on the passage of the bill May 3, and briefly discussed it today July 11, story here.

The text of HB 396 is here. This appears to be what passed. It takes place Oct. 1, 2013.  It is not clear if the law applies to communications coming from outside Maryland or going outside.  
The Westminster Patch also reported the signing of the final law on May 2, here. 

Most of the media coverage seems to be relatively unaware of the possible constitutional problems. 
Penalties can include a $500 fine or jail time. 
I still think, personally,  that the best way for teens to be popular is to be good at something in the real world first.  Let it be music, chess, sports, drama, a lot of things.   And don’t give into the pressure to have as large a count of “friends” as possible.  Think about quality. 

Tuesday, July 09, 2013

NBC4 in Washington DC gives high-level primer on safety for minors

Laurie Nathan gave a primer on teen safety on NBC Washington, saying that one in three minors experiences cyberbullying.
View more videos at:

The presentation was high-level and didn’t go into legalese.  Given the migration to smart phones and tablets, the idea of a “family computer” in a “public” family place (as part of the family’s “social cement”) already seems out moded. 

Thursday, June 27, 2013

Federal laws supersede state age of consent in cell phone c.p. cases

An Ars Technica article by Nate Anderson warns that recording (as on a cell phone) a video of a person under 18 in an intimate act may violate federal child pornography laws even though the person under 18 is still over the age of consent in a particular state, link here.

There is a serious case involving Sidney Myers, 20, who videotaped an encounter with a 16 year old girl in South Carolina, where the age of consent is 16. 

Several major states, however, including California, Virginia, Arizona and Wisconsin still enforce 18 as the age of consent. 

Wednesday, June 19, 2013

NBC reports on Google's initiative to keep child pornography off the Web

NBC Washington this afternoon (Wednesday) reported on Google’s plan to build a database of encrypted child pornography images to assist law enforcement, as reported in the NBC News story, by Yanncik Lejacq here.
In the television interview, Scott Rudin spoke for Google about blocking search results for illegal images. 
Much of the money spent will assist the Center for Missing and Exploited Children in Alexandria, VA (at the King Street Metro stop). 
Google has an official blog post on the effort, here. The encryption algorithm also allows images to be identified without having to be viewed again by people.

It would sound as though the same kind of technology had been explored by FOSI, earlier called the ICRA; but the service was abandoned (reported Dec. 3, 2010).   

Tuesday, June 18, 2013

Self-Censorship Repeal Part II: more postings of old writings from the Army days of the 60s

There’s a little more news in the “self-censorship” issue that I discussed here on May 3.

I’ve posted, in the “content” subdirectory of my main publishing site “”, two more manuscripts.  One of them is a 1981 short story by me “Expedition” that deals with strip mining and mountaintop removal.  No, that doesn’t lead to censorship.  (Maybe it could have at one time.)   The main course is “Chapter 4”, not from my first “Do Ask Do Tell” book, but from the 1969 unpublished novel “The Proles”, which I wrote out by hand in a binder notebook in the barracks while I was stationed at Fort Eustis. VA.   The chapter is titled “Interlude” and actually is a detailed, graphic account of my fourteen weeks in Army Basic Combat Training in early 1968, with much more explicit language than was used in my 1997 book.

The language in a few spots is quite explicit, using expletives that were common in Army “proletarian” speech during the Vietnam era.  In a few cases, terms with racial overtones were also used.  The fact that a much higher percentage of African Americans wound up in infantry combat and became casualties in Vietnam, because of the student deferment system at the time, is certainly important and born out here.  So is the ambivalent attitude toward homosexuality, which was often ignored but sometimes could be deployed as a way to control or demean others who were weaker in the barracks. 

With COPA overturned, I have gone ahead an left the crude language, typical of 1968, in place, as a historical record.  

Saturday, June 01, 2013

People who share (P2P) and watch c.p. sued in Maryland

Persons who watched child pornography created by others, up to 180 of them, have been served as defendants in a civil lawsuit filed by a woman near Annapolis , MD, the Baltimore Sun reports, in a story here

 The content was produced by her children’s father and another party.

A federal law allows victims to sue persons who trade in the content.  Many or most of these persons could and may be prosecuted for possession.  But it is possible to be served in a civil suit without prosecution.

Sharing the material in a P2P network would normally constitute trafficking and possession, resulting in prosecution.  Could a party who watched the material on someone else’s computer be sued successfully?  Such a person might not be involved in what the law considers possession.  That would be an interesting legal question.    

Update: June 29:

I has been reported that the Supreme Court will take up the issue of liability for a "home user" next term.  There are already conflicting appellate decisions. Details to follow. 

Monday, May 06, 2013

Strange history of former teacher Eric Toth when on the lam; can you do good after a terrible crime? A real paradox

The story of former (Beauvoir school) teacher Eric Toth, who lived on the lam for five years under different identities, pretending to be a tech writer and speaker and even drug counselor, after fleeing Washington DC in 2008 when a report  surfaced of his filming child pornography, does present a certain odd paradox.
Justin Jouvenal ran a detailed account (:The Alias Artist") about his life on the run int eh Washington Post today, link here.   What seems so odd is that, for a time, he (alias David Bussone) had agreed to live in poverty in Phoenix in a homeless shelter, and actually was an effective volunteer working with the “downtrodden”. 
Willingness to do so often lives at the heart of religious (especially Roman Catholic) teaching (look at the values of the new Pope Francis).  However, there seems to be an odd psychological twist.  Toth seemed to eschew intimate relationships with other young adults (male or female) who could take care of themselves,  could perform productively in society on their own, and who could accept or reject him as another “equal” adult.
I’ve written about “upward affiliation” and “radical hospitality” on other blogs, with a certain degree of self-criticism.  Focusing on relationships with adults who can actually judge and criticize you sounds like a healthy thing, important for psychological growth as one moves into adulthood.  It’s gotten seen to be selfish, self-contained, and too unable to sustain other people in a way that keeps a free culture going. 

It seems as though Toth wanted relations with people over whom he could wield power in one direction.  It’s hard to believe he could have, under this masquerade, actually helped people – yet apparently he did for a while.  That would be relevant eventually in his sentencing and rehabilitation.   My experience has been that people who can’t talk back and make their own decisions in life, once they’re adults, never can go anywhere and be effectively helped.  I know this statement sounds smug, but that’s what I’ve experienced. 

Toth eventually wound up on the FBI’s “most wanted” (replacing Osama bin Laden) and was arrested in Nicaragua when recognized.  

Friday, May 03, 2013

"Self-censored" text to my DADT 1997 book restored online (was a COPA issue)

I have finally taken the step to restore a few instances of “coarse language” at a few specific points in the online copy of my 1997 book, “Do Ask Do Tell: A Gay Conservative Lashes Back”, in Chapters 1, 2, and 3, accessible here. I have left the replacement text intact following each instance (1 in Chapter 1, 4 each in 2 and 3) in parentheses, delimited by double "++" signs.  
I had been a plaintiff (sponsored by Electronic Frontier Foundation) in opposing the Child Online Protection Act of 1998 which, after a protracted series of litigation events (and two trips to the Supreme Court) was finally overturned in a “trial on merits” by a federal judge in Philadelphia, the Eastern District of Pennsylvania, with the opinion rendered on March 22, 2007 (explained that date on this blog).
I think it is useful to give again the link of my affidavit in the case, as filed (with EFF and the ACLU)  in December, 1998, from Minneapolis where I was living at the time.  Points 18-20 cover the “self-censorship” that I had felt needed to be done at the time in order to be safe under COPA. The link is here

This has become a "forgotten" issue.  But I think it's useful to bring things up to date.  

Thursday, April 18, 2013

"Sexting" cases with teens grow more troubling for prosecutors in VA, MD

A teen “sexting” case goes to trial in Fairfax County, VA, according to a Washington Post story Thursday morning by Justin Jouvenal.  In another case, in Franklin County VA, a 15 year old teen with Asperger’s goes on trial.  In many of these cases, teens were not aware of the possible legal consequences of their actions.  The link for the story is here
When teens are tried as adults, there is little that can be done to avoid prison terms and sex offender registry.  The juvenile system has more options. 

Attempts in both Virginia and Maryland to provide lesser offenses when data is shared only among “willing” teens haven’t gotten far yet.   

Dr. Phil has often remarked that cases like these show the physiological immaturity of the teen brain, which cannot “see around corners” and imagine the possible consequences in the adult world.
Prosecutors say that sexted images and videos wind up on the Internet or in the hands of criminals and can tarnish (usually)  females depicted for life.  But it would be almost impossible for most younger teens to grasp such consequences. 

For what it’s worth, the AP has described another “Lorena Bobbitt” case (1993) in southern California recently.  

Saturday, April 06, 2013

Instagram contests for tweens might violate COPPA, or at least its spirit

Cecilia Kang has a front page article in the Washington Post Saturday, April 6, about “Instagram” beauty contests that seem to entice pre-teen girls, and which might be in violation of new interpretations of COPPA, story here
The article described a contest where girls find their online photo presence marked up in red (with an X) the way a teacher might grade an algebra test – with red ink. 

Anyone (complete strangers) can vote, and parents have found young girls entered into "contests" without their knowledge from pictures posted by others (which feeds the concern about photos of people in public, which is generally considered legally protected).  
The article notes that teens and tweens seem a lot more concerned about numerical measures of their popularity, as with counts of “likes”, than are many adults.  Someone of my generation hardly thinks that way at all.
Again, going to the Intagram site I see the invitation to put it on the smart phone.
I also see how far behind the times I am in learning to use all the features of my own phone.  Is it because I don’t think I’m pretty enough?  

The Times of India has another typical story in Instagram concerns by parents, here

Tuesday, March 05, 2013

Implicit Content Problem: A statement about my own brush with it in 2005

I wanted to make another statement about the “implicit content” problem as I have come to understand it.  The reader may wish to view the account of the issue I had when I was substitute teaching as described on the “BillBoushka” blog  July 27, 2007.
In that incident, the school system (or at least the high school principal) used a “chilling effect” technique, hinting that, while I had a First Amendment right to post non-obscene content on my own website provocative to others, I might be vulnerable to prosecution under Virginia or other state laws for using an electronic communication for the “purpose” of providing enticement or temptation to minors.
In other words, a screenplay in which a character like me was shown as vulnerable to temptation to an unusually precocious minor (the “ephebophilia problem”) had no apparent “purpose” for its being posted (such as actual compensation from some other party or potential commercial profitability, ironically speaking).  Therefore, someone could reason that it had been posted for the “purpose” of tempting someone to make an approach, and that would be a criminally illegal “purpose”.
However, the First Amendment, as normally implies, that someone can make a public posting about something for “no reason” (although not an "illegal reason").  This is a bit analogous to the “employment at will” doctrine that says an employee can be dismissed for “no reason” but not for an “illegal reason”.
The illegal purpose concept comes into play only when there is a separate primary precipitating event.  That is, the author of the questionable web posting (me)  actually tries to contact a specific (or believed but possibly fictitious) minor directly, possibly now by Facebook or Twitter  (although some of  these tools did not exist in late 2005 when this incident happened), email, or even a phone text message.  It could be a normally simple and innocuous message, but inappropriate on its face because of the ages involved or a teacher-student relation.  But once such contact occurs, then the presence of such a posting becomes legally significant and possibly  illegal on its own; and I believe Virginia’s statute presuming “purpose” could apply and add to a prosecution’s case.  But I in fact never initiated any such primary contact of any kind. 

I was present at the COPA trial in Philadelphia in October 2006 when the judge made a  verbal comment about the hidden dangers of “implicit content”.  

Monday, February 11, 2013

The PROTECT Act of 2003 deserves discussion

I wanted to point out another detail from Mike Young’s book (post yesterday) that needs to be mentioned specifically on this blog.
This matter would be the PROTECT Act of 2003, or the ”Prosecutorial Remedies and Other Tools to End the Exploitation of Children Today Act” of 2003. The simplest source for imparting an understanding of the provisions of the law is simply the entry on Wikipedia, here

Young notes that the Supreme Court had at one time rules that computer-generated images (stills or videos) that simulate child pornography and that didn’t use actual minors in their creation, are protected from prosecution by the First Amendment.  (I’m not sure what case this was – might have been CDA.) The 2003 law specifically punishes the user of computer generated images that look exactly like minors, inasmuch as animation technology is so good now that this can be done.  An interesting question could arise if a real actor were “made up” to look under 18 and then used in explicit images, but that sounds legal.  I wouldn’t do it! Wikipedia notes that non-obscene fictional beings under 18 do not, in the language of the law, trigger prosecution.  That makes it sound as if the person simulated in an image must exist to trigger prosecution, but that could be a very dangerous assumption for a website to make.
The law also prohibits drawings of minors that meet the “Miller Test" of obscenity, and has some age relationships as explained on Wikipedia. 
The United States Department of Justice has a Fact Sheet on the Act here
The full text of the law can be viewed on Thomas here.
Public Resource has a one-hour video on “Protect Act: The Statement of Reasons” (2010) on YouTube. 

I recall the libertarian opposition to this law back in 2003.
There is another law, the Child Protection and Sexual Predator Punishment Act of 1998, which requires ISP’s to report suspected child pornography detected on customer’s sites, although ISP’s do not monitor sites for them.  Their responsibility (as far as future liability) is a bit like DMCA Safe Harbor; it usually ends when they report and cooperate with authorities.  Wikipedia doesn’t seem to have a page for it.  

Sunday, February 10, 2013

"RTALabel" looks like a promising voluntary content labeling "opportunity"

Soon, I will review a book “Internet Laws” by Mike Young, but I wanted to mention, in advance, a point that he makes about another opportunity to put adult-oriented websites behind verification filters.
The service is call “RTLALabel” (link here).  The facility has considerable capability to label entire sites, individual pages in different formats, mobile sites, and Wordpress (it doesn’t mention Blogger).   The FAQ page on the site is well-worth reading. It hints that Congress could try to pass COPA-like laws in the future (even though COPA was overturned in 2007, as documented here).

RTA (“Restricted to Adults”)  is set up by the Association of Sites Advocating Child Protection, ASACP, link.
There is some material about RTA on YouTube:

As Young hints, there is also a Guide to sites that use AVS, or “Adult Verification Systems”, here. There are commission arrangements for AVS sign-ups which some might see as seedy or unethical.
Previously on this blog, I have covered the Family Online Safety Institute (FOSA) and the ICRA product, which discovered had been discontinued and last wrote about on Feb. 10, 2011. I don’t know why it was discontinued or has any connection to RTALabel now.  

It us unclear how these products could affect the constitutional or legal issues surrounding any future attempts to require adult verification to web sites.  

One other problem comes to mind right now with COPPA (not COPA);  a regular site could inadvertently collect personal information from minors without parental consent.  I talked about this last on January 1, 2013, but the matter still seems a bit unclear still, and Young mentions it in his book.  

Sunday, February 03, 2013

More on protecting kids and culture wars; Fairfax County VA c.p. case shows kids ignorant of risks

Regarding the “slippery slope” in my previous post here about protecting kids, it seems to me there is a big gap in “culture” regarding Internet freedom between those who have kids and those who don’t.
If you draw analogy (to restricting Internet speech) from the gun control debate, there is a difference.  Many people who want absolutely no government interference with their right to own weapons have large families, tend to live in more rural areas, and tend to believe that they could be on their own defending their families. 
On the other hand, single people (including women alone and gay men) living alone in large cities or suburbs sometimes are also outspoken on their rights to defend themselves (the “pink pistols” argument).
There’s another “disturbing behavior” story from Fairfax County, VA, where some high school boys were arrested at West Springfield High School (where I have subbed before) for getting some underage girls to make sexually explicit tapes for them (WJLA story and video here ).  It does seem that many teenagers don’t fully understand the legal ramifications of the things they do “under age”, as we know from "illegal" cell phone photos.  As Dr. Phil says, teens don't see around corners.  

Petula Dvorak weighs in on this incident saying teen sought "fame", here, in the Washington Post, on February 5.  She cites modern values, "If you aren't important, you aren't alive" and "Normal life is no life at all in today's value system."  How about real skills?  (Piano counts.)  How about others depending on you?
The kids seem to have thought their activity was "consensual". 

Wednesday, January 23, 2013

Can "protecting minors" and "purpose-driven legislation" create a slippery slope?

We often hear arguments for new legislation to “protect children”.  And we often hear that the freedom to possess or create something can be taken away if it serves no valid “purpose”.

The obvious application of these arguments is in the gun control debate, particularly with respect to possessing assault weapons or rapidly loading magazines, which do seem to have no legitimate use for most civilians.

You could use the same kind of thinking to restrict Internet freedom, particularly in the user-generated content area.  You could try to justify removing Section 230 downstream liability protections for service providers, for example to prevent reputation extortion or cyberbullying.  You could justify exposing ordinary sites or providers to more possibility to protect minors’ privacy (as with COPPA rules).  You could even justify some future reincarnation of COPA itself.

The “purpose” argument (used in gun control) because it can also play into a legal doctrine called “implicit content”, mentioned during the bench trial of COPA in Philadelphia in 2006. A blog posting or website object could be deemed illegal if the likely potential benefit that could be expected to accrue to the poster was illegal.  In that world, sites that earn money might actually be legally more legitimate, because making a living is a legitimate “purpose.”

So does gun control represent a legally slippery slope that can spread to other areas?

There's one other remark that has run in my mind about the Newtown tragedy in Connecticut.  It sounds as though Mrs. Lanza may have been considering not only committing Adam, but also taking away his computer access and his whole world.  It sounds as though she may have introduced him to weapons earlier to "make a man out of him".  It's a horrible thought, but quite common with some parents of disturbed teens boys.  The National Enquirer has claimed that Mrs. Lanza was a "Doomsday Prepper" and had probably preached to her son that the world would end on Dec. 21, 2012.    

Saturday, January 05, 2013

Deletion of personal cell phone video can lead to obstruction of justice charges (Steubenville, Ohio case)

A recent assault case in Steubenville, Ohio brings up a legal point which, although not limited in effect to minors, deserves mention in connection with teenage “cell phone abuse”.

In this case, some bystanders took videos of the incident (without intervening) and then reportedly deleted the videos.  That, according to prosecutors, can lead to obstruction of justice charges.  If you do nothing and never photograph anything at all, no charges would be possible.  This would be true whether the video and photos came from a cell phone or conventional digital camera. 

The story (Greg Mitchell)  in “The Nation” is here.

The influence of the hactivist group Anonymous is being debated.  The group apparently recovered and exposed some of the video.  But it is also being reported that, because of posting of these videos, it will be hard for defendants to get a fair trial.
The Columbus Dispatch reported the Reuters account here

Tuesday, January 01, 2013

More concerns have arisen over FTC's definition of "directed at kids" in new COPPA rukes

Forbes has a recent article (by Eric Goldman)  characterizing the recent FTC implementation of new rules for COPPA a  (Children’s Online Privacy Protection Act) “big mess”.  

One of the problematical issues concerns the idea that app developers will be held responsible when their software is used to collect information from other sites (especially Facebook).   This happens when there is “actual knowledge” that the app will be used this way.  Goldman calls the language “inscrutable” but that part may be common sense.  The problem is that app developers will now have to go through considerable expense to safeguard apps likely to be used by kids on major third party sites.

Goldman also takes issue with the last (or third) tier of the FTC”s expanded definition of “directed at kids” which can include more general-purpose websites and apps likely to appeal to everybody, but especially to older minors, particularly precocious minors already performing in public or working in unusual ways.  (We saw this kind of discussion with COPA.)   General purpose educational materials, for example on the sciences or sports with hobbies likely to appeal to minors could fall into this area.  Website operators could be in violation if they track information inadvertently, even IP addresses.  The FTC seems unaware that all hosted website services offer logs with IP addresses, and all offer detailed analytics (Urchin) that may fall just short of providing identifiable information if the webmaster looks hard enough.  (As I have noted, I had reason to examine my own user logs in late 2005 after an “incident” when I was substitute teaching.)  Advertising networks may expect webmasters to be facile with looking at detailed web statistics, and it is possible even to cut off specific abusive users (that has to do with HTA Access in Apache).  It’s not clear that the FTC was aware of how this really can work.

The link for Goldman’s article (Dec. 20,2012) is here

It is true that liability exists for service providers and webmasters only when "actual knowledge" of data collection from kids from sites or apps "directed at kids" exists.  

The precocious minor issue is interesting.  There are some practical situations where the only people  (for employment purposes) who have the detailed knowledge of how to deploy a particular site or app effectively are themselves minors.  This can be tricky.