Sunday, November 16, 2008

New product, eGuardian, to verify age provides opportunity, raises questions


There is a new free-market idea to the problem of properly identifying minors and adults on the Internet. A company called “eGuardian” (“Protecting your child on the Internet”) offers lifetime registration for $29 per child, and then offers commercial websites the opportunity to become partners so that they know that their children are minors. Then the commercial websites (especially social networking sites) can prevent age inappropriate material from reaching the minors. It is also possible to integrate the product into search engine results. Critics say that the commercial websites will use this to feed kiddie advertisements (for products like sweet cereals, toys, candy, etc.) Somehow the “Alphabit” cereal from the 1970s comes to my own mind.

The New York Times story is by Brad Stone, in the Ping Section, and is actually titled “Online Age Verification for Children Brings Privacy Worries,” on p 4 of Sunday Business, Nov. 16, 2008.

Tuesday, November 11, 2008

More issues with explicit "text" content (without images)


A post on a blog called “cyb3crim3” examines the question as to whether an computer file (or, for that matter, handwritten or manually typewritten letter or paper) that contains text alone and not pictures could ever be considered child pornography under the law. The blog is this and the relevant entry is dated Oct. 27, 2008.

The laws in some other countries, including Canada, Britain and some other commonwealth countries is, definitely, yes, in some circumstances. The blogger in this case discusses in detail a case in Canada, Regina v. Sharpe (2002). Wikipedia also points this out.

She (Susan Brenner, a law professor) quotes the Unites States statute 2256, link here on the Cornell Law School database. The text of the law quite clearly uses the term “visual depiction” which refers to “visual image” which itself is not defined. She hints that the possibility could exist that text that promotes the sale of such an “illegal” image could come under the penumbra of the statute.

It’s also important to note that the concept of “obscenity” does include text in American law. The concept refers to the lack of redeeming social value (“I know it when I see it”). CP, on the other hand, never offers the defense of “redeeming social value”.

She goes on to discuss Ohio v. Dalton (“793 N.E.2d 509 (Ohio App. 2003)”). There are more details in a book called “Cases in Communications Law” by Paul Siegel, 2007, from Rowman & Littlefield Publishers, Inc . The "books" link is here.

In this situation, someone already convicted of other crimes was charged with and plead guilty to a CP offense when private textual writings depicting such acts were found in his room. The facts are quite complicated, but eventually the Ohio appeals court allowed the withdrawal of the plea. The charge essentially amounted to punishing someone for fantasies or private thoughts (although these can become relevant if someone is in a treatment program after conviction for an actual offense). Applying the law this way did not have anything to do with preventing the use of actual minors. Some of the analysis had to do with the Surpeme Court’s ruling declaring the 1996 CPPA provision regarding computer generated images (not using minors) unconstitutional.

It’s important that had the textual materials been legally obscene, private possession of them would not be illegal for that reason, but distribution in public (including the Internet) would be illegal. But it is illegal to possess CP even in private. The other question is whether the distribution on the Internet of textual materials would be illegal. Wikipedia used to say that in the United States it was not illegal, but that sentence seems to have been removed. It might violate an ISP’s “terms of service”, but that is a private matter, not specifically a matter for criminal law. It could create issues on a public site when viewed from countries for which it would be illegal. Such a textual entity might well have redeeming social value, by showing the legal consequences following the depicted acts, say in a fictional story or screenplay. But an entity could run into other novel problems involving “implicit content” or even “enticement.”

These laws (even when viewed just in terms of images in their conventional meaning) could lead to certain exposures for wrongful prosecution, partly because of the “strict liability” doctrine for a possession offense (based on the idea of “conclusive presumption”). Hacking is a possibility, and may have happened in one case in 2006 in Arizona (see particularly Feb. 3, 2007 on my Internet safetly blog here. Bloggers could be inadvertently exposed to embedded images when moderating comments (they have to look at the comments first in order to determine whether to reject them), and even ordinary email home users could click on HTML email with embedded illegal images and then illegally possess the images. By and large, frivolous prosecutions have not occurred, but the exposure in the law is disturbing.

Virginia’s statute is here and is fairly specific with the terms “visual material” and “identifiable minor.”

All of this is a body of law somewhat distinct from COPA, although it got mentioned at the 2006 COPA trial.

Another note on Arizona:

This particular state his particularly draconian laws and penalties, and has been accused on "enemy jurisprudence," ignoring the "harm principle" and trying to use the law to destroy "undesirables." Consider the 200 year sentence (with no possibility of parole) in the case of Arizona v. Berger (twenty consecutive ten year terms for ten counts against former schoolteacher Morton Berger) , with this entry on the Law & Society Blog, here. "Enemy" thinking is what drove the ideology of Nazi Germany.

Tuesday, November 04, 2008

Australia creates flap with "mandatory" content filters


Australia has created a bit of a flap by trying to require mandatory content filtering throughout the country.

The filtering resides at two levels. One level filters content deemed illegal under Australian law. The second level, which apparently users must have on their computers, can be opted out by adults. The filters reportedly degrade network performance from by 20% to 75%.

Electronic Frontiers Australia has a report on “Labor’s Mandatory ISP Blocking Plan” here. Still, two thirds of parents don’t have the filters installed.

Ars Technica has a report by John Timmer titled “Aussie Govt: Don’t Criticize Our Terrible ‘Net’ Filters,” link here. The current Australian government seems determined to implement the program despite evidence developed by local journalists that it is flawed.

In the United States, the use of voluntary filters are the main way that parents can prevent objectionable content being viewed by their children. A more progressive system would be voluntary content labeling, with the cooperation of software developers and ISPs, as proposed and documented by the ICRA and discussed here before. It would appear that Australia is attempting to implement something like “COPA” in its filtering system.