Web Security, Privacy and Commerce, 2nd Edition

only for RuBoard - do not distribute or recompile

23.1 Pornography Filtering

The first and arguably the most important category of blocking software is software that blocks access to web sites that are considered to be pornographic or otherwise harmful to children.[1] The U.S. Congress has mandated that all schools receiving federal support under the E-Rate program or that purchase computers or Internet access using funds provided under Title III of the Elementary and Secondary Education Act must install software that blocks access to visual depictions that would be considered "obscene, child pornography, or harmful to minors."[2] Libraries that purchase computers or Internet access using funding provided by the Museum and Library Services Act are similarly required to use filtering technology.

[1] We note that not everyone defines pornography the same way, nor does everyone believe that exposure to items defined in the law as pornographic is harmful. Such debates are well beyond the scope of this book.

[2] The legislation was included with the Children's Internet Protection Act (CIPA) that was passed as part of the FCC's end-of-year spending package at the end of 2000. E-Rate recipients are also required to adopt an Internet safety policy after holding a public hearing. For further information, see Section 254(h)(5) of the Communications Act of 1934. The filtering mandate does not apply to institutions receiving E-Rate discounts for purposes other than providing Internet access. CIPA Sec. 1721(a)(5)(A)(ii).

"Harmful to minors" is defined in the Children's Internet Protection Act as:

Any picture, image, graphic image file, or other visual depiction that

(i) taken as a whole and with respect to minors, appeals to a prurient interest in nudity, sex, or excretion;

(ii) depicts, describes, or represents, in a patently offensive way with respect to what is suitable for minors, an actual or simulated sexual act or sexual contact, actual or simulated normal or perverted sexual acts, or a lewd exhibition of the genitals; and

(iii) taken as a whole, lacks serious literary, artistic, political, or scientific value as to minors.[3]

[3] Secs. 1703(b)(2), 3601(a)(5)(F), 1712(a)(2)(f)(7)(B), and 1721(c)(G).

Pornography-filtering software has been installed by many corporations in an attempt to prevent the company's computers from being used for downloading and displaying pornography. This is arguably because viewing pornography is not part of most people's job descriptions. Another reason is that employees downloading pornography may have it visible (or audible) to other employees or customers, which can result in sexual harassment lawsuits by offended parties.

23.1.1 Architectures for Filtering

Filtering software employs a variety of techniques to accomplish its purposes:

Site exclusion lists

The filtering company makes a list of sites known to contain objectionable content. An initial list is distributed with the filtering software; updates are sold on a subscription basis.

Site and page name keyword blocking

The filtering software automatically blocks access to sites or to web pages that contain particular keywords in the page's URL. For example, censorship software that blocks access to sites of a sexual nature might block access to all sites and pages in which the word "sex" or the letters "xxx" appear.

Content keyword blocking

The filtering software can scan all incoming information to the computer and automatically block the transfer of pages that contain a prohibited word.

Image content blocking

The filtering software analyzes the images using image analysis algorithms and recognize those that appear to have too many flesh tones or telltale characteristics, such as obscuring circles surrounded by a region of pink. Currently, image content analysis is not very accurate, but many organizations are actively working on this technology.

Transmitted data blocking

Blocking software can be configured so that particular information cannot be sent from the client machine to the Internet. For example, parents can configure their computers so that children cannot upload their names or their telephone numbers.

Filtering software can be installed at a variety of locations:

Each of these models is increasingly more difficult to subvert.

Filtering software can be controlled directly by the end user, by the owner of the computer, by the online access provider, or by the wide area network provider. The point of control does not necessarily dictate the point at which the software operates. For example, America Online's "parental controls" feature is controlled by the owner of each AOL account, but is implemented by the online provider's computers.

23.1.2 Problems with Filtering Software

The biggest technical challenge faced by filtering software companies is the difficulty of keeping the database of objectionable material up to date and of distributing that database in a timely fashion. Presumably, the list of objectionable sites is changing rapidly, with new sites being created all the time and old sites becoming defunct. To make things more difficult, some sites are actively attempting to bypass automated censors. Recruitment sites for pedophiles and neo-Nazi groups, for example, may actually attempt to hide the true nature of their sites by choosing innocuous-sounding names for their domains and web pages.[4]

[4] This tactic of choosing innocuous-sounding names is not limited to neo-Nazi groups. "Think tanks" and nonprofit organizations on both sides of the political spectrum frequently choose innocuous-sounding names to hide their true agenda. Consider these organizations: the Progress and Freedom Foundation, the Family Research Council, Fairness and Accuracy in Reporting, People for the American Way. From their names, can you tell what these organizations do or their political leanings?

The need to obtain frequent database updates may be a hassle for parents and educators who are seeking to uniformly deny children access to particular kinds of sites. On the other hand, it may be a boon for stockholders of the filtering software companies.

fAnother problem is the danger of casting too wide a net and accidentally screening out material that is not objectionable. For example, during the summer of 1996, NYNEX discovered that all of the pages about its ISDN services were blocked by censorship software. The pages had been programmatically generated and had names such as isdn/xxx1.html and isdn/xxx2.html, and the blocking software had been programmed to avoid "xxx" sites. America Online received much criticism for blocking access to breast cancer survivors' online groups because the groups' names contained the word "breast." People with names like "Sexton" and "Cummings" have reported that their email and personal pages have been blocked by overbroad matching rules. Filtering companies may leave themselves open to liability and public ridicule by blocking sites that should not be blocked under the company's stated policies.

Filtering companies may also block sites for reasons other than those officially stated. For example, there have been documented cases where companies selling blocking software have blocked ISPs because those ISPs have hosted web pages critical of the software. Other cases have occurred where research organizations and well-known groups such as the National Organization for Women were blocked by software that was advertised to block only sites that are sexually oriented. Vendors treat their lists of blocked sites as trade secret information; as a result, customers cannot examine the lists to see what sites are not approved.

Finally, blocking software can be overridden by some sophisticated users. A person who is frustrated by blocking software can always remove it if need be, by reformatting his computer's hard drive and reinstalling the operating system from scratch. But there are other, less drastic means. Some software can be defeated by using certain kinds of web proxy servers or virtual private networks (VPNs), or by requesting web pages via electronic mail. Software designed to block the transmission of certain information, such as a phone number, can be defeated by transforming the information in a manner that is not anticipated by the program's author. Children can, for example, spell out their telephone numbers ("My phone is five five five, one two one two,") instead of typing the number directly. Software that is programmed to prohibit spelled-out phone numbers can be defeated by misspellings.

Parents who trust this software to be an infallible electronic babysitter and allow their children to use the computer without any supervision may be unpleasantly surprised.

only for RuBoard - do not distribute or recompile

Категории