To block or not to block? That is the question many librarians contend with in deciding to filter users’ internet access. Although people might associate filtering with school and public libraries,  “the issue of filtering Internet content is not just a public or school library issue. Academic, community colleges, and special libraries are affected as well” (Maxwell, 2001). The author goes on to provide examples of a survey that reveals problems within law libraries such as pornography content.

Filtering software programs, or “censorwares,” block computer users from accessing certain information over the internet via access points, such as websites and emails. A black list is a list of websites or web pages that filtering softwares block automatically rather than have to rely on making an on-the-fly determination about the nature of the content on that site or page. Blacklists “will block a list of IP addresses and corresponding URLs deemed inappropriate. This list is updated on a regular basis often by an outside software vendor” (Huber, 2005).

There are pros and cons to these vendors and their blacklisting methods. On the pro side, Chen et al. (2015) pointed out that “real-time content analysis on the client side is usually an impractical solution,” while “blacklisting is efficient for filtering objectionable content at users’ access time with less computational resources.” In other words, a library’s computer might not have the computing power to filter objectionable content on the fly. Additionally, pre-set URL blacklists offer a less resource-intensive way of accomplishing the filtering task.

Blacklist censorware vendors claim another advantage which is using real people to review their blacklists to make determinations about whether a site should appear on the list or become blocked. Humans can make nuanced decisions that the software program’s algorithms might not. For example, an algorithm might block a site for containing pictures of nude people, but a human reviewer might recognize that the site is an online anatomy course and would therefore leave the site off the blacklist.

Another pro comes in relation to “white lists,” which are positive lists of sites that have been flagged as appropriate for the intended audience. If a new site of any type is created but isn’t added to the white list, it will automatically be blocked. Thus, compared to white lists, blacklists will result in fewer sites with no objectionable content being blocked. The sheer volume of sites on the internet also make white-listing impractical to sustain. In fact, Huber explains that the schools in his system switched away from white lists because of this problem and used blacklists instead.

On the con side, “the dynamic characteristics of the changing web require such objectionable blacklists to be constantly updated for sustainable blocking performance” (Chen et al., 2005). Filtering company Net Nanny explains that this can result in blacklists under-blocking, “since their list of websites is likely out of date and can’t keep up with user-generated content” (Net Nanny, n.d.).  However, Net Nanny also points out that blacklists can over-block “by prohibiting access to entire websites, when singular pages on the website may be perfectly safe.”

Finkelstein and Tien explain that “Many censorware companies claim their blacklists are human-reviewed,” which would seem to take care of both under- and over-blocking. But the authors did the math and determined that even when using conservative figures, there are so many sites, and so many new sites being added all the time, that it is impossible for any vendor’s human employees to keep up and review them all.

“Given the inability to human-review the web, a censorware blacklist necessarily must be created largely by a computer program,” (Finkelstein and Tien, n.d.), and these computer programs are inherently flawed. For example, when it comes to determining if a website contains content that meets the legal definition of “obscene,” Finkelstein and Tien state, “it is hard to see how anyone can seriously assert that computer programs could make such a judgment when humans endlessly debate these concepts.”

This endless debate lies at the heart of the issue, which is the lack of complete objective criteria to determine whether a site should be blocked. When Huber wrote of a site being “deemed inappropriate,” the author notably did not mention who was doing the deeming; or what they were supposed to base their deeming on, or that the deeming is ultimately a judgement call with no “right” answer.

Blacklisting of material that does not seem to contain any harmful or objectionable is a false-positive issue. This can include content that is constitutionally protected. “Libraries should continue to be wary of using internet filtering systems that block constitutionally protected material for adults or minors” (Chmara, 2012), lest they leave themselves open to lawsuits. Once a site is on a blacklist, whether a human or a software program places it there, it is censored. Finkelstein and Tien (n.d.) cite the example of language-translation websites being blacklisted because they “offer capabilities that can be used to escape the control-system” and “let people read forbidden material.” Denying users access to legitimate and useful sites like these, is perhaps the biggest problem.  

No matter what a professional’s feelings might be about technology use, staff at special libraries might find themselves tasked with employing a filtering software, some of which use blacklists. To make an informed decision, it is important for special library staff to understand what blacklists are and how they work to perform their work effectively.

References

Chen, H., Lee, L., Juan, Y., Tseng, W., & Tseng, Y. (2015). Mining browsing behaviors for objectionable content filtering. Journal of the Association for Information Science & Technology, 66(5), 930–942. https://doi-org.libaccess.sjlibrary.org/10.1002/asi.23217

Chmara, T. (2012, July 24). Why recent court decisions don’t change the rules on filtering. American Librarieshttps://americanlibrariesmagazine.org/2012/07/24/why-recent-court-decisions-dont-change-the-rules-on-filtering/  

Finkelstein, S., & Tien, L. (n.d.). Electronic Frontier Foundation white paper 1 for NRC project on tools and strategies for protecting kids from pornography and their applicability to other inappropriate internet content. National Academy of Scienceshttps://web.archive.org/web/20060419190143/http://www7.nationalacademies.org/itas/whitepaper_1.html

Huber, J. (2005). Internet filtering update. Media & Methods, 41(5), 16–17. https://search-ebscohost-com.libaccess.sjlibrary.org/login.aspx?direct=true&db=lls&AN=17539371&site=ehost-live&scope=site

Maxwell, N. K. (2001). Alternatives to filters. Library Technology Reports, 37(2), 1. https://link.gale.com/apps/doc/A74223162/AONE?u=csusj&sid=AONE&xid=08dcafb6

Net Nanny. (n.d.). Website blockerhttps://www.netnanny.com/features/website-blocker/

Author: Daniel Davis

Editor: Max Gonzalez Burdick


0 Comments

Leave a Reply

Avatar placeholder

The act of commenting on this site is an opt-in action and San Jose State University may not be held liable for the information provided by participating in the activity.

Your email address will not be published. Required fields are marked *

This site uses Akismet to reduce spam. Learn how your comment data is processed.