Donut Age: America's Donut Magazine

Blockage

I was out at a school last week doing a professional development workshop, and I got my first good look at the use of web filtering software. It ain't pretty. The workshop was on web resources for educators, and several things we'd counted on showing to the teachers there were either completely blocked or broken to the point of uselessness. These included:

  • iTunes Music Store (perhaps understandable, but we had wanted to show teachers the podcast section)
  • The Internet Archive (free media including historical video footage)
  • Almost anything associated with Yahoo, including search (the site itself was accessible, but you could not actually follow the links to search results) and Yahoo maps, which in turn broke NASA World Wind, which uses those maps)
  • Google Earth (the software ran but no map images appeared)
  • any number of blogs, including mine, Mark Bernstein's, and (apparently) everything hosted at Blogspot.

Now I understand (though I'm not sure I agree with) the perceived need for some kind of filtering at school sites to prevent access to objectionable content. But the solution used at this school (apparently a combination of some local choices—which may have included blocking of all jpeg images—along with filtering taking place at the state level) was clearly overkill. Crippling applications like World Wind and Google Earth makes no sense and takes a pair of great, free educational tools out of teachers' hands. This amounts to doing surgery with a chain saw.

Also, the Yahoo blocking smells positively fishy. Since Google searches behaved quite normally, it can't be a restriction on searches in general, and Google searches can yield plenty of 'objectionable' material, so I can't imagine it is about perceived safety. Since there's no particularly rational reason to block one and not the other, I wonder about the ethics of letting one commercial entity operate on the school network while crippling its main competitor.

The biggest problem with web filtering, though, is the lack of transparency or accountability. Whether sites get blocked on the basis of keywords or the judgement of a part-time college student, or "the latest neural network techniques" (as claimed by Surf Control, the filter in this case), there is no appealing or even analyzing the decision. What criteria are being used to block content? Surf Control says their software "intelligently classif[ies] unknown websites into one of 45 categories" and is "targeted to identify non-business sites that affect productivity in the workplace," but they don't provide many more details. Besides a big question I have about using business productivity as the guiding principle for filtering a school's web access, my experience suggests the process is far from intelligent. Although we couldn't access the useful educational resources cited above, we had no trouble accessing a parody of the official White House website and a site with detailed instructions for creating a thought screen helmet (it turns out you should use Velostat instead of tin foil).

A bad web filter is probably worse than no filter at all. Besides excluding access to perfectly good, sometimes outstanding, web content, it creates a false sense of security that the sites one does visit are necessarily safer and and more reliable than those that are excluded. Meanwhile, content providers who are being blocked may not even know about it, let alone be able to redress the problem and get off the blacklist. It all amounts to an insidious form of censorship that, I would maintain, is far more dangerous than explicit language and naughty pictures.