Unable to stop purveyors of child pornography directly, New York Attorney General Andrew Cuomo recently persuaded three major access providers to disable online newsgroups that distribute such images. But rather than cut off those specific newsgroups, all three decided to reduce administrative hassles by also disabling thousands of legitimate groups devoted to TV shows, the New York Mets and other topics.
Gordon Lyon, who runs a site that archives e-mail postings on security, found his domain name suddenly deactivated because one entry contained MySpace passwords obtained by hackers.
He said MySpace went directly to domain provider GoDaddy, which effectively shut down his entire site, rather than contact him to remove the one posting or replace passwords with asterisks. GoDaddy justified such drastic measures, saying that waiting to reach Lyon would have unnecessarily exposed MySpace passwords, including those to profiles of children.
Meanwhile, in response to complaints it would not specify, Network Solutions LLC decided to suspend a Web hosting account that Dutch filmmaker Geert Wilders was using to promote a movie that criticizes the Quran - before the movie was even posted and without the company finding any actual violation of its rules.
Service providers say unhappy customers can always go elsewhere, but choice is often limited.
Many leading services, particularly online hangouts like Facebook and News Corp.'s MySpace or media-sharing sites such as Flickr and Google Inc.'s YouTube, have acquired a cachet that cannot be replicated. To evict a user from an online community would be like banishing that person to the outskirts of town.
Other sites "don't have the critical mass. No one would see it," said Scott Kerr, a member of the gay punk band Kids on TV, which found its profile mysteriously deleted from MySpace last year. "People know that MySpace is the biggest site that contains music."
MySpace denies engaging in any censorship and says profiles removed are generally in response to complaints of spam and other abuses. GoDaddy also defends its commitment to speech, saying account suspensions are a last resort.
Few service providers actively review content before it gets posted and usually take action only in response to complaints.
In that sense, Flickr, YouTube and other sites consider their reviews "checks and balances" against any community mob directed at unpopular speech - YouTube has pointedly refused to delete many video clips tied to Muslim extremists, for instance, because they didn't specifically contain violence or hate speech.
Still, should these sites even make such rules? And how can they ensure the guidelines are consistently enforced?
YouTube has policies against showing people "getting hurt, attacked or humiliated," banning even clips OK for TV news shows, but how is YouTube to know whether a video clip shows real violence or actors portraying it? Either way, showing the video is legal and may provoke useful discussions on brutality.
"Balancing these interests raises very tough issues," YouTube acknowledged in a statement.
Unwilling to play the role of arbiter, the group-messaging service Twitter has resisted pressure to tighten its rules.