pageok
pageok
pageok
Legal Responses to "Cyber-Bullying":
The Yale Law Journal Pocket Part recently posted an interesting Call for Papers in response to the recent news stories about the XO board and anonymous Internet speech:
  The Yale Law Journal Pocket Part is soliciting essays and commentaries on the role of law, policy, and extralegal tactics in regulating instances of cyber bullying, including defamatory "Google bombing." How, if at all, should regulatory schemes address providers of information who make no endorsement of the information's content?
  I'm no expert in either the relevant law or the relevant technology. But here's my amateurish idea: Would it help to somehow link up provider immunity with search robot exclusion? Under current law, site owners are immune from liability for the speech of others under 47 U.S.C. 230. This means that a site owner can allow anonymous comments, announce that anything goes, and then sit back and watch as the trolls engage in all sorts of foul play. Search engine robots then pick up the foul play, resulting in harm weeks or months later when a third party googles that person or event. A lot of people may be harmed, but the law can't stop it: the provider is immune and the commenters are anonymous.

  If I'm not mistaken, though, the same provider who is immune under Section 230 also controls the scope of the resulting harm. Why? Because, at least as I understand it, the same provider controls whether search engine robots are permitted to come to the site and collect the information in the first place. I believe that blocking search engine robots is pretty easy, or at least could be configured to be easy; it just requires a line of htmlcoding.

  Where does that take us? Well, it suggests to me that we might consider conditioning legal immunity on disabling search robots. Providers would be immune for liabililty relating to particular content only if they had taken technical measures to block search engine robots from collecting that content. So if you wanted to host a free-for-all for others and be immune from liability, you could do that: you would just have to keep the resulting content from being fed into Google. On the other hand, if you wanted Google to pick up the content, for whatever reason, you would need to assume the risk of liability for that content you're letting Google collect.

  What kind of impact would such a rule have? I imagine it would lead a lot of providers to block Google and other search engines from collecting materials from message boards, blog comment threads, and the like. The unmoderated and anonymous comments would still be out there; they just wouldn't be found using search engines.

  Anyway, that's my idea. I may be way way off, either as a matter of law or technology; I'm not sure that it's so easy to diable the robots, and I'm not sure it would be easy to amend Section 230 to condition immunity on doing so. But I figured I would throw out the idea and get your thoughts.
Steve:
Isn't this, though, dependent upon the search engines all utilizing the same protocol, which is to say, disregarding files which the site owner wants you to disregard? Even if this is the standard Internet convention right now, it feels strange to construct a statute upon such an assumption.

Also, the propriety of "safe harbor" statutes really depends on the legal acumen of the folks being regulated. When we create a "safe harbor" under the federal securities laws, we proceed from the assumption that officers of publicly traded companies will take pains to educate themselves as to what you can and can't say in public. But do we really expect everyone who owns a comment-allowing site on the Internet to educate themselves regarding their legal liability, or do we prefer a more hands-off approach where well-intentioned people can pretty much just keep on doing what they're doing online?
4.16.2007 6:28pm
PatHMV (mail) (www):
I wouldn't want to have to choose between liability for any comments on by blog or keeping my blog out of Google entirely.

With a system set up in a particular manner, it should be possible, technically, to configure robots.txt to avoid just the comments, while still searching the main posts. Some blog software will also insert "rel='no_follow'" into the "a href" tag for every link, so that Google (and other search engines that comply with that very loosely established standard) won't count that link in calculating the linked page's pagerank. Basically, the no_follow tag says to Google: "here's a link, but I don't endorse it, so don't count this as a vote in favor of it." Google normally calculates pagerank by basically treating each link to a site as an endorsement of that site.

But on the larger front, I haven't seen sufficiently wide-spread harm which would justify removing or limiting the legal protections currently in place via section 230.
4.16.2007 6:36pm
PatHMV (mail) (www):
And for "cyber-bullying" by elementary and high school students in particular, what I'd really like to see is for the Supreme Court to revisit some of its school-based First Amendment jurisprudence and allow schools greater leeway to educate, through appropriate rewards and discipline, their students about the responsible exercise of their First Amendment rights.
4.16.2007 6:41pm
Cathy (mail) (www):
First, robots.txt is a voluntary convention. No webmaster, nor any search engine, is required to use or obey it. Thus it would be inappropriate use it as a standard to base liability upon.

Secondly, as a blog owner with open comments, I don't want to pre-emptively exclude the comments from search engines should people behave badly in them. To do so would also exclude them when people behave in an enlightening and/or constructive way.

We already have a similar problem with respect to spam comments. If you don't manage to get rid of them from your site quickly enough, you can end up getting indexed as relevant to their questionable subject matter. Many website owners would probably prefer not to have that happen, but to prevent it has high costs, either in terms of having to constantly monitor and delete rogue comments, or by ensuring that NO comments, even the legitimate ones, ever get indexed in search engines. Neither tactic is benign, and neither would be any law forcing webmasters to use one of them.
4.16.2007 6:54pm
Bill Sommerfeld (www):
Yes, robots.txt is currently voluntary. But if an authority were to implement some sort of legal policy based on the wishes of a web publisher to fence out robots from part of their site, why wouldn't robots.txt suffice to show the intent of the publisher? Why would the the authority need to define a new standard for robot exclusion, and how would they get webcrawler operators to use their new standard in addition to robots.txt?
4.16.2007 7:22pm
James Grimmelmann (mail) (www):
First, a few nits: robots.txt is not "HTML coding." The robots.txt file is, as its extension implies, a text file. Excluding all robots requires at least two lines of coding, not one.

Also, the problem here is not that search engines can find the indexed pages at all; it is that searches on people's names return pages with lies about them. Thus, the issue is that search doesn't work well enough; people looking to learn things about someone wind up being directed to libels that they probably wouldn't have wanted to see. Disabling all search access to those pages is overinclusive; it ends up making search less useful in general. I have a paper in half-written draft form arguing that a central goal of information policy should be making search tools more effective; telling people to disable the searchability of their pages strikes me as a bad idea. If anything, we should probably be telling search engines that they have no legal duty to respect robot exclusion protocols.

Another way to make this point is to note the incredibly broad range of activities protected by s. 230. If your proposed rule had been in place, most of the major s. 230 cases would have come out the other way, finding no safe harbor. As a practical matter, that would mean that many information providers would choose to disable search engine access. Run down the list of s. 230 defendants, and ask how well the Web would work if all of them had strong anti-robot policies in place. Some already do, but many do not. I think that anti-search Web would be worse than the one we have now.

Your point about searchability is a good one, but perhaps there might be other levers to pull to limit the search engine fallout from anonymous online libels.
4.16.2007 8:15pm
Anthony Ciolli (mail) (www):

Your point about searchability is a good one, but perhaps there might be other levers to pull to limit the search engine fallout from anonymous online libels.


I think Jarret Cohen's proposed solution to the search engine issue manages to not only preserve free speech and reduce the fallout of libel, but actually make search engines more effective and useful. He describes his proposal in his HL Record op-ed, beginning on page 3.
4.16.2007 8:30pm
loafer:
"Would it help to somehow link up provider immunity with search robot exclusion?"

Yes, if your goal is to stifle anonymous internet speech. No, if you're willing to stick by the First Amendment and all the valuable anonymous speech it has protected throughout the ages -- including, of course, the Federalist Papers -- even though you recognize that this will mean tolerating some bad speech.

Linking up provider immunity with search robot exclusion will encourage more active policing of speech viewpoints on forums. If you think this is a good result, I don't see why threatening forums with google de-listing is any better than threatening them with a tax or with criminal penalties.

As an aside, has there ever been any proof that AutoAdmit thread actually hurt anyone's employment prospects? I find it hard to believe, frankly, that an employer would reject a Yale 2L because some anonymous internet user created a thread about her on an internet forum. She claims that all fourteen of her interviewers rejected her. Realistically, everyone involved with legal employment should see that there was something else going on here.

I think the answer here is pretty clearly to ignore your detractors and get on with your life. This is easier and cheaper than resorting to courts and legislation, it keeps with the spirit of the First Amendment, and it gives anonymous internet trolls the amount of attention that they deserve: none.
4.16.2007 9:28pm
Patrick McKenzie (mail):
robots.txt as a defense has the same problem as immunizing people from prosecution from libel if they write the libelous text backwards. Yes, the most straightforward glance at the material won't cause the harm to the injured party, but the protection is weak. An improperly coded web spider, including the Google spider at many times in history, can ignore the robots.txt robot exclusion protocol (which relies on spiders being well-behaved -- it says "Please, do me a favor and don't request any of the following documents.")

Additionally, republication of content, which the web facilitates and which is getting easier with emerging technologies (RSS feeds, etc) would tend to diminish the utility of robots.txt. For example, my blog's RSS feed is mirrored without my authorization by many spam sites looking for content. Supposing I was worried my commenters tended twoards the risque, I could robots.txt away robots' "privilege" to look at the comment pages, which would protect the content from well behaved robots when it is on my site. But when a spammer picks up my comments RSS feed, or when someone incorporates it into a mashup, they don't pick up my robots.txt exclusion. Google will index their pages, and the complaining party ends up just as harmed.
4.16.2007 11:00pm
Jarret Cohen (mail) (www):
Free speech is a good thing, and words are powerful. Of course, speech has the ability to do much damage just as much as it can liberate minds. But speech isn't truly free unless everyone is elevated to the same platform to speak. That's why open forums are a positive thing; they allow everyone to voice his or her opinion. AutoAdmit, for example, offers everyone the ability to comment on the factual, the offensive, and the absurd. Yet I find that some people often don't think to confront their critics on the message board, which puzzles me, considering they believe it is being seen widely enough to warrant complaining about it. But at least they have a choice.

Many websites do not offer such a level playing field for speech. In the absence of this, and in the spirit of preserving the Internet's tradition of open expression, relevant individuals ought to have the ability to attach statements and explanations to listings in Google, which, like AutoAdmit, republishes third-party speech that may be considered hurtful. Deciding what can and can't be listed in Google only creates another kind of censorship/grey area problem, and I don't think it's the answer. This is not to say that there is a place for libel anywhere, but hurtful speech often isn't libelous. Read my op-ed in HL Record, where I go into more detail.
4.16.2007 11:00pm
logicnazi (mail) (www):
This seems to undermine the very benefits that justified sec 230 in the first place, be ineffective if not harmful and an affront to the principles of free speech we would never tolerate in the offline world. Not to mention being a totally unprincipled distinction.

It would be ineffective because, as others have observed, robots.txt, is a voluntary convention. However, it is a convention that is widely respected because it is mutually beneficial. As soon as you pass this law you create a strong incentive for many websites to block search engines from significant amounts of their material. For instance wikipedia or similar projects might be forced to block robots from all their pages. Yet people want to find this content giving search engines a competitive incentive to just ignore robots.txt. The net result may just be to eliminate all the benefits of the robots.txt system.

Secondly, the very motivation for the safe harbor provision was to encourage public forums. Yet keeping out search engines is effectively the same as making it a private forum. While their have been a few high profile cases of cyberbulling like the poor star wars kid where great harm resulted from wide dissemination the most common case is of a kid being mocked by others in his school who can just as easily spread the links by their own myspace pages. So we eliminate much of the benefit of the safe harbor (imagine no search on wikipedia) while gaining only limited benefits.

This change would be unprincipled because it creates an unjustified distinction between a website and external search engines. I mean many forums offer various means to view user posts including some forms of search. Presumably these would still be covered by the safe harbor so what justification for the different treatment if it is an external search engine? Moreover, for some of the most likely locations of cyberbulling (you tube) it isn't very clear how to distinguish the site itself from major search engines. If we are going to make people liable for letting a search engine index their site why not just leave the little guys out of it and just make the search engine directly responsible for indexing it?

Finally one of the reasons the safe harbor provisions were passed is that congress recognized that open forums on the Internet are the equivalent of public squares or free speech zones on universities. Just as it would be toxic to free speech to make a university liable for everything said on it's plaza so too is it unacceptable to make a forum responsible for everything said there. Your proposal is effectively suggesting we let universities have free speech zones only if they exclude the media or other public means of awareness of what is said there. That would be unacceptable suppression of free speech in the physical world and it's still unacceptable online (even though it is probably constitutional).

Additionally how would a web hosting company survive. Either they have to edit every website that is run on their servers or places like VC wouldn't get featured on Google. If you give web hosting companies a free pass then myspace just declares it is a web host as well.
4.17.2007 2:17am
_Publius_:
-- I find that some people often don't think to confront their critics on the message board, which puzzles me

I'm sorry, have you read the things that happen to people who complain on your board? You ask to disassemble the entire system of libel, for a claim that does not hold up as an empirical matter.

This would be throwing meat to the lions. Every single person who dared complain - on your board or otherwise - about their unwilling "participation" in the "Top 14 Girls" contest had their full name smeared across pages and pages of garbage, filled with insinuations that can't be easily be defended-against. One who dared confront those who would libel her had an email sent to the entire Yale faculty about her private life, an act so unconscionable that Anthony Cohen himself allegeldy resigned over it. (See his message to the AutoAdmit community -- "allegedly" in that he is still a frequent contributor to the board)

How does one "respond to" an intrusion upon seclusion? One a private fact has been improperly revealed, all the "confront[ation]" in the world will just draw more attention to it. One a private photo has been stolen and posted, one cannot "confront" it away. All a confrontation would (and did, in this case for those strong enough to stand up to what they saw as injustice) do was to fan the flames higher.

How does one "respond to" impersonation? One can post all day "no, really, that's not me" and the other person can post "yes it is" just as long. Eventually you could get to an ID showdown, but that requires a victim of impersonation to go through some ridiculous hoop to prove their identity.

How does one "respond to" a claim of herpes? Get a herpes test and post the results? To a claim of sexual looseness? Get affidavits from all possible sexual partners? To a lie about an LSAT score? Post your actual score, in contravention of the social norms of law school?

Basically, you ask that the entire system of libel be dismantled and that victims simply respond to libels. We have long recognized that libel cannot simply be responded-to away; once a negative aspiration has been cast it cannot be willed away. People don't have the attention to read through reams of disputed evidence; merely suggesting that someone is loose of sexual mores or willingly participated in a "contest" that degrades women is enough to cast a long shadow over their professional careers.

In this case a few women were brave enough to step up and try to fix a broken system. We should consider those who suffer in silence.
4.17.2007 2:49am
Jarret Cohen (mail) (www):
Publius: You are obviously very emotional about this issue, but the thing is, I don't really know what you're trying to say. Fact is, people who respond, if they do so with a cool head, are usually treated pretty well. I can think of a few examples right off the top of my head. I'm sure you can't only because you aren't really familiar with the message board, and it doesn't sound like you have all the facts straight. I don't even think you understood my main point. And I don't quite get what you're saying about "disassembling the entire system of libel."

I'm sure this will be responded to with another rant.
4.17.2007 11:22am
Anthony Ciolli (mail) (www):
I agree with logicnazi. There are also several other issues to consider:

-- This whole discussion assumes that we're dealing with (1) websites (2) where the individual protected by section 230 has access to robots.txt. What about situations like Barrett v. Rosenthal, where individuals repost defamatory statements on Usenet or other non-website online services? Okay, that one's easy enough: just make X-No-Archive mandatory to retain section 230 immunity. But what about other non-WWW online media? Not to mention that there's still the ISP/webhost issue. And what about situations like #2? Let's say section 230 is amended, but Volokh.com chooses not to take advantage of the immunity. If I repost a third party's defamatory message in a Volokh comment, do I also not have the safe harbor protection, even though I may want the immunity but have no way to edit robots.txt for Volokh? What if Volokh never tells anyone that it has chosen not to take advantage of the safe harbor?

-- As a practical matter this will just never work. Look at what's going on with torrent and file sharing sites right now. All that's happened is that the "bad" sites have left the United States for places like Sweden that have much more liberal laws, and file sharing continues to rise in popularity. If this amendment was actually implemented, the result would be Google either completely relocating overseas, or a foreign upstart taking away its market share and becoming the new leading search engine.

I don't think Section 230 is perfect, but such proposed changes are just going to make a bad situation a lot worse.
4.17.2007 12:23pm
crg (mail):
From a practical standpoint, this won't work to stop google bombs, because people won't do it. Low-rank content doesn't substantially contribute to a google bomb, while high-rank content is stuff that's going to hurt you to remove from a search engine. It's also rare that you would have a URL structure that would allow you to eliminate comments from being spidered without also eliminating the posts.

For this site, for example, you would have to exclude the /posts/ directory, which would also break the weighting for links to specific articles, and cause your site's search engine (which respects the robot.txt file) to only return the multiple-post archive page.
4.17.2007 2:54pm
lucia (mail) (www):
Jarret,
Publius didn't sound emotional to me. I found the comment pretty clear and would hardly call it "a rant".

Publius is correct: suggesting that the remedy for libel on a forum or board is for the libeled person to add to the feeding frenzy at the board is silly. While that strategy can boost traffic and ad revenue for the forum owner, there is very little benefit for the person who is libeled!

Those who are libeled need real remedies.
4.17.2007 3:49pm
Jarret Cohen (mail) (www):
lucia: I will be the first to agree with you that those who are libeled need real remedies. But what I see as the major problem is that most of what is called "libel" is not libel all, but rather the kind of childish mockery that exists on the schoolyard playground. For example, if someone posts her picture online, and people comment on it, are you saying this is illegal? The answer is no, leaving these people without "real remedies", if what you mean is legal remedies. The solution at that point is to at least empower the subjects of discussion to set the record straight. Not everything in life can be accomplished through frivolous litigation.

As for situations involving actual libel, a legal remedy already exists: one can pursue a defamation lawsuit against the actual author of the libelous statement. Of course, most people don't pursue such lawsuits because the actual damages individuals suffer from libelous internet postings are virtually non-existent.

I have described this as a social problem more so than a legal one. I believe that the solution will involve, at least in part, enhancements to search engines, not the legislated suppression of speech and opinion. Perhaps we are still awaiting a new "social paradigm", where people re-evaluate the extent to which they take Internet chatter seriously.
4.17.2007 5:18pm
_Publius_:
For example, if someone posts her picture online, and people comment on it, are you saying this is illegal? The answer is no, leaving these people without "real remedies", if what you mean is legal remedies.

Actually, it can tortious in many situations and criminal in some.

The tort of intrusion upon seclusion seems pretty relevant if the picture was taken improperly and widely disseminated without consent. A reasonable person would be outraged to find their personal photos spread all over the Internet with sexually suggestive and harassing comments. Even if you don't think this case rose to that level, it's a question for the jury.

Copyright infringement can also apply, and again it is tortious. There is simply no argument of fair use when a personal photo is taken without permission and posted for commercial gain (the site used to run AdWords) on public website.

Harassment can also apply. Same for stalking -- threats to stalk or intimidate somebody are a tort (and a federal crime under the parts of the VAWA that survived Lopez). I believe there were threats to hunt down some of the victims in real life on your site and to take camera-phone photos of them. That may be sufficient to rise to VAWA interstate stalking (which, incidentally, is not covered by CDA 230).

most people don't pursue such lawsuits

This argument is circular: we shouldn't enforce libel laws online because libel laws aren't currently enforced online.

There are many reaons why people don't pursue lawsuits under the current regime. Not the least of which is that site-owners, like yourself, make it near-impossible to get a remedy under the current regime by refusing to keep IP logs (as you have admitted). If the original poster can't be found then a lawsuit is a waste of time.

CDA 230 generally protects site owners no matter how libelous the content. Mr. Kerr would like to change CDA 230 so that victims are more able to get a remedy. Your response is begging the question: since CDA 230 makes lawsuits hard now we shouldn't change CDA 230.

The question posed by Orin is whether site-owners should be liable for the libels on their sites. To say that the lack of existing lawsuits against the original posters proves that site-owners should be immune is completely circular.

because the actual damages individuals suffer from libelous internet postings are virtually non-existent.

As an empirical matter, I suggest that's simply not true. Google casts a long shadow over the careers and lives of many people, not just those women defamed on your site.

not the legislated suppression of speech and opinion.

The protections of the First Amendment are broad and weighty, but they have never been extended to cover libels and intrusions into privacy. Speech is simply not protected when it is used to incite violence, to libel, to harass, or to invade someone's privacy.
4.17.2007 10:29pm
lucia (mail) (www):
Jarret:
I agree that some of what is going on at these forums is not libel. However, libel was the subject Orin raised. So, I took your previous suggestion relevant to the topic being discussed actually being here.

However, if you wish to bring up other types of bad social behavior at forums, I will agree simple childish mocking occurs. If you mean to suggest you were refering to this when suggesting that those taunted should jump in the fray, I say this: That's an entirely silly suggestion.

Even on the playground, the remedy for childish mocking is rarely to engage the children taunting you. It's generally better to ignore it.

On forums, even if the taunts aren't libel, but simply malicious or embarrasing to those taunted, it is not in the interest of the person being treated badly and teased to add to these anti-social threads, creating more and more search engine fodder that might link their name to additional search terms.

I guess I can see how a forum owner interested in generating traffic might convince himself that the victims taunted on his forum should jump in. I can even see how the forum owner might wish to convince third parties they should jump in.

After all, the additional search engine hits may lead to a boost in traffic, possibly leading to more advertising revenue. This benefits the forum owner; it doesn't benefit the teased and taunted person.

But seriously, if you think about it a bit, you can see jumping in doesn't benefit the person teased one bit! Far from being surprised some victims don't join the fray at your forum (or those of others) you should admire their good sense!
4.17.2007 10:41pm
Jarret Cohen (mail) (www):
Publius:

You wrote


The tort of intrusion upon seclusion seems pretty relevant if the picture was taken improperly and widely disseminated without consent. A reasonable person would be outraged to find their personal photos spread all over the Internet with sexually suggestive and harassing comments.

...

Copyright infringement can also apply, and again it is tortious. There is simply no argument of fair use when a personal photo is taken without permission and posted for commercial gain (the site used to run AdWords) on public website.



You might want to mention the fact that no one except you has said anything about reposting pictures, and in the case of AutoAdmit, it's impossible to post pictures. The issue as it relates to AutoAdmit is speech; never has the issue been reposting someone's pictures and "disseminating them ... for commercial gain." The issue is the fact that if you post your picture on a website that gets millions of hits, e.g. MySpace or Facebook, don't be surprised when somebody you don't know looks at it, links to it (whether it was the original or a copy on a separate, unaffiliated website, run by an unaffiliated third-party), and wants to talk about it on a forum. The day that people lose the general right to talk about others, or even to ogle over their womanly features if they so desire, will be a very sad day.

It appears to me that this point wasn't so much about Orin's article, though, but a rather bizarre attempt to imply that AutoAdmit may have illegally profiteered from... a hyperlink? An attempt at redirecting the discussion into a twisted criticism, not unlike your camera phone comment.

Your argument about libel and circular reasoning is also horrible, because it's based on the false assumption that people don't pursue litigation because they don't expect they're going to be able to get IP information. On the contrary, people usually assume such information will be available to them so long as they go to the trouble of subpoenaing it. In the three years I've run AutoAdmit, until the past month, I've never revealed any policy on storing IP records. Yet in these three years, no one has ever attempted to subpoena IP records from my server company, let alone pursue a lawsuit. They clearly didn't feel their purported damages warranted actual legal action. It's really easy to complain, and claim irreparable damages, and run to authorities, and send threatening letters, but when it comes down to quantifying actual damages and pursuing a lawsuit, it seems people usually come to the conclusion that the damages aren't there to justify it.

Finally, at the very end of your post, I can come to some agreement with you. "Speech is simply not protected when it is used to incite violence, to libel, to harass, or to invade someone's privacy." That's true. And as I've said many times, there should be and already exist legal remedies against actual torts or criminal behavior, regardless of the medium.
4.18.2007 11:28am
Jarret Cohen (mail) (www):
lucia:

I don't think libel was the subject Orin raised, as a matter of fact. The title of his article refers to "cyber-bullying", a much broader term than libel. Then, if you click through to the YLJ Call for Papers, you'll notice the PDF does not refer to libel, either. Who is talking about libel? Maybe you, but not Orin.

Nevertheless, I did address matters of libel in what I thought was a rather clear fashion: there ought to be legal remedy against libel in the Internet sphere, just as anywhere else.

Then, you go on to say that it's "better to ignore [childish mocking]."

If you say it's best to ignore it, then people are free to do so, and so are their employers. But what I have witnessed is the frequent (and similarly schoolyard) desire to run to the "teacher" (authority figure) and get them to punish the bully. That's a far cry from ignoring things.

You may think fostering a culture of tattle tales is the best solution, but I say that instead of resurrecting this two-way schoolyard behavior in the absence of legal remedies, people ought to stand up for themselves and not throw their hands up calling for help from an authority figure. After all, this person of authority often proves just as helpless to solve the problem. In the spirit of taking matters into one's own hands, there ought to be a mechanism that elevates these relevant parties to the same platform as their detractors, so that they may correct or even preempt misperceptions and clarify the record.

As I have noticed that nearly all of the thread removal requests I've received over the years have been over concerns of Google searchability, I believe that a simple enhancement to search engines could relieve individual web operators of the decision-making burden (and potential abuse of power) in such matters. You should bear in mind, many web operators are free to be far less cooperative than I have been, and in recognition of this, there should be a way to circumvent them without penalizing them or "getting nasty."

As far as the "boosted traffic" argument you keep raising, the impact on traffic from one thread is negligible, and it's just a poor argument all around (I won't go so far as to suggest it's flame). It simply isn't based on any evidence. I have trouble imagining a forum owner encouraging someone to respond to a detractor in the hopes of capturing whatever marginal additional traffic, if any, might be gained from a few more posts. Plus, many, if not most, forums are run non-commercially.
4.18.2007 11:31am
_Publius_:
Jarret said: In the three years I've run AutoAdmit, until the past month, I've never revealed any policy on storing IP records

That is a selective use of facts. Anthony Ciolli, in response to the Leiter controversy, made clear that the site collects IP data only during "incidents of spamming" (not an exact quote). Such a statement is heavily cited within the community. If that is wrong, then the entire community has misinterpreted the words of Anthony Ciolli, but you can't say that no AutoAdmit administrator has stated a policy.

Jarret said: But what I have witnessed is the frequent (and similarly schoolyard) desire to run to the "teacher" (authority figure) and get them to punish the bully. That's a far cry from ignoring things

I'm a little disgusted that the response to libel is to compare the victims to schoolchildren who should just stand up and pick a fight with the libeler. There's a reason why in offline libel we don't think the solution is to "confront" the accuser. It's simply not possible in libel to undo the harm through more speech. Nor is it fair to impose a unilateral burden on a victim to seek out and respond to every possible site which may or may not contain libels about them.

Further, how does one respond to some types of libel? "No, I don't have herpes" or "No, that's not my LSAT" is not very convincing (to use examples from your site).

How does one respond to intrusions into your private life? If I were to find some embarassing but true private fact about your life or sexual history and post it to this website, that would be an invasion of your privacy. There is nothing that you could do to respond to that in order to take back the damage. That's why the law recognizes a cause of action for instrusion upon seclusion.

And, as other posts on this site have suggested, vigilante justice is rarely justice at all.


Jarret said: in the case of AutoAdmit, it's impossible to post pictures

Techicality -- in that case, "to post a link to an Imageshack-type web host exclusively from AutoAdmit." The fact that the physical file does not reside on AutoAdmit.com in no way lessens the community's responsibility. The community posted the pictures with the intent that AutoAdmit readers could view the files, and created a link from AutoAdmit directly to the files.

Jarret said: if you post your picture on a website that gets millions of hits, e.g. MySpace or Facebook, don't be surprised when somebody you don't know looks at it, links to it (whether it was the original or a copy on a separate, unaffiliated website, run by an unaffiliated third-party), and wants to talk about it on a forum

I don't know where all of the images in the AutoAdmit Top 14 contest came from. But at least one of the victims (in one of the nice emails that got posted and openly mocked) claimed that the photos were taken from a friends-only Facebook account. Friends-only implies a level of expectation of privacy that does not extend to republishing, let alone the types of commentary that happened.

Further, prior publication has never been a defense to copyright infringement.

Looking beyond that, merely having a friends-only Facebook profile does not open one's entire private life to examination (including alleged sexual history, alleged sexual preferences, alleged STD status, alleged inter-personal relations, alleged family matters, etc) the way that happened on your site. It is still an intrusion on seclusion, a recognized tort. Even if you don't concede the tort, it's simply bad manners.

Jarret said: many web operators are free to be far less cooperative than I have been

Please announce to this website when you have deleted the threads about the Top 14 contest, deleted all of the victims' names, and/or handed over whatever IP logs you do keep to the victims. Maybe there has been more going on in the background, but it seems that you have been anything but helpful in this case.

Jarret said: The day that people lose the general right to talk about others, or even to ogle over their womanly features if they so desire, will be a very sad day.

Wow. In the end, this is about the right to "ogle over their womanly features"? I'm very happy to give up such a "right", if it ever existed, in order to enforce existing law regarding copyright, invasion of privacy, and libel.
4.18.2007 3:30pm
lucia (mail) (www):
Jarret said:
I don't think libel was the subject Orin raised, as a matter of fact.

If Orin isn't discussing libel, why would he post a quote containing the term "defamatory"?

I'm not a lawyer, but it's pretty clear Orin isn't discussing methods to handle perfectly legal but simply pesky behavior and speech. He is discussing behavior and speech which might be subject to either civil or criminal penalties but which is done under a shield of anonymity which is difficult to breach given current laws.

So, yes, I think you are changing the subject by wishing to focus on childish anti-social behavior rather than defammatory or stalkering type behavior.

You should bear in mind, many web operators are free to be far less cooperative than I have been,


You are bringing up what you have done and your experiences and possibly being a bit defensive about it? Until I read this thread, I'm afraid I am entirely unfamiliar with your forum. So, for may part, my comments have applied to forums generally. I will take your word for it that you are more co-operative.

However, with regard to your complaint about tattle tales: I think it's entirely appropriate for people who are abused by your forum visitors to try to get you to intervene. Telling them to jump in in fight it out is making a suggesting that is in your interest and not theirs.

If you do want to use the school yard analogy, I think if a kid is being assaulted or beaten up unfairly on a playground, it is appropriate for teachers to be informed so the bullying can be stopped.

Maybe you don't; maybe you think it would be better to let the teachers eat their lunches in peace and quite? But then I suspect you never ran to the teacher in 6th grade while two boys were literally ripping the pants off one of the girls? I did. (By the time the teacher arrived, the girl had no pants on. )

So, if libelous material appears on a forum, and there is currently no way to have it removed, I see no reason why people shouldn't ask you, the forum administrator remove it. Who else should they ask?

As to my experience with my tiny forum and blog: I've had one or two requests to remove comments from my knitting sites, and I have always complied. You know what? It's easy.

I believe that a simple enhancement to search engines could relieve individual web operators of the decision-making burden (and potential abuse of power) in such matters.


What decision-making burden? What potential abuse of power?

Frankly, there is no big "decision making burden". And providing a forum to cloak the identity of those who wish to dafame others and later refusing to intervene is already an abuse of power. Suggesting that it's up to Google to help people out is silly: the forum owner grants a forum and has certain power over it. They shouldn't insist the responsibility to avoid harms as a result of behavior on their forum belongs to search engines like Google.

Every forum owner already makes the decisions they want to make. Some forum owners let people publish anonymously and never delete anything. Others don't. Any forum owner can remove any threads they want to remove. They can forbid anonymity and only allow people with traceable email addresses to post. They can publish IP addresses with each comment. If there is a feeding frenzy, they can write a script to block certain people's names from being posted. If necessary, they can close down forums. It's all their choice; not Google's.

As far as the "boosted traffic" argument you keep raising, the impact on traffic from one thread is negligible, and it's just a poor argument all around (I won't go so far as to suggest it's flame).


Nice way to "not" suggest something. :)

First: What's your point? So what if each individual thread only brings in a little traffic? Who said any individual thread creates most of your traffic?

Forum traffic is generated by the sum total over all possible streams. One of those is searches over the entire collection of threads. Forum owners who want traffic (for any reason) are motivated to keep every single thread in place. They are motivated to encourage threads. If they are shielded from liablity, they may feel it's in their interest to keep stuff in place no matter how cruel, defamatory or nasty the threads may be.

And no, I don't think I need to hack into any individual forum owners' or blog owners' site statistics to provide concrete "evidence" that google searches result in traffic coming to a forum. If the searches didn't result in traffic, Orin wouldn't be suggesting we could eliminate this traffic by modifying the robots.txt file!
4.18.2007 4:52pm
Jarret Cohen (mail) (www):
lucia:

Orin started off the page referring broadly to "the role of law, policy, and extralegal tactics in regulating instances of cyber bullying, including defamatory 'Google bombing.'"

Cyber bullying is a broad, non-legal term, but I noticed you narrowed the topic to libel.

Then, I pointed out that often what is the subject of complaints and controversy is not libelous, if tortious at all. Right off the bat, I explained my concern about throwing around such terms:

I wrote, "This is not to say that there is a place for libel anywhere, but hurtful speech often isn't libelous." I repeated, "... What I see as the major problem is that most of what is called "libel" is not libel at all, but rather the kind of childish mockery that exists on the schoolyard playground."

I have also reiterated my stance on actual tortious or criminal behavior:

"As for situations involving actual libel, a legal remedy already exists: one can pursue a defamation lawsuit against the actual author of the libelous statement." I later repeated, "... As I've said many times, there should be and already exist legal remedies against actual torts or criminal behavior, regardless of the medium."

I understand that you're not a lawyer - neither am I - and I also understand that you may not be terribly familiar with this situation, since you hadn't heard about what's transpired with my forum. But since you aren't, I am trying to explain to you that you're stepping into an on-going discussion that reaches far deeper than simple definitions of libel and defamation. The problem, as I feel Orin appropriately identified it, is bullying - bruised egos and feelings getting hurt, aside from noses getting broken - and it's so much broader than the law. And lawsuits against everyone who says something objectionable to anyone on the Internet are not the panacea. This is why I have referred to it as "a social problem more so than a legal one."

I have to ask you, as I proceed, to adjust your frame of mind to one of a "social doctor", trying to diagnose a "social ill" and devise a cure.

You wrote:


If you do want to use the school yard analogy, I think if a kid is being assaulted or beaten up unfairly on a playground, it is appropriate for teachers to be informed so the bullying can be stopped.

Maybe you don't; maybe you think it would be better to let the teachers eat their lunches in peace and quite? But then I suspect you never ran to the teacher in 6th grade while two boys were literally ripping the pants off one of the girls? I did. (By the time the teacher arrived, the girl had no pants on. )


But it's loaded comments like these that really make me wonder how genuinely you want to find an answer to the problem, as opposed to just trying to make me look unsympathetic. Certainly, and I say once again, for the millionth time, there should be and is recourse against the bully when he breaks a kid's nose or pulls down a kid's pants. But what about the less egregious, often legal, but still personally hurtful acts of bullying, which I believe make up the majority of the social (not simply legal) problem I describe?

Have you ever noticed how siblings will tease each other endlessly, never coming close to inflicting physical harm, and endlessly tattle to Mom or Dad to spite one another? Have you noticed how, at some point, the exhausted parents begin to ignore the squabbling children's tattles, even when something truly wrong might have occurred? Well, it turns out adults behave not much differently than children. In my role as webmaster, I often feel thrust into the position of the exhausted and fed up parent, the recipient of endless tattles on comments that no one would ever argue were tortious or criminal. The reality is that most cases of "cyber bullying" are actually more akin to this analogy than your recollection of the pantless girl in elementary school.

My concern is that without solid, extralegal remedies, more people resort to frivolous litigation, or are more likely simply left out in the cold, never having achieved a resolution.

My concern is in actually fixing the problem, not assigning blame to particular web providers or expecting every person running a website to abide by social regulations that would somehow please everyone in the world. Put simply, such measures would fail. First, on the Internet, what disappears somewhere reappears elsewhere - like, for example, the Isle of Man, or somewhere else beyond American jurisdiction. Second, any imposition of acceptability standards on web content could create a slippery slope that could potentially compromise the entire nature of the Internet.

You seem very convinced that webmasters have an interest in keeping hurtful content on their websites in order to boost their traffic. Let's assume for a moment that this is true (even though, personally, I don't believe it boosts my traffic). If so, then it only goes to illustrate a conflict of interest between webmaster and the public that places those who complain in an innate position of weakness. My repeated stance is that while litigation is surely not the answer to every stubborn webmaster, webmasters who have the freedom to be stubborn are not the answer to every complaint.

You don't seem to appreciate the difficulty of sifting through and trying to make constant judgments as to what stays and what goes on a forum. The inevitable result of this process is that someone is going to accuse you of "selective moderation." In other words, there will always be people who disagree with the decisions you make. For this reason, I describe it as burdensome, which apparently you don't buy. But with all due respect, you said you run tiny sites on knitting. I would hardly compare the one or two requests you say you've received to the flood of them I receive every day for running a forum with 8,000,000 posts, where the topic of discussion isn't knitting. It does become quite a burden - especially when you're a non-commercial website like me. But I still try my best, striving for consistency.

I continue to recognize that feelings are being hurt all over the Internet, and I would like to develop an enlightened solution that eliminates the vagaries of the webmaster from the equation as much as possible, for all the reasons I've tried to express. I believe that if it can be justified that a website is not liable for the words of its posters - or, maybe in some sense, that it does not "own" their words - then it can be justified that one's last resort in dealing with the words of others need not be the website on which they appear. Google supersedes the individual webmaster, and its system, as well as those of other search providers, could easily allow for a mechanism to prominently attach personal statements to legitimately relevant listings. While this would not eliminate bullying from the realm of the Internet, it could go a long way to mitigate it if executed properly. If anything, it would present an option for the majority of people who find themselves without legal recourse against a "bully."
4.19.2007 12:15am
lucia (mail) (www):
Jarret:
Thanks for clarifying why you are taking this all so personally. If I now understand you correctly: your forum provides a nearly impenetrable cloak of invisibility that makes it easy for even the technologically incompetent to shield their identities and indulge in behaviors that range from simply anti-social to possibly criminal without any fear of liability or just harm to their own reputation. They have done so. Consequently, you believe that Orin's concern is not entirely general, but stems from behaviors that occurred specifically at your site.

Let me assure you: the problem is much more widespread than your little site. As I noted before: I was unaware that this was personal to you. My points are not "loaded" nor are they attempts to make you appear anything at all. My opinions were not based on the specifics of one single story of something that happened at your site; they still aren't. Your making it personal and explaining your personal "difficulties" moderating is only convincing me that what I thought in general is true:

Forum owners who don't rein in immoderate behavior just don't want to.

As to whether or not my comments are trying to address solutions to the social ills that can arise from behaviors at forums or just make you look bad: I am trying to discuss the social ills and solutions in general. Naturally, I am also responding to the arguments and examples you set forth to explain why forum owners in general, and you in particular, should not be expected to expend energy to curb anti-social, tortuous or criminal behavior on the part of their site visitors.

One of your arguments seemed to be to explain that the behavior is not criminal and somehow unworthy of our attention by using an analogy name-calling combination. Your specific analogy was to compare a forum to a school-yard and call those who ask authorities to intervene "tattle tales".

How is one to address your argument by combined name-calling and school-yard play analogy other than by describing the sorts of bullying that happens on playgrounds and the actions you are likening to tattling? The fact that if we use your analogy to justify forum owners refusal to intervene, we are similarly justifying teachers not defending children under attack is not a personal attack on you. It is simply the position you place yourself in if we continue your analogy.

You can admit your analogy doesn't work directly or indirectly. (You chose indirectly and whining.) Anyway, now that your first analogy seems to have broken down, you now want to change analogies and liken yourself, the forum owner, to tired parents who are too exhausted to deal with their kids whining.

Well, exhausted parents have everyone's sympathy.

But, let's face it: just drop this analogy. No forum owner is required to run a forum at all. If dealing with visitors is too much for any forum owner -- including you-- they can get help or close the forum down.

And second, in the long run, like tired parents, if forum owners make systemic changes, they can reduce the burdens of dealing with what you call tattling and whining on the part of the childish visitors to your site. So, let's skip the analogy of parents can do about their kids and skip straight to what forum owners can do the situation we are concerned with which is:

How can forum owners curb anti-social (or tortuous or or criminal) comments behavior we witness on forums that permit anonymous comments and which later lead to those who are maligned to "whine" about the comments.

Well, why don't we start by attacking the root of the problem. How to reduce the anti-social (or tortuous or or criminal) comments and behavior by anonymous forum visitors. Well, it's highly likely that if many commenters knew their identities could be traced they would self-moderate. (What young single guy wants to find links to his troll like posts on Don't date him girl.com? )

So, if a forum owner sincerely wishes to curb to this obnoxious behavior, they should at least try to make it difficult for visitors to conceal their identities. And there are simple steps that make it much, much more difficult for visitors to be anonymous. Any forum owner, can detect, record and even publish the IP address with every comment. Any forum owner, could require a real traceable email address before permitting visitors to comment and they could log that information and the IP number used during registration with every registrant. Any forum owner can put in place a list of spam words that block racial slurs or sexually charged language. Forum owners can do these things if they wish to do them. In fact, most free forum software does most of these things by default partly because many of these steps are necessary for spam control!

These steps would can greatly reduce the amount asked for and/or required moderation. And in my opinion, any forum owner who witnesses bouts of simply anti-social or criminal behavior on his blog and does not implement the widely available forum scripts t and then tries to liken himself to a exhausted parent gets no sympathy from me!

So to this:

You don't seem to appreciate the difficulty of sifting through and trying to make constant judgments as to what stays and what goes on a forum.

I am saying flat out that you are drastically overstating the difficulty.
4.19.2007 11:45am
lucia (mail) (www):
Yesterday's pesky behavior at autoadmit included a death threat. Acting o the advice of the FBI, the Dean Nell Newton of the University of California, Hastings College of Law, cancelled classes and evacuated the building.
4.19.2007 7:46pm
Jarret Cohen (mail) (www):
lucia:

You wrote:


Consequently, you believe that Orin's concern is not entirely general, but stems from behaviors that occurred specifically at your site.


Now I'm going to paste the first sentence of Orin's article:


The Yale Law Journal Pocket Part recently posted an interesting Call for Papers in response to the recent news stories about the XO board and anonymous Internet speech:


Assuming you can read, that pretty much spells out why I "believe" my website (XO) is relevant to the discussion.

~

As for the rest of your response, for the most part I couldn't really understand what you were trying to say, even though I tried very hard. I've tried to summarize what I did gather from your post:

You believe that your personal perception of reality, and specifically, what is "immoderate" and "anti-social," is a sound foundation to base an argument on.

You want webmasters to censor words and comments that go against your personal standards of decency and acceptability.

You believe that collecting IP addresses and reducing anonymity will help solve the problem, presumably because you believe it would enable litigation and discourage extreme comments. You also ignore the problem presented by proxies, and the fact that the nastiest people know well how to use them.

Yet:

You define the root of the problem as being "anti-social (or tortuous or or criminal) comments and behavior." Repeatedly, you differentiate between anti-social behavior and "tortuous" [sic] or criminal behavior. This tells me you clearly recognize that the problem is broader than questions of legality, which is what I emphasized. But unfortunately, that being the case, you come up short on a solution. Because collecting IP addresses in order to facilitate litigation will not help all those who are offended by behavior that is considered anti-social, but legal. Nor will it help those who are being bothered by people using proxies, or foreign ISPs, which would only become more popular if strict IP collection standards were imposed.

And collecting IP addresses wouldn't prevent someone like me from leveling perfectly legal insults at you on a message board, because, for example, I think you're just being irrational. The real world is full of bullying, mockery, insults, and criticism against which there is little recourse, so I can't really buy into your theory that reduced anonymity would significantly reduce "cyber bullying." Not to mention, wouldn't you rather be on the receiving end of insults as an anonymous person, if at all possible? It's a two-way street, unless, as I suggested before, you're out to destroy one of the fundamental features of the Internet.

Since you've established that you mean to deal with anti-social behavior just as much as illegal activity, I can confidently say your solution isn't a solution at all. Your "solution" is a criticism of webmasters who you don't feel are running their sites responsibly, but it offers no creative answer to the problem. You want to live in a dreamworld where everyone who manages a forum ought to behave the way you want them to behave. The bottom line is: whether or not they are acting ethically by your standards, by society's, or by God's, as a general solution it is not just to leave individual webmasters the ultimate *power* to decide what's said - and what isn't said - about private people. I don't think it's a fair situation for private citizens, and I believe the enlightened solution is to bypass the webmaster and tackle the true source of 99% of concerns: search engine listings.

You, on the other hand, propose no solution that acknowledges the reality of webmasters who don't need to cooperate, leaving these people in the cold. Your "solution" is to insinuate that all of these webmasters are being unethical and taking comfort in it, and to make the laughable suggestion that they should shut down if they can't moderate their forums the way you want them to.


Two final side-notes:

1) Your experience running a knitting blog is not comparable to moderating literally 8,000,000 posts on a college and law message board full of crazy, young people. If that's "little", there must be a lot of people making sweaters over at your place.

2) You brought up "dontdatehimgirl.com", a site designed solely for the purpose of humiliating men, as part of your scenario where a male "troll's" identity is revealed on the Internet, presumably earning him the shame of being featured on that site. I thought it was both odd and telling that when you brought up a website whose stated mission is to defame men out of being able to attract dates, rather than pointing out how the site is used anonymously by women solely to hurt men's reputations, you focused instead on a hypothetical male Internet user whose written comments, you imply, warrant public humiliation. Comments like these give me a pretty good idea of what you're about.
4.19.2007 9:57pm
lucia (mail) (www):
Jarrett:

You seem to be getting emotional and making a whole bunch of scatter claims about what I may think rather than discussing what I actually suggest in response to your repeated laments that you are overworked moderating your pesky childish foru visitors. Many of your guesses widely off any mark of what I think should be permitted at forums in general. (Some statements are just odd. Where in the heck does dontdatehimgirl.com state their mission is to defame men?)


Anyway, to touch on the portions of your commentary worth even considering: Contrary to your current suggestion, I am primariy concerned that those who are libeled, stalked or threathened with death are deprived of normal legal recourse by forum owners who would provide a cloak of anonymity to their forum visitors. I am not primarily concerned with boorish behavior.

You criticise my suggestion that forum owners log IP addresses because you think it will only address tortuous or criminal behavior but won't address the simply boorish, pesky or anti-social.

Well, I guess what? I'd be happy with a solution that only solves the problem of tortuous or criminal behavior! Yep, my "solution"-- which I admit is imperfect, is that we should link provider immunity for tortous or criminal behavior on the part of their site visitors to the following two actions:

a) forum owners must log IP addressees for each comment and
b) forum owners must require registration with a valid email address.

If a forum owners wants immunity, they should be required to do both of these things. (Needless to say, if no tortuous or criminal behavior occurs, they are in no danger. If tortuous or criminal behavior occurs, they can hand over records. )

Is it a perfect solution? No. Could motivated visitors take steps to cloak themselves? Yes. But at least the forum owners immunity is contingent on forum owners not actually providing the anonymity themselves.

That would be a big improvement over the current system, where forum owners are immune even if they take active steps to provide anonymity. And it's a heck of a lot better than your "solution" of sheilding forum owners from all liablity and forcing search engines who don't run the forums to come up with a solution!
4.20.2007 12:50am