AddThis SmartLayers

Law Column: Keep an eye on anonymous comments

At the end of last week, a Sheffield United footballer was sentenced to five years in jail for raping a young woman.  Shortly afterwards, a number of comments were made on Twitter, insulting the woman in question and, in some cases, naming her, or speculating as to her identity.

Understandably, there has been widespread outrage about this activity.  For instance, the charity End Violence Against Women issued a statement on their website:

“It is profoundly disturbing that the rape victim in the…trial has been named and abused on Twitter and other social media sites.  It has long been law that rape complainants are protected by lifetime anonymity and those who have named her have been reported to the police for committing a criminal offence.  This raises serious questions about the adequacy of the criminal justice system to deal with offences that occur online and we are calling for an urgent review of laws and practices.”

You can absolutely understand the sentiment: it is imperative that victims of offences, and of sexual offences in particular, have confidence in the criminal justice system, and its ability to protect them.  It would be nothing short of a tragedy if a victim of an offence did not come forward for fear of disclosure of her identity.

For that reason, we expect the authorities to take swift action against anyone who posted a comment naming the victim in question.

Traditional publishers will be well aware of the lifetime anonymity afforded to victims of sexual offences, and would never carry any sort of comments speculating as to the identity of victims.  Straight-forward identification is therefore rarely a risk for publishers.  The more common risk for those publishers is the risk of “jigsaw identification”: including a number of small pieces of information which when pieced together may allow someone to be identified.

However, the use of Twitter in this case makes it sensible to revisit the steps that can be taken to guard against any risk of identification of victims through user-generated comments (“UGC”).

It is an unfortunate reality that whether through a lack of understanding of the law, or a disregard for it, some users of social media believe that they are free to say whatever they want, whenever they want, without consequence, as evidenced by the events of the last week.  Where a publisher operates a comments facility, it is important to be in a position to react quickly where that facility is abused.

As you would imagine, a publisher has a defence when someone else posts unlawful comments on its website.  Regulation 19 of the Electronic Commerce (EC Directive) Regulations 2002 provides that defence as long as the publisher:

  • does not control the person making the post;
  •  does not have “actual knowledge” of the unlawful activity;
  •  upon obtaining such knowledge or awareness, “acts expeditiously to remove or to disable access to the   information”.

Some publishers will guard against any risk by disabling UGC for sensitive crime stories, and this is the classic sort of story where that is a wise pre-caution.

However, we have often seen that determined users (and internet trolls) will post comments in relation to one story in the comments facility either for a broadly related story or for a completely unrelated story, so this sensible precaution cannot completely guard against the risk of abuse of the comments facility.

Most publishers will not actively moderate the UGC on their sites, but will operate a notice and takedown procedure.  Those notice and takedown procedures rely on the fact that the regulation 19 defence applies if a publisher acts “expeditiously” on receiving notice of unlawful activity.  What is “expeditious” (i.e. fast enough) depends on the circumstances.

However, if a publisher is on notice of unlawful comments and does not move fast enough, the regulation 19 defence can be lost, and the publisher can itself become liable for the comments.

The encouraging thing to come out of these events is that the Twitter community was self-policing to some extent, with users reporting the abusive comments to the authorities and pointing out their illegality.

However, because anonymity is such a fragile thing: once it has been breached it remains broken, the sort of situation that arose last week is the situation where very fast action is needed.  Those events show why it is important that anyone operating a UGC facility must ensure that there is a very robust system in place for acting on reports of abuse of that facility.

One comment

You can follow all replies to this entry through the comments feed.
  • May 2, 2012 at 2:12 pm

    The papers I edit have a free and open noticeboard and we rely on Reg 19. I was served a PCC on the basis we did not move quick enough to move a comment posted on a Sunday until the following Tuesday.There were circumstances around this, but the PCC ruled that we acted in a reasonable amount of time and we were cleared.

    Report this comment

    Like this comment(0)