11.100 Internet intermediaries should not be liable under the tort for invasions of privacy committed by third parties using their services, where they have no knowledge of the invasion of privacy. Where they do have knowledge, there does not seem to be any justification to provide a complete exemption from liability. The ALRC therefore sees no need to recommend the enactment of a ‘safe harbour’ scheme, to protect internet intermediaries from liability under the tort.
11.101 There are two reasons why intermediaries are unlikely to be liable under this tort. First, the tort is targeted at positive conduct on the part of the defendant. It is difficult to characterise a failure to act as an ‘invasion’ of privacy. It is not intended to impose liability for mere omissions—that is, failing to act to stop an invasion of privacy by a third party. Secondly, the tort is confined to intentional or reckless invasions of privacy.
11.102 A mere intermediary will rarely have this level of intent, when third parties use their service to invade someone’s privacy. The operators of a social networking platform, for example, do not intend to invade someone’s privacy, when one of its customers posts private information about another person on the platform.
11.103 In some circumstances, an intermediary may be found to have the requisite fault after they have been given notice of an invasion of privacy. They may be found to have intended an invasion of privacy, or been reckless, if they know that their service has been used to invade someone’s privacy, and they are reasonably able to stop the invasion of privacy, but they choose not to do so.
11.104 Considering these two reasons, the ALRC does not think it necessary to recommend that safe harbour schemes for internet intermediaries be extended to protect intermediaries from liability under the new tort.
11.105 However, if such a scheme were necessary, amending cl 91 of sch 5 of the Broadcasting Services Act 1991 (Cth) may be one way of protecting intermediaries from liability under the tort. Clause 91 does not currently refer to laws under Commonwealth statutes. It provides that any law of a state or territory, or a rule of common law or equity has no effect to the extent to which it subjects an internet content host to liability in respect of hosting particular internet content.
11.106 In copyright law, s 116AG of the Copyright Act 1968 (Cth) limits the remedies a court may grant against carriage service providers for infringements of copyright that relate to their carrying out certain online activities. In order to access this scheme, a carriage service provider must meet conditions in s 116AH.
11.107 In the Discussion Paper, the ALRC proposed the introduction of a safe harbour scheme, to protect internet intermediaries from liability under the new tort for which a third party was primarily responsible. To rely on the defence, the intermediary might be required to meet certain conditions. The defence would not apply to invasions of privacy that intermediaries themselves intentionally or recklessly commit.
11.108 In the US, § 230 of the Communications Decency Act 1996 (US) contains a broad safe harbour scheme. The scheme exempts ‘interactive computer services’ from civil liability under US state and federal law where
(A) any action voluntarily taken in good faith to restrict access to or availability of material that the provider or user considers to be obscene, lewd, lascivious, filthy, excessively violent, harassing, or otherwise objectionable, whether or not such material is constitutionally protected; or
(B) any action taken to enable or make available to information content providers or others the technical means to restrict access to material described in paragraph (1). 
11.109 The provision also imposes a series of obligations on interactive computer service providers:
A provider of an interactive computer service shall, at the time of entering an agreement with a customer for the provision of interactive computer service and in a manner deemed appropriate by the provider, notify such customer that parental control protections (such as computer hardware, software, or filtering services) are commercially available that may assist the customer in limiting access to material that is harmful to minors. Such notice shall identify, or provide the customer with access to information identifying, current providers of such protections.
11.110 The EU safe harbour scheme provides that service providers are not under any ‘general obligation to monitor’ for illegal content. Services will not be liable for third party content where the internet intermediary had no ‘actual knowledge of illegal activity or information knowledge’ and, ‘upon obtaining such knowledge or awareness, acts expeditiously to remove or to disable access to the information’.
11.111 In the UK, the Defamation Act 2013 (UK) provides a defence for ‘operators of websites’. It is a defence for the operator to show that it was not the operator who posted the statement on the website. An operator of a website is understood as a person with effective control over the content of a website who is not the author, editor or publisher of the matter. There are differing degrees of control depending on the form and size of a platform.
11.112 Section 5(12) provides that the act of merely ‘moderating’ a site is not, in and of itself, sufficient to defeat the defence.
11.113 The defence is defeated if the claimant shows that
(a) it was not possible for the claimant to identify the person who posted the statement,
(b) the claimant gave the operator a notice of complaint in relation to the statement, and
(c) the operator failed to respond to the notice of complaint in accordance with any provision contained in regulations.
11.115 This detailed defence is complemented by s 10 which provides that a court does not have jurisdiction to hear and determine an action for defamation brought against a person who was not the author, editor or publisher of the statement complained of unless the court is satisfied that it is not reasonably practicable for an action to be brought against the author, editor or publisher.
11.116 If a safe harbour scheme were enacted, internet intermediaries should be required to comply with certain conditions to rely on the defence. Examples of such conditions might include requiring internet intermediaries to
remove, or take reasonable steps to remove, material that invades a person’s privacy, when given notice;
provide consumer privacy education or awareness functions, such as warnings about the risk of posting private information; and
comply with relevant industry codes and obligations under the Privacy Act 1988 (Cth).
11.117 Stakeholders suggested other conditions, including requiring internet intermediaries to
reasonably cooperate with and assist the relevant regulator with locating and pursuing a wrongdoer;
take action against individuals who are found liable for serious invasion of privacy, such as blocking their social media accounts;
block users who contravene these terms and conditions from uploading future content; and
show warnings about the risks and potential consequences of posting private information.
11.118 While the ALRC recommends that a safe harbour scheme is unnecessary, if such a defence were to be enacted consideration should be given to these conditions.
The broad term ‘internet intermediary’ is commonly used to cover: carriage service providers, such as Telstra or Optus; content hosts, such as Google or Yahoo!; and search service and application service providers, such as Facebook, Flickr and YouTube: Peter Leonard, ‘Safe Harbors in Choppy Waters-Building a Sensible Approach to Liability of Internet Intermediaries in Australia’ (2010) 3 Journal of International Media and Entertainment Law 221, 226. There is also a comprehensive definition of ‘carriage service provider’ in the Telecommunications Act 1997 (Cth) s 87. For the purposes of that Act, the provision defines a ‘listed carriage service provider’ as someone who provides carriage service to the public using a network owned by one or more carriers. This definition excludes online search engines and online vendors as they provide a platform for their customers, but not for the public. This definition restricts the scope of the safe harbour scheme.
Defamation is not limited to positive conduct in this way. The Court of Appeal of England and Wales, in Byrne v Deane, found the proprietors of a golf club liable for defamation, when someone anonymously posted a defamatory poem to the wall of the club. The club knew the defamatory poem was posted on the wall and could have taken it down, but did not: Byrne v Deane  1 KB 818. This principle has been applied to hold internet intermediaries liable as publishers in defamation where they have been given notice of defamatory matter present on their website, but fail to remove it within a reasonable time: Godfrey v Demon Internet Ltd  QB 201; Tamiz v Google Inc  1 WLR 2151; Trkulja v Google Inc LLC  VSC 533; Rana v Google Australia Pty Ltd  FCA 60.
Byrne v Deane  1 KB 818.
Broadcasting Services Act 1992 (Cth) Sch 5 cl 91(a).
A safe harbour exemption was recommended by some stakeholders in response to the DPM&C’s 2011 Issues Paper: Peter Leonard and Michael Burnett, Submission No 77 to DPM&C Issues Paper, 2011.
M Lemley, ‘Rationalizing Internet Safe Harbors’ (2007) 6 Journal on Telecommunications and High Technology Law 101, 102.
Communications Decency Act 1996, Title V of the Telecommunications Act 1996, 47 USC.
Ibid s 230(c)(d).
EU Directive on Electronic Commerce (2000/31/EC) art 15.
Ibid art 14.
Defamation Act 2013 (UK) s 5. This defence is defeated if the defendant showed malice: Ibid s 5(11). The Explanatory Memorandum to the Act explain that malice may arise in circumstances where the operator of a website had ‘incited the poster to make the posting or had otherwise colluded with the poster’: Explanatory Notes, Defamation Bill 2013 (UK) .
Defamation Act 2013 (UK) s 5(2).
Ibid s 5(5).
Ibid s 5(6).
Office of the Australian Information Commissioner, Submission 90.
Women’s Legal Services NSW, Submission 115.
Domestic Violence Legal Service and North Australian Aboriginal Justice Agency, Submission 120.
S Higgins, Submission 82. Other stakeholders also recommended possible conditions: UNSW Cyberspace Law and Policy Community, Submission 98; Australian Sex Party, Submission 92; Interactive Games and Entertainment Association, Submission 86; S Higgins, Submission 82; I Turnbull, Submission 81.