29.02.2012
12.39 This section outlines the existing methods employed to address RC content online. The Classification of Media Content Act should provide for similar methods of stopping the distribution of Prohibited content.
12.40 The ACMA is required to investigate complaints made about online content defined as ‘prohibited content’ under the Broadcasting Services Act. As has been explained, the definition of ‘prohibited content’ in the Broadcasting Services Act captures a wider range of content than RC—although RC content is certainly captured.[36] The ACMA may also choose to investigate a matter on its own initiative.[37]
12.41 The ACMA’s trained content assessors then investigate the complaint. The action that the ACMA must then take depends, among other things, on whether the content is hosted in Australia.
Take-down notices
12.42 Currently, if the ACMA assesses that content is substantially likely to be ‘potential prohibited content’ and the content is hosted by a ‘hosting service’,[38] or provided by way of a ‘live content service’,[39] or by a ‘links service’[40] with the appropriate Australian connection, then the ACMA must:
- issue an interim notice directing that certain steps be taken (broadly, that the content be taken down or removed); and
- apply to the Classification Board for classification of the content.[41]
12.43 The content must generally be taken down by 6 pm the next business day.[42] If the content is then classified RC, the ACMA issues a final take-down notice.[43] The requirement to comply with these interim and final take-down notices constitute ‘designated content/hosting service provider rules’,[44] so non-compliance may result in the commission of an offence[45] or the contravention of a civil penalty provision.[46]
12.44 The notice and take-down scheme has significantly reduced the amount of child sexual abuse online content hosted in Australia.[47] The ACMA reports that it has received ‘100% industry compliance’ with its actions to remove such content.[48]
12.45 However, as the Internet Industry Association (IIA) has explained, for ‘both technical and legal reasons, take-down notices can only apply in relation to content hosted in Australia’.[49]
Notifying law enforcement agencies
12.46 The ACMA has obligations in respect of ‘sufficiently serious’ online content, which has been the subject of complaint, regardless of whether the content is hosted in Australia or overseas. The ACMA considers the following online content ‘sufficiently serious’:
- ‘child abuse material’;
- content that advocates the doing of a terrorist act; and
- content that promotes or incites crime or violence.[50]
12.47 This content ‘mirrors’ some of the content currently within the scope of the RC classification category. Some of this content comes within the ambit of some offences in the Criminal Code (Cth), so may be broadly understood as ‘illegal content’.
12.48 The ACMA is obliged to refer online content that it considers to be ‘sufficiently serious’ to a member of an Australian police force or, where there is an arrangement in place with the chief of an Australian police force that the ACMA may notify the content to another person or body, to that other person or body.[51]
12.49 There is a Memorandum of Understanding in place between the ACMA and Commonwealth, state and territory police forces to ensure the swift reporting of such content[52] and associated information sharing.[53]
12.50 The ACMA has an arrangement with the Australian Federal Police (AFP) that online child abuse material that is hosted by a country which has membership with the International Association of Internet Hotlines (INHOPE) may be referred directly to INHOPE.[54] If the relevant jurisdiction is not an INHOPE member, then the ACMA refers the content to enforcement agencies such as the AFP[55] who in turn will liaise with international law enforcement agencies such as INTERPOL.
12.51 The ACMA refers online content that advocates the doing of a terrorist act to the AFP.[56]
Family friendly filters
12.52 If the ACMA is satisfied that content hosted outside Australia is prohibited content or potential prohibited content, as defined in the Broadcasting Services Act, the ACMA must, among other things,
notify the content to internet service providers so that the internet service providers can deal with the content in accordance with procedures specified in an industry code or industry standard.[57]
12.53 The ACMA notifies filter software makers or suppliers accredited by the IIA in accordance with the industry code in place under sch 5 of the Broadcasting Services Act.[58] To be designated an ‘IIA Family Friendly Filter’, the IIA must be satisfied that the internet filtering product or service meets certain requirements.[59]
12.54 The ACMA informs the filter software providers of the URLs that are to be excluded or ‘blocked’. This list is known as the ‘ACMA blacklist’.[60] The makers or suppliers of the ‘Family Friendly’ filtering products or services have agreed to give effect to the ACMA’s notifications by updating their products or services. The ACMA regularly reviews the URLs on its blacklist, and provides filter providers with revised lists.
12.55 Australian-based ISPs then make these ‘Family Friendly’ filters available to their customers free of charge or on a cost recovery basis.[61] Australian internet users have a choice as to whether or not they opt to use these filters.[62] If an Australian internet user has opted to use one of these filters, the blocking then occurs at the user’s end—namely on the user’s computer—rather than at a network level.
12.56 Schedules 5 and 7 of the Broadcasting Services Act are silent about whether the ACMA may also notify ‘Family Friendly’ filter software makers and providers of URLs which have been determined to contain child sexual abuse content by overseas organisations such as the Internet Watch Foundation, INTERPOL, and the National Center for Missing and Exploited Children—that is, online content that may not have been the subject of complaint under the Broadcasting Services Act framework. These overseas organisations, and the criteria used to determine whether content should be included on their lists, are discussed in a later section of this chapter.
ISP-level filtering
12.57 The Australian Government has proposed a scheme for mandatory filtering of certain online content by ISPs. Voluntary filtering is also being undertaken by some Australian ISPs. A number of stakeholders commented on ISP-level filtering.
Mandatory filtering
12.58 In December 2009, the Australian Government announced that it planned to introduce legislative amendments to the Broadcasting Services Act to require all ISPs in Australia to filter or ‘block’ RC content hosted on overseas servers. The ‘RC Content List’ is to comprise:
overseas-hosted online content which has been subject to complaint to the ACMA and which is being classified, or has been classified as RC, by the Classification Board using the classification scheme criteria; and
international lists of overseas-hosted child sexual abuse material from ‘highly-reputable’ overseas agencies—following the ACMA’s detailed ‘assessment of the rigour and accountability of classification processes used by these agencies’.[63]
12.59 The scheme is intended to help reduce the risk of inadvertent exposure to RC content, particularly by children, and reduce the current inconsistency between the treatment of RC content that is hosted in Australia (which is subject to the notice and take-down scheme) and that hosted overseas.[64]
12.60 The Government announced nine measures to increase accountability and transparency in relation to the scheme.[65] These include measures to ensure some content must be classified by the Classification Board before the content is added to the ‘RC Content List’, and that aggrieved persons may seek review of these decisions.[66] It was also proposed that the ACMA would regularly publish an up-to-date, high-level breakdown of the list by category, and that an independent expert would undertake an annual review of the processes.[67]
12.61 An exemption is being considered for popular overseas sites with high traffic, such as YouTube, if the owners of the sites implement their own systems either to take down RC content or to block Australian access.[68]
12.62 A number of stakeholders expressed views on mandatory ISP-level filtering, with some supporting the policy,[69] and others expressing opposition.[70] Supporting mandatory filtering, the Communications Law Centre submitted that:
A list of all material that has been refused classification should be published, with broad category descriptors explaining why the media content has been refused classification (eg ‘sexual violence’). Such media content should be compulsorily filtered at the ISP level.[71]
12.63 The Australian Christian Lobby likewise said that, ‘despite the limitations and challenges of ISP filtering, there are a range of studies demonstrating that it would be an effective way of filtering Refused Classification material’.[72]
12.64 Among the reasons that were given for opposing mandatory ISP-level filtering were concerns about:
- there being very little child sexual abuse content on the web, because this content is more prevalent in peer-to-peer file sharing and virtual private networks, which will not be filtered;[73]
- the filter not being effective because it can be by-passed;[74]
- the potential cost of the scheme given these limitations;[75]
- the filter may be giving a false sense of protection to households;[76]
- the filter being counterproductive in terms of finding and prosecuting those distributing and/or accessing child sexual abuse content;[77]
- a government list of websites to be filtered being secret,[78] open to abuse[79] (including ‘scope creep’—more categories of content being added over time), and infringing freedom of speech;[80] and
- the potential for over-blocking (that is, content being filtered that should not be filtered, such as creative/artistic works and information).[81]
Identifying content to be blocked
12.65 If ISPs were required mandatorily to filter all Prohibited or RC content, it is likely that certain content would have to be prioritised—and perhaps only a subcategory of Prohibited content would in fact be filtered. For example, the ACMA has recently reported that, of the 1,957 items of prohibited or potentially prohibited content it identified in 2010–11, 1,054 items were determined to be offensive depictions of children, whereas only 68 items depicting a sexual fetish were determined to be RC content.[82]
12.66 In the Discussion Paper, the ALRC proposed that the Classification of Media Content Act should provide that, if content is classified RC, the classification decision should state whether the content comprises real depictions of actual child sexual abuse or actual sexual violence. This content, the ALRC stated, could then be added to any blacklist of content that must be filtered at the ISP level, should such a policy be implemented.[83]
12.67 Some submissions supported the proposal.[84] For example, Telstra stated that it would be a ‘feasible and practical’ approach to implement and could ‘usefully form one element of a multi-faceted approach to this issue’.[85] However, others expressed concern that this would narrow the scope of what must be filtered. The Australian Council on Children and the Media, for example, said that ‘any material that is judged to be RC should be on the blacklist’, and particularly noted material ‘that incites or instructs in matters of crime or violence (especially terrorism)’.[86] Similarly, Collective Shout submitted that the RC classification should be broadened to include ‘any depiction of actual sex’ and material that ‘promotes, encourages or instructs in methods of suicide’.[87]
12.68 In contrast, Civil Liberties Australia, stated that:
If the ALRC were prepared to suggest that the only content that could not be contained in the other classification categories is real depictions of actual child sexual abuse or actual sexual violence, then that would be a very strong step forward.[88]
12.69 Some submissions queried the distinction between ‘actual’ abuse and simulations of abuse. For example, Amy Hightower argued that, while the definition of ‘child pornography material’ in the Criminal Code (Cth)
clearly captures abhorrent ‘real’ child sexual abuse material as intended, it also captures material which does not actually involve children at all, including cartoons, textual works or material where all involved parties are demonstrably over the age of eighteen. There is no legal distinction drawn between ‘real’ and ‘fictional’ abuse; to draw such a distinction would presumably require altering the Criminal Code.[89]
12.70 The Justice and International Mission Unit of the Uniting Church submitted that it would like the proposal to be broadened to include simulated depictions of actual child sexual abuse.[90] Others also called for a clear definition of ‘actual sexual violence’.[91]
12.71 Given the volume of Prohibited content on the internet, if ISPs were required mandatorily to filter Prohibited content, the Regulator may recommend that particular subcategories of Prohibited content will be prioritised. The selection of such subcategories should be carefully assessed. The ALRC notes in particular the community concerns about actual child sexual abuse and non-consensual sexual violence. In defining such a subcategory, the Regulator might also have regard to the types of content that are now the focus of international efforts to curb the distribution of child abuse material. The subcategory of ‘sufficiently serious content’, discussed above, might also be useful for this purpose.
Voluntary filtering
12.72 In early July 2010, the Australian Government announced that some Australian ISPs have agreed voluntarily to block, at the ISP level, a list of child abuse URLs.[92] The IIA then announced that it would develop a voluntary industry code for ISPs to block ‘child pornography’ websites.[93] On 27 June 2011, the IIA released the framework that would underpin its voluntary code.[94] A key feature of the voluntary scheme is that it uses INTERPOL’s list rather than a list maintained by the ACMA, or any other organisation. The criteria for inclusion in the INTERPOL list are stricter than the definition of child pornography material under Australian criminal legislation.[95]
12.73 To join the IIA’s voluntary code of practice, an ISP expresses interest in participation to the AFP and indicates that they have, or are preparing, their technical infrastructure to implement blocking of the list. The AFP then issues a ‘request’ to that ISP pursuant to s 313 of the Telecommunications Act 1997 (Cth). This statutory provision outlines the obligations of ‘carriers’ and ‘carriage service providers’ to do their best to prevent relevant telecommunications networks and facilities from being used in, or in relation to, the commission of Commonwealth, state or territory offences and to give officers and authorities of the Commonwealth and of the states and territories such help as is reasonably necessary for the purpose of enforcing the criminal law, amongst other things. Section 313(5) of the Telecommunications Act provides complying ISPs with a ‘safe harbour’ or ‘immunity’ from civil litigation for any ‘act done or omitted in good faith’ in performance of the duty that had been imposed on.
12.74 As of November 2011, the AFP had issued five s 313 requests to Australian ISPs,[96] which suggests that there are five Australian ISPs which are voluntarily filtering the INTERPOL blocklist at the ISP-level. There is no requirement for the ISPs to report their statistics, but for the period 1 July–15 October 2011, Telstra reported that there had been in excess of 84,000 redirections via its network.[97]
[36]Broadcasting Services Act 1992 (Cth) sch 7 cls 20, 21.
[37] Ibid sch 5 cl 27; sch 7 cl 44.
[38] Defined in Ibid sch 7 cl 4.
[39] Defined in Ibid sch 7 cl 2.
[40] Defined in Ibid sch 7 cl 2.
[41] Ibid sch 7 cl 47(2), 56(2), cl 62(2).
[42] Ibid sch 7 cl 53(1), 60(1), 68(1).
[43] Ibid sch 7 cl 47(1), 56(1), 62(1).
[44] Ibid sch 7 cl 53(6), 60(4), 68(6).
[45] Ibid sch 7 cl 106.
[46] Ibid sch 7 cl 107.
[47] W Wei, Online Child Sexual Abuse Content: The Development of a Comprehensive, Transferable International Internet Notice and Takedown System (2011), 81.
[48] Australian Communications and Media Authority, The ACMA Hotline—Combating Online Child Sexual Abuse <http://www.acma.gov.au/WEB/STANDARD/pc=PC_90103> at 23 August 2011.
[49] Internet Industry Association, Guide for Internet Users: Information about Online Content (Updated 2011), 8.
[50] Australian Communications and Media Authority, Regulating Online Content: The ACMA’s Role (2011), 3.
[51]Broadcasting Services Act 1992 (Cth) sch 5 cl 40(1)(a) (content hosted offshore); sch 7 cl 69(1) (Australian-hosted content).
[52] Australian Communications and Media Authority, Regulating Online Content: The ACMA’s Role (2011), 3.
[53] W Wei, Online Child Sexual Abuse Content: The Development of a Comprehensive, Transferable International Internet Notice and Takedown System (2011), 47.
[54] Australian Communications and Media Authority, Working Together to Fight Online Child Abuse Material <http://www.acma.gov.au/scripts/nc.dll?WEB/STANDARD/1001/pc=PC_90166> at 11 September 2011.
[55] Australian Communications and Media Authority, Regulating Online Content: The ACMA’s Role (2011), 3.
[56] Ibid, 3.
[57]Broadcasting Services Act 1992 (Cth) sch 5 cl 2(b).
[58] Ibid, sch 5 cl 40.
[59] Internet Industry Association, Internet Industry Codes of Practice: Codes for Industry Co-regulation in Areas of Internet and Mobile Content (2005), 23.
[60] Department of Broadband, Communications and the Digital Economy, Mandatory Internet Service Provider (ISP) Filtering: Measures to Increase Accountability and Transparency for Refused Classification Material–Consultation Paper (2009), 3.
[61] Internet Industry Association, Internet Industry Codes of Practice: Codes for Industry Co-regulation in Areas of Internet and Mobile Content (2005), 21.
[62] Internet Industry Association, Guide for Internet Users: Information about Online Content (Updated 2011), 4.
[63] S Conroy (Minister for Broadband, Communications and the Digital Economy), ‘Measures to Improve Safety of the Internet for Families’ (Press Release, 15 December 2009).
[64] Department of Broadband, Communications and the Digital Economy, ISP Filtering—Frequently Asked Questions <www.dbcde.gov.au/funding_and_programs/cybersafety_plan/internet_service_provider_isp_filtering/isp_filtering_live_pilot/isp_filtering_-_frequently_asked_questions> at 16 February 2012.
[65] Department of Broadband, Communications and the Digital Economy, Outcome of Public Consultation on Measures to Increase Accountability and Transparency for Refused Classification Material (2010).
[66] Ibid, Measures 1 and 5.
[67] Ibid, Measures 4 and 7.
[68] Department of Broadband, Communications and the Digital Economy, ISP Filtering—Frequently Asked Questions <www.dbcde.gov.au/funding_and_programs/cybersafety_plan/internet_service_provider_isp_filtering/isp_filtering_live_pilot/isp_filtering_-_frequently_asked_questions> at 16 February 2012.
[69] Eg, FamilyVoice Australia, Submission CI 2509; Australian Christian Lobby, Submission CI 2500; Communications Law Centre, Submission CI 2484; Bravehearts Inc, Submission CI 1175.
[70] A Hightower, Submission CI 2511; I Graham, Submission CI 2507; Confidential, Submission CI 2503; Confidential, Submission CI 2496; Arts Law Centre of Australia, Submission CI 2490; National Association for the Visual Arts, Submission CI 2471; R Harvey, Submission CI 2467; D Mitchell, Submission CI 2461.
[71] Communications Law Centre, Submission CI 2484.
[72] Australian Christian Lobby, Submission CI 2500.
[73] Eg, L Mancell, Submission CI 2492; R Harvey, Submission CI 2467; Civil Liberties Australia, Submission CI 2466; D Mitchell, Submission CI 2461.
[74] Eg, Confidential, Submission CI 2503; A Ameri, Submission CI 2491; Civil Liberties Australia, Submission CI 2466; J Denham, Submission CI 2464; Electronic Frontier Foundation, Submission CI 1174.
[75] Eg, A Ameri, Submission CI 2491; D Mitchell, Submission CI 2461; Electronic Frontier Foundation, Submission CI 1174.
[76] Eg, L Mancell, Submission CI 2492; K Weatherall, Submission CI 2155.
[77] Eg, Confidential, Submission CI 2503.
[78] Eg, A Hightower, Submission CI 2511; I Graham, Submission CI 2507.
[79] Eg, Confidential, Submission CI 2503; Confidential, Submission CI 2496; R Harvey, Submission CI 2467; Civil Liberties Australia, Submission CI 2466; J Denham, Submission CI 2464.
[80] Eg, Lin, Submission CI 2476; Electronic Frontier Foundation, Submission CI 1174.
[81] Eg, Arts Law Centre of Australia, Submission CI 2490; National Association for the Visual Arts, Submission CI 2471; K Weatherall, Submission CI 2155.
[82] Australian Communications and Media Authority, Annual Report 2010–11 (2011), 112–113.
[83] Australian Law Reform Commission, National Classification Scheme Review, ALRC Discussion Paper 77 (2011), Proposal 10–1.
[84] Eg, Uniting Church in Australia, Submission CI 2504; Arts Law Centre of Australia, Submission CI 2490; Telstra, Submission CI 2469.
[85] Telstra, Submission CI 2469.
[86] Australian Council on Children and the Media, Submission CI 2495.
[87] Collective Shout, Submission CI 2477.
[88] Civil Liberties Australia, Submission CI 2466.
[89] A Hightower, Submission CI 2511.
[90] Uniting Church in Australia, Submission CI 2504.
[91] A Hightower, Submission CI 2511; L Bennett Moses, Submission CI 2468.
[92] S Conroy (Minister for Broadband Communications and the Digital Economy), ‘Outcome of Consultations on Transparency and Accountability for ISP Filtering of RC Content’ (Press Release, 9 July 2010).
[93] Internet Industry Association, ‘IIA to Develop New ISP Code to Tackle Child Pornography’ (Press Release, 12 July 2010).
[94] Internet Industry Association, ‘Internet Industry Moves on Blocking Child Pornography’ (Press Release, 27 June 2011).
[95]Debates, Senate Standing Committee on Legal and Constitutional Affairs—Parliament of Australia, 2 November 2011, (Australian Federal Police answer to Question 25 on notice).
[96] Ibid.
[97]Debates, Senate Standing Committee on Legal and Constitutional Affairs Legislation Committee, 18 October 2011, 94 (N Gaughan).