Key themes in a ‘technology aware’ framework

10.13 Professor Lawrence Lessig has described four modes of regulation in cyberspace, noting that these modes are reflected in ‘real space’:

  • law—which may include prohibitions and sanctions for online defamation and copyright infringement;

  • social norms—which may involve a user ensuring that the behaviour of their avatar conforms to community expectations in an online world such as Second Life or a social networking site such as Facebook;

  • markets—which regulate the price paid for access to the internet and access to information on the internet; and

  • architecture—which is the code, hardware or software that shapes the appearance of cyberspace.[19]

10.14 Cyberspace regulatory theorists disagree on the role that should be taken by each modality in Lessig’s analysis.[20] Lessig demonstrates, however, that regulation of the internet and other developing technologies must be through measures additional to conventional law. Otherwise, the regulation through law can be circumvented or undermined, for example, by the architecture of the internet.

10.15 As a starting point, the ALRC suggests that broadly drafted statutory principles could address the impact of developing technology. A regulatory framework, however, should also accommodate co-regulation between the OPC and agencies and organisations, and it should seek to empower individuals by providing them with the requisite knowledge of how to protect their privacy.

10.16 A technology-aware regulator plays a crucial role in dealing with the impact of technology on privacy. In this chapter, the ALRC recommends that the OPC should provide guidance that outlines how certain requirements in the model UPPs can be met by agencies and organisations that use particular technologies to handle personal information.[21]

10.17 Education is a further important feature of the regulator’s role. In this chapter, the ALRC recommends that the OPC should educate individuals about how PETs can be used to protect privacy. In addition, education programs that focus on the deployment of technology in a privacy-enhancing way should be directed towards agencies and organisations that design and deploy new and developing technologies.[22]

10.18 In Chapter 47, the ALRC discusses the importance of proactive regulation. This is reflected in the recommendations to empower the OPC to conduct ‘Privacy Performance Assessments’ of organisations, and direct PIAs for new projects and developments of agencies.[23] These recommendations are intended, in part, to promote the early implementation of specific PETs and the deployment of technology in a privacy-enhancing way.

Privacy-enhancing technologies

10.19 A number of stakeholders submitted that the ALRC consider the role that PETs could play in a regulatory framework.[24] The term ‘PETs’ can be used in a number of different contexts. PETs can refer to particular technologies that form part of the architecture of technological systems used by agencies and organisations to deliver services.[25] Chapter 9 includes a discussion of these types of PETs, which may include mandatory access control devices or identity management systems. Secondly, individuals can utilise PETs to exercise control over the collection of their personal information.[26] Several of these types of PETs, including encryption and RFID signal blockers, are discussed in Chapter 9. Finally, the way that technology is used often determines whether its impact is privacy enhancing or invasive.[27] A holistic approach to regulating technology would encourage agencies and organisations to develop and deploy all technologies to enhance privacy, or at least to ensure that their impact is privacy neutral.

10.20 In May 2007, the European Commission issued a communication on PETs to the European Parliament and Council, noting that PETs were most effective when ‘applied according to a regulatory framework of enforceable data protection rules’.[28] In the ALRC’s view, PETs can promote enhanced security and trust and are, therefore, an essential component of the regulatory structure. Some PETs, however, can be physically unwieldy and costly to implement. Moreover, use of PETs may require a certain level of technological expertise. PETs alone cannot address the impact of technology on privacy and should complement, rather than replace, the legislative and regulatory structure outlined below.

10.21 Use of PETs by individuals—and education about PETs—can provide individuals with greater control over their personal information when using technologies such as the internet. The OPC submitted that:

Education and PET solutions together will be crucial for dealing with the international nature of the internet and for ensuring that individuals are able to exercise appropriate control of their personal information when its handling falls outside of the national jurisdiction of Australian privacy law.[29]

10.22 A national survey conducted in May 2007 found that Australians were more concerned about online privacy than the threat of a terrorist attack.[30] PETs, therefore, could play a role in increasing consumer trust in online interactions. Two main types of PETs that may be deployed by individuals to protect their privacy online, encryption and identity management, are discussed in Chapter 9.

10.23 Promoting mechanisms that enhance individual control over personal information is one way to deal with the protection of individual privacy in light of technological developments. Emphasising only the responsibility of individuals to protect their information privacy is undesirable. It places a ‘premium’ on the individual

having sufficient interest in protection and the ‘cultural capital’—the ability and the means to comprehend what is happening … to read obscure fine print on the web, and to assert herself in controlling inroads or seeking redress once these threats have been realised.[31]

International engagement

10.24 While this chapter focuses on domestic regulation of developing technology, the ALRC notes the jurisdictional issues presented by developing technologies such as the internet. Some of these issues are discussed in Chapter 31.

10.25 In 2006, the United Kingdom Information Commissioner published a report noting that an effective regulator needs to stay ‘abreast of, and knowledgeable about, new technologies and systems’.[32] Noting the resource implications, the Commissioner suggested that

it is advantageous … to develop a pooled technological knowledge-and-awareness capability, as may be occurring, for instance, at the level of the [European Union], through the Article 29 Working Party and other networks and channels in which many national and sub-national regulators participate.[33]

10.26 The OPC expressed its support for ‘Australia’s involvement in international forums to coordinate data protection schemes’.[34] The Australian Communications and Media Authority (ACMA) suggested that it would be appropriate for both the Australian Government and industry ‘to participate in relevant international fora (including technical standardisation discussions) dealing with privacy issues’.[35]

10.27 The global nature of technology development and deployment requires industry, the OPC, and the Australian Government to coordinate and engage with others in the international arena. International engagement would also assist the OPC to develop technology-specific guidance on the application of the model UPPs.

10.28 Such international discussions could also consider the impact on privacy of laws such as copyright. For example, Dr Matthew Rimmer submitted that further examination of the intersection between laws that regulate the handling of personal information and laws that prohibit the tampering with technological protection mechanisms (TPMs) is warranted.[36] TPMs, which are discussed further in Chapter 9, have the capacity to collect a significant amount of personal information. As much equipment or media that deploys TPMs is designed in or downloaded from jurisdictions other than Australia, examination of these types of issues at the international level would be worthwhile.

Proactive regulation

10.29 Early regulatory intervention is desirable to prevent interferences with privacy. In Chapter 47, the ALRC recommends that the Privacy Act be amended to empower the Privacy Commissioner to conduct Privacy Performance Assessments of the records of organisations for the purpose of ascertaining whether the organisation’s records are maintained in compliance with the requirements in the UPPs, privacy regulations and any privacy code that binds the organisation.[37] Further, the ALRC recommends that the OPC should have the power to direct agencies to conduct PIAs for new projects and developments.[38]

10.30 A number of agencies identified the importance of conducting PIAs early in the development of technical systems. The Department of Finance and Deregulation, for example, submitted that:

The application of PIA process in the initial design and architecture stages identifies possible privacy risks and can lead to innovative and privacy enhancing uses of technology.[39]

10.31 ACMA noted that the PIA conducted for its electronic number mapping (ENUM) project informed its consideration of the impact on privacy of ENUM as well as other new and emerging technologies.[40] ACMA also suggested that PIAs

can assist in educating individuals, agencies and organisations about specific privacy enhancing technologies and the privacy enhancing ways in which these technologies can be deployed. The PIA will provide guidance on these issues which in turn can enable the effective dissemination of appropriate educational material to government, industry and consumers.

10.32 In the ALRC’s view, the use of Privacy Performance Assessments and PIAs should result in PETs being incorporated into systems and processes, and prevent new or emerging technologies from having an adverse impact on privacy. For example, the ‘Anonymity and Pseudonymity’ principle in the model UPPs requires agencies and organisations to design systems that allow for anonymous or pseudonymous transactions where it would be practicable to do so. It has been noted that it may not be practicable to alter retrospectively systems such as biometric identification systems or transport systems using smart card technology to allow for anonymity in transactions.[41] PIAs could ensure that agencies and organisations take the impact of technology on privacy into account before a system is developed and, for example, develop systems that provide for anonymous or pseudonymous transactions where appropriate.

[19] L Lessig, ‘The Law of the Horse: What Cyberlaw Might Teach’ (1999) 113 Harvard Law Review 501, 507–510.

[20] See, eg, D Post, ‘What Larry Doesn’t Get: Code, Law, and Liberty in Cyberspace’ (2000) 52 Stanford Law Review 1439.

[21] Rec 10–3.

[22] Rec 10–2.

[23] See Recs 47–6, 47–4.

[24] See, eg, Office of the Privacy Commissioner, Submission PR 215, 28 February 2007; CSIRO, Submission PR 176, 6 February 2007; Australian Electrical and Electronic Manufacturers’ Association, Submission PR 124, 15 January 2007; Edentiti, Submission PR 29, 3 June 2006.

[25] Commission of the European Communities, Communication From the Commission to the European Parliament and the Council on Promoting Data Protection by Privacy Enhancing Technologies (PETs), COM(2007) 228 (2007), 3.

[26] Ibid, 3–4.

[27] See, eg, J Alhadeff, Consultation PC 169, Sydney, 26 April 2007; M Crompton, ‘Under the Gaze, Privacy Identity and New Technology’ (Paper presented at International Association of Lawyers 75th Anniversary Congress, Sydney, 28 October 2002), 9–10.

[28] Commission of the European Communities, Communication From the Commission to the European Parliament and the Council on Promoting Data Protection by Privacy Enhancing Technologies (PETs), COM(2007) 228 (2007), 4.

[29] Office of the Privacy Commissioner, Submission PR 215, 28 February 2007.

[30] Unisys, Unisys Security Index Australia: A Newspoll Survey May 2007, 1 May 2007.

[31] Surveillance Studies Network, A Report on the Surveillance Society (2006) United Kingdom Government Information Commissioner’s Office, 84.

[32] Ibid, 96.

[33] Ibid, 96.

[34] Office of the Privacy Commissioner, Submission PR 215, 28 February 2007.

[35] Australian Communications and Media Authority, Submission PR 522, 21 December 2007.

[36] M Rimmer, Submission PR 379, 5 December 2007.

[37] Rec 47–6.

[38] Rec 47–4.

[39] Australian Government Department of Finance and Deregulation, Submission PR 558, 11 January 2008.

[40] Australian Communications and Media Authority, Submission PR 522, 21 December 2007. ENUM is discussed in Chs 9 and 71.

[41] M Crompton, ‘Biometrics and Privacy: The End of the World as We Know it or the White Knight of Privacy?’ (Paper presented at Biometrics Institute Conference: Biometrics—Security and Authentication, Sydney, 20 March 2002).