Published 3 May 2024

Read the keynote address prepared for delivery by Privacy Commissioner Carly Kind for the CyberCX and Tech Council of Australia Privacy by Design Awards on Thursday 2 May 2024.

Introduction

Good evening

I am pleased to hear the Attorney-General speak so passionately about the need for privacy reform.

It was in part the Attorney-General’s leadership on the issue of privacy that drew me back to Australia, after 15 years living and working in Europe, to take on the role of Privacy Commissioner. It was so clear to me that he has a real, personal connection to this issue, which I also am a passionate advocate for.

I too share his belief that the time is right to move ahead and provide Australia with the privacy framework that will stand the test of time.

And I’m so honoured to have the opportunity to address you on the occasion of the Privacy by Design Awards.

As the organisers of today’s event know well, there is an important connection between the concept of privacy by design and privacy regulators such as myself. It was the then privacy regulator of Ontario, Canada, Ann Cavoukian, who conceptualised the privacy by design approach in 1995, almost 30 years ago.

The concept is deceptively simple. At its heart it is a recognition that technology is not exclusively, or even primarily, part of the problem when it comes to the protection of data privacy. Rather, it is an essential part of the solution.

Beyond that nub of the idea, however, the concept of privacy by design speaks to a deeper and more universal truth – that technology should be deployed in the pursuit of societal objectives, and not the other way around. That even as we, as individuals and communities, are shaped by technology, we also have the power to shape technology.

The key word there is power. Notions of power cut in every direction in the digital ecosystem – the power wielded by tech monopolies and duopolies; the power concealed in political microtargeting and misinformation campaigns; the lack of power and agency consumers feel when they’re using digital technologies.

To me, the right to privacy is all about power – the power to control who has what information about you, and the power to hold them to account in how they use it.

I’m speaking to you today on the eve of Privacy Awareness Week, which commences next week. This year, it was important to me that we centre the notion of power in our annual campaign, and so we are urging Australian organisations to power up their existing privacy practices and culture in advance of privacy law reform and changing technology. By powering up – by strengthening - your privacy practices, you are also empowering people.

Given that we’re here today to recognise and celebrate good ‘privacy by design’, I thought I’d take the opportunity to elaborate what powering up privacy by design might look like. I’d then like to share a few brief thoughts on what’s next when it comes to privacy and power.

The lifecycle of privacy by design

True privacy by design isn’t about a single feature or gimmick. In the words of the European Data Protection Supervisor, privacy by design has ‘a visionary and ethical dimension’.

It’s about ensuring privacy is at the forefront of the entire design lifecycle. It is not a piecemeal approach but one that encompasses legal, governance and societal responsibilities.

So, what does this look like in practice?

Privacy by design begins with leadership

As with everything in business, privacy by design begins with leadership.

Organisations should be making the case for privacy from the get-go, and they should be doing that in the C-suite. Not just because it is the right thing to do – although it is that – but because it makes good business and operational sense.

It starts at a fundamental level with legal compliance – no CEO, board or shareholders want to preside over an organisation that is breaking the law, especially in an environment where laws are changing, where penalties are increasing, and where the community has made it clear that it wants greater control over its privacy.

But legal compliance is the floor, not the ceiling. Increasingly, organisations need to prove they have a social licence, and a significant component of that is about considering and mitigating their role in collective and societal harms. Just as any responsible organisation now has a mechanism for considering the sustainability of its operations, and avoiding contributing to or exacerbating climate change, so too should responsible organisations consider their role in minimising harms for individuals and for groups that come through poor privacy practices. In recent years we’ve seen the immense harm that major data breaches, for example, can cause, particularly when they involve compromised credentials such as passports and driver licences, raising the risk of identity theft, or involve sensitive health information. What organisation wants to be responsible for the data version of an oil spill?

Ultimately, however, CEOs and boards need to be convinced to invest in privacy as a business offering. Good privacy practices make good business sense. As our Australian Community Attitudes to Privacy Survey has shown, consumers place a high value on privacy when choosing a product or service, with it ranking only after quality and price. They are even prepared to experience some inconvenience if their privacy is guaranteed.

Think about privacy from the start

With any new product or service offering, think about privacy right from the kick-off stage.

You need to think about privacy right from the start, right from your first meeting. Think about whether the community would consider what you’re intending to do as fair and reasonable. Don’t be the guys who are just preoccupied with whether you can, think first about whether you should.

In a world of increasing cyber threats, from motivated identity thieves to sophisticated threat actors, know that you can’t lose data you don’t hold in the first place. Map what information you need. Only collect that information that is necessary for you to carry out your business. Importantly, know what information you hold. Holding on to personal information for customers or shareholders that haven’t been contacted in many years is pointless, and the risks far outweigh any potential benefits. Do an audit of what information you’re collecting and your data holdings.

I’d strongly suggest organisations begin to take into account changes forthcoming in the Privacy Act reforms. In particular, as the Attorney-General outlined, the Government has agreed in principle tointroduce a new positive obligation that personal information handling is fair and reasonable. This is a fundamental shift in approach, and provides confidence that, like a safety standard, privacy is built into products and services from start.

The fair and reasonable standard would put the onus on organisations to consider, among other matters, whether consumers would reasonably expect their personal information to be used in particular ways, and to take into account the risk of unjustified adverse impact or harm, which could include physical, psychological or emotional harm or negative outcomes with respect to an individual’s eligibility for rights or benefits in employment, credit and insurance, housing, education, professional certification or provision of health care and related services.

Because organisations won’t be able to ‘consent out’ of the fair and reasonable requirement to justify their activities, I’m hopeful that this new feature of the Australian framework will set us aside from other jurisdictions worldwide and position us as Australia’s privacy regulator to take on some of the more concerning industry practices we’re seeing, whether those be related to emerging technologies such as biometric recognition, or to persistent problematic practices such as microtargeting or dark patterns.

Organisations can get in front of this now by thinking about how new products and offerings can embody fairness and reasonableness right from the start. One mechanism for facilitating this process is to get into the habit now of undertaking a privacy impact assessment at the commencement of any new technological deployment or novel use of personal data. This can help organisations identify and mitigate risks and think through the ‘should’ as well as the ‘could’.

Build consideration of privacy into research and design

As we move through the product lifecycle, organisations should be building in consideration of privacy into their user research, and throughout the research and design phase.

We know that when individuals have the chance to exercise agency around their privacy, they often will. For example, when Apple turned on their Do Not Track feature across iPhone apps, upwards of 90% of users elected to enable it on apps.

Identify the moments when people make decisions about information, and aid them in how they can make reasonable, informed choices. Only 2 in 5 people feel most organisations they deal with are transparent about how they handle their information, and 58% say they don’t understand how it is used.

Ensure consent, where applicable, is valid. As I have said, in Australia we have relied on placing the burden of consent on consumers who often feel they have no choice but to accept what the service does with their data. Proposed changes to the Privacy Act will seek to address the clarity of collection notices and consent requests, to improve consumer comprehension. There will also be an enhanced legislative definition of consent, which would require that consent be voluntary, informed, current, specific and unambiguous.

The Privacy Act Review is also likely to empower people to exercise their information rights. The proposals also seek to support privacy self-management through new and enhanced individual rights, including a right to erasure and a right to de-index internet search results. The proposed new and enhanced rights draw on elements and serve a similar function to rights under the GDPR. The changes would be supported by the introduction of a direct right of action and a statutory tort for serious invasions of privacy to provide individuals with additional avenues of redress to protect their privacy.

Carry privacy into deployment

Privacy should then be carried right through from research and design, to deployment.

Here, I want to emphasise the real role that privacy-preserving technologies can play. We hear a lot about the challenges that end-to-end encryption poses, but in terms of ensuring that individuals’ privacy is protected in their day-to-day use of products and services, encryption is a key element.

We are seeing somewhere in the vicinity of 1,000 data breaches that have serious implications for individuals each year, and the large majority of those are due to malicious or criminal attack. Phishing, compromised credentials, ransomware, hacking, malware and brute-force attack are among the many tactics deployed for accessing personal data and proprietary systems. Encryption, at rest and in transit, is one part of the puzzle when it comes to reasonable steps to protect the privacy and security of personal information.

Another part of the puzzle may be the government’s new Digital ID scheme.

Services and products that involve the collection of personal identity information can create serious privacy risks and harms.

Product and legal teams should strictly scrutinise such processes, considering what the minimum necessary information that needs to be collected is, and the minimum retention periods required.  Organisations should also be looking to implement privacy-preserving digital systems wherever possible.

The Digital ID scheme will let us prove who we are online more easily and securely – and crucially, without sharing identity documents with every organisation. This will be good for individuals, and also good for business who generally don’t like storing this sort of information.

The OAIC will be the independent privacy regulator for the scheme and will enforce its privacy safeguards.

Many of the organisations in this room will be thinking about how to deploy large language models (LLMs) and other forms of generative AI in your services and products, and I’d remind you of the immense privacy challenges that are associated with the use of such tools. I think we’re going to see increasing pressure on regulated entities to think about privacy-preserving approaches to AI, including the use of homomorphic encryption, synthetic data, or differential privacy approaches. In the meantime, I’d suggest a precautionary approach to using personal information in the context of LLMs.

Continuous improvement and monitoring is essential

Finally, then, what does privacy by design mean once your product has gone to market?

If you have done all of the above, then you can be congratulated for engaging in best practice privacy. But continuous improvement and monitoring is essential.

Even the most secure organisations can be victims of sophisticated attacks, and for that reason you should be familiar with the specifics of the Notifiable Data Breaches (NDB) scheme, and what is required should you experience a breach.

The NDB scheme has been in operation for six years now – given the maturity of the scheme, we absolutely expect that organisations are aware of and comply with the requirements.

In the event of an incident, your organisations must also be able to expeditiously assess whether an eligible data breach has occurred, how it has occurred and what informationhas beenaffected – and notify affected as quickly as possible.

The response matters. By responding quickly, organisations can substantially decrease the impact of a breach on affected individuals, reduce the costs associated with dealing with a breach, and reduce potential reputational damage.

Conclusion

So, then, privacy by design has salience at each stage of the product lifecycle. This means taking a holistic approach. Importantly, it means breaking down silos at the structural, operation and leadership levels to advance privacy by design and good privacy outcomes. Privacy is no longer the preserve of the IT nerds, or the legal team, but needs to be mainstreamed from the board room to the lunchroom.

Here in Australia, we stand on the precipice of some big changes in the privacy space. With any luck, the reforms to the Privacy Act will be introduced to Parliament this year, finally making our framework fit for purpose to enable the effective regulation of organisational practices in this new AI era. Without these reforms, we will struggle to ensure regulation keeps apace of new developments, particularly when it comes to the protection of children and vulnerable individuals, and the protection of privacy in the online realm.

I know that many of you in the room are impatient for this change. Despite the best efforts of many people, including the Attorney-General, reform is overdue. Unbelievably, we’re now in a position where the US legislature is considering adopting its own modern privacy legislation, an eventuality those of us who have worked in tech policy for some time thought was unlikely to happen this year. Let’s hope Australia can pull out in front in that neck-to-neck contest.

After all, legal reform is the ultimate power move when it comes to privacy by design. It will strengthen the power of the OAIC as a regulator, strengthen the hand of individuals vis a vis the organisations that collect and use their data, and strengthen the Australian market, ensuring that entities can innovate with data technology in a trusted context.

I hope we are all agreed here that we  all need to power up privacy – and privacy by design – to meet the challenges of the data age head on.

We are on the cusp of a new era of privacy reform and we need to be ready for it.