“Even in Australia, I am perceived as a woman who has strong opinions,” Rachel Dixon says.

To do her job, she must be. Dixon is Privacy and Data Protection Deputy Commissioner in the state of Victoria, where she’s taking on business and government leaders to ensure they protect customers’ information and use it responsibly.

Any organisation with data needs to understand the risks associated with it. Data is both “an asset and a liability”, she says. It can do great wonders, but can be dangerous in the wrong hands – either through a breach of privacy or security.

Make leaders responsible

Victoria has been a leader in this regard with a risk-based approach, which all organisations in the state are legally mandated to use. It requires organisations to have an inventory of their information – you can’t protect what you don’t know you have – and an understanding of the impact if that data gets stolen, she says.

The framework is based on the principle that not every organisation will have the same security risk profile, Dixon adds. It allows each to understand the level that is best suited to them – for instance, a garden will be required to manage data differently than a healthcare provider.

Under the law, the CEO of a business or Secretary of a government department is directly responsible for information security. They attest to the Office of the Victorian Information Commissioner that the organisation has actually taken the measures that it is required to. “If they mislead a commissioner, that’s an offence, so they’re the ones on the hook,” Dixon says. This was one of the most controversial moves, she adds, but has proven popular as it has brought leaders’ attention to the issue.

This is important for countries like Singapore where millions of health records were stolen last year. The country had to conduct a lengthy review into what went wrong and who was responsible. The inquiry’s first recommendation was that “cybersecurity must be viewed as a risk management issue” and “decisions should be deliberated at the appropriate management level”. Victoria’s risk-based approach and strong accountability measures are one way to enforce this.

Victoria’s framework includes requirements beyond the usual cybersecurity measures. “We look at physical security: can people just wander into your building off the street? How hard is it to tailgate them at the door?” It also provides checks for personnel security, looking at whether employees handling sensitive data could be easily compromised.

Privacy principles for cities

One of the most complex arenas for data protection is in smart cities – where technology collides with physical space, in many cases using sensors and cameras in public places without citizens being aware of it. How can people be given the choice to not share their data every time they walk out? “You can’t give informed consent to enter a street,” says Dixon, arguing that the current approach to smart cities is fundamentally flawed.

A popular model for cities has been to share data for startups and companies to analyse, and provide new and better services to residents. This is particularly problematic because there is no recourse for residents who do not wish for their data to be shared, Dixon believes. “There’s a certain coercive power there particularly when the private sector is involved.” Where there is consent involved at all, citizens have to opt-out – or their data will be collected by default.

A perfect example is Alphabet’s Sidewalk Labs initiative to turn Toronto’s waterfront into a tech-run district. Its privacy chief quit on learning that the company was sharing identifiable information with third parties. The project is now on the verge of being shutdown.

Smart cities must be based on an opt-in consent model, Dixon says. Data is collected, stored and used only if that resident has explicitly allowed for that. Cities should provide non-digital alternatives for those who don’t wish to share their data so that they’re not penalised for their choice.

The data that is collected should be managed in a controlled “lab” environment where strict governance can be enforced, she adds. This is because experts have not yet perfected the method of anonymising data – chances are that the anonymisation can be reversed or even that the data becomes less accurate, defeating the purpose of collecting it, she says.

Victoria’s state capital Melbourne has been setting up “innovation districts” to trial sensors. “We’ll be having some interesting discussions with them about the consent model, and the ability of people to still visit that public space without consenting,” Dixon says.

Tighter privacy laws

On the whole, governments must develop more “mature” laws for how companies collect, store and use customers’ data. “There is a case for tighter regulation globally,” she says. The current privacy norms place the burden on users to understand complex and often vague terms and conditions of how their data is being used. “We’re requiring a high degree of awareness by citizens,” she says.

Instead, consent should be based on the principle of a certain minimum guarantee of privacy for everyone, she says. Some consumers might be comfortable with a lower degree of privacy than others, but “there should be a floor below which people can’t trade off consent”, she says.

For this to happen, there are three key changes. First is that companies that collect users’ data should not be allowed to share that with third parties. This is often vaguely worded in privacy policies as users agreeing to share data with “marketing partners”, without any information on who they are or what they will do with the data. “Without the context, how can I have informed consent? That’s where the consent should stop”, says Dixon.

The next will be to ensure that companies track how data is collected and for what purpose. This will allow regulators to enforce that companies only use the data for that purpose, failing which they should destroy it. “That’s going to be very expensive, because businesses will need to change a lot of their fundamental systems,” Dixon says.

Finally, the world needs privacy regulators with teeth as they are called on to do new and complex work. They need greater resources to keep up with the colossal amount of data generated and pace of change in technology, particularly with complex areas like AI, biometrics, encryption and social media on the rise, she adds. “If people want to make some headway there, they will have to apply resources to making it better.”

Dixon is a woman of strong opinions, but her opinions must be well-informed. To take on businesses and agencies, she must show empathy to the people she works with. “You have to come at this very much prepared to hear what their issues are as well. I want my opinion, to be formed on the basis of facts from other people,” she adds.

Having been an advertising tech professional in a previous life, Dixon is personally all too aware of how powerful data is. Governments and businesses must treat that power responsibly and with great care. “I guess I’m atoning for my sins”, she jokes.