English social theorist Jeremy Bentham believed a panopticon — a tower that allows a watchman to observe prisoners without them knowing — would encourage inmates to act as though they are being watched all the time, and lead to better behaviour.
In today’s digital age, the panopticon can easily become a metaphor for modern surveillance. Citizens can feel like they are “always being monitored, measured, watched, reviewed,” says Rachel Dixon, Privacy and Data Protection Deputy Commissioner, Office of the Victorian Information Commissioner.
She discusses how governments can continue to preserve citizens’ privacy in an age of smart cities, pandemic contact tracing, and AI.
Privacy-preserving smart cities
There’s an “inevitable tension” if smart cities are funded by advertising or marketing, says Dixon. “Because you are mining the data of your citizens in order to make the cities smart. And there’s inevitably a privacy problem with that.”
Alphabet’s Sidewalk Labs initiative, for instance, aimed to turn Toronto’s waterfront into a high-tech utopia laden with sensors. But, according to The Atlantic, a privacy expert resigned after Sidewalk Labs and Waterfront Toronto “refused to unilaterally ban participating companies from collecting non-anonymous user data”.
The project was shuttered in May 2020 due to economic uncertainty from the pandemic, said the CEO of Sidewalk Labs.
“If you want a smart city, you have to think of other ways to fund that infrastructure that preserves privacy,” Dixon emphasises.
Victoria’s state capital Melbourne has built ‘innovation districts’ to trial the use of 5G and sensors to collect data that can be used to inform citizens of walking routes with comfortable temperatures.
Dixon’s team is working with one of these test precincts in Carlton city to ensure the ethical use of data. “There’s no reason why data scientists shouldn’t be able to work on data, provided there’s a governance board saying ‘yes, we can be ethical.’ But you can’t take that data away, you can work on it inside our lab.”
“We are big believers in having labs” as they provide a controlled environment for strict governance rules to be enforced, she adds. Researchers should not release data sets openly, even if they are anonymised, because it can be “really easy” to re-identify a data set with other details like location and timestamps.
The Victorian Center for Data Insights and the Chief Data Officer assists agencies in managing such risks. This is a “great capability” that “more governments around the country should be doing”, she advises. “To have an expert data scientist able to advise people on what’s safe and what isn’t is very helpful and makes our job a lot easier”.
Privacy and contact tracing
There has been “a lot of discussion in Australia” on whether law enforcement should get access to contact tracing data, Dixon says.
The Australian Information Commissioner in September called for personal contact tracing data to not be used for any other purposes, such as law enforcement or marketing. This came after various police agencies attempted to access check in data without a warrant, The Sydney Morning Herald reported.
The use of contact tracing data by law enforcement also became a heated debate in Singapore last year. It was revealed in January 2021 that the police could use data from the TraceTogether app for criminal investigations, despite initial assurances that data would only be used for contact tracing.
Privacy concerns then sparked a new legislation to be introduced, restricting the use of contact tracing data in criminal investigations to serious crimes such as murder, rape, and terrorism, wrote CNA.
Dixon believes contact tracing data should not be used for law enforcement. “It would damage people’s confidence … they’re checking in various places, because they want to fight the pandemic. They’re not checking in because they want to tell the police where they are.”
The Victorian Information Privacy Principles stipulate that personal contact tracing data is only released when law enforcement have a court order, she says.
“Truth be told, the police have a lot of other data sets for fighting crime,” she adds, like telecommunications and location data. “They don’t need the Covid check in data. Just saying.”
AI as a ‘business unit’
More governments are turning to AI to build personalised and effective services. But one of the greatest challenges of AI is that “people tend to think of it as a project”, she says. “They’ll put a team together, put some data scientists on it, produce the thing, and then that team will dissolve.”
But AI is more like an “automated business unit”, she adds. “It keeps on going” – data inputs might change slightly or become inaccurate. Regular testing is required to ensure the model doesn’t generate the “wrong kind of results” and cause users to make faulty decisions.
By building AI as projects, people are “not necessarily paying enough attention” to its full lifecycle and the continued work that is required, Dixon says. “And the real issue there is that it means there is no money for that retesting and assurance further down the track.”
Trust is, and will always be, the bedrock of good digital services. In today’s digital age, governments will need to tread the line carefully and ensure any new technologies continue to preserve citizens’ privacy.