Contact tracing tech has renewed the debate on surveillance and privacy.

Covid-19 apps released by nations are under greater scrutiny. Governments and companies are now showing how their apps protect privacy. The most high-profile of these is the rare collaboration between Google and Apple to ensure the tracking is anonymous.

Lesly Goh, Senior Technology Advisor at the World Bank, pointed to five of the latest data privacy techniques at GovInsider’s Festival of Innovation. Many of these innovations, while “still in research mode”, have great potential for enforcing security, she said.

1. Mask data with maths

First up is differential privacy, which uses mathematical formulae to protect datasets. A differential privacy tool adds a small amount of statistical noise to data. After analysis, it would be impossible to tell what an individual’s data contains, or if their data was used for analysis at all.

This method protects the individual’s identity, but doesn’t make analysis less accurate. “An attacker should not be able to learn any more about you from the statistics we publish using your data than from statistics that did not use your data,” according to the US Census Bureau.

The Bureau was the first organisation to apply differential privacy in its services for its residential population data tracker. Apple, Google, and Uber have used the same principle to secure user data for its iPhones, browsers and apps respectively, wrote Dr John M. Abowd, Chief Scientist and Associate Director for Research and Methodology at the US Census Bureau.

Earlier this year, Microsoft announced that it had collaborated with Harvard to develop an open source platform for differential privacy. Governments and businesses can share their data safely with researchers to gain more from the data.

2. Need-to-know only

The second technique is zero-knowledge proofs. Users provide proof for their knowledge of the data instead of sharing the data itself, Goh explained. When applied to digital identity, for instance, people could prove they meet a minimum age requirement without revealing their date of birth, wrote Wired.

Last year, the US announced that it would start using this technique in its military. This means it could potentially state that a digital system has a vulnerability, without exposing details about what the gap is. It could also attribute a cyberattack to an entity without having to reveal classified information or either side’s specific hacking abilities.

3. Eggs in multiple baskets

Third is a technique known as secure multiparty computation. This method spreads data analysis across multiple parties so no one can see the complete dataset, shared Goh.

Scientists recently used this method to predict the risk of Covid-19 infection, according to data security company Enya.ai. Researchers collected 3.6 million submissions of symptoms and test results from Enya.ai’s Covid screening tool. They then analysed the information such that unprotected data was never brought outside an individual’s phone or computer.

Countries and healthcare organisations across the world have been trying to balance public health concerns with individual rights. This study shows the possibilities for understanding the coronavirus without compromising on privacy.

4. Share analysis without sharing data

Next comes federated analysis, where analysts share insights from data analysis without sharing data itself, said Goh. This method involves processing each device’s data locally, then combining them. Data analysts can only see the combined results, not any particular device’s data.

For instance, Google uses federated analytics to show Google Pixel phone owners what songs are playing around them. Engineers can improve its database of popular songs without knowing the songs played by individual phones, Google explained.

Besides protecting data privacy, analysing the data close to its source also solves the logistical issues of transporting data all into one place. Bandwidth constraints can sometimes make moving large amounts of data tricky, wrote CIO.com.

5. Analyse encrypted data directly

Typically, data has to be decrypted before it can be analysed. Homomorphic encryption makes it possible to analyse data that is still in its encrypted format. This ensures sensitive data such as financial or healthcare details can remain protected while it’s processed.

This could be very useful in elections, wrote Forbes. Officials can tabulate votes while keeping the identities of the voters secret, and third parties can verify the results.

There’s no denying the good that data sharing and analysis can bring for public services, but most governments are rightfully nervous about releasing their datasets. These advancements in tech offer hope that governments don’t have to choose between serving citizens better and protecting their privacy.