The Covid-19 pandemic made cloud migration an inevitability for many governments. Moving to the cloud allowed public sector agencies to scale quickly, which became critical when the need for digital government services skyrocketed during the crisis.

The cloud is here to stay. It lets governments focus on their core jobs, allows them to access cost savings, and provides pre-built services for them to adopt. Yet, as delegates told the audience of AI x GOV 2022, agencies need to ensure data remains secure even as they seek to tap on the various benefits of the cloud.

As agencies put in place policies and regulations to ensure data remains secure on the cloud, it is critical that they consider their wider data security strategy and adopt the best practices for erasing data that is no longer needed.

Who’s responsible for what?

Migrating data to the cloud can provide benefits in terms of scalability and flexibility, but agencies need to fully understand what they remain accountable for. That is, they will need to understand the role they may play in protecting data stored on the cloud and what responsibility resides with the cloud service provider. This is where the shared responsibility model can help shed some light.

“All the hardware lifecycle management and maintenance of hardware components are covered by the cloud service provider,” Masayuki Morita, Vice President of APAC at Blancco Technology Group explains.

Cloud service providers – like Amazon Web Services – manage the security of the cloud, such as the physical security of the data centres as well as the security of the operating system. This frees agencies from the chore and costs of hardware maintenance, says Morita.

In turn, government agencies remain responsible for managing security within the cloud – that is, ensuring data remains secure. This can be done through means such as prudently managing access rights, keeping firewalls up to date, and erasing data that is no longer needed.

“Even though you use a public cloud system, you are not walking away from the accountability of protecting your customers’ data. The data layer on top of the cloud infrastructure, that’s what the user has to manage,” Morita says.

What’s the deal with data erasure?

Part and parcel of managing data storage is removing data that is no longer needed – storing unnecessary data can increase attack surfaces and vulnerability. Regulations like the EU’s General Data Protection Regulation, Singapore’s Personal Data Protection Act, and Japan’s amended Personal Information Protection Act also require responsible data erasure management.

After all, data erasure is unavoidable – wherever information is stored, data erasure will eventually be needed, be it at the end of individual projects, when systems move to a different environment, or when regulations call for data removal. In some jurisdictions like the EU, Thailand, and South Korea, individuals also have the right to request for their personal data to be erased in certain cases.

When the need to erase data arrives, cloud users have the responsibility of erasing data so that it can no longer be restored, and producing auditable proof that  erasure has occurred.

Cloud service providers need to provide the ability to do so, and third parties like data security firm Blancco Technology Group can come in to help agencies verify that erasure has taken place properly by generating tamper-proof reports and ensuring compliance with relevant regulations.

But how can governments plan for cloud data erasure from the beginning of their cloud journey? This is where encryption keys come into play.

Encryption keys – the key to data erasure

Unlike erasing data on hard drives or on on-premise servers, agencies rarely have access to the physical centres that host the public cloud, which may be distributed across the world. Physical erasure with methods such as shredding devices is not an option, and agencies have to rely on logical data erasure methods, explains Morita.

When data goes into the cloud, agencies need to encrypt such data with an encryption key. This ensures that only the user with access to the encryption key can decode the data – otherwise, the data will be unreadable. Establishing encryption keys from the get-go is a best practice for erasure management, Morita says.

For example, the Japanese government’s Information System Security Management and Assessment Program outlines that users must manage encryption keys and erase them to protect data in the cloud. When data is no longer needed, an agency simply needs to delete the encryption key and obtain verification that the key is successfully removed. This way, the original data will no longer be retrievable, explains Morita.

Accordingly, public cloud service providers are required to provide users with the tools that allow encryption key management and erasure, as well as information on how to use these tools. One example of this tool is the key management service offered by Amazon Web Services, which allows users to encrypt data and manage data encryption with an encryption key.

Amazon Web Services’ storage services such as Amazon Simple Storage Service allows integration between Blancco’s cloud erasure solution, Blancco Cloud Storage Eraser, and virtual environments. From the very beginning, agencies can create virtual spaces, or buckets, encrypted with an encryption key that users can create and upload themselves.

With this integration, Blancco facilitates erasure of all objects from the bucket, and the bucket itself. Blancco also produces tamper-proof erasure reports and an audit trail of the erasure process.

Similar to erasing data from hard drives, it is critical that agencies can be confident that their data erasure processes for data hosted on the cloud are reliable, verifiable, and auditable. Through Blancco’s erasure solutions, any data hosted on the cloud can be thoroughly rendered inaccessible, and agencies can obtain certification to prove this.