This article is more than 1 year old

Australia wants law to ban de-anonymisation of anonymous data

Attorney-General Brandis says it's important for privacy - but what about researchers who probe privacy, George?

Yet again, the Australian government has announced a proposal that could outlaw academic research.

In the wake of the privacy concerns that surrounded Australia's 2016 Census, attorney-general George Brandis has said the government will make it illegal to de-anonymise data sets that have been de-identified.

In the lead-up to the Census, citizens and privacy groups warned against the Australian Bureau of Statistics' (ABS') intention to retain names for future research efforts.

The ABS responded that names would not be used to link data sets, but rather, the data would be anonymised with an as-yet-unspecified cryptographic hash.

Researchers have warned that de-anonymisation remains possible, so Brandis has announced doing so will become a criminal offence.

From his media release: “The amendment to the Privacy Act will create a new criminal offence of re-identifying de-identified government data. It will also be an offence to counsel, procure, facilitate, or encourage anyone to do this, and to publish or communicate any re-identified dataset.

“The legislative change, which will be introduced in the [southern] Spring sittings of Parliament, will provide that these offences will take effect from today's announcement.”

+Comment: The aim of the legislation is worthwhile, but there are serious collateral risks to researchers.

Anonymity research is the first, and most obvious, risk. The only way to harden anonymous systems against attack is to research how many vectors are needed to trace a data record to an individual.

The Tor network is a case in point: a researcher who publishes how a 'sybil' node can identify individuals is helping harden the network against their attack.

The second risk is less obvious: a badly-drafted Act could impair cryptography research.

Let's assume, for example, that a data set is anonymised using a cryptographic hash.

The only way ordinary people can have confidence that their names are protected is for researchers to test it.

If the implementation is faulty – a bad hash is chosen, it's not salted, or it's done using a buggy software library – a badly-drafted law would stop a researcher testing it, while having no practical impact on bad actors (particularly if they're overseas).

If the law's badly drafted, it'll make sure Australians can't take part in research to protect data against “black hats”. ®

More about


Send us news

Other stories you might like