Voteware source code requester labelled 'vexatious'

Australian Electoral Commission plays hardball with Freedom of Information


Australia's Electoral Commission (AEC) has again denied a request to reveal the source code of the software used to count votes in Senate elections, and pointed out the the man seeking it that he may be a vexatious applicant abusing the freedom of information process.

Hobart Lawyer Michael Cordover first tried to view the source code back in November 2013. The request was highly topical because Australia's most recent election was a very close-run affair in which at least one seat turned on a handful of votes. Australia's Senate voting scheme uses a complex optional preferential system, so a small error in software could have big repercussions.

Cordover therefore argued that releasing the source code was in the public interest.

The AEC rebuffed his request, arguing that the code is commercially sensitive even though it is used only by the Commission. Cordover therefore tried again.

The AEC has since (written (PDF) to Cordover, explaining in greater detail why it won't release the source code.

The letter also says the AEC will apply to have Cordover declared a “vexatious applicant”, status that could prevent him from lodging future freedom of information requests.

Cordover's crowdfunding effort to raise cash for his ongoing efforts has sailed past its targets, and he has taken to Twitter to point out his plight and call for support.

Crucially, he has also denied colluding with others on his requests. Collusion is one reason the AEC says it thinks he is worthy of being declared vexatious. ®

Similar topics

Broader topics


Other stories you might like

  • Samsung fined $14 million for misleading smartphone water resistance claims
    Promoted phones as ready for a dunking – forgot to mention known problems with subsequent recharges

    Australia’s Competition and Consumer Commission has fined Samsung Electronics AU$14 million ($9.6 million) for making for misleading water resistance claims about 3.1 million smartphones.

    The Commission (ACCC) says that between 2016 and 2018 Samsung advertised its Galaxy S7, S7 Edge, A5, A7, S8, S8 Plus and Note 8 smartphones as capable of surviving short submersions in the sea or fresh water.

    As it happens The Register attended the Australian launch of the Note 8 and watched on in wonder as it survived a brief dunking and bubbles appeared to emerge from within the device. Your correspondent recalls Samsung claiming that the waterproofing reflected the aim of designing a phone that could handle Australia's outdoors lifestyle.

    Continue reading
  • Five Eyes alliance’s top cop says techies are the future of law enforcement
    Crims have weaponized tech and certain States let them launder the proceeds

    Australian Federal Police (AFP) commissioner Reece Kershaw has accused un-named nations of helping organized criminals to use technology to commit and launder the proceeds of crime, and called for international collaboration to developer technologies that counter the threats that behaviour creates.

    Kershaw’s remarks were made at a meeting of the Five Eyes Law Enforcement Group (FELEG), the forum in which members of the Five Eyes intelligence sharing pact – Australia, New Zealand, Canada, the UK and the USA – discuss policing and related matters. Kershaw is the current chair of FELEG.

    “Criminals have weaponized technology and have become ruthlessly efficient at finding victims,” Kerhsaw told the group, before adding : “State actors and citizens from some nations are using our countries at the expense of our sovereignty and economies.”

    Continue reading
  • Police lab wants your happy childhood pictures to train AI to detect child abuse
    Like the Hotdog, Not Hotdog app but more Kidnapped, Not Kidnapped

    Updated Australia's federal police and Monash University are asking netizens to send in snaps of their younger selves to train a machine-learning algorithm to spot child abuse in photographs.

    Researchers are looking to collect images of people aged 17 and under in safe scenarios; they don't want any nudity, even if it's a relatively innocuous picture like a child taking a bath. The crowdsourcing campaign, dubbed My Pictures Matter, is open to those aged 18 and above, who can consent to having their photographs be used for research purposes.

    All the images will be amassed into a dataset managed by Monash academics in an attempt to train an AI model to tell the difference between a minor in a normal environment and an exploitative, unsafe situation. The software could, in theory, help law enforcement better automatically and rapidly pinpoint child sex abuse material (aka CSAM) in among thousands upon thousands of photographs under investigation, avoiding having human analysts inspect every single snap.

    Continue reading

Biting the hand that feeds IT © 1998–2022