'What's up, Skip?' asks paraglider – before 'roo beats the snot out of him

He was landing. Marsupials have not yet developed flight

As if we needed another reminder that everything in Australia wants to kill you, the ongoing turf war between kangaroos and people has claimed another victim.

Innocent paraglider Jonathan Bishop was landing at Orroral Valley in Namadgi National Park, near the capital city of Canberra, when his GoPro helmet cam caught an approaching contingent of marsupial thugs.

Youtube Video

Though they'd have no difficulty kicking off a human's head, a clueless Bishop leaned back on media's worst mischaracterisation of the kangaroo.

"What's up, Skip?" 35-year-old Bishop can be heard asking the oncoming knot of muscle and rage.

To which the 'roo replied with a flurry of fists and claws.

"Hey fuck off, ahh fuck off! Go away!" Bishop screamed, concluding "fuckin' kangaroos" once the hit-and-run gang had scarpered.

9 News said that the extreme sports fan was none the worse for wear despite the brief pummelling.

Unconfirmed reports (ie, we're making it up) suggest the attack was a reprisal for the brutal hook a man inflicted on a kangaroo that had his dog in a headlock back in 2016.

As yet, there is no end to the violence in sight. However, maybe some JD Sports vouchers might encourage the sluggers to put their fists down. ®

Similar topics

Broader topics

Other stories you might like

  • Samsung fined $14 million for misleading smartphone water resistance claims
    Promoted phones as ready for a dunking – forgot to mention known problems with subsequent recharges

    Australia’s Competition and Consumer Commission has fined Samsung Electronics AU$14 million ($9.6 million) for making for misleading water resistance claims about 3.1 million smartphones.

    The Commission (ACCC) says that between 2016 and 2018 Samsung advertised its Galaxy S7, S7 Edge, A5, A7, S8, S8 Plus and Note 8 smartphones as capable of surviving short submersions in the sea or fresh water.

    As it happens The Register attended the Australian launch of the Note 8 and watched on in wonder as it survived a brief dunking and bubbles appeared to emerge from within the device. Your correspondent recalls Samsung claiming that the waterproofing reflected the aim of designing a phone that could handle Australia's outdoors lifestyle.

    Continue reading
  • Five Eyes alliance’s top cop says techies are the future of law enforcement
    Crims have weaponized tech and certain States let them launder the proceeds

    Australian Federal Police (AFP) commissioner Reece Kershaw has accused un-named nations of helping organized criminals to use technology to commit and launder the proceeds of crime, and called for international collaboration to developer technologies that counter the threats that behaviour creates.

    Kershaw’s remarks were made at a meeting of the Five Eyes Law Enforcement Group (FELEG), the forum in which members of the Five Eyes intelligence sharing pact – Australia, New Zealand, Canada, the UK and the USA – discuss policing and related matters. Kershaw is the current chair of FELEG.

    “Criminals have weaponized technology and have become ruthlessly efficient at finding victims,” Kerhsaw told the group, before adding : “State actors and citizens from some nations are using our countries at the expense of our sovereignty and economies.”

    Continue reading
  • Police lab wants your happy childhood pictures to train AI to detect child abuse
    Like the Hotdog, Not Hotdog app but more Kidnapped, Not Kidnapped

    Updated Australia's federal police and Monash University are asking netizens to send in snaps of their younger selves to train a machine-learning algorithm to spot child abuse in photographs.

    Researchers are looking to collect images of people aged 17 and under in safe scenarios; they don't want any nudity, even if it's a relatively innocuous picture like a child taking a bath. The crowdsourcing campaign, dubbed My Pictures Matter, is open to those aged 18 and above, who can consent to having their photographs be used for research purposes.

    All the images will be amassed into a dataset managed by Monash academics in an attempt to train an AI model to tell the difference between a minor in a normal environment and an exploitative, unsafe situation. The software could, in theory, help law enforcement better automatically and rapidly pinpoint child sex abuse material (aka CSAM) in among thousands upon thousands of photographs under investigation, avoiding having human analysts inspect every single snap.

    Continue reading

Biting the hand that feeds IT © 1998–2022