Fruit flies' brains at work: Decision-making? They use their eyes

Before you scoff, consider how much we rationalise away what's in front of us

Scientists hunting for the secret of how boffin scalpel-fodder favourite Drosophila melanogaster (aka the fruit fly) makes decisions have found that some of the brain circuitry active when it makes choices can be linked to what it has already seen.

The research is being undertaken in order to some day help better understand the much larger brains of mammals (like ours).

The problem with studying mammalian brains, such as those of primates or humans, is that they contain so many different neurons – the human brain has about 100 billion – of different types. Although plenty of prior experiments have recorded brain activity of mammals (for example, by using electrodes), Hokto Kazama, a systems neuroscientist at the RIKEN Brain Science Institute in Saitama, Japan, told The Register that it's very difficult to get to the scale of large swatches of individual neuron types.

However, there are plentiful special genetic tools for tagging the fewer neurons found in fruit flies, so that they can be much more easily and precisely observed during experimentation than mammals. Short-lived, genetically modified fruit flies with clear neuron tags are also easy to breed and maintain, he says.

Previously, neuroscientists have found that D melanogaster can remember its orientation, places or landmarks it has visited as it navigates, and even associate consequences (heat) with what it has clapped its compound eyes on. In the new study, appearing this week in Nature Neuroscience, Kazama's team ran a very similar task experiment to these while simultaneously recording the activity of neuron types linked in previous research to navigation.

The team collected a few hundred specially genetically modified D melanogaster fruit flies and tethered them to a plate in front of a white screen with their heads fixed, leaving their wings free.

For two seconds, the flies would see a black vertical bar in their left or right field of vision. The bar would disappear for eight seconds, and then after that they would see two black bars – one left, one right.

Consistently, the untrained fly would flap its wings in the direction of the bar on the opposite side to the one it had first seen, suggesting that it considered the original image when making its decision (for some unknown reason), Kazama says.

The flies were also genetically modified so that the selected type of neurons in their brain would glow a fluorescent green when active. One group of these neurons consistently glowed after the fly was shown the first bar, another as it made its decision of where to "turn" when shown the two bars.

This suggests the team discovered the neural activity linked to when the fly was using the visual information from its past (bar location) to make a decision (to turn).

"All of us wonder how we make the decisions we make," Ashburn, Virginia-based Howard Hughes Medical Institute's Janelia Research Campus neuroscientist Vivek Jayaraman, who was not involved in the study but researches fruit flies, told The Register.

He says there might be structural patterns in the neural activity of fruit flies that can act as a pointer for understanding structural patterns in mammalian brains, although he cautioned that there's more work to do before determining the significance of the found activity pattern and what it does to the fly's behaviour.

Kazama says the lab's next step is to try to determine how the fly combines the two separate groups of active neurons to aid in navigation. ®

Similar topics

Other stories you might like

  • DigitalOcean sets sail for serverless seas with Functions feature
    Might be something for those who find AWS, Azure, GCP overly complex

    DigitalOcean dipped its toes in the serverless seas Tuesday with the launch of a Functions service it's positioning as a developer-friendly alternative to Amazon Web Services Lambda, Microsoft Azure Functions, and Google Cloud Functions.

    The platform enables developers to deploy blocks or snippets of code without concern for the underlying infrastructure, hence the name serverless. However, according to DigitalOcean Chief Product Officer Gabe Monroy, most serverless platforms are challenging to use and require developers to rewrite their apps for the new architecture. The ultimate goal being to structure, or restructure, an application into bits of code that only run when events occur, without having to provision servers and stand up and leave running a full stack.

    "Competing solutions are not doing a great job at meeting developers where they are with workloads that are already running today," Monroy told The Register.

    Continue reading
  • Patch now: Zoom chat messages can infect PCs, Macs, phones with malware
    Google Project Zero blows lid off bug involving that old chestnut: XML parsing

    Zoom has fixed a security flaw in its video-conferencing software that a miscreant could exploit with chat messages to potentially execute malicious code on a victim's device.

    The bug, tracked as CVE-2022-22787, received a CVSS severity score of 5.9 out of 10, making it a medium-severity vulnerability. It affects Zoom Client for Meetings running on Android, iOS, Linux, macOS and Windows systems before version 5.10.0, and users should download the latest version of the software to protect against this arbitrary remote-code-execution vulnerability.

    The upshot is that someone who can send you chat messages could cause your vulnerable Zoom client app to install malicious code, such as malware and spyware, from an arbitrary server. Exploiting this is a bit involved, so crooks may not jump on it, but you should still update your app.

    Continue reading
  • Google says it would release its photorealistic DALL-E 2 rival – but this AI is too prejudiced for you to use
    It has this weird habit of drawing stereotyped White people, team admit

    DALL·E 2 may have to cede its throne as the most impressive image-generating AI to Google, which has revealed its own text-to-image model called Imagen.

    Like OpenAI's DALL·E 2, Google's system outputs images of stuff based on written prompts from users. Ask it for a vulture flying off with a laptop in its claws and you'll perhaps get just that, all generated on the fly.

    A quick glance at Imagen's website shows off some of the pictures it's created (and Google has carefully curated), such as a blue jay perched on a pile of macarons, a robot couple enjoying wine in front of the Eiffel Tower, or Imagen's own name sprouting from a book. According to the team, "human raters exceedingly prefer Imagen over all other models in both image-text alignment and image fidelity," but they would say that, wouldn't they.

    Continue reading
  • Facebook opens political ad data vaults to researchers
    Facebook builds FORT to protect against onslaught of regulation, investigation

    Meta's ad transparency tools will soon reveal another treasure trove of data: advertiser targeting choices for political, election-related, and social issue spots.

    Meta said it plans to add the targeting data into its Facebook Open Research and Transparency (FORT) environment for academic researchers at the end of May.

    The move comes a day after Meta's reputation as a bad data custodian resurfaced with news of a lawsuit filed in Washington DC against CEO Mark Zuckerberg. Yesterday's filing alleges Zuckerberg built a company culture of mishandling data, leading directly to the Cambridge Analytica scandal. The suit seeks to hold Zuckerberg responsible for the incident, which saw millions of users' data harvested and used to influence the 2020 US presidential election.

    Continue reading
  • Toyota cuts vehicle production over global chip shortage
    Just as Samsung pledges to invest $360b to shore up next-gen industries

    Toyota is to slash global production of motor vehicles due to the semiconductor shortage. The news comes as Samsung pledges to invest about $360 billion over the next five years to bolster chip production, along with other strategic sectors.

    In a statement, Toyota said it has had to lower the production schedule by tens of thousands of units globally from the numbers it provided to suppliers at the beginning of the year.

    "The shortage of semiconductors, spread of COVID-19 and other factors are making it difficult to look ahead, but we will continue to make every effort possible to deliver as many vehicles to our customers at the earliest date," the company said.

    Continue reading

Biting the hand that feeds IT © 1998–2022