Competition crowdsources blisteringly-fast software

TopCoder challenge helps immune system research


If you want a massive improvement in the software you use, the cheapest way to get it is to host a competition on TopCoder.

That seems to be at least one of the discoveries made when a group of research biologists staged a competition on the MIT-operated site. A two-week contest with regular prizes of $US500 ended up costing the researchers just $US6,000, and yielded new – and hugely efficient and effective – software for analysing immune system genes.

The real-world problem presented by the researchers was to analyse the genes involved in producing antibodies and T-cell receptors. This is definitely a non-trivial problem in genetic research. As Nature puts it:

“These genes are formed from dozens of modular DNA segments located throughout the genome, and they can be mixed and matched to yield trillions of unique proteins, each capable of recognizing a different pathogen or foreign molecule.”

With that kind of complexity, the problem is demanding on computing resources and software.

Hence the competition: the lead researcher, Eva Guinan (of the Dana-Farber Cancer Institute) and her collaborators asked TopCoder participants if they could do better: “The researchers offered TopCoder what they thought would be an impossible goal: to develop a predictive algorithm that was an order of magnitude better than either a custom solution developed by Arnaout [Ramy Arnaout of the Beth Israel Deaconess Medical Centre] or the NIH’s standard approach (MegaBLAST)”.

The result was a huge success, 84 solutions were offered by entrants in the competition, 16 of them outperforming MegaBLAST.

The best-of-the-best was 970 times faster than either MegaBLAST or Armout’s software, which should go some way towards Guinan’s perfect world in which the researcher could run this kind of analysis on their laptops instead of supercomputers.

There were 733 participants in the competition, of which 122 submitted code; 44 percent of them were software professionals, and the rest were students at various levels.

For The Register, this is a killer observation: to make the problem accessible for the competition, “they had to first reframe the problem, translating it so that it could be accessible to individuals not trained in computational biology.”

In other words, if you ask the right question, you can get the right answer – remarkably cheaply. ®

Similar topics


Other stories you might like

  • Mastering metadata key to shifting data fast, says Arcitecta
    A new transmission protocol can work lightning fast, but only with very thorough records to pull from

    Companies that move and analyze huge volumes of data are always on the lookout for faster ways to do it. One Australian company says it has created a protocol that can "transmit terabytes per minute across the globe."

    The company, Arcitecta, which has an almost 25-year history, has just announced the new Livewire protocol, which is part of their data management platform, Mediaflux, used by institutions including the Australian Department of Defense, drug maker Novartis, and the Dana Farber Cancer Institute.

    According to CEO Jason Lohrey, Livewire itself has already made an impact for some of the largest data movers. "One of our customers transmits petabytes of data around the globe, he told The Register.

    Continue reading
  • Real-time data analytics firm Tinybird nets $37m in Series A
    Millions of rows per second in real time, so the legend goes...

    A data analytics company claiming to be able to process millions of rows per second, in real time, has just closed out a Series A funding round to take-in $37 million.

    Tinybird raised the funds via investors Crane Ventures, Datadog CPO Amit Agarwal, and Vercel CEO Guillermo Rauch, along with new investments from CRV and Singular Ventures.

    Tinybird's Stephane Cornille, said the company plans to use the funds to expand into new regions, build connectors for additional cloud providers, create new ways for users to build their own connectors, provide tighter Git integration, Schema management and migrations, and better defaults and easier materialization.

    Continue reading
  • Big data means big money for the UK government as £2bn tender mooted
    Central procurement team tickles the market with tantalising offer... but what for?

    The UK government is putting the feelers out for a bundle of big data products and services in a move that could kick off £2bn in tendering.

    Cabinet Office-run Crown Commercial Service (CCS), which sets up procurement on behalf of central government ministries and other public sector organisations, has published an early market engagement to test the suppliers' interest in a framework for so-called big data and analytics systems.

    According to the prior information notice: "Big data and analytics is an emerging and evolving capability, with its prominence heightened by COVID. It is fast becoming recognised as business critical and a core business function, with many government departments now including chief data officers. The National Data Strategy and implementation of its missions reinforce the requirement to access and interrogate Government data more effectively to improve public services."

    Continue reading

Biting the hand that feeds IT © 1998–2022