US committee green-lights CRISPR-Cas9 human cancer cell trials

Genetics builder and breakers hope to bolster treatments


A United States advisory committee has green-lighted use of the ground-breaking CRISPR gene-editing technique in human trials.

The committee within the US National Institutes of Health approved the use of CRISPR-Cas9 for cancer treatment in which tests will be conducted on immune T cells extracted from melanoma patients.

The CRISPR-Cas9 technique is described as perfect for cutting and modifying DNA. It promises the most effective means to treat disease and some of the world's most intractable conditions.

Study leader Edward Stadtmauer of the University of Pennsylvania told Nature gene editing could improve cancer treatment.

“Cell therapies [for cancer] are so promising but the majority of people who get these therapies have a disease that relapses,”

The shortDNA sequences in which is CRISPR is present were found in 1987, while the Cas9 technique was discovered in 2012 by Dr Jennifer A Doudna .

The initial trial by the University of Pennsylvania is backed by US$250 million (£169 million, A$333 million) from the immunotherapy foundation formed by former Facebook president Sean Parker.

It will examine the effectiveness of CRISPR in human trials extracting T cells from 18 patients.

The technique will remove T-cells, inject a manufactured gene to target cancer cells, and remove separate genes that can interfere with the process.

It follows the approval of CRISPR CAs9 human trials in the UK, the first to be formally approved by a government.

Blighty's Human Fertilisation and Embryology Authority approved London's Francis Crick Institute to explore the earliest moments of human life using the technique.

The Institute's Dr Kathy Niakan will conduct research into one week-old embryos which are the size of about 250 cells.

That approval in February came five months after the Chinese breakthrough research was announced. ®

Similar topics


Other stories you might like

  • NASA circles August in its diary to put Artemis I capsule in Moon orbit
    First steps by humans to recapture planet's natural satellite

    NASA is finally ready to launch its unmanned Orion spacecraft and put it in the orbit of the Moon. Lift-off from Earth is now expected in late August using a Space Launch System (SLS) rocket.

    This launch, a mission dubbed Artemis I, will be a vital stage in the Artemis series, which has the long-term goal of ferrying humans to the lunar surface using Orion capsules and SLS technology.

    Earlier this week NASA held a wet dress rehearsal (WDR) for the SLS vehicle – fueling it and getting within 10 seconds of launch. The test uncovered 13 problems, including a hydrogen fuel leak in the main booster, though NASA has declared that everything's fine for a launch next month.

    Continue reading
  • Photonic processor can classify millions of images faster than you can blink
    We ask again: Has science gone too far?

    Engineers at the University of Pennsylvania say they've developed a photonic deep neural network processor capable of analyzing billions of images every second with high accuracy using the power of light.

    It might sound like science fiction or some optical engineer's fever dream, but that's exactly what researchers at the American university's School of Engineering and Applied Sciences claim to have done in an article published in the journal Nature earlier this month.

    The standalone light-driven chip – this isn't another PCIe accelerator or coprocessor – handles data by simulating brain neurons that have been trained to recognize specific patterns. This is useful for a variety of applications including object detection, facial recognition, and audio transcription to name just a few.

    Continue reading
  • World’s smallest remote-controlled robots are smaller than a flea
    So small, you can't feel it crawl

    Video Robot boffins have revealed they've created a half-millimeter wide remote-controlled walking robot that resembles a crab, and hope it will one day perform tasks in tiny crevices.

    In a paper published in the journal Science Robotics , the boffins said they had in mind applications like minimally invasive surgery or manipulation of cells or tissue in biological research.

    With a round tick-like body and 10 protruding legs, the smaller-than-a-flea robot crab can bend, twist, crawl, walk, turn and even jump. The machines can move at an average speed of half their body length per second - a huge challenge at such a small scale, said the boffins.

    Continue reading
  • Half of developers still at screens even during breaks
    Going for a walk: Good. Doomscrolling: Bad

    What are your peers doing to stave off burnout? Research from Stack Overflow suggests about half of developers are still spending their breaks in front of a screen.

    The Q&A programming resource surveyed 800 devs, and found most of the top five things they do when they need a break involve screens: listening to music (46 percent), visiting Stack Overflow (41 percent), browsing social media (37 percent), and watching videos (36 percent).

    Actually talking with fellow humans did not make the top five, and 4 percent of respondents had some other outlet for stress (possibly angrily banging some really terse comments into the source).

    Continue reading
  • Can AI transformer models help design drugs and treat incurable diseases?
    From protein prediction to drug generation, neural networks are revolutionizing medication

    Special report AI can study chemical molecules in ways scientists can't comprehend, automatically predicting complex protein structures and designing new drugs, despite having no real understanding of science.

    The power to design new drugs at scale is no longer limited to Big Pharma. Startups armed with the right algorithms, data, and compute can invent tens of thousands of molecules in just a few hours. New machine learning architectures, including transformers, are automating parts of the design process, helping scientists develop new drugs for difficult diseases like Alzheimer's, cancer, or rare genetic conditions.

    In 2017, researchers at Google came up with a method to build increasingly bigger and more powerful neural networks. Today, transformer-based models are behind some of the largest AI systems and typically learn patterns from vast amounts of text. They're versatile and can process different forms of language from code to ancient scripts scribbled thousands of years ago.

    Continue reading
  • Algorithm can predict pancreatic cancer from CT scans well before diagnosis
    Software picks up subtle clues human doctors miss

    AI algorithms can predict whether a patient will develop pancreatic cancer years before an official diagnosis, or so this research suggests.

    Tens of thousands of people in the US are diagnosed with pancreatic ductal adenocarcinoma – the most common type of pancreatic cancer – every year. Less than 10 percent of patients live more than five years after diagnosis.

    Detecting the disease earlier could boost survival rates by up to 50 percent, it is believed. But doctors don't right now have any methods that screen patients for early signs of pancreatic cancer. Now, a team of researchers led by Cedars-Sinai Medical Center, a top non-profit hospital based in Los Angeles, California, believe AI could be up to the task.

    Continue reading
  • Intel: Our fabs can mass produce silicon qubit devices
    If conventional silicon manufacturing processes can be repurposed, it could help create practical quantum systems

    Updated Intel and QuTech claim to have created the first silicon qubits for quantum logic gates to be made using the same manufacturing facilities that Intel employs to mass produce its processor chips.

    The demonstration is described by the pair as a crucial step towards scaling to the thousands of qubits that are required for practical quantum computation.

    According to Intel, its engineers working with scientists from QuTech have successfully created the first silicon qubits at scale at Intel's D1 manufacturing factory in Hillsboro, Oregon, using a 300mm wafer similar to those the company uses to mass produce processor chips.

    Continue reading
  • 'Virtually no difference' between AI and humans in diagnosing prediabetes
    Is that... a good or bad thing?

    Deep-learning algorithms have shown themselves equal to humans in detecting patients at high-risk of developing Type-2 diabetes by analyzing CT scans of their pancreases, according to a research paper published on Tuesday.

    Type-2 diabetes is estimated to affect 11.3 percent of the US population, or at least 37 million people. Type-2 diabetes can lead to issues with circulatory, nervous, and immune systems, increasing the risk of heart disease and strokes.

    Those with the initial form, prediabetes, can repair their body's insulin resistance, so they don't develop the full-blown condition, if they change their diets and exercise habits. American health officials reckon 38 percent of the US adult population, some 96 million people, have prediabetes.

    Continue reading
  • TACC Frontera's 2022: Academic supercomputer to run intriguing experiments
    Plus: Director reveals 10 million node hours, 50-70 million core hours went into COVID-19 research

    The largest academic supercomputer in the world has a busy year ahead of it, with researchers from 45 institutions across 22 states being awarded time for its coming operational run.

    Frontera, which resides at the University of Texas at Austin's Texas Advanced Computing Center (TACC), said it has allocated time for 58 experiments through its Large Resource Allocation Committee (LRAC), which handles the largest proposals. To qualify for an LRAC grant, proposals must be able to justify effective use of a minimum of 250,000 node hours and show that they wouldn't be able to do the research otherwise. 

    Two additional grant types are available for smaller projects as well, but LRAC projects utilize the majority of Frontera's nodes: An estimated 83% of Frontera's 2022-23 workload will be LRAC projects. 

    Continue reading
  • Scientists make spin ice breakthrough
    Artificial spin ice with smallest features ever created could be part of novel low-power HPC

    Researchers at the Paul Scherrer Institute and ETH Zurich in Switzerland have managed to accomplish a technological breakthrough that could lead to new forms of low-energy supercomputing.

    It's based around something called artificial spin ice: think of water molecules freezing into a crystalline lattice of ice, and then replace the water with nanoscale magnets. The key to building a good spin ice is getting the magnetic particles so small that they can only be polarized, or "spun," by dropping them below a certain temperature. 

    When those magnets are frozen, they align into a lattice shape, just like water ice, but with the added potential of being rearranged into a near infinity of magnetic combinations. Here the use cases begin to emerge, and a couple breakthroughs from this experiment could move us in the right direction.

    Continue reading

Biting the hand that feeds IT © 1998–2022