A Gamified Pain Management Intervention for Adults With Chronic Pain in Mainland China: Single-Arm Pre-Post Pilot Study With Machine Learning Predictive Modeling

Background: The widespread prevalence of chronic pain (CP) significantly impacts daily functioning worldwide. In mainland China, maintaining engagement in biopsychosocial interventions remains challenging. Gamification, designed based on self-determination theory, can enhance motivation, while machine learning (ML) algorithms can assist clinicians in dynamically optimizing pain management. Objective: This study aimed to (1) evaluate the preliminary effectiveness of a gamified pain management (GPM) program on CP and psychological outcomes and (2) identify key factors of significant pain improvements through the application of ML to guide intervention adjustments. Methods: A single-arm, pre-post study was conducted with 16 participants with CP in mainland China, recruited via social media using convenience sampling. Participants engaged in a 10-week web-based GPM intervention consisting of education, physical activities, and gamified elements, including points, avatars, and feedback. Primary outcomes were pain intensity and interference measured by the Brief Pain Inventory. Secondary outcomes included anxiety, depression, and quality of life. Analysis included paired tests, and ML models were trained to predict clinically meaningful pain reductions. Shapley additive explanations, least absolute shrinkage and selection operator regression, association rule mining, and Kaplan-Meier survival analysis were used to identify key predictors and optimal sessions and intervention durations across subgroups. Results: A total of 16 participants were engaged, with a mean age of 27.63 (SD 9.584) years. Results from paired tests reported significant improvements in pain intensity (decreased by 27.3%, 95% CI 1.061 to 3.064; =.001), pain interference (decreased by 27.3%, 95% CI 8.159-17.216; <.001), and psychological distress, including anxiety (=3.538, 95% CI 0.969 to 3.906; =.003) and depression (=4.559, 95% CI 2.230 to 6.145; <.001). The gradient boosting model demonstrated the highest predictive accuracy (area under the curve=0.89 and accuracy=0.82). Least absolute shrinkage and selection operator regression identified session 3 (β=−0.45, 95% CI −0.68 to −0.22; <.001) and session 5 (β=−0.32, 95% CI −0.59 to −0.05; =.02) as most predictive of clinical success, while association rule mining revealed effective session combinations for different patient subgroups. Time-to-event analyses indicated that individuals with low back pain and higher baseline severity required longer intervention durations for improvement (5 wk; =.03). Conclusions: This pilot study presents an innovative method that combines ML with dynamic engagement data from a GPM program during interventions, rather than relying on static baseline data in prior studies. The results show preliminary efficacy and identify specific optimal session combinations and personalized treatment durations for different pain subgroups. These exploratory findings contribute to the field by providing a data-driven method for adaptive, personalized digital health interventions that move beyond one-size-fits-all strategies, potentially enabling clinicians to modify content and dosage to improve engagement and outcomes if validated in larger sample trials. Trial Registration: Chinese Clinical Trial Registry ChiCTR2400094247; https://www.chictr.org.cn/showprojEN.html?proj=245138

Biobanks Set the Stage for Scaling Precision Medicine

Dating back more than a century, biobanks have outgrown their beginnings as small, local collections to become large, global facilities that store and handle millions of samples and serve thousands of researchers at any given time. Over the years, biobanks have transformed from passive repositories into active research infrastructures that are increasingly bridging the gap between medical research and clinical applications.

“Today’s biobanks have evolved far beyond sample storage,” said Yan Zhang, PhD, president of proteomic sciences at Thermo Fisher Scientific. “They are automated, digitally connected systems integrated with hospitals and health networks to ensure appropriate consent, longitudinal clinical context, and the ability to re-engage participants over time.”

Yan Zhang
Yan Zhang, PhD
President
Thermo Fisher Scientific

As safeguards of clinical samples, biobanks fulfill a central role in the advancement of precision medicine. Access to the right samples can make or break a research project, with most researchers reporting that they have had to limit their scope of work because of difficulties obtaining the samples they need.

“Robust, population-scale biobanking enables precision medicine to move from isolated findings toward broader clinical relevance,” said Zhang. “Modern biobanks combine genomics, proteomics, and other high-dimensional omics platforms with robust data architecture, high-performance computing, and artificial intelligence (AI)-driven modeling. Dedicated data science teams integrate molecular data, longitudinal health records, and curated public datasets to generate biologically meaningful interpretations.”

Biobanks now provide the infrastructure needed to support population-scale, longitudinal studies that allow scientists to uncover molecular drivers of disease and understand their evolution over time to ultimately identify biomarkers, develop targeted treatments, and inform clinical decisions.

“We’re seeing researchers design studies with scale in mind,” Zhang noted. “They’re combining proteomics, genomics, and clinical data to generate insights that are both statistically powerful and relevant to real-world populations. There’s also a clear shift from searching for a single biomarker to building a more complete, systems-level understanding of disease.”

To navigate today’s rapidly shifting landscape and meet their core purpose of supporting cutting-edge clinical research, biobanks have to keep up with fast-moving targets. Going forward, moving from initial discovery to translation will remain the number one challenge in precision medicine. “Generating discovery insight is no longer the limiting factor,” said Zhang. “Validating, standardizing, and implementing those insights at scale is.”

A matter of scale

Martin K. Rutter
Martin K. Rutter, MD
Deputy Chief Scientist
UK Biobank

One of the most transformative shifts in biobanking over the past decade has been an exponential increase in the scale of data collection and sample storage. At the forefront of this expansion is the UK Biobank, which currently stores around 18 million samples from 500,000 participants, together with imaging and biomarker data, healthcare records, questionnaires, physical measurements, demographics, lifestyle, and environmental data collected over the course of 20 years. This depth of phenotyping is what makes the data so valuable to researchers worldwide, said Martin K. Rutter, MD, professor of cardiometabolic medicine at the University of Manchester and deputy chief scientist at the UK Biobank. “When you link all that together, you can get amazing insights into the biology of disease.”

To keep up with increasing storage needs and researcher requests, the UK Biobank is now getting ready to move more than 10 million samples currently stored in its main laboratory to a new building in central Manchester by the end of the year. The new storage facility is designed to quadruple sample retrieval speed while making the whole infrastructure more energy-efficient and environmentally friendly.

The scale at which facilities like the UK Biobank operate today would have been unthinkable when it was established two decades ago. Such massive growth has been driven by rapid technological advances across genomics, transcriptomics, and proteomics, with costs continuing to fall while coverage, speed, and accuracy keep surging.

Partnerships with the pharmaceutical industry have also been instrumental in nurturing this exponential growth. This can be seen in initiatives like the UK Biobank Pharma Proteomics Project (UKB-PPP), a collaboration between the UK Biobank and 14 biopharmaceutical companies with the goal of analyzing proteomics data from 600,000 samples.

In the long run, scale provides the backbone to enable increasingly ambitious, statistically powerful studies. However, as they grow, biobanks face the challenge of navigating a constantly shifting landscape while making sure the samples and data they collect, store, and maintain are valuable to the entire research community they serve.

“Our job is to make the data available to researchers,” said Rutter. “We are involved now more than ever in connecting with research teams and trying to understand what their needs are.”

Through surveys and consultations, the UK Biobank actively gathers information to design prospective data collection programs that anticipate researcher needs. Next year, the biobank is planning a repeat assessment of its whole cohort, focusing on measurements of aging. The goal is to support researchers looking into causal pathways and mechanisms driving age-related diseases, empowering the development of preventive interventions and new diagnostics and treatments for age-related conditions.

Keeping pace with the evolving demands of researchers, industry, and the broader public is essential for biobanks to secure the funding necessary not only to operate but also to expand such vast enterprises, which remains a major challenge across this resource-intensive field.

Diversity takes the spotlight

Historically, samples collected by biobanks are biased in favor of participants who are white, middle-class, and have a higher education. This creates major disparities in the applicability of clinical research. In fact, studies have shown that patients from non-European ancestry backgrounds have not benefited equally from precision drugs approved by the U.S. Food and Drug Administration (FDA) to treat a range of cancer indications.

Even within biobanks dedicated to sampling the population of a specific region, ethnic minorities, low-income, or elderly people are often underrepresented, skewing results against the real-world populations they strive to serve. As the research community increasingly recognizes the importance of more diverse and representative patient cohorts, demand is rising for resources that address these barriers.

Representation is at the heart of All of Us, a program launched by the National Institutes of Health in 2018 to address the gap present at the time in many biobanks and sample repositories. This precision medicine initiative was designed to enroll participants who reflect the full range of populations found within the U.S., including individuals of varied ancestry backgrounds as well as those living in rural commmunities, which are rarely represented in biorepositories due in part to longstanding barriers to research participation, such as the logistical challenges of collecting samples and data from participants in remote locations.

Joshua Denny
Joshua C. Denny, MD
CEO
All of Us

“A lack of diversity impoverishes discovery and applicability of findings for all,” said Joshua C. Denny, MD, CEO of the All of Us Research Program.

For instance, data collected by All of Us has been used to investigate APOL1 gene variants linked to kidney disease, which are more common among people of West African ancestry. This research led to the identification of a novel APOL1 variant that can reduce the risk of kidney disease in individuals carrying high-risk variants.

The program has so far enrolled about 870,000 participants across all U.S. states, with about 80% of them representing communities that have historically been underrepresented in biomedical research. This has been achieved by emphasizing accessibility and flexible participation models; participants can enroll digitally and choose whether to share access to their electronic health records, donate biospecimens, and complete demographics and lifestyle surveys. They may also opt to provide saliva samples, simplifying logistics in rural areas with limited access to blood collection facilities.

“What works in a rural location is different from what works in a big city like New York,” said Denny. Whether it comes to location, age, or language, he emphasized the importance of adapting how the program approaches and engages each population.

Democratizing access to patient data across the research ecosystem is another major biobanking challenge that All of Us is committed to addressing. The program has established a streamlined access model that enables researchers to access the data they need in less than two hours if they belong to one of the 1,300 already approved institutions across the world. Together with central data storage and cloud-based analysis tools, their setup is designed to make the data accessible to researchers lacking the resources and local infrastructure for high-performance computing.

Towards global integration

With precision medicine studies steadily escalating both in size and complexity, researchers increasingly seek to bring together data stored across diverse biobanks to power larger, more ambitious studies with broader scientific and societal impact. However, building the infrastructure needed to enable cross-biobank studies is still a challenge, starting with convening stakeholders to harmonize data collection standards and establish international guidelines.

Anticipating this need, in 2013 the European Union established the Biobanking and Biomolecular Resources Research Infrastructure – European Research Infrastructure Consortium (BBMRI-ERIC), which currently coordinates the activity of about 500 biobanks across 32 countries.

Jens K. Habermann
Jens K. Habermann, MD, PhD
Director General
BBMRI-ERIC

“Precision medicine can only move forward with a strong starting point for research,” said Jens K. Habermann, MD, PhD, professor for translational surgical oncology and biobanking at the University of Lübeck and director general of the BBMRI-ERIC. “It can be very difficult for scientists to get all the information they need in one place, and this is what biobanks can enable.”

Pulling together data from all its members, the BBMRI-ERIC has set up a central catalogue for biobanks, biomolecular resources, and other data and sample collections, which users can employ to identify relevant resources and build virtual cohorts tailored to their research needs. The consortium also works with international committees to set guidelines and support members working towards compliance with international standards.

Despite ongoing progress, there are still obstacles ahead when it comes to harmonizing biobanking practices worldwide, including data collection, annotation, storage, and sharing. Tackling differences in data protection, consent, ethical standards, and regulatory requirements across borders will be another necessary step towards broader standardization. Finally, biobanks will need to invest in cybersecurity to ensure patient data can be shared between institutions safely.

Funding will be key to successfully addressing all these challenges. On this front, biobanks face the difficult task of maintaining their existing infrastructure, staying up to date and relevant to the research community, and investing in cross-biobank initiatives. All this must be balanced with growing financial pressure on research centers, hospitals, and the governments supporting them.

As part of its 10-year roadmap, the BBMRI-ERIC is setting the goal of forming international networks that bring together more diverse biobank types, such as environmental, wildlife, veterinary, and plant biodiversity repositories. The overarching aim is to move towards a One Health approach to biobanking, where samples and data that expand beyond monitoring human populations are brought together to tackle overlapping challenges that simultaneously affect human, animal, and environmental health.

Data-driven horizons

As the field forges ahead, biobanks are undergoing broad transformations in the way they operate. On the technology side, these changes are being propelled by the rise of multi-omics techniques in precision medicine research, as well as by rising demand from the research community for non-invasive patient monitoring data and longitudinal sample collection. All of these will be critical for the development of the next generation of personalized therapies and diagnostics.

“Over the next decade, biobanks are expected to become increasingly integrated into clinical and translational workflows,” said Zhang. “Proteomics, in particular, will play a growing role in helping us understand the dynamic biology of disease, enabling earlier detection, better prediction of recurrence, and more precise therapeutic strategies.”

A key driver of this shift will be AI. No longer just a supporting tool, AI is now becoming an integral part of biobank operations, contributing to real-time sample monitoring, predictive maintenance, risk management, and decision making.

On the data analysis side, Zhang has seen how AI is redirecting the focus from data generation to data interpretation. She said, “Biobanking has already enabled the collection of high-quality biospecimens linked to large-scale molecular and clinical datasets. The challenge now is extracting meaningful biological insight from that complexity.”

Although still in its early days, AI is becoming central to how researchers make use of biobank data, noted Rutter. Drawing from the UK Biobank data, recent studies have developed AI models that can predict a patient’s risk of stroke based on retinal images, calculate the risk of future disease by looking at an individual’s disease history, or spot neurodegenerative diseases like Alzheimer’s and Parkinson’s early using brain scans and physical activity data.

Going forward, Rutter expects to see biobanks moving away from static cohorts and in favor of continuous data collection, enabling more powerful predictions. For example, the UK Biobank is developing a mobile app that can track a participant’s physical activity and monitor their location and sleep patterns, offering an in-depth look at how a variety of factors affect their health with much more accuracy than self-reported surveys.

Over time, all these advances will steer clinical practice from treatment to prevention, allowing healthcare professionals to act early in the patient journey, when interventions are most effective, and eventually, even before disease develops. Ultimately, addressing complex diseases will require coordinated contributions from all stakeholders, including AI innovators, drug developers, clinicians, technology providers, and policymakers.

“The next decade will be incredibly exciting,” said Denny. “It will be all about leveraging the huge scale of resources that are just emerging today.”

 

Clara Rodríguez Fernández is a science journalist specializing in biotechnology, medicine, deeptech, and startup innovation. She previously worked as a reporter at Sifted and editor at Labiotech, and she holds an MRes degree in bioengineering from Imperial College London.

The post Biobanks Set the Stage for Scaling Precision Medicine appeared first on Inside Precision Medicine.

Gilead to Acquire Tubulis for Up to $5B, Expanding Cancer ADC Capabilities

Gilead Sciences has agreed to acquire German-based Tubulis for up to $5 billion, the companies said today, in a deal designed to expand the buyer’s antibody–drug conjugate (ADC) capabilities with a focus on fighting cancer.

Headquartered in Munich, privately held Tubulis has developed next-generation ADC candidates based on its own conjugation, linker and payload technologies intended to more selectively deliver diverse payloads to tumors deemed to be of high unmet need. The companies said Tubulis’ programs and platforms have broad potential across multiple tumor types, complementing Gilead’s development and commercialization expertise in oncology.

“We like the strategic fit and deal terms of the Tubulis (private) acquisition,” Daina M. Graybosch, PhD, senior managing director, immuno-oncology and a senior research analyst at Leerink Partners, wrote this morning in a research note. “This is more than an oncology bolt-on; we see real platform value in application of Tubulis’ ADC technologies to other therapeutic areas, namely virology.”

Tubulis’ lead pipeline candidate, TUB-040, is a sodium-dependent phosphate transport protein 2B (NaPi2b)-targeting topoisomerase-I inhibitor (TOPO1i) ADC that is now under study in the Phase Ib/II NAPISTAR1-01 trial (NCT06303505) assessing its safety, pharmacokinetics, and preliminary efficacy as a treatment for platinum-resistant ovarian cancer and non-small cell lung cancer (NSCLC).

In October at the European Society for Medical Oncology (ESMO), Graybosch noted, Tubulis presented data for TUB-040 showing a confirmed 50% overall response rate (ORR) and a 60% unconfirmed ORR across dose levels and irrespective of target antigen—results that were competitive with more mature datasets from leading TOPO1i ADCs.

“Though the dataset was early, and our primary outgoing question was how durability would mature, we suspect that Gilead saw durability maturing positively in their diligence,” Graybosch added. “If TUB-040 proves active in NSCLC, the program could complement their Trodelvy and IO [immune-oncology] lung programs. We wonder if Gilead saw early clinical NSCLC data in their diligence and if excitement around the emerging signal drove some of Tubulis’ valuation.”

Another Tubulis pipeline candidate, TUB-030, is a 5T4-targeting ADC that according to the companies has shown promising initial clinical data across various solid tumor types. TUB-030 is currently under study in the Phase I/IIa 5-STAR 1-01 trial (NCT06657222), a first-in-human study which aims to evaluate the safety, tolerability, pharmacokinetics, and efficacy of TUB-030 as a monotherapy in patients with advanced solid tumors. Tubulis has said it is developing TUB-030 for up to 13 undisclosed solid tumor indications.

Partners since 2024

The acquisition deal follows a two-year, up-to-$465 million collaboration with Tubulis launched in December 2024. Gilead gained access to Tubulis’ Tubutecan and Alco5 platforms after signing an exclusive option and license agreement to discover and develop an ADC against a solid tumor target.

At the time, Gilead agreed to pay Tubulis $20 million upfront, received an option that if exercised would have given Tubulis an additional $30 million—plus up to $415 million in payments tied to achieving development and commercialization milestones, as well as mid-single to low double-digit tiered royalties on sales of marketed products resulting from the collaboration.

“Today’s agreement follows a two-year collaboration with Tubulis, which has given us strong conviction in their programs and research capabilities,” Gilead Chairman and CEO Daniel O’Day said in a statement. “The agreement to acquire Tubulis is a significant milestone in Gilead’s progress in oncology. The company brings a clinical-stage candidate that is a potential new treatment for ovarian cancer, as well as a next-generation ADC platform and a promising early pipeline.”

“Bringing this potential into Gilead would further expand what is already the strongest and most diverse pipeline in our company’s history,” O’Day declared.

Investors appeared less enthusiastic about the acquisition, as shares of Gilead dipped 1.7% in early Tuesday trading to $137.80 as of 12:01 p.m. ET.

Tubulis is Gilead’s third announced acquisition this year. The biotech giant announced plans in March to buy Ouro Medicines for up to $2.18 billion, and in February agreed to acquire Arcellx for up to $7.8 billion—for which it agreed last week to extend its tender offer until 5 p.m. ET on April 24.

Under the acquisition deal, Gilead agreed to acquire all of the outstanding equity of Tubulis for $3.15 billion in upfront cash payable at closing, and up to $1.85 billion in payments tied to milestones.

The transaction is expected to close in the second quarter subject to expiration or termination of specified regulatory filings and other customary conditions.

Upon closing of the deal, Tubulis will operate as a dedicated ADC research organization within Gilead, with the Munich site serving as a hub for ADC innovation, building on its integrated discovery, manufacturing, and clinical capabilities to advance next generation ADCs.

Gilead said it plans to finance the transaction with a combination of cash on hand and senior unsecured notes. Gilead finished 2025 with $10.605 billion of cash, cash equivalents and marketable debt securities, up from $9.991 billion as of December 31, 2024.

The post Gilead to Acquire Tubulis for Up to $5B, Expanding Cancer ADC Capabilities appeared first on GEN – Genetic Engineering and Biotechnology News.

STAT+: Trump budget’s ‘America First’ drug policy proposals

You’re reading the web edition of D.C. Diagnosis, STAT’s twice-weekly newsletter about the politics and policy of health and medicine. Sign up here to receive it in your inbox on Tuesdays and Thursdays.

The 2026 STAT Madness competition was stacked with research on topics like smart dental floss that monitors stress, Baby KJ’s personalized gene therapy, and an artificial intelligence model designed to predict cell behavior. Check out the winner, unveiled this morning. And as always, send news tips to John.Wilkerson@statnews.com or John_Wilkerson.07 on Signal.

Budget reruns

The 2027 budget that the Trump administration released on Friday is in many ways a repeat of last year’s proposal: It includes deep cuts to the National Institutes of Health, the elimination of a health research agency, and the creation of a new agency devoted to chronic diseases called the Administration for a Healthy America.

Continue to STAT+ to read the full story…

U.S. Tumor Testing Suffers Under Access Disparities

Genomic testing for advanced cancer in the U.S. is hampered by unequal access across demographic groups and needs targeted policy solutions, researchers report.

Next-generation sequencing (NGS) was not carried out for the five most prevalent types of solid tumor among thousands of patients studied, the team revealed in JAMA Network Open.

And among those who did undergo this testing, wait time significantly varied according to race, insurance status, and practice setting.

“Our findings highlight the underrepresentation of certain patient demographics in tumor genomic profiling, revealing disparities in access to standard-of-care diagnostic modalities,” reported researcher Chadi Hage Chehade, MD, from the University of Utah, and coworkers.

“These results emphasize the need for healthcare policies to mitigate these gaps.”

Precision oncology has defined a new era in cancer treatment, enabling clinicians to tailor care based on the specific clinicogenomic features of a patient’s tumor, enabling more effective and less toxic treatment strategies.

NGS has emerged as a transformative technology, enabling comprehensive genomic profiling and uncovering alterations for targeted therapies.

To examine equity of care in the field, the researchers studied electronic health record data for patients with common advanced or metastatic cancers that spanned over 800 U.S. community and academic sites across the U.S. between 2018 and 2022.

The team examined time to first NGS testing and frequency of testing for 63,294 patients, including those with metastatic breast (19.1%), prostate (6.9%), pancreatic (9.7%), colorectal (21.6%), and non–small cell lung cancer (42.7%, NSCLC).

The median age in the group was 68 years and 53.7% was female. In terms of ethnicity, 2.7% were Asian, 10.0% were Black, 6.0% were Hispanic, 61.0% were White, and 20.3% were other races and ethnicities.

The frequency of testing increased over the four-year span across all cancer types, but by the final year of study up to 40% to 50% of patients were still not receiving NGS testing.

Results showed there were differential rates of testing and longer waiting times to NGS testing in some groups.

Patients with lower socioeconomic status (SES), non-Hispanic Black or Hispanic patients, those covered by Medicare, Medicaid, or other government programs, and those treated at an academic practice setting were significantly less likely to be tested in some of cancers than patients with high SES, who were non-Hispanic White, those covered by a commercial health plan, or those treated in community practice, respectively.

Among specific cancers, Hispanic patients were significantly less likely to be tested in metastatic breast or prostate cancer, and non-Hispanic Black patients were less likely to receive NGS in advanced NSCLC, metastatic colorectal or metastatic pancreatic cancer.

The findings highlight the need to improve access to standard-of-care diagnostic modalities and serve as a call to improve NGS testing rates nationwide, said Igor Makhlin, MD, in an accompanying Commentary article.

“While the accelerating pace of research and AI-driven technology is poised to herald the next generation of discoveries that translate into greater survival for patients with cancer, we cannot ignore the increased burden to stay up to date, largely born by community oncologists who manage a wide gamut of solid and liquid cancers,” he maintained.

“Creation and adoption of innovative strategies to support clinicians in implementing breakthrough advances into their practice regardless of zip code, practice site, or other factors will require a concerted effort by all relevant stakeholders, but closing this gap in GCC is absolutely necessary. Our patients are depending on us.”

The post U.S. Tumor Testing Suffers Under Access Disparities appeared first on Inside Precision Medicine.

STAT+: Many cancer patients don’t get genomic tests to guide treatment, study finds

For some advanced cancers, sequencing the tumor genome should be one of the first steps patients and physicians take. But a new study finds that many patients never receive genomic testing and so never get the chance to know if they might have benefitted from newer, more targeted therapies.

The study, published on Tuesday in JAMA Network Open, examined how many patients diagnosed with one of five different metastatic cancers received genetic sequencing for the cancers. For most cancers in the study, roughly half of patients in the cohort received genetic sequencing. Patients with low income, Medicare or Medicaid coverage, and Black or Hispanic race or ethnicity were also less likely to receive sequencing.

Cancer medicine and research have made enormous progress over the last few decades. The overall five-year survival rate has pushed up to 70% as of 2026, and the five-year survival rate for metastatic cancer has doubled since the 1960s. That’s in large part thanks to advances in medicines and technologies that can help treat cancer, like targeted therapies that work by exploiting key cancer mutations.

Continue to STAT+ to read the full story…

Autoimmune Disease-Related Inflammation Reduced with ENDOtollins Drug

A new study published in Nature Chemical Biology titled, “Munc13-4–STX7 inhibitors impair endosomal TLR activation and systemic inflammation,” scientists from Scripps Research have developed a new class of drug compounds, called ENDOtollins, that reduce harmful inflammation while maintaining the body’s ability to fight infections. The results offer new directions to treat autoimmune diseases, such as lupus, and rheumatoid and juvenile arthritis, which together affect more than 15 million Americans. 

“A key component of our approach is to begin by understanding the biological mechanisms at play,” said Sergio Catz, PhD, professor at Scripps Research and corresponding author of the study. “By accomplishing this first, we can more easily target the pathway driving inflammation without affecting other important processes.” 

Current autoimmune disease treatments, such as hydroxychloroquine, function by broadly blocking endosomes. While effective, this approach can lead to significant side effects, including gastrointestinal problems and, less commonly, vision damage, that cause patients to stop treatment. 

The authors focused on two proteins, Munc13-4 and syntaxin 7, that bind together to activate Toll-like receptors (TLRs), immune sensors that activate endosomes. This mechanism plays a key role in detecting foreign DNA and RNA from viruses and bacteria. In autoimmune diseases, TLRs become overactive and trigger chronic, damaging inflammation in the absence of a threat. 

The team screened roughly 32,000 compounds and identified molecules that specifically block the Munc13-4–syntaxin 7 interaction without disrupting other cellular functions. Given that Munc13-4 is found mainly in immune cells, the compounds offer a targeted approach to reduce inflammation. 

“Most treatments for autoimmune diseases manage symptoms; they don’t change the underlying course of the disease,” said Hugh Rosen, MD, PhD, professor at Scripps Research and co-author of the study. “What’s exciting about this approach is its potential to be disease-modifying: targeting the specific molecular machinery that drives inflammation, rather than broadly suppressing the immune system.” 

Notably, the study screened compounds in an intact cellular environment which contrasts from many drug screening approaches, which extract proteins from the cell. 

“By maintaining the proteins in their natural environment, we increase the likelihood that compounds we find will actually work in living cells,” said Jennifer Johnson, PhD, first author and senior staff scientist at Scripps Research. 

The most potent compound, ENDO12, reduced inflammation in animal models that were also given a TLR-activating molecule. Blood levels of inflammatory markers, including immune system activators IL-6 and IFN-γ, and the enzyme myeloperoxidase, dropped significantly in animals that were treated. 

ENDO12 treated animals demonstrated normal antiviral immune response when exposed to a virus. This selectivity addresses the concern that dampening inflammation with immunosuppressive drugs may leave patients vulnerable to infections. 

Looking ahead, the team will test ENDOtollins in models that more closely mimic human autoimmune diseases and evaluate the compounds’ chemistry for potential clinical use. 

Beyond autoimmune conditions, the researchers suggest ENDOtollins might help treat cytokine storms, the dangerous immune overreactions seen in patients with severe COVID-19 and as a side effect of CAR T cancer therapy. Both involve excessive IL-6 and runaway inflammation. 

While translating these findings into treatments for patients remains a long-term goal, Catz emphasizes that the mechanistic insights are valuable in their own right. ENDOtollins can serve as precision tools to probe other cellular processes regulated by endosomes and lysosomes, including pathways implicated in neurodegeneration and immune dysfunction.  

The post Autoimmune Disease-Related Inflammation Reduced with ENDOtollins Drug appeared first on GEN – Genetic Engineering and Biotechnology News.

Rewriting Life Before Birth: Entering the Fetal Genetic Intervention Era

A woman lies on an exam table, holding her partner’s hand tightly with anticipation, as a technician glides an ultrasound probe across her abdomen. On the screen, shifting staticky shadows resolve into a skull, a liver, and the flicker of a beating heart. For many families, this moment brings joy and relief. For others, it’s paralyzing, as doctors detect signs that something is wrong.

A single nucleotide change can cause neurodevelopmental delays and dimorphism, failing livers, and arrhythmia-ridden hearts. For decades, medicine could only identify these conditions, usually after birth. Prenatal screening has made it easier to detect progressive diseases like Duchenne muscular dystrophy, which degenerates and damages muscles before symptoms typically appear in childhood. But treating before birth could preserve tissue prior to the onset of irreversible deterioration.

Once unthinkable, genetic diseases can now be treated before birth. Fetal genetic intervention—including early screening, in utero gene therapy, stem cell transplantation, and even embryo editing—aims not just to diagnose disease but to correct it at its earliest stages. It is a rapidly advancing frontier, defined by technological promise and profound ethical questions.

It starts with detection

Jennifer Hoskovec, vice president of medical affairs at BillionToOne, has spent more than 20 years in prenatal genetics, an era dominated by risk assessment rather than intervention.

Historically, prenatal genetic screening has fallen into two main categories. Aneuploidy testing determines the risk of Down syndrome and other trisomies, sex chromosome abnormalities, and specific microdeletions. Screening is essential for these de novo mutations, which have no U.S. Food and Drug Administration (FDA)-approved genetic interventions. High-risk Down syndrome patients may receive a fetal echocardiogram, closer ultrasound monitoring, or tertiary care delivery with neonatal support. The standard practice is to screen, monitor, and manage.

Billion to One | Headshots
Jennifer Hoskovec
Vice President
BillionToOne

The second category involves inherited recessive conditions like cystic fibrosis (CF), spinal muscular atrophy (SMA), and phenylketonuria. If both parents are carriers for the same genetic mutation, then their child has a 25% chance of being affected. Testing typically requires samples from both parents. If both are carriers, chorionic villus sampling (CVS) and amniocentesis can detect fetal abnormalities in the first and second trimesters, respectively. However, getting each partner to follow up is a major hindrance. “When people go through a screening process and are found to be carriers, less than 50% of their partners complete the testing,” Hoskovec told Inside Precision Medicine. “Half of U.S. carriers of these genetic conditions, whether common or rare, don’t know what it means for their pregnancy. That limits their ability to get diagnostic testing because we do not have all the pieces of the puzzle.”

Hoskovec’s team developed a workaround: a single-gene noninvasive prenatal test that analyzes fetal cell-free DNA (cfDNA) circulating in maternal blood. Around nine weeks into pregnancy, fragments of fetal DNA shed from the placenta can be sequenced and quantified. If a mother is a carrier for a condition like CF or sickle cell disease, the test looks for a second variant that is not present in her DNA and forms evidence of paternal contribution.

“For example, if a mother has [the] sickle cell trait, we first sequence the full beta-globin gene in the cfDNA, which contains a mixture of maternal and fetal DNA,” Hoskovec said. “We look for a second variant not present in the mother that would indicate paternal contribution.”

Despite not replacing CVS or amniocentesis, Hoskovec said the result is highly sensitive, identifying 95% of affected pregnancies in the conditions it covers. Crucially, it does not require partner testing. “This is a stepping stone,” Hoskovec explained. “This earlier detection will likely accelerate the field by increasing the number of eligible patients for clinical studies and registries, improving equitable access across ethnic groups, and advancing precision medicine in prenatal care.”

Avoiding germline editing

David H. Stitelman
David H. Stitelman, MDr
Associate Professor
Yale-New Haven Children’s Hospital and Yale School of Medicine

As screening opens the door, fetal surgeons and gene therapy researchers are taking their first steps through it. A pediatric surgeon at the Yale School of Medicine, David H. Stitelman, MD, believes prenatal treatment has benefits. The fetus is small, so it can receive higher doses based on weight. As its immune system is still developing and more tolerant, stem cells are growing quickly and organs are still being formed, so problems can be fixed before they become permanent. Because the placenta exchanges oxygen, lung conditions like congenital diaphragmatic hernia can be treated during fetal life. But once a newborn takes a first breath, defective lungs can spell immediate crisis.

Fetal therapy is not new. Specialized centers have performed open fetal surgery for spina bifida and diaphragmatic hernia lung growth, and blood transfusions for fetal anemia dating back to the 1960s. What is new is the molecular toolkit. Stitelman’s lab is investigating gene editing methods that use the cell’s repair machinery to fix one- to three-base-pair DNA errors. Another team, led by pediatric and fetal surgeon Tippi MacKenzie, MD, at the University of California, San Francisco, is using viruses to replace genes for lysosomal storage diseases and fetal stem cells for alpha thalassemia.

Some diseases require only modest correction. In hemophilia, one percent normal clotting factor expression improves outcomes greatly. Increasing the expression of functional CFTR protein to 15% of wild-type levels may cure CF or at least make it manageable. Even a small number of liver cells corrected in hereditary tyrosinemia can boost growth and repopulate the organ. However, some situations, such as congenital cancer syndromes, may require nearly 100% correction. At present, Stitelman’s team achieves single-digit percentage editing in models of CF and beta thalassemia. “We’re in the optimization phase,” Stitelman told Inside Precision Medicine. “We are testing different nanoparticles and generations of editing strategies to incrementally reach therapeutic levels.”

Stitelman draws a clear ethical boundary: this is somatic editing, not germline editing. The aim is to treat the fetus as a patient, not to create heritable genetic changes. Instead of editing embryos in vitro, systemic therapeutic agents are delivered to avoid reproductive cell damage.

Unintended germline modification remains a concern. Editing a target gene could inadvertently disrupt developmental genes and affect future generations. But, Stitelman argues, medicine always carries risk. “In 1950, children with leukemia all died,” said Stitelman. “Today, some forms have a 98% long-term survival rate with chemotherapy. We know chemotherapy can cause germline mutations, yet we accept that risk because it saves lives. With gene editing, the issue is not zero risk but understanding and quantifying the risk. Ideally, there would be no measurable off-target effects. In the places we have examined, we have not seen off-target effects.”

One pregnancy, two patients

In a landmark trial in 2011 known as the Management of Myelomeningocele Study, investigators found that fetal surgery for severe spina bifida (myelomeningocele) achieved better results than postnatal repair. Surgically closing the spinal defect in utero improved motor function and reduced the need for shunting to relieve hydrocephalus. The benefit was so clear that the trial was stopped early and influenced how doctors treat structural birth defects.

Aijun Wang
Aijun Wang, PhD
Professor
University of California, Davis

At the University of California, Davis, biomedical engineer Aijun Wang, PhD, is working closely with fetal surgery pioneer Diana L. Farmer, MD, to evolve fetal intervention from heroic surgery to cellular and molecular therapy. Wang and Farmer launched the Cellular Therapy for In Utero Repair of Myelomeningocele (CuRe) trial, combining fetal surgery with stem cell transplantation. The goal is to not only close the spinal defect but also restore neural tissue and improve long-term function.

The lens that Wang has used to focus his research is fetal and maternal safety. “The fetus is the patient, but treatment inevitably carries some risk to the mother,” Wang told Inside Precision Medicine. “Open fetal surgery, in particular, poses significant maternal risk. Genetic treatments introduce additional uncertainties because the long-term effects of DNA modification are not fully understood. Safety must remain the highest priority.”

Genetic medicine delivery is a critical challenge for all life stages, but the stakes are particularly high for a developing fetus. In fetal development, targeting stem cell populations is especially important because these cells are highly active, proliferating, and migrating. If edited successfully at the right developmental window, their progeny will carry the correction. The problem would be if the edit was not just unsuccessful but detrimental.

Wang’s lab focuses on delivery systems, particularly lipid nanoparticles carrying mRNA-encoding gene-editing enzymes. For genetic manipulation and high-throughput screening, Wang’s lab utilizes mouse models. Fetal sheep are used for scaling and dosing, while human organoids are used for human-specific editing and functional outcomes.

“In our clinical work, we have engaged with the FDA and conducted extensive preclinical studies,” said Wang. “Using multiple complementary models is essential. Combining small animal models, large animal translational models, and human organoid systems provides a comprehensive framework for product development, from early screening to human-focused therapeutic design.”

Although the field is highly exciting and progressing rapidly, Wang warns against premature application, which could be dangerous. Safety, developmental biology, ethical considerations, and multidisciplinary collaboration are all essential. “Despite the excitement in the field, we must proceed cautiously,” said Wang. “There is strong potential for correcting specific mutations, especially point mutations, using precise gene editing approaches such as base editing. However, safety evaluation must precede rapid clinical application.”

Effective progress requires a village of physicians, surgeons, researchers, engineers, and ethicists working together. Scientific progress requires caution, responsibility, and thorough evaluation before clinical use.

The earlier, the better

If fetal intervention treats a diagnosed fetus, embryo editing operates even earlier—at the blastocyst stage in in vitro fertilization (IVF). Norbert Gleicher, MD, a fertility specialist known for treating some of the oldest and most difficult IVF patients in the United States, approaches genetic technologies with caution. Due to biological mosaicism, sampling limitations, and his belief that many abnormal embryos self-correct or develop normally, Gleicher opposes preimplantation genetic testing for aneuploidy.

Norbert Gleicher
Norbert Gleicher, MD
Founder & Medical Director
Center for Human Reproduction

But when it comes to single-gene diseases, he sees a different calculus. Couples with recessive mutations may have one-in-four embryos affected, and in dominant or X-linked diseases, half may carry the mutation. For patients who produce few embryos—especially older women—discarding affected embryos can mean losing precious chances at pregnancy. “If you can cure an embryo rather than discard it,” Gleicher told Inside Precision Medicine, “that makes a lot of sense.”

For single-gene diseases, Gleicher believes genetic editing with CRISPR or other platforms is the most straightforward intervention. He points to the 2025 work at the Children’s Hospital of Philadelphia on Baby KJ as a recent milestone. Even partial correction, which Gleicher believes is likely the case with Baby KJ—though no liver biopsies have been extracted—can transform prognosis. Gleicher said, “Correcting some cells was enough to clinically cure the baby, at least for the time being, from symptoms of a disease that historically kills affected children within a few years. However, we do not know whether the treated baby, who likely still has many affected cells, might become symptomatic again later in life.”

To Gleicher, success in a newborn is all the more reason to apply genetic intervention to fetal stages. “If this can be successful in a full human being, imagine how much easier it would be at the blastocyst stage, or even earlier at the cleavage stage, when the embryo consists of only six to eight cells,” said Gleicher. “If [CRISPR] is applied at that point, correcting those six to eight cells would mean that all their daughter cells would also be corrected. The result would be a normal baby at birth. That is the much stronger argument in this case.”

Just because something is possible, it doesn’t necessarily mean it should be done, and Gleicher establishes a clear ethical boundary. Editing to prevent a devastating single-gene disease is one thing. Editing for traits—eye color, intelligence, polygenic risk scores—is another. Polygenic predictions explain only a fraction of trait variance, and embryo implantation itself is uncertain. To him, offering polygenic selection in IVF is not only scientifically dubious but also ethically troubling. “It is surprising that professionals, particularly in genetics, would suggest such an approach,” said Gleicher. “It is worse than snake oil, because while snake oil may occasionally work by accident, this carries a real risk of causing serious harm.”

A pretty penny

What ultimately restricts fetal genetic intervention is timing. Early screening increases experimental trial eligibility, and early treatment may preserve organ development before irreversible damage. In conditions like CF and SMA, where postnatal gene therapies are expensive and delivered after injury, fetal intervention could change outcomes. Frontline screening can identify high-risk pregnancies at 11 weeks without family history or ethnicity, expanding trial access.

Yet, fetal genetic interventions require specialized teams, advanced delivery systems, counseling, and long-term follow-up. Without careful planning and reimbursement policies, only a few top-tier centers could progress, widening the gap. Ethical scrutiny remains inseparable from progress. Innovation must balance maternal risk, fetal benefit, and future consequences with safety, appropriate use, and clear limits. As prenatal care shifts from prediction to prevention, restraint and evidence will determine its future.

 

Jonathan D. Grinstein, PhD, North American editor for Inside Precision Medicine, investigates the most recent research and developments in a wide range of human healthcare topics and emerging trends, such as next-generation diagnostics, cell and gene therapy, and AI/ML for drug discovery. He is also the host of the Behind the Breakthroughs podcast, featuring people shaping the future of medicine. Jonathan earned his PhD in biomedical science from the University of California, San Diego, and a BA in neural science from New York University.

The post Rewriting Life Before Birth: Entering the Fetal Genetic Intervention Era appeared first on Inside Precision Medicine.

The Download: AI’s impact on jobs, and data centres in space

This is today’s edition of The Download, our weekday newsletter that provides a daily dose of what’s going on in the world of technology.

The one piece of data that could actually shed light on your job and AI 

Within Silicon Valley’s orbit, an AI-fueled jobs apocalypse is spoken about as a given. Now even economists who have downplayed the threat are coming around to the idea.  

Alex Imas, based at the University of Chicago, is one of them. He believes that any plan to address AI’s impact will depend on collecting one vital piece of data: price elasticity. 

Imas argues that “we need a Manhattan Project” for this. Read the full story to find out why

—James O’Donnell 

This article is from The Algorithm, our weekly newsletter giving you the inside track on all things AI. Sign up to receive it in your inbox every Monday. 

Four things we’d need to put data centers in space 

In January, Elon Musk’s SpaceX applied to launch up to 1 million data centers into Earth’s orbit. The goal? To fully unleash the potential of AI—without triggering an environmental crisis on Earth. 

SpaceX is among a growing list of tech firms pursuing orbital computing infrastructure. But can their plans really work? Here are four must-haves for making space-based data centers a reality

—Tereza Pultarova 

This story is part of MIT Technology Review Explains, our series untangling the complex, messy world of technology to help you understand what’s coming next. You can read more from the series here. 

The must-reads 

I’ve combed the internet to find you today’s most fun/important/scary/fascinating stories about technology. 

1 Trump has again proposed major cuts to US science and tech spending 
He wants to slash nearly every science-focused agency. (Ars Technica
+ If Trump gets his way, the US could face a costly brain drain. (NYT $)  
+ Top research talent is already fleeing the country. (Guardian)  
+ Basic science deserves our boldest investment. (MIT Technology Review

2 Sam Altman lobbied against AI regulations he publicly welcomed  
A bombshell report reveals many OpenAI insiders don’t trust him. (The New Yorker $) 
+ Some have called him a sociopath. (Futurism
+ OpenAI’s CFO fears it won’t be IPO-ready this year. (The Information $)  
+ A war over AI regulation is brewing in the US. (MIT Technology Review

3 NASA’s Artemis II has broken humanity’s all-time distance record 
The astronauts have flown farther than any humans before them. (BBC
+ Their mission includes MIT-developed technology. (Axios

4 Chinese tech firms are selling intel “exposing” US forces 
It comes from combining AI with open-source data.. (WP $) 
+ AI is turning the Iran conflict into theater. (MIT Technology Review

5 War is pushing countries to ditch hyperscalers 
Driven by Iran naming tech giants as military targets. (Rest of World
+ No one wants a data center in their backyard. (MIT Technology Review

6 OpenAI, Anthropic, and Google have united against China’s AI copying 
They’re sharing information on “adversarial distillation” (Bloomberg $) 

7 Anduril and Impulse Space are working on Trump’s “Golden Dome” 
They’re developing space-based missile tracking for the project. (Gizmodo)  

8 OpenAI has urged California to probe Elon Musk’s “anti-competitive behavior.” 
It accuses Musk of trying to “take control of the future of AGI.” (Reuters $) 
+ And claims he coordinated attacks with Mark Zuckerberg. (CNBC
+ A former Tesla president has revealed how he survived working for Musk. (WP $) 

9 DeepSeek’s new AI model will run on Huawei chips 
It’s expected to launch in the next few weeks. (The Information $) 

10 Memes have nuked our culture 
Internet “brain rot” has escaped our phones to take over everything. (NYT $) 

Quote of the day 

“I must say, it was actually quite nice.” 

 —Astronaut Victor Glover tells President Donald Trump what it was like when Artemis II was out of communication with the rest of humanity, The New York Times reports. 

One More Thing 

eucalyptus forest

PABLO ALBARENGA

Inside the controversial tree farms powering Apple’s carbon-neutral goal  

In 2020, Apple set a goal to become net zero by the end of the decade. To hit that target, the company is offsetting its emissions by planting millions of eucalyptus trees in Brazil. 

Apple is betting that the strategy will lead to a greener future. But critics warn that the industrial tree farms will do more harm than good. 

Find out why the plans have sparked a backlash. 

—Gregory Barber 

We can still have nice things 

A place for comfort, fun and distraction to brighten up your day. (Got any ideas? Drop me a line.) 

+ Japan’s automated bike garage is a cyclist’s dream come true.  
+ This deep dive into bird behavior reveals the secrets of their dining habits. (Big thanks to reader Terry Gordon for the find!) 
+ The first photo from the Artemis astronauts vividly captures the glow of our atmosphere. 
+ There’s a new contender for the world’s most gorgeous website: RobertDeNiro.com. 

Synaptic remodeling and the female depression exposome: a mini-review of neuroendocrine, epigenetic, and social determinants

Depression is a multifactorial, chronic disorder and represents a leading cause of disability, with women exhibiting nearly twice the lifetime prevalence compared to men. Growing evidence indicates that this disparity cannot be explained by hormonal or psychosocial factors, but rather by dynamic interactions between environmental exposures, neuroendocrine signaling, and epigenetic regulation across development. This mini-narrative review aimed to examine how sex-specific exposome components interact with epigenetic mechanisms and synaptic remodeling processes to influence vulnerability to Major Depressive Disorder in women. The reviewed evidence demonstrates that fluctuations in ovarian hormones modulate HPA axis responsivity, neuroinflammatory signaling, and glutamatergic transmission through epigenetic regulation of stress-responsive genes such as NR3C1, SLC6A4, and BDNF, consequently influencing synaptic remodeling within corticolimbic circuits. Environmental and social exposures, particularly early-life adversity and psychosocial stressors, further interact with microglial activation and chromatin remodeling to produce long-lasting alterations in hippocampal and prefrontal plasticity. Collectively, these findings support a model in which sex-dependent neuroendocrine sensitivity amplifies exposome-driven epigenetic programming across the lifespan. Future research directions emerging from this synthesis include longitudinal life-course studies integrating multi-omic biomarkers, quantitative exposome assessment, and neuroimaging approaches to identify modifiable environmental targets and advance precision, sex-informed preventive and therapeutic strategies in depression.