How an Alzheimer’s Risk Gene Rewires the Brain Decades Before Symptoms

For millions of people worldwide, carrying the APOE4 gene variant means a significantly higher risk of developing Alzheimer’s disease. Yet one of the biggest unanswered questions has been when, and how, that risk begins to take hold in the brain.

New research from the Gladstone Institutes, published in Nature Aging, suggests that the effects of APOE4 emerge far earlier than previously understood. The study shows that subtle but important changes in brain activity occur long before memory loss begins, offering a potential window for early intervention.

Early changes in a seemingly healthy brain

Alzheimer’s disease is typically diagnosed after cognitive symptoms appear, but growing evidence suggests that the disease process begins decades earlier. The new study adds to this picture by demonstrating that brain circuits in young individuals carrying APOE4 are already functioning differently.

We found fundamental changes in brain circuits occurring in young mice that still had normal learning and memory, and importantly, that those changes predicted the development of cognitive deficits at older ages, ” said Misha Zilberter, PhD, principal staff research scientist at Gladstone and senior author of the study.

The researchers observed increased neuronal activity in the hippocampus, a brain region essential for learning and memory. Similar patterns of hyperactivity have been reported in human APOE4 carriers, even before clinical symptoms arise.

According to the scientists, this suggests that Alzheimer’s risk is not simply a matter of late-stage degeneration, but may instead involve long-term changes in how brain circuits are wired and function.

Smaller neurons, stronger signals

To understand what drives this early hyperactivity, the team examined individual brain cells. They found that neurons in key regions of the hippocampus were physically smaller in APOE4 carriers compared to those with the more common, lower-risk APOE3 variant.

While this might seem like a minor structural difference, it has functional consequences. Smaller neurons are more easily activated, meaning they fire more readily in response to stimuli. This heightened sensitivity can lead to persistent hyperactivity within neural circuits.

Over time, this imbalance may place stress on the brain and contribute to the gradual decline seen in Alzheimer’s disease.

A surprising source of dysfunction

For years, researchers believed that APOE4’s effects were primarily driven by astrocytes, support cells in the brain that produce most of the APOE protein. However, the new findings challenge this assumption.

The team discovered that the disruptive effects on brain activity were instead linked to APOE4 produced directly by neurons themselves. When APOE4 was removed from neurons, their size and activity returned to normal. Removing it from astrocytes, by contrast, had little effect.

This shift in understanding refocuses attention on neurons as key drivers of early disease processes, rather than passive victims of surrounding dysfunction.

A reversible pathway—and a new target

Perhaps the most striking finding of the study is that these early changes may not be permanent.

The researchers identified a protein called Nell2 as a central player in the process. Levels of Nell2 were elevated in APOE4 neurons and appeared to drive both the reduction in cell size and the increase in neuronal activity.

By reducing Nell2 levels in adult mice, the team was able to restore normal neuron structure and function—even after the changes had already occurred.

“What’s exciting about Nell2 is that we were able to reverse the disease manifestations in adult mice by lowering its level,” said Yadong Huang, co-senior author of the study. “That tells us the damage is not irreversible […].”

This raises the possibility of developing therapies that target Nell2, potentially slowing or preventing disease progression in individuals at high genetic risk.

Implications for early intervention

APOE4 is present in roughly one in four people and in the majority of Alzheimer’s patients. Despite this, current treatments largely focus on late-stage symptoms rather than early prevention.

The new findings suggest that intervening earlier, before cognitive decline begins, could be key. If brain circuit changes can be detected and corrected at an early stage, it may be possible to delay or even prevent the onset of Alzheimer’s disease.

The study also highlights the importance of understanding how genetic risk translates into functional changes in the brain. Rather than acting as a simple risk marker, APOE4 appears to actively reshape neural activity over time.

A shift in perspective

More broadly, the work reflects a growing shift in Alzheimer’s research, from focusing solely on hallmark features such as amyloid plaques and tau tangles to examining earlier, subtler changes in brain function.

By identifying a concrete pathway linking genetic risk to altered brain activity, the study provides a clearer framework for understanding how the disease develops.

“This study is a big breakthrough for the field of Alzheimer’s research,” Huang said. “It opens the door to a better understanding of how APOE4 alters the function of neurons at a young age to increase risk of cognitive decline, and to the development of therapies that could block the detrimental effects of APOE4 early on.”

While the findings are based on mouse models, they align closely with observations in humans and offer a strong foundation for future research. The next steps will involve determining whether targeting Nell2 or similar pathways can produce similar benefits in human patients.

If successful, such approaches could transform how Alzheimer’s disease is treated, not as an inevitable consequence of aging, but as a process that can be detected early and potentially reversed.

The post How an Alzheimer’s Risk Gene Rewires the Brain Decades Before Symptoms appeared first on Inside Precision Medicine.

Epigenetic Strategy Restores Tumor Suppressor in Acute Myeloid Leukemia Models

Scientists from The Jackson Laboratory (JAX) and their collaborators elsewhere have found a potential way to treat cases of acute myeloid leukemia that involves turning a key cancer fighting gene back on. Besides potentially treating AML without harsh chemotherapy regimens, their work also highlights a promising strategy for studying gene-silencing mechanisms in other diseases. Full details of the study, which was done in mice, are available in a paper published in Science Translational Medicine titled “Epigenetic reactivation of the tumor suppressor ZBTB7A by KDM4 inhibition in human acute myeloid leukemia.”

Normally, tumor suppressor genes work to prevent cells from becoming cancerous. But in cancers like AML, some of these genes are switched off epigenetically. These changes to gene activity are difficult to track because standard DNA sequencing technologies are designed to find mutated DNA. “If we can identify which genes have been silenced and understand how to turn them back on, that could open up entirely new therapeutic possibilities,” said Eric Wang, PhD, an assistant professor JAX who led the research. “Instead of only trying to kill these cells, we may be able to restore the mechanisms that normally keep them under control.”

Though scientists have made great strides in developing therapies for AML, prognosis for the disease is still relatively poor. Part of the challenge is that AML cells remain in an immature, stem cell-like state. According to the paper, Wang and his team developed a tool that combines fluorescence in situ hybridization and flow cytometry with CRISPR gene editing technology to map gene activity in cells. They used the tool, called FISHnCRISP, to identify a tumor-suppressing gene called ZBTB7A that is silenced in AML patients. By restoring ZBTB7A expression, the scientists forced the cancer cells into a state where they grew less aggressively.

Digging into the details, AML cells produce a longer version of ZBTB7A’s regulatory tail, that contains sites that attract a protein called ZFP36L2, which reduces the gene’s activity. Additionally, a family of enzymes known as KDM4 modify how DNA is packaged inside AML cells, which effectively silences ZBTB7A expression. Data from experiments in mice with AML showed that when KDM4 enzymes were blocked, ZBTB7A regained its expression, reducing leukemia burden while leaving normal blood formation largely unaffected.

Importantly, “there are drug candidates out there to inhibit KDM4, and in our study we just repurposed one of them to treat AML cells,” Wang said. “We won’t know unless we test it in clinical trials, but this approach could be better than chemotherapy, because we showed it’s not toxic at all to normal blood cells.” 

Future studies will focus on refining the approach and determining whether it might be combined with existing treatments. The team plans to test an experimental drug that targets KDM4, which is currently being tested in a clinical trial for solid tumors. 

“We demonstrated that downregulating ZBTB7A causes this hyperinflammatory state that promotes cancer growth” and “now, we’re proposing this epigenetic approach to force AML cells to differentiate into white blood cells that eventually undergo cell death,” Wang said. “We could potentially translate our research into an early phase clinical trial more readily than developing a whole new compound from scratch.” 

The post Epigenetic Strategy Restores Tumor Suppressor in Acute Myeloid Leukemia Models appeared first on GEN – Genetic Engineering and Biotechnology News.

STAT+: NIH would get $5 billion cut under Trump’s 2027 budget, but Congress unlikely to go along

The White House is asking Congress to cut $5 billion from the National Institutes of Health and to downsize the number of its institutes and centers from 27 to 22 — a plan that is expected to receive a chilly reception from lawmakers from both parties. 

The president’s fiscal year 2027 budget request, released Friday, asks for $41 billion for the NIH and eliminates the National Center for Complementary and Integrative Health, the Fogarty International Center, and the National Institute on Minority Health and Health Disparities. The 2027 budget also proposes consolidating two institutes focused on research on drug and alcohol abuse into a new entity called the National Institute of Substance Use and Addiction Research, as well as relocating the National Institute of Environmental Health Sciences into the Centers for Disease Control and Prevention. 

The White House proposal also asks Congress to slash the budget for the Advanced Research Projects (ARPA-H), which funds cutting-edge science, from its current $1.5 billion to $945 million.

Continue to STAT+ to read the full story…

Four things we’d need to put data centers in space

MIT Technology Review Explains: Let our writers untangle the complex, messy world of technology to help you understand what’s coming next. You can read more from the series here.

In January, Elon Musk’s SpaceX filed an application with the US Federal Communications Commission to launch up to one million data centers into Earth’s orbit. The goal? To fully unleash the potential of AI without triggering an environmental crisis on Earth. But could it work?

SpaceX is the latest in a string of high-tech companies extolling the potential of orbital computing infrastructure. Last year, Amazon founder Jeff Bezos said that the tech industry will move toward large-scale computing in space. Google has plans to loft data-crunching satellites, aiming to launch a test constellation of 80 as early as next year. And last November Starcloud, a startup based in Washington State, launched a satellite fitted with a high-performance Nvidia H100 GPU, marking the first orbital test of an advanced AI chip. The company envisions orbiting data centers as large as those on Earth by 2030.

Proponents believe that putting data centers in space makes sense. The current AI boom is straining energy grids and adding to the demand for water, which is needed to cool the computers. Communities in the vicinity of large-scale data centers worry about increasing prices for those resources as a result of the growing demand, among other issues.

In space, advocates say, the water and energy problems would be solved. In constantly illuminated sun-synchronous orbits, space-borne data centers would have uninterrupted access to solar power. At the same time, the excess heat they produce would be easily expelled into the cold vacuum of space. And with the cost of space launches decreasing, and mega-rockets such as SpaceX’s Starship promising to push prices even lower, there could be a point at which moving the world’s data centers into space makes sound business sense. Detractors, on the other hand, tell a different story and point to a variety of technological hurdles, though some say it’s possible they may be surmountable in the not-so-distant future. Here are four of the must-haves we’d need to make space-based data centers a reality. 

A way to carry away heat 

AI data centers produce a lot of heat. Space might seem like a great place to dispel that heat without using up massive amounts of water. But it’s not so simple. To get the power needed to run 24-7, a space-based data center would have to be in a constantly illuminated orbit, circling the planet from pole to pole, and never hide in Earth’s shadow. And in that orbit, the temperature of the equipment would never drop below 80 °C, which is way too hot for electronics to operate safely in the long term. 

Getting the heat out of such a system is surprisingly challenging. “Thermal management and cooling in space is generally a huge problem,” says Lilly Eichinger, CEO of the Austrian space tech startup Satellives.

On Earth, heat dissipates mostly through the natural process of convection, which relies on the movement of gases and liquids like air and water. In the vacuum of space, heat has to be removed through the far less efficient process of radiation. Safely removing the heat produced by the computers, as well as what’s absorbed from the sun, requires large radiative surfaces. The bulkier the satellite, the harder it is to send all the heat inside it out into space.

But Yves Durand, former director of technology at the European aerospace giant Thales Alenia Space, says that technology already exists to tackle the problem.

The company previously developed a system for large telecommunications satellites that can pipe refrigerant fluid through a network of tubing using a mechanical pump, ultimately transferring heat from within a spacecraft to radiators on the exterior. Durand led a 2024 feasibility study on space-based data centers, which found that although challenges exist, it should be possible for Europe to put gigawatt-scale data centers (on par with the largest Earthbound facilities) into orbit before 2050. These would be considerably larger than those envisioned by SpaceX, featuring solar arrays hundreds of meters in size—larger than the International Space Station.

Computer chips that can withstand a radiation onslaught

The space around Earth is constantly battered by cosmic particles and lashed by solar radiation. On Earth’s surface, humans and their electronic devices are protected from this corrosive soup of charged particles by the planet’s atmosphere and magnetosphere. But the farther away from Earth you venture, the weaker that protection becomes. Studies show that aircraft crews have a higher risk of developing cancer because of their frequent exposure to high radiation at cruising altitude, where the atmosphere is thin and less protective.

Electronics in space are at risk of three types of problems caused by high radiation levels, says Ken Mai, a principal systems scientist in electrical and computer engineering at Carnegie Mellon University. Phenomena known as single-event upsets can cause bit flips and corrupt stored data when charged particles hit chips and memory devices. Over time, electronics in space accumulate damage from ionizing radiation that degrades their performance. And sometimes a charged particle can strike the component in a way that physically displaces atoms on the chip, creating permanent damage, Mai explains.

Traditionally, computers launched to space had to undergo years of testing and were specifically designed to withstand the intense radiation present in Earth’s orbit. These space-hardened electronics are much more expensive, though, and their performance is also years behind the state-of-the-art devices for Earth-based computing. Launching conventional chips is a gamble. But Durand says cutting-edge computer chips use technologies that are by default more resistant to radiation than past systems. And in mid-March, Nvidia touted hardware, including a new GPU, that is “bringing AI compute to orbital data centers.” 

Nvidia’s head of edge AI marketing, Chen Su, told MIT Technology Review, that “Nvidia systems are inherently commercial off the shelf, with radiation resilience achieved at the system level rather than through radiation‑hardened silicon alone.” He added that satellite makers increase the chips’ resiliency with the help of shielding, advanced software for error detection, and architectures that combine the consumer-grade devices with bespoke, hardened technologies.

Still, Mai says that the data-crunching chips are only one issue. The data centers would also need memory and storage devices, both of which are vulnerable to damage by excessive radiation. And operators would need the ability to swap things out or adapt when issues arise. The feasibility and affordability of using robots or astronaut missions for maintenance is a major question mark hanging over the idea of large-scale orbiting data centers.

“You not only need to throw up a data center to space that meets your current needs; you need redundancy, extra parts, and reconfigurability, so when stuff breaks, you can just change your configuration and continue working,” says Mai. “It’s a very challenging problem because on one hand you have free energy and power in space, but there are a lot of disadvantages. It’s quite possible that those problems will outweigh the advantages that you get from putting a data center into space.”

In addition to the need for regular maintenance, there’s also the potential for catastrophic loss. During periods of intense space weather, satellites can be flooded with enough radiation to kill all their electronics. The sun has just passed the most active phase of its 11-year cycle with relatively little impact on satellites. Still, experts warn that since the space age began, the planet has not experienced the worst the sun is capable of. Many doubt whether the low-cost new space systems that dominate Earth’s orbits today are prepared for that.

A plan to dodge space debris

Both large-scale orbiting data centers such as those envisioned by Thales Alenia Space and the mega-constellations of smaller satellites as proposed by SpaceX give a headache to space sustainability experts. The space around Earth is already quite crowded with satellites. Starlink satellites alone perform hundreds of thousands of collision avoidance maneuvers every year to dodge debris and other spacecraft. The more stuff in space, the higher the likelihood of a devastating collision that would clutter the orbit with thousands of dangerous fragments.

Large structures with hundreds of square meters of solar arrays would quickly suffer damage from small pieces of space debris and meteorites, which would over time degrade the performance of their solar panels and create more debris in orbit. Operating one million satellites in low Earth orbit, the region of space at the altitude of up to 2,000 kilometers, might be impossible to do safely unless all satellites in that area are part of the same network so they can communicate effectively to maneuver around each other, Greg Vialle, the founder of the orbital recycling startup Lunexus Space, told MIT Technology Review.

“You can fit roughly four to five thousand satellites in one orbital shell,” Vialle says. “If you count all the shells in low Earth orbit, you get to a number of around 240,000 satellites maximum.”

And spacecraft must be able to pass each other at a safe distance to avoid collisions, he says. 

“You also need to be able to get stuff up to higher orbits and back down to de-orbit,” he adds. “So you need to have gaps of at least 10 kilometers between the satellites to do that safely. Mega-constellations like Starlink can be packed more tightly because the satellites communicate with each other. But you can’t have one million satellites around Earth unless it’s a monopoly.”

On top of that, Starlink would likely want to regularly upgrade its orbiting data centers with more modern technology. Replacing a million satellites perhaps every five years would mean even more orbital traffic—and it could increase the rate of debris reentry into Earth’s atmosphere from around three or four pieces of junk a day to about one every three minutes, according to a group of astronomers who filed objections against SpaceX’s FCC application. Some scientists are concerned that reentering debris could damage the ozone layer and alter Earth’s thermal balance

Economical launch and assembly

The longer hardware survives in orbit, the better the return on investment. But for orbital data centers to make economic sense, companies will have to find a relatively cheap way to get that hardware in orbit. SpaceX is betting on its upcoming Starship mega-rocket, which will be able to carry up to six times as much payload as the current workhorse, Falcon 9. The Thales Alenia Space study concluded that if Europe were to build its own orbital data centers, it would have to develop a similarly potent launcher. 

But launch is only part of the equation. A large-scale orbital data center won’t fit in a rocket—even a mega-rocket. It will need to be assembled in orbit. And that will likely require advanced robotic systems that do not exist yet. Various companies have conducted Earth-based tests with precursors of such systems, but they are still far from real-world use.

Durand says that in the short term, smaller-scale data centers are likely to establish themselves as an integral part of the orbital infrastructure, by processing images from Earth-observing satellites directly in space without having to send them to Earth. That would be a huge help for companies selling insights from space, as many of these data sets are extremely large, and competition for opportunities to downlink them to Earth for processing via ground stations is growing.

“The good thing with orbital data centers is that you can start with small servers and gradually increase and build up larger data centers,” says Durand. “You can use modularity. You can learn little by little and gradually develop industrial capacity in space. We have all the technology, and the demand for space-based data processing infrastructure is huge, so it makes sense to think about it.”

Smaller facilities probably won’t do much to offset the strain that terrestrial data centers are placing on the planet’s water and electricity, though. That vision of the future might take decades to come to fruition, some critics think—if it even gets off the ground at all. 

<![CDATA[What are the cultural, religious, philosophical, and psychological factors that differentiate Eastern and Western values regarding assisted suicide and euthanasia?]]>
<![CDATA[Bipolar and spring mania: tips from an expert.]]>

STAT+: White House proposes 12% cut to federal health agencies in 2027 budget request

WASHINGTON — The White House wants Congress to cut spending on the Department of Health and Human Services by more than 12%, according to its proposed 2027 federal budget, released Friday. 

The budget is broadly similar to what the Trump administration proposed last year. That includes deep cuts to the National Institutes of Health, the elimination of a health research agency, and the creation of a new agency devoted to chronic diseases called the Administration for a Healthy America. 

The president’s budget is as an agenda-setting document, offering a sense of what the administration hopes to focus on in the coming year. Congress, however, is ultimately responsible for passing laws that set federal spending.

Continue to STAT+ to read the full story…

<![CDATA[“I go down the 58,022 names, half-expecting to find my own in letters like smoke.”
]]>
<![CDATA[New analysis links traumatic brain injury to later psychosis, prompting long-term screening and follow-up.]]>

STAT+: Up and down the ladder: The latest comings and goings

Hired someone new and exciting? Promoted a rising star? Finally solved that hard-to-fill spot? Share the news with us, and we’ll share it with others. That’s right. Send us your changes, and we’ll find a home for them. Don’t be shy. Everyone wants to know who is coming and going.

And here is our regular feature in which we highlight a different person each week. This time around, we note that Proxygen hired Chiara Conti as chief scientific officer. Previously, she worked at Blueprint Medicines, where she was senior director.

But all work and no play can make for a dull chief scientific officer.

Continue to STAT+ to read the full story…