Is fake grass a bad idea? The AstroTurf wars are far from over.

A rare warm spell in January melted enough snow to uncover Cornell University’s newest athletic field, built for field hockey. Months before, it was a meadow teeming with birds and bugs; now it’s more than an acre of synthetic turf roughly the color of the felt on a pool table, almost digital in its saturation. The day I walked up the hill from a nearby creek to take a look, the metal fence around the field was locked, but someone had left a hallway-size piece of the new simulated grass outside the perimeter. It was bristly and tough, but springy and squeaky under my booted feet. I could imagine running around on it, but it would definitely take some getting used to.

My companion on this walk seemed even less favorably disposed to the thought. Yayoi Koizumi, a local environmental advocate, has been fighting synthetic-turf projects at Cornell since 2023. A petite woman dressed that day in a faded plum coat over a teal vest, with a scarf the colors of salmon, slate, and sunflowers, Koizumi compulsively picked up plastic trash as we walked: a red Solo cup, a polyethylene Dunkin’ container, a five-foot vinyl panel. She couldn’t bear to leave this stuff behind to fragment into microplastic bits—as she believes the new field will. “They’ve covered the living ground in plastic,” she said. “It’s really maddening.” 

The new pitch is one part of a $70 million plan to build more recreational space at the university. As of this spring, Cornell plans to install something like a quarter million square feet of synthetic grass—what people have colloquially called “astroturf” since the middle of the last century. University PR says it will be an important part of a “health-promoting campus” that is “supportive of holistic individual, social, and ecological well-being.” Koizumi runs an anti-plastic environmental group called Zero Waste Ithaca, which says that’s mostly nonsense.

This fight is more than just the usual town-versus-gown tension. Synthetic turf used to be the stuff of professional sports arenas and maybe a suburban yard or two; today communities across the United States are debating whether to lay it down on playgrounds, parks, and dog runs. Proponents say it’s cheaper and hardier than grass, requiring less water, fertilizer, and maintenance—and that it offers a uniform surface for more hours and more days of the year than grass fields, a competitive advantage for athletes and schools hoping for a more robust athletic program.

But while new generations of synthetic turf look and feel better than that mid-century stuff, it’s still just plastic. Some evidence suggests it sheds bits that endanger users and the environment, and that it contains PFAS “forever chemicals”—per- and polyfluoroalkyl substances, which are linked to a host of health issues. The padding within the plastic grass is usually made from shredded tires, which might also pose health risks. And plastic fields need to be replaced about once a decade, creating lots of waste.

Yet people are buying a lot of the stuff. In 2001, Americans installed just over 7 million square meters of synthetic turf, just shy of 11,000 metric tons. By 2024, that number was 79 million square meters—enough to carpet all of Manhattan and then some, almost 120,000 metric tons. Synthetic turf covers 20,000 athletic fields and tens of thousands of parks, playgrounds, and backyards. And the US is just 20% of the global market. 

Where real estate is limited and demand for athletic facilities is high, artificial turf is tempting. “It all comes down to land and demand.”

Frank Rossi, professor of turf science, Cornell

Those increases worry folks who study microplastics and environmental pollution. Any actual risk is hard to parse; the plastic-making industry insists that synthetic fields are safe if properly installed, but lots of researchers think that isn’t so. “They’re very expensive, they contain toxic chemicals, and they put kids at unnecessary risk,” says Philip Landrigan, a Boston College epidemiologist who has studied environmental toxins like lead and microplastics.

But at Cornell, where real estate is limited and demand for athletic facilities is high, synthetic turf was a tempting option. As Frank Rossi, a professor of turf science at Cornell, told me: “It all comes down to land and demand.”


In 1965, Houston’s new, domed base­ball stadium was an icon of space-age design. But the Astrodome had a problem: the sun. Deep in the heart of Texas, it shined brightly through the Astrodome’s skylights—so much so that players kept missing fly balls. So the club painted over the skylights. Denied sunlight, the grass in the outfield withered and died.

A replacement was already in the works. In the late 1950s a Ford Foundation–funded educational laboratory determined that a soft, grasslike surface material would give city kids more places to play outside and had prevailed upon the Monsanto corporation to invent one. The result was clipped blades of nylon stuck to a rubber base, which the company called ChemGrass. Down it went into Houston’s outfield, where it got a new, buzzier name: AstroTurf.

""
Workers lay artificial turf at the Astrodome in Houston on July 13, 1966. Developed by Monsanto, the material was originally known as ChemGrass but was later renamed AstroTurf after the stadium.
AP PHOTO/ED KOLENOVSKY, FILE

That first generation of simulated lawn was brittle and hard, but quality has improved. Today, there are a few competing products, but they’re all made by extruding a petroleum-based polymer—that’s plastic—through tiny holes and then stitching or fusing the resulting fibers to a carpetlike bottom. That gets attached to some kind of padding, also plastic. In the 1970s the industry started layering that over infill, usually sand; by the 1990s, “third generation” synthetic turf had switched to softer fibers made of polyethylene. Beneath that, they added infill that combined sand and a soft, cheap shredded rubber made from discarded automobile tires, which pile up by the hundreds of millions every year. This “crumb rubber” provides padding and fills spaces between the blades and the backing.

In the early 1980s, nearly half the professional baseball and football fields in the US had synthetic turf. But many players didn’t like it. It got hotter than real grass, gave the ball different action, and seemed to be increasing the rate of injuries among athletes. Since the 1990s, most pro sports have shifted back toward grass—water and maintenance costs pale in comparison to the importance of keeping players happy or sparing them the risk of injury. 

But at the same time, more universities and high schools are buying the artificial stuff. The advantages are clear, especially in places where it rains either too much or not enough. A natural-grass field is usable for a little more than 800 hours a year at the most, spread across just eight months in the cooler, wetter northern US. An artificial-turf field can see 3,000 hours of activity per year. For sports like lacrosse, which begins in late winter, this makes artificial turf more appealing. Most lacrosse pitches are now synthetic. So are almost all field hockey pitches; players like the way the even, springy turf makes the ball bounce.

Furthermore, supporters say synthetic turf needs less maintenance than grass, saving money and resources. That’s not always true; workers still have to decompact the playing surface and hose it off to remove bird poop or cool it down. Sometimes the infill needs topping up. But real grass allows less playing time, and because grass athletic fields often need to be rotated to avoid damage, synthetic ground cover can require less space. Hence the market’s explosive growth in the 21st century.


The city and town of Ithaca—two separate political entities with overlapping jurisdiction over Cornell construction projects—held multiple public meetings about the university’s new synthetic fields: the field hockey pitch and a complex called the Meinig Fieldhouse. Koizumi’s group turned up in force, and a few folks who worked at Cornell came to oppose the idea too—submitting pages of citations and studies on the risks of synthetic grass.

At two of those meetings, dozens of Cornell athletes turned out to support the turf. Representatives of the university and the athletic department declined to speak with me for this story, citing an ongoing lawsuit from Zero Waste Ithaca. But before that, Nicki Moore, Cornell’s director of athletics, told a local newspaper that demand from campus groups and sports teams meant the fields were constantly overcrowded. “Activities get bumped later and later, and sometimes varsity teams won’t start practicing until 10 at night, you know?” Moore told the paper. “Availability of all-weather space should normalize scheduling a great deal.”

That argument wasn’t universally convincing. “It’s a bad idea, but that’s from the environmental perspective,” says Marianne Krasny, director of Cornell’s Civic Ecology Lab and one of the speakers at those hearings. “Obviously the athletic department thinks it’s a great idea.”

square patch of artificial turf

GETTY IMAGES

Members of Cornell on Fire, a climate action group with members from both the university and the town, joined in opposing the use of artificial turf, citing the fossil-fuel origins of the stuff. They described the nominal support of the project from student athletes as inauthentic, representing not grassroots support but, yes, an astroturf campaign. 

Sorting out the actual science here isn’t simple. Over time, the plastic that synthetic turf is made of sheds bits of itself into the environment. In one study, published in 2023 in the journal Environmental Pollution, researchers found that 15% of the medium-­size and microplastic particles in a river and the Mediterranean Sea outside Barcelona, Spain, came from artificial turf, mostly in the form of tiny green fibers. Back in 2020, the European Chemicals Agency estimated that infill material from artificial-­turf fields in the European Union was contributing 16,000 metric tons of microplastics to the environment each year—38% of all annual microplastic pollution. Most of that came from the crumb rubber infill, which Europe now plans to ban by 2031. 

This pollution worries the Cornell activists. Ithaca is famous for scenic gorges and waterways. The new field hockey pitch is uphill from a local creek that empties into Cayuga Lake, the longest of the Finger Lakes and the source of drinking water for over 40,000 people.

And it’s not just the plastic bits. When newer generations of synthetic turf switched to durable high-density polyethylene, the new material gunked up the extruders used in the manufacturing process. So turf makers started adding fluorinated polymers—a type of PFAS. Some of these environmentally persistent “forever chemicals” cause cancer, disrupt the endocrine system, or lead to other health problems. Research in several different labs has found PFAS in many types of plastic grass.

But the key to assessing the threat here is exposure. Heather Whitehead, an analytical chemist then at the University of Notre Dame, found PFAS in synthetic turf at levels around five parts per billion—but estimated it’d be in water running off the fields at three parts per trillion; for context, the US Environmental Protection Agency’s legal drinking-water limit on one of the most widespread and dangerous PFAS chemicals is four parts per trillion. “These chemicals will wash off in small amounts for long periods of time,” says Graham Peaslee, Whitehead’s advisor and an emeritus nuclear physicist who studies PFAS concentrations. “I think it’s reason enough not to have artificial turf.”

This gets confusing, though. There are over 16,000 different types of PFAS, few have been well studied, and different ­companies use different manufacturing techniques. Companies represented by the Synthetic Turf Council now “use zero intentionally added PFAS,” says Melanie Taylor, the group’s president. “This means that as the field rolls off the assembly line, there are zero PFAS-formulated materials present.”

Some researchers are skeptical of the industry’s assurances. They’re hard to confirm, especially because there are a lot of ways to test for PFAS. The type of synthetic turf going onto the new field hockey pitch at Cornell is called GreenFields TX; the university had a sample tested using an EPA method that looks for 40 different PFAS compounds. It came back negative for all of them. The local activists countered that the test doesn’t detect the specific types they’re most concerned about, and in 2025 they paid for three more tests on newly purchased synthetic turf. Two clearly found fluorine—the F in “PFAS”—and one identified two distinct PFAS compounds. (The company that makes GreenFields TX, TenCate, declined to comment, citing ongoing litigation.)

PFAS isn’t the only potential problem. There’s also the crumb rubber made from tires. A billion tires get thrown out every year worldwide, and if they aren’t recycled they sit in giant piles that make great habitats for rats and mosquitoes; they also occasionally catch fire. Lots of the tires that go into turf are made of styrene-­butadiene rubber, or SBR. In bulk, that’s bad. Butadiene is a carcinogen that causes leukemia, and fumes from styrene can cause nervous system damage. SBR also contains high levels of lead.

But how much of that comes out of synthetic-­turf infill? Again, that’s hotly debated. Researchers around the world have published suggestive studies finding potentially dangerous levels of heavy metals like zinc and lead in synthetic turf, with possible health risks to people using the fields. But a review of many of the relevant studies on turf and crumb rubber from Canada’s National Collaborating Centre for Environmental Health determined that most well-conducted health risk assessments over the last decade found exposures below levels of concern for cancer and certain other diseases. A 2017 report by the European Chemicals Agency—the same people who found all those microplastics in the environment—“found no reason to advise people against playing sports on synthetic turf containing recycled rubber granules as infill material.” And a multiyear study from the EPA, published in 2024, found much the same thing—although the researchers said that levels of certain synthetic chemicals were elevated inside places that used indoor artificial turf. They also stressed that the paper was not a risk assessment. 

The problem is, the kinds of cancers these chemicals can cause may take decades to show up. Long-term studies haven’t been done yet. All the evidence available so far is anecdotal—like a series for the Philadelphia Inquirer that linked the deaths of six former Phillies players from a rare type of brain cancer called glioblastoma to years spent playing on PFAS-containing artificial turf. That’d be about three times the usual rate of glioblastoma among adult men, but the report comes with a lot of cautions—small sample size, lots of other potential causes, no way to establish causation.

Synthetic turf has one negative that no one really disputes: It gets very hot in the sun—as hot as 150 °F (66 °C). This can actually burn players, so they often want to avoid using a field on very hot days.

""
A field hockey player from Cornell University passes the ball during a game played on artificial turf at Bryant University in 2025. Cornell’s own turf field will be ready for the 2026 season.
GETTY IMAGES

Athletes playing on artificial turf also have a higher rate of foot and ankle injuries, and elite-level football players seem to be more predisposed to knee injuries on those surfaces. But other studies have found rates of knee and hip injury to be roughly comparable on artificial and natural turf—a point the landscape architect working on the Cornell project made in the information packet the university sent to the city. Athletic departments and city parks departments say that the material’s upsides make it worthwhile, given that there’s no conclusive proof of harm.

Back in Ithaca, Cornell hired an environmental consulting firm called Haley & Aldrich to assess the evidence. The company concluded that none of the university’s proposed installations of artificial turf would have a negative environmental impact. People from Cornell on Fire and Zero Waste Ithaca told me they didn’t trust the firm’s findings; representatives from Haley & Aldrich declined to comment.

Longtime activists say that as global consumption of fossil fuels declines, petrochemical companies are desperate to find other markets. That means plastics. “There’s a big push to shift more petrochemicals into plastic products for an end market,” says Jeff Gearhart, a consumer product researcher at the Ecology Center. “Industry people, with a vested interest in petrochemicals, are looking to expand and build out alternative markets for this stuff.”

All that and more went before the decision-­makers in Ithaca. In September 2024, the City of Ithaca Planning Board unanimously issued a judgment that the Meinig Fieldhouse would not have a significant environmental impact and thus would not need to complete a full environmental impact assessment. Six months later, the town made the same determination for the field hockey pitch.

Zero Waste Ithaca sued in New York’s supreme court, which ruled against the group. Koizumi and lawyers from Pace University’s Environmental Litigation Clinic have appealed. She says she’s still hopeful the court might agree that Ithaca authorities made a mistake by not requiring an environmental impact statement from the college. “We have the science on our side,” she says.


Ithaca is a pretty rarefied place, an Ivy League university town. But these same tensions—potential long-term environmental and public health consequences versus the financial and maintenance concerns of the now—are pitting worried citizens against their representatives and city agencies around the country. 

New York City has 286 municipal synthetic-­turf fields, with more under construction. In Inwood, the northernmost neighborhood in Manhattan, two fields were approved via Zoom meetings during the pandemic, and Massimo Strino, a local artist who makes kaleidoscopes, says he found out only when he saw signs announcing the work on one of his daily walks in Inwood Hill Park, along the Hudson River. He joined a campaign against the plan, gathering more than 4,300 signatures. “I was canvassing every weekend,” Strino says. “You can count on one hand, literally, the number of people who said they were in favor.” 

But that doesn’t include the group that pushed for one of those fields in the first place: Uptown Soccer, which offers free and low-cost lessons and games to 1,000 kids a year, mostly from underserved immigrant families. “It was turning an unused community space into a usable space,” says David Sykes, the group’s executive director. “That trumped the sort of abstract concerns about the environmental impacts. I’m not an expert in artificial turf, but the parks department assured me that there was no risk of health effects.”

Artificial turf doesn’t go away. “You’re going to be paying to get rid of it. Somebody will have to take it to a dump, where it will sit for a thousand years.”

Graham Peaslee, emeritus nuclear physicist studying PFAS concentrations, University of Notre Dame

New York City councilmember Christopher Marte disagrees. He has introduced a bill to ban new artificial turf from being installed in parks, and he hopes the proposal will be taken up by the Parks Committee this spring. Last session, the bill had 10 cosponsors—that’s a lot. Marte says he expects resistance from lobbyists, but there’s precedent. The city of Boston banned artificial turf in 2022.  

Upstate, in a Rochester suburb called Brighton, the school district included synthetic-­turf baseball and softball diamonds in a wide-ranging February 2024 capital improvement proposition. The measure passed. In a public meeting in November 2025, the school board acknowledged the intent to use synthetic grass—or, as concerned parents had it, “to rip up a quarter ­million square feet of this open space and replace it with artificial turf,” says David Masur, executive director of the environmental group PennEnvironment, whose kids attend school in Brighton. Parents and community members mobilized against the plan, further angered when contractors also cut down a beloved 200-year-old tree. School superintendent Kevin McGowan says it’s too late to change course. Masur has been working to oppose the plan nevertheless—he says school boards are making consequential decisions about turf without sharing information or getting input, even though these fields can cost millions of dollars of taxpayer money.

In short, the fights can get tense. On Martha’s Vineyard, in Massachusetts, a meeting about plans to install an artificial field at a local high school had to be ended early amid verbal abuse. A staffer for the local board of health who voiced concern about PFAS in the turf quit the board after discovering bullet casings in her tote bag, she said, which she perceived as a death threat. After an eight-year fight, the board eventually banned artificial turf altogether. 


What happens next? Well, outdoor artificial turf lasts only eight to 12 years before it needs to be taken up and replaced. The Synthetic Turf Council says it’s at least partially recyclable and cites a company called BestPLUS Plastic Lumber as a purveyor of products made from recycled turf. The company says one of its products, a liner called GreenBoard that artificial turf can be nailed into, is at least 40% recycled from fake grass. Joseph Sadlier, vice president and general manager of plastics recycling at BestPLUS, says the company recycles over 10 million pounds annually. 

Yet the material is piling up. In 2021, a Danish company called Re-Match announced plans to open a recycling plant in Pennsylvania and began amassing thousands of tons of used plastic turf in three locations. The company filed for bankruptcy in 2025.

In Ithaca, university representatives told planning boards that it would be possible to recycle the old artificial turf they ripped out to make way for the Meinig Fieldhouse. That didn’t happen. An anonymous local activist tracked the old rolls to a hauling company a half-hour’s drive south of campus and shared pictures of them sitting on the lot, where they stayed for months. It’s unclear what their ultimate fate will be.

That’s the real problem: Artificial turf just doesn’t go away. “You’re going to be paying to get rid of it,” says Peaslee, the PFAS expert. “Somebody will have to take it to a dump, where it will sit for a thousand years.” At minimum, real grass is a net carbon sink, even including installation and maintenance. Synthetic turf releases greenhouse gases. One life-­cycle analysis of a 2.2-acre synthetic field in Toronto determined that it would emit 55 metric tons of carbon dioxide over a decade. Plastic fields need less water to maintain, but it takes water to make plastic, and natural grass lets rainwater seep into the ground. Synthetic turf sends most of it away as runoff.

It’s a boggling set of issues to factor into a decision. Rossi, the Cornell turf scientist, says he can understand why a school in the northern United States might go plastic, even when it cares about its students’ health. “It was the best bad option,” he says. Concerns about microplastics and PFAS are “significant issues we have not fully addressed.” And they need to be. 

Douglas Main is a journalist and former senior editor and writer at National Geographic.

A Social Justice Approach to Assistive Technology and Well-Being of People With Visual Disabilities in Low- and Middle-Income Countries: Qualitative Narrative Study

Background: The United Nations’ third Sustainable Development Goal emphasizes ensuring healthy lives and promoting well-being (WB) for all, which requires effective assistive technology (AT) for persons with disabilities. In low- and middle-income countries (LMICs), however, AT remains largely inaccessible, and high abandonment rates indicate that many existing solutions fail to meet users’ needs. To improve AT design and effectiveness, a deeper understanding of users’ lived experiences and the ways AT influences WB is essential. Objective: This study aimed to explore how technology creates opportunities or barriers in the daily lives of persons with visual disabilities in LMICs and how it affects their WB. Methods: We conducted a qualitative narrative study guided by deductive qualitative analysis, using the capability approach (CA) and disadvantage theory (DT) as theoretical frameworks. Nineteen adults with visual disabilities from Cali, Colombia, participated in in-depth, semistructured interviews. A focus group (n=5) deepened the exploration of shared experiences. Data analysis followed three stages: (1) deductive coding using Nussbaum list of central capabilities and key CA constructs (functionings, conversion factors, and agency); (2) recoding through DT concepts (insecure functioning, corrosive disadvantages, and fertile functionings); and (3) inductive analysis to capture emergent sociocultural themes. Results: AT shaped both opportunities and constraints in participants’ lives. While functionings such as employment, mobility, and affiliation were highly valued, they often remained insecure due to systemic barriers. Corrosive disadvantages—such as unemployment, exclusion, and limited spatial autonomy—undermined multiple capabilities simultaneously. Conversely, fertile functionings such as equitable employment, adaptive sports, and access to well-designed AT supported agency and resilience. The inductive analysis revealed 3 interconnected themes: the aspiration to explore and expand movement, the desire to appear attractive, and the adoption of nonconfrontational strategies to maintain social harmony. These findings highlight how emotional, aesthetic, and cultural dimensions shape the experience and meaning of AT. Conclusions: While AT research in LMICs often emphasizes availability, it rarely addresses how social norms, structural violence, and fear affect meaningful use. The combined CA and DT lens reveals that AT can either enable or constrain WB depending on how it aligns with users’ lived contexts. Designing for fertile functionings—those that support agency, safety, and resilience—is essential. Participatory, context-sensitive design must prioritize not only functionality, but also aesthetic dignity, cultural relevance, and emotional security. Including the voices—and silences—of persons with disabilities in the Global South is crucial for transforming AT from a mere tool into a catalyst for real freedom and WB.
<img src="https://jmir-production.s3.us-east-2.amazonaws.com/thumbs/b69af78b31160b1f7cc12c3b94f9456d" />

Faster Process Development via “Transfer Learning”

An emerging artificial intelligence technique called “transfer learning” could help drug makers use data to speed up the development of biopharmaceutical manufacturing processes, according to new analysis.

In transfer learning, predictive models that have been trained on historical data are used to improve the performance of a task.

Unlike machine learning (ML)—where the training process begins from scratch—transfer learning applies existing knowledge to new but related problems, reducing the amount of data and time required to build the model.

Researchers at the Karlsruhe Institute of Technology in Germany, who looked at the approach, identified several potential biopharma applications, according to lead author Daniel Barón Díaz, citing reactor modeling as an example.

“Transfer learning models can be used to predict critical outcomes like viable cell density (VCD) and product titre from online sensor data—for example, pH, temperature, gas flow—from historical data from a different, but related process.”

The approach can also optimize process monitoring. Díaz tells GEN that, “Transfer learning-enhanced soft sensors can be established to monitor protein concentrations in real-time by leveraging existing models from related fermentations.”

Data limitation

When compared with other model-building techniques, transfer learning offers potential cost and time savings, according to Díaz, who cites a reduced experimentation burden as an example.

“Conventional machine learning requires large, structured datasets that are often unavailable in biopharma due to the high cost and labor-intensive nature of experiments. Transfer learning allows companies to leverage historical data and existing models to build reliable predictors for new processes with very limited data.

“By reusing prior knowledge, transfer learning can significantly decrease the number of experiments required—sometimes needing only one to three batches to achieve robust simulations,” he says.

However, the ultimate benefit is that transfer learning speeds up process model development, according to Díaz, who adds, “It can make model adaptation faster than retraining from scratch, facilitating quicker process design and digital twin deployment.”

Challenges

So, transfer learning has the potential to create predictive models for manufacturing development. However, the key caveat is that the processes involved must be sufficiently similar for it to be effective, Díaz says.

“For transfer learning to be effective, the source and target domains must be meaningfully related. If the processes are too different, the assumptions and learned representations may not align, leading to negative transfer, where the transferred knowledge actually degrades the model’s performance.

“Data sets obtained at different scales or under varying conditions are often inconsistent, which can hinder the successful transfer of knowledge. Fine-tuning complex neural network architectures on very small target datasets can lead to overfitting, where the model fails to generalize to new data,” he says.

To address this, manufacturers will need to establish metrics to determine similarity, Díaz explains.

“There are currently no standardized metrics for measuring domain similarity in bioprocessing, nor are there comprehensive benchmark datasets to easily compare different transfer learning techniques.”

Another challenge is the current lack of AI expertise in the industry, Díaz says.

“There is often a disciplinary knowledge gap between process engineers and data scientists, and ML models without a mechanistic backbone may be perceived as opaque black boxes, hindering trust and industrial adoption,” he tells GEN.

The post Faster Process Development via “Transfer Learning” appeared first on GEN – Genetic Engineering and Biotechnology News.

Lung Screening Incidental Findings May Guide Follow-Up for Other Cancers

An analysis of the US National Lung Screening Trial (NLST) has found that the presence of certain types of abnormalities in regions outside of the lungs on low-dose computed tomography (LDCT) images may be associated with a significantly increased risk for extrapulmonary cancer.

The abnormalities, termed significant incidental findings (SIFs), could help clinicians decide when follow-up care is likely to catch extrapulmonary cancer early and when it may not be necessary.

“In this paper, we provide an evidence base for making decisions on abnormalities outside of the lungs that might be seen at lung screening,” said study author Ilana Gareen, PhD, a professor of epidemiology at Brown University School of Public Health. “The goal is to give physicians and patients better data so that they can make more informed choices about those abnormalities that should be considered for follow-up and those that most likely can be ignored.”

Writing in JAMA Network Open, Gareen and co-authors explain that LDCT lung cancer screening frequently detects SIFs unrelated to lung cancer; in the NLST, 34% of 26,455 patients screened with LDCT had SIFs reported but the nature of the SIFs varied.

And although there are recommendations for reporting and addressing SIFs, there is limited evidence for an association between SIFs detected at LDCT lung cancer screening and extrapulmonary cancer diagnoses.

To address this, Gareen and team analyzed data from 75,104 LDCT screening rounds performed in 26,445 individuals (mean age, 61 years; 59.0% men) who were randomly assigned to receive LDCT during the NSLT. The participants had a history of heavy smoking (≥30 pack–years), meaning they are also at high risk for several extrapulmonary cancers, including pancreatic, bladder, and kidney cancer.

The researchers focused on SIFs that were labelled as potentially indicative of extrapulmonary cancer (cancer SIF), rather than those that possibly indicated emphysema or cardiovascular disease.

They report that cancer SIFs were recorded for 2265 (3.0%) screening rounds in 1807 (6.8%) participants across the three screening rounds they received.

Participants with cancer SIFs were significantly older than those with no cancer SIF (mean 62.1 vs. 61.4 years) and significantly more likely to have a history of a smoking-related disease (68.6 vs. 65.7%).

Within one year of a screening round, 1025 participants were diagnosed with an extrapulmonary cancer. Of these, 67 (6.5%) had a SIF on LDCT. This corresponds to 3.0% of participants with a cancer SIF.

Overall, the risk for extrapulmonary cancer among the people with a cancer SIF was 29.6 per 1000 screening rounds compared with 13.3 per 1000 screening rounds in those without a cancer SIF. After adjustment for potential confounders, the marginal risk difference between the two groups was 13.9 per 1000 participants, suggesting that for every 1000 people screened, the presence of a cancer SIF is associated with 13.9 additional cases of extrapulmonary cancer.

When the researchers looked at specific cancer types, they found that the marginal risk difference was substantially higher for urinary cancers, at 17.0 per 1000 participants. It was 5.0 for digestive cancer, 12.3 for breast cancer, and 13.8 for other cancers including lymphoma and leukemia.

“In general, if an abnormality is found that might indicate cancer, the patient receives additional imaging to evaluate that abnormality,” Gareen told Inside Precision Medicine. “Our paper provides additional information as to those abnormalities that should be considered to increase the risk of a cancer diagnosis.”

Importantly, mortality from extrapulmonary cancer accounted for 22.3% of the certified deaths in the LDCT arm of the NLST. Therefore “early detection of these cancers may facilitate early treatment and potentially reduce associated morbidity and mortality,” the authors write. “Identification of cancer SIFs associated with extrapulmonary cancers in NLST participants could be used to plan appropriate diagnostic evaluations for patients undergoing lung cancer screening.”

Gareen said the next step will be to determine if the findings are replicated in lung screening in the community, or if the rate in community screening is higher or lower.

In accompanying comment, Patrick Senior and Andrew Creamer, both from Gloucestershire Hospitals NHS Foundation Trust, in Gloucester, United Kingdom, point out that the false positive rate for a cancer SIF was 97% but say “it is hard to imagine a scenario in which an incidental finding with even a possibility of representing cancer would be disregarded.”

However, they note that “when considered in the context of the numbers of people eligible for lung cancer screening programs around the world, acting on such findings poses a considerable additional burden on the health systems that must investigate them.”

Senior and Creamer say that the results “underscore the importance of both a robust health economics analysis of how screening programs manage such incidental findings and patient-centered research to understand the impact that such unexpected results may have on the individual. Further research is needed to ensure that screening programs are confident when faced with information they did not ask for.”

The post Lung Screening Incidental Findings May Guide Follow-Up for Other Cancers appeared first on Inside Precision Medicine.

From Reactive to Proactive: Reimagining Hypertension Management in the Precision Medicine Era

According to the World Health Organization, an estimated 1.4 billion adults aged 30–79 worldwide had hypertension in 2024, representing around one-third of the global population of that age. Of these, 44% were unaware that they were living with a leading risk factor for premature death and poor health worldwide due to its association with myocardial infarction, stroke, and kidney disease.

Despite the size of the hypertension problem, its diagnosis and treatment pathway has remained largely the same for decades.

A 60-year-old pathway

“The current pathway in hypertension diagnosis and treatment has really not changed in over 60 years,” said Sandosh Padmanabhan, MD, PhD, chair of pharmacogenomics and professor of cardiovascular genomics and therapeutics at the University of Glasgow in Scotland.

He explained that it is based on opportunistic detection of hypertension, which has traditionally been defined as a blood pressure (BP) of 140/90 mmHg in the clinic, although thresholds vary by measurement method and guideline. For example, out-of-office measures typically use lower cut-points (e.g., home/daytime ambulatory averages) of 135/85 mmHg.

Sandosh Padmanabhan
Sandosh Padmanabhan, MD, PhD
Professor
University of Glasgow

Diagnosis typically occurs when a patient visits their primary care physician (PCP) or has a pharmacy BP check. Confirmation follows, ideally with out-of-office BP monitoring to avoid misclassification caused by one-off measurements.

Patients are then stratified by predicted 10-year cardiovascular risk, using risk calculators such as Q-risk or the PREVENT score, and treatment is based on a stepwise algorithm. First, patients are generally given lifestyle advice like reducing salt, alcohol, and caffeine intake, improving sleep, managing stress, and increasing exercise. This may give them a chance to reduce their BP without pharmacologic intervention.

If unsuccessful, depending on local guidelines, patients may be offered an angiotensin-converting enzyme (ACE) inhibitor or angiotensin receptor blocker if under 55 years of age. Those over 55 years or of Black African or Caribbean origin are started on a calcium channel blocker. The next steps combine ACE inhibitors and calcium channel blockers, then add a thiazide-like diuretic, followed by spironolactone or other drugs.

However, this approach uses “a population-level logic,” said Padmanabhan. Although age and ethnicity are considered, “these are broad demographic proxies that don’t include any understanding of the individuals’ underlying pathophysiology or the genetic makeup.”

He stresses that, on a public health basis, the system works. There are multiple effective, low-cost antihypertensive drug classes and many generic options available that effectively lower BP. Despite this, control rates are poor. “Fewer than one in four hypertensive adults globally have their BP adequately controlled,” he said.

The measurement problem

Part of the issue lies in how BP is measured. “To give you an idea about the scale of inertia, we diagnose BP using a device that was introduced in the late 19th century,” Padmanabhan noted, referring to the sphygmomanometer invented by Scipione Riva-Rocci in 1896. Not only that, the technique can also be flawed. Variables such as incorrect cuff size, improper positioning, and patient movement can distort readings. Even talking during measurement can increase BP values by 5–9 mmHg or even higher.

Crucially, a single measurement provides little insight into cumulative lifetime exposure to high BP and can be skewed by issues like white coat hypertension or masked hypertension. “We look at the BP number, but the patients don’t experience that number. What they experience is a lifelong vascular risk,” Padmanabhan explained. “Treatment is not about a short-term reduction in a number. It’s about long-term sustained risk reduction.”

Yet the current system remains reactive and is not working well enough. “We have to move away from reactive diagnosis to proactive identification,” Padmanabhan said. “The earlier we measure accurately and respond systematically, the fewer surprises we’ll see later.”

Continuous monitoring

The pitfalls of opportunistic, or even planned, BP measurement are driving the emergence of new technologies capable of continuous monitoring.

Josep Solà
Josep Solà, PhD
CTO and Co-founder
Aktiia

Josep Solà, PhD, began working on optical sensing technology in 2004 at the Centre for Electronics and Microtechnology in Switzerland. By analyzing subtle changes in reflected light caused by arterial dilation, it became clear that BP could be measured using these light signals. In 2018, this research was spun out into Aktiia, where Solà is CTO and co-founder. The company has developed and commercialized the Hilo™ band: a CE-certified wearable medical device designed for continuous, cuffless, BP monitoring that has been clinically validated against traditional ambulatory BP monitoring.

The band tracks BP and heart rate automatically, about 25 times per day, without requiring any action from users. Paired with an app, the device shows users daily, nightly, and long-term BP trends. It is currently available as a certified medical device across Europe, Australia, and Canada, and, following FDA approval in July 2025, the company is preparing for a U.S. launch.

Solà said he and co-founder Mattia Bertschi, PhD, were convinced they could change how hypertension is being managed today. He believes there is no good reason why most people with hypertension cannot control the condition. The medication is cheap and effective; the problem is that there has been no technology that patients can use to properly manage their condition.

“No one wants to use a cuff every day for the next 30 years,” said Solà. “They’re just so inconvenient, and you cannot expect people to proactively measure something they don’t feel.”

The Hilo band gives wearers a feedback loop that has historically been missing from BP measurement. Users can immediately see that reducing their salt or alcohol intake, for example, lowers their BP. “We are empowering people,” said Solà. “We are empowering them to look at the intervention, or combination of interventions, with or without medication, to see what is effective for them, and this reinforces their willingness to continue with the changes they are making.”

Hilo product
Credit: Hilo

Data published by Aktiia has shown that this approach works. A study of 8,950 U.K.-based Hilo users indicated that individuals who monitored their BP continuously showed better control over time. Specifically, users over 50 years of age appeared able to prevent the age-related rise in systolic BP typically seen in the general population, which the researchers say “may reflect greater awareness, stronger treatment adherence, and lifestyle changes prompted by continuous feedback.”

Wearables at scale: Opportunity and caution

Beyond dedicated monitoring devices like the Hilo band, smartwatches and other devices are increasingly capable of detecting physiological signals associated with cardiovascular risk. The Apple Watch can detect potential signs of chronic hypertension by analyzing heart rate sensor data over 30-day periods, the Huawei Watch D provides on-demand and 24-hour ambulatory BP monitoring using an air-filled strap, while the team behind the Oura ring is developing a “Blood Pressure Profile” feature to detect early signs of hypertension.

Although this represents a significant step toward embedding cardiovascular monitoring into everyday life, the increasing use of these devices raises important questions about accuracy, interpretation, and clinical integration, particularly as they often rely on indirect signals rather than direct BP measurement.

Adam Bress
Adam Bress, PharmD
Researcher
University of Utah

As Adam Bress, PharmD, from the Spencer Fox Eccles School of Medicine at the University of Utah, and colleagues have recently shown, translating wearable-derived signals into meaningful clinical information is not straightforward.

They evaluated the hypertension alert feature of the Apple Watch, which has a published sensitivity of 41% and specificity of 92%, meaning that approximately 59% of individuals with undiagnosed hypertension would not receive an alert, while about eight percent of those without hypertension would receive a false alert.

“The problem there, is that this data only tells you how the alert works in a very controlled, limited population,” said Bress. “In order to understand how it’s going to work in the real world, we need to know how the true prevalence of undiagnosed hypertension varies in the population and in subgroups and to what degree.”

Using data from nearly 4,000 adults in the U.S., Bress and colleagues showed that the pretest probability of having hypertension has a significant impact on the reliability of the alert. For example, among adults under 30 years of age, the pretest probability of having hypertension is 14%. A positive alert on the Apple Watch would increase this probability to 47%, whereas no alert reduces the probability to 10%.

However, for adults aged 60 years and older, an alert increases the probability of an individual having hypertension from a pretest level of 45% to 81%, whereas the absence of an alert only lowers it to 34%. This translates to large numbers of false negatives when applied across millions of users.

In Apple’s validation study, the company stresses that the watch is not intended to replace traditional diagnosis methods or to be used as a method of BP surveillance, and that the absence of a notification does not indicate the absence of hypertension.

“The concern is, if you’re not getting an alert, will people interpret that as them not having hypertension,” said Bress. “That’s the worry. … The groups in which the negative alert is the least trustworthy contain the people with the highest risk. We’re most worried about people being falsely reassured.”

At the same time, he is clear that wearables should not be dismissed. “This technology is an important step forward; we need more wearable tech that can screen,” he said.

Unfortunately, access to these devices is not universal. Advanced monitoring technologies are often first adopted by the “worried well”—people who are more affluent and health-conscious—rather than those at highest risk.

“The only thing that can change this is a clear political decision to make awareness of hypertension large scale,” said Solà. Devices like the Hilo band could be used much like the continuous glucose monitors for diabetes. The difference is that if someone with diabetes doesn’t keep their blood glucose levels under control through regular monitoring, they can become ill very quickly. With hypertension, the effects of poor control don’t become apparent for decades.

“We need the policymakers to understand that investing in this technology today will have a return on investment in 10 years from now, not in one year from now,” Solà remarked.

Targeted drug selection

Even when hypertension is detected early and monitored closely, treatment remains largely empirical and can lead to therapeutic inertia, one of the biggest current challenges in hypertension care. “BP is not like diabetes, it doesn’t cause symptoms, and because of that, we don’t escalate treatment often enough,” said Padmanabhan.

At the same time, treatment selection remains largely trial-and-error. Clinicians cycle through medications sequentially, adjusting regimens based on response rather than underlying biology. The issue is that failed attempts risk side effects and can erode trust. That lack of trust can then impact adherence and, therefore, cardiovascular risk.

Instead, Padmanabhan believes that we need to move toward mechanistically informed drug selection.

This approach is common in oncology, where targeted therapies have been matched to specific mutations, but the picture is more complex for BP. Genome-wide association studies (GWAS) have identified more than 30 genes associated with monogenic forms of hypertension or hypotension and more than 2,100 single nucleotide polymorphisms linked to BP regulation, underscoring its highly polygenic nature.

This, combined with the strong influence of environmental factors, means that there is no single pathway or biomarker that can be easily targeted to reduce BP.

Padmanabhan’s work on the uromodulin gene (UMOD), however, shows that GWAS data can translate into therapy. His team identified a signal on chromosome 16 linked to uromodulin, a protein that is only expressed in one part of the kidney and plays a role in salt regulation. In a clinical trial  comparing people with low BP to those with high BP, they found that people with the UMOD allele that increases protein expression experienced a sustained reduction in BP when treated with the loop diuretic torasemide, whereas the effect was only temporary and followed by rebound in those carrying the UMOD allele that lowers protein expression.

Approximately two-thirds of the population carry the UMOD allele that increases protein expression, meaning that loop diuretics like furosemide or torasemide, which are more commonly used to treat heart failure, could potentially be used in hypertension personalized by the patient’s genotype.

So far, “this is the only clinical trial from a GWAS-identified genetic variant in hypertension,” Padmanabhan noted, highlighting both the promise and challenge of pharmacogenomics in hypertension.

Although clinical translation from GWAS of hypertension has been limited, research has shown that genetic variation in drug-metabolizing enzymes can significantly impact hypertension treatment efficacy and toxicity. For example, variants of CYP2D6 affect metoprolol metabolism whereas those in CYP2C9 influence responses to losartan. Research is needed to determine whether testing for these variants or others could reduce trial-and-error prescription, minimize side effects, and thus increase patient confidence and long-term engagement.

Teresa Castielo
Teresa Castielo, MD
Director
MIAL Healthcare

On a more fundamental level, biological sex differences remain a significant consideration in cardiovascular medicine. “Biological factors are an integral part of the clinical picture,” noted Teresa Castiello, MD, consultant cardiologist and director of MIAL Healthcare in London. She points out that clinical trials have historically seen a predominance of male participants; as a result, many standard medication dosages are based on data primarily derived from men.

This can lead to challenges with tolerability and a higher incidence of side effects in women as the therapeutic dose required for efficacy often tends to be lower in female patients.

Castiello suggests that this area of management warrants further refinement in clinical practice. She also emphasizes that key aspects of female cardiovascular risk, including reproductive history, menopause, and conditions like polycystic ovary syndrome, are nuances that may not always receive the necessary focus in routine care.

Toward a precise, preventative system

Ultimately, transforming hypertension care will require more than new technologies or therapies. It will require a fundamental change in how care is delivered.

Padmanabhan argues that hypertension should be managed through a “precision prevention service,” that integrates early detection, continuous monitoring, and personalized treatment, and involves more than just PCPs.

This approach recognizes that the disease is not just a clinical condition but a societal one, influenced by factors such as diet, socioeconomic status, work patterns, and access to care. Equity remains another critical issue. “We treat the ideal average patient under ideal circumstances but that’s not reality,” said Padmanabhan.

There also needs to be a cultural shift, said Castiello. “It’s not just the doctor’s responsibility; we also need to take responsibility for our own health.”

Solà shares a similar vision for the future: he would like to see BP measurement to become as routine as brushing your teeth, supported by technologies that empower individuals and reduce the burden on healthcare systems.

If realized, this shift could transform hypertension from a silent, progressive disease into a manageable, preventable condition, saving millions of lives in the process.

 

Laura Cowen is a freelance medical journalist who has been covering healthcare news for over 10 years. Her main specialties are oncology and diabetes, but she has written about subjects ranging from cardiology to ophthalmology and is particularly interested in infectious diseases and public health.

The post From Reactive to Proactive: Reimagining Hypertension Management in the Precision Medicine Era appeared first on Inside Precision Medicine.

The Download: AI’s impact on jobs, and data centres in space

This is today’s edition of The Download, our weekday newsletter that provides a daily dose of what’s going on in the world of technology.

The one piece of data that could actually shed light on your job and AI 

Within Silicon Valley’s orbit, an AI-fueled jobs apocalypse is spoken about as a given. Now even economists who have downplayed the threat are coming around to the idea.  

Alex Imas, based at the University of Chicago, is one of them. He believes that any plan to address AI’s impact will depend on collecting one vital piece of data: price elasticity. 

Imas argues that “we need a Manhattan Project” for this. Read the full story to find out why

—James O’Donnell 

This article is from The Algorithm, our weekly newsletter giving you the inside track on all things AI. Sign up to receive it in your inbox every Monday. 

Four things we’d need to put data centers in space 

In January, Elon Musk’s SpaceX applied to launch up to 1 million data centers into Earth’s orbit. The goal? To fully unleash the potential of AI—without triggering an environmental crisis on Earth. 

SpaceX is among a growing list of tech firms pursuing orbital computing infrastructure. But can their plans really work? Here are four must-haves for making space-based data centers a reality

—Tereza Pultarova 

This story is part of MIT Technology Review Explains, our series untangling the complex, messy world of technology to help you understand what’s coming next. You can read more from the series here. 

The must-reads 

I’ve combed the internet to find you today’s most fun/important/scary/fascinating stories about technology. 

1 Trump has again proposed major cuts to US science and tech spending 
He wants to slash nearly every science-focused agency. (Ars Technica
+ If Trump gets his way, the US could face a costly brain drain. (NYT $)  
+ Top research talent is already fleeing the country. (Guardian)  
+ Basic science deserves our boldest investment. (MIT Technology Review

2 Sam Altman lobbied against AI regulations he publicly welcomed  
A bombshell report reveals many OpenAI insiders don’t trust him. (The New Yorker $) 
+ Some have called him a sociopath. (Futurism
+ OpenAI’s CFO fears it won’t be IPO-ready this year. (The Information $)  
+ A war over AI regulation is brewing in the US. (MIT Technology Review

3 NASA’s Artemis II has broken humanity’s all-time distance record 
The astronauts have flown farther than any humans before them. (BBC
+ Their mission includes MIT-developed technology. (Axios

4 Chinese tech firms are selling intel “exposing” US forces 
It comes from combining AI with open-source data.. (WP $) 
+ AI is turning the Iran conflict into theater. (MIT Technology Review

5 War is pushing countries to ditch hyperscalers 
Driven by Iran naming tech giants as military targets. (Rest of World
+ No one wants a data center in their backyard. (MIT Technology Review

6 OpenAI, Anthropic, and Google have united against China’s AI copying 
They’re sharing information on “adversarial distillation” (Bloomberg $) 

7 Anduril and Impulse Space are working on Trump’s “Golden Dome” 
They’re developing space-based missile tracking for the project. (Gizmodo)  

8 OpenAI has urged California to probe Elon Musk’s “anti-competitive behavior.” 
It accuses Musk of trying to “take control of the future of AGI.” (Reuters $) 
+ And claims he coordinated attacks with Mark Zuckerberg. (CNBC
+ A former Tesla president has revealed how he survived working for Musk. (WP $) 

9 DeepSeek’s new AI model will run on Huawei chips 
It’s expected to launch in the next few weeks. (The Information $) 

10 Memes have nuked our culture 
Internet “brain rot” has escaped our phones to take over everything. (NYT $) 

Quote of the day 

“I must say, it was actually quite nice.” 

 —Astronaut Victor Glover tells President Donald Trump what it was like when Artemis II was out of communication with the rest of humanity, The New York Times reports. 

One More Thing 

eucalyptus forest

PABLO ALBARENGA

Inside the controversial tree farms powering Apple’s carbon-neutral goal  

In 2020, Apple set a goal to become net zero by the end of the decade. To hit that target, the company is offsetting its emissions by planting millions of eucalyptus trees in Brazil. 

Apple is betting that the strategy will lead to a greener future. But critics warn that the industrial tree farms will do more harm than good. 

Find out why the plans have sparked a backlash. 

—Gregory Barber 

We can still have nice things 

A place for comfort, fun and distraction to brighten up your day. (Got any ideas? Drop me a line.) 

+ Japan’s automated bike garage is a cyclist’s dream come true.  
+ This deep dive into bird behavior reveals the secrets of their dining habits. (Big thanks to reader Terry Gordon for the find!) 
+ The first photo from the Artemis astronauts vividly captures the glow of our atmosphere. 
+ There’s a new contender for the world’s most gorgeous website: RobertDeNiro.com. 

STAT+: Biotech investors’ plea to Trump, and a busy M&A week

Want to stay on top of the science and politics driving biotech today? Sign up to get our biotech newsletter in your inbox.

The Trump administration is using newly announced 100% tariffs as leverage to push both large and small drugmakers into confidential pricing and manufacturing agreements.

Also, the burgeoning peptide craze is highlighting a trust gap in medicine, in which patients increasingly favor unproven treatments over well-established drugs.

Continue to STAT+ to read the full story…