Abortions Before Fetal Viability Are Legal: Might Science and the Change on the Supreme Court Undermine That?
This article is part of the magazine, "The Future of Science In America: The Election Issue," co-published by LeapsMag, the Aspen Institute Science & Society Program, and GOOD.
Viability—the potential for a fetus to survive outside the womb—is a core dividing line in American law. For almost 50 years, the Supreme Court of the United States has struck down laws that ban all or most abortions, ruling that women's constitutional rights include choosing to end pregnancies before the point of viability. Once viability is reached, however, states have a "compelling interest" in protecting fetal life. At that point, states can choose to ban or significantly restrict later-term abortions provided states allow an exception to preserve the life or health of the mother.
This distinction between a fetus that could survive outside its mother's body, albeit with significant medical intervention, and one that could not, is at the heart of the court's landmark 1973 decision in Roe v. Wade. The framework of viability remains central to the country's abortion law today, even as some states have passed laws in the name of protecting women's health that significantly undermine Roe. Over the last 30 years, the Supreme Court has upheld these laws, which have the effect of restricting pre-viability abortion access, imposing mandatory waiting periods, requiring parental consent for minors, and placing restrictions on abortion providers.
Viability has always been a slippery notion on which to pin legal rights.
Today, the Guttmacher Institute reports that more than half of American women live in states whose laws are considered hostile to abortion, largely as a result of these intrusions on pre-viability abortion access. Nevertheless, the viability framework stands: while states can pass pre-viability abortion restrictions that (ostensibly) protect the health of the woman or that strike some kind a balance between women's rights and fetal life, it is only after viability that they can completely favor fetal life over the rights of the woman (with limited exceptions when the woman's life is threatened). As a result, judges have struck down certain states' so-called heartbeat laws, which tried to prohibit abortions after detection of a fetal heartbeat (as early as six weeks of pregnancy). Bans on abortion after 12 or 15 weeks' gestation have also been reversed.
Now, with a new Supreme Court Justice expected to be hostile to abortion rights, advances in the care of preterm babies and ongoing research on artificial wombs suggest that the point of viability is already sooner than many assume and could soon be moved radically earlier in gestation, potentially providing a legal basis for earlier and earlier abortion bans.
Viability has always been a slippery notion on which to pin legal rights. It represents an inherently variable and medically shifting moment in the pregnancy timeline that the Roe majority opinion declined to firmly define, noting instead that "[v]iability is usually placed at about seven months (28 weeks) but may occur earlier, even at 24 weeks." Even in 1977, this definition was an optimistic generalization. Every baby is different, and while some 28-week infants born the year Roe was decided did indeed live into adulthood, most died at or shortly after birth. The prognosis for infants born at 24 weeks was much worse.
Today, a baby born at 28 weeks' gestation can be expected to do much better, largely due to the development of surfactant treatment in the early 1990s to help ease the air into babies' lungs. Now, the majority of 24-week-old babies can survive, and several very premature babies, born just shy of 22 weeks' gestation, have lived into childhood. All this variability raises the question: Should the law take a very optimistic, if largely unrealistic, approach to defining viability and place it at 22 weeks, even though the overall survival rate for those preemies remains less than 10% today? Or should the law recognize that keeping a premature infant alive requires specialist care, meaning that actual viability differs not just pregnancy-to-pregnancy but also by healthcare facility and from country to country? A 24-week premature infant born in a rural area or in a developing nation may not be viable as a practical matter, while one born in a major U.S. city with access to state-of-the-art care has a greater than 70% chance of survival. Just as some extremely premature newborns survive, some full-term babies die before, during, or soon after birth, regardless of whether they have access to advanced medical care.
To be accurate, viability should be understood as pregnancy-specific and should take into account the healthcare resources available to that woman. But state laws can't capture this degree of variability by including gestation limits in their abortion laws. Instead, many draw a somewhat arbitrary line at 22, 24, or 28 weeks' gestation, regardless of the particulars of the pregnancy or the medical resources available in that state.
As variable and resource-dependent as viability is today, science may soon move that point even earlier. Ectogenesis is a term coined in 1923 for the growth of an organism outside the body. Long considered science fiction, this technology has made several key advances in the past few years, with scientists announcing in 2017 that they had successfully gestated premature lamb fetuses in an artificial womb for four weeks. Currently in development for use in human fetuses between 22 and 23 weeks' gestation, this technology will almost certainly seek to push viability earlier in pregnancy.
Ectogenesis and other improvements in managing preterm birth deserve to be celebrated, offering new hope to the parents of very premature infants. But in the U.S., and in other nations whose abortion laws are fixed to viability, these same advances also pose a threat to abortion access. Abortion opponents have long sought to move the cutoff for legal abortions, and it is not hard to imagine a state prohibiting all abortions after 18 or 20 weeks by arguing that medical advances render this stage "the new viability," regardless of whether that level of advanced care is available to women in that state. If ectogenesis advances further, the limit could be moved to keep pace.
The Centers for Disease Control and Prevention reports that over 90% of abortions in America are performed at or before 13 weeks, meaning that in the short term, only a small number women would be affected by shifting viability standards. Yet these women are in difficult situations and deserve care and consideration. Research has shown that women seeking later terminations often did not recognize that they were pregnant or had their dates quite wrong, while others report that they had trouble accessing a termination earlier in pregnancy, were afraid to tell their partner or parents, or only recently received a diagnosis of health problems with the fetus.
Shifts in viability over the past few decades have already affected these women, many of whom report struggling to find a provider willing to perform a termination at 18 or 20 weeks out of concern that the woman may have her dates wrong. Ever-earlier gestational limits would continue this chilling effect, making doctors leery of terminating a pregnancy that might be within 2–4 weeks of each new ban. Some states' existing gestational limits on abortion are also inconsistent with prenatal care, which includes genetic testing between 12 and 20 weeks' gestation, as well as an anatomy scan to check the fetus's organ development performed at approximately 20 weeks. If viability moves earlier, prenatal care will be further undermined.
Perhaps most importantly, earlier and earlier abortion bans are inconsistent with the rights and freedoms on which abortion access is based, including recognition of each woman's individual right to bodily integrity and decision-making authority over her own medical care. Those rights and freedoms become meaningless if abortion bans encroach into the weeks that women need to recognize they are pregnant, assess their options, seek medical advice, and access appropriate care. Fetal viability, with its shifting goalposts, isn't the best framework for abortion protection in light of advancing medical science.
Ideally, whether to have an abortion would be a decision that women make in consultation with their doctors, free of state interference. The vast majority of women already make this decision early in pregnancy; the few who come to the decision later do so because something has gone seriously wrong in their lives or with their pregnancies. If states insist on drawing lines based on historical measures of viability, at 24 or 26 or 28 weeks, they should stick with those gestational limits and admit that they no longer represent actual viability but correspond instead to some form of common morality about when the fetus has a protected, if not absolute, right to life. Women need a reasonable amount of time to make careful and informed decisions about whether to continue their pregnancies precisely because these decisions have a lasting impact on their bodies and their lives. To preserve that time, legislators and the courts should decouple abortion rights from ectogenesis and other advances in the care of extremely premature infants that move the point of viability ever earlier.
[Editor's Note: This article was updated after publication to reflect Amy Coney Barrett's confirmation. To read other articles in this special magazine issue, visit the e-reader version.]
Jamie Rettinger was still in his thirties when he first noticed a tiny streak of brown running through the thumbnail of his right hand. It slowly grew wider and the skin underneath began to deteriorate before he went to a local dermatologist in 2013. The doctor thought it was a wart and tried scooping it out, treating the affected area for three years before finally removing the nail bed and sending it off to a pathology lab for analysis.
I have some bad news for you; what we removed was a five-millimeter melanoma, a cancerous tumor that often spreads, Jamie recalls being told on his return visit. "I'd never heard of cancer coming through a thumbnail," he says. None of his doctors had ever mentioned it either. "I just thought I was being treated for a wart." But nothing was healing and it continued to bleed.
A few months later a surgeon amputated the top half of his thumb. Lymph node biopsy tested negative for spread of the cancer and when the bandages finally came off, Jamie thought his medical issues were resolved.
Melanoma is the deadliest form of skin cancer. About 85,000 people are diagnosed with it each year in the U.S. and more than 8,000 die of the cancer when it spreads to other parts of the body, according to the Centers for Disease Control and Prevention (CDC).
There are two peaks in diagnosis of melanoma; one is in younger women ages 30-40 and often is tied to past use of tanning beds; the second is older men 60+ and is related to outdoor activity from farming to sports. Light-skinned people have a twenty-times greater risk of melanoma than do people with dark skin.
"It was pretty weird, I was totally blasted away. Who had thought of this?"
Jamie had a follow up PET scan about six months after his surgery. A suspicious spot on his lung led to a biopsy that came back positive for melanoma. The cancer had spread. Treatment with a monoclonal antibody (nivolumab/Opdivo®) didn't prove effective and he was referred to the Hillman Cancer Center at the University of Pittsburgh Medical Center, a four-hour drive from his home in western Ohio.
An alternative monoclonal antibody treatment brought on such bad side effects, diarrhea as often as 15 times a day, that it took more than a week of hospitalization to stabilize his condition. The only options left were experimental approaches in clinical trials.
"When I graduated from medical school, in 2005, melanoma was a death sentence" with a cure rate in the single digits, says Dr. Diwakar Davar, 39, an oncologist at Hillman who specializes in skin cancer. That began to change in 2010 with introduction of the first immunotherapies, monoclonal antibodies, to treat cancer. The antibodies attach to PD-1, a receptor on the surface of T cells of the immune system and on cancer cells. Antibody treatment boosted the melanoma cure rate to about 30 percent. The search was on to understand why some people responded to these drugs and others did not.
At the same time, there was a growing understanding of the role that bacteria in the gut, the gut microbiome, plays in helping to train and maintain the function of the body's various immune cells. Perhaps the bacteria also plays a role in shaping the immune response to cancer therapy.
One clue came from genetically identical mice. Animals ordered from different suppliers sometimes responded differently to the experiments being performed. That difference was traced to different compositions of their gut microbiome; transferring the microbiome from one animal to another in a process known as fecal transplant (FMT) could change their responses to disease or treatment.
When researchers looked at humans, they found that the patients who responded well to immunotherapies had a gut microbiome that looked like healthy normal folks, but patients who didn't respond had missing or reduced strains of bacteria.
Davar knew that FMT had a very successful cure rate in treating the gut dysbiosis of C. difficile infection and he wondered if a fecal transplant from a patient who had responded well to cancer immunotherapy treatment might improve the cure rate of patients who did not originally respond to immunotherapies for melanoma.
"It was pretty weird, I was totally blasted away. Who had thought of this?" Jamie first thought when the hypothesis was explained to him. But Davar's explanation that the procedure might restore some of the beneficial bacterial his gut was lacking, convinced him to try. He quickly signed on in October 2018 to be the first person in the clinical trial.
Fecal donations go through the same safety procedures of screening for and inactivating diseases that are used in processing blood donations to make them safe for transfusion. The procedure itself uses a standard hollow colonoscope designed to screen for colon cancer and remove polyps. The transplant is inserted through the center of the flexible tube.
Most patients are sedated for procedures that use a colonoscope but Jamie doesn't respond to those drugs: "You can't knock me out. I was watching them on the TV going up my own butt. It was kind of unreal at that point," he says. "There were about twelve people in there watching because no one had seen this done before."
A test two weeks after the procedure showed that the FMT had engrafted and the once-missing bacteria were thriving in his gut. More importantly, his body was responding to another monoclonal antibody (pembrolizumab/Keytruda®) and signs of melanoma began to shrink. Every three months he made the four-hour drive from home to Pittsburgh for six rounds of treatment with the antibody drug.
"We were very, very lucky that the first patient had a great response," says Davar. "It allowed us to believe that even though we failed with the next six, we were on the right track. We just needed to tweak the [fecal] cocktail a little better" and enroll patients in the study who had less aggressive tumor growth and were likely to live long enough to complete the extensive rounds of therapy. Six of 15 patients responded positively in the pilot clinical trial that was published in the journal Science.
Davar believes they are beginning to understand the biological mechanisms of why some patients initially do not respond to immunotherapy but later can with a FMT. It is tied to the background level of inflammation produced by the interaction between the microbiome and the immune system. That paper is not yet published.
It has been almost a year since the last in his series of cancer treatments and Jamie has no measurable disease. He is cautiously optimistic that his cancer is not simply in remission but is gone for good. "I'm still scared every time I get my scans, because you don't know whether it is going to come back or not. And to realize that it is something that is totally out of my control."
"It was hard for me to regain trust" after being misdiagnosed and mistreated by several doctors he says. But his experience at Hillman helped to restore that trust "because they were interested in me, not just fixing the problem."
He is grateful for the support provided by family and friends over the last eight years. After a pause and a sigh, the ruggedly built 47-year-old says, "If everyone else was dead in my family, I probably wouldn't have been able to do it."
"I never hesitated to ask a question and I never hesitated to get a second opinion." But Jamie acknowledges the experience has made him more aware of the need for regular preventive medical care and a primary care physician. That person might have caught his melanoma at an earlier stage when it was easier to treat.
Davar continues to work on clinical studies to optimize this treatment approach. Perhaps down the road, screening the microbiome will be standard for melanoma and other cancers prior to using immunotherapies, and the FMT will be as simple as swallowing a handful of freeze-dried capsules off the shelf rather than through a colonoscopy.
In Sydney, Australia, in the basement of an inner-city high-rise, lives a mass of unexpected inhabitants: millions of maggots. The insects are far from unwelcome. They are there to feast on the food waste generated by the building's human residents.
Goterra, the start-up that installed the maggots in the building in December, belongs to the rapidly expanding insect agriculture industry, which is experiencing a surge of investment worldwide.
The maggots – the larvae of the black soldier fly – are voracious, unfussy eaters. As adult flies, they don't eat, so the young fatten up swiftly on whatever they can get. Goterra's basement colony can munch through 5 metric tons of waste in a day.
"Maggots are nature's cleaners," says Bob Gordon, Head of Growth at Goterra. "They're a great tool to manage waste streams."
Their capacity to consume presents a neat response to the problem of food waste, which contributes up to 8% of global greenhouse gas emissions each year as it rots in landfill.
"The maggots eat the food fairly fresh," Gordon says. "So, there's minimal degradation and you don't get those methane emissions."
Alongside their ability to devour waste, the soldier fly larvae hold further agricultural promise: they yield an incredibly efficient protein. After the maggots have binged for about 12 days, Goterra harvests and processes them into a protein-rich livestock feed. Their excrement, known as frass, is also collected and turned into soil conditioner.
"We are producing protein in a basement," says Gordon. "It's urban farming – really sustainable, urban farming."
Goterra's module in the basement at Barangaroo, Sydney.
Supplied by Goterra
Goterra's founder Olympia Yarger started producing the insects in "buckets in her backyard" in 2016. Today, Goterra has a large-scale processing plant and has developed proprietary modules – in shipping containers – that use robotics to manage the larvae.
The modules have been installed on site at municipal buildings, hospitals, supermarkets, several McDonald's restaurants, and a range of smaller enterprises in Australia. Users pay a subscription fee and simply pour in the waste; Goterra visits once a fortnight to harvest the bugs.
Insect agriculture is well established outside of the West, and the practice is gaining traction around the world. China has mega-facilities that can process hundreds of tons of waste in a day. In Kenya, a program recently trained 2000 farmers in soldier fly farming to boost their economic security. French biotech company InnovaFeed, in partnership with US agricultural heavyweight ADM, plans to build "the world's largest insect protein facility" in Illinois this year.
"The [maggots] are science fiction on earth. Watching them work is awe-inspiring."
But the concept is still not to everyone's taste.
"This is still a topic that I say is a bit like black liquorice – people tend to either really like it or really don't," says Wendy Lu McGill, Communications Director at the North American Coalition of Insect Agriculture (NACIA).
Formed in 2016, NACIA now has over 100 members – including researchers and commercial producers of black soldier flies, meal worms and crickets.
McGill says there have been a few iterations of insect agriculture in the US – beginning with worms produced for bait after World War II then shifting to food for exotic pets. The current focus – "insects as food and feed" – took root about a decade ago, with the establishment of the first commercial farms for this purpose.
"We're starting to see more expansion in the U.S. and a lot of the larger investments have been for black soldier fly producers," McGill says. "They tend to have larger facilities and the animal feed market they're looking at is potentially quite large."
InnovaFeed's Illinois facility is set to produce 60,000 metric tons of animal feed protein per year.
"They'll be trying to employ many different circular principles," McGill says of the project. "For example, the heat from the feed factory – the excess heat that would normally just be vented – will be used to heat the other side that's raising the black soldier fly."
Although commercial applications have started to flourish recently, scientific knowledge of the black soldier fly's potential has existed for decades.
Dr. Jeffery Tomberlin, an entomologist at Texas A&M University, has been studying the insect for over 20 years, contributing to key technologies used in the industry. He also founded Evo, a black soldier fly company in Texas, which feeds its larvae the waste from a local bakery and distillery.
"They are science fiction on earth," he says of the maggots. "Watching them work is awe-inspiring."
Tomberlin says fly farms can work effectively at different scales, and present possibilities for non-Western countries to shift towards "commodity independence."
"You don't have to have millions of dollars invested to be successful in producing this insect," he says. "[A farm] can be as simple as an open barn along the equator to a 30,000 square-foot indoor facility in the Netherlands."
As the world's population balloons, food insecurity is an increasing concern. By 2050, the UN predicts that to feed our projected population we will need to ramp up food production by at least 60%. Insect agriculture, which uses very little land and water compared to traditional livestock farming, could play a key role.
Insects may become more common human food, but the current commercial focus is animal feed. Aquaculture is a key market, with insects presenting an alternative to fish meal derived from over-exploited stocks. Insect meal is also increasingly popular in pet food, particularly in Europe.
While recent investment has been strong – NACIA says 2020 was the best year yet – reaching a scale that can match existing agricultural industries and providing a competitive price point are still hurdles for insect agriculture.
But COVID-19 has strengthened the argument for new agricultural approaches, such as the decentralized, indoor systems and circular principles employed by insect farms.
"This has given the world a preview – which no one wanted – of [future] supply chain disruptions," says McGill.
As the industry works to meet demand, Tomberlin predicts diversification and product innovation: "I think food science is going to play a big part in that. They can take an insect and create ice cream." (Dried soldier fly larvae "taste kind of like popcorn," if you were wondering.)
Tomberlin says the insects could even become an interplanetary protein source: "I do believe in that. I mean, if we're going to colonize other planets, we need to be sustainable."
But he issues a word of caution about the industry growing too big, too fast: "I think we as an industry need to be very careful of how we harness and apply [our knowledge]. The black soldier fly is considered the crown jewel today, but if it's mismanaged, it can be relegated back to a past."
Goterra's Gordon also warns against rushing into mass production: "If you're just replacing big intensive animal agriculture with big intensive animal agriculture with more efficient animals, then what's the change you're really effecting?"
But he expects the industry will continue its rise though the next decade, and Goterra – fuelled by recent $8 million Series A funding – plans to expand internationally this year.
"Within 10 years' time, I would like to see the vast majority of our unavoidable food waste being used to produce maggots to go into a protein application," Gordon says.
"There's no lack of demand. And there's no lack of food waste."