When Rattan Lal was awarded the Japan Prize for Biological Production, Ecology in April—the Asian equivalent of a Nobel—the audience at Tokyo's National Theatre included the emperor and empress. Lal's acceptance speech, however, was down-to-earth in the most literal sense.
Carbon, in its proper place, holds landscapes and ecosystems together.
"I'd like to begin, rather unconventionally, with the conclusion of my presentation," he told the assembled dignitaries. "And the conclusion is four words: In soil we trust."
That statement could serve as the motto for a climate crisis-fighting strategy that has gained remarkable momentum over the past five years or so—and whose rise to international prominence was reflected in that glittering award ceremony. Lal, a septuagenarian professor of soil science at Ohio State University, is one of the foremost exponents of carbon farming, an approach that centers on correcting a man-made, planetary chemical imbalance.
A Solution to Several Problems at Once?
The chemical in question is carbon. Too much of it in the atmosphere (in the form of carbon dioxide, a potent greenhouse gas) is the main driver of global heating. Too little of it in the soil is the bane of farmers in many parts of the world, and a threat to our ability to feed a ballooning global population. Advocates say agriculture can mitigate both problems—by adopting techniques that keep more soil carbon from escaping skyward, and draw more atmospheric carbon down into fields and pastures.
The potential impacts go beyond slowing climate change and boosting food production. "There are so many benefits," says Lal. "Water quality, drought, flooding, biodiversity—this is a natural solution for all these problems." That's because carbon, in its proper place, holds landscapes and ecosystems together. Plants extract it from the air and convert it into sugars for energy; they also transfer it to the soil through their roots and in the process of decomposition. In the ground, carbon feeds microbes and fungi that form the basis of complex food webs. It helps soil absorb and retain water, resist erosion, and hold onto nitrogen and phosphorous—keeping those nutrients from running off into waterways and creating toxic algal blooms.
Government and private support for research into carbon-conscious agriculture is on the rise, and growing numbers of farmers are exploring such methods. How much difference these methods can make, however, remains a matter of debate. Lal sees carbon farming as a way to buy time until CO2 emissions can be brought under control. Skeptics insist that such projections are overly optimistic. Some allies, meanwhile, think Lal's vision is too timid. "Farming can actually fix the climate," says Tim LaSalle, co-founder of the Center for Regenerative Agriculture at California State University, Chico. "That should be our only focus."
Yet Can soil solve the climate crisis? may be not be the key question in assessing the promise of carbon farming, since it implies that action is worthwhile only if a solution is ensured. A more urgent line of inquiry might be: Can the climate crisis be solved without addressing soil?
A Chance Meeting Leads to the Mission of a Lifetime
Lal was among the earliest scientists to grapple with that question. Born in Pakistan, he grew up on a tiny subsistence farm in India, where his family had fled as refugees. The only one of his siblings who learned to read and write, he attended a local agricultural university, then headed to Ohio State on scholarship for his PhD. In 1982, he was working at the International Institute of Tropical Agriculture in Nigeria, trying to develop sustainable alternatives to Africa's traditional slash-and-burn farming, when a distinguished visitor dropped by: oceanographer Roger Revelle, who 25 years earlier had published the first paper warning that fossil fuel combustion could throw the climate dangerously off-kilter.
Rattan Lal, Distinguished University Professor of Soil Science at Ohio State, received the Japan Prize at a ceremony in April.
(Photo: Ken Chamberlain. CFAES.)
Lal showed Revelle the soil in his test plots—hard and reddish, like much of Africa's agricultural land. Then (as described in Kristin Ohlson's book The Soil Will Save Us), he led the visitor to the nearby forest, where the soil was dark, soft, and wriggling with earthworms. In the forest, the soil's carbon content was 2 to 3 percent; in Lal's plots, it had dwindled to 0.5 percent. When Revelle asked him where all that carbon had gone, Lal confessed he didn't know. Revelle suggested that much of it might have floated into the atmosphere, adding to the burden of greenhouse gases. "Since then," Lal told me, "I've been looking for ways to put it back."
Back at Ohio State, Lal found that the United States Department of Agriculture (USDA) and Environmental Protection Agency (EPA) were also interested in the connection between soil carbon and climate change. With a small group of other scientists, he began investigating the dimensions of the problem, and how it might be solved.
Comparing carbon in forested and cultivated soils around the globe, the researchers calculated that about 100 billion tons had vanished into the air since the dawn of agriculture 10,000 years ago. The culprits were common practices—including plowing, overgrazing, and keeping fallow fields bare—that exposed soil carbon to oxygen, transforming it into carbon dioxide. Yet the process could also be reversed, Lal and his colleagues argued. Although there was a limit to the amount of carbon that soil could hold, they theorized that it would be possible to sequester several billion tons of global CO2 emissions each year for decades before reaching maximum capacity.
Lal set up projects on five continents to explore practices that could help accomplish that goal, such as minimizing tillage, planting cover crops, and leaving residue on fields after harvest. He organized conferences, pumped out papers and books. As other researchers launched similar efforts, policymakers worldwide took notice.
But before long, recalls Colorado State University soil scientist Keith Paustian (a fellow carbon-farming pioneer, who served with Lal on the UN's International Panel on Climate Change), official attention "kind of faded away. The bigger imperative was to cut emissions." And because agriculture accounted for only about 13 percent of greenhouse gas pollution, Paustian says, the sectors that emitted the most—energy and transportation—got the bulk of funding.
A Movement on the Rise
In recent years, however, carbon farming has begun to look like an idea whose time has come. One factor is that efforts to reduce emissions haven't worked; in 2018 alone, global CO2 output rose by an estimated 2.7 percent, according to the Global Carbon Project. Last month, researchers from the Scripps Institute of Oceanography reported that atmospheric CO2—under 350 ppm when Lal began his quest—had reached 415 ppm, the highest in 3 million years. And with the world's population expected to approach 10 billion by 2050, the need for sustainable technologies to augment food production has grown increasingly pressing.
Today, carbon-conscious methods are central to the burgeoning movement known as "regenerative agriculture," which also embraces other practices aimed at improving soil health and farming in an ecologically sound (though not always strictly organic) manner. In the United States, the latest Farm Bill includes $25 million to incentivize soil-based carbon sequestration. State and local governments across the country are supporting such efforts, as are at least a dozen nonprofits. The Department of Energy's Advanced Projects Research Agency (ARPA-e) is working to develop crops and technologies aimed at increasing soil carbon accumulation by 50 percent. General Mills recently announced plans to advance regenerative farming on 1 million acres by 2030, and many smaller companies have made their own commitments.
The toughest challenge, Lal suggests, may be persuading farmers to change their ways.
Internationally, the biggest initiative is the French-led "4 per 1,000" initiative, which aims to increase the amount of carbon in the soil of farms and rangelands worldwide by 0.4 percent per year—a rate that the project's website contends would "halt the increase of CO2 (carbon dioxide) concentration in the atmosphere related to human activities."
Given the current pace of research, Lal believes that goal—which equates to sequestering 3.6 billion tons of CO2 annually, or 10 percent of global emissions—is doable. The toughest challenge, he suggests, may be persuading farmers to change their ways. Although carbon farming can reduce costs for chemical inputs such as herbicides and fertilizers, while building rich topsoil, agriculturalists tend to be a conservative lot.
And getting low-income farmers to leave crop residue on fields, instead of using it for fuel or animal feed, will require more than speeches about melting glaciers. Lal proposes a $16 per acre subsidy, totaling $64 billion for the world's 4 billion acres of cropland. "That's not a very large amount," he says, "if you're investing in the health of the planet."
Experimental Methods Attract Supporters and Skeptics
Some experts question whether enough CO2 can be stashed in the soil to prevent the rise in average global temperature from surpassing the 2º C mark—set by the 2016 Paris Agreement as the limit beyond which climate change would become catastrophic. But others insist that carbon farming's goal should be to reverse climate change, not just to put it on pause.
"That's the only way out of this predicament," says Tim LaSalle, whose Center for Regenerative Agriculture supports the use of experimental methods ranging from multi-species cover cropping to fungal-dominant compost solutions. Using such techniques, a few researchers and farmers claim to be able to transfer carbon to the soil at rates many times higher than with established practices. Although several of these methods have yet to be documented in peer-reviewed studies, LaSalle believes they point the way forward. "We can't fix the climate, or even come close to it, using Rattan's numbers," he says, referring to Lal. "If we can replicate these experiments, we can fix it."
Even scientists sympathetic to regenerative ag warn that relying on unproven techniques is risky. "Some of these claims are beyond anything we've seen in agricultural science," says Andrew McGuire, an agronomist at Washington State University. "They could be right, but extraordinary claims require extraordinary evidence."
Still, the assorted methods currently being tested—which also include amending soil with biochar (made by heating agricultural wastes with minimal oxygen), planting long-rooted perennial crops instead of short-rooted annuals, and deploying grazing animals in ways that enrich soil rather than depleting it—offer a catalogue of hope at a time when environmental despair is all too tempting.
Last October, the National Academy of Sciences, Engineering, and Medicine issued a report acknowledging that it was too late to stave off apocalyptic overheating just by reducing CO2 emissions; removing carbon from the atmosphere would be necessary as well. The document laid out several options for doing so—most of which, it cautioned, had serious limitations.
"Soil is a bridge to the future. We can't do without it."
One possibility was planting more forests. To absorb enough carbon dioxide, however, trees might have to replace areas of farmland, reducing the food supply. Another option was creating biomass plantations to fuel power plants, whose emissions would be stored underground. But land use would be a problem: "You'd need to cover an area the size of India," explains Paustian, who was a co-author of the report. Yet another alternative was direct-air capture, in which chemical processes would be used to extract CO2 from the air. The technology was still in its infancy, though—and the costs and power requirements would likely be astronomical.
The report took up agriculture-based methods on page 95. Those needed further research as well, the authors wrote, to determine which approaches would be most effective. But of all the alternatives, this one seemed the least problematic. "Soil carbon is probably what you can do first, cheapest, and with the most additional co-benefits," says Paustian. "If we can make progress in that area, it's a huge advantage."
In any case, he and other researchers agree, we have little choice but to try. "Soil is a bridge to the future," Lal says. "We can't do without it."
Astronauts at the International Space Station today depend on pre-packaged, freeze-dried food, plus some fresh produce thanks to regular resupply missions. This supply chain, however, will not be available on trips further out, such as the moon or Mars. So what are astronauts on long missions going to eat?
Going by the options available now, says Christel Paille, an engineer at the European Space Agency, a lunar expedition is likely to have only dehydrated foods. “So no more fresh product, and a limited amount of already hydrated product in cans.”
For the Mars mission, the situation is a bit more complex, she says. Prepackaged food could still constitute most of their food, “but combined with [on site] production of certain food products…to get them fresh.” A Mars mission isn’t right around the corner, but scientists are currently working on solutions for how to feed those astronauts. A number of boundary-pushing efforts are now underway.
The logistics of growing plants in space, of course, are very different from Earth. There is no gravity, sunlight, or atmosphere. High levels of ionizing radiation stunt plant growth. Plus, plants take up a lot of space, something that is, ironically, at a premium up there. These and special nutritional requirements of spacefarers have given scientists some specific and challenging problems.
To study fresh food production systems, NASA runs the Vegetable Production System (Veggie) on the ISS. Deployed in 2014, Veggie has been growing salad-type plants on “plant pillows” filled with growth media, including a special clay and controlled-release fertilizer, and a passive wicking watering system. They have had some success growing leafy greens and even flowers.
"Ideally, we would like a system which has zero waste and, therefore, needs zero input, zero additional resources."
A larger farming facility run by NASA on the ISS is the Advanced Plant Habitat to study how plants grow in space. This fully-automated, closed-loop system has an environmentally controlled growth chamber and is equipped with sensors that relay real-time information about temperature, oxygen content, and moisture levels back to the ground team at Kennedy Space Center in Florida. In December 2020, the ISS crew feasted on radishes grown in the APH.
“But salad doesn’t give you any calories,” says Erik Seedhouse, a researcher at the Applied Aviation Sciences Department at Embry-Riddle Aeronautical University in Florida. “It gives you some minerals, but it doesn’t give you a lot of carbohydrates.” Seedhouse also noted in his 2020 book Life Support Systems for Humans in Space: “Integrating the growing of plants into a life support system is a fiendishly difficult enterprise.” As a case point, he referred to the ESA’s Micro-Ecological Life Support System Alternative (MELiSSA) program that has been running since 1989 to integrate growing of plants in a closed life support system such as a spacecraft.
Paille, one of the scientists running MELiSSA, says that the system aims to recycle the metabolic waste produced by crew members back into the metabolic resources required by them: “The aim is…to come [up with] a closed, sustainable system which does not [need] any logistics resupply.” MELiSSA uses microorganisms to process human excretions in order to harvest carbon dioxide and nitrate to grow plants. “Ideally, we would like a system which has zero waste and, therefore, needs zero input, zero additional resources,” Paille adds.
Microorganisms play a big role as “fuel” in food production in extreme places, including in space. Last year, researchers discovered Methylobacterium strains on the ISS, including some never-seen-before species. Kasthuri Venkateswaran of NASA’s Jet Propulsion Laboratory, one of the researchers involved in the study, says, “[The] isolation of novel microbes that help to promote the plant growth under stressful conditions is very essential… Certain bacteria can decompose complex matter into a simple nutrient [that] the plants can absorb.” These microbes, which have already adapted to space conditions—such as the absence of gravity and increased radiation—boost various plant growth processes and help withstand the harsh physical environment.
MELiSSA, says Paille, has demonstrated that it is possible to grow plants in space. “This is important information because…we didn’t know whether the space environment was affecting the biological cycle of the plant…[and of] cyanobacteria.” With the scientific and engineering aspects of a closed, self-sustaining life support system becoming clearer, she says, the next stage is to find out if it works in space. They plan to run tests recycling human urine into useful components, including those that promote plant growth.
The MELiSSA pilot plant uses rats currently, and needs to be translated for human subjects for further studies. “Demonstrating the process and well-being of a rat in terms of providing water, sufficient oxygen, and recycling sufficient carbon dioxide, in a non-stressful manner, is one thing,” Paille says, “but then, having a human in the loop [means] you also need to integrate user interfaces from the operational point of view.”
Growing food in space comes with an additional caveat that underscores its high stakes. Barbara Demmig-Adams from the Department of Ecology and Evolutionary Biology at the University of Colorado Boulder explains, “There are conditions that actually will hurt your health more than just living here on earth. And so the need for nutritious food and micronutrients is even greater for an astronaut than for [you and] me.”
Demmig-Adams, who has worked on increasing the nutritional quality of plants for long-duration spaceflight missions, also adds that there is no need to reinvent the wheel. Her work has focused on duckweed, a rather unappealingly named aquatic plant. “It is 100 percent edible, grows very fast, it’s very small, and like some other floating aquatic plants, also produces a lot of protein,” she says. “And here on Earth, studies have shown that the amount of protein you get from the same area of these floating aquatic plants is 20 times higher compared to soybeans.”
Aquatic plants also tend to grow well in microgravity: “Plants that float on water, they don’t respond to gravity, they just hug the water film… They don’t need to know what’s up and what’s down.” On top of that, she adds, “They also produce higher concentrations of really important micronutrients, antioxidants that humans need, especially under space radiation.” In fact, duckweed, when subjected to high amounts of radiation, makes nutrients called carotenoids that are crucial for fighting radiation damage. “We’ve looked at dozens and dozens of plants, and the duckweed makes more of this radiation fighter…than anything I’ve seen before.”
Despite all the scientific advances and promising leads, no one really knows what the conditions so far out in space will be and what new challenges they will bring. As Paille says, “There are known unknowns and unknown unknowns.”
One definite “known” for astronauts is that growing their food is the ideal scenario for space travel in the long term since “[taking] all your food along with you, for best part of two years, that’s a lot of space and a lot of weight,” as Seedhouse says. That said, once they land on Mars, they’d have to think about what to eat all over again. “Then you probably want to start building a greenhouse and growing food there [as well],” he adds.
And that is a whole different challenge altogether.
We are sticking our heads into the sand of reality on Omicron, and the results may be catastrophic.
Omicron is over 4 times more infectious than Delta. The Pfizer two-shot vaccine offers only 33% protection from infection. A Pfizer booster vaccine does raises protection to about 75%, but wanes to around 30-40 percent 10 weeks after the booster.
That’s because the much faster disease transmission and vaccine escape undercut the less severe overall nature of Omicron. That’s why hospitals have a large probability of being overwhelmed, as the Center for Disease Control warned, in this major Omicron wave.
Yet despite this very serious threat, we see the lack of real action. The federal government tightened international travel guidelines and is promoting boosters. Certainly, it’s crucial to get as many people to get their booster – and initial vaccine doses – as soon as possible. But the government is not taking the steps that would be the real game-changers.
Pfizer’s anti-viral drug Paxlovid decreases the risk of hospitalization and death from COVID by 89%. Due to this effectiveness, the FDA approved Pfizer ending the trial early, because it would be unethical to withhold the drug from people in the control group. Yet the FDA chose not to hasten the approval process along with the emergence of Omicron in late November, only getting around to emergency authorization in late December once Omicron took over. That delay meant the lack of Paxlovid for the height of the Omicron wave, since it takes many weeks to ramp up production, resulting in an unknown number of unnecessary deaths.
We humans are prone to falling for dangerous judgment errors called cognitive biases.
Widely available at-home testing would enable people to test themselves quickly, so that those with mild symptoms can quarantine instead of infecting others. Yet the federal government did not make tests available to patients when Omicron emerged in late November. That’s despite the obviousness of the coming wave based on the precedent of South Africa, UK, and Denmark and despite the fact that the government made vaccines freely available. Its best effort was to mandate that insurance cover reimbursements for these kits, which is way too much of a barrier for most people. By the time Omicron took over, the federal government recognized its mistake and ordered 500 million tests to be made available in January. However, that’s far too late. And the FDA also played a harmful role here, with its excessive focus on accuracy going back to mid-2020, blocking the widespread availability of cheap at-home tests. By contrast, Europe has a much better supply of tests, due to its approval of quick and slightly less accurate tests.
Neither do we see meaningful leadership at the level of employers. Some are bringing out the tired old “delay the office reopening” play. For example, Google, Uber, and Ford, along with many others, have delayed the return to the office for several months. Those that already returned are calling for stricter pandemic measures, such as more masks and social distancing, but not changing their work arrangements or adding sufficient ventilation to address the spread of COVID.
Despite plenty of warnings from risk management and cognitive bias experts, leaders are repeating the same mistakes we fell into with Delta. And so are regular people. For example, surveys show that Omicron has had very little impact on the willingness of unvaccinated Americans to get a first vaccine dose, or of vaccinated Americans to get a booster. That’s despite Omicron having taken over from Delta in late December.
What explains this puzzling behavior on both the individual and society level? We humans are prone to falling for dangerous judgment errors called cognitive biases. Rooted in wishful thinking and gut reactions, these mental blindspots lead to poor strategic and financial decisions when evaluating choices.
These cognitive biases stem from the more primitive, emotional, and intuitive part of our brains that ensured survival in our ancestral environment. This quick, automatic reaction of our emotions represents the autopilot system of thinking, one of the two systems of thinking in our brains. It makes good decisions most of the time but also regularly makes certain systematic thinking errors, since it’s optimized to help us survive. In modern society, our survival is much less at risk, and our gut is more likely to compel us to focus on the wrong information to make decisions.
One of the biggest challenges relevant to Omicron is the cognitive bias known as the ostrich effect. Named after the myth that ostriches stick their heads into the sand when they fear danger, the ostrich effect refers to people denying negative reality. Delta illustrated the high likelihood of additional dangerous variants, yet we failed to pay attention to and prepare for such a threat.
We want the future to be normal. We’re tired of the pandemic and just want to get back to pre-pandemic times. Thus, we greatly underestimate the probability and impact of major disruptors, like new COVID variants. That cognitive bias is called the normalcy bias.
When we learn one way of functioning in any area, we tend to stick to that way of functioning. You might have heard of this as the hammer-nail syndrome: when you have a hammer, everything looks like a nail. That syndrome is called functional fixedness. This cognitive bias causes those used to their old ways of action to reject any alternatives, including to prepare for a new variant.
Our minds naturally prioritize the present. We want what we want now, and downplay the long-term consequences of our current desires. That fallacious mental pattern is called hyperbolic discounting, where we excessively discount the benefits of orienting toward the future and focus on the present. A clear example is focusing on the short-term perceived gains of trying to return to normal over managing the risks of future variants.
The way forward into the future is to defeat cognitive biases and avoid denying reality by rethinking our approach to the future.
The FDA requires a serious overhaul. It’s designed for a non-pandemic environment, where the goal is to have a highly conservative, slow-going, and risk-averse approach so that the public feels confident trusting whatever it approved. That’s simply unacceptable in a fast-moving pandemic, and we are bound to face future pandemics in the future.
The federal government needs to have cognitive bias experts weigh in on federal policy. Putting all of its eggs in one basket – vaccinations – is not a wise move when we face the risks of a vaccine-escaping variant. Its focus should also be on expediting and prioritizing anti-virals, scaling up cheap rapid testing, and subsidizing high-filtration masks.
For employers, instead of dictating a top-down approach to how employees collaborate, companies need to adopt a decentralized team-led approach. Each individual team leader of a rank-and-file employee team should determine what works best for their team. After all, team leaders tend to know much more of what their teams need, after all. Moreover, they can respond to local emergencies like COVID surges.
At the same time, team leaders need to be trained to integrate best practices for hybrid and remote team leadership. Companies transitioned to telework abruptly as part of the March 2020 lockdowns. They fell into the cognitive bias of functional fixedness and transposed their pre-existing, in-office methods of collaboration on remote work. Zoom happy hours are a clear example: The large majority of employees dislike them, and research shows they are disconnecting, rather than connecting.
Yet supervisors continue to use them, despite the existence of much better methods of facilitating colalboration, which have been shown to work, such as virtual water cooler discussions, virtual coworking, and virtual mentoring. Leaders also need to facilitate innovation in hybrid and remote teams through techniques such as virtual asynchronous brainstorming. Finally, team leaders need to adjust performance evaluation to adapt to the needs of hybrid and remote teams.
On an individual level, people built up certain expectations during the first two years of the pandemic, and they don't apply with Omicron. For example, most people still think that a cloth mask is a fine source of protection. In reality, you really need an N-95 mask, since Omicron is so much more infectious. Another example is that many people don’t realize that symptom onset is much quicker with Omicron, and they aren’t prepared for the consequences.
Remember that we have a huge number of people who are asymptomatic, often without knowing it, due to the much higher mildness of Omicron. About 8% of people admitted to hospitals for other reasons in San Francisco test positive for COVID without symptoms, which we can assume translates for other cities. That means many may think they're fine and they're actually infectious. The result is a much higher chance of someone getting many other people sick.
During this time of record-breaking cases, you need to be mindful about your internalized assumptions and adjust your risk calculus accordingly. So if you can delay higher-risk activities, January and February might be the time to do it. Prepare for waves of disruptions to continue over time, at least through the end of February.
Of course, you might also choose to not worry about getting infected. If you are vaccinated and boosted, and do not have any additional health risks, you are very unlikely to have a serious illness due to Omicron. You can just take the small risk of a serious illness – which can happen – and go about your daily life. If doing so, watch out for those you care about who do have health concerns, since if you infect them, they might not have a mild case even with Omicron.
In short, instead of trying to turn back the clock to the lost world of January 2020, consider how we might create a competitive advantage in our new future. COVID will never go away: we need to learn to live with it. That means reacting appropriately and thoughtfully to new variants and being intentional about our trade-offs.