The U.S. has approximately 58 percent of the market share in the biotech sector, followed by China with 11 percent. However, this market share is the result of several years of previous research and development (R&D) – it is a present picture of what happened in the past. In the future, this market share will decline unless the federal government makes investments to improve the quality and quantity of U.S. research in biotech.
The effectiveness of current R&D can be evaluated in a variety of ways such as monies invested and the number of patents filed. According to the UNESCO Institute for Statistics, the U.S. spends approximately 2.7 percent of GDP on R&D ($476,459.0M), whereas China spends 2 percent ($346,266.3M). However, investment levels do not necessarily translate into goods that end up contributing to innovation.
Patents are a better indication of innovation. The biotech industry relies on patents to protect their investments, making patenting a key tool in the process of translating scientific discoveries that can ultimately benefit patients. In 2020, China filed 1,497,159 patents, a 6.9 percent increase in growth rate. In contrast, the U.S. filed 597,172, a 3.9 percent decline. When it comes to patents filed, China has approximately 45 percent of the world share compared to 18 percent for the U.S.
So how did we get here? The nature of science in academia allows scientists to specialize by dedicating several years to advance discovery research and develop new inventions that can then be licensed by biotech companies. This makes academic science critical to innovation in the U.S. and abroad.
Academic scientists rely on government and foundation grants to pay for R&D, which includes salaries for faculty, investigators and trainees, as well as monies for infrastructure, support personnel and research supplies. Of particular interest to academic scientists to cover these costs is government support such as Research Project Grants, also known as R01 grants, the oldest grant mechanism from the National Institutes of Health. Unfortunately, this funding mechanism is extremely competitive, as applications have a success rate of only about 20 percent. To maximize the chances of getting funded, investigators tend to limit the innovation of their applications, since a project that seems overambitious is discouraged by grant reviewers.
Considering the difficulty in obtaining funding, the limited number of opportunities for scientists to become independent investigators capable of leading their own scientific projects, and the salaries available to pay for scientists with a doctoral degree, it is not surprising that the U.S. is progressively losing its workforce for innovation.
This approach affects the future success of the R&D enterprise in the U.S. Pursuing less innovative work tends to produce scientific results that are more obvious than groundbreaking, and when a discovery is obvious, it cannot be patented, resulting in fewer inventions that go on to benefit patients. Even though there are governmental funding options available for scientists in academia focused on more groundbreaking and translational projects, those options are less coveted by academic scientists who are trying to obtain tenure and long-term funding to cover salaries and other associated laboratory expenses. Therefore, since only a small percent of projects gets funded, the likelihood of scientists interested in pursuing academic science or even research in general keeps declining over time.
Efforts to raise the number of individuals who pursue a scientific education are paying off. However, the number of job openings for those trainees to carry out independent scientific research once they graduate has proved harder to increase. These limitations are not just in the number of faculty openings to pursue academic science, which are in part related to grant funding, but also the low salary available to pay those scientists after they obtain their doctoral degree, which ranges from $53,000 to $65,000, depending on years of experience.
Thus, considering the difficulty in obtaining funding, the limited number of opportunities for scientists to become independent investigators capable of leading their own scientific projects, and the salaries available to pay for scientists with a doctoral degree, it is not surprising that the U.S. is progressively losing its workforce for innovation, which results in fewer patents filed.
Perhaps instead of encouraging scientists to propose less innovative projects in order to increase their chances of getting grants, the U.S. government should give serious consideration to funding investigators for their potential for success -- or the success they have already achieved in contributing to the advancement of science. Such a funding approach should be tiered depending on career stage or years of experience, considering that 42 years old is the median age at which the first R01 is obtained. This suggests that after finishing their training, scientists spend 10 years before they establish themselves as independent academic investigators capable of having the appropriate funds to train the next generation of scientists who will help the U.S. maintain or even expand its market share in the biotech industry for years to come. Patenting should be given more weight as part of the academic endeavor for promotion purposes, or governmental investment in research funding should be increased to support more than just 20 percent of projects.
Remaining at the forefront of biotech innovation will give us the opportunity to not just generate more jobs, but it will also allow us to attract the brightest scientists from all over the world. This talented workforce will go on to train future U.S. scientists and will improve our standard of living by giving us the opportunity to produce the next generation of therapies intended to improve human health.
This problem cannot rely on just one solution, but what is certain is that unless there are more creative changes in funding approaches for scientists in academia, eventually we may be saying “remember when the U.S. was at the forefront of biotech innovation?”
One of the Netherlands’ most famous pieces of pop culture is “Soldier of Orange.” It’s the title of the country’s most celebrated war memoir, movie and epic stage musical, all of which detail the exploits of the nation’s resistance fighters during World War II.
Willem Johan Kolff was a member of the Dutch resistance, but he doesn’t rate a mention in the “Solider of Orange” canon. Yet his wartime toils in a rural backwater not only changed medicine, but the world.
Kolff had been a physician less than two years before Germany invaded the Netherlands in May 1940. He had been engaged in post-graduate studies at the University of Gronigen but withdrew because he refused to accommodate the demands of the Nazi occupiers. Kolff’s Jewish supervisor made an even starker choice: He committed suicide.
After his departure from the university, Kolff took a job managing a small hospital in Kampen. Located 50 miles from the heavily populated coastal region, the facility was far enough away from the prying eyes of Germans that not only could Kolff care for patients, he could hide fellow resistance fighters and even Jewish refugees in relative safety. Kolff coached many of them to feign convincing terminal illnesses so the Nazis would allow them to remain in the hospital.
Despite the demands of practicing medicine and resistance work, Kolff still found time to conduct research. He had been haunted and inspired when, not long before the Nazi invasion, one of his patients died in agony from kidney disease. Kolff wanted to find a way to save future patients.
He broke his problem down to a simple task: If he could remove 20 grams of urea from a patient’s blood in 24 hours, they would survive. He began experimenting with ways to filter blood and return it to a patient’s body. Since the war had ground all non-military manufacturing to a halt, he was mostly forced to make do with material he could find at the hospital and around Kampen. Kolff eventually built a device from a washing machine parts, juice cans, sausage casings, a valve from an old Ford automobile radiator, and even scrap from a downed German aircraft.
The world’s first dialysis machine was hardly imposing; it resembled a rotating drum for a bingo game or raffle. Yet it carried on the highly sophisticated task of moving a patient’s blood through a semi-permeable membrane (about a 50-foot length of sausage casings) into a saline solution that drew out urea while leaving the blood cells untouched.
In emigrating to the U.S. to practice medicine, Kolff's intent was twofold: Advocate for a wider adoption of dialysis, and work on new projects. He wildly succeeded at both.
Kolff began using the machine to treat patients in 1943, most of whom had lapsed into comas due to their kidney failure. But like most groundbreaking medical devices, it was not an immediate success. By the end of the war, Kolff had dialyzed more than a dozen patients, but all had died. He briefly suspended use of the device after the Allied invasion of Europe, but he continued to refine its operation and the administration of blood thinners to patients.
In September 1945, Kolff dialyzed another comatose patient, 67-year-old Sofia Maria Schafstadt. She regained consciousness after 11 hours, and would live well into the 1950s with Kolff’s assistance. Yet this triumph contained a dark irony: At the time of her treatment, Schafstadt had been imprisoned for collaborating with the Germans.
With a tattered Europe struggling to overcome the destruction of the war, Kolff and his family emigrated to the U.S. in 1950, where he began working for the Cleveland Clinic while undergoing the naturalization process so he could practice medicine in the U.S. His intent was twofold: Advocate for a wider adoption of dialysis, and work on new projects. He wildly succeeded at both.
By the mid-1950s, dialysis machines had become reliable and life-saving medical devices, and Kolff had become a U.S. citizen. About that time he invented a membrane oxygenator that could be used in heart bypass surgeries. This was a critical component of the heart-lung machine, which would make heart transplants possible and bypass surgeries routine. He also invented among the very first practical artificial hearts, which in 1957 kept a dog alive for 90 minutes.
Kolff moved to the University of Utah in 1967 to become director of its Institute for Biomedical Engineering. It was a promising time for such a move, as the first successful transplant of a donor heart to a human occurred that year. But he was interested in going a step further and creating an artificial heart for human use.
It took more than a decade of tinkering and research, but in 1982, a team of physicians and engineers led by Kolff succeeded in implanting the first artificial heart in dentist Barney Clark, whose failing health disqualified him from a heart transplant. Although Clark died in March 1983 after 112 days tethered to the device, that it kept him alive generated international headlines. While graduate student Robert Jarvik received the named credit for the heart, he was directly supervised by Kolff, whose various endeavors into artificial organ research at the University of Utah were segmented into numerous teams.
Forty years later, several artificial hearts have been approved for use by the Food and Drug Administration, although all are a “bridge” that allow patients to wait for a transplant.
Kolff continued researching and tinkering with biomedical devices – including artificial eyes and ears – until he retired in 1997 at the age of 86. When he died in 2009, the medical community acknowledged that he was not only a pioneer in biotechnology, but the “father” of artificial organs.
For millions of people with macular degeneration, treatment options are slim. The disease causes loss of central vision, which allows us to see straight ahead, and is highly dependent on age, with people over 75 at approximately 30% risk of developing the disorder. The BrightFocus Foundation estimates 11 million people in the U.S. currently have one of three forms of the disease.
Recently, ophthalmologists including Daniel Palanker at Stanford University published research showing advances in the PRIMA retinal implant, which could help people with advanced, age-related macular degeneration regain some of their sight. In a feasibility study, five patients had a pixelated chip implanted behind the retina, and three were able to see using their remaining peripheral vision and—thanks to the implant—their partially restored central vision at the same time.
Should people with macular degeneration be excited about these results?
“Every week, if not every day, patients come to me with this question because it's devastating when they lose their central vision,” says retinal surgeon Lynn Huang. About 40% of her patients have macular degeneration. Huang tells them that these implants, along with new medications and stem cell therapies, could be useful in the coming years.
“The goal here is to replace the missing photoreceptors with photovoltaic pixels, basically like little solar panels,” Palanker says.
That implant, a pixelated chip, works together with a tiny video camera on a specially designed pair of eyeglasses, which can be adjusted for each patient’s prescription. The video camera relays processed images to the chip, which electrically stimulates inner retinal neurons. These neurons, in turn, relay information to the brain’s visual cortex through the optic nerve. The chip restores patients’ central sight, but not completely. The artificial vision is basically monochromatic (whitish-yellowish) and fairly blurry; patients were still legally blind even after the implant, except when using a zoom function on the camera, but those with proper chip placement could make out large letters.
“The goal here is to replace the missing photoreceptors with photovoltaic pixels, basically like little solar panels,” Palanker says. These pixels, located on the implanted chip, convert light into pulsed electrical currents that stimulate retinal neurons. In time, Palanker hopes to improve the chips, resulting in bigger boosts to visual acuity.
The pixelated chips are surgically implanted during a process Palanker admits is still “a surgical learning curve.” In the study, three chips were implanted correctly, one was placed incorrectly, and another patient’s chip moved after the procedure; he did not follow post-surgical recommendations. One patient passed away during the study for unrelated reasons.
University of Maryland retinal specialist Kenneth Taubenslag, who was not involved in the study, said that subretinal surgeries have become less common in recent years, but expects implants to spur improvements in these techniques. “I think as people get more experience, [they’ll] probably get more reliable placement of the implant,” he said, pointing out that even the patient with the misplaced chip was able to gain some light perception, if not the same visual acuity as other patients.
Retinal implants have come under scrutiny lately. IEEE Spectrum reported that Second Sight, manufacturer of the Argus II implant used for people with retinitis pigmentosa, a genetic disease that causes vision loss, would no longer support the product. After selling hundreds of the implants at $150,000 apiece, company leaders announced they’d “decided to pursue an orderly wind down” of Second Sight in March 2020 in the wake of financial issues. Last month, the company announced a merger, shifting its focus to a new retinal implant, raising questions for patients who have Argus II implants.
Retinal surgeon Eugene de Juan of the University of California, San Francisco, was involved with early studies of the Argus implants, though his participation ended over a decade ago, before the device was marketed by Second Sight. He says he would consider recommending future implants to patients with macular degeneration, given the promise of the technology and the lack of other alternatives.
“I tell my patients that this is an area of active research and development, and it's getting better and better, so let's not give up hope,” de Juan says. He believes cautious optimism for Palanker’s implant is appropriate: “It's not the first, it's not the only, but it's a good approach with a good team.”