Historically, the U.S. Government has played a major role in advancing technological innovations, but times are changing. Today, we analyze its role in scaling vital innovative projects.
Since its inception in the 18th century, the U.S. Government has played a central role in advancing disruptive innovations — from agricultural practices to automation, from defining new tools to developing new schools of thought.
However, to understand its role in unleashing innovation, we must first explore how its approach to investing in Research & Development (R&D) — the bedrock of technological advances — has evolved throughout history.
According to the Bureau of the Fiscal Service’s Data Lab (whose mission is to promote transparency of Government finances by providing engaging and informative data-driven analyses of federal spending data), research and development activities are part of a process to help us find solutions to problems using science, engineering, and technology. The three subcategories of R&D are basic research, applied research, and experimental development.
The Congressional Joint Economic Committee (a.k.a. JEC, one of the four joint committees of Congress, responsible for reporting the economic condition of the U.S. and making suggestions for improvements), defines basic research as that which contributes to our fundamental stock of knowledge, yet lacks specific applications. On the other hand, applied research involves the process of understanding how technology addresses a specific need or application.
In other words, basic research attempts to answer questions such as, “How did the universe begin?” or “What are the causative factors of cancer?” while applied research involves more actionable questions like, “How can cybersecurity be improved to prevent election fraud?” or “What are ways to make car tires last longer?”
Data from the U.S. R&D Funding and Performance Fact Sheet shows that the U.S. Government became the global R&D leader in the 20th century, funding up to 69% of global R&D initiatives post-World War II — research that covered everything from vaccines to electronics, to mention a few.
With time, the Government’s funding of R&D began to decline near the first decade of the 21st century. Data shows a huge drop, from 69% in the mid-20th century to 21.9% by 2018. Meanwhile, private funding for R&D has increased each year since 2010, reaching an all-time high of 69.7% in 2018.
Much is lost when we look at data in aggregate, so we dove deeper into the numbers and found some important nuances. The first category of R&D — basic research — is perpetually underfunded by the private industry precisely because it tends to be conducted with no specific commercial applications in mind. Businesses, for obvious reasons, tend to concentrate their R&D spending on the development of products and processes with direct and immediate commercial value.
For private firms, early R&D stages are very risky and capital intensive. This results in organizational constraints for tech entrepreneurs, who are forced to focus on low-hanging fruits and viable or proven innovations. Lack of funding for startups developing truly disruptive technologies continues to be one of the main reasons why 90% of tech startups fail today.
Surprisingly, the U.S. Government, which funded early R&D projects without knowing if they would yield any result, played a decisive role in enabling comprehensive research that contributed to unveiling some of the biggest breakthroughs in history (see: the internet, Google Search Engine). Silicon Valley’s success story, starting with the development and funding of the semiconductor industry, is a testament to this (more about this in a second.)
If publicly-funded R&D had performed poorly in the past, we wouldn’t question the motives behind its dramatic decline. However, publicly-funded R&D has proven to be directly responsible for major advances in technology, the global economy, and global welfare.
Federally-Funded Innovative Breakthroughs
We can trace the U.S. Federal Government’s support of innovation back to 1797, when the first U.S. armory opened in Springfield, Massachusetts. An interesting thing to note here is that the U.S. Army was both the only producer and provider of this market then.
By the late 1930s, during the nascent years of modern computing, defense contracts from the Navy and the Army provided the only market for R&D focused on hardware and software, as we are familiar with today. The Department of Defense even supported research on semiconductors and subsidized the production facilities for the private industry. This industry encompasses what is known today as Silicon Valley, the epicenter of global innovation, entrepreneurship, and venture capital investment.
In 1958, we saw the Defense Advanced Research Projects Agency (DARPA) investing massive amounts of public funds in response to the launch of the USSR’s Sputnik — the world's first artificial satellite. This funding kicked off the chapter of the Cold War that came to be known as the “Space Race.” The best-known invention to come out of DARPA is the Internet. Still, many other equally important technologies like the GPS, Onion Routing, and Siri (that’s right, Apple’s Siri!), came through the agency’s heavy investment in early, high-risk, non-market-based research.
Fast-forwarding to today, based on findings from the Knowledge Portal of Innovation and Access to Medicine, the U.S. Government was one of the 2 largest investors, along with Germany, in vaccine R&D, with public funding representing 90% of the $6.6 billion tracked globally.
This does not include an additional public investment of over $2.3 billion to scale Covid-19 vaccine efforts, which injected (pun intended!) seed funds during Q1 of 2020 to create a more flexible regulatory path for the Food and Drugs Administration (FDA) to approve vaccines. This early investment of public funds resulted in the first fully-approved Covid-19 vaccine (Pfizer-BioNTech) and the ability to vaccinate over 55% of the country’s population as of October 2021.
The more one digs into the data, the more evident it becomes how the role of the Government in R&D investment simply cannot be replaced by the private sector. Only the former has the muscle and leverage to mobilize enormous sums of capital so boldly.
Is The U.S. Government Still Leading Innovation Today?
So we arrive back to our original question.
The restricted mobility and increased dependence on digital technologies for providing and accessing basic public services during the Covid-19 pandemic is evidence of how existing Government infrastructure and operations failed to guarantee one of the basic principles of public services — continuity. Without upholding this principle, millions experienced incomplete, reduced, and sometimes impossible access to basic services, such as healthcare, education, social programs, justice, and more.
As a matter of fact, a report from Insight Public Sector confirmed that only 18% of IT professionals in the public sector felt their organization was “extremely prepared” to handle the IT changes that resulted from the pandemic. Furthermore, nearly two-thirds (61%) of public sector agencies experienced between 1–4 weeks of downtime while transitioning to remote work, with downtime lasting 3–4 weeks for 30% of public agencies.
It would seem obvious considering the history, the starring role of the U.S. Government in the creation of new technologies, and its recent leadership combating a global pandemic, that it would be well-positioned to embrace, adapt and innovate tools to modernize governmental operations, processes, and decision making. Unfortunately, this is hardly the case.
Government agencies across the U.S. continue to use shockingly outdated technology. By continuing to hold onto legacy systems for their internal processes, the public workforce is at a great disadvantage, having the almost impossible mission of doing their job while responding to people’s needs, making critical decisions in real-time, understanding complex regulations, ensuring their activities are compliant with those regulations, and facing new challenges, all at the same time.
No doubt, it is difficult for public officers to take action on these critical activities. Further inhibiting their ability is the fact they have obsolete tools. Due to the chronic underfunding of basic governmental IT infrastructure, our public officers and workers tend to be unaware of the shoulders they stand on — those of pioneers and moonshot-makers, not the risk-averse that popular narratives would like us to believe.
Today's U.S. Government agencies might not be funding and embracing innovation as they once were, but with a new wave of stimulus funds flowing from Washington, D.C., and backed by the American Rescue Plan Act (ARPA), State and Local government agencies have a once-in-a-lifetime opportunity to invest in the technology infrastructure needed to become high-performing entities that empower the public workforce to deliver outstanding public services to their communities.
Three plug-and-play tools that could be adopted today using ARPA funds, and that can dramatically help State and Local government agencies in their innovation and digital transformation journeys, are:
We started with a big question (“Is the U.S. Government still leading innovation today?”), and the simple answer is: no, not like before at least. But we believe a better answer is: No, but that’s about to change, thanks to the pressing needs and renewed can-do spirit emerging from battling the Covid-19 pandemic. Time is of the essence, and with forward-thinking public officials and elected leaders who are willing to decisively seize the moment, there is a new chance for the U.S. Government to resume its role as the leader in global innovation.
1. Data Lab. (2020). Research & Development in Contract Funding | U.S. Treasury Data Lab.https://datalab.usaspending.gov/rd-in-contracting/