Yet universities struggle to convert their research overhead into revenue centers. (Hint: it's not tech transfer)
Not long ago, I wrote about The Benefits of a Distributed Network for Pandemic Monitoring. In part one of this article, I extend that thinking to university research services.
My fundamental thesis is that every life ever saved started with some form of basic research. However, we often fail to consciously recognize this in our collective frontal cortex. Let me repeat that. It's important to understand that every disease or life saved can be traced back to basic research discoveries. From the invention of sanitary medical conditions and sanitary water to the discovery of penicillin and advances in newborn screening, these breakthroughs, whether accidental or not, all began with basic discoveries. Sometimes, the process was slow and arduous, but the inventors persevered until their dreams were recognized. A recent example is the development of the COVID vaccine in just "9 months," which was built on 20 years of research that previously went largely unnoticed.
“Every Life Ever Saved Started With Basic Research”
The question that I pose is: Why focus on building the house when your foundation is in quicksand? Let me explain. I totally agree that for the last 50 years, the focus on new research tools and assay development has created immense value in the research, diagnostic, and therapeutic markets. However, this slightly myopic approach risks a future where waves of new scientific technologies will come and go like tides on the beach. I challenge the conventional thinking that basic discoveries come from building faster scientific instruments. Or from more accurate or sensitive assays. Or from sleeker software that runs or analyzes the data coming off of them. While these are all true, they miss the fundamental problem.
“The fundamental challenge we face in science is not the technology we can build, but the infrastructure on which they are built. “
To better understand this point, let's look back at history. In 1956 Eisenhower passed the Federal Aid Highway Act that funded the entire highway system in the US. At the time Eisenhower thought the highway system would be used to transport military equipment. Later in 1960, the first supercomputer called the CDC 1604 was built by Cray and used at academic institutions that could afford to buy and house them. Originally, they were built to handle what we would today perceive as basic computational functions. Now many of those computers can fit on our smartphones. These computers were essentially on digital islands. Few people knew they existed. There was no Internet, WiFi or cloud computing. Nor search engines to find them or ways to index these assets. The challenge of moving data between these behemoths eventually led to the development of ARPANET (Advanced Research Projects Agency Network) which eventually became the Internet as we know it today, but it was originally intended for defense applications. It was the Internet that unleashed the power of computing, not the other way around. I ask can we as the scientific community expect to continue to drive discoveries without our own version of the internet?
“The creation of the fundamental infrastructure components required to make resources in life sciences accessible has yet to happen”
Consider this: The creation of the fundamental infrastructure components required to make resources in life sciences accessible has yet to happen. Akin to the supercomputers of the 1970s sitting on digital islands I euphemistically call university research labs….similarly all of our biggest research assets sit in silos as well. This is because just like the supercomputers of the days past the current scientific resources and services are not interconnected in a public way. Our thought leaders in science literally don’t know where all of the tools we need to do our work are. Most Vice Presidents' of Research or CFOs at a major university would struggle to answer the following questions; “ How many shared resources facilities do you have on campus? With how many equipment and services? What is the annual revenue run rate on those?” Our struggle to answer such fundamental questions points to the main challenge we face in science. How do we create transparency and availability across the wide range of scientific equipment and services we need? Research universities have an opportunity to change all of this by moving all of their products and services in front of their own firewalls.
“We can’t continue to rely on tribal knowledge to find the very tools that drive life-saving discovery.”
We can’t continue to rely on tribal knowledge to find the tools we need. We need to move faster like the tech space. We have to stop and build the infrastructure that allows us to access the tools we need as seamlessly as CPU time in the cloud. Why can’t we access multi-million dollar scientific equipment on demand like we book a home or buy something on Amazon? I’ve already heard what the skeptics say… “But it’s sciences…it’s more complex than consumer products and services.” Skeptics normally don't disrupt industries.
In the next 10 years, science will no longer be confined to the ownership of equipment or live on digital islands I euphemistically refer to as University Core Facilities. Why can’t a scientist at biotech see every instrument and service a research university has and purchase them? Why can’t any scientist? Universities already understand the power of moving educational assets to new online platforms. These platforms bring new revenue without compromising their independence or scientific integrity. This transformation has already happened in the online education space. Why not with research services?