In the late 1970s, computer interfaces were mostly text-based. A simple, unwelcoming prompt let you control the computer by typing individual text commands.
If you knew how to program, this new tool was incredibly powerful. But for the vast majority of people who didn't, computers were inaccessible.
Enter the graphic user interface (GUI), which Xerox pioneered at its Palo Alto Research Center (PARC). Suddenly, everyday users could manipulate on-screen elements using a mouse.
We take this innovation for granted, but at the time, it wasn't obvious to everyone that this was the future. To some, it looked like a toy – incapable of doing real work.
But as the story goes, Steve Jobs understood the GUI's power the first time he laid eyes on it during a tour of Xerox's facility. It's mostly apocryphal, but there's a kernel of truth to the story: Jobs, along with many talented engineers and designers at Apple recognized the GUI as a powerful layer of abstraction. They believed that layer could enable everyday people to leverage the power of a computers.
Though Xerox PARC pioneered the computer interface that billions of people would one day use, Apple and Microsoft ultimately brought it to the masses.
So who gets credit? Certainly Xerox deserves historical recognition. But invention is useless if it can't be put in the hands of the people who need it.
This tension between innovation and distribution is also present in vaccine development. Creating an effective mRNA vaccine, for example, is a modern miracle.
Also challenging, however, is the process of getting shots into billions of people's arms in every corner of the globe. It's a herculean effort and a staggering logistical operation spanning supply chains, transportation, public messaging, IT, and more.
Ideas and invention are priceless. But only if they can be scaled to effect change in the world.
Hat tip to Steven Johnson for the vaccines example.