Here’s Why It Takes So Long to Move From Concept to Commercial Success

Let our journalists help you make sense of the noise: Subscribe to the Mother Jones Daily newsletter and get a recap of news that matters.

The computer mouse was invented in 1963, demonstrated in 1968, shown off in a lab in 1973, introduced on a personal computer in 1984, and finally widely adopted in the early 90s. That’s three decades:

That might seem like a long time, but as computer scientist Bill Buxton has argued, thirty years is actually a typical amount of time for a breakthrough computing invention to go from the first laboratory prototype to commercial ubiquity.

The first packet-switched network, the ARPANET, was launched in 1969. It took about 30 years, until the turn of the millenium, for Internet access to be widely adopted by American consumers.

….Why does it take so long? In all of these cases, it took a decade or longer for the new techniques to spread and mature inside the research community….Once a computing concept has been refined in the laboratory, it can take another decade to turn it into a viable commercial product.

….This 30-year rule of thumb can help to form an educated guess about when future innovations will reach the mass market. For example, the first car capable of driving itself long distances was created in 2005, and the technology has been maturing in academica and corporate labs over the last eight years. If self-driving technology follows the same trajectory as previous computing innovations, commercial self-driving cars will be introduced sometime in the 2020s, and the technology will become widely adopted in the 2030s.

That’s Tim Lee, and I’d add one more thing: a lot of these inventions depend on computing power. A mouse isn’t very useful without a graphical user interface, and you can’t run a useful GUI on a Z80. You can do it—barely—with a small black-and-white display—on a Motorola 68000. And then, finally you can do it at reasonable cost with a decent display on the microprocessors of the late 80s and early 90s.

Driverless cars are following the same arc. Obviously software is a huge issue too, but sufficient computer power at a reasonable price is a bare minimum. We’re still a decade or so away from that.


Headshot of Editor in Chief of Mother Jones, Clara Jeffery

It sure feels that way to me, and here at Mother Jones, we’ve been thinking a lot about what journalism needs to do differently, and how we can have the biggest impact.

We kept coming back to one word: corruption. Democracy and the rule of law being undermined by those with wealth and power for their own gain. So we're launching an ambitious Mother Jones Corruption Project to do deep, time-intensive reporting on systemic corruption, and asking the MoJo community to help crowdfund it.

We aim to hire, build a team, and give them the time and space needed to understand how we got here and how we might get out. We want to dig into the forces and decisions that have allowed massive conflicts of interest, influence peddling, and win-at-all-costs politics to flourish.

It's unlike anything we've done, and we have seed funding to get started, but we're looking to raise $500,000 from readers by July when we'll be making key budgeting decisions—and the more resources we have by then, the deeper we can dig. If our plan sounds good to you, please help kickstart it with a tax-deductible donation today.

Thanks for reading—whether or not you can pitch in today, or ever, I'm glad you're with us.

Signed by Clara Jeffery

Clara Jeffery, Editor-in-Chief

payment methods

We Recommend