Perceived hardware innovations come from end applications, not semiconductors

During my 13+ years in semiconductor, I was always working on at least 2 to 3 projects simultaneously, each in a different stage of development. The development cycle per project shortened from 2.5 years in the early 2000's to a bit more than 1 year before I left the industry in 2013. Every subsequent project performed better, was more power-efficient and was most of the time cheaper than the previous one.

Most people refer to this as Moore's Law, which was never a scientific law but rather an observation-induced business imperative. I see it more as an outdated continuation of Andy Grove's mantra "Only the Paranoid Survive".

One would have thought that with the tremendous and relentless improvement in processing and communication brought by all these semiconductor chips, we human beings would be enjoying a similar growth in productivity.

The result couldn't be further from the truth, however you cut it.

Below is a graph charting Intel's per-CPU transistor count against the per-hour labour output for non-farm businesses in USA since 1971 — the year Intel rolled out the ground-breaking 4-bit CPU 4004 with 2,300 transistor.

The per-hour labour output represents productivity, the fundamental driver of growth in real economic welfare. It has been normalized in this chart to have the same starting point as the transistor count. Note that the Y-axis is in log scale as a natural result of plotting anything regarding semiconductor along the time line.

The increasing gap between transistor count per CPU and the labor productivity

As can be seen, unlike the legendary exponential growth of per-CPU transistor count, the real-term productivity has only doubled over the past 40 years, leaving it a more or less flat curve on a log-Y chart.

Now a semiconductor veteran could argue that transistor count doesn't mean processing power. An economics-trained person would further argue that many of the things are not counted by the traditional labor output accounting.

All plausible. Still, the gap is simply too huge to be explained by such technicalities, given that semiconductor-driven technology seems to be everywhere in our life today.

As a long-time semiconductor practitioner, I would like to propose another hypothesis that could help explain this gap as well as the emergence of this new generation of hardware innovation. I think the semiconductor industry had been chasing Moore's Law for too long without enough reflection. As a result, the end applications fail to fully exploit the potentials of every new generation of chips.

Truly innovative end applications always entail a much longer life cycle simply due to customer adoption. Adopting a truly innovative end application takes time and is always much lagging the inherent roadmap of semiconductor industry. Even iPhones took a couple years and several generations to complete its conquer of the mass, despite its apparent superiority as a device in even its crappy 1st generation form.

The nature of "human beings" slowly getting used to a brand new "application" implies that productivity would always lag behind the semiconductor progress. And the gap just gets bigger as semiconductor relentlessly rolls ahead while applications scramble to appease users frustrated by the latest UI change.

If this is true, it means there's a lot of potential trapped in the existent, off-the-shelf semiconductor chips, waiting to be unlocked by creative end application developers whose products do not hinge on specifications.

This is in fact what the new generation of hardware entrepreneurs around the world are doing today.

They source available semiconductor chips and/or functional modules that might not be the latest or the most powerful but have not been fully utilized by their previous clients. They then leverage these powerful semiconductor chips to build awe-inspiring applications such as Keecker and Giroptic,

Semiconductor chips become a silent partner in this hardware revolution.

As a result, the perceived hardware innovation today comes mostly from end applications, unlike the early days when the sticker "Intel Inside" meant something to a PC buyer.

The forever frontier-pushing semiconductor R&Ds might scorn the rather marketing notion of "perceived". It wouldn't change the fact that we might be standing at the beginning a 10-year golden period where truly differentiating and satisfying hi-tech products could come from anywhere in the world, finally enabling the consumers to fully enjoy the cumulative progress of the semiconductor industry.

Someday the end applications will finally catch up with the semiconductor progress, which is slowing down in a dangerous path. Then we'll start to worry about an overall stoppage of innovation. Until that day, however, I believe the ones better positioned to claim the biggest part of the value chain would be this new generation of nimble and creative hardware entrepreneurs.

And all these thoughts coming from someone who's been in the semiconductor industry for 13+ years. Figure this!

Share of time vs. share of wallet

No business plans, please