Why, and how, we’ll all be making our own computer chips in the future… (Part 2)

If you read the tech news, it can’t have slipped your attention that significant changes are afoot in the computer chip industry. New chief at Intel; Apple making their own chips for laptops and phones; big mergers — ARM, Nvidia, Xilinx, Qualcomm; the inexorable rise of RISC-V…

This is a multi-part blog about what’s going on and where these changes will take us. In the first part, I discuss the macroeconomics driving the change in the semiconductor ecosystem. Here I cover what effect this is having on the industry.

You also can’t have failed to notice that the once majestic silverback of silicon valley, Intel, is gradually becoming a shadow of its former self. Intel’s problem — now — is something that used to be a strength. It has relied on one major design to satisfy all of its market segments for a long time.

This approach worked very well when a new manufacturing process gave them a boost every couple of years, giving them market differentiation between older and newer processes. When each new process stopped being such a boost, it started to make less and less sense.

The market has realised that to get the necessary performance, chips need to be tailor-made for how they will be used. A perfect example is Amazon with its ‘Graviton’ processors.

Amazon builds data centres at an astounding rate. Their main costs are the costs of the servers that go in them, and the electricity it takes to power and cool them. They had already driven down the cost of servers by getting their own server designs built by far-east manufacturers. That meant that the Intel processors they use were the main cost on both fronts.

To solve this issue, in 2015 they bought an Israeli chip startup called Annapurna Labs. The first thing they did was use their chip for all their new custom network cards. Then they put them to work in secret, designing a single chip that would be their entire new server platform.. to replace Intel.

By designing their own server chip, they could optimise it specifically for their major revenue-generating workloads, resulting in big wins on both *operating* and *capital* costs.

This isn’t a singular story. The same drivers and outcomes can be seen happening at Google, Microsoft, Western Digital, Seagate, Sony, Facebook, Baidu, Huawei, Alibaba …

In the next part of this series, I will cover the implications of this transition for the tech industry as a whole.




Rob is a serial entrepreneur in Open Source and deep tech. He is applying the learning from the world of Open Source software to hardware design.

Love podcasts or audiobooks? Learn on the go with our new app.

Recommended from Medium

Bizo’s Journey to Minimum Viable Traction (MVT)

Are “financial supermarkets” the future of banking?

How Social Media Can Help Beat OTAs

Asana Partners Closes Latest Fund at $1.5 Billion

Dropbox will share its recipe for internal innovation at Pause 2017

Digital Health Plays Music!

Post-Pandemic Supply Chains: A Prognosis from Johnson & Johnson

6 reasons why China is leading Virtual Reality growth worldwide

Get the Medium app

A button that says 'Download on the App Store', and if clicked it will lead you to the iOS App store
A button that says 'Get it on, Google Play', and if clicked it will lead you to the Google Play store
Rob Taylor

Rob Taylor

Rob is a serial entrepreneur in Open Source and deep tech. He is applying the learning from the world of Open Source software to hardware design.

More from Medium

Zappychats S1E1 with Sumit Neogi: How HRs Can Become True Partners with Digital Transformation

7 Minute Challenge — Plot twists

9 Curious Questions To Stop You From Sabotaging Your Dreams — Inside Your Mind

The Daily Routine of Yuka — a person with Nomophobia