2

Nvidia, Groq, HBM & What's Ahead for Semis (Doug O'Laughlin, Founder of Fabricated Knowledge)

2
Transcript

No transcript...

The most important trend powering AI is semiconductors and that trend is not slowing down anytime soon. We need to understand what is happening at the semiconductor level to be able to project forward what we think will occur in AI. This episode is the best way to learn what’s ahead of semis.

Doug O’Laughlin is the first 2nd time guest on Software Snack Bites! He is the Founder of Fabricated Knowledge which is one of the fastest growing and highest grossing newsletters out there. The reason for that is because each of his write-ups is packed with insights that even a layperson can understand. We did this podcast right before Nvidia GTC so you’ll have to read Fabricated Knowledge to get Doug’s takes on the new announcements. In this podcast we talk about the inference specific chips (Groq & Cerebras), why HBM (high bandwidth memory) is so important to the future semi story, Nvidia’s current dominant strategy and why they are well positioned to continue winning, where ASICs and Broadcom fit in, and some other trends outside of semis Doug is watching.

If you had listened to Doug’s episode last year, you would be up 250% on Nvidia. If you had read his writing, you would be up even more. Additionally, if you read his writing on Intel, you would have had a 65% gain in < 6 months.

Of course, this is not investment advice, do your own research, but Doug’s knowledge will at least help you figure out where to start your diligence. There are multiple links below with terminology discussed and Doug’s free articles on certain topics. The best way to support him is to subscribe to Fabricated Knowledge.

Where to Find Doug:

Where to Find Shomik:

In this episode, we cover:

(02:06) – Why GPUs & CUDA are So Important to LLMs

(03:58) – Why GPUs Are More Specialized Than We Think

(06:47) – The Incentive for AI Teams to Stay with Nvidia

(08:07) – Why the Market is Excited About Groq

(09:17) – SRAM not Scaling and Why That Matters

(12:44) – Why HBM is So Important to Scaling

(15:05) – Why Massive Chips with SRAM Have Constraints (Cerebras, Groq)

(17:55) – The Challenge with Inference Specific Chips (Cerebras, Groq)

(19:27) – What is Infiniband and How Nvidia is Embracing Ethernet

(22:18) – Why Google is the Best Bet to Be Able to Take on Nvidia

(24:00) – AMD’s Challenges vs Nvidia’s Tick-Tock Cadence

(25:37) – Nvidia’s Strategy Around GPU Clouds

(27:44) – Why TSMC Isn’t As Big of a Bottleneck as You Might Think

(31:14) – 14 Nanometer Chips vs Leading Edge as a Strategy

(33:00) – VLIW Challenges to Work With Various Models (Cerebras, Groq)

(36:57) – Why SemiCap Companies May Not Be Overdone in Markets

(42:35) – Broadcom’s Place in This World Given ASICs

(47:15) – What is Under Hyped in Semis Still

(49:55) – Doug’s Call On Intel Outperforming in 2023 (Backside Power Delivery)

(53:10) – Why Yield is the Challenge in Building New Chips

(53:50) – Calling Our Shot on Google in March 2024

(55:34) – Other Trends That Doug is Watching Outside of Semis

Show Notes:

Datacenter is the New Compute Unit (Why Nvidia Will Continue to Dominate)

Why HBM is the Hottest Thing in Memory (and Semis)

Detailed Nvidia GTC Analysis

Is This The Intel Inflection (June 2023)

Terminology:

Backside Power Delivery

Hybrid Bonding

High Bandwidth Memory

Static Random Access Memory

Very Long Instruction Word Architecture

Nvidia SpectrumX Ethernet Networking

2 Comments
Software Snack Bites
Software Snack Bites
A podcast series exploring all things enterprise software. We'll interview experts from all areas of company building, the enterprise buyer perspective, investors viewpoints and deep dive into specific technology trends.