Big Technology

Big Technology

Share this post

Big Technology
Big Technology
Generative AI Might Be Slamming Right Into A Resource Wall
Copy link
Facebook
Email
Notes
More

Generative AI Might Be Slamming Right Into A Resource Wall

Throwing more data, compute, and energy at the problem may run its course. What’s next?

Alex Kantrowitz's avatar
Alex Kantrowitz
Apr 26, 2024
∙ Paid
65

Share this post

Big Technology
Big Technology
Generative AI Might Be Slamming Right Into A Resource Wall
Copy link
Facebook
Email
Notes
More
5
8
Share
Image

Each new generation of large language model consumes a staggering amount of resources. 

Meta, for instance, trained its new Llama 3 models with about 10 times more data and 100 times more compute than Llama 2. Amid a chip shortage, it used two 24,000 GPU clusters, with each chip running around the price of a luxury car. It employed so much data in its AI…

This post is for paid subscribers

Already a paid subscriber? Sign in
© 2025 Alex Kantrowitz
Privacy ∙ Terms ∙ Collection notice
Start writingGet the app
Substack is the home for great culture

Share

Copy link
Facebook
Email
Notes
More