Nvidia's H100 AI GPUs cost up to four times more than AMD's competing MI300X — AMD's chips cost $10 to $15K apiece; Nvidia's H100 has peaked beyond $40,000: Report (2024)

Nvidia's H100 AI GPUs cost up to four times more than AMD's competing MI300X — AMD's chips cost $10 to $15K apiece; Nvidia's H100 has peaked beyond $40,000: Report (1)

According to Citi's price projections for AMD's MI300 AI accelerators, Nvidia currently charges up to four times more for its competing H100 GPUs, highlighting its incredible pricing power as a shortage of H100 GPUs continues. We contacted AMD to confirm Citi's pricing projection, but an AMD representative told Tom's Hardware the company doesn't share that pricing publicly.

AMD has formally started volume shipments of its CDNA 3-based Instinct MI300X accelerators and MI300A accelerated processing units (APUs), and some of the first customers have already received their MI300X parts, but pricing for different customers varies based on volumes and other factors. But in all cases, Instincts are massively cheaper than Nvidia's H100.

Citi (viaSeekingAlpha) estimates that AMD sells its Instinct MI300X 192GB to Microsoft for roughly $10,000 a unit, as the software and cloud giant is believed to be the largest consumer of these products at this time (and it has managed to bring up GPT-4 on MI300X in its production environment). Other customers have to pay around $15,000 for an Instinct MI300X GPU for artificial intelligence (AI) and high-performance computing (HPC) applications. Both prices are massively lower than Nvidia charges for itshugely popular H100 80GBAI and HPC GPU.

Just like AMD, Nvidia also does not officially disclose the pricing of its H100 80GB products as it depends on numerous factors, such as the volume of the batch and overall volumes that a particular client procures from Nvidia. But over the recent quarters, we have seen Nvidia's H100 80GB HBM2E add-in-card available for$30,000, $40,000, and evenmuch more at eBay. Meanwhile, the more powerful H100 80GB SXM with 80GB of HBM3 memory tends to cost more than an H100 80GB AIB.

In general, the prices of Nvidia's H100vary greatly, but it is not even close to $10,000 to $15,000. Furthermore, given the memory capacity of the Instinct MI300X 192GB HBM3, it makes more sense to compare it to Nvidia's upcoming H200 141GB HBM3E and Nvidia's special-editionH100 NVL 188GB HBM3 dual-card solutiondesigned specifically to train large language models (LLMs) that probably sell for an arm and a leg.

Given that tons of AI applications and workloads are optimized for Nvidia's CUDA software stack, demand for its compute GPUs is overwhelming, which is why the company can sell its Hopper-based products at a huge premium. Meanwhile, AMD is trying to attract clients to its CDNA 3-based Instinct MI300-series products, so it might have decided to sell them at a relatively low price.

AMD expects sales of its data center GPUs — which includes MI300-series devices — to exceed $3.5 billion, and the company says it has some supply still available, which stands in contrast to Nvidia's rumored 52-week wait times. In either case, analysts from Citi deem AMD's $3.5 billion in sales an underestimation.Christopher Danely, an analyst with Citi, believes that AMD could generate $5 billion on data center GPUs this year, and $8 billion in 2025.

Stay On the Cutting Edge: Get the Tom's Hardware Newsletter

Get Tom's Hardware's best news and in-depth reviews, straight to your inbox.

Nvidia's H100 AI GPUs cost up to four times more than AMD's competing MI300X — AMD's chips cost $10 to $15K apiece; Nvidia's H100 has peaked beyond $40,000: Report (2)

Anton Shilov

Freelance News Writer

Anton Shilov is a Freelance News Writer at Tom’s Hardware US. Over the past couple of decades, he has covered everything from CPUs and GPUs to supercomputers and from modern process technologies and latest fab tools to high-tech industry trends.

More about artificial intelligence

Elon Musk's xAI plans to build 'Gigafactory of Compute' by fall 2025 — using 100,000 Nvidia's H100 GPUs17 cringe-worthy Google AI answers demonstrate the problem with training on the entire web

Latest

See more latest►

5 CommentsComment from the forums

  • Pierce2623

    It’s kinda crazy that companies are so lazy they’ll pay 4x for the same performance just for an easier to use software stack. If AMD put a real push behind their software stack, it still wouldn’t matter because Nvidia just has the mindshare period.

    Reply

  • oofdragon

    So the craze about Artificial Intelligence is basically because most people lack Natural Intelligence, seeing as everyone is paying 4x more for the same performance. There we have it.. why RTX sells more than XTX... lack of NI

    Reply

  • Neilbob

    I'm sure said companies are just erring on the side of caution, trying to keep competition alive...

    And after all, everyone be concerned. Nvidia are right on the edge of being completely destitute, so bad they're approaching Apple levels of poverty. Doesn't it make your heart break?

    Reply

  • Freestyle80

    oofdragon said:

    So the craze about Artificial Intelligence is basically because most people lack Natural Intelligence, seeing as everyone is paying 4x more for the same performance. There we have it.. why RTX sells more than XTX... lack of NI

    yeah why dont they worship AMD like you, AMD are gods, more people should be bowing down to them and buy anything they release

    Reply

  • Argolith

    Talking about the article... Hopefully with more money coming in they will have more to invest on the gaming side of things and maybe use these accelerators of theirs to build up a strong(er) alternative to DLSS... but I feel like they have little to no incentive at the moment (after all despite being similar to GPUs this is AI accelerators we're talking about and they sell to enterprise at much steeper prices) and probably we will just end up seeing more production capacity shifted away from gaming. Who knows, one day some cool feature might trickle down the product stack... Maybe?
    Unfortunately I'm starting to forget the days Radeon moved a decent amount of units or introduced cool stuff like HBM to GPUs your average Joe might buy.

    Reply

Most Popular
Tape shipments increased to 152.9 exabytes in 2023 — not bad for a ’dead’ storage medium
Microsoft shot real lasers through a window to make Windows 10's wallpaper — surprisingly the iconic art wasn't computer generated
AMD expected to release 800 series motherboard chipsets for Ryzen 9000 CPUs — chipset series numbers to run in parallel to Intel
Lenovo Legion Go Lite tipped to do battle with the refreshed Asus ROG Ally X
Der8auer admits he 'messed up' with underperforming Thermal Grizzly Heatspreaders and Direct Die coolers
Intel throws in the towel on the processors that killed the first Aurora supercomputer — Knights Mill and Knights Landing support removed from LLVM
Windows 11 Copilot+ AI features like Recall can be enabled without an NPU
Samsung denies report citing HBM quality issues, asserts its HBM memory works just fine
Lenovo embraces MoDT in China — laptop CPUs are used in its new Legion 7000K gaming towers
Chinese chip industry leader asks companies to focus on building innovations using mature nodes
AI accelerators market poised to grow by 250% this year, says TSMC
Nvidia's H100 AI GPUs cost up to four times more than AMD's competing MI300X — AMD's chips cost $10 to $15K apiece; Nvidia's H100 has peaked beyond $40,000: Report (2024)

References

Top Articles
Latest Posts
Article information

Author: Virgilio Hermann JD

Last Updated:

Views: 6497

Rating: 4 / 5 (41 voted)

Reviews: 88% of readers found this page helpful

Author information

Name: Virgilio Hermann JD

Birthday: 1997-12-21

Address: 6946 Schoen Cove, Sipesshire, MO 55944

Phone: +3763365785260

Job: Accounting Engineer

Hobby: Web surfing, Rafting, Dowsing, Stand-up comedy, Ghost hunting, Swimming, Amateur radio

Introduction: My name is Virgilio Hermann JD, I am a fine, gifted, beautiful, encouraging, kind, talented, zealous person who loves writing and wants to share my knowledge and understanding with you.