Home » Blog » Intel could have a plan for its future GPUs to better challenge AMD and Nvidia, as patent hints at new chiplet design

Intel could have a plan for its future GPUs to better challenge AMD and Nvidia, as patent hints at new chiplet design

by
0 comments

Intel has plans for future GPUs that aren’t monolithic, but are built from separate chiplets, or at least there’s some thinking along those lines at Team Blue.

We’ve gathered that because TechSpot noticed a denizen of X, Underfox, who flagged up a patent filed by Intel for a “disaggregated GPU architecture” which will likely be the “first commercial GPU architecture with logic chiplets” from the chip giant.

Earlier this month, Intel was finally granted a patent for its disaggregated GPU architecture, which will likely be the first commercial GPU architecture with logic chiplets, also allowing for the power-gate of chiplets not used to process workloads. pic.twitter.com/XsNjjdVIOuOctober 26, 2024

What does this mean exactly? All existing consumer GPUs are thus far monolithic, which means they have a single graphics chip with everything inside. A disaggregated architecture refers to splitting up that single chip, so you have multiple chiplets instead.

This won’t happen with Battlemage, the next-gen of Arc graphics cards expected to arrive early in 2025. If it was something in the works for Battlemage, we’d surely have heard about this on the rumor mill by now.

So, this might be the plan down the line for Celestial, Druid, or one of the future generations of Arc GPUs – assuming Intel gets that far with its discrete graphic card line-up.

As ever with patents, we must bear in mind that they are often filed speculatively, and a lot of them don’t see the light of day in finished products on shelves.

(Image credit: Future / John Loeffler)

Analysis: The benefits – and pitfalls – of disaggregation

Why go for a disaggregated GPU design like this? Chiplets have certain advantages in terms of chip design flexibility (modularity) and better power efficiency, with the latter being particularly important for high-end graphics cards when these days we’re getting into the realms of some truly wattage-sucking monsters.

The tricky bit, though, is effectively splitting up a monolithic chip into multiple chiplets leaves the problem of ensuring that those chiplets have fast enough interconnects to ensure that performance doesn’t drop by going this route.

AMD was rumored to be looking at a chiplet design for the RDNA 4 flagship, before seemingly canning it (and as we know, Team Red has purportedly retreated to just mid-range graphics cards for its next-gen RX 8000 GPUs now). Nvidia, too, was rumored to be looking at a multi-chip design for the Blackwell GeForce flagship, the RTX 5090, but any chatter from the grapevine around this idea has died down to nothing.

One way or another, we’re likely to see chiplet designs for consumer GPUs in the future, perhaps from AMD, Nvidia, and indeed Intel as evidenced by the patent here.

There are broader worries about how far Intel is going to push with its discrete Arc GPUs, mind you, and Battlemage graphics cards are likely to be low-end only. While work has apparently started on Celestial, it’s notable that Team Blue isn’t really talking about its Arc line of GPUs much these days (outside of integrated graphics, that is).

You might also like

These are the best GPUs right nowCheck out the best cheap monitor dealsWant a great gaming PC? Look no further

You may also like

Are you sure want to unlock this post?
Unlock left : 0
Are you sure want to cancel subscription?

This website uses cookies to improve your experience. We'll assume you're ok with this, but you can opt-out if you wish. Accept Read More

-
00:00
00:00
Update Required Flash plugin
-
00:00
00:00
Verified by MonsterInsights