Image: AMD

AMD users who plan to build a new system around the company’s next generation of Ryzen chips may not necessarily have to worry about coupling it with a dedicated graphics solution.

Chips and Cheese has shared documents that suggest all upcoming Ryzen processors based on the Zen 4 architecture will feature iGPUs. (See the “on-chip graphics” row in the compatibility table below, which indicates dedicated graphics.)

It isn’t clear whether these iGPUs will actually be enabled in all models, but the news is good for Ryzen users who only need a relatively moderate level of graphics performance.

The documents stem from the recent GIGABYTE leak, which has revealed plenty of other interesting information such as the specifications for AMD’s upcoming Ryzen Threadripper processors.

A new block diagram shared by Chips and Cheese has created some confusion as to whether the Socket AM5 platform supports DisplayPort 2.0 or not.

A block diagram previously shared by TechPowerUp indicates that it does, but the new one doesn’t. Perhaps the latter is for Zen 4 CPUs that don’t have their iGPUs enabled.

Image: Chips and Cheese

AM5 now features an additional 4x PCIe Gen 4 lanes, which might be useful for attaching more M.2 slots or implementing a a discrete USB4 controller. These extra PCIe lanes won’t be available with all AM5 CPUs though. Unlike with Genoa, there’s no mention of PCIe 5 support on AM5 CPUs. AMD seems to have chosen more lanes instead of faster lanes.

Source: Chips and Cheese

Don’t Miss Out on More FPS Review Content!

Our weekly newsletter includes a recap of our reviews and a run down of the most popular tech news that we published.

Join the Conversation

24 Comments

  1. Hmm..

    I thought there was already a lineup with integrated graphics, they just called it an APU.

    What’s the difference here?

  2. Just what I don’t need – integrated graphics on a high performance CPU. If I want basic graphics, I’ll toss in a 710.

    how about using that die space for extra cache instead?

  3. Well, these days it will let you at least use the system while you wait your turn in the Newegg shuffle.

  4. [QUOTE=”Paul_Johnson, post: 40140, member: 2″]
    Well, these days it will let you at least use the system while you wait your turn in the Newegg shuffle.
    [/QUOTE]
    lol! That said, I don’t think the 710 ever went out of stock anywhere. Even the RX 560 and 1030ti were almost always in stock when I was shopping back in Jan.

  5. [QUOTE=”Brian_B, post: 40118, member: 96″]
    Hmm..

    I thought there was already a lineup with integrated graphics, they just called it an APU.

    What’s the difference here?
    [/QUOTE]
    An APU [I]should[/I] have more grunt, at least in my imagination of what it’s supposed to mean. Sadly, Intel has done that better, and mobile SoCs have really embodied the form more than anything else.

    A lot of that comes down to software integration, though. AMD continues to make strides here so these may have more ‘impact’ than we might otherwise expect!

    [QUOTE=”Endgame, post: 40121, member: 1041″]
    Just what I don’t need – integrated graphics on a high performance CPU.
    [/QUOTE]
    If they follow what Intel has done, the highest (and lowest) CPU brackets will have the more limited implementations. And with AMDs chiplet approach, that may mean that they can use different dies with different balances of say CUs and cache, if I’m correct in assuming that they’d situate the graphics logic on the cache / uncore die.

    Outside of that, I’m betting that the lack of integrated graphics by design has hurt them a bit, and the GPU market has likely exacerbated the effect. I’ll also venture to say that the graphics logic that AMD includes could easily have more features than your average low-end GPU, looking in particular toward things like video encoding and decoding performance. Stuff that end-users are likely to do that just isn’t efficient on CPU cores.

  6. [QUOTE=”Uvilla, post: 40163, member: 397″]
    At this point it makes sense. The graphics part is probably a small part of the die.
    [/QUOTE]

    Well, this is an Ice Lake CPU. I wouldn’t call the IGP small exactly.
    For AMD though – they can use one (or two) chiplets for graphics and they don’t really need to change their CPU chiplet die at all. Which is what I thought they were doing with APUs

    [IMG]https://en.wikichip.org/w/images/d/d6/ice_lake_die_%28quad_core%29_%28annotated%29.png[/IMG]

  7. [QUOTE=”Brian_B, post: 40164, member: 96″]
    Well, this is an Ice Lake CPU. I wouldn’t call the IGP small exactly.
    For AMD though – they can use one (or two) chiplets for graphics and they don’t really need to change their CPU chiplet die at all. Which is what I thought they were doing with APUs

    [IMG]https://en.wikichip.org/w/images/d/d6/ice_lake_die_%28quad_core%29_%28annotated%29.png[/IMG]
    [/QUOTE]
    Now just think how much better Ice Lake would have been had they dedicated that whole GPU block to L3 cache…

  8. AMD is feeling pressure from intel, who are serious about their iGPU strategy. If AMD doesn’t reply, it may have to play catchup in the near future.

  9. [QUOTE=”Stoly, post: 40195, member: 1474″]
    AMD is feeling pressure from intel, who are serious about their iGPU strategy. If AMD doesn’t reply, it may have to play catchup in the near future.
    [/QUOTE]
    Well, thinking about it strategically —

    People who buy an APU (or CPU with IGP – not sure there is a distinction) and plan on using the video, probably aren’t buying a dGPU anytime soon, and AMD is very much in the business of wanting to sell you both.

    The market for IGP/APUs has mostly been mobile and business devices – low margin commodity markets, where you can really only make nay money with large volumes – AMD isn’t going to get that overnight.

    And Intel really has only been pushing IGP in the past… decade+ because it didn’t have a strong dGPU product to push, and it allowed them to gain in those marketspaces and get established. I think IGP is fine, so long as I never ~need~ to use it for anything serious.

  10. I’m torn. On one hand a basic level functioning 2d desktop would be nice for stuff where you just need basic video. OTOH I wouldn’t want to give up die space that could be used for better performance like l3 cache etc.

    I miss the old AM3 motherboards that had built in crappy graphics. You couldn’t game on them, but perfectly fine to build on or run a server.

  11. [QUOTE=”Burticus, post: 40239, member: 297″]
    I’m torn. On one hand a basic level functioning 2d desktop would be nice for stuff where you just need basic video. OTOH I wouldn’t want to give up die space that could be used for better performance like l3 cache etc.

    I miss the old AM3 motherboards that had built in crappy graphics. You couldn’t game on them, but perfectly fine to build on or run a server.
    [/QUOTE]
    I kinda agree. Why can’t get get video pushed back out onto the chipset controller? I get that it will suck donkey balls since it will be removed from, pretty much everything… But we aren’t exactly wanting it to drive games – it’s just to drive a basic desktop and base level functionality.

  12. [QUOTE=”Brian_B, post: 40251, member: 96″]
    I kinda agree. Why can’t get get video pushed back out onto the chipset controller? I get that it will suck donkey balls since it will be removed from, pretty much everything… But we aren’t exactly wanting it to drive games – it’s just to drive a basic desktop and base level functionality.
    [/QUOTE]
    In this case, just use a GeForce 710 or equivalent. I’m sure there are scenarios where you would really like to use the PCIE slot for something else, but a 1x riser should be fine for basic graphics.

    edit to say, I’m also the guy that is annoyed at the raspberry pi for including graphics hardware that I just turn off because I never use it. I’d rather have 8 cores and 0 iGPU…. So I may not be the best person to find value in the igpu

  13. [QUOTE=”Endgame, post: 40259, member: 1041″]
    In this case, just use a GeForce 710 or equivalent. I’m sure there are scenarios where you would really like to use the PCIE slot for something else, but a 1x riser should be fine for basic graphics.
    [/QUOTE]
    No, it’s value is in not needing to plug in a card at all – mostly troubleshooting, or linux boxes that sit at the CLI anyway, or things like that.

    Sure, you could use a bottom tier discrete card, but if there’s already some rudimentary video available – why even do that?

    As for the rpi – if it didn’t have a GPU at all, your only option would be to SSH into the unit, it wouldn’t have any local video output capability at all (I guess someone could invent a GPIO GPU of sorts, but it would be complicated) – and that’s exactly the type of capability I’m saying is nice to have here. I wouldn’t call it essential, but nice in those situations where even a 710 is pretty much overkill.

  14. [QUOTE=”Brian_B, post: 40261, member: 96″]
    No, it’s value is in not needing to plug in a card at all – mostly troubleshooting, or linux boxes that sit at the CLI anyway, or things like that.

    Sure, you could use a bottom tier discrete card, but if there’s already some rudimentary video available – why even do that?

    As for the rpi – if it didn’t have a GPU at all, your only option would be to SSH into the unit, it wouldn’t have any local video output capability at all (I guess someone could invent a GPIO GPU of sorts, but it would be complicated) – and that’s exactly the type of capability I’m saying is nice to have here. I wouldn’t call it essential, but nice in those situations where even a 710 is pretty much overkill.
    [/QUOTE]

    Yeah, and for those that use a PI for an emulator, which are a few people, that would be kind of a problem with no graphics output.

  15. [QUOTE=”Brian_B, post: 40261, member: 96″]
    No, it’s value is in not needing to plug in a card at all – mostly troubleshooting, or linux boxes that sit at the CLI anyway, or things like that.

    Sure, you could use a bottom tier discrete card, but if there’s already some rudimentary video available – why even do that?

    As for the rpi – if it didn’t have a GPU at all, your only option would be to SSH into the unit, it wouldn’t have any local video output capability at all (I guess someone could invent a GPIO GPU of sorts, but it would be complicated) – and that’s exactly the type of capability I’m saying is nice to have here. I wouldn’t call it essential, but nice in those situations where even a 710 is pretty much overkill.
    [/QUOTE]
    I can use one card for multiple hosts. I can put my 710 in my freenas tomorrow if I needed the minimal interface it has, and then the day after I could put it in my pfsense box, etc. for the once every 3 years I might need a console, I can toss in a card. The benefit I would have is higher continuous performance during those 3 years where I didn’t need a Gpu at all.

    as for the Pi, I have a number of pi 4s at the house – around 20 off them. My Kodi boxes are the Only 3 that actually have ever used the video output. The rest just get SSH’d to, including immediately after setup.

  16. [QUOTE=”Endgame, post: 40278, member: 1041″]
    I can use one card for multiple hosts. I can put my 710 in my freenas tomorrow if I needed the minimal interface it has, and then the day after I could put it in my pfsense box, etc. for the once every 3 years I might need a console, I can toss in a card. The benefit I would have is higher continuous performance during those 3 years where I didn’t need a Gpu at all.

    as for the Pi, I have a number of pi 4s at the house – around 20 off them. My Kodi boxes are the Only 3 that actually have ever used the video output. The rest just get SSH’d to, including immediately after setup.
    [/QUOTE]
    Well the other part of my statement was putting the GPU in the chipset controller…

    But to each their own. I’d rather not juggle cards at all if I can help it

  17. As long as I get the performance I want, and the capabilities I will use, in addition to no real additional cost… I don’t care?

  18. [QUOTE=”Grimlakin, post: 40293, member: 215″]
    As long as I get the performance I want, and the capabilities I will use, in addition to no real additional cost… I don’t care?
    [/QUOTE]
    What? You mean common sense? Naaaaaaaaah

  19. [QUOTE=”Grimlakin, post: 40293, member: 215″]
    As long as I get the performance I want, and the capabilities I will use, in addition to no real additional cost… I don’t care?
    [/QUOTE]
    Does it have no cost? Look at that Intel block diagram – half is iGPU. They could just drop the iGPU, make double the chips and sell them for 45% less. This would make it a real monetary cost to have that GPU.

    Alternately, it’s an opportunity cost. They have dropped a significant portion of cache and extra cores to free up die space to give you a GPU you’ll never use.

    it sucks, but you just have to live with it, because what other choice will you have?

  20. [QUOTE=”Endgame, post: 40328, member: 1041″]
    Does it have no cost? Look at that Intel block diagram – half is iGPU. They could just drop the iGPU, make double the chips and sell them for 45% less. This would make it a real monetary cost to have that GPU.
    [/QUOTE]
    While that picture is likely correct, note that Intel doesn’t put their ‘biggest’ GPUs in their desktop CPUs, and that the CPU pictured is a quad-core unit.

    Smaller GPU, four or more additional CPU cores, and it perhaps doesn’t look so lopsided of a comparison.

  21. [QUOTE=”Endgame, post: 40328, member: 1041″]
    Alternately, it’s an opportunity cost. They have dropped a significant portion of cache and extra cores to free up die space to give you a GPU you’ll never use.
    [/QUOTE]
    I do not think this is true. At least with respect to it being a higher performing CPU versus one with IGP.

    I think they hit a TDP wall with the cores way back when they first moved IGP from the chipset to the CPU. Once they hit that wall – they still had die space and figured why the heck not – it makes the IGP a lot faster, doesn’t hurt the CPU really, and opens up a huge market on the mobile/commercial side that don’t want to fool with discrete graphics anyway.

    I don’t think removal of the IGP gets you a faster CPU – Xeons aren’t really faster than Pentiums, after all. If anything the reverse has been true – some dies with extra cache for the IGP have been faster than non-IGP or lesser IGP variants (Crystalwell).

    would they be cheaper? Probably. That I can’t dispute.

  22. [QUOTE=”Brian_B, post: 40346, member: 96″]
    would they be cheaper? Probably. That I can’t dispute.
    [/QUOTE]
    Cheaper to produce than a die without an IGP, yes – but that doesn’t mean cheaper overall necessarily, right?

    Obviously conjecture, but for Intel it may very well have been cheaper to ‘waste’ die space on GPUs that won’t get used versus having to build a GPU-less part, remembering that these are all consumer-level dies for the most part. Intel has labeled some of these as Xeons, for example, but I don’t think they’ve put any IGPs into their HEDT / actual Xeon sockets since at least Skylake.

    And to point out where AMD differs here, is that AMD has taken the modular approach with their chiplets. How much AMD differs will depend on where they situate their GPUs in that architecture. If it’s to be in the I/O die as I myself suspect, then that means that it won’t necessarily be included in TR and Epyc releases. On the other hand if the GPU logic is in the CCD, then a Ryzen CPU could theoretically have two GPUs and a TR could have four, and so on!

  23. Well with ddr5 we may get good graphics speeds. This ,I agree ,will of course cover huge markets needs by default.
    I do agree it may be the IO die, makes sense. Very interesting cpu if all true.

Leave a comment