Image: Intel

Intel’s initial batch of Xe-HPG (high-performance gaming) DG2 graphics cards will reportedly comprise six models. This is according to a puzzle decoded by hardware enthusiast and occasional leaker harukaze5719, which lists six variants and their purported specifications. Intel’s flagship Xe-HPG DG2 GPU will supposedly leverage 512 Execution Units (4,096 Streaming Processors) and 8 or 16 GB of GDDR6 memory, while the bottom end will be occupied with a model featuring 96 Execution Units (768 Streaming Processors) and 4 GB of GDDR6 memory.

  • 512 EU (4096 SP) / 256bit-bus / 8 or 16G VRAM
  • 384 EU (3072 SP) / 192bit-bus / 6 or 12G VRAM
  • 256 EU (2048 SP) / 128bit-bus / 4 or 8G VRAM
  • 192 EU (1536 SP) / 128bit-bus / 4G VRAM
  • 128 EU (1024 SP) / 64bit-bus / 4G VRAM
  • 96 EU (768 SP) / 64bit-bus / 4G VRAM

The purported specifications suggest that Intel’s flagship Xe-HPG DG2 graphics card might be able to go toe to toe with AMD’s Radeon RX 6800 XT (4,608 SP, 16 GB GDDR6), which has already proven its mettle against NVIDIA’s GeForce RTX 3080 (8,704 SP, 10 GB GDDR6X). Rumors suggest that Intel’s DG2 GPUs will be manufactured on TSMC’s 7-nanometer process.

“Intel introduced a new Xe microarchitecture variant, Xe-HPG, a gaming-optimized microarchitecture, combining good performance-per-watt building blocks from Xe-LP, leveraging the scale from Xe-HP for a bigger configuration and compute frequency optimization from Xe-HPC,” a fact sheet reads. “A new memory subsystem based on GDDR6 is added to improve performance per dollar and XeHPG will have accelerated ray tracing support. Xe-HPG is expected to start shipping in 2021.”

Don’t Miss Out on More FPS Review Content!

Our weekly newsletter includes a recap of our reviews and a run down of the most popular tech news that we published.

Join the Conversation

11 Comments

  1. Hmm.. top end is 4096 SP… that’s smack in-between what a 6800 and 6800 XT have. Fewer than the new 3060 though… no word on efficiency or speed so. So yeah… great theorycrafting, but no news until benchmarks.

  2. Maybe sorta interesting if these actually make it into the wild with any retail stock.

    Also… I wonder what the drivers are going to be like (probably rocky at first) since Intel hasn’t produced a real video card since the i740 (90’s)

  3. [QUOTE=”Burticus, post: 30305, member: 297″]
    Maybe sorta interesting if these actually make it into the wild with any retail stock.

    Also… I wonder what the drivers are going to be like (probably rocky at first) since Intel hasn’t produced a real video card since the i740 (90’s)
    [/QUOTE]
    Well, they do have drivers now, for IGP. I can remember a bit more than a year ago, they vowed to make their GPU drivers awesome… I think that was a lot of what Kyle and others were over there doing for a bit, was beefing up their enthusiast market. And… nothing ever really seemed to come from it.

    Yeah, Intel drivers have always been stable, but … that isn’t saying a whole lot. They’ve never been high performance, or sought out to keep up with the a release cadence or chase new released games with updates.

  4. [QUOTE=”Brian_B, post: 30304, member: 96″]
    Hmm.. top end is 4096 SP
    [/QUOTE]
    SPs are whatever the company says they are; they are only very loosely useful for comparison.

    I don’t think we can at all say what performance will look like. We don’t know any of the other details that make a significant difference in end-product performance either, such as memory subsystem performance, driver tuning, scaling, and so on.

    [QUOTE=”Burticus, post: 30305, member: 297″]
    Maybe sorta interesting if these actually make it into the wild with any retail stock.

    Also… I wonder what the drivers are going to be like (probably rocky at first) since Intel hasn’t produced a real video card since the i740 (90’s)
    [/QUOTE]

    Intel has the largest desktop market share by a very wide margin. They also have stable Linux drivers in shipping Linux kernels months before the hardware that will use those drivers ships itself.

    The main reason to be skeptical is that Intel GPUs simply aren’t compared to Nvidia and AMD because their performance classes rarely overlap and certainly don’t overlap in stressful scenarios. So this is going to be new!

  5. Spot on the mark. They already hold the majority at a certain tier level. At 4096 they get their foot on the door for another. The real questions are price and availability. These days anyone can show proof of product but can whether or not anyone can buy it is another altogether.

  6. [QUOTE=”LazyGamer, post: 30310, member: 1367″]
    SPs are whatever the company says they are; they are only very loosely useful for comparison.

    I don’t think we can at all say what performance will look like. We don’t know any of the other details that make a significant difference in end-product performance either, such as memory subsystem performance, driver tuning, scaling, and so on.
    [/QUOTE]
    Well.. there is a good deal of liberty when it comes to the definition, but I think it’s not too far from the mark. Very true that it’s not exactly analogous to CPU cores, as various architectures vary a good deal in efficiency and supported instruction set… just look at the example I had — 4096 is more than a 6800, but less than a 3060, and the 3060 is slower (rasterizing) than a 6800, even though it has more SPs.

    But… but ! It gets you in the ball park. I think we can assume we will be looking at something mid-tier. So I think you can safely assume that a GPU with 96 SP’s is going to be slower than one with 4096…. pretty much regardless of architecture within a defined time span / generations. Or taking it to less extreme, I would bet that Intel doesn’t come anywhere near a 3090 (with it’s 10,496 cores), but probably can beat out a 2060 (with it’s 1.920 cores).

    My guess will be, on rasterization… slower than a RX 6800, probably about on par with whatever AMD was going to release for the 6700. Roughly equal to a 3060 Ti. No clue on ray tracing or about a DLSS equivalent.

    But I could be wrong – it’s an entirely new architecture… Could probably look at Xe IGP on Rocket Lake and try to make some inferences there – it would probably be running lower power per SP, but could probably make some educated guess, if someone were ambitious enough

  7. [QUOTE=”Peter_Brosdahl, post: 30316, member: 87″]
    Spot on the mark. They already hold the majority at a certain tier level. At 4096 they get their foot on the door for another. The real questions are price and availability. These days anyone can show proof of product but can whether or not anyone can buy it is another altogether.
    [/QUOTE]
    I assume that there’ll be some market for whatever they make in this current climate, but I also expect those buying for gaming performance are going to be hesitant.

    I will be, and I’ve been gaming on Intel IGPs for a decade. My younger brother has been for two.

    And there’s enough factors to discuss that go both ways that we can’t really even posit as to which way this release will go!

    I’ll say that I’m somewhat hopeful, but I will also caution against being [I]too[/I] hopeful. Stock availability or not, Intel is stepping into territory that they haven’t seriously approached since the i740. It’s one thing to run a decade-old game or something like Civilization on an IGP at 1080p60, and an entirely different thing to target modern games with high-resolution, high-refresh-rate requirements. It’s conceivable for Intel to pull this off, and there are also so very many things that can go wrong.

    [QUOTE=”Dogsofjune, post: 30318, member: 168″]
    Bring on the benches, we need options for upgrades
    [/QUOTE]
    I care about the benches, to a degree, but actually getting stuff running well is the first hurdle. And by well I mean stably, and with competitive frametime performance. I do not want to see another ‘FPS queen’ scenario like we got with AMD Crossfire, where the actual frametimes were [I]worse[/I] with Crossfire enabled. Games need to run, and they need to [I]feel[/I] good on these cards.

    I’ll compromise framerates for that. I’ve done it.

    [QUOTE=”Brian_B, post: 30321, member: 96″]
    But… but ! It gets you in the ball park. I think we can assume we will be looking at something mid-tier. So I think you can safely assume that a GPU with 96 SP’s is going to be slower than one with 4096…. pretty much regardless of architecture within a defined time span / generations. Or taking it to less extreme, I would bet that Intel doesn’t come anywhere near a 3090 (with it’s 10,496 cores), but probably can beat out a 2060 (with it’s 1.920 cores).
    [/QUOTE]

    Absolutely!

    I was a bit rushed for time (by my hungry wife) when I typed out my original post, and it comes off colder than I intended. You have my apology for that, I should have saved it for later.

    I do agree that we can get within the ballpark judging by the number of SPs. There is, after all, a basic unit of work being done, so the product of the number of SPs and the clockspeed should get us a comparable estimate of performance. I’ll just say that I am cautious assigning any certainty smaller than ‘ballpark’ to it!

  8. [QUOTE=”LazyGamer, post: 30322, member: 1367″]

    I care about the benches, to a degree, but actually getting stuff running well is the first hurdle. And by well I mean stably, and with competitive frametime performance. I do not want to see another ‘FPS queen’ scenario like we got with AMD Crossfire, where the actual frametimes were [I]worse[/I] with Crossfire enabled. Games need to run, and they need to [I]feel[/I] good on these cards.

    I’ll compromise framerates for that. I’ve done it.
    [/QUOTE]
    In the AMD example, it was the software that hurt, the hardware was there. Maybe not as much for Crossfire, but drivers soon eased the pain of microstutter. That did create some bad attention. I know I had high hopes for CF. I do still miss those 7950’s.

    I am very curious how Intel tackles drivers. I honestly anticipate Intel’s drivers to be fairly generic leaving a lot of hardware potential on the table. Does it have to take the fps crown? No, but I’d like to see how it deals with real world games and applications. Hopefully, like Nvidia has shown, newer drivers can help raise performance.

    I am with you on your sentiments. =)

    I’m just looking for something that would feel like an upgrade to either a 1080ti or 1070ti, without selling off my children, three dogs that I borrowed from various neighbors, and all the catalytic converters in the county, with the hopes of not making a sidegrade move in hardware

  9. [QUOTE=”Dogsofjune, post: 30325, member: 168″]
    In the AMD example, it was the software that hurt, the hardware was there. Maybe not as much for Crossfire, but drivers soon eased the pain of microstutter. That did create some bad attention. I know I had high hopes for CF. I do still miss those 7950’s.
    [/QUOTE]
    I got burned (literally on occasion) with HD6950s. Reporting on the issue was there, but sparse, as it did vary by game, configuration, hardware, and a host of other things.

    On paper the combo was unbeatable; you’d pay 50% more for 5% more performance, more or less, so it was ‘topped out’ from a value perspective.

    And the difference in ‘feeling’ was also a bit weird. When the frametimes were consistent it was butter, but at their worst, compared to turning Crossfire off, a single card just felt ‘different’, though certainly more responsive.

    My next step was to GTX670’s in SLI, and that actually went very well. Then GTX970s in SLI, which also went well so long as the game didn’t try to use that last 512MB of VRAM, which drivers were eventually released to essentially disallow. Still have one of those GTX970s, still runs great today.

    Now, the HD7950s are when AMD really committed to fixing Crossfire after Nvidia called them to carpet. And I wish that didn’t sound fanboish, but that’s exactly what happened. Nvidia had been developing their entire SLI solution around optimizing frametimes, which AMD had just been straight up ignoring because no one pinned down why Crossfire just didn’t deliver the expected experience in some situations.

    When I’m critical of AMDs drivers, it’s grounded in history that they keep repeating. They have a lot of work to do to earn back trust.

    And that brings us to Intel, who hasn’t even [I]started[/I] earning that trust yet.

    [QUOTE=”Dogsofjune, post: 30325, member: 168″]
    I am very curious how Intel tackles drivers. I honestly anticipate Intel’s drivers to be fairly generic leaving a lot of hardware potential on the table. Does it have to take the fps crown? No, but I’d like to see how it deals with real world games and applications. Hopefully, like Nvidia has shown, newer drivers can help raise performance.
    [/QUOTE]
    That’s fairly representative of what I’m thinking too. They don’t need to wring out all of the performance, they need to bring out a solid experience. Dependability will go much further than a one trick pony, I think.
    [QUOTE=”Dogsofjune, post: 30325, member: 168″]
    ‘m just looking for something that would feel like an upgrade to either a 1080ti or 1070ti, without selling off…
    [/QUOTE]
    Yup!
    Like, Nvidia has already sold me that 20GB 3080Ti that was teased upon the introduction of this generation. MSRP of the 10GB model, +US$100 for double the VRAM, and I’ll take a Founder’s Edition right now.

    I guess I’m lucky that Nvidia hasn’t gotten around to actually producing that rumored card, cause I would have probably paid a scalper for one by now.

Leave a comment