Image: Intel

During today’s Intel Architecture Day 2021 event, Intel shared a short video demonstrating how its new high-quality super sampling technology, XeSS, might compare to NVIDIA and AMD’s respective DLSS and FidelityFX Super Resolution technologies for enabling higher resolutions at a lower performance cost. It seems impressive.

As indicated in the video’s comparison sequences, Intel’s XeSS super sampling technology is able to upscale a 1080p image to 4K with what appears to be minimal quality loss. Intel’s head of graphics software Lisa Pearce goes a little further, boldly claiming that there is “no visible quality loss” at all in the 1080p image compared to its native 4K counterpart.

Intel XeSS’ results are made possible by deep learning. “XeSS uses deep learning to synthesize images that are very close to the quality of native high-res rendering,” Intel explained, noting that the reconstruction is performed by a “neural network trained to deliver high performance and great quality.”

Pearce added that the cost of Intel XeSS is “relatively small” and that the AI-assisted scaling allows for performance boosts of up to 2x. Intel’s XeSS SDK will be available this month.

Image: Intel

The contents and game levels shown in this demo were created by Rens. Rens is a 3D artist, environment artist and technical art director. He is known for his outstanding photogrammetry techniques and high-end rendering skills, and has worked with top game development studios like DICE, Epic Games and Sony.

Source: Intel

Don’t Miss Out on More FPS Review Content!

Our weekly newsletter includes a recap of our reviews and a run down of the most popular tech news that we published.

Join the Conversation

14 Comments

  1. Interesting, I didn’t see anything saying that this was required to be ran on Intel’s cards. Could this be the resolution devolution we need to ‘feel like’ we get to play a 4k?

  2. [QUOTE=”Grimlakin, post: 39917, member: 215″]
    Interesting, I didn’t see anything saying that this was required to be ran on Intel’s cards. Could this be the resolution devolution we need to ‘feel like’ we get to play a 4k?
    [/QUOTE]
    I’ve read conflicting reports on wether or not nvidia can run XeSS as its done with AI cores. What’s clear is that AMD can’t.

    This could give a distinct advantage to nvidia as it would be the only one that supports all upscaling techniques.

    … Just read the techspot article on the matter and it claims XeSS indeed runs on AMD cards

  3. [QUOTE=”Stoly, post: 39928, member: 1474″]
    I’ve read conflicting reports on wether or not nvidia can run XeSS as its done with AI cores. What’s clear is that AMD can’t.

    This could give a distinct advantage to nvidia as it would be the only one that supports all upscaling techniques.
    [/QUOTE]

    All OTHER than oh wait yea all. Damn yea that would be a pretty distinct advantage. I don’t think Intel will actually want this though. Would be better if it was a software layer for everyone. It would suck for intel if they spent the time and money to do the Artificial intelligence upscaling only to have Nvidia be better at running their own accelerators.

    On another note though…. who’s to say they are not simply contracting with Nvidia for access to their AI upscaling engine and results. Giving them a larger base of games that will work on launch day for the new hardware.

    Wouldn’t surprise me… a lot of vendors licensed their ‘shadowplay’ or whatever their VDI gaming box’s were called. They just rebranded it.

  4. Intel could absolutely not afford to have a proprietary solution on their GPU’s at launch.

    It has to work on everything out there made in the last say 2 years or so.

    This might be the end of DLSS. Or maybe not.
    Time will tell. I hope not, I like DLSS and I think it could have a lot to give yet.

  5. [QUOTE=”Auer, post: 39941, member: 225″]
    This might be the end of DLSS. Or maybe not.
    Time will tell. I hope not, I like DLSS and I think it could have a lot to give yet.
    [/QUOTE]
    I do too. It’s not perfect but none of them are. However, I think it’s awesome that NV provided a means to offload some of the processing to other hardware for increased performance. Here’s hoping that if worse comes to worst they can somehow update it so those tensor cores can be programmed for other solutions as well. I know they don’t want to give their market share away but if another standard dominates a decision will have to be made.

  6. This is Nvidia. They’ll double down on DLSS, sign exclusive deals with game developers and force Intel’s hand. Simply because they have the majority of the market.

  7. [QUOTE=”Riccochet, post: 39968, member: 4″]
    This is Nvidia. They’ll double down on DLSS, sign exclusive deals with game developers and force Intel’s hand. Simply because they have the majority of the market.
    [/QUOTE]
    Well I’m partial to DLSS as I think it’s a good solution so I’m ok with that.

  8. [QUOTE=”Peter_Brosdahl, post: 39967, member: 87″]
    I do too. It’s not perfect but none of them are. However, I think it’s awesome that NV provided a means to offload some of the processing to other hardware for increased performance. Here’s hoping that if worse comes to worst they can somehow update it so those tensor cores can be programmed for other solutions as well. I know they don’t want to give their market share away but if another standard dominates a decision will have to be made.
    [/QUOTE]
    Rumor says nvidia is working on AI texture upscaling and use AI to offload even more RT rendering besides denoising.

  9. [QUOTE=”Stoly, post: 39970, member: 1474″]
    Rumor says nvidia is working on AI texture upscaling and use AI to offload even more RT rendering besides denoising.
    [/QUOTE]

    How… I don’t understand how this would be done. Lets be very clear. There is no AI happening on the RTX cards today. Nvidia is running scenes through their AI engine and it is building instructions for tensor or whatever cores to upscale content based on pre determined factors for a more efficient and clean display at the end. There is no intelligence happening while you’re playing a game with DLSS in the graphics pipe line.

    The best they could do is use their in house API to pre generate specialized instructions bundled with RTX card drivers to teach it optimized routines to upscale graphics again based on predetermined settings.

  10. [QUOTE=”Grimlakin, post: 39980, member: 215″]
    How… I don’t understand how this would be done. Lets be very clear. There is no AI happening on the RTX cards today. Nvidia is running scenes through their AI engine and it is building instructions for tensor or whatever cores to upscale content based on pre determined factors for a more efficient and clean display at the end. There is no intelligence happening while you’re playing a game with DLSS in the graphics pipe line.
    [/QUOTE]

    From Nvidia:

    “Powered by dedicated AI processors on [URL=’https://www.nvidia.com/en-us/geforce/20-series/’]GeForce RTX GPUs[/URL] called Tensor Cores, DLSS 2.0 is a new and improved deep learning neural network that boosts frame rates while generating beautiful, crisp game images. It gives gamers the performance headroom to maximize ray tracing settings and increase output resolutions.”

    [LIST]

  11. [B]Great Scaling Across All GeForce RTX GPUs and Resolutions -[/B] A new AI network more efficiently uses Tensor Cores to execute 2X faster than the original. This improves frame rates and eliminates previous limitations on which GPUs, settings, and resolutions could be enabled.
  12. [B]One Network For All Games – [/B]The original DLSS required training the AI network for each new game. DLSS 2.0 trains using non-game-specific content, delivering a generalized network that works across games. This means faster game integrations, and ultimately more DLSS games.
  13. [/LIST]

    [URL unfurl=”true”]https://www.nvidia.com/en-us/geforce/news/nvidia-dlss-2-0-a-big-leap-in-ai-rendering/[/URL]

  14. [QUOTE=”Auer, post: 39996, member: 225″]
    From Nvidia:

    “Powered by dedicated AI processors on [URL=’https://www.nvidia.com/en-us/geforce/20-series/’]GeForce RTX GPUs[/URL] called Tensor Cores, DLSS 2.0 is a new and improved deep learning neural network that boosts frame rates while generating beautiful, crisp game images. It gives gamers the performance headroom to maximize ray tracing settings and increase output resolutions.”

    [LIST]

  15. [B]Great Scaling Across All GeForce RTX GPUs and Resolutions -[/B] A new AI network more efficiently uses Tensor Cores to execute 2X faster than the original. This improves frame rates and eliminates previous limitations on which GPUs, settings, and resolutions could be enabled.
  16. [B]One Network For All Games – [/B]The original DLSS required training the AI network for each new game. DLSS 2.0 trains using non-game-specific content, delivering a generalized network that works across games. This means faster game integrations, and ultimately more DLSS games.
  17. [/LIST]

    [URL unfurl=”true”][URL][URL]https://www.nvidia.com/en-us/geforce/news/nvidia-dlss-2-0-a-big-leap-in-ai-rendering/[/URL][/URL][/URL]
    [/QUOTE]

    Yep… and I bet you that your games do not look better with the same driver set no matter how much you play them and how many screenshots of in game IQ you get.

    Yes… the AI network generates the knowledge for the game. And now the AI network can deliver algorithms that are more generic and not game specific but still work well based on learned pattern recognition.

    No where in their statements are you actually running local AI to improve game upscaling. Their AI networks use tensor cores… YOUR RTX card is not part of their AI network.

    Am I the only one that can read what they are saying here or am I missing something?

  18. [QUOTE=”Grimlakin, post: 40016, member: 215″]
    Yep… and I bet you that your games do not look better with the same driver set no matter how much you play them and how many screenshots of in game IQ you get.

    Yes… the AI network generates the knowledge for the game. And now the AI network can deliver algorithms that are more generic and not game specific but still work well based on learned pattern recognition.

    No where in their statements are you actually running local AI to improve game upscaling. Their AI networks use tensor cores… YOUR RTX card is not part of their AI network.

    Am I the only one that can read what they are saying here or am I missing something?
    [/QUOTE]

    I can read what they are saying just fine.

    [URL unfurl=”true”]https://www.nvidia.com/en-us/data-center/tensor-cores/[/URL]

  19. [QUOTE=”Auer, post: 40023, member: 225″]
    I can read what they are saying just fine.

    [URL unfurl=”true”][URL]https://www.nvidia.com/en-us/data-center/tensor-cores/[/URL][/URL]
    [/QUOTE]

    Yep tensor cores can do that. The tensor cores in your pc are not using ai to accelerate anything in a game today with DLSS.

  20. [QUOTE=”Grimlakin, post: 40028, member: 215″]
    Yep tensor cores can do that. The tensor cores in your pc are not using ai to accelerate anything in a game today with DLSS.
    [/QUOTE]
    This is how I understand it to work as well

Leave a comment