Image: NVIDIA

NVIDIA’s next flagship graphics card is going to offer a pretty astonishing level of performance that blows away the current flagship.

That’s what prominent leaker kopite7kimi has suggested, anyway, having shared today what is alleged to be the first benchmark score for the GeForce RTX 4090. The Ada-based graphics card has supposedly scored 19,000 points in Time Spy Extreme, 3DMark’s 4K DirectX 12 benchmark test.

Scores rounded up by VideoCardz suggest that the GeForce RTX 4090 could be over 60% faster than the GeForce RTX 3090 Ti. The GeForce RTX 4090’s alleged 3DMark Time Spy Extreme score also implies that the graphics card is over 80% faster than the GeForce RTX 3090, although there are some finer points to be made.

A quick comparison between three xx90 SKUs flagship parts from Ampere and Ada generation suggests that RTX 4090 would be 66% faster than RTX 3090 Ti and 82% faster than RTX 3090 in this one benchmark alone.

However, just one synthetic benchmark does not tell us the full story. Although it is a massive improvement at 4K resolution (used by Extreme preset), the card may perform different in other benchmarks. For starters, we still do not know anything about RTX 40 series raytracing performance.

NVIDIA is rumored to launch its GeForce RTX 4090 graphics card in October, the GeForce RTX 4080 in November, and the GeForce RTX 4070 in December. The GeForce RTX 4060 will supposedly be unveiled at CES 2023.

The NVIDIA GeForce RTX 4090 is rumored to feature 16,384 CUDA cores, a significant increase over the GeForce RTX 3090 Ti’s specification. That card features 10,752 CUDA cores.

Source: kopite7kimi (via VideoCardz)

Go to thread

Don’t Miss Out on More FPS Review Content!

Our weekly newsletter includes a recap of our reviews and a run down of the most popular tech news that we published.

9 comments

  1. Uh-huh!

    I've repeatedly squashed, over and over, that the BS claim (by psycho fans, shills, and tainted analysts just to jack-up NVIDIA's share price) that the 4090 will NOT be over 2x the performance of the 3090 (hell, some *lies* even claimed that the 4070 would have 2x the speed of the 3090 😏) but the NVIDIA nuts considered my opinion was just blasphemy and whatnot. 🤣

    We sold our OG 3090FE last February and profited more than we paid for It in 4Q20, and we had reinstalled our 2080Ti. We just purchased a 3090TI Strix (2) for $1499 a piece a couple of days ago.

    We're good. The 4000/7000 series will be a circus-generation and the only benefit it will bring is the DELIBERATE nerfed features cut from Ampere/RDNA 2 mostly. e.g. DisplayPort 2.0, etc.

    You know, the NVIDIA Way! 😏
  2. Uh-huh!

    I've repeatedly squashed, over and over, that the BS claim (by psycho fans, shills, and tainted analysts just to jack-up NVIDIA's share price) that the 4090 will NOT be over 2x the performance of the 3090 (hell, some *lies* even claimed that the 4070 would have 2x the speed of the 3090 😏) but the NVIDIA nuts considered my opinion was just blasphemy and whatnot. 🤣

    We sold our OG 3090FE last February and profited more than we paid for It in 4Q20, and we had reinstalled our 2080Ti. We just purchased a 3090TI Strix (2) for $1499 a piece a couple of days ago.

    We're good. The 4000/7000 series will be a circus-generation and the only benefit it will bring is the DELIBERATE nerfed features cut from Ampere/RDNA 2 mostly. e.g. DisplayPort 2.0, etc.

    You know, the NVIDIA Way! 😏
    If only you could vote with your wallet... :rolleyes: :rolleyes:
  3. If only you could vote with your wallet... :rolleyes: :rolleyes:
    We did! Two for the price of one.

    If any changes happens with these SKUs in the next few weeks, we are ready to respond.

    But, if anyone *thinks* that a 3090Ti price will reduce below a stack this year, well, good luck with that. 🤣
  4. I find it odd that, first, nVidia is the de facto king here. Which, ok is understandable

    That said, the general consensus is that AMD has to be both faster and cheaper for "many people" to show any interest or lend any credence to them as a worthwhile contender. And even that isn't good enough often times for them to be considered legitimate competition.

    Yet, Intel... they just need to release some pictures of a card, a couple of mock up builds at a conference and some canned benchmarks, and Intel becomes this dark horse that will save the industry with it's boundless competition.

    Not saying that everyone in this forum is guilty of this, just my observation of general internetziens. I guess hope just springs eternal in this general population of super-optimists.
  5. I find it odd that, first, nVidia is the de facto king here. Which, ok is understandable

    Yeah, Nvidia has a long history of putting out more solid, reliable products. They also currently have the feature advantage, due to them pushing raytracing pretty hard. (I personally don't think RT is all that remarkable, but once you convince game makers to include it then every GPU maker has to support it well, or they automatically become second tier)

    More or less kids are maxing out all the settings in the latest games and it runs reasonably on latest gen Nvidia GPU's but is a slideshow on latest gen AMD GPU's, that's pretty much guaranteed to give AMD GPU's a poor reputation.

    That said, the general consensus is that AMD has to be both faster and cheaper for "many people" to show any interest or lend any credence to them as a worthwhile contender. And even that isn't good enough often times for them to be considered legitimate competition.

    That's how market perceptions tend to work. As dumb as it may seem, halo products set a company reputation, and thus sell lower end products to people who will never afford (or want to spend the money for) a halo product.

    How many "Ultimate Driving Machines" does BMW sell because customers see their M cars and are impressed? Then they go into a dealership and buy a base model 528i and feel smug because they have the "Ultimate Driving Machine" even though my 25 year old base model Volvo wagon out-corners a base 5 series.

    Halo products sell the entire product line. It's dumb, but it is psychological.

    AMD has spent several generations without any answer to Nvidia's halo products, probably since the launch of the first Titan in 2013. Then right when they were getting close with the 6900xt and 6950xt, Nvidia adds RT to the mix and relegates them to second tier status again.

    If AMD adds functionally usable RT (that doesn't turn most games that use it into slide shows) and can maintain halo product parity, sometimes leading, sometimes slightly behind, for a few generations in a row, we will see the reputation change, and AMD GPU's no longer be considered second tier, just like what happened with their CPU's after the launch of Ryzen.

    Yet, Intel... they just need to release some pictures of a card, a couple of mock up builds at a conference and some canned benchmarks, and Intel becomes this dark horse that will save the industry with it's boundless competition.

    Yes, Intel entering the market is exciting, not because their products are likely to be particulary class leading, but because they are the first entrant to the market in over 20 years. And of all the corporations out there, I can't think of anyone else who both has the money to throw at product development like this, AND the experience of working with silicone to pull it off. I mean, Apple probably could, but they are not likely to sell PC compatible discrete GPU's any time soon. Same with Google.

    No one is expecting Intel to immediately jump to the forefront of GPU performance, but they can do what neither AMD nor Nvidia can do, and that is to add another player to the discrete GPU market, and that WILL have positive impacts for the consumer, even if they remain low tier entry to mid level products, this will improve pricing of entry to mid tier GPU's across the board, just by their very existence.

    Intel was obviously drawn in this direction due to the insane GPU pricing over the last couple of years. Lets see if they keep up the fight and follow through with it all now that demand is softening a little bit. This generation is mostly sunk cost, so it will probably come out in some form sooner or later, but it remains to be seen if they can keep up the motivation to stay in the market, when profit margins are no longer as extreme as they have been over the last couple of years.
  6. As an AMD card owner of the current generation I have no FOMO for RT. Some things I would like to tinker with just are not present on the AMD cards (look at the recent compute task comparison with the AMD 6950/ Nvidia 3090TI for an example). At this rate in the next generation I'll probably flip back to Nvidia again due to some of the stuff I want to do with the card's that isn't specifically gaming focused.

    That is where Nvidia is missing out. people want that **** because schmucks like me are buying condenser microphones and **** like that to mess with and we want the functions that Nvidia brings in sound correction and background removal and the like.

Leave a comment

Please log in to your forum account to comment