Image: NVIDIA

NVIDIA may be prepping an especially monstrous product for PC gamers.

Tweets shared by leaker kopite7kimi today have alluded to “the beast,” a flagship Ada Lovelace product that will not only feature a new board design in the form of the PG137, but a particularly extreme set of specifications.

These include 18,176 CUDA cores, 48 GB of GDDR6X memory running at 24 Gbps speeds, and an incredible TBP of 800 watts. This figure is nearly twice as high as the graphics card power that NVIDIA has listed for its current flagship, the GeForce RTX 3090 Ti, which sits at 450 watts.

The GeForce RTX 3090 Ti features 10,752 CUDA cores and 24 GB of GDDR6X memory in comparison.

Kopite7kimi didn’t have much more to share regarding NVIDIA’s “beast,” but the alleged power demands would suggest that the graphics card will require dual 16-pin power connectors. This product may also be one of the chief reasons as to why NVIDIA decided to develop a new triple-fan cooler design for its GeForce RTX 40 Series, which was teased by kopite7kimi back in June.

In a separate tweet, kopite7kimi has suggested that NVIDIA might be sharing official details relating to the GeForce RTX 4090 “soon.”

NVIDIA’s “beast” is speculated to be a Titan-class graphics card. The company’s most recent Titan product designed for gamers was the GeForce Titan RTX, which released in December 2018 for $2,500.

Source: kopite7kimi

Go to thread

Don’t Miss Out on More FPS Review Content!

Our weekly newsletter includes a recap of our reviews and a run down of the most popular tech news that we published.

37 comments

  1. I caught a Jayztwocents video that he stated from one of his sources that these cards could be delayed. This would not surprise me. Also 800W is starting to get to space heater territory..
  2. That is truly bonkers. We need someone to make a BIG efficiency leap.
    I think...

    neither major camp has been ignoring efficiency, I don't think. It's just that we're in that wierd spot where the nodes aren't jumping as quickly as they were, so you have that leading to smaller gains from manufacturing that you once had.

    That said, the competition hasn't stopped, and our appetite for progress has only gotten larger.

    So the only avenue left is to stack on more cores and crank up the power. Eventually that too will hit limits - monolithic chips will get too big, multichip module interposers will get too complicated, data transfer speeds will eventually start to cap things off... something.

    But yeah, I'm pretty sure if you were to look at stock designs over the last few generations, for a given level of performance, efficiency has gone up. Often by a good bit. It's just that performance hasn't really followed, so they throw more cores and push power limits.
  3. Going to need a PSU just for your GPU...

    Kind of reminds me of that time Intel demoed their 28 core Xeon at Computex 2018...

    1658857793515.png

    ...and neglected to tell anyone it required an industrial chiller to stay cool.

    1658857688417.png
  4. That is truly bonkers. We need someone to make a BIG efficiency leap.
    They get more efficient most generations. What we're looking at is not an efficiency regression, in terms of getting lower performance per watt, but rather seeing an increase in the maximum power draw limits that accompany some performance gains.

    Unlike an efficiency regression - or even efficiency improvement (velocity) regression, we're just seeing higher-performance options being available for those that are willing to foot the cost of purchasing and running them.

    For those worried about efficiency or just energy thriftiness, lower-powered models are certainly available :)
  5. GeForce RTX 4090? Titan?
    Great way to turn those unused hydronic baseboard heaters into a central GPU heating system. Not terribly efficient but MOAR FPS!
  6. GeForce RTX 4090? Titan?
    Great way to turn those unused hydronic baseboard heaters into a central GPU heating system. Not terribly efficient but MOAR FPS!
    All of this has happened before, and all if it will probably happen again - with the improvement in both wired and wireless interlinks, for those that need on-site computing power, a move back toward centralized big iron starts making sense. Even at home!

    And why not integrate it into HVAC? If there's a heating need but not a compute need - sell the compute!
  7. All of this has happened before, and all if it will probably happen again - with the improvement in both wired and wireless interlinks, for those that need on-site computing power, a move back toward centralized big iron starts making sense. Even at home!

    And why not integrate it into HVAC? If there's a heating need but not a compute need - sell the compute!
    I sure hope there's an implicit smiley face in there. Sometimes it's hard to tell. ;)
  8. And why not integrate it into HVAC? If there's a heating need but not a compute need - sell the compute!
    I don't think I posted about it but I remember reading about 6 months ago about a country, I think in the EU, that had a data center that was plumbed into the local community heating setup.
  9. GeForce RTX 4090? Titan?
    They need to bring the Titan branding back. It was never cheap which should alleviate some of the confusion about the pricing and performance for them. I know the x90 cards predated them were halo products unto themselves and used to feature 2x GPUs on the same PCB but the Titan nomenclature always separated it from the pack. The whole stack is just so confusing now between x80, x90, TI, and Super, and then the absence of the Titan. From Maxwell to now NV has really gotten messy with its branding, especially when the refresh cycles kick in towards the end of a product generation.
  10. They need to bring the Titan branding back.
    This I agree with, it would have made much more sense for the 3090 / 3090 Ti, and would have highlighted the main driver for the product - the extra VRAM for content creators. Though I do get the 8K angle, from a marketing perspective. Don't agree with it, but it was valid.
    The whole stack is just so confusing now between x80, x90, TI, and Super, and then the absence of the Titan. From Maxwell to now NV has really gotten messy with its branding, especially when the refresh cycles kick in towards the end of a product generation.
    Makes it confusing for reviewers and consumers - as well as for competitors and competitor marketing teams.

    Not sure if that really works for Nvidia, but it does allow for some flexibility for the marketing / branding folks when dealing with competition, reception, and production issues.
  11. Nope - serious, if not somewhat far-fetched at the moment.
    Not really there were companies that were making 'heaters' for offices that were nothing but compute engines they would pay you to host for the benefit of generating heat in colder climates for effectively free. They even looked like fancy wall mounted radiators. Not sure if it ever went anywhere. Lets see if my google foo can find them...

    No I can't find it but it was a thing... dang.
  12. They get more efficient most generations. What we're looking at is not an efficiency regression, in terms of getting lower performance per watt, but rather seeing an increase in the maximum power draw limits that accompany some performance gains.

    Unlike an efficiency regression - or even efficiency improvement (velocity) regression, we're just seeing higher-performance options being available for those that are willing to foot the cost of purchasing and running them.

    For those worried about efficiency or just energy thriftiness, lower-powered models are certainly available :)
    Look I HAVE the power to spare... it just seems like we need some gains in efficiency... Otherwise the power issue is going to go off the rails... if it hasn't already.


    Like we need someone be it AMD or Nvidia or heaven forbid... intel to come along and say...

    Here is our newest top tier video card. With current BIOS it will run every triple A game even Cyberpunk with RT at 4k at 100hz refresh. This is a limit we put on the card by design. You can go into the video card bios and unlock this for your own purposes but the design of this card is to play any current game at 100hz. Oh and it will only use 250watt TDP to do it.

    While sporting a MASSIVE heat sync and power chokes/delivery clearly able to go far beyond that.

    Someone needs to draw a line in the sand... and really I think targeting that value will be appreciated especially if you can unlock more power as more demanding games come along. Maybe Oooo a clock profiler PER big game. Hummm... yea I like that.
  13. I sure hope there's an implicit smiley face in there. Sometimes it's hard to tell. ;)
    I remember about three of four years ago someone was selling a crypto mining space heater. It was just a headless PC for the most part - you connected it to Wi-Fi, and it had a thermostat that would start/stop computations around whatever temperature you set.

    I should see if I can find it again.
  14. Look I HAVE the power to spare... it just seems like we need some gains in efficiency... Otherwise the power issue is going to go off the rails... if it hasn't already.
    I think one of two things will happen first.

    First off there is a hard limit that most people can pull from the wall without hiring an electrician. Most typical US homes run their bedrooms / offices on 15A 120V circuits. To go beyond that will be a big leap - it will require a disclaimer that you can’t just plug it into any old outlet and have it run — so I think, at least for consumer purposes — that represents a ceiling.

    The second will be the heat / noise that a consumer will be willing to put up with. An 800W card, together with a CPU cranking 100-200W, along with a 150-250W monitor, plus anything else in the room… you do have what amounts to a space heater. That will either be very large or very noisy. And it’s going to throw off a lot of heat. Enthusiasts may put up with it just so they can post those sweet screenshots of crazy benchmark numbers, but ~most~ consumers won’t fool with it.

    Just like most consumers don’t drive exotic sports cars, even among those who could afford them, but we all like to drool over them
  15. I think one of two things will happen first.

    First off there is a hard limit that most people can pull from the wall without hiring an electrician. Most typical US homes run their bedrooms / offices on 15A 120V circuits. To go beyond that will be a big leap - it will require a disclaimer that you can’t just plug it into any old outlet and have it run — so I think, at least for consumer purposes — that represents a ceiling.

    The second will be the heat / noise that a consumer will be willing to put up with. An 800W card, together with a CPU cranking 100-200W, along with a 150-250W monitor, plus anything else in the room… you do have what amounts to a space heater. That will either be very large or very noisy. And it’s going to throw off a lot of heat. Enthusiasts may put up with it just so they can post those sweet screenshots of crazy benchmark numbers, but ~most~ consumers won’t fool with it.

    Just like most consumers don’t drive exotic sports cars, even among those who could afford them, but we all like to drool over them

    All of what you say is true, but on the flip side MOST consumers won't be buying the flagship card this story is reporting on.

    I presume their more mid level parts will have more conventional power draws.

Leave a comment

Please log in to your forum account to comment