Image: NVIDIA

NVIDIA CEO Jensen Huang is aware that gamers and other enthusiasts aren’t thrilled with the pricing of the GeForce RTX 40 Series, but they should probably get used to it because prices will only continue to increase, according to remarks that the executive made during a recent conference call Q&A with reporters.

“Is there anything you would like to say to the community regarding pricing on the new generation of parts, as well as, can they expect to see better pricing at some point and basically address all the loud screams that I’m seeing everywhere?” asked PC world’s Gordon Ong.

“Moore’s law is dead,” Huang responded. “And the ability for Moore’s Law to deliver twice the performance at the same cost, or the same performance at half the cost every year and a half is over. It’s completely over.”

“And so the idea that the chip is going to go down in cost over time, unfortunately, is a story of the past,” the executive added.

From a PC Gamer report:

To be fair to Nvidia, we are looking at a complete switch in chip supplier—from Samsung to TSMC—between the 30-series and 40-series cards. Jensen points out that “a 12-inch wafer is a lot more expensive today than it was yesterday. And it’s not a little bit more expensive; it is a tonne more expensive.”

Since TSMC’s 4nm production capacity is highly sought after right now, we can only speculate just how much Nvidia is being charged for manufacturing the Ada Lovelace processors. Ultimately, we can’t expect Nvidia to swallow all the extra manufacturing costs if that’s been the case; some of it will inevitably trickle down to the consumer.

NVIDIA launched the GeForce RTX 40 Series this week, revealing new graphics cards in the form of the GeForce RTX 4090 and GeForce RTX 4080, which will be available in 16 and 12 GB options. The GeForce RTX 4090 will be the first to launch, releasing on October 12 with a starting MSRP of $1,599.

Go to thread

Don’t Miss Out on More FPS Review Content!

Our weekly newsletter includes a recap of our reviews and a run down of the most popular tech news that we published.

11 comments

  1. Moore's Law may be dead, but that doesn't necessarily mean that consumers will throw more money at it.

    Even Wall Street is worried nVidia screwed this one up -- if these cards are just too expensive, then rather than just sucking it up and paying more (which is the message JSH is sending with his comments), consumers may just wait until something becomes affordable.

    For a company that plays more to Wall Street than their consumers, JSH screwed that one up royally. He tried to assuage his shareholders by reassuring them that nVidia had set the right price despite customer sentiment but Wall Street appears to be reading between the lines and the stock has dropped significantly since the announcement, although most analysts remain bullish on the overall nVidia outlook.

    1663956514018.png
  2. Well, Dgpus cards could be priced in multiples of current gen consoles... So the price argument is sus. Ps5 and xbox are plenty capable... And they have graphics, cpus, power supply, memory, motherboard, a fast ssd, a case, and a controller. So yeah. Sure nvidias are top of the line silicon, but still, i think they are squeezing the market a bit much. I wonder if pc gaming still growing or shrinking or what. I am thinking in units, not revenue, then again even.in units, who knows what is crypto and what isn't... Just die already crypto. Is there any reliable way of measuring the trajectory of pc gaming community? Steam maybe?
  3. with the energy prices in Europe beeing what they are (ie x12 or even more) I see little profit in mining over here so there is that silver lining.

    Ofc people are having to choose between heating or eating so that's a lot less interesting (news quated a family who paid 50€ month for gas last year who had to pay 530€ at the end for the year which is not uncommon, are now paying 1.500€/month) so sure we are having 1.500€ for a 4080 16GB that's 2/3rd's core wise of what is should be for double the price of previous gen, suuure.
  4. Moore's law is dead (in its traditional misinterpreted interpretation*) and indeed, manufacturing and supplier pricing has gone up both due to the pandemic, and due to the difficulty of manufacturing silicon chips with small gate sizes due to the inevitable current leakage problems.

    That said, this had already been factored in. ~9xx gen pricing adjusted for inflation (which helps account for the cost increases due to the pandemic) should be reflective of actual current costs. Anything above that is really just chip makers manipulating things to try to get higher profit margins.

    So, the 4090 should be at ~Titan pricing of ~$1199
    The 16GB 4080 should be at about xx80 pricing at about $749
    And the 2GB 4080 should be at ~xx70 pricing at ~$450

    Any amount above that is "just because they can and they want the money".

    Both the GPU and CPU industries are broken. There need to be 3-5 viable competitors in a market for them to work. We had this in the early 2000's. We don't today.

    As I've said before, these ****ers need a Microsoft-style date with the DOJ.

    *Moore's actual comment was based on the number of transistors doubling every - what was it, 9 months? Not on performance increasing at that level, so it technically mostly stands with the adding of cores and other things.
  5. *Moore's actual comment was based on the number of transistors doubling every - what was it, 9 months? Not on performance increasing at that level, so it technically mostly stands with the adding of cores and other things.
    Generally speaking the more paths in a core the more transistors it will have. There is some fluxuation with modern instruction sets reducing the number needed to cover a wider gambit. The real key here is variable instruction set programming and hardware. Again that is where GPU's shine. Thousands of cores that are in essence programmable.

    So if your program only calls a subset of instructions running that through the GPU where you get to specify on the fly how many cores are doing what instruction for the general compute units. Then you get better performance.. That was the big hurdle that's led to the modern GPU's being so good at so many tasks.

    It's bonkers that a video card costs as much as a mid range laptop. That will always be bonkers.
  6. Moore's first law never said anything about price though. It's strictly about technical ability.

    In fact, there was a "second" law (also called Rock's Law) that did deal with price -- the price to build the fab for manufacturing doubles with every other generation (every 3-4 years).

    I don't know if the first law is done or not - it's very obvious that performance gains have slowed, but process gains continue to march on (at least for most people except Intel). But 1/2 of an already small number is a small number, so the magnitude of those gains goes down each generation... and engineers are figuring clever ways around strict process node improvements: chiplets, etc.

    Now, maybe we have hit the intersection of those two laws - where we haven't hit a dead end on the technical side, but it's just gotten to the point that we don't have enough volume / economy to make it economical.
  7. Moore's law never said anything about price though. It's strictly about technical ability.

    In fact, there was a "second" law (also called Rock's Law) that did deal with price -- the price to build the fab for manufacturing doubles with every other generation (every 3-4 years).

    I don't know if the first law is done or not - it's very obvious that performance gains have slowed, but process gains continue to march on (at least for most people except Intel). But 1/2 of an already small number is a small number, so the magnitude of those gains goes down each generation... and engineers are figuring clever ways around strict process node improvements: chiplets, etc.

    Now, maybe we have hit the intersection of those two laws - where we haven't hit a dead end on the technical side, but it's just gotten to the point that we don't have enough volume / economy to make it economical.
    That's a good point. Exactly how much compute and GPU power does modern gaming need. If we max out 4k at 120hz for the most demanding of games... short of scientific modeling what would need that kind of computer/power that would run locally?

    We all know AI subscription service will be a thing but the cost of the compute needed to drive AI will keep it in a hosted solution as opposed to something running locally... (short of those of us that will get into that arena of writing our own.)
  8. That's a good point. Exactly how much compute and GPU power does modern gaming need. If we max out 4k at 120hz for the most demanding of games... short of scientific modeling what would need that kind of computer/power that would run locally?

    Game development has an insatiable appetite for GPU and CPU capacity. If you are drastically outperforming current games at 4k120, just wait, that won't be the case for long, next gen games will adapt and crank up the quality/whatever knob.
  9. Game development has an insatiable appetite for GPU and CPU capacity. If you are drastically outperforming current games at 4k120, just wait, that won't be the case for long, next gen games will adapt and crank up the quality/whatever knob.
    Unfortunately this is true. Even if you do get to the point of photographic fidelity - there's still the curious fact that given sufficient resources programmers will just get lazy and start cranking out inefficient code.
  10. Unfortunately this is true. Even if you do get to the point of photographic fidelity - there's still the curious fact that given sufficient resources programmers will just get lazy and start cranking out inefficient code.

    Yep. We aren't quite there yet for games, but this is happening all across the software industry.

    The conclusion is that high level low effort languages and tools safe a metric ton of money in programmer hours, testing and other QA.

    As the saying goes, programmers are expensive, CPU cycles and RAM are relatively cheap.

    On the one hand I find this kind of offends my old school sensibilities of doing the most with limited hardware.

    I used to participate in the "Demo Scene" back in the day, when groups competed against each other when it came to who could make the most impressive real time audio-visual demonstration on a fixed piece of hardware. (This is where Mad Onion and later Futuremark came from as well, born out of the Finnish demo group Future Crew) Picture a convention center or sports stadium filled with rows of tables, like a LAN party, but no one is playing games. Everyone is coding or doing artwork for the demo competition at the end, with participants voting for the winner, and the winner taking the pot.

    On the other hand - however - trhis probably allows a ton of software to be made, and enjoyed by users which otherwise would be too costly to bring to market.

    So it's a tradeoff.

    Some indie games already do this. You don't need a 16 core CPU or a GTX 4090 to play a sprite based platformer.

    I think we are pretty far from this point in AAA titles though.
  11. Unfortunately this is true. Even if you do get to the point of photographic fidelity - there's still the curious fact that given sufficient resources programmers will just get lazy and start cranking out inefficient code.
    It’s not really that it’s lazy development. There are a few spins on the phrase, but in general “the last 20% take 80% of the time” holds true from my experience working as a performance engineer. And at least with my co-workers at the fortune 100 companies I’ve worked for, the stuff each individual developer does isn’t generally where the problems lie. It’s when you have multiple teams integrating code with different objectives (ex: the security ops team injects a feature or step that the application does not integrate well with) that stuff pops up and making it work well takes significant effort. If brute forcing the integration works, it can be worth it to save 1000 hours of developer time at the cost of 3fps (or wherever).

Leave a comment

Please log in to your forum account to comment