Image: NVIDIA

News broke yesterday of NVIDIA’s alleged launch schedule for its upcoming GeForce RTX 40 Series, including a supposed CES 2023 unveiling date for a newly mentioned (but predictable) GeForce RTX 4060 graphics card. Popular leaker @kopite7kimi has now taken to Twitter to share a little insight on the lower mid-range GPU, claiming that it will consume more power than the Ampere-based GeForce RTX 3070. If true, this would mean that the GeForce RTX 4060 will feature a TDP greater than 220 watts, which is what NVIDIA officially lists for the graphics card power of its GeForce RTX 3070. NVIDIA fans who have been following the company’s gaming products closely may not be too surprised, however, as members of the GeForce X60 series have proven to be more and more power hungry over the past few generations.

  • GeForce RTX 4060: 220W+
  • GeForce RTX 3060: 170W
  • GeForce RTX 2060: 160W
  • GeForce GTX 1060: 120W
  • GeForce GTX 960: 120W
  • GeForce GTX 760: 170W

The AD102 and the AD103 GPUs are expected to utilize the same PCB, PG139, but the SKU will vary slightly with the RTX 4090 using the PG139-330 SKU and the RTX 4080 using a PG139-360 SKU. The RTX 4070 on the other hand will be based on a completely different board. The NVIDIA GeForce RTX 4060 will be the first Ada Lovelace card for 2023, being unveiled at CES 2023 and launching a few weeks later. The mainstream card is also expected to utilize the AD106.

Sources: @kopite7kimi (via VideoCardz), Wccftech

Go to thread

Don’t Miss Out on More FPS Review Content!

Our weekly newsletter includes a recap of our reviews and a run down of the most popular tech news that we published.

13 comments

  1. My question is: how are they going to target lower tiers? What will be the 'level' of a GPU that can rely entirely off of PCIe slot power like a 1050Ti can?
  2. My question is: how are they going to target lower tiers? What will be the 'level' of a GPU that can rely entirely off of PCIe slot power like a 1050Ti can?
    They'll release a 4030 that'll be gimped to oblivion. Look at the 1630 they are set to release. 64-bit data bus rumored to be in play. If that does come to fruition, then I wouldn't be surprised to see a 4030 built the same way..
  3. nVidia's Idea of efficiency = Throw MOAR POWAH at it! Sure it'll perform laps around the competition, but the efficiency will blow, and not in a good way!
    Both major companies have teetered in both directions - this is really the first instance that we're looking at AMD both advancing in performance and efficiency enough to highlight Nvidia's potential lack thereof in quite some time, IMO.

    They'll release a 4030 that'll be gimped to oblivion. Look at the 1630 they are set to release. 64-bit data bus rumored to be in play. If that does come to fruition, then I wouldn't be surprised to see a 4030 built the same way..
    So, I hope they don't do that - I hope they take the low-end segment more seriously than they have been and than AMD seems to be doing, and hopefully Intel is taking it seriously enough to spur some competition here too.

    It'd be nice for a US$100 video card to actually be worth... US$100.
  4. Both major companies have teetered in both directions - this is really the first instance that we're looking at AMD both advancing in performance and efficiency enough to highlight Nvidia's potential lack thereof in quite some time, IMO.


    So, I hope they don't do that - I hope they take the low-end segment more seriously than they have been and than AMD seems to be doing, and hopefully Intel is taking it seriously enough to spur some competition here too.

    It'd be nice for a US$100 video card to actually be worth... US$100.
    Oh.. I know AMD is all too guilty of this as well. Its been a LONG time since nVidia has been forced to do it though. Just stands out more. AMD's video selection still hasn't been able to fully catch up, and nVidia is trying to make sure the nail is firmly planted in their coffin..
  5. Oh.. I know AMD is all too guilty of this as well. Its been a LONG time since nVidia has been forced to do it though. Just stands out more. AMD's video selection still hasn't been able to fully catch up, and nVidia is trying to make sure the nail is firmly planted in their coffin..
    I think AMD is doing pretty good, IMO. They're behind on RT, but not materially as their tech works, just needs more grunt. They've made progress on smart upsampling despite lacking a technological equivalent to DLSS, so the big question in my eyes is ongoing support for video encoding for streaming. But even Intel will beat Nvidia to the punch here if they deliver ARC before RTX 4000-series GPUs hit.
  6. Over AMD, Nvidia actually has some production uses that AMD is lacking in it's cards. Known compatibility with various encoders, the whole background sound and video elimination enhancements that you can run for free on RTX cards... stuff like that. As an AMD card owner I would want to tinker with but can't... yet on my desktop I don't run a webcam... though I've considered it... if I was running an RTX card. I'd like to see AMD eventually have some parity in those feature spaces as well.
  7. Over AMD, Nvidia actually has some production uses that AMD is lacking in it's cards. Known compatibility with various encoders, the whole background sound and video elimination enhancements that you can run for free on RTX cards... stuff like that. As an AMD card owner I would want to tinker with but can't... yet on my desktop I don't run a webcam... though I've considered it... if I was running an RTX card. I'd like to see AMD eventually have some parity in those feature spaces as well.
    This! AMD needs to step up to the plate with some real encoder support and utilities making use of the power they've got. I steered clear of their video cards strictly because of this.
  8. This! AMD needs to step up to the plate with some real encoder support and utilities making use of the power they've got. I steered clear of their video cards strictly because of this.
    Using antiquated GPU IP in their APUs has been to blame at least in part for this - outdated tech with questionable software support is not particularly conducive to drumming up support from the software development community.

    AMD is rectifying this with Zen 3+ (currently for mobile) and Zen 4 heading to the desktop next. If the hardware and drivers are there, then we're likely to get more interest from software houses.
  9. outdated tech with questionable software support is not particularly conducive to drumming up support from the software development community.
    Well, the flip side of that is... the tech is mature, and the software ~should~ be plentiful with widespread support.

    The fact that it never really was, I think, is indicative of the problem AMD has.
  10. Well, the flip side of that is... the tech is mature, and the software ~should~ be plentiful with widespread support.

    The fact that it never really was, I think, is indicative of the problem AMD has.
    Even if the tech were perfect (this being pre-RDNA, cannot draw conclusions for current hardware), and all indications point to it not having been, a lack of sync between APUs and GPUs works against adoption. APUs lagging works against it even more, as we'd expect those to be more plentiful - except that AMD chose not include graphics capabilities in their main SKUs.

    And we're comparing against Intel, where SKUs without IGPs are the exception, and Nvidia, the market leader and standard bearer - both of which are broadly supported across consumer and professional product ranges.

    Essentially, AMDs tech stood little chance of community acceptance and adoption - outside of the FOSS community, at least. Commercially it was a non-starter, with houses like Blender ignoring them entirely, and Adobe not being bothered with compatibility testing. The latter being a point of pain for myself; Adobe products stopped crashing constantly on her desktop when I swapped out an RX460 with what should be an inferior GTX 1050Ti.

    AMD choosing to put GPUs in their 'uncore' dies is a huge step forward here, as is the release of Ryzen 6000 mobile SKUs using RDNA IP. And with Nvidia pummeling them with featuresets that extend beyond raster performance along with Intel chomping at the bit to consume the bottom half of the discrete GPU market and bringing a similar featureset, it seems like AMD is in a 'now or never' position and they're acting with a material response.

Leave a comment

Please log in to your forum account to comment