Bringing 2019 to a close in the GPU space, we have one final video card review for the year: NVIDIA’s GeForce GTX 1650 Super. The last of the company’s mid-generation kicker cards to refresh the lineup in the second-half of 2019, the GTX 1650 Super is designed to shore up NVIDIA’s position in the sub-$200 video card market, offering solid performance for 1080p gaming without breaking the bank. At the same time, it’s also NVIDIA’s response to AMD’s new Radeon RX 5500 XT series of cards, which having landed last week, significantly outperform the original GTX 1650.

Like the other Super cards, these refresh parts serve to shore up NVIDIA’s competitive positioning against AMD’s Radeon RX 5000 series cards, offering improved performance-per-dollar at every tier. NVIDIA has taken some flak for uncompetitive pricing, and this is not unearned. And it goes perhaps most for the original GTX 1650, which although is easily the best 75W card on the market, it’s always been surpassed in value by AMD’s cards; first the last-generation Polaris cards, and now the new Navi cards. So with the GTX 1650 Super, NVIDIA and its partners finally get a chance to rectify this with a more competitive part, that while no longer fitting into the original’s 75W niche, offers better performance all around.

NVIDIA GeForce Specification Comparison
  GTX 1660 GTX 1650 Super GTX 1650 GTX 1050 Ti
CUDA Cores 1408 1280 896 768
ROPs 48 32 32 32
Core Clock 1530MHz 1530MHz 1485MHz 1290MHz
Boost Clock 1785MHz 1725MHz 1665MHz 1392MHz
Memory Clock 8Gbps GDDR5 12Gbps GDDR6 8Gbps GDDR5 7Gbps GDDR5
Memory Bus Width 192-bit 128-bit 128-bit 128-bit
VRAM 6GB 4GB 4GB 4GB
Single Precision Perf. 5 TFLOPS 4.4 TFLOPS 3 TFLOPS 2.1 TFLOPS
TGP 120W 100W 75W 75W
GPU TU116
(284 mm2)
TU116
(284 mm2)
TU117
(200 mm2)
GP107
(132 mm2)
Transistor Count 6.6B 6.6B 4.7B 3.3B
Architecture Turing Turing Turing Pascal
Manufacturing Process TSMC 12nm "FFN" TSMC 12nm "FFN" TSMC 12nm "FFN" Samsung 14nm
Launch Date 03/14/2019 11/22/2019 04/23/2019 10/25/2016
Launch Price $219 $159 $149 $139

From a pure hardware perspective, perhaps the most interesting thing about the GTX 1650 Super is that, unlike the other Super cards, NVIDIA is giving the new Super card a much bigger jump in performance over its predecessor. With a of 46%, increase in GPU throughput and faster 12Gbps GDDR6 memory, the GTX 1650 Super is much farther ahead of the GTX 1650 than what we saw with October’s GTX 1660 Super launch, relatively speaking.

The single biggest change here is the GPU. While NVIDIA is calling the card a GTX 1650, in practice it’s more like a GTX 1660 LE; NVIDIA has brought in the larger, more powerful TU116 GPU from the GTX 1660 series to fill out this card. There are cost and power consequences to this – the 284mm2 is a very large chip to be selling for $159 – but the payoff is that it gives NVIDIA a lot more SMs and CUDA Cores to work with. Coupled with that is a small bump in clockspeeds, which pushes the on-paper shader/compute throughput numbers up by just over 46%.

Such a large jump in GPU throughput also requires a lot more memory bandwidth to feed the beast. As a result, just like the GTX 1660 Super, the GTX 1650 Super gets the GDDR6 treatment as well. Here NVIDIA is using slightly lower (and lower power) 12Gbps GDDR6, which is attached to the GPU via a neutered 128-bit memory bus. Still, this one change gives the GTX 1650 Super 50% more memory bandwidth than the vanilla GTX 1650, very close to its increase in shader throughput.

Do note, however, that not all aspects of the GPU are being scaled out to the same degree. In particular, the GTX 1650 Super still only has 32 ROPs, with the rest of TU116’s ROPs getting cut off along with its spare memory channels. This means that while the GTX 1650 Super has 46% more shader performance, only has 4% more ROP throughput for pushing pixels. Counterbalancing this to a degree will be the big jump in memory bandwidth, which helps to keep those 32 ROPs well-fed, but at the end of the day the GPU is getting an uneven increase in resources, and gaming performance gains do reflect this at times.

The drawback to all of this, then, is power consumption. While the original GTX 1650 is a 75 Watt card – making it the fastest thing that can be powered solely by a PCIe slot – the Super-sized card is officially a 100 Watt product. This gives up the original GTX 1650’s unique advantage, and it means builders looking for even faster 75W cards won’t get their wish, but it’s the power that pays the cost of the GTX 1650 Super’s higher performance. Traditionally, NVIDIA has held pretty steadfast at 75W for their xx50 cards, but then again at the end of the day, despite the name, this is closer to a lightweight GTX 1660 than it is a GTX 1650.

Speaking of hardware features, besides giving NVIDIA a good deal more in the way of GPU resources to play with, the switch from the TU117 GPU to the TU116 GPU also has one other major ramification that some users will want to pay attention to: video encoding. Unlike TU117, which got the last-generation NVENC Volta video encoder block for die space reasons, TU116 gets the full-fat Turing NVENC video encoder block. Turing’s video encode block has been turning a lot of heads for its level of quality – while not archival grade, it’s competitive with x264 Medium – which is important for streamers. This also led to TU117 and the GTX 1650 being a disappointment in some circles, as an otherwise solid video card was made far less useful for video encoding. So with the GTX 1650 Super, NVIDIA is resolving this in a roundabout way, thanks to the use of the more powerful TU116 GPU.

Product Positioning & The Competition

As is always the case in the lower-end segment of NVIDIA’s product stack, the GTX 1650 Super is a pure virtual launch. This means that NVIDIA hasn’t put together a retail reference design, and all of the cards are based on partner designs.

At this point, the partners have been shipping TU116 and TU117-based cards for over 8 months, so they have been able to hone their GTX 16-series designs. This means that they’ve been able to hit the ground running, with existing designs ready for quick modification or straight reuse right away. The net result is that the newest GTX 1650 Supers look like and are built like the GTX 1660 and GTX 1650 cards that have preceded them.

Within NVIDIA’s product stack then, the GTX 1650 Super is not a wholesale replacement for the GTX 1650 – that card is still sticking around – but the GTX 1650 Super is going to be the value option for this performance segment. For consumers and OEMs who need 75W cards (for cooling or power reasons), then the $149 GTX 1650 remains the best choice. For everyone else, the GTX 1650 Super offers a whole lot more performance for $10 more. And while it’s not going to be performing on the same level as NVIDIA’s $200+ GTX 1660 cards, the GTX 1650 Super packs enough horsepower that it’s not going to be too far behind.

NVIDIA GeForce 20/16 Series (Turing) Product Stack
RTX 20 Series GTX 16 Series
RTX 2080 Ti GTX 1660 Ti
RTX 2080 Super GTX 1660 Super
RTX 2070 Super GTX 1660
RTX 2060 Super GTX 1650 Super
RTX 2060 GTX 1650

It’s looking outside of NVIIDA’s product stack where we find the real competition for the GTX 1650 Super: AMD’s new Radeon RX 5500 XT series cards, particularly the $169 4GB model. While NVIDIA did not directly call out AMD when first revealing the GTX 1650 Super, the timing – between the RX 5500 series announcement and RX 5500 XT launch – leave no doubt in that respect. NVIDIA has been seemingly content to let AMD hold the $150-$200 market with their RX 580/570 cards, but with the latest AMD launch, that passive positioning has come to an end.

As I noted in last week’s RX 5500 XT review, there’s not quite a 1:1 match between Radeon and GeForce parts right now. The 4GB RX 5500 XT is $10 more expensive than the GeForce GTX 1650 Super, which as Ian Cutress made note of when we were discussing this article earlier this week, in the sub-$200 market customers are typically buying what they can afford. So even $10 matters in some cases. Still, it’s in NVIDIA’s best interests to meet or beat the RX 5500 XT 4GB on performance, to deny AMD that edge. And meanwhile NVIDIA generally has the edge on energy efficiency, though it’s no longer the one-sided fight it was against AMD’s Polaris-based RX 500 series cards.

Finally, the wild card factor here is once again going to be gaming bundles. NVIDIA doesn’t offer one, but AMD does. Along with a 3-month trial for Microsoft’s Xbox Games Pass program, the company is bundling the forthcoming “Master Edition” of Monster Hunter: Iceborne. We don’t often see game bundles with sub-$200 cards, so the inclusion of one can be a powerful factor in this segment of the market, since a game is a more significant fraction of the value of a card.

Though with most of Newegg’s video card stock being anything but in stock, just getting a card is a challenge right now. The first wave of GTX 1650 Super cards have done pretty well sales-wise, so the video card retailer has all of two models of GTX 1650 Super in stock, and Amazon is much the same.

Holiday 2019 GPU Pricing Comparison
AMD Price NVIDIA
Radeon RX 5700 $319 GeForce RTX 2060
  $279 GeForce GTX 1660 Ti
  $229 GeForce GTX 1660 Super
Radeon RX 5500 XT 8GB $199/$209 GeForce GTX 1660
Radeon RX 5500 XT 4GB $169/$159 GeForce GTX 1650 Super
  $149 GeForce GTX 1650
ZOTAC Gaming GeForce GTX 1650 Super
POST A COMMENT

67 Comments

View All Comments

  • Korguz - Sunday, December 22, 2019 - link

    why do you think the games will target ps4 ?? is this just your own opinion?? Reply
  • Kangal - Sunday, December 22, 2019 - link

    Because there's a lot of PS4 units hooked up to TVs right now, there will still be hooked up until 2022. When the PS4 launched, the PS3 was slightly ahead of the Xbox 360, yet sales were nothing like the PS4's. And the PS3 was very outdated back in 2014, whereas in 2020, the PS4 is not nearly as outdated... so there's more longevity in there.

    So with all those factors and history, there's a high probability (certainty?) that Game Publishers will still target the PS4 as their baseline. This is good news for Gaming PC's with only 8GB RAM and 4GB VRAM, and performance below that of a RX 5700. Regardless, it's always easier to upgrade a PC's GPU than it is to upgrade the entire console.

    ...that's why Ryan is not quite right
    Reply
  • Korguz - Sunday, December 22, 2019 - link

    um yea ok sure... and you have numbers to confirm this ?? seems plausible, but also, just personal opinion Reply
  • Kangal - Monday, December 23, 2019 - link

    During the launch of the PS4 back in 2014, the older PS3 was 8 YEARS OLD at the time, and hadn't aged well, but it did a commendable sales of 85 Million consoles.

    I was surprised by the Xbox 360 which was 9.5 YEARS OLD, which understandably was more outdated, and it did a surprising sales of 75 Million consoles.

    Because both consoles weren't very modern/quite outdated, and marketing was strong, the initial sales of the PS4 and Xbox One were very strong in 2014. Despite this there was about another, 5 Million PS3 and Xbox 360, budget sales made in this period. And it took until Early-2016 for Game Publishers to ditch the PS3 and Xbox 360. So about 1.5 Years, and about 40 Million sales (PS4) or 25 Million sales (Xbox 360) later. During this period people using 2GB VRAM Graphic Cards (GTX 960, AMD R9 370X) were in the clear. Only after 2016 were they really outdated, but it was a simple GPU Swap for most people.

    So that's what happened, that's our history.
    Now let's examine the current/upcoming events!
    The PS4 has sold a whopping 105 Million consoles, and the Xbox One has a commendable 50 Million units sold. These consoles should probably reach 110 Million and 55 Million respectively when the PS5 and Xbox X release. And within 2 years they will probably settle on a total of 120 Million and 60 Million sales total. That's a huge player base for companies to ignore, and is actually better than the previous generation. However, this current gen will have both consoles much less outdated than the previous gen, and it's understandable since both consoles will only be 6 YEARS OLD. So by the end of 2022, it should (will !!) be viable to use a lower-end card, something that "only" has 4GB VRAM such as the RX 5500XT or the GTX 1650-Super. And after that, it's a simple GPU Swap to fix that problem anyway so it's no big deal.

    Ryan thinks these 4GB VRAM cards will be obsolete within 6 Months. He's wrong about the timing. It should take 2 Years, or about x4 as much time. If he or you disagree, that's fine, but I'm going off past behavior and other factors. I will see Ryan in 6 Months and see if he was right or wrong.... if I remember to revisit this article/comment that is : )
    Reply
  • Korguz - Monday, December 23, 2019 - link

    and yet... i know some friends that sold their playstations.. and got xboxes... go figure....
    for game makers to make a game for a console to port it to a comp = a crappy game for the most part.. supreme commander 2, is a prime example of this....
    Reply
  • flyingpants265 - Sunday, December 22, 2019 - link

    Most benchmarks on this site are pretty bad and missing a lot of cards.

    Bench is OK but the recent charts are missing a lot of cards and a lot of tests.

    Pcpartpicker is working on a better version of bench, they've got dozens of PCs running benchmarks, 24/7 year-round, to test every possible combination of hardware and create a comprehensive benchmark list. Kind of an obvious solution, and I'm surprised nobody has bothered to do this for... 20-30 years or longer..
    Reply
  • Korguz - Sunday, December 22, 2019 - link

    hmmmmmm could it be because of, oh, let me guess... cost ????????????????? Reply
  • sheh - Saturday, December 21, 2019 - link

    In the buffer compression tests the 1650S fares worse than both the non-S cards and the 1050 Ti.
    How come?

    Curiously, the 1660S is even worse than the 1650S.
    Reply
  • catavalon21 - Saturday, December 21, 2019 - link

    Guessing it's ratio differences not rated to absolute performance. A more comprehensive chart in BENCH of the INT8 Buffer Compression test shows the 2080Ti with a far lower score than any of the recent mid-range offerings.

    https://www.anandtech.com/bench/GPU19/2690
    Reply
  • catavalon21 - Sunday, December 22, 2019 - link

    * not related to Reply

Log in

Don't have an account? Sign up now