In the closing months of 2018, NVIDIA finally released the long-awaited successor to the Pascal-based GeForce GTX 10 series: the GeForce RTX 20 series of video cards. Built on their new Turing architecture, these GPUs were the biggest update to NVIDIA's GPU architecture in at least half a decade, leaving almost no part of NVIDIA's architecture untouched.

So far we’ve looked at the GeForce RTX 2080 Ti, RTX 2080, and RTX 2070 – and along with the highlights of Turing, we’ve seen that the GeForce RTX 20 series is designed on a hardware and software level to enable realtime raytracing and other new specialized features for games. While the RTX 2070 is traditionally the value-oriented enthusiast offering, NVIDIA's higher price tags this time around meant that even this part was $500 and not especially value-oriented. Instead, it would seem that the role of the enthusiast value offering is going to fall to the next member in line of the GeForce RTX 20 family. And that part is coming next week.

Launching next Tuesday, January 15th is the 4th member of the GeForce RTX family: the GeForce RTX 2060 (6GB). Based on a cut-down version of the same TU106 GPU that's in the RTX 2070, this new part shaves off some of RTX 2070's performance, but also a good deal of its price tag in the process. And for this launch, like the other RTX cards last year, NVIDIA is taking part by releasing their own GeForce RTX 2060 Founders Edition card, which we are taking a look at today.

NVIDIA GeForce Specification Comparison
  RTX 2060 Founders Edition GTX 1060 6GB (GDDR5) GTX 1070
(GDDR5)
RTX 2070
CUDA Cores 1920 1280 1920 2304
ROPs 48 48 64 64
Core Clock 1365MHz 1506MHz 1506MHz 1410MHz
Boost Clock 1680MHz 1709MHz 1683MHz 1620MHz
FE: 1710MHz
Memory Clock 14Gbps GDDR6 8Gbps GDDR5 8Gbps GDDR5 14Gbps GDDR6
Memory Bus Width 192-bit 192-bit 256-bit 256-bit
VRAM 6GB 6GB 8GB 8GB
Single Precision Perf. 6.5 TFLOPS 4.4 TFLOPs 6.5 TFLOPS 7.5 TFLOPs
FE: 7.9 TFLOPS
"RTX-OPS" 37T N/A N/A 45T
SLI Support No No Yes No
TDP 160W 120W 150W 175W
FE: 185W
GPU TU106 GP106 GP104 TU106
Transistor Count 10.8B 4.4B 7.2B 10.8B
Architecture Turing Pascal Pascal Turing
Manufacturing Process TSMC 12nm "FFN" TSMC 16nm TSMC 16nm TSMC 12nm "FFN"
Launch Date 1/15/2019 7/19/2016 6/10/2016 10/17/2018
Launch Price $349 MSRP: $249
FE: $299
MSRP: $379
FE: $449
MSRP: $499
FE: $599

Like its older siblings, the GeForce RTX 2060 (6GB) comes in at a higher price-point relative to previous generations, and at $349 the cost is quite unlike the GeForce GTX 1060 6GB’s $299 Founders Edition and $249 MSRP split, let alone the GeForce GTX 960’s $199. At the same time, it still features Turing RT cores and tensor cores, bringing a new entry point for those interested in utilizing GeForce RTX platform features such as realtime raytracing.

Diving into the specs and numbers, the GeForce RTX 2060 sports 1920 CUDA cores, meaning we’re looking at a 30 SM configuration, versus RTX 2070’s 36 SMs. As the core architecture of Turing is designed to scale with the number of SMs, this means that all of the core compute features are being scaled down similarly, so the 17% drop in SMs means a 17% drop in the RT Core count, a 17% drop in the tensor core count, a 17% drop in the texture unit count, a 17% drop in L0/L1 caches, etc.

Unsurprisingly, clockspeeds are going to be very close to NVIDIA’s other TU106 card, RTX 2070. The base clockspeed is down a bit to 1365MHz, but the boost clock is up a bit to 1680MHz. So on the whole, RTX 2060 is poised to deliver around 87% of the RTX 2070’s compute/RT/texture performance, which is an uncharacteristically small gap between a xx70 card and an xx60 card. In other words, the RTX 2060 is in a good position to punch above its weight in compute/shading performance.

However TU106 has taken a bigger trim on the backend, and in workloads that aren’t pure compute, the drop will be a bit harder. The card is shipping with just 6GB of GDDR6 VRAM, as opposed to 8GB on its bigger brother. The result of this is that NVIDIA is not populating 2 of TU106’s 8 memory controllers, resulting in a 192-bit memory bus and meaning that with the use of 14Gbps GDDR6, RTX 2060 only offers 75% of the memory bandwidth of the RTX 2070. Or to put this in numbers, the RTX 2060 will offer 336GB/sec of bandwidth to the RTX 2070’s 448GB/sec.

And since the memory controllers, ROPs, and L2 cache are all tied together very closely in NVIDIA’s architecture, this means that ROP throughput and the amount of L2 cache are also being shaved by 25%. So for graphics workloads the practical performance drop is going to be greater than the 13% mark for compute throughput, but also generally less than the 25% mark for ROP/memory throughput.

Speaking of video memory, NVIDIA has called this the RTX 2060 but early indications are that there will be different configurations of RTX 2060s with less VRAM and possibly fewer CUDA cores and other hardware resources. Hence, it seems forward-looking to refer to the product mentioned in this article as the RTX 2060 (6GB); as you might recall, the GTX 1060 6GB was launched as the ‘GTX 1060’ and so appeared as such in our launch review, up until a month later with the release of the ‘GTX 1060 3GB’, a branding that does not indicate its lower-performing GPU configuration unrelated to frame buffer size. Combined with ongoing GTX 1060 naming shenanigans, as well as with GTX 1050 variants (and AMD’s own Polaris naming shenanigans also of note), it seems prudent to make this clarification now in the interest of future accuracy and consumer awareness.

NVIDIA GTX 1060 Variants
Specification Comparison
  GTX 1060 6GB GTX  1060 6GB
(9 Gbps)
GTX 1060 6GB (GDDR5X) GTX 1060 5GB (Regional) GTX 1060 3GB
CUDA Cores 1280 1280 1280 1280 1152
Texture Units 80 80 80 80 72
ROPs 48 48 48 40 48
Core Clock 1506MHz 1506MHz 1506MHz 1506MHz 1506MHz
Boost Clock 1708MHz 1708MHz 1708MHz 1708MHz 1708MHz
Memory Clock 8Gbps GDDR5 9Gbps GDDR5 8Gbps GDDR5X 8Gbps GDDR5 8Gbps GDDR5
Memory Bus Width 192-bit 192-bit 192-bit 160-bit 192-bit
VRAM 6GB 6GB 6GB 5GB 3GB
TDP 120W 120W 120W 120W 120W
GPU GP106 GP106 GP104* GP106 GP106
Launch Date 7/19/2016 Q2 2017 Q3 2018 Q3 2018 8/18/2016

Moving on, NVIDIA is rating the RTX 2060 for a TDP of 160W. This is down from the RTX 2070, but only slightly, as those cards are rated for 175W. Cut-down GPUs have limited options for reducing their power consumption, so it’s not unusual to see a card like this rated to draw almost as much power as its full-fledged counterpart.

All-in-all, the GeForce RTX 2060 (6GB) is quite the interesting card, as the value-enthusiast segment tends to be more attuned to price and power consumption than the performance-enthusiast segment. Additionally, as a value-enthusiast card and potential upgrade option it will also need to perform well on a wide range of older and newer games – in other words, traditional rasterization performance rather than hybrid rendering performance.

Meanwhile, looking at evaluating the RTX 2060 itself, measuring generalizable hybrid rendering performance remains unclear. Linked to the Windows 10 October 2018 Update (1809), DXR has been rolled-out fairly recently. 3DMark’s DXR benchmark, Port Royal, is due on January 8th, while for realtime raytracing Battlefield V is the sole title with it for the moment, with optimization efforts are ongoing as seen in their recent driver efforts. Meanwhile, it seems that some of Turing's other advanced shader features (Variable Rate Shading) are only currently available in Wolfenstein II.

Of course, RTX support for a number of titles have been announced and many are due this year, but there is no centralized resource to keep track of availability. It’s true that developers are ultimately responsible for this information and their game, but on the flipside, this has required very close cooperation between NVIDIA and developers for quite some time. In the end, RTX is a technology platform spearheaded by NVIDIA and inextricably linked to their hardware, so it’s to the detriment of potential RTX 20 series owners in researching and collating what current games can make use of which specialized hardware features they purchased.

Planned NVIDIA Turing Feature Support for Games
Game Real Time Raytracing Deep Learning Supersampling (DLSS) Turing Advanced Shading
Anthem   Yes  
Ark: Survival Evolved   Yes  
Assetto Corsa Competizione Yes    
Atomic Heart Yes Yes  
Battlefield V Yes
(available)
Yes  
Control Yes    
Dauntless   Yes  
Darksiders III   Yes  
Deliver Us The Moon: Fortuna   Yes  
Enlisted Yes    
Fear The Wolves   Yes  
Final Fantasy XV   Yes
(available in standalone benchmark)
 
Fractured Lands   Yes  
Hellblade: Senua's Sacrifice   Yes  
Hitman 2   Yes  
In Death     Yes
Islands of Nyne   Yes  
Justice Yes Yes  
JX3 Yes Yes  
KINETIK   Yes  
MechWarrior 5: Mercenaries Yes Yes  
Metro Exodus Yes    
Outpost Zero   Yes  
Overkill's The Walking Dead   Yes  
PlayerUnknown Battlegrounds   Yes  
ProjectDH Yes    
Remnant: From the Ashes   Yes  
SCUM   Yes  
Serious Sam 4: Planet Badass   Yes  
Shadow of the Tomb Raider Yes    
Stormdivers   Yes  
The Forge Arena   Yes  
We Happy Few   Yes  
Wolfenstein II     Yes, Variable Shading
(available)

So the RTX 2060 (6GB) is in a better situation than the RTX 2070. With comparative GTX 10 series products either very low on stock (GTX 1080, GTX 1070) or at higher prices (GTX 1070 Ti), there’s less potential for sales cannibalization. And as Ryan mentioned in the AnandTech 2018 retrospective on GPUs, with leftover Pascal inventory due to the cryptocurrency bubble, there’s much less pressure to sell Turing GPUs at lower prices. So the RTX 2060 leaves the existing GTX 1060 6GB (1280 cores) and 3GB (1152 cores) with breathing room. That being said, $350 is far from the usual ‘mainstream’ price-point, and even more expensive than the popular $329 enthusiast-class GTX 970.

Across the aisle, the recent Radeon RX 590 in the mix, though its direct competition is the GTX 1060 6GB. Otherwise, the Radeon RX Vega 56 is likely the closer matchup in terms of performance. Even then, AMD and its partners are going to have little choice here: either they're going to have to drop prices to accomodate the introduction of the RTX 2060, or essentially wind down Vega sales.

Unfortunately we've not had the card in for testing as long as we would've liked, but regardless the RTX platform performance testing is in the same situation as during the RTX 2070 launch. Because the technology is still in the early days, we can’t accurately determine the performance suitability of RTX 2060 (6GB) as an entry point for the RTX platform. So the same caveats apply to gamers considering making the plunge.

Q1 2019 GPU Pricing Comparison
AMD Price NVIDIA
Radeon RX Vega 56 $499 GeForce RTX 2070
  $449 GeForce GTX 1070 Ti
  $349 GeForce RTX 2060 (6GB)
  $335 GeForce GTX 1070
Radeon RX 590 $279  
  $249 GeForce GTX 1060 6GB
(1280 cores)
Radeon RX 580 (8GB) $200/$209 GeForce GTX 1060 3GB
(1152 cores)
Meet The GeForce RTX 2060 (6GB) Founders Edition
Comments Locked

134 Comments

View All Comments

  • Bluescreendeath - Monday, January 7, 2019 - link

    And take a look at the article's 4k benchmarks. The 2060 6gb performs equal to or better than the vega 64 or gtx1070 gpus with 8GB of vram even on 4k resolution. You are overly hung up on paper specs and completely missing what is actually important - real world performance. The 2060 clearly has sufficent VRAM as it performs better than its closest priced competitors and spanks the rx580 8gb by 50-60%.
  • ryrynz - Tuesday, January 8, 2019 - link

    This guy gets it.
  • TheJian - Tuesday, January 8, 2019 - link

    ROFL. AMD is aiming first 7nm cards at GTX 1080/1070ti perf. This is not going to kill NV's cash cows, which 80% of NET INCOME comes from ABOVE $300. IE, 2060 on up and workstation/server cards at 80% of NV's INCOME (currently at ~3B a year, AMD <350mil).

    NV is one of the companies listed as having already made LARGE orders at TSMC, along with Apple, AMD etc. The only reason AMD is first (July, MAYBE, So NV clear sailing for 6 more months on 2060+ pricing and NET INCOME), is Nvidia is waiting for price to come down so they can put out a LARGER die (you know brute force you mentioned), AND perhaps more importantly their 12nm 2080ti will beat the snot out of AMD 7nm for over a year as AMD is going small with 7nm because it's NEW. NV asked TSMC to make 12nm special for them...LOL, as you can do that with 3B net INCOME. It worked as it smacked around AMD 14nm chips and looks like they'll be fine vs. 7nm that is SMALL die sizes. If you are NOT competing above $300 you'll remain broke vs. Intel/NV.
    https://www.extremetech.com/computing/283241-nvidi...
    As he says, not likely NV will allow more than 6-12 on a new node without a response from NV, though if you’re not battling above $300, no point for an NV response as not much of their money is under $250.
    “If Navi is a midrange play that tops out at $200-$300, Nvidia may not care enough to respond with a new top-to-bottom architectural refresh. If Navi does prove to be competitive against the RTX 2070 at a lower price point, Nvidia could simply respond by cutting Turing prices while holding a next-generation architecture in reserve precisely so it can re-establish the higher prices it so obviously finds preferable. This is particularly true if Navi lacks GPU ray tracing and Nvidia can convince consumers that ray tracing is a technology worth investing in.”
    AGREED. No response needed if you are NOT chasing my cash cows. Again, just a price cut needed even if NAVI is really good (still missing RT+DLSS) and then right back to higher prices with 7nm next gen Q1 2020? This is how AMD should operate, but management doesn’t seem to get chasing RICH people instead of poor.

    BRUTE FORCE=LARGER than your enemy, and beating them to death with it (NV does this, reticle limits hit regularly). NV 12nm is 445mm^2, AMD 7nm looks like 1/2 that. The process doesn't make up that much, so no win for AMD vs. top end NV stuff obviously and they are not even claiming that with 1080 GTX perf...LOL. 10-15% better than Vega64 or 1080 (must prove this at $249)? Whatever you're smoking, please pass it. ;)

    Raytracing+DLSS on RTX 2060 is 88fps. Without both 90fps. I call that USEFUL, not useless. I call it FREE raytracing.
    https://wccftech.com/amd-rx-3080-3070-3060-navi-gp...
    Not impressed for no RT/DLSS tech likely for another year as AMD said they won't bother for now. This card doesn't compete vs ANY RTX card, as they come with RT+DLSS as NV shows releasing new 10 series cards, probably to justify the RTX pricing too (showing you AMD is in a different rung by rehashing 10 series). Which as you see below DLSS+RT massively boosts perf so it's free. It's clear using tensor cores to boost RT is pretty great.

    https://wccftech.com/nvidia-geforce-rtx-2060-offic...
    Now that is impressive. Beats my 1070ti handily, looks better while doing it, and shaves off 20-30w from 180. That will be 120w with 7nm and likely faster and BEFORE xmas if AMD 7nm is anything worth talking about on gpu (cpu great, gpu aimed too low, like xbox1/ps4 needing another rev or two...LOL). 5Grays/s (none for AMD, this year) which is plenty for 1080p as shown. It's almost like getting a monitor upgrade for free (1080p DLSS+RT is 1440p or better looking with no perf loss). What do you think happens with games DESIGNED for RT/DLSS instead of patched in like BF5? LOL. These games will be faster than BF5 unless designed poorly right? It only took 3 weeks to up perf 50% in a game NOT made for the tech (due to a bug in the hardware EA found I guess as noted in their vid - NV fixed, and bam, perf up 50%). 6.5TF, which even without RT+DLSS is looking tough for AMD 7nm currently based on announcements above. AMD will be 150w for 3080 it seems, vs. NV 150-160w 12nm 2060 etc. Good luck, no wiping away NV here.

    "If you’re having a hard time believing this, don’t worry, you’re not alone because it does sound unbelievable."
    Yep. I agree, but we'll see. Not too hard to understand why AMD would have to price down, as it has no RT+DLSS. You get more from NV, so it's likely higher if rumor pricing on AMD is correct.

    Calling 60% barely faster is just WRONG. Categorically WRONG. Maybe you could say that at 10%, heck I'll give 15% to you too. But at 60% faster, you're not even on the same playing field now. RTX cards (2060+ all models total) will sell MILLIONS in the next year and likely have already sold a million (millions? All of xmas sales) with 2080/2080ti/2070 out for a while. It only takes 10mil for a dev to start making games for consoles, and I’m guessing it’s the same roughly for a PC technology (at 7nm they’ll ALL be RTX probably to push the tech). The first run of Titans was 100k and sold out in an F5 olympics session...LOL. Heck the 2nd run did the same IIRC and NV said they couldn’t make them as fast as they sold them. I'm pretty sure volume on a 2060 card is MUCH higher than Titan runs (500k? A million run for midrange?). I think we’re done here, as you have no data to back up your points ;) Heck we can’t even tell what NV is doing on 7nm yet, as it’s all just RUMOR as the extremetech article shows. Also, I HIGHLY doubt AMD will decimate the new 590 with prices like WCCFTECH rumored. The 3070 price would pretty much KILL all 590 sales if $199. Again, I don’t believe these prices, but we’ll see if AMD is this dumb. NONE of these “rumors” look like they are AMD slides etc. Just words on a page. I think they’ll all be $50 or more higher than rumored.
  • Bp_968 - Tuesday, January 8, 2019 - link

    I think your confused about DLSS. DLSS is fancy upscaling. It won't make a 1080p image look like 1440p, it will make a 720p image "sorta" look like a 1080p image (does it even work in 1080p now? It was originally only available on 1440p and 4k, for obvious reasons.).

    The 2060 is the best value 20 series card so far but when compared to historical values its absolute garbage. In the 350$ price bracket you had the 4GB 970, then the 8gb 1070/1070ti, and now the *6GB* 2060. The 1070 and 1070ti were a huge improvement over the equivalent priced 970 and the 2060 (the same price bracket) is a couple of percent better (if that) and at a 2GB memory deficit!

    Nvidia shoveled a enteprise design onto gamers to try and recover some R&D costs from gamers itching for a new generation card. There are already rumors of a 2020 release of an entirely new design based on 7nm. This seems quite plausible considering they are going to be facing new 7nm cards from AMD and *something* from intel in 2020. And underestimating Intel has always been a very very dangerous thing to do.
  • D. Lister - Tuesday, January 8, 2019 - link

    "DLSS is fancy upscaling. It won't make a 1080p image look like 1440p, it will make a 720p image "sorta" look like a 1080p image (does it even work in 1080p now? It was originally only available on 1440p and 4k, for obvious reasons.)."

    Em no, it's not upscaling, but rather the opposite.

    https://www.digitaltrends.com/computing/everything...
    "DLSS also leverages some form of super-sampling to arrive at its eventual image. That involves rendering content at a higher resolution than it was originally intended for and using that information to create a better-looking image. But super-sampling typically results in a big performance hit because you’re forcing your graphics card to do a lot more work. DLSS however, appears to actually improve performance."

    That is where the tensor AI comes in. It is trained in-house by Nvidia with the game, with higher res images (upto 64X resolution), which the AI then applies to change the image you see by effectively applying supersampling (aka DOWNscaling).

    Example:
    If object A looks like A1 at 1080p, which looks like A2 if downsampled from resolution(n*1080p)
    If object B looks like B1 at 1080p, which looks like B2 if downsampled from resolution(n*1080p)
    If object C looks like C1 at 1080p, which looks like C2 if downsampled from resolution(n*1080p)

    Then a scene at 1080p which has objects A, B and C which, without DLSS, would look like A1B1C1, would look like A2B2C2 with DLSS.
  • Gastec - Saturday, January 19, 2019 - link

    Intel will release low end cards, from a gaming point of view, considering 1440p as the new standard. Nvidia will still have the high end and they will raise the prices of their high performance cards even further. I see nothing good in the near future for our pockets. The wealthy trolls will be even more aggresive.
  • ryrynz - Tuesday, January 8, 2019 - link

    What? Pointless? You don't use this card for 4K...
  • CiccioB - Wednesday, January 9, 2019 - link

    Why not?
    If DLSS allows for the right performances I can't see why 4K are excluded from the possibile use of this card.
    Or do you think that 4K is only 8+GB of RAM, 3000+ shaders and 64+ROPS with 500+GB/s bandwidth for any particular reason?
    They are all there for <b>incrementing performances</b>. If can do that with other kind of work (like using DLSS) then, what's the problem?
    The fact that AMD has to create a card with the above resources to do (maybe) the same work as they have not invested a single dollar in AI up to now ad are at least 4 years behind the competition on this particular feature?
  • DominionSeraph - Monday, January 7, 2019 - link

    No, the 580 has a significantly worse price/performance ratio. All of you here are making the common mistake of calculating price/performance as though a card works without a PC to put it in, instead of improving the performance of said PC. If we calculated price/performance by the cost of the individual component then every discrete card would be infinitely worse than integrated graphics, which costs $0. We can intuitively understand that this is flat out false.

    If you've just dumped $3000 into a high end machine and its peripherals, with a RX 580 the cost goes to $3200. With a RTX 2060 it's $3350. The RTX 2060 is 4.7% more expensive for 50% more speed.
    This price/performance holds true until we hit a $100 PC. If you pulled a $100 Sandy Bridge rig off ebay, then now at $450 with a RTX 1060 it is 50% more expensive than at $300 with a RX 580.

    As long as your rig is worth more than $100 the RTX 1060 stomps the RX 580 in price/performance.
  • AshlayW - Monday, January 7, 2019 - link

    You're ignoring the point that not everyone wants to spend $350 on a GPU. 1080p gaming is perfectly acceptable on cards costing less than $300 and at the moment AMD offers the best value here. Especially with RX 570. Turing is just overpriced.

Log in

Don't have an account? Sign up now