Zotac GeForce GT 640 DDR3 Review: Glacial Gaming & Heavenly HTPC
by Ryan Smith & Ganesh T S on June 20, 2012 12:00 PM ESTTwo weeks ago NVIDIA formally launched the retail GeForce GT 640, catching up to their OEM and laptop offerings with their first GK107 based video card for the retail desktop market. GT 640 is designed to be NVIDIA’s entry-level Kepler video card, joining several Fermi rebadges as the members of the GT 6xx series. With the enthusiasm behind Intel’s Ivy Bridge in the laptop market and the boost in sales it has provided for NVIDIA’s mobile GPUs, NVIDIA is hoping to accomplish the same thing in the desktop market with GT 640.
Today we’ll finally be taking a look at the GT 640 in action. We’re expecting NVIDIA will launch a GDDR5 variant at some point, but, for the first round of cards, GT 640 is exclusively DDR3. This has important performance repercussions. Meanwhile, as is common with entry-level video cards, there is no reference design intended for retail sale and NVIDIA isn’t sampling any such card. However, NVIDIA’s partners are stepping up to sample cards to the press. Our sample comes from Zotac, who sent over their single slot based Zotac GeForce GT 640.
GTS 450 | GT 640 DDR3 | GT 630 GDDR5 | GT 630 DDR3 | |
Previous Model Number | N/A | New | GT 440 GDDR5 | GT 440 DDR3 |
Stream Processors | 192 | 384 | 96 | 96 |
Texture Units | 32 | 32 | 16 | 16 |
ROPs | 16 | 16 | 4 | 4 |
Core Clock | 783MHz | 900Mhz | 810MHz | 810MHz |
Shader Clock | 1566MHz | N/A | 1620MHz | 1620MHz |
Memory Clock | 3.6GHz GDDR5 | 1.782GHz DDR3 | 3.2GHz GDDR5 | 1.8GHz DDR3 |
Memory Bus Width | 128-bit | 128-bit | 128-bit | 128-bit |
Frame Buffer | 1GB | 2GB | 1GB | 1GB |
GPU | GF106 | GK107 | GF108 | GF108 |
TDP | 106W | 65W | 65W | 65W |
Transistor Count | 1.17B | 1.3B | 585M | 585M |
Manufacturing Process | TSMC 40nm | TSMC 28nm | TSMC 40nm | TSMC 40nm |
Launch Price | $129 | $99/$109 | N/A | N/A |
Diving right into the GT 640’s specifications, this is the same GK107 GPU as the 640M and other laptop/OEM GK107 products, so we’re looking at the same basic features and specifications. GT 640 ships with all of GK107’s functional units enabled, which means 384 CUDA cores organized into 2 SMXes sharing a single GPC. Further attached to that lone GPC is a pair of ROP blocks and memory controllers, giving the GT 640 16 ROPs, 256KB of L2 cache, and a 128-bit memory bus. All of this is augmented by the common features of the Kepler family, including the NVENC hardware H.264 encoder, VP5 video decoder, FastHDMI support, and PCIe 3.0 connectivity.
Thanks to the fact that this is a retail desktop product the GT 640 will ship at a fairly high clockspeed of 900MHz, which puts its clockspeed ahead of its OEM DDR3 counterpart but behind the GDDR5 version. Like the laptop and OEM versions boost clock is not present, so performance is rather straightforward here. Similarly, for those of you looking to make Fermi comparisons, GT 640 and other Kepler based video cards do away with the separate shader clock in favor of additional CUDA cores, so GT 640 has a much lower shader clock and far more CUDA cores than its predecessors.
Unfortunately things don’t look nearly as good on the memory front. NVIDIA is launching with DDR3 here, which means that even with the 128-bit memory bus this card is extremely memory bandwidth starved. At just shy of 1.8GHz it only has 28.5GB/sec of memory bandwidth. DDR3 versus GDDR5 has been a recurring issue in this market segment, and as both GPU performance and GDDR5 performance have increased over time the gap between DDR3 and GDDR5 card variants has continued to grow. By the time we’re up to this many ROPs and shaders the memory bandwidth requirements are simply enormous. In traditional fashion these DDR3 cards are outfitted with more memory overall – the DDR3 GT 640 ships with 2GB – so it has a memory pool every bit as large as the GTX 680’s but lacks the memory bandwidth to make effective use of it. So expect the obligatory 1GB GDDR5 version to be much faster here.
As for physical specifications, the official TDP on the GT 640 is 65W, putting this card distinctly into the PCIe slot powered category. Idle power consumption on the other hand is spec’d at 15W, which at least on paper is actually a bit worse than GT 440 and competing 28nm cards. Meanwhile the die size on GK107 is 118mm2, virtually identical to the 116mm2 die size of GF108. Product naming aside, due to the similar TDP and GPU die sizes the GT 640 is clearly going to be the direct successor to the GT 440 from a hardware perspective.
All things taken into account, GT 640 (or rather GK107) is a rather powerful video card given the entry-level market segment it will be occupying. GT 440 (GF108) only had 4 ROPs so NVIDIA is no less than quadrupling their theoretical pixel throughput here. At the same time the CUDA core count is greatly expanding thanks to the smaller manufacturing process and Kepler architectural changes, and after compensating for those architectural changes NVIDIA has effectively doubled their shading and texturing performance. A healthy boost in the number of functional units is of course typical with any new manufacturing process, but because NVIDIA pared down Fermi so much for GF108 the additional functional units on GK107 should significantly improve performance.
NVIDIA’s official guidance on performance is that GT 640 should come in just behind GTS 450, which makes sense given the similar theoretical shader and ROP performance of the two cards. Ultimately we wouldn’t be surprised if a GK107 card surpassed GTS 450 thanks to the former’s higher clockspeeds, but that will have to wait for a GDDR5 GT 640 or something similar. As it stands the DDR3 performance handicap will keep GT 640 from catching up to the GTS 450, even with the clockspeed advantage. Consequently while NVIDIA is pitching GT 640 as a solid performer at 1680x1050 it can really only achieve this at lower quality settings. So for most of our readers GT 640’s real strength is going to be HTPC usage thanks to its combination of video features (VP5/NVENC), ample shader performance for post-processing, and its sub-75W TDP.
Wrapping things up, let’s quickly talk about pricing and availability. NVIDIA’s goal for the GT 640 DDR3 was for it to be a $99 card, however with their partners free to design their own cards and set their own prices, they have for the most part not gone along with this. Currently only a single GT 640 is available at Newegg for $99, with everything else (including the Zotac card we’re reviewing today) going for $109. The better news here is that unlike the GTX 670/680 availability shouldn’t be an issue here as GK107 is far easier for NVIDIA to produce in volume even with TSMC’s capacity constraints. Cards have readily been available for over 2 weeks now and that’s not expected to change.
Finally, because of its de facto price of $109 the GT 640 DDR3 is in direct competition with AMD’s equally priced Radeon HD 7750, along with last-generation cards such as the GTS 450, GTS 550 Ti, and Radeon HD 5750. Unfortunately for the GT 640 it’s the only card packing DDR3 at this price, so it should come as no great surprise that its performance is significantly lagging its competition. As we alluded to earlier, any significant success for GT 640 is going to have to rely on its role as a sub-75W card, where it’s one of the more powerful cards in that segment.
Spring 2012 GPU Pricing Comparison | |||||
AMD | Price | NVIDIA | |||
Radeon HD 6870 | $159 | GeForce GTX 560 | |||
Radeon HD 6850 | $139 | ||||
Radeon HD 7770 | $129 | ||||
$119 | GeForce GTX 550 Ti | ||||
Radeon HD 7750 | $109 | GeForce GT 640/GTS 450 | |||
Radeon HD 6750 | $99 |
60 Comments
View All Comments
cjs150 - Thursday, June 21, 2012 - link
"God forbid there be a technical reason for it.... "Intel and Nvidia have had several generations of chip to fix any technical issue and didnt (HD4000 is good enough though). AMD have been pretty close to the correct frame rate for a while.
But it is not enough to have the capability to run at the correct frame rate is you make it too difficult to change the frame rate to the correct setting. That is not a hardware issue just bad design of software.
UltraTech79 - Wednesday, June 20, 2012 - link
Anyone else really disappointed in 4 still being standardized around 24 fps? I thought 60 would be the min standard by now with 120 in higher end displays. 24 is crap. Anyone that has seen a movie recorded at 48+FPS know whats I'm talking about.This is like putting shitty unleaded gas into a super high-tech racecar.
cjs150 - Thursday, June 21, 2012 - link
You do know that Blu-ray is displayed at 23.976 FPS? That looks very good to me.Please do not confuse screen refresh rates with frame rates. Screen refresh runs on most large TVs at between 60 and 120 Hz, anything below 60 tends to look crap. (if you want real crap trying running American TV on an European PAL system - I mean crap in a technical sense not creatively!)
I must admit that having a fps of 23.976 rather than some round number such as 24 (or higher) FPS is rather daft and some new films are coming out with much higher FPS. I have a horrible recollection that the reason for such an odd FPS is very historic - something to do with the length of 35mm film that would be needed per second, the problem is I cannot remember whether that was simply because 35mm film was expensive and it was the minimum to provide smooth movement or whether it goes right back to days when film had a tendency to catch light and then it was the maximum speed you could put a film through a projector without friction causing the film to catch light. No doubt there is an expert on this site who could explain precisely why we ended up with such a silly number as the standard
UltraTech79 - Friday, June 22, 2012 - link
You are confusing things here. I clearly said 120(fps) would need higher end displays (120Hz) I was rounding up 23.976 FPS to 24, give me a break.It looks good /to you/ is wholly irrelevant. Do you realize how many people said "it looks very good to me." Referring to SD when resisting the HD movement? Or how many will say it again referring to 1080p thinking 4k is too much? It's a ridiculous mindset.
My point was that we are upping the resolution, but leaving another very important aspect in the dust that we need to improve. Even audio is moving faster than framerates in movies, and now that most places are switching to digital, the cost to goto the next step has dropped dramatically.
nathanddrews - Friday, June 22, 2012 - link
It was NVIDIA's choice to only implement 4K @ 24Hz (23.xxx) due to limitations of HDMI. If NVIDIA had optimized around DisplayPort, you could then have 4K @ 60Hz.For computer use, anything under 60Hz is unacceptable. For movies, 24Hz has been the standard for a century - all film is 24fps and most movies are still shot on film. In the next decade, there will be more and more films that will use 48, 60, even 120fps. Cameron was cock-blocked by the studio when he wanted to film Avatar at 60fps, but he may get his wish for the sequels. Jackson is currently filming The Hobbit at 48fps. Eventually all will be right with the world.
karasaj - Wednesday, June 20, 2012 - link
If we wanted to use this to compare a 640M or 640M LE to the GT640, is this doable? If it's built on the same card, (both have 384 CUDA cores) can we just reduce the numbers by a rough % of the core clock speed to get rough numbers that the respective cards would put out? I.E. the 640M LE has a clock of 500mhz, the 640M is ~625Mhz. Could we expect ~55% of this for the 640M LE and 67% for the 640M? Assuming DDR3 on both so as not to have that kind of difference.Ryan Smith - Wednesday, June 20, 2012 - link
It would be fairly easy to test a desktop card at a mobile card's clocks (assuming memory type and functional unit count was equal) but you can't extrapolate performance like that because there's more to performance than clockspeeds. In practice performance shouldn't drop by that much since we're already memory bandwidth bottlenecked with DDR3.jstabb - Wednesday, June 20, 2012 - link
Can you verify if creating a custom resolution breaks 3D (frame packed) blu-ray playback?With my GT430, once a custom resolution has been created for 23/24hz, that custom resolution overrides the 3D frame-packed resolution created when 3D vision is enabled. The driver appeared to have a simple fall through logic. If a custom resolution is defined for the selected resolution/refresh rate it is always used, failing that it will use a 3D resolution if one is defined, failing that it will use the default 2D resolution.
This issue made the custom resolution feature useless to me with the GT430 and pushed me to an AMD solution for their better OOTB refresh rate matching. I'd like to consider this card if the issue has been resolved.
Thanks for the great review!
MrSpadge - Wednesday, June 20, 2012 - link
It consumes about just as much as the HD7750-800, yet performs miserably in comparison. This is an amazing win for AMD, especially comparing GTX680 and HD7970!UltraTech79 - Wednesday, June 20, 2012 - link
This preform about as well as an 8800GTS for twice the price. Or half the preformance of a 460GTX for the same price.These should have been priced at 59.99.