With this year’s Gamescom event now in full swing, the German games show seems to be taking on an ever-larger presence in the worlds of gaming and hardware. Along with a slew of games announcements (and a Google Stadia event tucked in between), NVIDIA is also using the show to launch an unexpected new GeForce driver. The first of the Release 435 family, the 436.02 driver adds and revises several features, including GPU integer scaling, a new sharpening filter for NVIDIA’s freestyling system, a rework of their pre-rendered frame limiter (now called Low Latency Mode), as well as 30-bit color for OpenGL applications, and the usual suite of game fixes and performance improvements.

There are several different things going on in NVIDIA’s latest driver, and overall it feels a lot like a reaction to last month’s Radeon launch. In particular is the focus on low latency gaming (Radeon Anti-Lag), shader-based image sharpening (Contrast Adaptive Sharpening), and NVIDIA’s choice of games for performance optimizations (a couple of which are in our 2019 suite). Which is not to downplay the driver – if anything, it’s the most interesting driver out of NVIDIA in a long while – but it’s definitely laser-focused in certain ways on features that arch-rival AMD has just introduced or otherwise focused on themselves in the last month.

Integer Image Upscaling At Last

At any rate, let’s start with what I feel is by far the most interesting aspect of today’s announcement, which is integer display scaling support. This is a feature that gamers have been requesting for quite a number of years now – I’ve been asking for it off and on since early this decade – and the wheels finally began moving a bit earlier this year when Intel casually announced that they’d support the feature for their Gen 11 GPUs, aka Ice Lake. However with those parts not hitting retail until next month, it looks like NVIDIA is technically going to beat Intel to the punch with their release.

Bundled in the new driver is NVIDIA’s take on integer scaling. Because the announcement for this driver has gone out in advance of the driver itself – that is due at 9am ET, after this was written – I haven’t had a chance to try the feature. But according to NVIDIA, it behaves as you’d expect it do: doing nearest neighbor scaling of lower-resolution images to an integer multiple of their original resolution, producing a sharp, pixelated image. In essence, you end up with a lower resolution image displaying on a higher resolution monitor as if it were a lower resolution monitor. Importantly, this mode is very different from traditional bilinear(ish) image scaling, which produces a softer, blurrier image without pixelization.

Neither integer scaling nor bilinear scaling are always the right solution, but depending on the situation, each method can produce better results. NVIDIA has opted to focus their own blog post in talking about using integer scaling for pixel art games, where the pixelated look is very intentional, though these games typically (but not always) do integer scaling on their own to begin with.


Simulated Upscaling @2x Zoom: Integer Scaling (Left) vs. Bilinear Scaling (Right)

The more interesting use for the feature, I feel, is in gaming on 4K and 5K monitors, especially with sub-RTX 2080 class GPUs. This is because the high resource demands for 4K+ gaming are difficult for all but NVIDIA’s most powerful GPUs to keep up with (and even then…), which necessitates rendering a game at a sub-native resolution. Which in turn introduces the blurriness caused by bilinear upsampling. Integer scaling, on the other hand, would allow a game to be rendered at 1080p and then perfectly upscaled to 4K (2160p); it eliminates the pixel density benefits of a 4K monitor when gaming, but it retains the sharpness of native resolution rendering. It’s not quite a “have your cake and eat it too” solution, but especially for laptop users where 4K gaming isn’t a real option when 4K panels are, the potential is huge.

What remains to be seen then is how well this works in practice, both with respect to NVIDIA’s drivers as well as games themselves. While NVIDIA can control the former, they have less control over the latter, so there are still subtle ways that games can interact poorly with integer scaling. In particular is UI/text size, since this is sometimes tied to resolution. Also, as NVIDIA notes in their own release notes, integer scaling doesn’t currently play well with HDR; and in fact the whole feature is still classified as being in beta, even if the drivers themselves are not.

At any rate, the feature is being rolled out today for Turing owners – and just Turing owners. Specifically, the feature is available on GeForce RTX 20 series and GTX 16 series cards, but not NVIDIA’s prior Pascal (GTX 10) and Maxwell (GTX 900) series cards. According to NVIDIA’s announcement, the feature hinges on the “hardware-accelerated programmable scaling filter available in Turing”, however to be honest I don’t know how accurate this statement is, or how much of a blocker it might be for past cards. NVIDIA has a history of rolling out new features for their latest generation parts first, and then backporting the feature for older cards a few months down the line, so that may yet end up being the case here.

Improved Image Sharpening for NVIDIA Freestyle

Moving on, this driver release is also adding a new image sharpening filter for Freestyle, the company’s post-process filter ability that’s baked into GeForce Experience. While the company already had a sharpening filter in Freestyle, according to NVIDIA the new filter offers better image quality while also halving the performance impact from the prior filter. In practice, this latest addition seems to be NVIDIA’s counter to AMD’s new Contrast Adaptive Sharpening – itself a counter to NVIDIA’s Deep Learning Super Sampling – offering another, more generic shader-based approach that’s functionally similar to AMD’s.

While I’ll keep DLSS comparisons to a minimum here since I haven’t tested the driver itself, DLSS support is still limited to a dozen or games – and while these are popular games, they are still only a portion of the overall gaming ecosystem. A post-processing shader-based approach, on the other hand, can work over most games (i.e. anything Freestyle works with), and most APIs, with NVIDIA enabling it for DX9 through DX12, along with Vulkan.

As for how the image quality will compare to DLSS or AMD’s own solution, that remains to be seen. Post-processing alone cannot entirely recover data that has been lost from lower resolution rendering, and this is true for shader and deep learning-based approaches; native resolution rendering remains the best approach for image clarity. However, as far as post-processing goes, performance and image quality are variables on a continuum rather than fixed values, so there are tradeoffs and benefits going both directions, and depending on the game, the right algorithms with the right settings can produce some good results. Meanwhile, as image sharpness seems to be a battleground that both AMD and NVIDIA are interested in fighting over, I would full expect both of them to continue working on their algorithms.

“Max Pre-Rendered Frames” Becomes “Ultra-Low Latency” Mode

Also receiving a makeover in NVIDIA’s latest driver is their Max Pre-Rendered Frames feature, which again seems to be a response to AMD’s Radeon Anti-Lag functionality. The rarely noticed feature has been present in NVIDIA’s drivers for a decade – which, as NVIDIA likes to remind everyone, makes them first – and allows users to control how many not-yet-rendered rendered frames can be queued up to be rendered and displayed. In 436.02, the feature is being redeployed as Low Latency Mode, and it’s getting a new mode as well.

Overall, the rechristened feature is being simplified some, both in name and in functionality. Along with what NVIDIA is undoubtedly expecting to be a more approachable name, Low Latency Mode will have just 3 settings – Off, On, and Ultra – which is down from 5 for the previous Max Prerendered Frames implementation.

In terms of functionality then, while Off does exactly what it says on the name (which is to say nothing, leaving queuing up to the game), On and Ultra have a bit more nuance. On essentially compresses the previous incarnation’s settings down to a single label; instead of being able to select a queue size from 1 to 4 pre-rendered frames, On simply locks the queue at 1 frame. Ultra, meanwhile, is more -or-less new, and goes one step further by reducing the queue size to 0, meaning frames are submitted to the GPU on a just-in-time basis and no pre-rendered frames are held in reserve.

Ultra mode potentially offers the lowest latency, but the flip side is that all the usual caveats to manually adjusting the rendering queue size still apply. The rendering queue exists to help smooth out frame pacing on both the display and rendering/submission sides of matters, however it costs latency to hold those frames. Even keeping the queue at a smaller size could throw things off, and just-in-time rendering is trickier still, since bad submission timing cannot be hidden. Which is why it’s an optional feature to begin with, rather than set to Ultra by default. Still, for latency-sensitive uses (and latency-sensitive gamers), being able to adjust the rendering queue size was (and remains) a useful feature to have.

Meanwhile, perhaps the oddest part of all of this isn’t the first time that NVIDIA has offered Ultra mode. Until earlier this decade, NVIDIA’s drivers also supported a queue size of 0, which is why I’m not sure this entirely counts as a new feature. However given the tricky nature of queuing and the evolution of OSes, it’s also entirely possible that NVIDIA has implemented a newer algorithm for pacing frame submission.

At any rate, as with its predecessor, Low Latency Mode is limited to DX9/DX10/DX11 games. Low-level APIs like DX12 and Vulkan give games very explicit control over queue sizes, so drivers cannot (or at least really should not) override the queue sizes on these newer APIs. On the plus side, unlike integer scaling, this feature is not being restricted to Turing-based video cards, so all NVIDIA GPU owners get access to it right away.

OpenGL 30-bit Color, More G-Sync Compatible Displays, & More

Wrapping things up, the 436.02 drivers also include some other feature improvements. Besides the usual slate of performance improvements – with NVIDIA focusing particularly on Apex Legends, Strange Brigade, Forza Horizon 4, World War Z, and Battlefield V – the new driver also incorporates support for 30-bit color in OpenGL applications. This ability was previously announced and rolled out for GeForce Studio Driver users last month, and like the name says, allows OpenGL applications to output images with 30-bit (10bpc) color to the GPU. Up until now, NVIDIA has purposely restricted the feature to its Quadro family of video cards, as a means of segmenting the product families and driving content creation users towards Quadro cards. Now it’s available for GeForce and Quadro users alike across both the Studio and Game Ready driver families, allowing for the use of deep color modes with all APIs.

Meanwhile, on the monitor side of matters, NVIDIA has added another 3 monitors to their G-Sync Compatible program: ASUS’s VG27A and Acer’s CP3271 & XB273K GP monitors.

Source: NVIDIA

Comments Locked

25 Comments

View All Comments

  • evilspoons - Tuesday, August 20, 2019 - link

    Doh. I just finished replaying a game that'd look way better with integer scaling vs bilinear. Oh well. (Red Alert 2 - with some modded exes it runs at native resolution during gameplay, but cutscenes and menus are switched to 640x480, 800x600, or 1024x768, all of which look nasty on my setup.)
  • AshlayW - Tuesday, August 20, 2019 - link

    Having just switched to nvidia for the first time in 4 years (2060 Super) this is a pretty nice treat for me. Especially the integer scaling. Performance gains are always nice, and it's good to see AMD pressing competition hard to stir up new features as standard.

    Now if nvidia could pull the control panel interface out if being stuck in 1999, that would be great ^.^
  • Alistair - Tuesday, August 20, 2019 - link

    As long as they keep the left and right design. The new AMD panel is very responsive, but I hate the layout.

    nVidia could use a Win10 application for settings like Realtek etc., I actually like that. Opens and works instantly.
  • StrangerGuy - Tuesday, August 20, 2019 - link

    NV's driver control panel has always been sluggish to load and to apply settings for no good reason. I cannot even remember when it wasn't.
  • lilkwarrior - Wednesday, August 21, 2019 - link

    Intel actually pushed them this time more than anything else as far as integer scaling
  • Mr Perfect - Tuesday, August 20, 2019 - link

    Image sharpening looks interesting. I've been using Reshade's image sharpening, but if the driver can do it then all the better.
  • TristanSDX - Tuesday, August 20, 2019 - link

    Hopefully you (Anandtech site) will review integer scaling soon
  • Monstieur - Tuesday, August 20, 2019 - link

    I just tested integer scaling in several games on my Predator x27 with the new driver. 1080p should be perfectly upscaled to 2160p, but it still look blurry and much worse than a 27" 1080p panel.
  • Phynaz - Tuesday, August 20, 2019 - link

    You’ve done something wrong.
  • Monstieur - Wednesday, August 21, 2019 - link

    It does seems to be working with integer scaling. If I select 1440p on my 2160p monitor it just centers the image without scaling, but 1080p and 720p are scaled to the full screen.

Log in

Don't have an account? Sign up now