BatteryBoost: Gaming Battery Life x 3

First things first, let's talk about the typical FPS (Frames Per Second) before we get to BatteryBoost. Plugged in, GRID Autosport at our 1080p High settings will average around 145 FPS, making it a good candidate for BatteryBoost. Perhaps more importantly, even when running on battery power GRID Autosport is able to average nearly 120 FPS (117 to be precise). Obviously, enabling BatteryBoost FPS targets will result in the average FPS equaling the target – 30, 40, 50, or 60 is what we tested – and testing without BatteryBoost but with VSYNC will result in a 60 FPS average.

As for the other games, Tomb Raider at High (not the benchmark, but actually running the full game for our battery test) gets around 107 FPS on AC power but drops to 70-73 FPS on battery power, so we wouldn't expect nearly as much of a benefit from BatteryBoost, especially at the 60FPS target. Borderlands: The Pre-Sequel falls roughly between those two, getting 135-140 FPS on AC power and dropping to around 88 FPS on battery power.

If BatteryBoost is simply benefiting from lower FPS, our VSYNC results should be the same as the BatteryBoost 60FPS results, but as we'll see in a moment that's not the case. Figuring out exactly what NVIDIA is doing is a bit more complex, and we'll discuss this more on the next page. First, let's start with GRID Autosport and run some detailed tests at 10FPS intervals and see how battery life scales.

Interestingly, while BatteryBoost on its own is able to do a good job at improving battery life – the GT72 goes from just 55 minutes without BatteryBoost to 112 minutes with a 30FPS target – tacking on VSYNC adds a bit more battery life on top of that. Our best result is at 30FPS with VSYNC enabled, where the GT72 manages 124 minutes, just surpassing NVIDIA's target of two hours. Of course, you'll probably want to stop a few minutes earlier to make sure your game progress is saved (if applicable), and 30FPS isn't the best gaming experience. Moving to higher FPS targets, BatteryBoost offers diminishing returns, but that's sort of expected. Even at 60FPS however, BatteryBoost still manages 90 minutes compared to 70 minutes without BatteryBoost but with VSYNC.

Given VSYNC appears to help even when BatteryBoost is enabled, for our remaining tests we simply left VSYNC on (except for the one non-BatteryBoost test). We ended up running four tests: no BatteryBoost and without VSYNC, no BatteryBoost but with VSYNC, and then BatteryBoost at 60 and 30 FPS targets with VSYNC. Here's the same data from the above chart, but confined to these four test results.

GRID has a nice almost linear line going from 30FPS to 60FPS to no BatteryBoost with VSYNC, and then finally to the fully unlimited performance. What's interesting is that the other two games we tested don't show this same scaling….

Borderlands: The Pre-Sequel has lower frame rates by default, so BatteryBoost isn't able to help quite as much. Normal performance on battery power without VSYNC results in 54 minutes of gaming, which is pretty similar to the result with GRID Autosport. That actually makes sense as in both games we're basically running the system as fast as it will go. Putting a 60FPS cap into effect via VSYNC, battery life only improves by a few minutes, while tacking on BatteryBoost with a 60FPS target gets us up to 62 minutes. Since we're starting at just under 90FPS with no frame rate cap, the smaller gains in battery life with a 60FPS target aren't a surprise, but the very modest 15% improvement is less than I expected. Dropping to a 30FPS target, we're not quite able to get two hours, but we do come quite close at 112 minutes – so essentially double the battery life compared to running at full performance.

Last is Tomb Raider, and as the game with the lowest starting FPS (on battery power) I expected to see the smallest gains in battery life. Interestingly, battery life without BatteryBoost and VSYNC starts at 57 minutes, slightly more than the other two games, but Tomb Raider is known for being more of a GPU stress test than something that demands a lot of the CPU, so perhaps the Core i7-4710HQ just doesn't need to work as hard. Turning on VSYNC does almost nothing (the one minute increase is basically within the margin of error), and BatteryBoost targeting 60FPS is only slightly better (six minutes more than without BatteryBoost). Once we target 30FPS, the end result is about the same as Borderlands TPS: 113 minutes, just missing a 100% improvement in battery life.

Just for kicks, I ran a separate test with Tomb Raider using 1080p and Normal quality with the BatteryBoost 30FPS setting to see if I could get well over two hours by further reducing image quality. While there's still a lot going on that requires power from the system – remember we're dealing with a 45W TDP CPU and around a 100W maximum TDP GPU, plus various other components like the motherboard, LCD, storage, and RAM – at these moderate quality settings I was able to get 125 minutes out of Tomb Raider.

In essence, the less work the GPU has to do and the higher the starting frame rates, the more likely BatteryBoost is to help. It's little wonder then that NVIDIA's discussion of BatteryBoost often makes mention of League of Legends. The game is definitely popular, and what's more it's fairly light on the GPU. By capping FPS at 30 it's easy to see how such a light workload can reach into the 2+ hour range. Interestingly, with Tomb Raider managing 2.08 hours at Normal quality, and given the GT72 uses an ~87 Wh battery, that means the power draw of the notebook during this test is only around 41-42W – not bad for a notebook with a theoretical maximum TDP (under AC power) of roughly 150W.

NVIDIA's BatteryBoost: Thoroughly Investigated A Closer Look at Clock Speeds and Power
Comments Locked


View All Comments

  • inighthawki - Friday, October 24, 2014 - link

    Yeah, at such high framerates, it wouldn't be uncommon to not always be at the max queue depth, so you'll get the illusion that it's always continuously rendering. But in this case you're really just rendering frames ahead. One nice advantage sis that if you hit the queue depth, you'll actually get more consistently smooth motion, since the frame rate is more consistent. Having the game wake up consistently every vblank and rendering one frame provides a more fixed timestep for things like animation, compared to having a variable rate by rendering as fast as you can. Most people will likely never notice though.

    It's unfortunate that Windows forces the games into that model, since sometimes I'd love the triple buffering model instead. I like the lower latency of that mode, while also removing screen tearing. Given that I run a GTX780 plugged into my wall socket, I'm not too concerned about the power savings - especially considering I usually disable vsync anyway, so I'm not really wasting any more than normal.
  • HiTechObsessed - Friday, October 24, 2014 - link

    If what you're saying is true, battery life would decrease when turning on VSync... Looking at the results here, with BatteryBoost off, turning VSync on increases battery life.
  • thepaleobiker - Thursday, October 23, 2014 - link

    Yes, please read the article good sir.
  • limitedaccess - Thursday, October 23, 2014 - link

    Is there any actual difference in terms of thermal performance? Either lower temps and/or fanspeed (fan noise)? I would assume if the GPU itself is consuming significantly less power its average heat output should be lower as well and less stress placed upon the cooling system.

    As an extension of this are you able to ask Nvidia to comment on whether or not it is technically possible to extend a variation of this to desktop GPUs and if there is any plan to? This would enable the flexibility of building a system that is extremely low noise (or even passive) for certain gaming workloads yet still have performance on demand.
  • nevertell - Thursday, October 23, 2014 - link

    As there is less energy consumed, there is less energy dissapated. Ultimately, all energy that is used by any computer that isn't then used to power LED's or displays will be turned into heat.
  • limitedaccess - Thursday, October 23, 2014 - link

    Yes I'm aware of the theory. However I am curious as to what the actual tested impact would be in this case and how significant (or insignificant) the difference might be.
  • Brett Howse - Thursday, October 23, 2014 - link

    When I tested the Razer Blade, I noticed a significant decrease in temperatures and of course noise when playing with Battery Boost enabled, which is what you would expect since it is working far less.
  • JarredWalton - Thursday, October 23, 2014 - link

    Yup. Running the GPU at lower clocks and reducing power consumed means the fans don't have to work as hard to keep the system cool. Targeting 30FPS, the GT72 is pretty quiet -- not silent, but not loud at all. I didn't take measurements (I'll try that for the final full review), but there's nothing too shocking: lower performance => less heat => less noise.
  • CrazyElf - Thursday, October 23, 2014 - link

    All in all, this new Battery Boost feature seems to indicate a modest incremental improvement in battery life. It's not as good as say, the leap in performance per watt that Maxwell gave, but it's welcome nonetheless.

    The issue has always been that there's a tradeoff between size, mobility, and battery life, especially for a large hungry gaming GPU.

    Jarred, by any chance, are you aware that there is going to be are variants of the GT72 with an IPS monitor coming out in the coming months? It's already up for pre-order at many of the laptop sellers. Downside is there's a pretty big price premium.
  • sonicmerlin - Thursday, October 23, 2014 - link

    Don't these laptops have nvidia Optimus and Haswell processors? Why is their non gaming runtime so low despite their large batteries?

Log in

Don't have an account? Sign up now