A Closer Look at Clock Speeds and Power

Wrapping things up, while we've shown that BatteryBoost can certainly improve battery life, there's still the question of what exactly NVIDIA is doing behind the scenes. We know they're playing with the maximum FPS of course, but a frame rate cap alone isn't (always) able to match what BatteryBoost can deliver. To try and shed some additional light on what's going on internally, I logged performance data while running our three BatteryBoost gaming tests. This time, however, the goal was not to fully drain the battery but rather to try and find out what's going on in terms of clock speeds and power draw at a lower level; that means the tests were shorter and there may be more variance, but the numbers are generally in agreement.

There are four tests for each game where I logged data: AC power is the baseline, then I tested DC power without BatteryBoost, with BatteryBoost and a 60FPS target, and finally with BatteryBoost and a 30FPS target. I also tested all for settings with and without VSYNC. I won't guarantee the numbers are 100% accurate, as I have to rely on a utility to report clock speeds and other items, so I won't create any potentially misleading charts; nonetheless, the results are rather interesting to discuss.

First, under AC power the CPU is basically left to run free, and in most cases it will run near its maximum Turbo Boost clocks (3.2-3.3GHz); it also consumes quite a bit of power (25-35W) when VSYNC is off. The GTX 980M meanwhile is running basically full tilt (1100MHz plus or minus ~25MHz on the Core thanks to GPU Boost 2.0, and 5000MHz RAM). Turning VSYC on gives us a taste of things to come, however: average CPU clocks are typically much lower (1800-2000MHz, with spikes up to 3400MHz and lows of 800MHz) and average CPU package power is likewise substantially lower (10-30W). The GPU clocks don't change much, but GPU utilization drops from close to 100% (95-99%, depending on the game) to 32-55%. Switch to battery power and things start to get a bit interesting.

Let's discuss the three games I tested in turn, starting with Tomb Raider. The CPU clock speeds and power tend to vary substantially based on the game, and the GPU varies a bit as well though not as much as the CPU. Even without BatteryBoost, CPU clocks are often at their lowest level (800MHz), and turning on VSYNC actually resulted in higher average CPU clocks but lower average CPU power – the logging data may not be capturing fully accurate CPU clocks, though I suspect the power figures are pretty accurate. GPU clocks show some similarly odd behavior: without VSYNC the average GPU clock was 479MHz with 3200MHz GDDR5, but utilization is at 97%; with VSYNC the average GPU clocks are a bit higher (~950/3200 core/RAM) but utilization is just under 52%.

Enabling BatteryBoost with 60FPS and 30FPS targets continues to generate somewhat unexpected results. At 60FPS, the CPU is generally close to the base 800MHz, but it does average slightly higher when VSYNC is on; power draw from the CPU is pretty consistent at around 6.1-6.5W for the package. Average GPU clocks meanwhile make a bit more sense (they're slightly lower with VSYNC enabled), while average GPU utilization is slightly higher with VSYNC. Overall, however, system power use is much lower with BatteryBoost than without, which is what we'd expect from our earlier battery testing results. It looks like in Tomb Raider the GPU (plus the rest of the system except for the CPU) draws around 60-65W without BatteryBoost, and that drops to 50-55W with BatteryBoost at 60FPS. Our 30FPS BatteryBoost numbers meanwhile don't show a significant change in CPU clocks (still close to the minimum 800MHz), but with the lower FPS the CPU doesn't have to work as hard so CPU package power is now down to around 4.6-4.7W. On the GPU front, the core clocks are around 670-700MHz with close to 50% utilization, but the GDDR5 memory is now running at 1620MHz, so there are some definite power savings there. Average power draw from the GPU and system (again, minus the CPU) is around 35-40W.

Borderlands: The Pre-Sequel behaves quite differently on battery power. The AC results are about the same (CPU and GPU basically running as fast as they can), but now on DC power without BatteryBoost the CPU continues to run at relatively high clocks (3.0-3.4GHz), and as you'd expect power draw remains pretty high as well (20-25W). With BatteryBoost at 60FPS, VSYNC actually had substantially higher CPU clocks (and CPU power use – 14.6W with VSYNC compared to 11.2W without, though the test didn't last as long so there's more chance for variance), but at 30FPS things start to look a lot more like Tomb Raider: the CPU runs at 800-1500MHz, with a 1.0GHz average with VSYNC and 1.125GHz average without; CPU power is 6-7W as well (slightly lower with VSYNC). As for the GPU, things aren't all that different; there's a hard cap of 3.2GHz on the GDDR5 when running off the battery, and while the 980M is frequently at that mark when striving for 60FPS, it's mostly at 1620MHz on the 30FPS setting. The GPU (and system other than CPU) draw close to 50W at 60FPS and 35W at 30FPS, while running without BatteryBoost puts things closer to 60W.

With GRID Autosport, the results on AC power and on DC without BatteryBoost are basically similar to the other two games, though the CPU apparently isn't working as hard as in Borderlands. On AC power it uses 35W and that drops to 23W with VSYNC; on DC without BatteryBoost the CPU is drawing 25W and 15W with VSYNC. The GPU plus other system components meanwhile look to be drawing around 66W without BatteryBoost and 56W with VSYNC enabled. Turn on BatteryBoost and again at 60FPS we see higher CPU clocks (and higher CPU power use) when VSYNC is enabled, but we're talking about 10.7W without VSYNC and 13.7W with VSYNC, and apparently other factors can make up for the difference. The GPU and other components draw around 42W without VSYNC and 39W with VSYNC, so it balances out. Last but not least, at 30FPS the CPU package power averages ~7.3W without VSYNC and ~7.8W with VSYNC, while the GPU and remaining components use 35.7W without VSYCN and 31.8W with VSYNC.

Based on our testing of three different games, it appears BatteryBoost is most effective in games that don't hit the CPU as hard, though with caveats. Tomb Raider for example is known to be pretty easy on the CPU (i.e. a slower AMD APU would likely get close to the same frame rates as a fast Core i7 when paired with the same GPU). However, the type of calculations each game uses (including AI) mean that in some cases a game that doesn't appear to be very CPU intensive may still draw a fair amount of power from the CPU. In general, it looks like the GTX 980M under most gaming workloads will draw at least 25-30W of power (and another 5W or so for the motherboard, RAM, LCD, etc.), which means the lower the CPU load the better. In some cases it should be possible to get the entire GT72 notebook close to 35W while gaming, which would mean the 87Wh battery might last up to nearly 2.5 hours; more realistically, I'd expect most games will pull 40-45W even at the 30FPS target with BatteryBoost, which equates to 1.9 to 2.2 hours at most. Obviously if you have a game that's more taxing (e.g. Metro: Last Light), you'll get even less battery life.

With that said, one other interesting piece of information is that in our Light battery test (Internet surfing) using the same Balanced power profile, with the GTX 980M enabled the GT72 manages around 220 minutes of mobility. (Our Heavy battery test drops it to 165 minutes, if you're wondering.) While two hours of gaming isn't going to be enough for a LAN party, it's still quite impressive to see the GTX 980M effectively drawing about as much power as a GT 750M when BatteryBoost is enabled – though in most cases it's also providing roughly the same level of performance of the GT 750M (under AC power).

BatteryBoost: Gaming Battery Life x 3 Closing Thoughts
Comments Locked


View All Comments

  • JarredWalton - Thursday, October 23, 2014 - link

    This particular laptop does not have Optimus; you can manually enable/disable the GPU, though it requires a reboot. Since I'm testing games on the GPU, however, I wanted to compare battery life gaming to battery life not gaming (but with the GPU still active). It looks like the 980M uses around 8W idle, give or take, so turning it off and using the HD 4600 will improve battery life into the 6 hour range.
  • sonicmerlin - Tuesday, October 28, 2014 - link

    Given these things have much larger batteries than ultra books, which can last significantly longer than 6 hours, you'd think these things would get longer run times when using the IGP.
  • Krysto - Friday, October 24, 2014 - link

    Not a bad idea, this feature.
  • Calista - Friday, October 24, 2014 - link

    You can already today have a decent gaming experience with a 4 hour battery life. But you won't get it running full tilt with a modern game. We have the technology already, it's all about how the market works. More efficient component also allows for faster components. But those will consume more energy. And we're at full circle. My advice - return to games made five years ago and they will run very well on an Intel GPU while giving a long battery life.

    Long battery life/High framerates/Good graphics - feel free to pick two of those. But you will never get all three.
  • RoninX - Friday, October 24, 2014 - link

    Or carry a spare battery.

    I just bought a new MSI GT60 Dominator with the GTX 970M. The main reason I picked this over the smaller, lighter GS60 Ghost is that the GT60 comes with a removable 9-cell battery, where the GS60 has a non-removable 6-cell battery.

    I get over 2 hours of runtime with Borderlands: The Pre-Sequel at high settings, 30 fps, and 1920x1080. With a spare battery, that's over 4 hours, which is plenty for my primary use case for battery gaming (gaming while waiting for airline flights).

    I was also impressed with the GT60's full performance plugged into AC, which comes close to my desktop (i7-2700k with GTX 680) using 3D Mark. The fan does sound a bit like a hovercraft when the CPU/GPU is running at full tilt, but I can live with that.
  • jann5s - Tuesday, October 28, 2014 - link

    I love these type of articles, thank you AT!

    If I may propose another topic: The visual impact of game quality settings (e.g. FSAA) compared to the cost in performance.

Log in

Don't have an account? Sign up now