For those that keep a close eye on consumer hardware, AMD recently has been involved in a minor uproar with some of its most vocal advocates about the newest Ryzen 3000 processors. Some users are reporting turbo frequencies much lower than advertised, and a number of conflicting AMD partner posts have generated a good deal of confusion. AMD has since posted an update identifying an issue and offering a fix, but part of all of this comes down to what turbo means and how AMD processors differ from Intel. We’ve been living on Intel’s definitions of perceived standards for over a decade, so it’s a hard nut to crack if everyone assumes there can be no deviation from what we’re used to. In this article, we’re diving at those perceived norms, to shed some light on how these processors work.

A Bit of Context

Since the launch of Zen 2 and the Ryzen 3000 series, depending on which media outlet you talk to, there has been a peak turbo issue with the new hardware. This turbo frequency issue has been permeating in the ecosystem since Zen 2 was launched, with popular outlets like Gamers Nexus noting that on certain chips, the advertised turbo frequency was only achieved under extreme cooling conditions. For other outlets, being within 50 MHz of the peak turbo frequency has been considered chip-to-chip variation, or a function of early beta firmware. A wide array of people put varying amounts of weight behind this, from conspiracy to not being bothered about it at all.

However, given recent articles by some press, as well as some excellent write-ups by Paul Alcorn over at Tom’s Hardware*, we saw that the assumed public definitions of processor performance actually differs from Intel to AMD. What we used as the default standard definitions, which are based on Intel’s definitions, are not the same under AMD, which is confusing everyone. No one likes a change to the status quo, and even with articles out there offering a great breakdown of what's going on, a lot of the general enthusiast base is still trying to catch up to all of the changes.

This confusion – and the turbo frequency discussion in general – were then brought to the fore of news in the beginning of September 2019. AMD, in a two week span, had several things happen essentially all at once.

  1. Popular YouTuber der8aur performed a public poll of frequency reporting that had AMD in a very bad light, with some users over 200 MHz down on turbo frequency,
  2. The company settled for $12.1m in a lawsuit about marketing Bulldozer CPUs,
  3. Intel made some seriously scathing remarks about AMD performance at a trade show,
  4. AMD’s Enterprise marketing being comically unaware of how its materials would be interpreted.

Combined with all of the drama that the computing industry can be known for – and the desire for an immediate explanation, even before the full facts were in – made for a historically bad week for AMD. Of course, we’ve reported on some of these issues, such as the lawsuit because they are interesting factoids to share. Others we ignored, such as (4) for a failure to see anything other than an honest mistake given how we know the individuals behind the issues, or the fact that we didn’t report on (3) because it just wasn’t worth drawing attention to it.

What has driven the discussion about peak turbo has come to head because of (1). Der8auer’s public poll, taken from a variety of users with different chips, different motherboards, different cooling solutions, different BIOS versions, still showed a real-world use case of fewer than 6% of 3900X users were able to achieve AMD’s advertised turbo frequency. Any way you slice it, without context, that number sounds bad.

Meanwhile, in between this data coming out and AMD’s eventual response, a couple of contextual discrepancies happened between AMD’s partner employees and experts in the field via forum posts. This greatly exacerbated the issue, particularly among the vocal members of the community. We’ll go into detail on those later.

AMD’s response, on September 10th, was a new version of its firmware, called AGESA 1003-ABBA. This was released along with blog post that detailed that a minor firmware issue was showing 25-50 MHz drop in turbo frequency was now fixed.

Naturally, that doesn’t help users who are down 300 MHz, but it does come down to how much the user understands how AMD’s hardware works. This article is designed to shed some light on the timeline here, as well as how to understand a few nuances of AMD's turbo tech, which are different to what the public has come to understand from Intel’s use of specific terms over the last decade.

*Paul’s articles on this topic are well worth a read:
Ryzen 3000, Not All Cores Are Created Equal
Investigating Intel’s Claims About Ryzen Reliability
Testing the Ryzen 3000 Boost BIOS Fix

This Article

In this article we will cover:

  • Intel’s Definition of Turbo
  • AMD’s Definition of Turbo
  • Why AMD is Binning Differently to Intel, relating to Turbo and OC
  • A Timeline of AMD’s Ryzen 3000 Turbo Reporting
  • How to Even Detect Turbo Frequencies
  • AMD's Fix
Defining Turbo, Intel Style
Comments Locked

144 Comments

View All Comments

  • Exodite - Wednesday, September 18, 2019 - link

    I'll take the opportunity to free-ride eastcoast_pete's comment to second its content! :)

    Awesome article Ian, this is the kind of stuff that brings me to AnandTech.

    Also, in particular I found it fascinating to read about AMD's solution to electromigration - Zen seems to carry around a lot of surprises still! Adding to pete's ask re: overclocking vs. lifespan I'd be very interested to read more about how monitoring and counteracting the effects of electromigration actually works with AMD's current processors.

    Thanks again!
  • HollyDOL - Wednesday, September 18, 2019 - link

    I used to have factory OCed GTX 580 (EVGA hydro model, bought when it was fresh new merchandise)... More than half of it's life time I was also running BOINC on it. Swapped for GTX 1080 when it was fresh new. So when replaced with faster card it was 5-6yrs old.

    Out of this one case I guess unless you go extereme OC or fail otherwise (condensation on subambient, very bad airflow, wrongly fitted cooler etc. etc.) you'll sooner replace with new one anyway since the component will get to age where no OC saves it from being obsolete anyway.

    Though I'd be curious about more reliable numbers as well.
  • Gondalf - Tuesday, September 17, 2019 - link

    Intel do not guarantee the turbo still it deliver well, AMD at least for now nope.
    Fix or not fix it is pretty clear actual 7nm processes are clearly slower than 14nm, nothing
    can change this. Are decades that a new process have always an higher drive current of
    the older one, this time this do not happen.
    Pretty impressive to see a server cpu with 20% lower ST performance only because the
    low power process utilized is unable to deliver a clock speed near 4Ghz, absurd thing considering
    that Intel 14nm LP gives 4GHz at 1V without struggles.
    Anyway.....this is the new world in upcomin years.
  • Korguz - Tuesday, September 17, 2019 - link

    intel also does not guarantee your cpu will use only 95 watts when at max speed... whats your point ? cap that cpu at the watts intel specifies.. and look what happens to your performance.
  • Gondalf - Tuesday, September 17, 2019 - link

    My point power consumption is not a concern in actual desktop landscape only done of entusiasts with an SSD full of games, they want top ST perf at any cost, no matter 200 W of power consumption.
    Absolutely different is the story in mobile and server, but definitevely not in all workloads around.
  • vanilla_gorilla - Tuesday, September 17, 2019 - link

    > they want top ST perf at any cost

    They actually don't. Because no one other than the farmville level gamer is CPU bound. Everyone is GPU bound. The only exception is possibly people playing at 1080p (or less) and their framerates are 200-300 or more. There are no real situations where you will see any perceptible difference between the high end AMD or Intel CPU for gaming while using a modern discreet GPU.

    The difference is buying AMD is cheaper, both the CPU and the platform, which has a longer lifetime by the way (AM4) and you get multicore performance that blows Intel away "for free".
  • N0Spin - Monday, October 21, 2019 - link

    I have seen reviews of demanding current generation gaming titles like Battlefield 5 in which reviewers definitely noted that the CPU level/and # of cores indeed influences the performance. I am not stating that this is always the case, but CPUs/cores can and do matter in a number of instances even if all you do is game, after running a kill all extraneous processes script.
  • Xyler94 - Tuesday, September 17, 2019 - link

    You're speaking for yourself here...

    I don't care if my CPU gets me 5 more FPS when I'm already hitting 200+ FPS, I care whether the darn thing doesn't A: Cook itself to death and B: Doesn't slow down when I'm hitting it with more tasks.

    People have done the test, and you can too if you have an overclocking friendly PC. disable all but 1 core, and run it at 4GHZ, and see how well your PC performs. Then, enable 4 cores, and set them at 1GHZ, see how well the PC feels. It was seen that 4 cores at 1GHz was better than 1 core at 4ghz. The reality? More cores do more work. It's that simple.

    You either don't pay electricity or are in a spot where the electricity cost of your computer doesn't factor into your monthly bill. Some people do care if a single part of their PC draws 200W of power. I certainly care, because the lower the wattage, I don't have to buy a super expensive UPS to power my device. Also, gaming is becoming more multi-threaded, so eventually, the ST performance won't matter anyways.
  • Korguz - Tuesday, September 17, 2019 - link

    Gondalf, sorry but nope.. for some how much power a cpu uses is a concern, specially when one goes to choose HSF to cool that cpu, and they buy one, only to find that it isnt enough to keep it cool enough to run at the specs intel says.. and labeling a cpu to use 95 watts, and have it use 200 or more, is a HUGE difference. but you are speaking for your self, on the ST performance, as Xyler94 mentioned.
  • evernessince - Tuesday, September 17, 2019 - link

    How about no. 200w for a few FPS sounds like a terrible trade off unless you are cooking eggs on your nipples with the 120 F room you are sitting in after that PC is running for 1 hour.

Log in

Don't have an account? Sign up now