Thanks to the further proliferation of 8Gb GDDR5 memory modules, we’ve seen an uplift over the last few months with the memory capacity of professional graphics cards. For the professional graphics market this is always a welcome development, as datasets are already massive and always growing, especially in the content creation field.

Due to various technical considerations – primarily a larger memory bus – over the past generation AMD has traditionally offered the highest capacity professional graphics cards, with the current FirePro W9100 topping out at 16GB. More recently, last month NVIDIA surpassed AMD with the launch of the 24GB Quadro M6000. However this week in advance of the 2016 NAB Show, AMD is firing back and retaking the top spot with their own capacity bump, updating the FirePro W9100 to 32GB.

AMD FirePro W Series Specification Comparison
  AMD FirePro W9100 (32GB) AMD FirePro W9100 (16GB) AMD FirePro W9000 AMD FirePro W8100
Stream Processors 2816 2816 2048 2560
Texture Units 176 176 128 160
ROPs 64 64 32 64
Core Clock 930MHz 930MHz 975MHz 824MHz
Memory Clock 5Gbps GDDR5 5Gbps GDDR5 5.5Gbps GDDR5 5Gbps GDDR5
Memory Bus Width 512-bit 512-bit 384-bit 512-bit
Double Precision 1/2 1/2 1/4 1/2
Transistor Count 6.2B 6.2B 4.31B 6.2B
TDP 275W 275W 274W 220W
Manufacturing Process TSMC 28nm TSMC 28nm TSMC 28nm TSMC 28nm
Architecture GCN 1.1 GCN 1.1 GCN 1.0 GCN 1.1
Warranty 3-Year 3-Year 3-Year 3-Year
Launch Price (List) $4999 $3999 $3999 $2499
Launch Date Q2 2016 April 2014 August 2012 July 2014

The updated FirePro W9100 takes off right where the previous model left off. Based around a fully enabled version of AMD’s Hawaii GPU, the specifications outside of memory capacity are unchanged. As for the memory itself, this update sees AMD replace their 4Gb GDDR5 chips with 8Gb chips, moving from a 32 x 4Gb configuration to a 32 x 8Gb configuration. Consequently any possible performance impact is data set size dependent. Performance essentially doesn’t change for data sets that fit within memory, while sets between 16GB and 32GB that were slow before because they didn’t fit on the card will now be able to be loaded in their entirety.

With their latest capacity bump, AMD becomes the first company to ship a 32GB pro graphics card, and consequently retakes their top spot in the market. At the same time AMD will have final bragging rights for this generation, as AMD and NVIDIA have now both maxed out the memory capacity of their current cards.

The 32GB FirePro W9100 will be launching this quarter through AMD’s usual distribution and OEM partners. The MSRP will be $4999, which is closely aligned to competitor NVIDIA’s own pricing, though also higher than the 16GB card it supplants. Meanwhile AMD will continue to ship the 16GB card as well, and while there isn’t a current MSRP attached to it, it’s currently available from retailers for around $3000.

Comments Locked


View All Comments

  • Achaios - Friday, April 15, 2016 - link

    Seriously, where the heck did this guy come from and why does such a post get thumbs up at Anandtech? Was this guy living in a cave somewhere?

    This guy is saying that modern games do not even utilize 4 GB on the GPU, however, Grand Theft Auto V (GTA V) is utilizing 100% of the 3072 MB GDDR5 of my GTX 780 TI AND 8 GB System Memory AND is in addition reserving AN ADDITIONAL 14 GB systems memory! And that is at 1920X1080, and I don't have enough GPU RAM left to maximize Graphics Settings (i.e. "Advanced Graphics" switched off) and even one or tow seetings at the Graphics section of GTA V ere not set to ULTRA because I am using 100% of my GPU RAM.

    So, indeed we are living in an era where we are stuck with lesser technology GPUs and we are also stuck with GPUs that have far less RAM than they ought to. IMO, top-end mainstream gaming GPUs whould have come with 12 GB memory.

    For the record, the only GPU that can play GTA V with everything set to max at 4k is the 12 GB Titan variant. They tried with a GTX980TI 6GB and the card was choking due to lack of memory. So much for the OP's claim that no game can utilize 4 or 8 GB memory, ahahaha.
  • Manch - Friday, April 15, 2016 - link

    I don't think that +1 means what you think it means. That was for the critical reply to the dudes rant. Nobody gave him a thumbs up....
  • mapesdhs - Friday, April 15, 2016 - link

    Modded games also use a lot of RAM. Skyrim is a good example, people with complex setups easily use 6GB+.

    Heck, my customised Crysis config uses almost 4GB. :D
  • minasnoldo - Friday, April 15, 2016 - link

    If I recall correctly, PC cards have issues with modern console games due to the Heterogeneous Memory Architecture of the consoles (I believe Anandtech did an excellent article on HMA a year or 2 back).

    The idea is that game developers, are given a large pool of memory to carve up as they see fit Turns out that a lot of triple-A game makers have been choosing to devote large portions of it to texture, hence the issues on even top tier cards. It seems it isn't a power issue so much as it is a resource availability issue.
  • Samus - Sunday, April 17, 2016 - link

    Poor programming partially, but the root issue stands. We need more GPU memory. The GPU handles a lot of post processing now, and with textures and resolutions increasing, we will only need more. 4GB is the minimum for modern AAA gaming at 1080p in full detail.
  • Gonemad - Tuesday, April 19, 2016 - link

    Exactly. I had a 1GB graphics card that COULD run GTA, but not at 1080p without those nagging errors that GTA spits out. As a stopgap, I bought a R9 200 with 3GB of RAM, so I can at least bump to 1080p and keep a lot of things below ultra to make it playable at an average of 35fps with dips into 25fps. Using an relic i7 920 cpu...

    I don't mind the cinema standard of 24fps, but it turned out to be a good parameter to know when the GFX card is choking. On the bright side, I have 12GB of RAM, and the GTA executable sucks a whole 8GB for itself, leaving the rest for the system.

    So yeah, theory disproven. I need every ounce of GPU and CPU power I can muster to play GTA 5 at least @ 1080p.

    I'd have to dump my whole rig and buy a fresh machine with a Titan to dial everything up.
  • minasnoldo - Tuesday, April 19, 2016 - link

    Is it possible there is something else going on? I am not a programmer so I am completely out of my depth here (but I am thinking out loud so please be patient).

    What I mean by the "something else", I am reminded of the DX9 requirement (IIRC) that everything in video ram be mirrored in system memory, which, would seriously cut into your available RAM if you were running a 32 bit program.

    A year or 2 ago and Anandtech was originally writing some articles about HMA and hUMA, they talked about some of the finer points of memory reservations/sharing and systems/GPUs. My uninformed thoughts on high system memory usage have me wondering if it is possible that there is something about how the instructions on the consoles work that, ends up making a lot of memory duplication when the easiest porting methods are used (as opposed to ripping the game apart and practically remaking the various bits specifically for tradition PC architecture or something that would cost them more money).

    Anyway, those are just my thoughts. Anybody have more insight deeper than just "
    poor optimization"?
  • 06GTOSC - Friday, April 15, 2016 - link

    People who can't afford $1500-2000 gaming rigs apologize to you for "holding back" your entertainment.
  • BurntMyBacon - Friday, April 15, 2016 - link

    @06GTOSC: "People who can't afford $1500-2000 gaming rigs apologize to you for "holding back" your entertainment."

    Apology accepted, but there is really no need; it's just a game. ... Oh, wait. You weren't talking to me.
  • BrokenCrayons - Friday, April 15, 2016 - link

    Hey, games are serious business. People spend a lot of time arguing about nearly every aspect of video games because they play an important role in the formation of the lens through which they view the world. For some people, absolutely nothing else has ever or will ever matter more than giggling gleefully into a microphone about how amazing it was to cause the fictional death of another person's digital representation in the confines of a game's artificial world. Just because the rest of us muted all the audio and aren't even sitting behind our screens because we're busy living our lives in the material world doesn't diminish the importance to that one person.

Log in

Don't have an account? Sign up now