Wolfenstein II: The New Colossus (Vulkan)

id Software is popularly known for a few games involving shooting stuff until it dies, just with different 'stuff' for each one: Nazis, demons, or other players while scorning the laws of physics. Wolfenstein II is the latest of the first, the sequel of a modern reboot series developed by MachineGames and built on id Tech 6. While the tone is significantly less pulpy nowadays, the game is still a frenetic FPS at heart, succeeding DOOM as a modern Vulkan flagship title and arriving as a pure Vullkan implementation rather than the originally OpenGL DOOM.

Featuring a Nazi-occupied America of 1961, Wolfenstein II is lushly designed yet not oppressively intensive on the hardware, something that goes well with its pace of action that emerge suddenly from a level design flush with alternate historical details.

The highest quality preset, "Mein leben!", was used. Wolfenstein II also features Vega-centric GPU Culling and Rapid Packed Math, as well as Radeon-centric Deferred Rendering; in accordance with the preset, neither GPU Culling nor Deferred Rendering was enabled.

Wolfenstein II - 2560x1440 -

Wolfenstein II - 1920x1080 -

Wolfenstein II - 99th Percentile - 2560x1440 -

Wolfenstein II - 99th Percentile - 1920x1080 -

As we've seen before, Turing and Vega tend to run well on Wolfenstein II. For our games, these results are actually the closest the RX 590 can get to the GTX 1660 Ti, and even here the GTX 1660 Ti is a solid 13-14% ahead. Here, the GTX 1660 Ti also pulls the biggest lead over the GTX 1060 6GB, coming in at more than 1.5X faster, but also loses to the RX Vega 56 by more than other games.

The 6GB of framebuffer doesn't seem to be holding the GTX 1660 Ti back. The GTX 960's 2GB framebuffer, on the other hand, is asphyxiating.

Ashes of the Singularity: Escalation Final Fantasy XV
Comments Locked

157 Comments

View All Comments

  • Yojimbo - Saturday, February 23, 2019 - link

    My guess is that in the next (7 nm) generation, NVIDIA will create the RTX 3050 to have a very similar number of "RTX-ops" (and, more importantly, actual RTX performance) as the RTX 2060, thereby setting the capabilities of the RTX 2060 as the minimum targetable hardware for developers to apply RTX enhancements for years to come.
  • Yojimbo - Saturday, February 23, 2019 - link

    I wish there were an edit button. I just want to say that this makes sense, even if it eats into their margins somewhat in the short term. Right now people are upset over the price of the new cards. But that will pass assuming RTX actually proves to be successful in the future. However, if RTX does become successful but the people who paid money to be early adopters for lower-end RTX hardware end up getting squeezed out of the ray-tracing picture that is something that people will be upset about which NVIDIA wouldn't overcome so easily. To protect their brand image, NVIDIA need a plan to try to make present RTX purchases useful in the future being that they aren't all that useful in the present. They can't betray the faith of their customers. So with that in mind, disabling perfectly capable RTX hardware on lower end hardware makes sense.
  • u.of.ipod - Friday, February 22, 2019 - link

    As a SFFPC (mITX) user, I'm enjoying the thicker, but shorter, card as it makes for easier packaging.
    Additionally, I'm enjoying the performance of a 1070 at reduced power consumption (20-30w) and therefore noise and heat!
  • eastcoast_pete - Friday, February 22, 2019 - link

    Thanks! Also a bit disappointed by NVIDIA's continued refusal to "allow" a full 8 GB VRAM in these middle-class cards. As to the card makers omitting the VR required USB3 C port, I hope that some others will offer it. Yes, it will add $20-30 to the price, but I don't believe I am the only one who's like the option to try some VR gaming out on a more affordable card before deciding to start saving money for a full premium card. However, how is VR on Nvidia with 6 GB VRAM? Is it doable/bearable/okay/great?
  • eastcoast_pete - Friday, February 22, 2019 - link

    "who'd like the option". Google keyboard, your autocorrect needs work and maybe some grammar lessons.
  • Yojimbo - Friday, February 22, 2019 - link

    Wow, is a USB3C port really that expensive?
  • GreenReaper - Friday, February 22, 2019 - link

    It might start to get closer once you throw in the circuitry needed for delivering 27W of power at different levels, and any bridge chips required.
  • OolonCaluphid - Friday, February 22, 2019 - link

    >However, how is VR on Nvidia with 6 GB VRAM? Is it doable/bearable/okay/great?

    It's 'fine' - the GTX 1050ti is VR capable with only 4gb VRAM, although it's not really advisable (see Craft computings 1050ti VR assessment on youtube - it's perfectly useable and a fun experience). The RTX 2060 is a very capable VR GPu, with 6gb VRAm. It's not really VRAM that is critical in VR GPU performance anyway - more the raw compute performance in rendering the same scene from 2 viewpoints simultaneously. So, I'd assess that the 1660ti is a perfectly viable entry level VR GPU. Just don't expect miracles.
  • eastcoast_pete - Saturday, February 23, 2019 - link

    Thanks for the info! About the miracles: Learned a long time ago not to expect those from either Nvidia or AMD - fewer disappointments this way.
  • cfenton - Friday, February 22, 2019 - link

    You don't need a USB C port for VR, at least not with the two major headsets on the market today.

Log in

Don't have an account? Sign up now