Ashes of the Singularity: Escalation (DX12)

A veteran from both our 2016 and 2017 game lists, Ashes of the Singularity: Escalation remains the DirectX 12 trailblazer, with developer Oxide Games tailoring and designing the Nitrous Engine around such low-level APIs. The game makes the most of DX12's key features, from asynchronous compute to multi-threaded work submission and high batch counts. And with full Vulkan support, Ashes provides a good common ground between the forward-looking APIs of today. Its built-in benchmark tool is still one of the most versatile ways of measuring in-game workloads in terms of output data, automation, and analysis; by offering such a tool publicly and as part-and-parcel of the game, it's an example that other developers should take note of.

Settings and methodology remain identical from its usage in the 2016 GPU suite. To note, we are utilizing the original Ashes Extreme graphical preset, which compares to the current one with MSAA dialed down from x4 to x2, as well as adjusting Texture Rank (MipsToRemove in settings.ini).

We've updated some of the benchmark automation and data processing steps, so results may vary at the 1080p mark compared to previous data.

Ashes of the Singularity: Escalation - 2560x1440 - Extreme Quality

Ashes of the Singularity: Escalation - 1920x1080 - Extreme Quality

Ashes: Escalation - 99th Percentile - 2560x1440 - Extreme Quality

Ashes: Escalation - 99th Percentile - 1920x1080 - Extreme Quality

Interestingly, Ashes offers the least amount of improvement in the suite for the GTX 1660 Ti over the GTX 1060 6GB. Similarly, the GTX 1660 Ti lags behind the GTX 1070, which is already close to the older Turing sibling. With the GTX 1070 FE and RX Vega 56 neck-and-neck, the GTX 1660 Ti splits the RX 590/RX Vega 56 gap.

Far Cry 5 Wolfenstein II
Comments Locked

157 Comments

View All Comments

  • eva02langley - Friday, February 22, 2019 - link

    At 280$ for a Vega 56 with 3 games, it is brainless and one of the best value as of late. Can't wait for Navi to disrupt even more this overdue stagnant market.
  • CiccioB - Friday, February 22, 2019 - link

    Yes, it will be a new black hole in AMD quarters if the production cost/performance is the same as the old GCN line...
    You see, selling as HBM monster like Vega for that price simply means that the project is a compete flop (as it was Fiji) and nvidia can continue selling its mainstream GPU at the price they want despite the not so good market period.
  • eva02langley - Friday, February 22, 2019 - link

    Final Fantasy XV is another game gimping AMD due to gameworks implementation.
  • eddman - Friday, February 22, 2019 - link

    They disable those before benchmarking. From the article: "For our testing, we enable or adjust settings to the highest except for NVIDIA-specific features"
  • CiccioB - Friday, February 22, 2019 - link

    All games gimp nvidia s their engine is written for the consoles that mount obsolete AMD HW.
  • Oxford Guy - Saturday, February 23, 2019 - link

    It's hardly difficult to add in a bit of special slowdown sauce for the "PC" versions.
  • Comagnum - Friday, February 22, 2019 - link

    This is such a joke. Vega 56 is now the same price and out performs this terrible product, and the 1070 (AIB versions) performs similarly enough that the 1660ti has no real place in the market right now. Nvidia is a greedy terrible company. What a joke.
  • Falcon216 - Friday, February 22, 2019 - link

    I followed your advice and bought a Vega56 instead of a 1660Ti and now my power supply has been making those weird noises animals make wen they're suffering and need help what do I do?
  • Cooe - Friday, February 22, 2019 - link

    Fanboy nonsense alert!!! Unless you bought your power supply at a Chinese flea market, ignore this dude.

    (Granted there are totally cases where you'd want something like a 1660Ti over a V56 for efficiency reasons [say ultra SFF], but this guy's spitting nonsense)
  • Falcon216 - Friday, February 22, 2019 - link

    My point
    ========
    Your Head

    The V56 uses ~200w nominally depending on your choice of settings, in the detailed Tom's review it goes as low as 160w at the most minimum performance level and as high as 235w depending on the choice of power BIOS. The 1660Ti is then shown to use ~125w in BF1 and (assuming Tom's tested the V56 performance on stock settings) Anand's BF1 test shows a 9FPS lead (11%) over the 1660Ti. I'll trade that 11% performance for 40% less (absolute scale) power usage any day - My PSU ain't getting any younger and "lol just buy another one" is dumb advice dumb people make.

    Happy now?

Log in

Don't have an account? Sign up now