Ashes of the Singularity: Escalation (DX12)

A veteran from both our 2016 and 2017 game lists, Ashes of the Singularity: Escalation remains the DirectX 12 trailblazer, with developer Oxide Games tailoring and designing the Nitrous Engine around such low-level APIs. The game makes the most of DX12's key features, from asynchronous compute to multi-threaded work submission and high batch counts. And with full Vulkan support, Ashes provides a good common ground between the forward-looking APIs of today. Its built-in benchmark tool is still one of the most versatile ways of measuring in-game workloads in terms of output data, automation, and analysis; by offering such a tool publicly and as part-and-parcel of the game, it's an example that other developers should take note of.

Settings and methodology remain identical from its usage in the 2016 GPU suite. To note, we are utilizing the original Ashes Extreme graphical preset, which compares to the current one with MSAA dialed down from x4 to x2, as well as adjusting Texture Rank (MipsToRemove in settings.ini).

We've updated some of the benchmark automation and data processing steps, so results may vary at the 1080p mark compared to previous data.

Ashes of the Singularity: Escalation - 2560x1440 - Extreme Quality

Ashes of the Singularity: Escalation - 1920x1080 - Extreme Quality

Ashes: Escalation - 99th Percentile - 2560x1440 - Extreme Quality

Ashes: Escalation - 99th Percentile - 1920x1080 - Extreme Quality

Interestingly, Ashes offers the least amount of improvement in the suite for the GTX 1660 Ti over the GTX 1060 6GB. Similarly, the GTX 1660 Ti lags behind the GTX 1070, which is already close to the older Turing sibling. With the GTX 1070 FE and RX Vega 56 neck-and-neck, the GTX 1660 Ti splits the RX 590/RX Vega 56 gap.

Far Cry 5 Wolfenstein II
Comments Locked

157 Comments

View All Comments

  • Psycho_McCrazy - Tuesday, February 26, 2019 - link

    Given that 21:9 monitors are also making great inroads into the gamer's purchase lists, can benchmark resolutions also include 2560.1080p, 3440.1440p and (my wishlist) 3840.1600p benchies??
  • eddman - Tuesday, February 26, 2019 - link

    2560x1080, 3440x1440 and 3840x1600

    That's how you right it, and the "p" should not be used when stating the full resolution, since it's only supposed to be used for denoting video format resolution.

    P.S. using 1080p, etc. for display resolutions isn't technically correct either, but it's too late for that.
  • Ginpo236 - Tuesday, February 26, 2019 - link

    a 3-slot ITX-sized graphics card. What ITX case can support this? 0.
  • bajs11 - Tuesday, February 26, 2019 - link

    Why can't they just make a GTX 2080Ti with the same performance as RTX 2080Ti but without useless RT and dlss and charge something like 899 usd (still 100 bucks more than gtx 1080ti)?
    i bet it will sell like hotcakes or at least better than their overpriced RTX2080ti
  • peevee - Tuesday, February 26, 2019 - link

    Do I understand correctly that this thing does not have PCIe4?
  • CiccioB - Thursday, February 28, 2019 - link

    No, they have not a PCIe4 bus.
    Do you think they should have?
  • Questor - Wednesday, February 27, 2019 - link

    Why do I feel like this was a panic plan in an attempt to bandage the bleed from RTX failure? No support at launch and months later still abysmal support on a non-game changing and insanely expensive technology.

    I am not falling for it.
  • CiccioB - Thursday, February 28, 2019 - link

    Yes, a "panic plan" that required about 3 years to create the chips.
    3 years ago they already know that they would have panicked at the RTX cards launch and so they made the RT-less chip as well. They didn't know that the RT could not be supported in performance with the low number of CUDA core low level cards have.
    They didn't know that the concurrent would have played with the only weapon it was left to it to battle, that is prize as they could not think that the concurrent was not ready with a beefed up architecture capable of the sa functionalities.
    So, yes, they panicked for sure. They were not prepared to anything of what is happening,
  • Korguz - Friday, March 1, 2019 - link

    " that required about 3 years to create the chips.
    3 years ago they already know that they would have panicked at the RTX cards launch and so they made the RT-less chip as well. They didn't know that the RT could not be supported in performance with the low number of CUDA core low level cards have. "

    and where did you read this ? you do understand, and realize... is IS possible to either disable, or remove parts of an IC with out having to spend " about 3 years " to create the product, right ? intel does it with their IGP in their cpus, amd did it back in the Phenom days with chips like the Phenom X4 and X3....
  • CiccioB - Tuesday, March 5, 2019 - link

    So they created a TU116, a completely new die without RT and Tensor Core, to reduce the size of the die and lose about 15% of performance with respect to the 2060 all in 3 months because they panicked?
    You probably have no idea of what are the efforts to create a 280mm^2 new die.
    Well, by this and your previous posts you don't have idea of what you are talking about at all.

Log in

Don't have an account? Sign up now