After becoming the victim of the SARS-CoV-2 coronavirus not once but twice, NVIDIA’s GTC 2020 keynote address has finally been rescheduled. The virtual keynote is now set to be broadcast on YouTube on May 14th, at 6am Pacific (13:00 UTC).

One of the many technology trade shows impacted by the now global pandemic, NVIDIA resorted to breaking up their annual GPU Technology Conference into multiple pieces. A number of presentations and sessions originally scheduled to be given at the show have instead been moved online as part of NVIDIA’s digital GTC 2020 conference. Meanwhile the show-defining keynote speech, as always to be delivered by CEO Jensen Huang, was previously rescheduled as a digital event for March before being postponed entirely.

And while NVIDIA isn’t saying much new about the contents of the keynote itself, it’s still expected to be one of NVIDIA’s most important presentations of the last few years. In particular, this year’s keynote is widely anticipated to include the announcement of a next-generation compute GPU architecture.

NVIDIA's current Volta architecture-based GV100 GPU is now a few years old, and supercomputer planning announcements have tipped the fact that NVIDIA will have a new Tesla accelerator ready later this year. The current generation of Tesla accelerators have been a huge success story for NVIDIA, so there's a great deal of interest in seeing how NVIDIA will keep up that momentum, especially in the face of stiff competition from all directions, from FPGA suppliers to Intel's Xe GPU family.

Source: NVIDIA

Comments Locked

34 Comments

View All Comments

  • Slash3 - Saturday, April 25, 2020 - link

    If that was a 2080 Super (and not a 2070 Super) for $530, you made out like a highwayman on that one.
  • drexnx - Saturday, April 25, 2020 - link

    might explain why I was able to snag a gigabyte gaming OC 2080ti for 900 as well.
  • drexnx - Saturday, April 25, 2020 - link

    (with $50 MIR to come later as well)
  • eva02langley - Sunday, April 26, 2020 - link

    Still a scam for what it is. It is about 30-35% faster than a 5700XT while costing almost 3 times the cost. I got my MSI 5700 XT gaming X for 365$.
  • drexnx - Sunday, April 26, 2020 - link

    a 5700XT would have been almost a sidegrade from my 1080, and I have a 1440p 144hz g-sync monitor anyway. Kinda locked into team green until 4k144 is possible/standard

    plus "iT hAs RaYtRaCinG!"
  • eva02langley - Sunday, April 26, 2020 - link

    It was a good deal, it's barely 10-20$ more expensive than a 2070s MSRP.
  • dwade123 - Saturday, April 25, 2020 - link

    RIP AMD. Have fun with console peasants while Nvidia gobbles up 99% of PC gamers.
  • eva02langley - Sunday, April 26, 2020 - link

    Consoles are going to push 2160p, not the halo segment of the PC dGPU. I am way more interested in a PS5 than any new product from Nvidia. Once again, consoles are innovating to the point that the PC industry is playing catch up.
  • BenSkywalker - Sunday, April 26, 2020 - link

    You have to be in awe of RDNA, with a full node advantage and an extra year of development time they somehow managed to not obliterate the 2080ti in performance, they didn't even match it, but they went rely above and beyond by not matching the 2080 Super and in fact struggle to compete with the 2070 Super while barely winning on power usage.....

    The mind boggles, obviously a team of actual engineers couldn't fail that hard, they must have been trying to set a new standard for shockingly poor product management to fool nVidia into taking a nap.

    Anything less then 60% faster than a 2080Ti in traditional raster and 300% faster in ray tracing will be a humiliating failure for RDNA2.
  • Vesperan - Monday, April 27, 2020 - link

    Different tier cards. You may as well reframe your comment to say that Nvidia failed because the 2070 didn't manage to obliterate the 2080ti. Of course it didn't, it was never meant to.

    Don't take that as me saying Navi is best most amazing card ever made and that AMD tech is better than Nvidia. Its simply not - for one thing AMDs power consumption is way too higher given their process node advantage. But I don't criticise a mid-weight Navi for not obliterating a heavy weight Turing.

    For my 10 cents I'm hoping Ampere is fantastic in the midrange - driving down AMD prices so I can get a cheaper RX5700 (Freesync monitor).

Log in

Don't have an account? Sign up now