Advanced displays clearly drive sales of higher-end GPUs, so NVIDIA has multiple initiatives to make high-end gaming monitors available. One of such initiatives is the BFGD (big format gaming display) project and it is designed to bring ultra-large gaming LCDs to the market. BFGD is currently supported by several suppliers of displays, but at present only HP’s Omen X 64.5-incher is available. This is going to change later this year as ASUS is prepping to launch its own BFGD.

The ASUS ROG Swift PG65UQ uses a 64.5-inch 8-bit 4K Ultra-HD AMVA panel featuring 750-1000 nits brightness (typical/HDR), a 3200:1-4000:1 contrast ratio (minimum/typical), 178° viewing angles, a 120 - 144 Hz refresh rate (normal/overclocked), and a 4 ms GtG response time with overdrive enabled. Since the monitor belongs to the G-Sync HDR range, it has a 384-zone full direct-array backlight enhanced with quantum dots to guarantee high contrast as well as precise reproduction of 95% of the DCI-P3 color space.

While the ROG Swift PG65UQ uses the same panel as HP’s Omen X Emperium, the monitor is configured differently. Unlike HP, ASUS decided not to equip its 64.5-inch gaming display with a high-end sound bar as well as an integrated NVIDIA Shield STB. Meanwhile, the display still has a USB hub and an IR receiver.

The decision not to offer a soundbar and the STB is pretty logical as hardcore gamers (especially among the ROG clientele) tend to use gaming speakers of their choice. Besides, without the soundbar and the STB the monitor costs less and potentially enables ASUS to sell it at a lower price point than HP.

So far, ASUS has not announced a firm launch date as well as MSRP for the ROG Swift PG65UQ, but NVIDIA is already demonstrating the monitor, so the unit appears to be more or less ready.

Want to keep up to date with all of our Computex 2019 Coverage?
 
Laptops
 
Hardware
 
Chips
 
Follow AnandTech's breaking news here!
POST A COMMENT

29 Comments

View All Comments

  • boeush - Friday, May 31, 2019 - link

    Ever tried using a TV as a computer monitor? The lack of contrast on text tends to make it nigh-unusable. In general, TVs play fast and loose with the image they're trying to display; they are designed to produce a visually 'equivalent' image at a large viewing distance.

    Monitors aren't allowed to mess with the picture content - so no lossy compression, no hinky games with local edge or contrast enhancement to "improve picture quality", etc. Monitors would have DisplayPort and other PC-oriented I/O ports, whereas a TV won't have any such things.

    TVs will probably also have worse GtG and generally ghosting artifact characteristics, because they're geared toward displaying movie content where frame-to-frame differences tend to be small and gradually evolving - which isn't the case with the type of content that computers tend to generate.
    Reply
  • GrandTheftAutoTune - Friday, October 4, 2019 - link

    You know nothing about modern TV’s it seems. Modern TV’s have a Game Mode and PC Mode that disables all the nonsense that screws up text sharpness and accuracy. Heck most of that is caused by post-processing effects and sharpness than any intelligent person who wishes to calibrate their display to studio accuracy disables anyway.

    There is zero compression used outside for Chroma Sampling with HDR - which all displays including HDR monitors use. If you actually look at the raw measurements of GtG of both the best TV’s and monitors you would realise they’re both near the edge of what’s possible with LED/LCD. Don’t ever listen to manufactures quoted response times, they all measure Gt differently. Using RTings measurements, you will realise that you could achieve low blur with a high end Samsung TV for example. You can actually achieve much less than a monitor with BFI (Black Frame Insertion) which is much better than overdrive. You can also disable and/or reduce dimming if you’re don’t like it.

    On a properly set up Samsung colour calibrated with the nonsense off, it looks as clear and as accurate as my phone screen for web browsing. Movies and video games look way better due to higher brightness, contracts and lower levels of blur. On a Sony OLED Colour calibrated with the nonsense turned off in Game Mode, with BFI enabled in HDR. It was the best visual experience I’ve had. Easily destroy any monitor in every category. No issues with sharpness or dimming, Plasma levels of motion clarity, studio level colour accuracy, insane pitch black room contrast for horror games, high brightness levels, superb UHD clarity, no banding in 10-bit colour. It looked perfect.

    Next time educate yourself before you comment nonsense.
    Reply
  • colecodez - Sunday, June 2, 2019 - link

    Lots of reasons that are the exact reasons why all TVs are not even close to a good monitor:
    - No Displayport
    - Max refresh rate: 120Hz
    - Input lag 4k@120Hz: 18.4ms
    - 4k Freesync maxes out at 60Hz
    - 4k@120Hz skips frames in game mode. Without game mode we're talking 57ms input lag.
    - TVs are optimized for TV formats, PCs are not.
    - Ads

    TL;DR: If you don't see input lag advertised on the box, pass.

    https://www.rtings.com/tv/reviews/samsung/q90-q90r...
    Reply
  • NunayaBiz - Thursday, September 19, 2019 - link

    The best TVs are extremely close to a good monitor, and at a fraction of the price
    - No Displayport, yes, that's why they said once GPU's get HDMI 2.1
    - Max Refresh Rate: 120Hz is NOT that different from 144Hz. This is mostly a non-issue. AND the ROG is Also 120Hz. It can only reach 144Hz with an overclock.
    - Input lag 4k@120Hz: 18.4ms After an update, the ms is now 7.1ms
    - 4k Freesync maxes out at 60Hz - Yes, this will be one thing you'll be giving up
    - 4k@120Hz skips frames in game mode - This is fixed.
    - TVs are optimized for TV formats - Explain why this is an issue?
    - Ads - Def. a problem with TV use, but idk if they are visible during use as a monitor

    If you don't see input lag advertised on the box - They NEVER put input lag on the box. The advertised 1ms or 2ms on the box refers to the GTG, NOT the input lag. Even good gaming monitors have ~4-5ms of input lag, basically unnoticeable from the 7.1ms of input lag seen in the q90.

    OR you could go with a LG C9 with OLED, with sub-7ms input lag times (haven't been tested with 4k yet, as there aren't GPU's with HDMI 2.1)

    TL;DR The ROG Swift 22Hz faster refresh rate after overclock, Freesync that goes to 120Hz - For that you're paying $~5,000-$6,599 vs $2,000
    Up to you if expanded Freesync range and 22Hz refresh rate is worth $3,000 - $4,599
    Reply
  • Metalingus - Friday, October 4, 2019 - link

    1. DisplayPort isn’t needed, HDMI 2.1 is good enough and will be for many years.
    2. The perceptual difference between 120Hz and 144Hz is minute. Even if you really care, many Samsung TV’s can be overclocked to 144Hz.
    3. Actually new updates for TV’s have reduce input lag further. Some are as low as 7.1ms at 120Hz. Which is a minuscule 2ms slower than the best PC monitor for input lag right now. I forget the name but it’s a 240Hz monitor with a measured input lag of 5ms. You won’t feel 2ms difference.
    4. That’s true, I would imagine next years TV’s will support VRR up to 120Hz.
    5. That’s irrelevant. The only use for 120Hz is video games. You’re supposed to play in Game Mode. Self explanatory and it’s a non issue.
    6. TV’s by standard are optimised for 3 formats; Rec.709/SDR colour, DCI P3/SDR/HDR and Rec.2020 HDR. The first one is almost identical to web colour standards, therefore your point is moot with regards to internet content. The second is for movies and the last is for movies and video games. PC games target Rec.2020 HDR too. So your uneducated point is false. Plus EVEN niché standards that many colour artists use, you CAN tune the TV for them. You can buy a colourometer and calibrate to whatever standard you like and save it as a preset.
    7. Relevant point but not all TV’s have ads. Plus you only ever seek them when accessing the Smart Menus, so it’s a moot point for PC use.
    8. No monitor companies state the input lag of their products either. Reviewers have to use a LeoBognar device to measure them and post the results. Another moot point.
    Reply
  • Guspaz - Friday, May 31, 2019 - link

    Which specs vastly exceed any TV on the market? Compared to a standard LG OLED, the peak brightness is slightly higher, and the "overclocked" framerate of 144Hz is a bit higher than the 120Hz on a TV, but this display has a worse contrast ratio, worse pixel response time, worse coverage of the DCI-P3 colour space... The only metrics where this display beats out conventional OLED TVs by a significant margin is in supporting G-Sync HDR, except game consoles don't support G-Sync, they support FreeSync... which the LG OLEDs do support. Reply
  • jeremyshaw - Friday, May 31, 2019 - link

    The LG OLEDs support HDMI 2.1 VRR, not Freesync.

    This is important, since no Freesync-only source seems to be able to activate Freesync over HDMI. Xbox One S/X implemented a more correct HDMI 2.1 VRR setup (compatible with Freesync and HDMI VRR), but AMD's unofficial HDMI modification for Freesync over HDMI, doesn't work.

    Note: neither the XboneS/X nor the PS4 use AMD's own HDMI implementation. Sony uses their own logic, fabricated at Panasonic. MSFT uses a TI DP-->HDMI converter.

    Either way, they have more input lag (tested) than my U2711.
    Reply
  • jeremyshaw - Friday, May 31, 2019 - link

    Whoops, I meant to remove my last line before posting. Later retests shown the high input lag was a fault in the input lag test, not actual high input lag. Reply
  • colecodez - Sunday, June 2, 2019 - link

    No the input lag is way higher than monitors in all tests I looked at. The game mode cuts it to 18ms but drops frames at 4k. Reply
  • Metalingus - Friday, October 4, 2019 - link

    Well that’s changed. Look at the results now with the latest firmware. Input lag is below 10ms. Reply

Log in

Don't have an account? Sign up now