Today ASUS announced a new monitor targeted at the gaming market. The ASUS MG278Q is a 27" TN panel with a resolution of 2560x1440 and a 144Hz refresh rate. In the chart below you can see further information about the monitor and its specifications.

ASUS MG278Q
Resolution 2560x1440
Refresh Rate 144Hz
Panel Size 27"
Peak Brightness 350nits
Response Time 1ms (GtG)
Viewing Angle (H/V) 170° / 160°
Inputs / Outputs 1 x DisplayPort 1.2
1 x Dual-link DVI
1 x HDMI 2.0
1 x HDMI 1.4
1 x 3.5mm audio
3 x USB 3.0 (1 upstream 2 downstream)
Color Depth 16.7 million colors (Likely 6bit+AFRC)
Dimensions 625 x 563 x 233mm (with stand)

Being a gaming-oriented display, the MG278Q's focus is on a low response time and a high refresh rate rather than color accuracy. Since it's a TN panel it's likely that the panel has a native 6bit color depth per subpixel and uses temporal dithering to emulate 16.7 million colors, although this has not been confirmed. In addition to the 144Hz refresh rate, the MG278Q supports AMD's FreeSync technology which utilizes the Adaptive Sync feature of DisplayPort 1.2a to enable a variable refresh rate synchronized to the GPU's rendering of frames. More information about FreeSync and how it works can be found here.

ASUS is yet to announce pricing fort he MG278Q, but we've seen TN displays with similar specifications from Acer and BenQ for $500-600. The MG278Q will be available in North America in early September

Source: ASUS

Comments Locked

59 Comments

View All Comments

  • T1beriu - Wednesday, August 19, 2015 - link

    nVidia has the liberty to adopt FreeSync anytime they please. It's a royalty-free standard. Nothing's stopping them. But they won't.
  • Guspaz - Wednesday, August 19, 2015 - link

    They could adopt it, sure, but each implementation (G-Sync versus FreeSync) does things that the other does not. If nVidia simply adopted FreeSync, it would be worse in some ways, and if AMD adopted G-Sync, the same is true.
  • foxalopex - Wednesday, August 19, 2015 - link

    Umm, I think you're missing a REALLY important point here. Nvidia's G-Sync is a Proprietary interface. Nvidia is NOT going to let anyone adopt it without paying a royalty at the very least assuming they will let you adopt it in the first place. AMD's Freesync has been adopted into the VESA standards which is free for anyone to use. Literally you can create your own Freesync monitor by reading the specifications and building the circuits yourself. You cannot with G-Sync. In G-Sync you MUST use the scaler chip that Nvidia provides.
  • medi03 - Wednesday, August 19, 2015 - link

    Not really.
    Freesync is superior, no perf hit, easier to implement, no port restrictions, can even work over hdmi.
  • Nenad - Thursday, August 27, 2015 - link

    That is not true.
    1) Freesync is not superior, it has minor pros and cons compared to GSync
    2) no perf difference: "performance hit" is around 2% according to AMD, or not existing according to Anandtech
    3) freesync DOES have port restriction: it also works only on Displayport (it is "DisplayPort Adaptive Sync" standard after all). Monitor itself can support other ports, but not for Freesync

    BTW, only 'noticeable' advantage of Freesync is that it is cheaper, while only 'noticeable' advantage of GSync is that it support ULMB blur reduction.
    And, while I have GSync monitor, I'm really glad that Freesync monitors are also now available, since it will put pressure on NVidia to reduce prices of their GSync adapter.
  • ppi - Wednesday, August 19, 2015 - link

    nVidia sure can implement it ... with Pascal. If they want, that is. Maxwell apparently does not have the right HW to support the necessary VESA standard.

    Unlike AMD partners however, nVidia clearly caught the ghosting issue and implemented some solution in its scaler chip. I suppose for FreeSync monitors, we will have to wait for next get scaler chips, that will resolve the ghosting issue, and then for AMD drivers solution for low-fps scenarios.
  • anubis44 - Monday, August 24, 2015 - link

    Nothing's stopping nVidia from enabling support in their drivers but Jen Hsun Huang's ego. That, unfortunately, is very large obstacle. Judging by the first DX12 benchmarks and information, however, there is no reason to go nVidia at this point:

    http://arstechnica.com/gaming/2015/08/directx-12-t...
  • tamalero - Friday, August 28, 2015 - link

    Except they hardly do, they want to shove their proprietary ala Apple on your face.
    CUDA is a fine example.
  • Murloc - Wednesday, August 19, 2015 - link

    AMD is already following the VESA standard. It's nvidia that's not doing so.
  • Guspaz - Wednesday, August 19, 2015 - link

    G-Sync predates the VESA standard, and the VESA standards in some respects is not as capable. An updated standard that included some of the better aspects of G-Sync (such as actual hardware changes to the scaler) would be the best of both worlds.

Log in

Don't have an account? Sign up now