As the various board partners finally begin unveiling their custom GTX 1000 series designs, GIGABYTE has arrived with their latest entry in the Mini ITX game. GIGABYTE’s GeForce GTX 1070 Mini ITX OC is a compact 17cm (7in) in length, allowing it to fit into Mini ITX gaming builds. The cooler on this card houses a 90mm fan with custom blades, which GIGABYTE claims will enhance the airflow. In the heatsink itself we can find three heat pipes which make direct contact with the GPU core. The aim of this cooler is of course to bring higher performance at lower temperatures. Though from my own experience good performance a less audible GPU fan behind the television is greatly appreciated as well.

The board itself is built out of higher quality materials in an effort to increase lifespan while giving improved performance. To go with the cooler and component choices, this card has been given a 5+1 phase power delivery system, as compared to the 4+1 phase system of the GTX 1070 reference PCB.

GTX 1070 Specification Comparison
  GIGABYTE GeForce GTX 1070 Mini ITX OC (OC Mode) GIGABYTE GeForce GTX 1070 Mini ITX (Gaming Mode) GTX 1070
Founders Edition
Core Clock 1556MHz 1531MHz 1506MHz
Boost Clock 1746MHz 1721MHz 1683MHz
Memory Clock 8Gbps GDDR5 8Gbps GDDR5
VRAM 8GB 8GB
TDP 150W? 150W? 150W
Launch Date ? 6/10/2016
Launch Price ? $449

On the topic of overclocking this card comes with two performance profiles, Gaming and OC Mode. Gaming Mode is the default profile, which the slightly higher clocked OC Mode is enabled through their Xtreme Engine utility. Both the Gaming and OC Mode options offer a mild overclock over the founder’s edition card.

As more Mini ITX cards come along it will indeed be interesting to see how powerful compact gaming rigs will become. Beginning with cards such as the AMD Radeon R9 Nano, enthusiasts in the last generation who wanted to save on space gained a lot of freedom on what they could do while maintaining big performance. This generation should improve performance in mITX systems by a good amount, and I don't expect this is the last mITX GTX 1070 we're going to see.

Source: GIGABYTE

Comments Locked

21 Comments

View All Comments

  • JoeyJoJo123 - Tuesday, July 5, 2016 - link

    Neat! *Snaps picture*
  • Chaitanya - Tuesday, July 5, 2016 - link

    availability and price are crap.
  • Morawka - Tuesday, July 5, 2016 - link

    same goes for any 16nm/14nm GPU. RX480's are even selling for +$150 over MSRP right now
  • tuxfool - Tuesday, July 5, 2016 - link

    I don't know where you live but those RX480s definitely aren't over 150$ over MSRP where I checked.
  • bj_murphy - Tuesday, July 5, 2016 - link

    In Canada they are at some vendors. Probably not hitting $150 price premiums in US dollars though I'd imagine.
  • ToTTenTranz - Tuesday, July 5, 2016 - link

    I don't know if this is going to eat a lot into Nano's niche market or not, but this card is quite a bit wider than the PCIe shield's width, whereas the Nano is almost a perfect fit.
    There are probably more than a couple of mITX cases where the Nano fits and this doesn't.

    OTOH, Fiji's hardware video codecs and output options are getting a bit terrible for future-proof HTPCs, so..
  • ImSpartacus - Tuesday, July 5, 2016 - link

    That's an interesting perspective.

    I hadn't thought of how this would compare to a nano.
  • Death666Angel - Tuesday, July 5, 2016 - link

    Had to choose a video card for my HTPC, because I bought a new 4k TV. Ended up with an ITX sized Zotac GTX960 for my 20cm case. I needed/wanted native HDMI 2.0 without the need for adapters, so anything AMD was out of the question unfortunately. I'll see what I end up with next upgrade cycle (I haven't started a single game on the HTPC though, so maybe the GTX 960 stays a while... until room scale VR becomes a thing)....
  • 8steve8 - Tuesday, July 5, 2016 - link

    would prefer more than 1 DP port... no need for 2x DVI.
  • nevcairiel - Tuesday, July 5, 2016 - link

    Port configurations are sometimes so weird. Which GPU still has 2 DVI these days?

    There is really no reason to not offer the same array of outputs the "big" GPUs have, 3xDP, 1x HDMI, 1x DVI, or 2xDP/2xHDMI/1xDVI, whichever combination you prefer more.

Log in

Don't have an account? Sign up now