Long ago, in a land far, far away…actually, just a few years ago, probably in your own home or office you could most likely find yourself sitting in front of a tiny 14" monitor which would pale in comparison to your smallest 20" TV. Unfortunately the prices of computer monitors, at least a few years ago, kept most 17" and larger displays out of the hands of the mainstream consumer, although most enthusiasts had dreams of the $600 17" wonders.

Times do change, and currently, owning a 17" monitor isn’t that big of a deal anymore. Monitor prices have dropped tremendously, and what used to cost anywhere between $600 and $900 can now be had for under $400. If quality isn’t your top concern, a 17" monitor can become your new best friend for a little over $200, while it may not be the best looking friend you’ve had it’ll get the job done.

However, during the time when 14" monitors were the most affordable monitors out there, a few companies decided to take advantage of something that has quietly become a household necessity these days, the television. Companies like ATI began offering TV output ports directly on their video cards, which meant that at a sacrifice in visual quality, you could make use of a much larger screen for your computer. Since those days, the quality of TV output on video cards has improved dramatically, however more importantly, the price of including such a feature with a card no longer drives the price of the card to unreasonable levels.

In the past, the only video cards that did feature TV output were ones most hardcore gamers or knowledgeable enthusiasts wouldn’t dream of touching. Virtually every video card produced after the 3dfx Banshee now has support for TV output, and at a quality that once was only available on dedicated $300+ TV output cards. But what’s the catch? You’ve been "in" the industry long enough to know that no dream technology comes without a catch, there’s a definite reason users haven’t dropped their 17" CRT monitors in favor of 20" TVs.

The Tradeoffs

The first thing you must keep in mind is that a computer monitor is capable of displaying much higher resolutions than your standard NTSC or PAL TV. The NTSC standard calls for 525 scan lines, however of those 525 scan lines only 483 are active. The European version of the US standard, is PAL, and calls for a higher resolution at a slightly lower refresh rate. What this basically turns out to be is that your standard TV set, whether it is a NTSC or PAL compliant set, can barely display the lowest resolution your desktop probably supports, 640 x 480.

The resolution difference between TVs and computer monitors is the source for quite a few limitations that are placed on TV output modes. It used to be the case that enabling TV output forced your desktop into a 640 x 480 resolution at a fixed refresh rate, usually around 60Hz or lower. This was the most straightforward way of accomplishing the goal of taking what you see on your desktop, and displaying it on a TV. This brings you to the first tradeoff you must make with enabling TV output, desktop resolution/refresh rate degradation. Most manufacturers have taken routes to make this tradeoff less noticeable, their methods of doing so can differ greatly, so this is one of the first things you’ll want to consider.

More Tradeoffs
Comments Locked

0 Comments

View All Comments

Log in

Don't have an account? Sign up now