A couple of weeks ago, Aldi had a 42" LCD for AUD$1500. Its quite a good deal, image quality is quite good, compared to other sets in that price range. Sure, it only has an analogue tuner, with lots of ghosting (on a signal that our older LG CRT displayed perfectly), but who cares, for $1.5k, you get a screen thats perfect for hooking up to a media centre machine, tuner be damned.
Unfortunately, its not all brightness and sunshine. One of the key reasons I can see to putting HDMI on a screen is to accurately feed video information to screen. I want my computer to be able to turn the top left pixel white. Now, this screen has a native resolution of 1366x768. No big deal, using Powerstrip, you can easily configure your video card to output these odd resolutions. When connected via VGA, there is no problem (apart from the quantisation noise introduced from converting from digital to analogue and back again), but heavens forbid you try and connect your computer via the HDMI port. When fed a 1366x768 signal via DVI/HDMI, the screen sees fit to scale the image down so that you have a tiny compressed image in the middle.
It gets better though, if you feed it a somewhat more standard signal of 1280x720 via the HDMI port, it scales it up . . . up past the edges of the screen. So now, you have a 1366x768 panel, displaying around 680 lines of useful information. Why don't the manufacturers of these sets understand that there is absolutely no need to perpetrate the legacy of analogue CRT sets, the nightmare of overscan.
Overscan was introduced early on to combat distortions in the picture that were particularly evident in early sets. In particular, there were nonlinearities on the edges, where each line would not start directly below the one above it, and blooming, where the whole picture would stretch outwards as the brightness increased.
Now, with a pure digital signal, there is no need for overscan. The data stream contains information for each and every pixel in the final image. If the resolution of the signal is the same as the physical dimensions of the screen, you will have a 1:1 pixel mapping. An obvious example of this is an LCD monitor connected via DVI, or a laptop LCD screen. When these screens are displaying a signal that is the native resolution of the panel, the computer has control over each pixel on the screen individually.
You can test this on your own screen using the (free) excellent tools at http://tft.vanity.dk/. If you are running it within a browser, rather than standalone, make sure you put your browser in fullscreen mode before continuing. Launch the utility, and bring the mouse to the top of the window. A popup menu will appear, and you can select "1:1 pixel mapping" test. If you screen is really giving you a 1:1 pixel mapping, you will see an even tone across the whole screen (actually, a checkerboard if you look closely). If not, you will see scaling artifacts, some examples of which can be found at http://pixelmapping.wikispaces.com/Pixel+mapping+explained.
So, how do you fix this? Well, Samsung have released an updated firmware for one of my workmate's TV (a 40" 1080P set), which provides this functionality that should have been there in the first place. Another workmate has an LG rear projection 720P set, and all he has won for his efforts is loss of his set for weeks on end, while technicians looked for faults that are fundamental in the design of the set. As for me, I'm going to give support another call tomorrow, and try to get through to someone who understands the problem, rather than blaming the computer.