I totally agree with you.
We have 9 PCs in our home and almost all of them have both AMD cards and flat panel displays with HDMI cables used to connect them. As with you every single time I update the video driver I have to walk around and correct this underscan glitch.
It doesn't actually happen with all PCs though and I've discovered it seems to be the montor's EDID that AMD cards are incorrectly interpreting. For example on the PC I'm sitting at right now it has an I-Inc monitor and this system defaults to 0 underscan so I can just update the driver and that is it.
Two PCs have HP w2338h monitors. For these the monitors must have been manufactured at different times because one has the problem while the other does not. I compared the EDIDs and found they are different even though the monitors are both identical part numbers and look identical.
In another case the PC is hooked up to an Olevia HDTV and it too always underscans. This is where this glitch really annoys me. The only reason AMD would ever need to underscan is when hooked up a TV that has a lot of overscan. The TV in this case is smart enough to detect that a PC is driving it so it turns off its overscan features yet the AMD video card doesn't get it and underscans anyway.
Another is hooked up to a Toshiba HDTV and it too always underscans. This HDTV defaulted to about 8% overscan when new and ATI video cards underscan by default far more than this so plug and play it is not. I have disabled the HDTV's overscan (when we bought it a few years ago) so now every time I install a new video driver I have to once again manually open the Catalyst Control Center and turn off the underscan.
If anyone at AMD is reading this, please change the default to 0 underscan when HDMI is being used.
As anttimonty states above 99% of PCs are hooked up to PC monitors so there is no need to underscan. To make mattors even more silly (with AMD's decision) most HDTVs are either smart enough to turn off overscan when hooked up to a PC or the consumer has disabled overscan when they first hooked up their PC to it so even HDTVs do not require a default 15% underscan and would benifit from a 0% underscan/overscan default. As a minimum the underscan/overscan setting should stick between video driver updates but a default of 0% would make MUCH more sense as a default.
BTW, one workaround is to purchase fairly expensive HDMI to DVI cables and plug the HDMI end into the monitor/HDTV and the DVI end into the PC. When this is done the AMD video card interprets the display as a monitor and defaults to 0% underscan/overscan.
One similar and related glitch I fight with regularly is the color spectrum output by AMD video cards. Most HDTVs require a black to full brightness range of 16-235 while monitors require 0-255. Recorded TV video content is typically in the 16-235 range while photographs, video games, desktop apps, etc. use 0-255. This means that if the AMD video card detects the monitor as an HDTV it compresses content with 0-255 to 16-235 and leave TV content at 16-235. If the monitor is actually an HDTV calibrated for 16-235 this is ideal but unfortunately most monitors are calibrated for 0-255 so you end up with washed out colors and reduced contrast. Most people may not notice this but the problem is there at the same time as the underscan problem.
AMD introduced a setting in "video" where you can tell the card to output video (only) in a 16-235 range or 0-255 but this doesn't fully fix the problem because it doesn’t correct the problem with photographs, games, desktop applications, etc.
To make matters even more annoying, we have one older HDTV that has a DVI port on it and no HDMI that AMD video cards incorrectly detect as a monitor (Nvidia cards correctly detect it as an HDTV). This means the video card outputs everything with a 0-255 range. Even Recorded TV (Media Center) which is recorded at 16-235 is by default converted to 0-255 before being output. Since we also view photographs on this HDTV we have to tweak the brightness and contrast in "My Digital Flat Panels" to get close to the correct brightness levels but this isn't an ideal solution.
What AMD needs to add is an output setting in "My Digital Flat Panels" that allows the user to select between an output of 0-255 and 16-235 and this needs to be available regardless of if a monitor or HDTV is "detected".
The "Video" setting for colorspace should really be a setting for the video format input to the card, not the output. The output should be controlled in "My Digital Flat Panels". Video should have 3 options: 1. Video source is 0-255 (typical of some PC video sources such as game captures). 2. Video source is 16-235 (typical of Recorded TV and other TV captures). 3. Auto detect video source (if the video contains only numbers in the 16-235 range then it is 16-235, if it is beyond this then interpret and process as 0-255).
These issues have been around with ATI/AMD video cards since people started regularly hooking PCs up to TVs (more than 10 years ago). I keep hoping ATI will add a colorspace output setting in "My Digital Flat Panel" but they just don't seem to understand colorspace, overscan/underscan defaults and what HDTVs vs monitors require.