I've recently bought my new Ati HD7770, and installed its drivers, which are all of them now updated.
However, I have a problem when I set the resolution to 1080p, or even 720p. It shouldn't be my monitor problem since native resolution is 1080p, but the fact is that when i put the overscan to 0%, the image is larger than the screen.
If it is the default, I get the black borders. So you may say that i shouldn't put it at 0%. Okey, but then the screen is not clear, and i can't see letters as properly as if I set it to 0%.
Im using Windows 7, 64bits 4GB RAM, Intel Quad Core Q9300, and version 13.1 of CCC, and Im connecting the graphics card through HDMI to HDMI connector to a 23" Full HD 1080p TV.
It's your television. It's treating the signal as overscanned, and only showing part of it. You need to change its aspect ratio from 16:9 to the option that displays the entire source frame. What that option is called varies from one manufacturer to the next, and even among models from the same maker.
My 2008 Samsung, for example, calls it Just Scan. Here are some common options to look for:
Dot by Dot
Some HD televisions simply don't have that option. If yours is one of those, you're out of luck, at least over HDMI. Some such limited models will automatically disable overscan for VGA connections.
I've been checking my TV aspect ratios, and the options avaliable are just 4:3, 16:9, 2 Zooms, and Automatic.
I've also been checking other resolutions, and this happens to me in all resolutions where i can change the under/overscan option to 0% (or 10% max) in a range of resolutions from 640x480 up to 1920x1080, no matter what resolution it is, I get the same issue (larger image than screen).
Contrary to it, if I can't change the under/overscan option it looks good, this actually happens with a little number of resolutions, one of these is 1280x800 (EDIT: It happens with all resolutions when GPU scaling is enabled; when disabled, it just happens as I wrote before). Same happens using VGA connector, I can't change the under/overscan option (it's disabled by the software by default)
That's why I think the problem comes from that scaling option from the software. So is there anway to completely disable that auto-scaling?
If your television forces overscan, it won't work optimally regardless of your graphics card. That's a cold hard fact. If you were to provide the make and model of your TV, I might be able to check whether or not that's actually the case.
There's no difference in quality between VGA and HDMI, given successful avoidance of overscan in the latter case. There's an extra analog to digital conversion step with VGA, but unless you're using the worst VGA cable on the planet, the resulting digital pixel arrangement will be identical.