So did AMD ever get rid of all the underscan shenanigans? I have had nothing buy headaches with underscanning in all my different games. I suspect it has something to do with the fact I run my monitor through a KVM switch. CCC still seems to detect the monitor model correctly, though.
Anytime I use a new resolution I have to go into CCC and change the underscan which for some reason defaults to 15% instead of 0%. I have to do this for every resolution and every frequency I use. And anytime I update my drivers, all the settings (which take me hours to get) get reset. I don't even own a TV, why can't I just turn this underscan crap off, globally? If AMD hasn't bothered to do something about this (I see threads on various forums going back to at least 2007) then I think I'll just go nVidia on my next build. I went to go replay Mirror's Edge, and bumped the resolution down to 1280x720. I've got underscan. I keep playing with CCC and adjusting the underscan for 720p resolutions at various frequencies, but those changes aren't reflected in game. This happens all the time (WiC and Crysis most recently). It would be comical if it weren't so damned frustrating.
From what I can tell, CCC has gotten worse. The GUI certainly takes longer to sort through when I tried updating my drivers (and rolled back). Then again I haven't updated my drivers in a good, long while. It shouldn't matter, but I'll pre empt the question of what I ahve by saying my card is a 5970 on a HP w2338h running through a Aten CS-1764 KVM.
12:54 AM by