Topic Title: Underscan Issue
Topic Summary: Perpetual Underscanning
Created On: 10/24/2011 12:42 AM
Status: Read Only
Linear : Threading : Single : Branch
Search Topic Search Topic
Topic Tools Topic Tools
View similar topics View similar topics
View topic in raw text format. Print this topic.
 10/24/2011 12:42 AM
User is offline View Users Profile Print this message

Author Icon
605scorpion
n00b

Posts: 1
Joined: 10/24/2011

So did AMD ever get rid of all the underscan shenanigans?  I have had nothing buy headaches with underscanning in all my different games.  I suspect it has something to do with the fact I run my monitor through a KVM switch.  CCC still seems to detect the monitor model correctly, though.

 

Anytime I use a new resolution I have to go into CCC and change the underscan which for some reason defaults to 15% instead of 0%.  I have to do this for every resolution and every frequency I use.  And anytime I update my drivers, all the settings (which take me hours to get) get reset.  I don't even own a TV, why can't I just turn this underscan crap off, globally?  If AMD hasn't bothered to do something about this (I see threads on various forums going back to at least 2007) then I think I'll just go nVidia on my next build.  I went to go replay Mirror's Edge, and bumped the resolution down to 1280x720.  I've got underscan.  I keep playing with CCC and adjusting the underscan for 720p resolutions at various frequencies, but those changes aren't reflected in game.  This happens all the time (WiC and Crysis most recently).  It would be comical if it weren't so damned frustrating.

 

From what I can tell, CCC has gotten worse.  The GUI certainly takes longer to sort through when I tried updating my drivers (and rolled back).  Then again I haven't updated my drivers in a good, long while.  It shouldn't matter, but I'll pre empt the question of what I ahve by saying my card is a 5970 on a HP w2338h running through a Aten CS-1764 KVM.



Edited: 10/24/2011 at 12:54 AM by 605scorpion
 10/24/2011 05:59 AM
User is online View Users Profile Print this message

Author Icon
Thanny
Alpha Geek

Posts: 1417
Joined: 07/13/2009

There's no such thing as underscan.  What you're talking about is the driver defaulting to a reasonable level of overscan compensation when connected via HDMI, which will provide a complete image on most HDMI-connected displays (i.e. HD televisions).

There is a simple way to avoid dealing with overscan issues - don't use HDMI.  Computer monitors use VGA, DVI, or DisplayPort.  HDMI is for televisions.  If the only digital connection your monitor has is HDMI, buy a better monitor.

Otherwise, deal with the fact that you're using a connection type which has baggage attached.  I doubt nVidia handles things any "better".

 

 12/03/2011 08:26 AM
User is offline View Users Profile Print this message

Author Icon
Herowin
n00b

Posts: 1
Joined: 12/03/2011

Thanny, your post was most worthless and totally unhelpful. The fact is, this is a driver issue. Telling people to get a new monitor because AMD can't seem to get their drivers to save a setting reeks of blatant fanboism.

If it wasn't for the 9th post located here: http://forums.amd.com/game/messageview.cfm?catid=260&threadid=107707&STARTPAGE=1&FTVAR_FORUMVIEWTMP=Linear I'd have sold my ATI card and picked up something from NVIDIA quite a while ago. As is stands, I just have to go through the huge inconvenience of setting scaling for a single resolution and refresh rate, applying, rebooting and then doing that for ALL resolutions AND refresh rates I will be using every time I update drivers. It's totally ridiculous.

This has absolutely nothing to do with HDMI, it has to do with drivers being broken for well over a year. I certainly won't by buying an AMD/ATI video card for my next build solely due to this immense waste of my time.

Can the driver team just fix this stupid problem already?

 12/03/2011 07:08 PM
User is online View Users Profile Print this message

Author Icon
Thanny
Alpha Geek

Posts: 1417
Joined: 07/13/2009

Congratulations on your ability to completely avoid comprehending what someone else has written.

Defaulting to a level of overscan compensation when HDMI is being used is a driver design decision.  It's not a bug.

Putting a HDMI-to-DVI converter between the monitor and card may prevent the need to compensate for your connection choice.

 

 12/03/2011 08:28 PM
User is offline View Users Profile Print this message

Author Icon
black_zion
80 Column Mind

Posts: 12541
Joined: 04/17/2008

The reason AMD programs the drivers to default to a level of underscan is that many HDMI displays are TVs, good quality monitors use either DisplayPort or DVI (and may support HDMI in addition but meant for game console support), and TVs are programmed to have a certain amount of overscan, the reason for this you can clearly see if you have a TV that is capable of customizing the amount of over and underscanning, there is a bit of picture that you do not see which holds information useful to the TV station, such as timing information. Modern TVs and HDMI monitors will have in their EDID data the amount of overscanning, and like Thanny said using the converter can screw with this.

-------------------------
ASUS Sabertooth 990FX/Gen3 R2, FX-8350 w/ Corsair H60, 8 GiB G.SKILL RipjawsX DDR3-2133, XFX HD 7970 Ghz, 512GB Vertex 4, 256GB Vector, 240GB Agility 3, Creative X-Fi Titanium w/ Creative Gigaworks S750, SeaSonic X750, HP ZR2440w, Win 7 Ultimate x64
Statistics
85703 users are registered to the AMD Support and Game forum.
There are currently 7 users logged in.

FuseTalk Hosting Executive Plan v3.2 - © 1999-2014 FuseTalk Inc. All rights reserved.