Topic Title: Graphic card mistakes monitor for TV
Topic Summary:
Created On: 07/06/2013 07:57 AM
Status: Post and Reply
Linear : Threading : Single : Branch
Search Topic Search Topic
Topic Tools Topic Tools
View similar topics View similar topics
View topic in raw text format. Print this topic.
 07/06/2013 07:57 AM
User is offline View Users Profile Print this message

Author Icon

Posts: 2
Joined: 07/06/2013

I just copied the template out of the sticky thread, I hope this is fine

Graphics Card
Gigabyte RADEON HD 7870 OC

AMD Catalyst Driver Version, and Driver History
CCC 13.2

Operating System
Windows 8 SP1 64-bit

Issue Details
The problem is that when I connect my Samsung SyncMaster XL2370 to my PC with a HDMI cable, the resolution is automatically set to 1080p. The native resolution of the monitor is 1920x1080. I went to the CCC to change the resolution but under Basic only resoultions up to 1680x1050 are being listed. If I change the resolution via Windows, 
1920x1080 IS being listed, but if I change it to it, it switches again to 1080p.

To clarify, if 1080p would give me the right picture (googling showed me that it does so on some monitors), I'd be fine, but if I set the thing to 1080p it basically melts my eyes out of their sockets It really isn't good for any use. Also, the signal is completely fine as long as I use a DVI cable (which BTW I can't use in this particular situation).

Motherboard or System Make & Model
Gigabyte GA-890XA-UD3 v2.0

Power Supply

be quiet 550W

Display Device(s) and Connection(s) Used
Samsung SyncMaster XL2370 with HDMI cable

Applications and Games

occurs at all times as long as I use the HDMI cable

CPU Details
AMD Phenom II X4 Black Edition 965

Motherboard BIOS Version
- the BIOS version is displayed as soon as your system is powered on (ex: ASUS BIOS 1015)

System Memory Type & Amount
2x Kingston
2 GB DDR3-1333 DDR3

I was able so far to fix this issue by using an adapter. I don't have the adapter right now though and am wondering why I can't use this monitor the way it is intended to. I hope anyone can help me with this!

Thank you for your time!

 07/06/2013 11:00 AM
User is offline View Users Profile Print this message

Author Icon
80 Column Mind

Posts: 13530
Joined: 04/17/2008

Does it still happen with Catalyst 13.4?

AMD FX-8350 w/ Corsair H105, ASUS Sabertooth 990FX Gen3/R2, 8GiB G.SKILL DDR3-2133, XFX HD 7970 GHz Edition, 1TB Samsung 850 Evo, 256GB OCZ Vector, Creative X-Fi Titanium w/ Creative Gigaworks S750, SeaSonic X750, Corsair C70, HP ZR2440w, Win 7 Ult Ed x64
 07/06/2013 12:31 PM
User is offline View Users Profile Print this message

Author Icon
Alpha Geek

Posts: 1558
Joined: 07/13/2009

The problem is that your monitor thinks it's a TV when you use the HDMI input, and is reporting that to the graphics card.  That means it's treating the signal as overscanned, causing the horrible text quality you see.

If you can't get the display to turn off overscan and set the graphics driver overscan value to 0%, then your "monitor" cannot be used properly via HDMI.

Just stick with DVI and you won't have any of those problems.


 07/07/2013 04:54 AM
User is offline View Users Profile Print this message

Author Icon

Posts: 2
Joined: 07/06/2013

Thanks for giving me the idea that it's a monitor problem! That lead me in the right direction. I switched the monitor signal from PC to AV, adjusted the overscan and now 1080p gives a rather similar picture to the 1920x1080 resolution on the PC-DVI signal. I might have to adjust for brightness and sharpness a little bit, but otherwise it works fine.


93039 users are registered to the AMD Support and Game forum.
There are currently 2 users logged in.

FuseTalk Hosting Executive Plan v3.2 - © 1999-2015 FuseTalk Inc. All rights reserved.