Topic Title: cant get 1920x1080 on samsung full hd monitor
Topic Summary:
Created On: 08/23/2014 06:20 PM
Status: Post and Reply
Linear : Threading : Single : Branch
Search Topic Search Topic
Topic Tools Topic Tools
View similar topics View similar topics
View topic in raw text format. Print this topic.
 08/23/2014 06:20 PM
User is offline View Users Profile Print this message

Author Icon
yasserxy
Peon

Posts: 5
Joined: 08/23/2014

this issue happened after change dvi to vga adapter .

every thing worked fine for year and half this is my pc specs

Monitor S22B300B 21.5"inch Samsung

intel core i5 3470

gigabyte h61m-s2p

kingston ddr3 1333 4gb

SAPPHIRE HD 7770 GHz Edition 1GB GDDR5

Catalyst Version 14.6 RC

was able to get 1920x1080 without any problem but today for some reason my monitor doesnt respond so i tried on-board graphic card (without dvi to vga adapter ) and it works .. so i bought a new dvi to vga adapter .after that i cant find 1920x1080 resolution anymore and maximum resolution i can get is 1600x1200 !!



another thing i cant get any info about the monitor in ccc

 

 

 08/23/2014 08:20 PM
User is offline View Users Profile Print this message

Author Icon
Thanny
Alpha Geek

Posts: 1494
Joined: 07/13/2009

Something's preventing DDC from working.  Did you try overriding the maximum resolution by unchecking the box you show in your screenshot?

 

 08/23/2014 08:21 PM
User is offline View Users Profile Print this message

Author Icon
black_zion
80 Column Mind

Posts: 13081
Joined: 04/17/2008

Why the crap are you using VGA with both your monitor and card support straight DVI?

-------------------------
AMD FX-8350 w/ Corsair H105, ASUS Sabertooth 990FX Gen3/R2, 8GiB G.SKILL DDR3-2133, XFX HD 7970, 512GB Vertex 4, 256GB Vector, 240GB Agility 3, Creative X-Fi Titanium w/ Creative Gigaworks S750, SeaSonic X750, Corsair C70, HP ZR2440w, Win 7 Ultimate x64
 08/23/2014 08:33 PM
User is offline View Users Profile Print this message

Author Icon
yasserxy
Peon

Posts: 5
Joined: 08/23/2014

yep i get all kind of resolutions now 



so what causing this issue the adapter or something else 

 08/23/2014 08:41 PM
User is offline View Users Profile Print this message

Author Icon
yasserxy
Peon

Posts: 5
Joined: 08/23/2014

Originally posted by: black_zion Why the crap are you using VGA with both your monitor and card support straight DVI?


it was working perfect for nearly 2 years on 1920x1080 resolution so i didn't thought dvi cable is necessary

 08/23/2014 09:18 PM
User is offline View Users Profile Print this message

Author Icon
black_zion
80 Column Mind

Posts: 13081
Joined: 04/17/2008

Uh yea, use DVI, you'll notice the increased quality.

-------------------------
AMD FX-8350 w/ Corsair H105, ASUS Sabertooth 990FX Gen3/R2, 8GiB G.SKILL DDR3-2133, XFX HD 7970, 512GB Vertex 4, 256GB Vector, 240GB Agility 3, Creative X-Fi Titanium w/ Creative Gigaworks S750, SeaSonic X750, Corsair C70, HP ZR2440w, Win 7 Ultimate x64
 08/24/2014 03:36 PM
User is offline View Users Profile Print this message

Author Icon
yasserxy
Peon

Posts: 5
Joined: 08/23/2014

Originally posted by: black_zion Uh yea, use DVI, you'll notice the increased quality.

thank you 

just bought DVI-D Single-Link Cable no noticeable quality improvement but at least ccc recognize the monitor

 

will DVI-D dual link cable make any difference for quality improvement?? 

 08/25/2014 03:27 PM
User is offline View Users Profile Print this message

Author Icon
Thanny
Alpha Geek

Posts: 1494
Joined: 07/13/2009

Dual Link is only useful for higher resolutions and/or refresh rates, as it doubles the number of data pins and unlocks the maximum signal frequency.

As for the quality difference between VGA and DVI, it's generally zero for a digital display, provided the monitor's ADC isn't complete garbage.

 

 08/26/2014 02:42 PM
User is offline View Users Profile Print this message

Author Icon
yasserxy
Peon

Posts: 5
Joined: 08/23/2014

Originally posted by: Thanny Dual Link is only useful for higher resolutions and/or refresh rates, as it doubles the number of data pins and unlocks the maximum signal frequency.

 

As for the quality difference between VGA and DVI, it's generally zero for a digital display, provided the monitor's ADC isn't complete garbage.

thank you ..this is 100% correct coz i see no defference ethire using dvi or dvi-vga cables  ..but still what was preventing DDC from working in the first place ?



Edited: 08/26/2014 at 02:52 PM by yasserxy
 08/26/2014 06:32 PM
User is offline View Users Profile Print this message

Author Icon
Thanny
Alpha Geek

Posts: 1494
Joined: 07/13/2009

Bad cable or bad connection would be my first guess as to why DDC wasn't working.

 

Statistics
89092 users are registered to the AMD Support and Game forum.
There are currently 0 users logged in.

FuseTalk Hosting Executive Plan v3.2 - © 1999-2014 FuseTalk Inc. All rights reserved.