Topic Title: Wide Gamut Monitors and 10bit with Display Port
Topic Summary: Are all Radeon GPUs enabled to use 10bit per color channel?
Created On: 08/18/2012 12:42 PM
Status: Post and Reply
Linear : Threading : Single : Branch
Search Topic Search Topic
Topic Tools Topic Tools
View similar topics View similar topics
View topic in raw text format. Print this topic.
 08/18/2012 12:42 PM
User is offline View Users Profile Print this message

Author Icon
Seba_F80
n00b

Posts: 3
Joined: 08/18/2012

Hi all,

I'm quite new to this kind of subjects, so I hope not to ask any newbie questions.

I want to buy a new Radeon HD based Graphic Card for my new Windows 7 based PC and I'll add a wide gamut IPS Monitor because interested in digital photography.

I'm not sure but I think I will put in an Intel i5 or i7 CPU (Sandy Bridge or Ivy Bridge, please tell me if this is not the right choice)

So, my question is about the 10bit per color channel support needed for high precision and fidelty colors when using calibrated IPS monitors.

My target is a U2410 Dell or a Spectraview Reference NEC display (for both more than sRGB gamut coverage, lower than 100% Adobe RGB coverage).

Do all the Radeon GPU based video card allow 10bits/color channel if I use a Display Port cable? What if I used a DVI connection?

Are there any system requirements (minimum Radeon GPU type, minimum amount of RAM, Motherboard chipset type, Windows 7 drivers, Video card Drivers and so on ... ) to enable this kind of "high fidelity vision" on an IPS monitor?

Thank you in advance, Seba_F80

 08/18/2012 01:49 PM
User is offline View Users Profile Print this message

Author Icon
black_zion
80 Column Mind

Posts: 12205
Joined: 04/17/2008

Starting with that HD 5000 series all Radeon boards support 10 bit per channel DisplayPort displays and 12 bit per channel HDMI displays while DVI is limited to 8 bits per channel due to bandwidth limitations. For digital photography AMD or Intel won't matter. You need to be using Windows 7 x64 with at least 8GB of RAM since high megapixel .RAW photos can use a lot of memory,

-------------------------
ASUS Sabertooth 990FX/Gen3 R2, FX-8350 w/ Corsair H60, 8 GiB G.SKILL RipjawsX DDR3-2133, XFX HD 7970 Ghz, 512GB Vertex 4, 256GB Vector, 240GB Agility 3, Creative X-Fi Titanium w/ Creative Gigaworks S750, SeaSonic X750, HP ZR2440w, Win 7 Ultimate x64
 08/19/2012 07:24 AM
User is offline View Users Profile Print this message

Author Icon
Seba_F80
n00b

Posts: 3
Joined: 08/18/2012

Thank you black_zion,

So I think I'll buy one of the 6000 series Radeon Graphic Card, as I've readen this is the GPU Apple i Mac use.

Thanks so much, I'll link this post to the official Nikon Italy forum, as many users have guessed about this aspect, and we have a thread on this topic

Bye, Seb

 10/09/2012 08:55 PM
User is offline View Users Profile Print this message

Author Icon
Seba_F80
n00b

Posts: 3
Joined: 08/18/2012

Originally posted by: black_zion Starting with that HD 5000 series all Radeon boards support 10 bit per channel DisplayPort displays and 12 bit per channel HDMI displays while DVI is limited to 8 bits per channel due to bandwidth limitations. For digital photography AMD or Intel won't matter. You need to be using Windows 7 x64 with at least 8GB of RAM since high megapixel .RAW photos can use a lot of memory,

Hi, can you confirm that HD 6xxx series boards (for example HD 6670, I would like to buy) have the ability to use 10 bit per channel via display port and that Win 7 64bit drivers will allow you to use 10 bit?

Some people are telling me they cannot find where to enable this feature in the System (Win 7).

Thanks again, Seba_F80

Statistics
84160 users are registered to the AMD Support and Game forum.
There are currently 3 users logged in.

FuseTalk Hosting Executive Plan v3.2 - © 1999-2014 FuseTalk Inc. All rights reserved.