Originally posted by: BloodyIron As the article outlines all third party adapters can do this properly too. The only thing preventing it from happening is a built in I2C EEPROM chip which identifies the device, that's it. Plenty of other adapters are propery wired and meet necessary specs, yet AMD's driver intentionally shuts them out.
Because supporting them costs more money than using this little chip. It's that simple.
Also they probably knew those cards would be moved to legacy state soon, so that's one more good reason to do so. AFAIK all non-legacy cards should have a native HDMI output by now.
CPU: AMD Phenom II X4 810 @ 3250MHz | RAM: Kingmax 2x2GB DDR2 800 @ 833MHz| MoBo: MSI K9A2 CF v1.0 (BIOS: 1.D)| GPU: Asus HD 6850 1024MB (DirectCu) @ 850/1150MHz | Display: L24FHD | PSU: PC Power & Cooling Silencer 750 Quad | OS: MS Windows 7 Pro x64
CPU: Pentium 4 Northwood S478 @ 3200MHz | RAM: 1,5GB DDR 400| MoBo: Gigabyte GA-8S661FXMP-RZ | GPU: ASUS Radeon X1650 Pro 256MB | Display: Dimarson 19" CRT | PSU: Noname 400W | OS: changing twice every week...