My setup is as follows: I have a 21" monitor connected via DVI port, and a surround receiver connected to the HDMI port, with the receiver connected to a 32" LCD TV. I do general computing on the monitor and games through the TV. I usually have one screen on at a time, but occasionally both together.
After updating the driver to 13.1, the graphic card would only turn on the DVI port if there was any kind of active connection on the HDMI port. I could only use my monitor if either the TV and the receiver were on; if either was off (and not sending power to the HDMI port) my monitor would turn off (as opposed to going blank). I would have to physically disconnect the HDMI cable to be able to use the monitor with the receiver or TV off. This did not occur in the BIOS, only in Windows, and only when an ATI driver was installed. I used the ATI cleaner utility to remove the driver and when I would restart and Windows would be free of ATI drivers, I would be able to use the monitor while the HDMI connection was inactive. I even reinstalled the older driver (12.11) but the problem still happened. I had to perform a system restore to the day before to get everything working properly again.
Since the computer kept switching off the DVI even when I cleaned the system and installed an older driver again, I'm kind of inclined to think there's some obscure setting that I missed when I was futzing around with the drivers. It would explain why only a system restore got my setup back to the way it was before. Anybody know what I'm talking about? I can't upgrade my drivers until I know what was causing the auto-switch from DVI to HDMI.
I'm using a Sapphiretech HD 7850 2GB OC card in Windows 7 64bit.
01:33 AM by