Okay so I recently bought a new gigabyte radeon 7870 oc'd edition from amazon on wednesday. Meaning it comes out of the box at 1100 mhz, with vram set to 1200mhz. I am running windows 7 64 bit. I am using a cx600m power supply, and have a 1080p 60hz LCD sharp 22/23 inch aquos display. I am using a ga970 ds3 motherboard, with 8 gb of corsair ram, and an fx 6300 with stock cooling to go along. I have tried two hdmi cables but not dvi or mini hdmi. However both cables didn't help the problem. The problem is this. My screen stutters/flashes every so often, it ranges from 1 second to 5 seconds. First a thin horizontal line near the middle of the screen, then 60-70% of the bottom half the screen flashes. I have tried using 13.4 drivers, and the drivers that came with the 7870 on the CD, I am not sure which drivers, but it's a 2012 driver I think. When I did a clean sweep of drivers, with driversweeper, I tried just installing the disc based drivers, and it seems to fix the problem. I played all night, with maybe one or two slight artifacts in a arma 3, but this was during a session of about 5 hours. Anyways, I turned my computer off, and turned it back on about 8 hours later, the problem started up again. I am not sure whether it's the drivers or the card, or the cables or the display? I am going to try hooking the computer up to my plasma 65 inch 1080p display with the same 2 hdmi cables, but for now I don't think it's the display because I used a really oldgpu before and it was fine.
Final words, the flashing bottom of screen and flashing horizzontal line doesn't happen on 16 bit color, or in 720p mode. only happens in 1080p 60hz on 32 bit true color. Also, My temps are fine idle temp 26 c, and furmark load temp maxes at like 62-63 c. So my question is, what is the problem? I also can't download/install 13.6 beta drivers, it brings me an error message saying that I can't install because of damaged download or something like that. Anyways, some help would be nice.
Also, I tried lowering the clock speed to like 900 mhz and 1000 mhz, but didn't change anything,probably because It's happening even at idle :/
I tried it on my 65 inch plasma at 1080p 60hz, and instead of that, I got shakiness/stuttering that I don't see while playing arma 3, so for example if I move my view left and right in the game ( this is at like 30-60 fps), the grass and stuff flickers on my plasma, but not on my 22 inch lcd. I tried playing at lowest settings which gave me like 80-120 fps, and I didn't see any problem but the textures and stuff were horrible so I can understand why there wouldn't be a problem. anyways, none of this happens at 50hz at 1080p on my lcd. my plasma doesn't support 50hz. So my question now is, why can't I run my screen at 60hz 1080p? is it my card or possibly the screen? I think it's the card, because at 1080p 60hz on the plasma it showed different problems. Could it be my hdmi cables? I tried 2, both show the same problem, and I don't mind playing on 50hz, but if my card is bad Im going to rma it. Sorry for talking so much, but it's just very confusing and puzzling to me.
What are you CPU temps? Maybe the new card is utilizing your CPU more than the old graphics adapter that you were using.
The opinions expressed above do not represent those of Advanced Micro Devices or any of their affiliates. Requiescat In Pace AMD Processor Forums | Member since: 1/19/2009 | Post count: 4142 Please don't PM me with questions, instead create a thread so that everyone can assist and benefit from the knowledge provided. Thanks in advance!