Topic Title: How Long Would This GPU Last In Terms of Gaming?
Topic Summary: AMD Radeon HD 7970 Ghz Edition on 1920x1080p, high quality?
Created On: 02/25/2013 09:27 PM
Status: Post and Reply
Linear : Threading : Single : Branch
Search Topic Search Topic
Topic Tools Topic Tools
View similar topics View similar topics
View topic in raw text format. Print this topic.
 02/25/2013 09:27 PM
User is offline View Users Profile Print this message

Author Icon
vElectrixx
Peon

Posts: 1
Joined: 02/25/2013

I'm talking about Crysis, Batman, and future releases that have much better graphics. I was told it would "last" for about 4 years until I'd need to change the settings to lower quality.

 02/25/2013 11:30 PM
User is offline View Users Profile Print this message

Author Icon
Thanny
Elite

Posts: 1207
Joined: 07/13/2009

Well, there are a handful of games today that a single 7970 can't maintain a steady 60fps with at 1920x1080.  Even though almost all games are console ports, some developers make the extra effort to add PC-only features that are more demanding on the hardware.  One example is Batman:AC, which, with all DX11 features at the highest levels, is too much for a pair of 7970's at 2560x1600 to maintain a steady 60fps.  Add in high PhysX (requires a spare nVidia card and some mucking about), and it's even more demanding.

Beyond that, as far as straight console ports go, much will depend on just how much more capable the next generation of consoles are.  The big two will both be released within a year most likely.  The published specs for the PS4 reveal less than half the raw rendering power of a 7970, but fast unified memory and the CPU and GPU sharing a die means performance will likely be quite high.

I'd guess that within four years, most new games will be more than a single 7970 can handle with all settings set to max.  Time will tell.

 

 02/26/2013 08:06 AM
User is offline View Users Profile Print this message

Author Icon
Immortal Lobster
Forum Moderator

Posts: 219
Joined: 08/28/2012

At the current rateof game development, I don't think any GPU can last more than 6 months anymore.

 

Personally, I think it's getting rediculous, and the game developers need to program them just a tad more humbly. just sayin'



-------------------------

 02/26/2013 08:51 AM
User is offline View Users Profile Print this message

Author Icon
black_zion
Nanotechnology Guru

Posts: 11585
Joined: 04/17/2008

Agreed, it's going to depend on the games you play. Look at games such as the latest in the Battlefield series or even The Witcher 2, programmed very well so it only takes a single 7970 at very high detail levels. Then look at games such as WoW, FarCry 3, Crysis 3, Skyrim, and any of the console ports which are released quickly, they are very underwhelming.

But then again your definition of "max settings" may be different than ours. For example, AA can cause a large performance hit, yet with the higher resolution textures used in modern games, subjectively you may not notice a difference between 2, 4, and 8x.

-------------------------
ASUS Sabertooth 990FX/Gen3 R2, FX-8350 w/ Corsair H60, 8 GiB G.SKILL RipjawsX DDR3-2133, XFX HD 7970 Ghz, 512GB Vertex 4, 256GB Vector, 240GB Agility 3, Creative X-Fi Titanium w/ Creative Gigaworks S750, SeaSonic X750, HP ZR2440w, Win 7 Ultimate x64
 02/26/2013 10:20 AM
User is offline View Users Profile Print this message

Author Icon
zipsi
Case Modder

Posts: 922
Joined: 11/22/2010

The Witcher 2 - 50-60fps with everything maxed, including ubersampling(solid 60fps without). BF3 on ultra, 4xMSAA, solid 60 fps and.. then comes Crysis 3 with 35fps on low, 30 on medium, 25 on high and 10-15 on very high.. I have no idea what in C3 requires that much hardware, it doesn't even look good, over bloomed console port with extremely overused amount of shaders so it looks like a house of mirrors, I feel happy I didn't buy that p-o-s and tested it with the help of a cryshiz fanboy. It is not normal that one would need to trifire 7970's to max something on 1080p..



-------------------------
Intel I7 3770K @ 4.5ghz, Cooler Master Hyper 212 Evo, Gigabyte Z77X-UD3H, Gigabyte HD7970 ghz edition, 4x4gb Kingston HyperX Beast 2133mhz, Seasonic Platinum 860, OCZ Vertex 4 128gb, WD Black 1TB, WD Green 3TB+1.5TB, ASUS Xonar Essence ST, ASUS VE278Q, Windows 3.11
 02/26/2013 04:09 PM
User is offline View Users Profile Print this message

Author Icon
black_zion
Nanotechnology Guru

Posts: 11585
Joined: 04/17/2008

One reason why my "free" Crysis 3 code I got with my 7970 (which XFX still has "in testing" longer than I was actually able to use it) sits on my shelf collecting dust...We do have one hope, and that is that consoles are moving closer to computers, and game programmers are finally starting to learn that the community at large aren't going to pay good money for a game that takes massive amounts of money to get playable frame rates because they're to lazy.

-------------------------
ASUS Sabertooth 990FX/Gen3 R2, FX-8350 w/ Corsair H60, 8 GiB G.SKILL RipjawsX DDR3-2133, XFX HD 7970 Ghz, 512GB Vertex 4, 256GB Vector, 240GB Agility 3, Creative X-Fi Titanium w/ Creative Gigaworks S750, SeaSonic X750, HP ZR2440w, Win 7 Ultimate x64
 02/26/2013 04:18 PM
User is offline View Users Profile Print this message

Author Icon
stumped
Stack Smasher

Posts: 6103
Joined: 11/13/2009

That is strange...I am getting 48fps average on crysis3. Everything on high except shaders and game effects which are set to very high. I found using d3d overrider to force triple buffering increased framerate 9fps...this is with vert sync enabled.



-------------------------

Intel I7 960 @ 3.87ghz *Intel DX58SO *HIS HD6970 2gb *Corsair TX650M *2x4gb Corsair XMS3 *WD Black 1TB *Windows 7 Home Premium 64bit *Gateway HD2401 24" 2ms 1920x1200  ***Asus N71Jq laptop *Intel I7 720QM Processor *Mobility Radeon HD5730 1gb *8gb Ram *Windows 7 64bit  



* A clear conscience is usually a sign of bad memory *

 02/27/2013 05:46 AM
User is offline View Users Profile Print this message

Author Icon
Offler
Grinding Levels

Posts: 115
Joined: 06/25/2010

Triple buffering + Vsync = Best possible configuration.

People on gaming forums this usually dont understand why, on technical and hardware forums some people agree with it.

Vsync causes frame A to be locked on screen for time of 1 monitor refresh cycle (at 60hz its 16,7 mililseconds = a plenty of time), Frame B is already written in memory, GPU can work on Frame C if there is free (third) buffer. If you dont have it, your peformance is decreasing.

Therefore those two features should be enabled together. Vsync with double buffering is causing dramatical performance reduction. Thats the reason why are people looking for "Adaptive Vsync" and similar useless features.

Secondary positive effect is that CPU and GPU is not generating more frames than your monitor is able to display. This reduces CPU and GPU utilisation and thus power consumption.

Argument regarding longer input lag is not valid if you are able to generate enought frames (typically 60). In this situation input lag is extended by exactly 16,7ms when comparing to disabled Vsync and double buffering.

If you have low framerates its no need to enable Vsync at all. Your PC is unable to run game on constant speed and the performance fluctuates. In such situation its far worse than arguing against vsync and input lag, since unstable performance is generating much longer delays.

that why I believe that Adaptive Vsync is only placebo for those who are unable to fine tune game performance to 60fps. It just turn Vsync ON when FPS grows to 60, otherwise is disabled, but to make this feature have any meaning you need to turn on triple buffering on all times.

Triple buffering is quite memory consuming... Especially on high resolutions.



-------------------------

Xeon X3360, Asus Rampage, Gigabyte HD 7970OC, Corsair HX-650



Edited: 02/27/2013 at 05:53 AM by Offler
 02/27/2013 08:39 PM
User is offline View Users Profile Print this message

Author Icon
Thanny
Elite

Posts: 1207
Joined: 07/13/2009

Largely correct, but some misconceptions in there as well.  Vsync is always useful for making sure the frame buffer isn't updated in the middle of a screen refresh, regardless of the frame rate.

Triple buffering only needs to be used when the frame time sometimes goes above the refresh interval (i.e. 16.67ms for a 60Hz display), to avoid dropping from 60fps to 30fps (or 120fps to 60fps for a 120Hz display).

The display lag caused by vsync will never reach 16.67ms.  If the frame time is lower than the refresh interval, the display lag due to vsync will be the refresh interval minus the frame time.  So a 6ms frame time means that the image displayed at the next refresh will be 10.67ms old.  If the frame time is 16ms, then the image would be less than a millisecond old when it was displayed.  If the frame time were half a millisecond (possible for very, very old games), then the display lag due to vsync would be over 16ms. 

If the frame time goes above the refresh interval, then the display lag due to vsync will vary, but never reach 16.67ms.  With triple buffering, of course.  Without it, the lag would be 33.33ms minus the frame time.

The only game I've ever seen which had pathological display lag with vsync was Dead Space.  Absolutely atrocious with the built-in vsync, that locks the rate to 30fps.  Much better but still noticeable if vsync forced from without with D3DOverrider to the native rate (60fps for most people).

No other game I've played (not a small list) had noticeable display lag at all, much less due to vsync.  People using PVA or Overdrive panels will have a lot of display lag created by the monitor itself, so it's possible the little bit added by vsync will push it past the point of detection.

I still think screen tearing is much more obnoxious, which is why I always use vsync.  I cannot comprehend how some people can claim to not notice it (or, worse, claim it doesn't happen on their screen, despite the statistical impossibility of that being true).

 

 02/27/2013 10:07 PM
User is offline View Users Profile Print this message

Author Icon
black_zion
Nanotechnology Guru

Posts: 11585
Joined: 04/17/2008

Fluidity is paramount. Eyes evolved to see changes, which is why you can stare at a field of 100 red squares and as soon as one flips to maroon or burgundy, or to another shape or even disappears altogether, you can see it in fractions of a second. Frame rates of movies are 24fps, yet in a theater or on a HDTV using the BR standard 1080p24 they appear fluid and how the same movie using 4¦3 pulldown on non 24hz displays can show jutters. Games can have frame rates in excess of your refresh rate, but if the timings between the frames (something AMD has been hammered for lately) aren't consistent, then you notice that as well, which is why the same game at the same frame rate can subjectively perform worse on AMD hardware than nVidia, even if the frame rate is greater on AMD hardware.

Only games I can think of which really don't like Vsync are TES games, especially I-IV. Bad mouse lag and increased wait/rest times if Vsync was used.

-------------------------
ASUS Sabertooth 990FX/Gen3 R2, FX-8350 w/ Corsair H60, 8 GiB G.SKILL RipjawsX DDR3-2133, XFX HD 7970 Ghz, 512GB Vertex 4, 256GB Vector, 240GB Agility 3, Creative X-Fi Titanium w/ Creative Gigaworks S750, SeaSonic X750, HP ZR2440w, Win 7 Ultimate x64
Statistics
81037 users are registered to the AMD Support and Game forum.
There are currently 3 users logged in.

FuseTalk Hosting Executive Plan v3.2 - © 1999-2014 FuseTalk Inc. All rights reserved.