Topic Title: Lower than expected FPS: Cs GO
Topic Summary:
Created On: 02/20/2014 09:16 AM
Status: Post and Reply
Linear : Threading : Single : Branch
Search Topic Search Topic
Topic Tools Topic Tools
View similar topics View similar topics
View topic in raw text format. Print this topic.
 02/20/2014 09:16 AM
User is offline View Users Profile Print this message

Author Icon

Posts: 3
Joined: 02/20/2014

I'm sorry if this is too long, but I tried to be thorough. 


Graphics Card

amd r290x (gigabyte windforce gv-r929xoc) @gigabyte's factory settings

AMD Catalyst Driver Version, and Driver History
catalyst 13.11 (previously 13.1)

Operating System

windows 8 pro (x64)

Issue Details

Cs go maxes out at 230 fps when absolutely idle (source engine fps) and has an average of 150 fps (min 100) in ALL game modes (30man server, or 1v1 against a bot).  This is significantly lower than expected. 

150 fps is barely playable (100 is NOT) but anything over 200 is perfectly acceptable, to describe a range.  Recently I changed the graphics settings in the game DOWN and noticed a loss in fps (180 average to 150 average), which could not be recovered by reverting back to previous settings OR default OR lowest possible settings.

I know that csgo is a cpu intensive game, but I would not expect my i5 to be bottlenecking at <200 fps with low temps.  Furthermore boosting amd settings seems to handicap the fps even more (from what I understand, and I can be totally wrong, these things like FXAA are entirely handled by the gpu and not the cpu). If by turning fxaa on, I get a 20 fps drop (average), this seems to suggest the gpu as a bottlenecking culprit (because such a big decrease, if handled by only the gpu, means that the gpu was already quite taxed) , which makes no sense because its a 290x and lowest settings+fxaa should not be so impactful

Furthermore others with inferior gpu (particularly AMD radeon cards) and the same i5 cpu have reported greater than 210 average fps during gameplay.


The usual suspects: 

CPU usage hovers between 20-40% during gameplay.

No processor is being parked.

Multirendering is enabled (resource monitor reports all 4 cores working during gameplay--and due to gameplay, not other apps)

testing was done with -high (launch option for high processor affinity) and without (my usual), no difference was seen

CPU temp is 43 @all cores averaged over 20 mins of gameplay.

GPU temp is between 46-51 celcius with an average of 50 (sits at 50 consistantly). --well within 290x ezmode temp ratings

Tests were run at absolute lowest game settings @1920x1080 (am not willing to change resolution) (NO vsync)

Catalyst cs go profile is manually set to allow application settings 

I play the game in windowed mode (-noborder -sw launch options) on 1 monitor (I have 3 in total operating during the game)

Motherboard or System Make & Model

MSI 7808

Power Supply

850W Corsair 

Display Device(s) and Connection(s) Used

acer 60hz 21.5" @DVI

CPU Details

intel core i5-2310 @2.91 Ghz 4-core

 02/20/2014 08:59 PM
User is offline View Users Profile Print this message

Author Icon

Posts: 1050
Joined: 11/22/2010

150fps barely playable, 100 unplayable.. on a 60hz monitor, I just fail to find logic here. 60fps would be the PERFECT framerate for this monitor as it can't deliver any more frames, turn on vsync.

Intel I7 3770K @ 4.5ghz, Cooler Master Hyper 212 Evo, Gigabyte Z77X-UD3H, Gigabyte HD7970 ghz edition, 4x4gb Kingston HyperX Beast 2133mhz, Seasonic Platinum 860, Samsung 840 Pro 256gb, OCZ Vertex 4 128gb, WD Black 1TB, WD Green 3TB+1.5TB, ASUS Xonar Essence ST, ASUS VE278Q, Windows 3.11
 02/21/2014 06:10 AM
User is offline View Users Profile Print this message

Author Icon

Posts: 3
Joined: 02/20/2014

I believe right before I stated those figures I explained it was source engine fps.

If you do not know the difference then please read


I am not complaining about the refresh rate of my monitor.

 02/21/2014 09:57 AM
User is offline View Users Profile Print this message

Author Icon
Grinding Levels

Posts: 135
Joined: 11/11/2012

"FXAA are entirely handled by the gpu and not the cpu"

Does FXAA work on AMD gpu's ? FXAA/TXAA = Nvidia.  MLAA = AMD, no ?

Did you link the correct article ? And, what exactly is 'Source Engine fps' ? Why would a game like CS not be playable @100fps, and how do you measure the frame rates of the Source Engine ?

I skimmed the article about servers, but didn't pick up on why the game would be uplpayable @ 150fps. Are you running a server ?

Plus the article concludes - (can't make the bold go away.... )

It is not important if a server runs at 333, 500, 600 or even 1000 FPS. Any of these frame rates make a server fast enough. It is far more important that the server has a high quality internet connection and always reaches its pre-set FPS.

This has what to do with those playing the game ?

Not sure if it applies, because you seem to be moving settings around, but I  read this interesting observation regarding lowering settings on a cpu bound game a while back -

"The biggest bottleneck that any piece of hardware will suffer from is software not utilizing it full capacity.

It's extremely easy to create an 'artificial' bottleneck where you match a slower cpu with a highend gpu then turn all the game settings down to low.

This will inevitably show that a faster cpu allows the highend gpu to render more frames per second.
However this isn't a realistic real world usage scenario, as who buys a graphics card to run games at low settings? Your much more likely to have the game settings turned up as high as they will go, drastically increasing the graphics processing requirements of every single frame rendered, reducing the amount of work the cpu actually has to do.... "


 02/21/2014 08:45 PM
User is offline View Users Profile Print this message

Author Icon

Posts: 3
Joined: 02/20/2014

If you skim an article and then say I dont understand why this relates, because you skimmed the article I cant really help you any further.

Cs servers run at fps x.  Your computer runs at fps y. As y->x the game reaches highest playability, or the version of the game you are running is closest to the version of the game the server is running before incorporating latency or loss.  

If you read the article this would be incredibly obvious as to what it has to do with playing the game.

FPS should be a function of cpu and gpu power.  The bottom line is that 

Since I didnt say I was running a server you can safely assume I am not.  

I have no idea why fxaa wouldnt work on amd gpus, it isnt propriatary nvidia and its mostly software handled anyway right?  Also I said I could be wrong about the cpu/gpu handling thing.


The last thing you mention is interesting, but it wouldnt apply in general since I have tested on all relevent permutations of settings. Although it does suggest a few things about why turning graphics settings all the way up doesnt have as big of an impact (assuming it applies here in any way).


 02/23/2014 06:24 AM
User is offline View Users Profile Print this message

Author Icon

Posts: 5
Joined: 02/13/2014

I think we are experiencing the same problem, my issue is with older cod2 game. 280x doesn't maintain 250 fps, but radeon 6850 works like a charm.

And guys please don't talk about how useless is to run 250 fps on a 60 Hz monitor. Try playing cod2 on 60 fps and 250 then report back. Difference is noticable.

93135 users are registered to the AMD Support and Game forum.
There are currently 2 users logged in.

FuseTalk Hosting Executive Plan v3.2 - © 1999-2015 FuseTalk Inc. All rights reserved.