I don't think you understand the nature of supersampling. If you enable 8xSSAA with a resolution of 1920x1080, you're doing the equivalent of rendering at a screen resolution of 15,360x8640. That's over 132 million pixels per frame to render.
Why would you think any GPU can do that in a modern game with no problem?
As a quick experiment, I ran Mass Effect (with frame rate smoothing disabled) using various AA settings. Resolution was 2560x1600, and I have two 7970's in CF at 1200/1500.
I loaded a save from the galaxy console on the Normandy bridge, and had the character turn around to look at a particular spot on the wall (toward the front of the ship is CPU bound to about 130fps). Here are the results:
- No AA - 305fps
- 8xMSAA - 203fps
- 4xSSAA - 137fps
- 8xSSAA - 82fps
- 8xMLAA - 148fps
As you can see, there's a substantial decrease in frame rate going from 8xMSAA to 8xSSAA. Moreover, because of the deferred rendering of the Unreal 3 engine that Mass Effect uses, none of the traditional AA methods works on all surfaces. The only one that truly anti-aliases the entire scene is MLAA, a software method that acts on the completely rendered frame, which can render some in-game text unreadable.