Short answer, that's probably entirely normal for your setup.
For any 3D application, there are two frame generation rates. The first is how fast the CPU can calculate the world models and send them to the graphics card. The second is how fast the graphics card can render the textures.
If the first is lower than the second, the GPU usage will be less than its capacity. Say, perhaps, that your CPU can prepare 30 frames per second to be rendered. The GPU can render 45 of those prepared frames per second. The overal rate will be only 30fps, and the GPU will be idle about a third of the time.
The maximum rate for CPU and GPU frame will vary from game to game, and within each game by the detail settings and screen resolution (for the most part, as resolution goes up, the CPU load stays the same, while the GPU load rises substantially). The threading of the game matters a lot, too, in these days of ubiquitous multi-core processors. If the game doesn't use multiple threads, CPU usage will even be below total potential.