I replaced several Identical XP PC's with dual monitor setups with new identical models, say old Dell Opti 390's with new Dell Opti 3020's. We kept one of the old XP's in place for the time being so I have a working XP setup just as it was plus the three new Win7 units. All PC's connect to dual monitors of identical make and model, 1 Samsung 19" wide screen and one Dell 19 inch standard. All PC's have the EXACT SAME resolutions set for each monitor. XP is set to 96DPI, Win7 is set to Smaller which is supposed to also be 96DPI.
Problem is we have an app with hard coded windows sizes. On the XP box when we push that app to the secondary screen, the Samsung LCD, the app displays perfectly within the screen boundaries. On Windows 7 when we do the same the app appears to be about 1.5x to large, it exceeds the screen dimensions. As we are at the max resolution for these monitors we cannot increase them further.
So what gives. 1360x768 is supposed to be 1360x768, it should not be OS dependent. 96DPI should be 96DPI, it shouldn't change with the OS. Yes all monitors have their own settings set exactly the same, zoom, screen size, etc. Yes I get it that different video cards will change things but I go back to 1360x768 is supposed to be 1360x768, video card shouldn't be adjusting that on it's own based on OS yet allow Windows showing us we are in a certain resolution.
Again Same Monitors on all stations, same video card on all stations. Windows 7 is clearly showing a different display resolution than XP does even though the resolution settings are identical.
AMD Radeon 5450 - Latest drivers