Any of you PC gamers know how to prevent Windows from using a certain refresh rate? As in disable it, or mark it as unusable?
I still use my old CAD-grade 16:10 24" Sony CRT for gaming because:
- Zero input lag at all resolutions
- No ghosting
- AMAZING color/image quality
Unfortunately it does introduce one problem. I run at 1920x1200@96Hz pretty much all the time. However, Windows 8.1 thinks 1920x1200@100Hz on my monitor is an acceptable refresh rate even though 96Hz is as high as it goes at 1920x1200.
The vast majority of games will either use what the system is using (1920x1200@96Hz), or give me the option to select a resolution AND refesh rate. However, a small subset simply grab the maximum available refresh rate for a given resolution and run in that. That results in me getting a blank screen until I Alt-Tab back out and use Task Manager to kill the program. I'm wanting to play one of those games lately and that's annoying the crap out of me.
If it helps at all, the monitor model is an SGI GDM-FW9011 (basically a rebadged Sony GDM-FW900).
Kestrel (Falcon NW Tiki) – 4.0 GHz i7 4790K / 16GB RAM / 512GB Samsung 950 Pro M.2, 2x480GB Intel 730 (RAID0), 10TB STX BarraCuda Pro / GeForce GTX TITAN X 12GB
Iridium (MacBook Pro Mid-2012) – 2.7 GHz i7 3820QM / 16GB RAM / 2TB Samsung 850 Pro / GeForce GT 650M 1GB
Antimony (PowerBook G4 2001) – 1.0 GHz PPC 7455 / 1GB RAM / 512GB Micron M600 / Radeon 9000 64MB
When there's a multiplayer version, I'm going to be on Frost's team. Well, except he doesn't seem to actually need a team...I mean, what's the point? "Hey look, it's Frost and His Merry Gang of Useless Hangers-On!" Or something.