View Single Post
Old 02-28-2002, 10:12 AM   #8
Detah
Dungeon Master
 

Join Date: January 3, 2002
Location: Frankfort, KY
Posts: 94
FYI Some stuff I found on the net.

Refresh Rate- The number of times per second a displayed image is regenerated. Measured in Hertz, e.g., 60Hz or 60 times per second. The faster the rate, the steadier the image.

Slower video refresh rates can cause flicker at high resolutions. The video adapter and monitor may support a higher refresh rate for the resolution you are using.

It is entirely possible that when you installed your video card (or video card driver) that it forced your refresh rate to be set at a certain level, like 100 Hertz. Higher refresh rates are recommended for when you experience flickering or jotty gameplay during some games.

I suggest you read the readme.txt files for your videocard and your monitor and check what each one recommends for the refresh rate.

I found this on the microsoft website
“Configuring your hardware to use a refresh rate that your hardware does not support could damage your monitor, and is not recommended.” [hardware here means your video card and your monitor.]

I have never heard of a lower setting damaging a computer. I’m not sure why windows is giving you that message. Maybe its a built in protection so people dont go changing the refresh rate willy nilly. It is possible that setting your refresh rate to 110 when the recommended is 60 could harm your monitor, but not vice versa. To be safe, check your readme files and see what is recommended.

PS once you have confirmed that your videocard driver has been updated, you no longer need the .exe file. I recommend creating a drivers folder and placing it in there in case you need to reinstall it later. It is also a good idea to keep a little txt file in there with the file name of the driver and what device it belongs to.

Detah
Detah is offline   Reply With Quote