# 32bit True color or 16bit high color?



## BigDaveinNJ (Jun 9, 2000)

I have a DELL PIII 500Mhz PC and I run Win98se and I was wondering which of the display property settings ... 32bit True color or 16bit High color would be better as far as a better image and overall PC performance? I run my Trinitron monitor @ 800 x 600 resolution because my eyes cannot handle the smaller fonts that accompany the higher resolutions.

Thanks in advance

DAVID


----------



## jerricho (Dec 21, 2001)

This page will give you a good piece of info on the different color modes....

Color Mode


----------



## yohojones (May 7, 2003)

You'll experience banding, (stipes running through the picture,) for some pictures if you run in 16 color mode. If you have a video card with 8 megs of ram or more running in 32 should not be a problem.

It all depends on what level of visual quality you prefer.

While we're on the subject, if the higher resolutions hurt your eyes you could try raising your refresh rate. To do this right click on a blank area of your desktop and go to properties. This will get you into your display properties. Go to the system tab and then click on the advanced box. Go to the adaptor tab and there will be a pull down menu of different frequencies. Switch to a higher number and it will reduce screen flicker. Make sure your monitor can handle the higher frequencies first though because setting it to an unsupported number can wreck your monitor.

Yoho


----------



## BigDaveinNJ (Jun 9, 2000)

Thanks for the info guys.

 D.


----------



## brushmaster1 (Jun 15, 2002)

16 bit color (NOT 16 colors...there is a HUGE difference) uses a lot less system resources, and the human eye cannot tell the difference between 16 bit and 32 bit color.


----------



## yohojones (May 7, 2003)

The human eye cannot tell the diference between 24 bit and 32 bit... but it can definately tell the difference between 16 and 32.

Why would video card manufacturers spend millions putting the 32 bit color mode into their video cards if humans would not even know the difference.

Yoho


----------



## brushmaster1 (Jun 15, 2002)

I do a LOT of image processing, and I use 16 bit color. I have tried 24 bit and 32 bit, and there is NO visible difference! This applies regardless of video card or monitor, as I have used many of each. 

Video card manufacturers started using 24 bit color, then 32 bit color mainly as an advertising gimmick; 32 bit color SOUNDS more impressive than 16 bit. Why do you think all the high-end video cards still support the lower color depths? 4 bit (16 colors) and 8 bit (256 colors) are needed for some old EGA and VGA monitors, but there is no reason to support 16 bit color... except that it is just as good as higher color depths while using less resources.


----------



## yohojones (May 7, 2003)

http://www.lenswork.com/calibrate.htm

If you don't want to read the article it basically says that 16-bit color supports up to 65,536 colors. 32-bit color supports up to 16,777,216 color.

It's generally accepted that humans can resolve 7 to 10 million colors. You can definitely tell the difference beween the two. 24 bit and 32 bit are not discernable.

Yoho


----------

