# show 8bit images on 16bit screen?

Help me, I need to know how to display 8bit images on a 16bit screen mode...

• : Help me, I need to know how to display 8bit images on a 16bit screen mode...
:

I try to help you from what I know, but I can't give you the solutions.
The passage from 8bit to 16 bit maybe a problem 'cose they completly different meaning:
16 bit mean that you have an 16bit integer per pixel where the first 5 more sigificative bit are red's one (0-31), the 6 next one are green's one (0-63) and the last 5 one's are blue's one.
Bit/Color: F/R,E/R,D/R,C/R,B/R,A/G,9/G,8/G,7/G,6/G,5/G,4/B,3/B,2/B,1/B,0/B or RRRRRGGGGGGBBBBB

But the 8 Bit mean that you have an char (0-255) per pixel, that's an index for a palette(Color0, Color1, ecc.)
The palette has variable settings and dimension, it usually use 12bit (4bit for Red, 4 bit for blue and 4 for green), but the colors that are in a palette are usually variable so that you can with 256 colors better aproximate the original image. All this information should be in the image's header or it may use the windows's one.

What u have to do is to understend where is and what dimension have the image's palette, and the convert the palette format into 16 bit format:
Case 12 bit:
red_16=red_12+red_12
grn_16=grb_12<<2
blu_16=blu_12+blu_12