Video boards for the last few years are 24 bit color (called True Color). Note that there is no 32 bit color. The confusion is that 24 bit color mode normally uses 32 bit video mode today, referring to the efficient 32 bit accelerator chips (word size). The 24 bit color mode and so-called 32 bit video mode show the same 24 bit colors, the same 3 bytes RGB per pixel. 32 bit mode simply discards one of the four bytes (wasting 25% of video memory), because having 3 bytes per pixel severely limits video acceleration functions. Processor chips can only copy data in byte multiples (8, 16, 32, or 64 bits). A 24 bit copy done with a hardware video accelerator would require three 8-bit transfers per pixel instead of one 32-bit transfer. 32 bit video mode is for speed, and it shows 24 bit color.
24 bit color is 8 bits each of RGB, allowing 256 shades of each primary color, and 256x256x256 = 16.7 million color combinations. Studies show that the human eye can detect about 100 intensity steps (at any one current brightness adaptation of the iris), so 256 tones of each primary is more than enough. We won't see any difference between RGB (90,200,90) and (90,201,90) but we can detect 1% steps (90,202,90) (on a CRT tube, but 18 bit LCD panels show 1.5% steps). So our video systems and printers simply don't need more than 24 bits.
http://www.scantips.com/basics11.htmlEn plus, aucun système d'affichage ne gère actuellement les "couches alpha", Beryl / XGL inclus.