I used to make games in 15 and 16 bit color mode, the problem with it is support from drivers, they're not always the same, and can be very problematic. With 24 bit graphics, you have 8 bits red 8 bits green, and 8 bits blue. With 16 bits, you have to split 16 three ways, and it just doesn't work, you have 5 6 5. The problem is, different hardware used the extra bit in different locations, so 15 bit color was normally more compatible. Don't listen to Wikipedia on this, I've never seen a card that wasted 4 bits for an alpha channel giving you 12 bits of color depth, and I have a lot of programming experience from that time span and on more than a few devices (The Ozzy_98 on some TG-16\PCE demos was me)
If you're wondering about 32 bit color, it's 8 per channel plus an "alpha" channel. meaning they pretty much waste 8 bits, but most hardware can process 32 bits faster than 24, so that's the advantage there.
And don't forget the joys of dos extenders!