iPhone Color-Blindness Correction
It has always amazed me how poorly-served the color-blind population is when it comes to automated, seamless solutions and adaptations to their condition in the digital and print worlds. I first realized that color-blindness was nowhere near as rare as I had thought it to be while working as a scientist and making very heavy use of color-encoding in my data visualizations; this was an effective way to learn that a full 2% of the population are dichromats, and around 5% have some form of color blindness.
The details of the visualizations below aren’t terribly important; suffice it to say that what the viewer needs to “get” is that different colors (in this case, types of cloud/storm vertical profiles) live coherently in different areas of the data space (here’s the full writeup). Deuteranopes are significantly disadvantaged when I show them this chart.
What is particularly perplexing is that automated solutions should be fairly trivial. Mathematically, this is just a problem of finding an optimal transformation (mapping) from a three-dimensional data space to a two-dimensional data-space (for dichromacy), allowing for some adjustable parameters based on a particular individuals type or degree of color blindness. Not hard, and easily achievable in real-time given current computing power.
These folks have some rudimentary “Daltonize” algorithms assembled together on a web page, as well as a Photoshop filter, which illustrate what the results could be (there’s also a Firefox extension to test/simulate). Interestingly, the simple algorithm above doesn’t do terribly well on my test cases above; I think this is because they, by design, arrange “related” colors next to each other, so the Daltonize algorithm’s ability to boost contrast is overridden by the brain’s desire to overlay coherence.
Great opportunity for:
- A real-time, iPhone-based color correction “color loupe”: simply a tap into the iPhone’s onboard camera viewfinder, with some back-end real-time color correction. At minimum, this would help color-blind individuals with what (I assume) are all those frustrating instances (not just scientific data visualization) where data are encoded into colors. Based on near-term iPhone usage (17m sold through Dec 08), and the stats above, this is something like a 250,000 – 1,000,000 person market. Nice return for a $0.99-$1.99 price point app.
- Apple to get on the game, and get an edge with this 5% of the Windows user base, with native, real-time Mac color correction (including support for print correction via ColorSync). Actually, given how forward-looking they have been with Universal Access, it’s just plain mystifying that they haven’t hit this niche by now. (Note: there appears to be a Linux/GNOME applet which provides some type of filtering.)
What am I missing? Has someone snapped up all the patents and squirreled them away, blocking any kind of systematic solutions? This is way overdue…