The correction isn’t realtime using the viewfinder … commenters have noted that that would likely overtax even the iPhone 4’s processing capability. As an academic point I’m curious to hear from color blind folk if the correction approach actually works.
Actually, it seems that a plethora of apps have emerged to fill this niche, although many seem to fall into the “touch a color see its name” variety…
Harkening back to my March post on iPhone-enabled severe weather spotting/reporting, here’s a similar concept applied to birding.
A recent conversation about upgrading some of our NASA public exhibits to “self-narrated tour” capabilities has sent me down a speculative rabbit trail of thinking more broadly about virtual interactivity with physical world objects. Museums have, in some ways, paved the way here: from self-guided Walkman tours to self-guided iPod Shuffle tours to museum cell phone tours, allowing portable end-user interactivity, but I think this interactivity is typically decoupled (absent user intervention) from real world objects. I think (and I suspect many others have thought) that it’s not too far downstream that we’ll have sufficient standards, automation and capabilities in place to allow key (if not many) real world objects to accessible, virtual-world analogues (yep – full circle to the Coffee Pot Webcam). Read more
It has always amazed me how poorly-served the color-blind population is when it comes to automated, seamless solutions and adaptations to their condition in the digital and print worlds. I first realized that color-blindness was nowhere near as rare as I had thought it to be while working as a scientist and making very heavy use of color-encoding in my data visualizations; this was an effective way to learn that a full 2% of the population are dichromats, and around 5% have some form of color blindness. Read more
OK, so my first entry is a riff off of an existing idea, but I think has real legs.
The inspiration comes from a recent NPR Science Friday story about the National Phenology Network, which gets citizen-volunteers to share their observations of plant phenology (first bloom, etc) to help monitor climate data.
Neat, but I’m thinking a fairly “niche market”, and a web interface, while it will get the job done, may not be the best way to encourage participation (during the interview, iPhone, Twitter, etc future interfaces were briefly mentioned).
A much, much larger “market” could easily exist with severe weather spotting, and I’m zeroing in VFR-direct to the iPhone app concept. Read more