Carmakers Prepare For Augmented Reality Driving 177
An anonymous reader writes "Car manufacturers at CES are showing off their future integration of mobile computing technologies and automobiles. Quoting CNN: 'As digital tech — and our expectations for it — becomes more mobile, carmakers are taking notice. Many automotive designers here seem to have taken inspiration from smartphones, with their promise of being always connected and their vast menu of apps for every purpose. ... Simply point your hand at them, and the icons open to show real-time information: when that bridge over there was built, what band is playing at that nightclub on the left, whether that new café up the street has any tables available. Wave your hand again, and you've made a restaurant reservation. ... All these advancements may make driving more interesting. Or they may spoil one of modern society's last refuges from the hyper-connected digital world. Either way, they are coming soon.'"
Not in North Carolina (Score:1, Informative)
Operating such an interface would be against the law here. Thankfully.
If industry lobbyists manage to get this legalized, expect this to directly increase car wrecks and fatalities.
Head-Up Displays are double edged swords (Score:5, Informative)
In reality, the auto companies and their partners in university labs have been doing work research on HUDs for a while. The augmented reality approach has been tried in research studies as a result of successes in the aviation community. However, there are huge differences between augmented reality for cars when comparing to planes and pedestrians. The point of this post is not that HUDs are bad or unlikely to succeed, but rather that the designers of trade show concepts are ignoring much of the existing research. The concepts in TFA are unlikely to be used for actual products due to safety issues. Expect simpler HUDs focused on safety oriented problems. Here are some of the safety problems:
First, cars tend to hit things quicker. This is a crude point about recovery time, but a major one.
Second, there is considerably more variation in scene brightness due to driving speeds and local factors like buildings and trees. This leads to challenges perceiving the HUD imagery. Demos on trade show floors and labs usually gloss over this factor.
Perhaps the biggest concern is that there humans have perception errors due to the way our brains integrate augmented reality with the real world. First is the issue of cognitive capture. This is when you ignore the real world and just use the HUD for your information. For example, the collision warning system may highlight all the moving vehicles, so you learn to just look for the highlighting. Unlike a video game where every object is known, automotive sensing doesn't work 100% and objects will be missed. Cognitive capture is when you fail to perceive the kid running into the middle of the street because he wasn't highlighted. This can be demonstrated easily in the lab and many studies have concrete evidence of this.
The second perception problem is that HUDs can lead to misperception of distance. A HUD can only have one focal length while the real world has an infinite amount. Mismatches can lead to the driver misinterpreting the distance of an object. This isn't a problem when flying (everything is at optical infinity) or walking (you're moving to slow), but can cause problems when driving.
The third perception problem is masking. This is when the information about the new cafe covers the pedestrian crossing the street.
If your organization has access to this paper [nih.gov], it is an excellent primer on the issues. And yes, it was written in 1997.
Perhaps helpful things (Score:4, Informative)
Re:Yes. (Score:3, Informative)