BY: Ashley Yost, Communications Manager for Arizona State University
Arizona State University (ASU) is taking the competition to the next level with their team’s approved innovation topic. The team has chosen to tackle the issue of driver distraction by allowing the driver to interact with the entire car, not just the entertainment system. This multi-modal approach will allow drivers to communicate with the vehicle in visual, touch, and speech-interactive methods.
The reason ASU decided to tackle this topic is because speech recognition has become very common in today’s vehicles. This type of interaction was ground-breaking when first revealed, but speech recognition does not seem to be the ultimate solution for driver-to-vehicle interaction.
Humans mainly communicate in three ways: verbal, para-verbal, and non-verbal. Verbal is related to speech, para-verbal is related to the tone of speech, and non-verbal is related to body language, hand movements, and lip reading. By heightening these interaction capabilities within the vehicle, driver distraction is bound to be reduced.
Cars have developed to be computers on wheels and a need has developed to bring the technologies into the cockpit. Arizona State is looking forward to the integration process and bringing their research to reality.