February 1, 2018
My daughter bought a used car to get back and forth to college. Her favorite thing about it is that it has buttons and knobs, as opposed to the touchscreens and voice commands that dominate the user interfaces (UIs) in the cars my wife and I drive. Is the younger generation yearning for an old-school tactile experience? Probably not.
The physical button vs. touchscreen interface battle was already fought by Apple and Blackberry. Buttons lost. My daughter, like most of us, has become accustomed to the relatively intuitive and responsive touchscreen UI on her smartphone. That is not the experience she had in my 2013 car. It was called “frustrating to use” in one of the more polite automotive reviews. The touchscreen in my wife’s 2015 car from a different automaker fared slightly better, being described as merely “difficult to use,” by the reviewer.
When shopping for our current vehicles, we paid attention to factors like acceleration, fuel economy and price. We did not focus in on how easy it would be to accidentally turn up the heater instead of the radio, how wearing gloves would prevent interaction with many of the cars’ features, or how good they were at recognizing our voices. Rookie mistakes. They were our first touchscreen- and voice-enabled cars, and we just assumed they would work like our touchscreen phones.
Learning from Experience
My wife and I, like most consumers, know better now. I have no doubt that the test drives for our next cars will be dominated not by driving, but by interacting with infotainment systems, environmental controls and navigation systems. Nor do I doubt that the experience will be much improved. Automakers have long known how important user experiences are—from the weight of the doors, to the sound of the engine—the way a car feels differentiates it. Now that cars are essentially computers on wheels—and will soon be self-driving ones—the UI is the most important differentiator for many consumers, and automakers know it just has to work.
It’s not just the automotive industry that has learned from its UI mistakes. Many industries are still responding to the opportunities created by integrating electronics and software into their designs, even as artificial intelligence opens up new possibilities for interactive user experiences.
Take the so-called smart speakers for example, which along with connected, self-driving cars were the talk of CES, the annual technology show that took place last month. We received an Amazon Echo smart speaker for Christmas. The voice recognition technology is impressive. The speaker can usually pick up commands even if we’re in another room and even through background noise. The “smarts” used by the smart speaker are pretty impressive as well.
In time, the novelty of hardware being able to detect our voices and software being able to correctly identify what we want will wear off. We fickle consumers will quickly go from “I didn’t think it would be able to do that” to “Why can’t it always do that?” Already, my kids are disappointed that the Alexa-controlled Echo can’t add milk and eggs to the grocery list with one voice command instead of two.
The Bleeding Edge
Software is hard, and great UI is even harder. The latest generation of smartphones, cars and speakers have set the UI bar high for products in other industries. When it comes to user experience, it’s advisable to look before you leap into the latest interfaces.
For example, does the world need a smarter toilet? “Voice services and connected devices have become integrated into every facet of the home—with the notable exception of the bathroom,” Kohler president and CEO David Kohler said in a statement announcing the Numi, its Alexa-enabled toilet that made its debut at CES 2018. “Until now.”
He has a point and hopefully the research to back it up. It takes considerable engineering and consumer behavior knowledge to get the user experience right. Our cover story in this issue explains the complex and rapidly advancing technologies that are supporting voice-enabled products. We also take a look at how algorithms may soon be able to sense consumers’ emotions.
Someday your smart toilet may start playing “Splish Splash” when you tell your shower to begin running water, just because it senses your amusement. In the meantime, designers, engineers and marketers need to collaborate to decide when a button, knob or lever is still the best solution for a given application.