honda and softbank team up to make cars that can feel emotion – DOC683465
Japanese telecoms and Internet giant SoftBank has announced that it will team up with Honda to create cars that can communicate with their human pilots on an emotional level. The proposed touchy-feely machines would utilize advanced artificial intelligence capable of reading a human’s emotional state and computing context-based awareness, effectively turning an everyday conveyance into a four-wheeled digital buddy.
According to a recent report in , SoftBank sees myriad applications for the technology, including both practical uses (offering help with parking) and psychological benefits (keeping the driver company on a long trip).
The proposal comes on the heels of SoftBank’s $32 billion acquisition of ARM Holdings, a U.K.-based computer chip design company that provides hardware for a slew of smart devices, including phones, tablets, and TVs, as well as automotive applications (ABS sensors, etc.).
SoftBank has already made significant strides in the areas of robotics and AI technology. The company’s most famous advance is Pepper, a humanoid robot introduced in 2014 that uses a slew of sensors, cameras, and microphones to “perceive” human emotion and provide “genuine day-to-day [companionship].”
Of course, Honda brings more to the table than just its expertise in building cars. The automaker also has a robot of its own – ASIMO (Advanced Step in Innovative Mobility), which is capable of recognizing movement, expressions, posture, and gestures, as well as moving in a bipedal, human-like manner.
Masayoshi Son, the billionaire CEO of SoftBank, had this to say: “Imagine if robots, with their super intelligence, devoted themselves to humans. And imagine that cars themselves became supercomputers or robots one day. Honda will be the first to adopt this technology.”
Why Does It Matter?
The last few months have been pretty damn interesting when it comes to automotive future tech. Most of the news has pertained to fully autonomous vehicles, a.k.a. driverless cars. Controversy over the application of autonomous vehicles was sparked when three separate high-profile crashes involving Tesla ’s Autopilot system were revealed, leading to a string of governmental investigations and a good amount of consumer anxiety.
But the debate over whether or not autonomous vehicles are ready for the road hasn’t stopped several of the major manufacturers from pushing it in the press. Most recently, Hyundai outlined its various uses in a report on 12 future “megatrends” that show promise in revolutionizing the industry, while Mercedes framed the E-Class as “taking the next step on the road to autonomous driving.” Tesla CEO Elon Musk even defended Autopilot in a 1,500-word blog post that says, “As the technology matures, all Tesla vehicles will have the hardware necessary to be fully self-driving.”
Long story short – the race to full autonomy is on, and it appears as though we are on the cusp of an industry-wide watershed moment.
Naturally, this begs the question – what comes next?
According to SoftBank and Honda, the answer is a smart car that can perceive emotion. But here’s the thing: it’s not a new idea.
We’ve seen it pop up from time to time in various forms over the years. In 2015, the Swiss automaker Rinspeed brought the Budii concept to the Geneva Motor Show, an all-electric “friend on wheels” capable of learning and adapting to the preferences of its driver.
Before that, Toyota offered up the FV2 concept at the 2013 Tokyo Motor Show, a three-wheeled stand-up motorcycle that could use facial and voice recognition AI to discern the driver’s mood, even making suggestions about possible destinations.
With the rapid progression of modern technology and competition amongst manufacturers to offer the bleeding edge in consumer features, it now looks like an emotional car could very well become a reality in the not-so-distant future.
How Does It Work?
Making stuff like autonomous vehicles and emotional robots actually work requires the combination of several different technologies. Here’s the basic rundown:
An artificial intelligence program would be able to discern an individual’s emotional makeup through a variety of different inputs. Microphones would pick up speech patterns (volume, word choice, speed, etc.), while cameras would observe the individual’s facial expression and posture. Infrared would register the individual’s body temperature, while further sensors would take into account perspiration levels, respiration rates, and similar indicators. From the data collected, the AI would then have a general idea how the individual felt.
From there, the program could then take a multitude of suitable actions completely autonomously. A few examples would include the appropriate in-car entertainment (music, movies, etc.), adjusting the ambient lighting, or fine-tune the seating position. Or, if the AI was advanced enough, it could simply engage the individual in conversation.
So, What’s The Point Of An Emotional Car?
Let’s say you’ve had a rough day at work, and all you want to do is go home and relax. As soon as you step into your car, it recognizes your tense posture, elevated body temperature, and scowling face. Presuming it has self-driving capabilities, it’ll set the destination for your apartment, rather than the store to run that errand you’ve been meaning to get to. The seat is reclined a bit, the lights dimmed, and a little soothing music is played. And, presuming it has the proper psychological aptitude, it’ll ask about your day, letting you unload a little before walking in the front door.
Is That A Good Idea? What If My Car And I Don’t Get Along?
We’ve all seen the movies. Rogue AI is a danger that’s beginning to manifest in the real world, so of course, the idea of an emotional car must be tempered with caution. That said, we’re only now starting to understand the full potential of this kind of technology, and although it’s still on the horizon, it’s definitely worth your consideration.
Personally, I think it would be awesome to have a car buddy. As systems are automated and controls once relegated to human hands are entrusted to computers, the link between man and machine is starting get lost. This kind of technology can completely reverse that trend, and to me, that’s exciting.
What do you think? Let us know in the comments.