News

Advancing AI Emotion Sensors In The Transportation And Logistics Industry


In the first blog in the AI transportation and logistics series, I featured the AI transformation innovations at Purolator, the second blog focused on the acceleration of smarter AI telematics infrastructure in fleet management. This third blog explores AI emotion sensors and the impact that the affective computing market will have on the transportation and logistics industry.

We are now entering a space which is smarter things are in everything.

We already can monitor a driver’s vehicle movements to know risks to safety from sharp turns, speeding, to braking intensity. Sensors are moving into our automobile steering wheels so we will know the tension that a human may be feeling from his or her grip on the steering wheel (too tight may classify as tension/not relaxed and monitor does the tension continue all day or ease up later on in the day). Taking this a step further, research on gait  (body posture) and how humans are walking to and from their vehicles can also determine if a human’s body posture is upright with chin forward (may be classified as confident), or head lowered (may be classified as either deep in thought or sad?).  

It seems everyone in the automotive industry is thinking outside the box to secure a complete view of human behavior 24×7 and bringing man and machine closer together.

How far these innovations will go and be integrated into societal norms and have ethical and privacy implications is still uncharted territory. What is clear the changes are well underway.

Already you can see how fast Samsung is bringing more vehicles into their branded SmartThings ecosystem. Although focused on car innovations currently, where you can be able to start or stop the engine of compatible vehicles with your SmartThings mobile app, and activate the heater or AC before you step inside your car, these earlier innovation projects create a future vision of only what is to come. Already Samsung has teamed up with Google to bring a SmartThings dashboard to Android Auto. As such, you’ll be able to control your smart home products or open your garage door from your dashboard after connecting it to a Samsung phone.

In October, Samsung announced that SmartThings support was coming to Mercedes-Benz’s MBUX Voice Assistant for hands-free smart home device control.

Understanding the Affective Computing (Emotional AI Segment)

The Affective Computing or Emotional AI segment is a market projected to grow from USD $28.6 billion in 2020 to USD $140.0 billion by 2025, at a CAGR of 37.4% during the forecast period. Emotional AI or emotional artificial intelligence enables computer systems and algorithms to recognize and interpret human emotions by tracking facial expressions, body language, or from voice/speech.

Emotion AI is striving to bring man and machine closer together.

Examples of facial recognition in vehicles will enable cars or trucks to know who is the authorized driver, and will be able to automatically recognize you and follow your voice commands. Computer vision algorithms are now very accurate and can identify an individual’s face and decompose: eyes, tip of the nose, eyebrows, corners of the mouth, etc., and then track a person’s movement to also identify a person’s emotions.

This is done currently by comparing large databases of facial expressions that can identify emotion types from facial gestures (joy, sadness, anger, contempt, disgust, fear, and surprise). Additional software can augment the emotion classification and include facial identification and verification, age and gender detection, ethnicity and multi-face detection, and much more.

Then we add into this mix, voice recognition software to complement /correlate with the facial recognition software to recognize…



Read More: Advancing AI Emotion Sensors In The Transportation And Logistics Industry

Products You May Like