Emotion sensors that change the context based on the user’s mood.
Autoemotive is a project of MIT’s affective computing group that is based on the exploration of the potential of emotional connections to machines. The goal of the project is to implement mood recognition technology in cars, in order to create a car that is able to empathise with the driver and improve the experience behind the wheel.
Complementing the existing infrastructure in the large majority of vehicles with MIT’s emotional sensors, the car could adjust the position of the seat automatically, and also lower the music and brightness of the dashboard, or extend the range of the headlights to compensate for loss of vision. All thanks to face recognition technology and based on electrical signals of the skin recorded by the sensors integrated into the steering wheel or door handles.
This system, in addition to adapting the situation to the driving context and the driver’s mood, would also help prevent many of the situations that are responsible for most traffic accidents. Drowsiness on long trips, emotional stress caused by traffic, or simply a minor distraction while at the wheel, could be detected by the vehicle, triggering a series of safety mechanisms to help control the situation.
Face recognition is one of the methods used most widely in technology to detect moods. In fact, data from more than a billion facial expressions has been used to program algorithms that are able to recognise and classify basic emotions such as anger or happiness, with up to 90% precision.
Companies like Afectiva or Emotient, which specialise in face recognition, provide companies and advertisers with their software for detecting facial movements, in order to monitor the emotional reactions of consumers to advertising campaigns or products that are being prepared to be launched on the market.
The development of increasingly sophisticated emotional sensors and with a much more complete set of records of mood metrics, this technology is also sparking new developments in the mobile apps sector.
One of the most well known is Moodscope, a mobile app developed by Microsoft to measure the user’s mood based on the activity recorded on the device, and share it through the user’s network of contacts. Mappiness is another application that like Moodscope, uses the internal sensors to determine the user’s mood and reaction to external stimuli in real time.
Medicine is another area that is interested in the use of biosensors, trained with face recognition techniques to diagnose illnesses faster and more accurately. In fact, emotion sensors have spurred new psychiatric techniques to determine patient moods based on the recognition of basic emotions like fear or anger.
The connectivity of these algorithms with other types of applications could be used to complement the so-called personal assistants, offering personalised information based on your mood. It could also be useful to know when the best time is to talk to a particular contact and establish the proper tone of the conversation, based on whether you are happy or displeased.
However, some experts say that the potential benefits of emotional connections with machines aside, there are still unresolved questions that could affect user privacy.
Images | pixabay