How to detect emotions remotely with wireless signals

MIT researchers have developed “EQ-Radio,” a device that can detect a person’s emotions using wireless signals. By measuring subtle changes in breathing and heart rhythms, EQ-Radio is 87 percent accurate at detecting if a person is excited, happy, angry or sad, and can do so without on-body sensors, say researchers.
 
MIT professor and project lead Dina Katabi of MIT’s Computer Science and Artificial Intelligence Laboratory (CSAIL) envisions the system being used in health care and testing viewers’ reactions to ads or movies in real time.
 
Using wireless signals reflected off people’s bodies, the device measures heartbeats as accurately as an ECG monitor, with a margin of error of approximately 0.3 percent, according to the researchers. It then studies the waveforms within each heartbeat to match a person’s behavior to how they previously acted in one of the four emotion-states.
 
The team will present the work next month at the Association of Computing Machinery’s International Conference on Mobile Computing and Networking (MobiCom).
 
EQ-Radio sends wireless signals that reflect off of a person’s body and back to the device. To detect emotions, its beat-extraction algorithms break the reflections into individual heartbeats and analyze the small variations in heartbeat intervals to determine their levels of arousal and positive affect.
 
These measurements are what allow EQ-Radio to detect emotion. For example, a person whose signals correlate to low arousal and negative affect is more likely to tagged as sad, while someone whose signals correlate to high arousal and positive affect would likely be tagged as excited.
 
The exact correlations vary from person to person, but are consistent enough that EQ-Radio could detect emotions with 70 percent accuracy even when it hadn’t previously measured the target person’s heartbeat. In the future it could be used for non-invasive health monitoring and diagnostic settings.
 
For the experiments, subjects used videos or music to recall a series of memories that each evoked one the four emotions, as well as a no-emotion baseline. Trained just on those five sets of two-minute videos, EQ-Radio could then accurately classify the person’s behavior among the four emotions 87 percent of the time.
 
One of the challenges was to tune out irrelevant data. To get individual heartbeats, for example, the team had to dampen the breathing, since the distance that a person’s chest moves from breathing is much greater than the distance that their heart moves to beat.
 
To do so, the team focused on wireless signals that are based on acceleration rather than distance traveled, since the rise and fall of the chest with each breath tends to be much more consistent,  and, therefore, have a lower acceleration, than the motion of the heartbeat.