The purpose of this project to create a system that sits on the user’s person in an unobtrusive manner (similar to Google Glass) and monitors his or her emotional health using facial and pupil analysis.

This project is solving the Space Wearables: Fashion Designer to Astronauts challenge.


**Project Overview**

"Emotional Health" (EH) is the result of an international collaborative effort amongst team members from Boston, Tokyo, and India.

The system monitors a user's psychological well-being through facial analysis using a mounting system similar to that of Google Glass. The headset, itself, is of a light-weight, unobtrusive design with an inward-facing camera (equipped with a polarized, wide-angle lens) that feeds video of the user's facial expressions to analytics software. The software then sends the resulting data to both a real-time graph and an Evernote-based diary that summarizes the day's emotional trends into a single sentence. The end result is a product that, with its short- and long-term tracking, would help to ensure the mental stability of its users.

This setup would be especially helpful for monitoring astronauts, who spend much of their time in potentially high-stress environments. Furthermore, EH's accessibility encourages its users to be more cognizant of their own attitude and the psychological effects of their milieux. This propensity towards self-awareness could prove vital for space colonies, which would, at least initially, be isolated from the mediation of mission control. That being said, EH would also allow mission control to monitor the well-being of inhabitants within an environment, and would, thus, automatically keep control apprised of at least the broad strokes of a situation.

**Predicting Emotional State of Astronauts**

Using the data output from the device, we are building a statistical model to predict the overall emotional state of NASA astronauts.

Statistical results: Theoretical prediction model: ~ Y_(emotional state)= β_0-β_Sad x_sad-β_Surprised x_Surprised+β_Happy x_Happy-β_angry x_angry+ ε

Fitted prediction model: ~ Y ̂(emotional rate)= 1.58245 -0.405465x(Angry )-21.4318x_sad -4.17484x_surprised +19.7326x_happy

The coefficients (β) are telling us about the relationship between the odds of the device output emotion and the probability of a positive emotional state of the astronaut.

**Additional Information**

Deliverables: ~ functional emotion analysis software ~ functional UI for results / data display (either website or smartphone app) ~ 3D design and model of the camera / mount rig ~ Evernote diary-entry-based log page

Implementation: ~ facial expression analysis ~ pupil analysis / heat map (future iterations)

Software: ~ clmtrackr ~ additional software for display / user interface ~ Evernote Developer for diary log (https://sandbox.evernote.com/)

Hardware Plan: ~ small camera (6.3mm minimum focal length, wide angle lens, polarized against glasses reflection) ~ compact mounting rig to place camera on glasses

Project Information

License: MIT license (MIT)

Source Code/Project URL: https://github.com/ISAC2014-TEAM4


clmtrackr - https://github.com/auduno/clmtrackr
Evernote - http://dev.evernote.com/


  • sushmitha choudhary
  • Kela Roberts
  • Everton Yoshitani
  • Jun Kawasaki
  • Yuichi Yazaki
  • Viplav Valluri
  • Zalika Corbett