Systems using the Mind for Latent Expression (SMILE) is developing BCI-controlled wearable technology for expressing emotion. SMILE aims to fill a gap in amytrophic lateral sclerosis (ALS)-specified augmentative and alternative communication (AAC) for visual emotive communication, which has largely been not researched in favor of advances in computerized text-to-speech.
SMILE is currently developing MoodLens, emotional expression AAC glasses, and divided into three projects:
- BCI Control: control paradigm for MoodLens using EEG signals
- BCI Affect Detection: studying emotional information collected from EEG data to improve user experience
- Display: improving the design (ergonomics, form factor, feedback) of MoodLens
If you are interested in working on these projects or have other inquiries, contact Angela Vujic, project lead.
|Vujic, A., Starner, T., & Jackson, M. (2017). Wearable, Visual Emotional Expression Glasses for ALS-specified AAC. Proceedings of the 32nd Annual CSUN Assistive Technology Conference.||2017|
|Vujic, Angela, Thad Starner, and Melody Jackson. "MoodLens: towards improving nonverbal emotional expression with an in-lens fiber optic display."Proceedings of the 2016 ACM International Symposium on Wearable Computers. ACM, 2016.||2016|
Front row, left to right: Kaela Frazier, Angela Vujic (project lead), Kantwon Rogers
Back row, left to right: Chris Waites, David Kim, Jatin Nanda, Albert Shaw, Beatrice Domingo
Not pictured: Mike Winters