dc.contributor.author | Li, Richard | |
dc.date.accessioned | 2017-11-30T20:37:28Z | |
dc.date.available | 2017-11-30T20:37:28Z | |
dc.date.issued | 2017-11-13 | |
dc.identifier.uri | http://hdl.handle.net/1853/59037 | |
dc.description | Presented on November 13, 2017 as part of the Three Minute Thesis Finals in the Student Center Ballroom. | en_US |
dc.description | Richard Li is a master's student finalist from the School of Interactive Computing at Georgia Tech. He was runner-up and received a $750 research travel grant. | en_US |
dc.description | Runtime: 03:42 minutes | en_US |
dc.description.abstract | Food journalling is the primary recommendation of physicians for many health concerns, including weight loss and a variety of diseases. The state-of-the-art approach for food journalling is by asking patients to record and self-report their own dietary activities for the day. However, it has been shown that the adherence and accuracy of such data is very low, resulting in little benefit for the patient. As a result, determining when someone is eating has been a point of interest to the ubiquitous computing community for several years. While many wearable approaches have been proposed and assessed in lab settings thus far, no practical solutions have
been evaluated in the real world yet. In our work, we present a hearing aid form factor device that tracks the motion of the jaw from the ear. Further, we highlight three main contributions that contribute to its effectiveness and practicality:
1) Assess how well three sensing modalities performed and how their corresponding form factors are perceived.
2) Implement a novel approach to collect data for training a generalizable machine learning model by using a semi-controlled home environment.
3) Evaluate the system in unconstrained environments, obtaining state-of-the-art results validated with video footage from a wearable camera. | en_US |
dc.format.extent | 03:42 minutes | |
dc.language.iso | en_US | en_US |
dc.publisher | Georgia Institute of Technology | en_US |
dc.relation.ispartofseries | Three Minute Thesis (3MT™) at Georgia Tech | |
dc.subject | Eating detection | en_US |
dc.subject | Wearable computing | en_US |
dc.title | EarBit: Using Wearable Sensors to Detect Eating Episodes in Unconstrained Environments | en_US |
dc.type | Moving Image | |
dc.contributor.corporatename | Georgia Institute of Technology. Office of Graduate Studies | en_US |
dc.contributor.corporatename | Georgia Institute of Technology. Center for Teaching and Learning | en_US |
dc.contributor.corporatename | Georgia Institute of Technology. School of Interactive Computing | en_US |
dc.type.genre | Presentation | |