Show simple item record

dc.contributor.authorLi, Richard
dc.date.accessioned2017-11-30T20:37:28Z
dc.date.available2017-11-30T20:37:28Z
dc.date.issued2017-11-13
dc.identifier.urihttp://hdl.handle.net/1853/59037
dc.descriptionPresented on November 13, 2017 as part of the Three Minute Thesis Finals in the Student Center Ballroom.en_US
dc.descriptionRichard Li is a master's student finalist from the School of Interactive Computing at Georgia Tech. He was runner-up and received a $750 research travel grant.en_US
dc.descriptionRuntime: 03:42 minutesen_US
dc.description.abstractFood journalling is the primary recommendation of physicians for many health concerns, including weight loss and a variety of diseases. The state-of-the-art approach for food journalling is by asking patients to record and self-report their own dietary activities for the day. However, it has been shown that the adherence and accuracy of such data is very low, resulting in little benefit for the patient. As a result, determining when someone is eating has been a point of interest to the ubiquitous computing community for several years. While many wearable approaches have been proposed and assessed in lab settings thus far, no practical solutions have been evaluated in the real world yet. In our work, we present a hearing aid form factor device that tracks the motion of the jaw from the ear. Further, we highlight three main contributions that contribute to its effectiveness and practicality: 1) Assess how well three sensing modalities performed and how their corresponding form factors are perceived. 2) Implement a novel approach to collect data for training a generalizable machine learning model by using a semi-controlled home environment. 3) Evaluate the system in unconstrained environments, obtaining state-of-the-art results validated with video footage from a wearable camera.en_US
dc.format.extent03:42 minutes
dc.language.isoen_USen_US
dc.publisherGeorgia Institute of Technologyen_US
dc.relation.ispartofseriesThree Minute Thesis (3MT™) at Georgia Tech
dc.subjectEating detectionen_US
dc.subjectWearable computingen_US
dc.titleEarBit: Using Wearable Sensors to Detect Eating Episodes in Unconstrained Environmentsen_US
dc.typeMoving Image
dc.contributor.corporatenameGeorgia Institute of Technology. Office of Graduate Studiesen_US
dc.contributor.corporatenameGeorgia Institute of Technology. Center for Teaching and Learningen_US
dc.contributor.corporatenameGeorgia Institute of Technology. School of Interactive Computingen_US
dc.type.genrePresentation


Files in this item

Thumbnail
Thumbnail

This item appears in the following Collection(s)

Show simple item record