Automatic Food-Intake Monitoring Based on IoT Embedded System for Alzheimer’s Patients

Date of Award

12-15-2020

Degree Name

Doctor of Philosophy

Department

Electrical and Computer Engineering

First Advisor

Dr. Bradley Bazuin

Second Advisor

Dr. Janos Grantne

Third Advisor

Dr. Maureen Mickus

Keywords

Internet of Things (IoT), deep learning, computer vision, embedded systems, Alzheimer's diseases, real-time monitoring systems

Abstract

In the United States, around 5.8 million people live with Alzheimer's and dementia, and around 50 million globally in 2020. Patients with Alzheimer’s and dementia are suffering from a reduction in cognitive abilities and would not be able to function safely without help. More than 80% of patients care provided at home by more than 16 million family members, friends, or other unpaid caregivers. Alzheimer’s patients in the middle and late stages begin to lose the ability to know the significance of drinking and eating even when hungry. It is essential to use technology to enhance these patients’ daily lives while simultaneously reducing the significant time spent by the caregivers. Developing a successful Food-Intake Monitoring System (FIMS) to monitor and encourage the patients to get their meals will reduce the cost of individual caregiving and ease the situation of loved ones caring for their elders and allow the patients to feed themselves. In this dissertation, FIMS implementations based on three different realizations have been defined, implemented using embedded IOT system Raspberry Pi 3 Plus, and tested at the Digital Signal and Image Processing Laboratory (DISPLAY) at Western Michigan University. The FIMS will monitor the eating activity and send alerts or emails to the caregiver and, at the same time, prompt a video after a period of time when the patient does not eat, encouraging them to eat or reminding them how.

The first embodiment of FIMS focused on small self-contained IOT based processing and imaging hardware and algorithm processing based on video Hand detection based on Skin-Color After some success, a second system focused on advancing the video image processing based on using a convolutional neural networks model called Faster RCNN to detect hand position and movements. Finally, a Food Intake Activities Recognition (FIAR) algorithm based on the Inception V3 model focusing on eating activity identification was defined. The proposed systems show demonstrates and shows alternate and improved effectiveness for food intake monitoring and prompting for Alzheimer patients.

Access Setting

Dissertation-Abstract Only

Restricted to Campus until

12-15-2030

This document is currently not available here.

Share

COinS