Browsing by Author "Hossain, Delwar"
Now showing 1 - 8 of 8
Results Per Page
Sort Options
Item Automatic Count of Bites and Chews From Videos of Eating Episodes(IEEE, 2020) Hossain, Delwar; Ghosh, Tonmoy; Sazonov, Edward; University of Alabama TuscaloosaMethods for measuring of eating behavior (known as meal microstructure) often rely on manual annotation of bites, chews, and swallows on meal videos or wearable sensor signals. The manual annotation may be time consuming and erroneous, while wearable sensors may not capture every aspect of eating (e.g. chews only). The aim of this study is to develop a method to detect and count bites and chews automatically from meal videos. The method was developed on a dataset of 28 volunteers consuming unrestricted meals in the laboratory under video observation. First, the faces in the video (regions of interest, ROI) were detected using Faster R-CNN. Second, a pre-trained AlexNet was trained on the detected faces to classify images as a bite/no bite image. Third, the affine optical flow was applied in consecutively detected faces to find the rotational movement of the pixels in the ROIs. The number of chews in a meal video was counted by converting the 2-D images to a 1-D optical flow parameter and finding peaks. The developed bite and chew count algorithm was applied to 84 meal videos collected from 28 volunteers. A mean accuracy (+/- STD) of 85.4% (+/- 6.3%) with respect to manual annotation was obtained for the number of bites and 88.9% (+/- 7.4%) for the number of chews. The proposed method for an automatic bite and chew counting shows promising results that can be used as an alternative solution to manual annotation.Item Automatic Ingestion Monitor Version 2 - A Novel Wearable Device for Automatic Food Intake Detection and Passive Capture of Food Images(IEEE, 2021) Doulah, Abul; Ghosh, Tonmoy; Hossain, Delwar; Imtiaz, Masudul H.; Sazonov, Edward; University of Alabama TuscaloosaUse of food image capture and/or wearable sensors for dietary assessment has grown in popularity. Active - methods rely on the user to take an image of each eating episode. "Passive" methods use wearable cameras that continuously capture images. Most of "passively" captured images are not related to food consumption and may present privacy concerns. In this paper, we propose a novel wearable sensor (Automatic Ingestion Monitor. AIM-2) designed to capture images only during automatically detected eating episodes. The capture method was validated on a dataset collected from 30 volunteers in the community wearing the AIM-2 for 24h in pseudo-free-living and 24h in a free-living environment. The AIM-2 was able to detect food intake over 10-second epochs with a (mean and standard deviation) Fl-score of 81.8 +/- 10.1%. The accuracy of eating episode detection was 82.7%. Out of a total of 180,570 images captured, 8,929 (4.9%) images belonged to detected eating episodes. Privacy concerns were assessed by a questionnaire on a scale 1-7. Continuous capture had concern value of 5.0 +/- 1.6 (concerned) while image capture only during food intake had concern value of 1.9 +/- 1.7 (not concerned). Results suggest that AIM-2 can provide accurate detection of food intake, reduce the number of images for analysis and alleviate the privacy concerns of the users.Item Body mass index and variability in meal duration and association with rate of eating(Frontiers, 2022) Simon, Stacey L.; Pan, Zhaoxing; Marden, Tyson; Zhou, Wenru; Ghosh, Tonmoy; Hossain, Delwar; Thomas, J. Graham; McCrory, Megan A.; Sazonov, Edward; Higgins, Janine; University of Colorado Anschutz Medical Campus; University of Alabama Tuscaloosa; Brown University; Boston UniversityBackgroundA fast rate of eating is associated with a higher risk for obesity but existing studies are limited by reliance on self-report and the consistency of eating rate has not been examined across all meals in a day. The goal of the current analysis was to examine associations between meal duration, rate of eating, and body mass index (BMI) and to assess the variance of meal duration and eating rate across different meals during the day. MethodsUsing an observational cross-sectional study design, non-smoking participants aged 18-45 years (N = 29) consumed all meals (breakfast, lunch, and dinner) on a single day in a pseudo free-living environment. Participants were allowed to choose any food and beverages from a University food court and consume their desired amount with no time restrictions. Weighed food records and a log of meal start and end times, to calculate duration, were obtained by a trained research assistant. Spearman's correlations and multiple linear regressions examined associations between BMI and meal duration and rate of eating. ResultsParticipants were 65% male and 48% white. A shorter meal duration was associated with a higher BMI at breakfast but not lunch or dinner, after adjusting for age and sex (p = 0.03). Faster rate of eating was associated with higher BMI across all meals (p = 0.04) and higher energy intake for all meals (p < 0.001). Intra-individual rates of eating were not significantly different across breakfast, lunch, and dinner (p = 0.96). ConclusionShorter beakfast and a faster rate of eating across all meals were associated with higher BMI in a pseudo free-living environment. An individual's rate of eating is constant over all meals in a day. These data support weight reduction interventions focusing on the rate of eating at all meals throughout the day and provide evidence for specifically directing attention to breakfast eating behaviors.Item Comparison of Wearable Sensors for Estimation of Chewing Strength(IEEE, 2020) Hossain, Delwar; Imtiaz, Masudul Haider; Sazonov, Edward; University of Alabama TuscaloosaThis paper presents wearable sensors for detecting differences in chewing strength while eating foods with different hardness (carrot as a hard, apple as moderate and banana as soft food). Four wearable sensor systems were evaluated. They were: (1) a gas pressure sensor measuring changes in ear pressure proportional to ear canal deformation during chewing, (2) a flexible, curved bend sensor attached to right temple of eyeglass measuring the contraction of the temporalis muscle, (3) a piezoelectric strain sensor placed on the temporalis muscle, and (4) an electromyography sensor with electrodes placed on the temporalis muscle. Data from 15 participants, wearing all four sensors at once were collected. Each participant took and consumed 10 bites of carrot, apple, and banana. The hardness of foods were measured by a food penetrometer. Single-factor ANOVA found a significant effect of food hardness on the standard deviation of signals for all four sensors (P-value <.001). Tukey's multiple comparison test with 5% significance level confirmed that the mean of the standard deviations were significantly different for the provided test foods for all four sensors. Results of this study indicate that the wearable sensors may potentially be used for measuring chewing strength and assessing the food hardness.Item Detection of Food Intake Sensor's Wear Compliance in Free-Living(IEEE, 2021) Ghosh, Tonmoy; Hossain, Delwar; Sazonov, Edward; University of Alabama TuscaloosaObjective detection of periods of wear and non-wear is critical for human studies that rely on information from wearable sensors, such as food intake sensors. In this paper, we present a novel method of compliance detection on the example of the Automatic Ingestion Monitor v2 (AIM-2) sensor, containing a tri-axial accelerometer, a still camera, and a chewing sensor. The method was developed and validated using data from a study of 30 participants aged 18-39, each wearing the AIM-2 for two days (a day in pseudo-free-living and a day in free-living). Four types of wear compliance were analyzed: 'normal-wear', 'non-compliant-wear', 'non-wear-carried', and 'non-wear-stationary'. The ground truth of those four types of compliance was obtained by reviewing the images of the egocentric camera. The features for compliance detection were the standard deviation of acceleration, average pitch, and roll angles, and mean square error of two consecutive images. These were used to train three random forest classifiers 1) accelerometer-based, 2) image-based, and 3) combined accelerometer and image-based. Satisfactory wear compliance measurement accuracy was obtained using the combined classifier (89.24%) on leave one subject out cross-validation. The average duration of compliant wear in the study was 9h with a standard deviation of 2h or 70.96% of total on-time. This method can be used to calculate the wear and non-wear time of AIM-2, and potentially be extended to other devices. The study also included assessments of sensor burden and privacy concerns. The survey results suggest recommendations that may be used to increase wear compliance.Item Improvement of Methodology for Manual Energy Intake Estimation From Passive Capture Devices(Frontiers, 2022) Pan, Zhaoxing; Forjan, Dan; Marden, Tyson; Padia, Jonathan; Ghosh, Tonmoy; Hossain, Delwar; Thomas, J. Graham; McCrory, Megan A.; Sazonov, Edward; Higgins, Janine A.; University of Colorado Anschutz Medical Campus; University of Alabama Tuscaloosa; Brown University; Boston UniversityObjective: To describe best practices for manual nutritional analyses of data from passive capture wearable devices in free-living conditions. Method: 18 participants (10 female) with a mean age of 45 +/- 10 years and mean BMI of 34.2 +/- 4.6 kg/m(2) consumed usual diet for 3 days in a free-living environment while wearing an automated passive capture device. This wearable device facilitates capture of images without manual input from the user. Data from the first nine participants were used by two trained nutritionists to identify sources contributing to inter-nutritionist variance in nutritional analyses. The nutritionists implemented best practices to mitigate these sources of variance in the next nine participants. The three best practices to reduce variance in analysis of energy intake (EI) estimation were: (1) a priori standardized food selection, (2) standardized nutrient database selection, and (3) increased number of images captured around eating episodes. Results: Inter-rater repeatability for EI, using intraclass correlation coefficient (ICC), improved by 0.39 from pre-best practices to post-best practices (0.14 vs 0.85, 95% CI, respectively), Bland-Altman analysis indicated strongly improved agreement between nutritionists for limits of agreement (LOA) post-best practices. Conclusion: Significant improvement of ICC and LOA for estimation of EI following implementation of best practices demonstrates that these practices improve the reproducibility of dietary analysis from passive capture device images in free-living environments.Item Monitoring of Eating Behavior Using Sensor-Based Methods(University of Alabama Libraries, 2024) Hossain, Delwar; Sazonov, EdwardThe essential physiological functions of the human body, including respiration, circulation, physical exertion, and protein synthesis, rely on energy from dietary constituents. Understanding eating behavior is crucial for overall health, as deviations in energy intake can lead to malnutrition-induced weight loss or obesity-related weight gain. Traditionally, dietary intake assessment has relied on self-reporting methods, such as dietary records, 24-hour recalls, and food frequency questionnaires. While these methods help understand relationships between eating behavior and dietary intake, they lack the granularity needed to explore detailed food consumption processes. Therefore, there is a need for innovative solutions that enable objective, precise, and automated monitoring of eating behavior, especially in free-living conditions.This dissertation investigates the application of wearable sensor systems for the automatic monitoring of eating behavior with minimal effort from subjects. First, a systematic review was conducted to identify available technology-driven methods for monitoring eating behavior. Then, a novel, contactless method for detecting and measuring eating behaviors such as chews and bites from eating videos was developed. Then an algorithm was devised to evaluate and compare different sensor modalities for identifying eating behavior, specifically focusing on chewing and chewing strength measurement. Four sensor modalities--Ear Canal Pressure Sensor, Piezoresistive Bend Sensor, Piezoelectric Strain Sensor, and EMG Sensor--were assessed. Results indicated comparable efficacy across all four systems in identifying chewing and chewing strength.Next, a novel Ear Canal Pressure Sensor system was explored for monitoring eating behavior, particularly chewing, in free-living environments. The findings demonstrated accurate detection and estimation of chewing in both controlled and free-living settings. Finally, a machine learning model to estimate energy intake (EI) from food intake using sensor-captured eating behavior features was developed and evaluated in free-living settings. The results highlight the efficacy of the sensor-based EI model and the potential for improved accuracy by leveraging image assistance and automatic food item detection.In conclusion, this research advances eating behavior monitoring using wearable sensor technologies. The findings hold promise for personalized nutrition interventions and mark a significant step forward in the objective assessment of eating habits.Item Wearable Egocentric Camera as a Monitoring Tool of Free-Living Cigarette Smoking: A Feasibility Study(Oxford University Press, 2020) Imtiaz, Masudul H.; Hossain, Delwar; Senyurek, Volkan Y.; Belsare, Prajakta; Tiffany, Stephen; Sazonov, Edward; University of Alabama Tuscaloosa; State University of New York (SUNY) BuffaloIntroduction: Wearable sensors may be used for the assessment of behavioral manifestations of cigarette smoking under natural conditions. This paper introduces a new camera-based sensor system to monitor smoking behavior.The goals of this study were (1) identification of the best position of sensor placement on the body and (2) feasibility evaluation of the sensor as a free-living smoking-monitoring tool. Methods: A sensor system was developed with a 5MP camera that captured images every second for continuously up to 26 hours. Five on-body locations were tested for the selection of sensor placement. A feasibility study was then performed on 10 smokers to monitor full-day smoking under free-living conditions. Captured images were manually annotated to obtain behavioral metrics of smoking including smoking frequency, smoking environment, and puffs per cigarette. The smoking environment and puff counts captured by the camera were compared with self-reported smoking. Results: A camera located on the eyeglass temple produced the maximum number of images of smoking and the minimal number of blurry or overexposed images (53.9%, 4.19%, and 0.93% of total captured, respectively). During free-living conditions, 286,245 images were captured with a mean (+/- standard deviation) duration of sensor wear of 647(+/- 74) minutes/participant. Image annotation identified consumption of 5(+/- 2.3) cigarettes/participant, 3.1(+/- 1.1) cigarettes/participant indoors, 1.9(+/- 0.9) cigarettes/participant outdoors, and 9.02(+/- 2.5) puffs/cigarette. Statistical tests found significant differences between manual annotations and self-reported smoking environment or puff counts. Conclusions: A wearable camera-based sensor may facilitate objective monitoring of cigarette smoking, categorization of smoking environments, and identification of behavioral metrics of smoking in free-living conditions.