No Arabic abstract
Obesity and being over-weight add to the risk of some major life threatening diseases. According to W.H.O., a considerable population suffers from these disease whereas poor nutrition plays an important role in this context. Traditional food activity monitoring systems like Food Diaries allow manual record keeping of eating activities over time, and conduct nutrition analysis. However, these systems are prone to the problems of manual record keeping and biased-reporting. Therefore, recently, the research community has focused on designing automatic food monitoring systems since the last decade which consist of one or multiple wearable sensors. These systems aim at providing different macro and micro activity detections like chewing, swallowing, eating episodes, and food types as well as estimations like food mass and eating duration. Researchers have emphasized on high detection accuracy, low estimation errors, un-intrusive nature, low cost and real life implementation while designing these systems, however a comprehensive automatic food monitoring system has yet not been developed. Moreover, according to the best of our knowledge, there is no comprehensive survey in this field that delineates the automatic food monitoring paradigm, covers a handful number of research studies, analyses these studies against food intake monitoring tasks using various parameters, enlists the limitations and sets up future directions. In this research work, we delineate the automatic food intake monitoring paradigm and present a survey of research studies. With special focus on studies with wearable sensors, we analyze these studies against food activity monitoring tasks. We provide brief comparison of these studies along with shortcomings based upon experimentation results conducted under these studies. We setup future directions at the end to facilitate the researchers working in this domain.
Since stress contributes to a broad range of mental and physical health problems, the objective assessment of stress is essential for behavioral and physiological studies. Although several studies have evaluated stress levels in controlled settings, objective stress assessment in everyday settings is still largely under-explored due to challenges arising from confounding contextual factors and limited adherence for self-reports. In this paper, we explore the objective prediction of stress levels in everyday settings based on heart rate (HR) and heart rate variability (HRV) captured via low-cost and easy-to-wear photoplethysmography (PPG) sensors that are widely available on newer smart wearable devices. We present a layered system architecture for personalized stress monitoring that supports a tunable collection of data samples for labeling, and present a method for selecting informative samples from the stream of real-time data for labeling. We captured the stress levels of fourteen volunteers through self-reported questionnaires over periods of between 1-3 months, and explored binary stress detection based on HR and HRV using Machine Learning Methods. We observe promising preliminary results given that the dataset is collected in the challenging environments of everyday settings. The binary stress detector is fairly accurate and can detect stressful vs non-stressful samples with a macro-F1 score of up to %76. Our study lays the groundwork for more sophisticated labeling strategies that generate context-aware, personalized models that will empower health professionals to provide personalized interventions.
Over the past several years, the electrocardiogram (ECG) has been investigated for its uniqueness and potential to discriminate between individuals. This paper discusses how this discriminatory information can help in continuous user authentication by a wearable chest strap which uses dry electrodes to obtain a single lead ECG signal. To the best of the authors knowledge, this is the first such work which deals with continuous authentication using a genuine wearable device as most prior works have either used medical equipment employing gel electrodes to obtain an ECG signal or have obtained an ECG signal through electrode positions that would not be feasible using a wearable device. Prior works have also mainly dealt with using the ECG signal for identification rather than verification, or dealt with using the ECG signal for discrete authentication. This paper presents a novel algorithm which uses QRS detection, weighted averaging, Discrete Cosine Transform (DCT), and a Support Vector Machine (SVM) classifier to determine whether the wearer of the device should be positively verified or not. Zero intrusion attempts were successful when tested on a database consisting of 33 subjects.
Malnutrition is a major public health concern in low-and-middle-income countries (LMICs). Understanding food and nutrient intake across communities, households and individuals is critical to the development of health policies and interventions. To ease the procedure in conducting large-scale dietary assessments, we propose to implement an intelligent passive food intake assessment system via egocentric cameras particular for households in Ghana and Uganda. Algorithms are first designed to remove redundant images for minimising the storage memory. At run time, deep learning-based semantic segmentation is applied to recognise multi-food types and newly-designed handcrafted features are extracted for further consumed food weight monitoring. Comprehensive experiments are conducted to validate our methods on an in-the-wild dataset captured under the settings which simulate the unique LMIC conditions with participants of Ghanaian and Kenyan origin eating common Ghanaian/Kenyan dishes. To demonstrate the efficacy, experienced dietitians are involved in this research to perform the visual portion size estimation, and their predictions are compared to our proposed method. The promising results have shown that our method is able to reliably monitor food intake and give feedback on users eating behaviour which provides guidance for dietitians in regular dietary assessment.
Over the past several decades, naturally occurring and man-made mass casualty incidents (MCI) have increased in frequency and number, worldwide. To test the impact of such event on medical resources, simulations can provide a safe, controlled setting while replicating the chaotic environment typical of an actual disaster. A standardised method to collect and analyse data from mass casualty exercises is needed, in order to assess preparedness and performance of the healthcare staff involved. We report on the use of wearable proximity sensors to measure proximity events during a MCI simulation. We investigated the interactions between medical staff and patients, to evaluate the time dedicated by the medical staff with respect to the severity of the injury of the victims depending on the roles. We estimated the presence of the patients in the different spaces of the field hospital, in order to study the patients flow. Data were obtained and collected through the deployment of wearable proximity sensors during a mass casualty incident functional exercise. The scenario included two areas: the accident site and the Advanced Medical Post (AMP), and the exercise lasted 3 hours. A total of 238 participants simulating medical staff and victims were involved. Each participant wore a proximity sensor and 30 fixed devices were placed in the field hospital. The contact networks show a heterogeneous distribution of the cumulative time spent in proximity by participants. We obtained contact matrices based on cumulative time spent in proximity between victims and the rescuers. Our results showed that the time spent in proximity by the healthcare teams with the victims is related to the severity of the patients injury. The analysis of patients flow showed that the presence of patients in the rooms of the hospital is consistent with triage code and diagnosis, and no obvious bottlenecks were found.
One major challenge in the medication of Parkinsons disease is that the severity of the disease, reflected in the patients motor state, cannot be measured using accessible biomarkers. Therefore, we develop and examine a variety of statistical models to detect the motor state of such patients based on sensor data from a wearable device. We find that deep learning models consistently outperform a classical machine learning model applied on hand-crafted features in this time series classification task. Furthermore, our results suggest that treating this problem as a regression instead of an ordinal regression or a classification task is most appropriate. For consistent model evaluation and training, we adopt the leave-one-subject-out validation scheme to the training of deep learning models. We also employ a class-weighting scheme to successfully mitigate the problem of high multi-class imbalances in this domain. In addition, we propose a customized performance measure that reflects the requirements of the involved medical staff on the model. To solve the problem of limited availability of high quality training data, we propose a transfer learning technique which helps to improve model performance substantially. Our results suggest that deep learning techniques offer a high potential to autonomously detect motor states of patients with Parkinsons disease.