Name: | Description: | Size: | Format: | |
---|---|---|---|---|
1.31 MB | Adobe PDF |
Advisor(s)
Abstract(s)
Background:
Off-the-shelf-mobile devices have several sensors available onboard that may be used for the recognition of Activities of Daily
Living (ADL) and the environments where they are performed. This research is focused on the development of Ambient Assisted
Living (AAL) systems, using mobile devices for the acquisition of the different types of data related to the physical and physiological
conditions of the subjects and the environments. Mobile devices with the Android Operating Systems are the least expensive and
exhibit the biggest market while providing a variety of models and onboard sensors.
Objective:
This paper describes the implementation considerations, challenges and solutions about a framework for the recognition of ADL and
the environments, provided as an Android library. The framework is a function of the number of sensors available in different mobile
devices and utilizes a variety of activity recognition algorithms to provide a rapid feedback to the user.
Methods:
The Android library includes data fusion, data processing, features engineering and classification methods. The sensors that may be
used are the accelerometer, the gyroscope, the magnetometer, the Global Positioning System (GPS) receiver and the microphone.
The data processing includes the application of data cleaning methods and the extraction of features, which are used with Deep
Neural Networks (DNN) for the classification of ADL and environment. Throughout this work, the limitations of the mobile devices
were explored and their effects have been minimized. Results:
The implementation of the Android library reported an overall accuracy between 58.02% and 89.15%, depending on the number of
sensors used and the number of ADL and environments recognized. Compared with the results available in the literature, the
performance of the library reported a mean improvement of 2.93%, and they do not differ at the maximum found in prior work, that
based on the Student’s t-test.
Conclusion:
This study proves that ADL like walking, going upstairs and downstairs, running, watching TV, driving, sleeping and standing
activities, and the bedroom, cooking/kitchen, gym, classroom, hall, living room, bar, library and street environments may be
recognized with the sensors available in off-the-shelf mobile devices. Finally, these results may act as a preliminary research for the
development of a personal digital life coach with a multi-sensor mobile device commonly used daily.
Description
Keywords
Activities of daily living Sensors Mobile devices Pattern recognition Data fusion Android library Artificial neural networks Recognition