Developing an advanced human and IoT interaction platform based on wireless sensor networks and embedded IMU systems

The aim of this study is the development and implementation of an advanced human and IoT interaction platform based on Wireless Sensor Networks and Embedded IMU Systems. The aforementioned application is able to recognize hand gestures and depending on the characteristics of those, perform specif...

Πλήρης περιγραφή

Λεπτομέρειες βιβλιογραφικής εγγραφής
Κύριος συγγραφέας: Παπαδόπουλος, Αλέξανδρος Ιωάννης
Άλλοι συγγραφείς: Κουμπιάς, Σταύρος
Μορφή: Thesis
Γλώσσα:English
Έκδοση: 2018
Θέματα:
Διαθέσιμο Online:http://hdl.handle.net/10889/11846
Περιγραφή
Περίληψη:The aim of this study is the development and implementation of an advanced human and IoT interaction platform based on Wireless Sensor Networks and Embedded IMU Systems. The aforementioned application is able to recognize hand gestures and depending on the characteristics of those, perform specific actions in an Ambient Assisted Living Environment. This provides a service for people incapable of self-sufficiency in their place of living, in a non invasive way. This topic has recently been receiving a lot of attention because it provides support to patients and elderly people like no other domain and is of a great technological interest. During the development of this study we have used a single IMU sensor that transmits its data with the use of BLE (Bluetooth Low Energy) technology and by taking advantage of the MQTT, IoT communication protocol, we establish a fully remote and autonomous application. By using the Python programming language and implementing Madgwick’s filter for Quaternion-based rotational representation of our object, we have managed to establish a really accurate and instantaneous three dimensional performance. Furthermore we exploit the domain of Neural Networks and its various classification techniques while decisively applying a Multilayer Perceptron to effectively recognize the specified gestures. This thesis provides a detailed analysis of related scientific areas and aims to provide the reader with the tools to fully comprehend the suggested solution. During this process we have found that it is possible to use a low-power, low-cost sensor to develop a state-ofthe-art, accurate result. We also managed to combat overfitting problems while creating our own dataset and engaging in data augmentation techniques. Finally, we were able to create a platform that classifies between gestures in real time and provides the foundations to act upon them. We believe that this application can be the foundations for similar ones in the future and aid in the development of more complex Internet of Things and Assisted Living platforms.