An experimental multimodal system, designed for polysensory diagnosis and stimulation of persons with impaired communication skills or even non-communicative subjects is presented. The user interface includes an eye tracking device and the EEG monitoring of the subject. Furthermore, the system consists of a device for objective hearing testing and an autostereoscopic projection system designed to stimulate subjects through their immersion in a virtual environment. Data analysis methods are described, and experiments associated with classification of mental states during listening exercises as well as audio-visual stimuli are presented and discussed. Feature extraction was based on discrete wavelet transformation and clustering employing the k-means algorithm was designed. All algorithms were implemented in the Python programming language with the use of Open Source libraries. Tests of the proposed system were performed in a Special School and Educational Center in Koś-cierzyna, Poland. Results and comparison with data gathered from the control group of healthy people are presented and discussed.
Authors
- dr inż. Adam Kurowski link open in new tab ,
- dr inż. Piotr Odya link open in new tab ,
- dr hab. inż. Piotr Szczuko link open in new tab ,
- dr inż. Michał Lech link open in new tab ,
- mgr inż. Paweł Spaleniak link open in new tab ,
- prof. dr hab. inż. Bożena Kostek link open in new tab ,
- prof. dr hab. inż. Andrzej Czyżewski link open in new tab
Additional information
- DOI
- Digital Object Identifier link open in new tab 10.1007/978-3-319-60438-1_5
- Category
- Aktywność konferencyjna
- Type
- materiały konferencyjne indeksowane w Web of Science
- Language
- angielski
- Publication year
- 2017