This project is my chief project at NeuroLoops, Georgia Tech. The objective of this study is to quantitatively model the sensorimotor control processes of hand movements and to employ real-time machine learning models for behavioral prediction. The ultimate goal is to predict performance outcomes before the completion of a dart-throwing action, thereby enabling the application of appropriate corrective stimuli to support user adaptation during the learning process. The research primarily consists of three components: the design of multimodal hardware devices, the development of real-time predictive models, and the design of a virtual reality closed-loop interaction system.
1. Design of Multimodal Hardware Devices
We employ the Senso.me glove in combination with a Valve tracker to determine the world coordinates of finger joints. Computer vision methods were not adopted due to unresolved issues such as prediction jitter during rapid movements and the inability to capture actions outside the camera’s field of view. In addition, we independently developed and fabricated a 16-channel EMG wristband to collect electromyographic signals from the wrist flexor muscles. Concurrently, we recorded users’ eye positions, pupil diameters, and head positions within the VR environment to construct behavioral models.
2. Development of Real-time Predictive Models
We collected 30 minutes of motor performance data from 125 participants across six variables. Principal Component Analysis (PCA) was applied for feature dimensionality reduction, followed by convolutional neural networks (CNNs) for temporal feature extraction. Finally, a three-layer Long Short-Term Memory (LSTM) model was employed to predict the dart release timing.
3. Design of a Virtual Reality Closed-loop Interaction System
We implemented corrective adjustments for eye movement behavior by providing a correction vector based on the previous outcome to facilitate user learning. In addition, we visualized the real-time predicted trajectory curve to further support the learning process.
Keywords: Brain-Computer Interaction, Virtual Reality, Human-Computer Interaction, Ubiquitous Computing
Project Type: Project led by the NeuroLoops Laboratory at Georgia Institute of Technology
Time: 2025.1-Present
Instructor: Dr.Tansu Celikel
Main Contributions:
1. Independently developed a multi-modal VR dart-throwing training system using Python and C#, integrating Senso.me gloves (IMU), OpenBCI 16-channel EMG, 8-channel EEG, and Varjo headset for multi-modal motion data collection
2. Independently designed and manufactured a TPU 3D-printed wrist stabilizer to reduce multi-modal input between hardware; implemented a damping system and hysteresis error recalibration method to effectively mitigate IMU sensor drift
Skill: C#, Python, Eye-Tracking, EEG, EMG, Signal Processing, Wearable Design