How can we leverage biosensor data and AI to create personalized and adaptive VR training experiences?

Role: Designer & Developer

Team: Jordan Clark, CJ Connett, Cyril Medabalimi, Elan Grossman

Duration: 3 Days

Tools: Unity, Rhino 3D, Figma

Problem

Traditional Standard Operating Procedure (SOP) training in biopharma relies on reading lengthy PDFs, a method that is unengaging, difficult to retain, and prone to errors. Employees are expected to learn complex workflows and operate machinery with little interactive or adaptive reinforcement, leading to mistakes, inefficiencies, and compliance risks.

My team and I created Optl over the course of 3 days at the MIT Reality Hack and placed 1st in our category for OpenBCI’s Pioneering a Neuroadaptive Future track.

Static training is insufficient in high-stake industries

Optl is a neuroadaptive VR training platform that personalizes learning using real-time EEG analysis and machine learning. It dynamically adjusts training scenarios based on user stress levels, cognitive engagement, and physiological responses, making learning more effective, immersive, and data-driven.

Solution

Leveraging bio data for SOP and training optimization

We leveraged OpenBCI’s Galea headset and its biosensor data to create a real-time adaptive training experience, using EEG brainwave activity, heart rate variability (HRV), and other biometric signals to assess a user's cognitive state. As trainees progressed through the VR simulation, the system continuously monitored their stress levels, focus, and cognitive load, adjusting the training scenario accordingly.

If a user showed signs of overload or disengagement, the system could slow down, provide additional guidance, or introduce adaptive cues to reinforce learning. Conversely, if a user demonstrated high confidence and efficiency, the simulation could increase complexity or introduce new challenges to keep engagement high. This dynamic feedback loop ensured that training was personalized, responsive, and optimized for knowledge retention and real-world performance.

User Testing

The VR training simulation was rigorously tested with a diverse group of users, allowing us to gather valuable feedback and iteratively refine the experience. Analyzing the spatial data using Cognitive3D’s dashboard gave lots of insights alongside the EEG data.

Takeaways

  • I took on the roles of designer, developer, and project manager, leading the team to deliver a fully functional solution under tight time constraints. I narrowed the scope to something feasible and impactful while ensuring the project stayed on track.

  • From asset creation in 3D modeling software to prototyping in Unity and integrating biosensor data, I was responsible for the entire development pipeline, ensuring every detail aligned with our vision.

  • I ensured that our solution wasn’t just technically sound, but also aligned with business objectives and scalability, demonstrating my ability to balance strategy and execution seamlessly.

Business Pitch/Teaser Video

Full VR Training Walkthrough