At BrainCo, a series of STEM learning projects were developed to integrate the core products—the NeuroMaker Hand and the BCI Brainband—with a range of sensors (EMG, flex, color, sound, button, rotary) and external platforms such as Micro:bit and a robot dog, creating cross-disciplinary experiences that link neuroscience, engineering, and computer science. These integrations help students see how brainwave data and engineering tools can interact with both digital and physical systems, deepening their understanding of emerging technologies.
Each project is designed as a student-initiated, inquiry-driven experience, typically lasting 90–120 minutes. This structure allows learners to explore, test, and iterate, while also building confidence in their problem-solving skills. The projects connect across disciplines—biology, computer science, engineering, and neuroscience—and align with Career and Technical Education (CTE) pathways in health science, information technology, and STEM. Activities are mapped to state-level CTE standards in Florida, Ohio, and California, ensuring both creativity and curricular alignment.
The design process focused on several priorities:
Integration: Connecting sensors, coding, and brainwave data with real-world biomedical and engineering applications.
Continuous Improvement: Refining activities through classroom feedback to enhance clarity, interactivity, and overall impact.
Student Agency: Encouraging ownership through hands-on building, coding, experimentation, and collaboration.
Together, these projects show how emerging technologies can deliver both knowledge and emotional engagement. They demonstrate how STEM can be experienced not only as subjects to study, but also as tools to create, connect, and imagine future careers—preparing students with the skills, curiosity, and confidence needed for the paths ahead.
NeuroMaker BCI + Micro:bit: Turn Your Attention into Light, Text, and Sound is a dynamic and beginner-friendly project that introduces students to brain-computer interface (BCI) technology through creative, real-time applications. Using the NeuroMaker BCI headband and Micro:bit, students translate their attention data into interactive outputs like blinking lights, scrolling text, and playful sound effects. This course not only demystifies neuroscience and coding but also empowers learners to see how their own brain activity can be harnessed to control digital devices—bridging the gap between thought and technology in a hands-on, exciting way.
Brain Breeze Kick-off: Design Your Brain-Powered Fan is an intermediate BCI project where students use the NeuroMaker brainband and Arduino components to build a real fan that spins faster as their attention increases. By connecting and coding a circuit with a motor driver and a DC fan, learners translate their focus levels into physical airflow and visualize it through a fun bubble-blowing display. This project introduces more complex wiring and coding skills while encouraging creativity through add-ons like flags, pinwheels, or paper boats—all driven by the power of the mind.
Focus & Spin: Design Your Own Brainwave Display is an interactive STEM project that empowers students to transform real-time brainwave data into physical motion. Using the NeuroMaker BCI headset and Arduino technology, learners build and code a servo motor circuit that reflects their attention levels through a custom-designed moving dial. Blending neuroscience, engineering, and creativity, this beginner-friendly course helps students explore brain-computer interfaces (BCIs) while designing their very first brain-powered gadget.
Design a Brain-Powered Prosthetic Hand is an advanced hands-on project where students step into the role of biomedical engineers to control a prosthetic hand using their brainwaves. Inspired by real BCI research at Harvard iLab, this experience combines neuroscience, coding, and hardware engineering as learners use a NeuroMaker BCI headband to manipulate all five fingers of a NeuroMaker Hand. Through this project, students deepen their understanding of servo motor control, explore the intersection of brain-computer interfaces and assistive technology, and gain insight into real-world applications of prosthetic innovation.
This project explores the integration of electromyography (EMG) sensor data with programmable prosthetic devices. I designed an interactive learning experience where users collect and interpret real-time muscle activity to control the NeuroMaker Hand. The module introduces core bioengineering concepts through applied coding and hardware interfacing, emphasizing the real-world relevance of neural signals in assistive technology. The goal was to create an engaging, STEM-rich environment that connects scientific understanding with hands-on experimentation.
An interactive gesture-matching game powered by flex sensors, LED indicators, and a button interface, designed to control and compare hand movements using the NeuroMaker Hand. The project combines sensor-based input with block-based programming to evaluate gesture accuracy in real time. It demonstrates how wearable inputs and robotics can be integrated to create responsive, educational STEM challenges.
An interactive Red-Light-Green-Light game utilizing ultrasonic sensors for motion detection and LED indicators to signal game states. The system is programmed through block-based coding to control the NeuroMaker Hand, simulating a real-time stop-and-go challenge. The project demonstrates the integration of sensor data, robotic actuation, and logic-based game mechanics in a STEM learning context.
A sensor-based control system using a button, rotary encoder, and ultrasonic distance sensor to enable automated object detection and grasping with the NeuroMaker Hand. The project integrates real-time distance measurements with programmed responses, allowing the prosthetic hand to detect nearby objects and close automatically when they enter a defined range. Highlights include sensor coordination, actuator control, and practical applications of automation in assistive technology.
A storytelling project integrating the NeuroMaker Hand 2.0 and BioSensor Kit to animate glove puppets through gesture-controlled movement. By combining sensor input and programmable motion, the project bridges robotics and literature, transforming classic narratives into interactive performances. Highlights the creative application of engineering concepts, wearable sensors, and motion control in a performative context.
The NeuroMaker Hand 2.0 curriculum offers six comprehensive modules, each designed to teach students hands-on skills in mechanical and electrical engineering, biomedical exploration, programming, and artificial intelligence. Through interactive lessons, students engage in problem-solving, critical thinking, and explore real-world applications in fields such as manufacturing, bioengineering, and AI development.