Muse for Cursor Control (Motor Cortex? Visual Focus?)


Hi guys,

I am trying to control a cursor with Muse. So far, Ive made an Android app which has a turtle which moves as follows.

Blink: Move ahead one step
Jaw Clench: Turn 90 degree clockwise
Close Eyes: Bring turtle back to center.

I detect closed eyes just by putting a simple threshold on Absolute Alpha from FP9 (EEG1). When you close your eyes, the Alpha power increase and it can easily be detected by a threshold (or a more robust algorithm).

I need a few more brain related events (Not artifacts) to control lets say, a cursor, a toy car etc. with any triggerable signal. People have used EEG for detecting intention to move ones hand, or visual focus, has anybody tried to develop any algorithms for Muse?

Even if you can share your unsuccessful attempts it would really help me, and other people following this forum.


I thought to make such app to control mouse and keyboard. I started collect muse signals when I think about letter. I analysed the fft0-3 data, no coincidence… The most good signals are eyes blinking and moves in eeg0-3…
My opinion - to control mouse moves brain should use brain areas involved in motor function. When we use keyboard we use fingers - motor function. But there is no such area in our brain - it should be trained and it will appear. Like we study to work or keyboard typing.


I agree the app would need to be collaborated and trained, but the training would be faster if more people had the muse, and shared their data, and the things they experienced in life also that would cause the random noise while training.

Does anyone know who coded Steven Hawkling’s apparatus to communicate the way he does? I’d imagine it’s doing the same kind of thing. If that’s the case, it may be much faster to just…maybe… use our fingers?


I researched about Stephen Hawkings apparatus. It is a motion sensor that measures the movement on his cheek muscles which controls a UI autocompletes, learns frequently used words and stuff.

Dimitriy, there is areas in our brain where Alpha waves correspond to Motor function, and are called Mu waves then. Though I have not tried the Muse measuring different positions on scalp, and im not sure how its possible.

Blink and Clench cursor is really possible, (Blink: Fwd, Clench: Rotate heading) but its not good for completely disabled people, and how would you click?

I came across a technique called SSVEP (please take a moment to search youtube if you dont know what it is). Has anybody implemented it with Muse?