I am trying to control a cursor with Muse. So far, Ive made an Android app which has a turtle which moves as follows.
Blink: Move ahead one step
Jaw Clench: Turn 90 degree clockwise
Close Eyes: Bring turtle back to center.
I detect closed eyes just by putting a simple threshold on Absolute Alpha from FP9 (EEG1). When you close your eyes, the Alpha power increase and it can easily be detected by a threshold (or a more robust algorithm).
I need a few more brain related events (Not artifacts) to control lets say, a cursor, a toy car etc. with any triggerable signal. People have used EEG for detecting intention to move ones hand, or visual focus, has anybody tried to develop any algorithms for Muse?
Even if you can share your unsuccessful attempts it would really help me, and other people following this forum.