Has Muse been used for controlling?


Have browsed and searched the forums but not found any reference to if/where Muse would’ve been used for controlling something. E.g. thinking left would move the cursor to left, right, up down etc.
I’m aware of that Netflix have somehow used Muse as a remote control, so it seems to be possible to some degree.

Any references?

I’m contemplating to purchase Muse, not for the ‘meditation’ functionality, but for using it for controlling. Not sure if it’s really possible with only 4 channels though. OpenBCI would perhaps be a better option, but too expensive for me.


I think your best bet at doing this would be to use the eye movement interference patterns as a control. Technically this would be EMG rather than EEG control, but it’s something you can do right now without funding a ton of neuroscience researchers :wink:

If you open Muse Monitor to the RAW EEG screen you can see very distinct patterns when you look up, down, left and right. Then maybe you could use blink as a “click”?


Thx! I’ve ordered a Muse and will experiment a bit. Have read a lot about EEG’s, Muse and similar products the latest days, so have at least a high level understanding on what’s possible and what’s not necessary possible or at least feasible.
As having experience with Deep learning I’ll try to throw some data to Tensorflow (www.tensorflow.org) to see if anything is revealed.


Hi Thomas
I’m about to embark on a masters in Deep Learning and thought I could do a project with Muse. I was wondering if you (or anyone else) had made any inroads in this direction. Any hints would be most welcome. I’m not sure, at this stage, what Muse is capable of, given that it only has 4 sensors and is somewhat less accurate than a medical grade device (and cheaper of course!).
Thanks in advance


Hi teticio,
We LOVE academia and research. Muse has been involved in several high profile studies already, and we have a brilliant research team on staff here. I’m sure they would love to talk turkey with you some time.


Thanks Daniel.

I have been reading through pretty much everything I can get my hands on. It seems like the most successful projects have been one which uses your database to detect trends in EEG with age, and another one used to authenticate people based on their EEG (although it doesn’t seem to work so well after a passage of time) and yet another looking at controlling a light switch by voluntarily blinking. The self calibrating protocol idea looks interesting, but I’m not sure what it can be applied to at this stage. However, many of the other papers I’ve read have somewhat mixed results, perhaps because the raw information they are looking for is not there or is not reliable enough.

I’d be very grateful for any pointers - with an emphasis on the chance of some success as opposed to the “coolness” of the idea. Happy to be contacted by your research team on my email (I have the same user name on gmail).

Thanks in advance


Hi teticio,
Sorry, unfortunately I lost interest in the concept as it was almost impossible to get Muse to connect to my SurfacePro 4 with MuseDirect. The only way I got it to work at all was through an external BT dongle, but also then it dropped the connection regularly. The final hit was that the Intel processor is not supporting whatever OpenGL graphics driver that is being used to visualize the data. I have a 11 year old laptop which would probably work better as it has its own graphics card, but I don’t want to start all over again.
Still I’m interested in this topic, so I hope that you or anyone embarking on this journey would keep us updated.


Thank you so much for this. I was into this issue and tired to tinker around to check if its possible but couldnt get it done. Now that i have seen the way you did it, thanks guys
https://notepad.software/ https://vidmate.onl/download/ https://filezilla.software/