I’m building an application where the user can use the Muse headband to visualise a Web based application running in real time in the browser. I already got it working, and I have access to the full data available to the SDK.
I’m trying to understand which brainwaves to look at to try to understand is the user is relaxed or stressed in order to make my visualisation move more, or less (Exactly as the breathing exercise in the Muse app). Which brainwaves are you guys listening to? What’s the logic between playing the stormy sounds and the calm sounds?
Thank you in advance.
any guidance would be greatly appreciated.