I noticed the lag almost from the very beginning. I think I was prepared for it because in some of the developer or research documentation there is mention of the app doing calculations on multiple samples of data to make its adjustments and present them as a change in sound.
If you see the raw data output from Muse, you’ll understand. With the 2016 headband, the only way I’ve seen to get that view is with the Muse Monitor app for mobile devices ($15 on the Apple App Store). Muse is capturing alpha, beta, delta, theta, and gamma waves from each of the sensors touching different parts of your head. On screen, every time you bling your eyes, the alpha wave line shoots up and back down. If you clench your jaw, the gamma line moves. Muse has to take all of that, aggregate it, process it through whatever magic calculation it uses to decide your brain is doing something worth reporting, like having a loud thought, and present it back to you.
For experienced, attentive meditators it seems the brain is faster at recognizing that. I experience the lag and dissonance regularly and also find it distracting at times to the point that I think my meditation is “less productive”. On the other hand, the dips and valleys is sound from the app still happen enough in situation where I do not think I am distracted that I find it interesting and challenging to figure out what Muse is “seeing” that I may be missing.
Thanks for posting your comment. It’s cool to hear similar experiences.
BTW, if it seems I may be affiliated with Interaxon in any capacity, I’m only an enduser