Measuring the latency of sending data from muse / muse monitor


Hi Guys

I’m using Muse Monitor, and I’m trying to measure the latency from the time the data was created (sampled in muse) till It got processed.
I’m using pythonosc to process osc messages.
I tried to do so by looking at the diff between the message.time and now.
For example if message is an osc message, then:
delta_millis = (time.time() * 1000) - (message.time * 1000)

I tested on 3 devices (2 iphones and an ipad) and found a latency of about 500 milliseconds.

Does that latency sound correct? Is there a better way to measure it?


Here’s a rough estimate of the data process

Internally in the headband you have:
Electrical pulse on head -> [Muse hardware processing delay] -> Bluetooth Packet (no timestamp).

From the band to the device you have:
Bluetooth packet -> [Device bluetooth stack buffering delay] -> data packet -> [Interaxon SDK processing delay] -> processed data (with timestamp) -> [Muse Monitor processing delay] -> UDP OSC packet

Network delay:
UDP OSC Packet -> [Network Transmission delay] -> UDP OSC Packet received by server.

So it really depends on what you want to measure.
Regarding the electrical signal to Interaxon timestamp creation, the most delay in my experience happens with the bluetooth packet buffering. You can see this just by looking at the timestamp delta between raw eeg samples. There is a clear grouping and buffering going on with one sample taking ages and the rest being very quick.

That aside, I think the best way to measure the whole thing would be to sync your device time with a hardware pulse generator, then send a timed 3v pulse through the AUX port pin 4. You will then have a start time referenced to the device and can measure the delay.

Alternatively, if you have a high speed camera, you could film Muse Monitor’s raw eeg view and tap your finger on the AUX USB ground to generate a pulse, then watch the delay in the graph changing. (Enable AUX RAW view in settings)


Thanks for the details !
I did considered the bluetooth delay, but hadn’t considered the buffering which makes the delay vary.
So if I wish to create a simple P300 experiment, just assuming that there’s X delay that should be added to the 300ms delay is probably not good enough, because the variance of the delay is big.
But I also do not seem to have an alternative from your description, cause the timestamp is given after some internal Bluetooth delay is created. If I take an Osc Message timetag (instead of taking the current timestamp when I process the message on my server) it could be a little better, but there will still be a large variance in the internal delay


You just have to make the assumption that the device itself is sending samples at a constant rate (which Interaxon have told me it is) and then average out the timestamp deltas.


hi Shiran,
Please, I want to ask you if you can detect the exitation signal P300 with Muse ?


Yes, I was able to reproduce. You may consult this excellent repo by Alexandre


Dear Shiran,
thank you very much