EEG Channel location


“A1 FP1 FP2 A2” or “TP9 FP1 FP2 TP10”? Another question is when eye blink occurs, the first and fourth channel would be much different, but for another two channels (FP1, FP2) not too much different. That makes me quite confused, from my knowledge, frontal parts are more related to eye blink, i.e. FP1 and FP2 should be more different while eye blinking. is there anything wrong? for channel location.


TP9 FP1 FP2 TP10
That is correct - ears show eye blinks, the forehead sensors show your eyes moving left to right.


Thank you.


Hi, what montage do you use?


I’m viewing the output of the OSC data in my PureData script and I never see a value for :

[B]/muse/dsp/jaw_clench i
/muse/dsp/touching_forehead i

I do see data for :

/muse/dsp/blink i

Have these OSC paths changed? I’m using the OSC 3.4 paths. Here is the muse-io command line that I am using :[/B]
muse-io --preset 12 --dsp --osc osc.udp://localhost:5000

Is there some documentation on the specifics of the presets? The documentation describes specific technical parameters, but there is no info on why I would choose one over the other.



Hi Tom,

This is because, as I have said, there are still a lot of mistakes at the documentation and Developer site.
At this point I’m getting too curious to know what are they doing there ?
Why are they so far from the forum and their customers ?
I know … they are all researchers and developers … I know this, because I was one of them many years ago, and I can remember - when we are too much involved in a difficult and chalenged task, we forget everything else :slight_smile:

It’s time to hire someone to take care of us.

Well, there was no change in OSC paths 3.4 … “blink” is the only data that is at the [B]/muse/dsp/[/B] path, all others are at: [B]/muse/dsp/elements/[/B]…
[B]/muse/dsp/elements/jaw_clench i
/muse/dsp/elements/touching_forehead i
/muse/dsp/elements/is_good iiii
/muse/dsp/elements/horseshoe ffff [/B]

About the “presets” I have noted that if you add the “–dsp” parameter to muse-io command line, it forces “preset 14” to be passed to muse.

I have not found any way to filter data or reduce the ammount of data streamed from muse-io command line.
I believe that we will have to wait until the API is released and a real muse driver is written.
The only way I found was using to take input from the muse OSC stream and rebroadcast it to another port, using its filters parameters (that allows you to filter only the paths you need).
But since the command line becomes too long if you want to filter many paths, I made a little change in so now it can take all parameters from a text file.
(I have never developed anything in Python before, but it’s working)

It’s too boring having to go to the command prompt every time, or even creating desktop shortcuts that I’m finishing to develop a Windows GUI (I only develop for windows OS), from where you can start muse-io with the parameters you want, and also start muse-player instances and do whatever you want.

But this approach will only work for you, case you have got to install all MacOS SKD … pyliblo, and the remaining.

Good luck,



Thanks for the clarification on the paths. That was my mistake. It is a little odd that the “blink” path is separate from the others.

I agree that the MUSE SDK developers don’t seem to be paying much attention to the forum. I even sent them the exact corrections for the Mac SDK install. It is also not possible to private message between forum members (which I have posted to the forum) which makes communication and collaboration awkward.

You would think that they would leverage our efforts to help make the documentation better.

As far as the SDK i’ve never gotten past the “pyliblo” step, and no one is answering questions. My SDK questions were redirected back to the forum.
Someone was saying that they were not getting notifications when their threads were being replied to, so you have to look all over the boards to find stuff. Clunky.

I also keep getting this random server error popup.

The only real problem that I’m having now with the data is that the string “nan” continues to appear in place of floats in the /muse/dsp/elements/alpha, etc paths.
When my scripts encounter the “nan”, they lock up.

While I wish I could control the amount of data being sent over OSC, it hasn’t caused me any problems up to now. It’s just not clear how the different command line options interact.

Is there any way to combine or average the 4 channels of data for each wave type? As it is, I’m just grabbing the first one and ignoring the others.



Concerning NAN:

You said you’re using PureData? I’m assuming the object are the same as Max, but maybe they’re not. In Max averaging the 4 is as simple as sending the list through a [zl sum] followed by [/ 4.]. To get the largest , use [maximum].


Hi there,

I just wanted to make a couple quick comments about the discussion here.

A nice quick and simple way to reduce osc traffic is using Muse-Lab to forward messages. It’s great because you can not only see a live stream of each signal, but you can select each path you with to forward individually, which is great for development. We use this internally all the time to test new systems. It’s a really simple mechanism that can forward to multiple destinations.

You can do this in the outputs tab in Muse-Lab’s OSC Menu. Just type in an ip ( is local) and check each path you with you with to forward. Hope that helps.

Regarding the NaN’s on alpha. You’re going to see NaN when our DSP encouters noisy data, this includes user blinks. This is because it becomes difficult to interpret bandpowers properly with large muscle artifacts. Unfortunately this means you need to make your software robust against NaN entries.