novice Unity/C# developer here. I’ve started building an Android app aimed at emotional regulation training which includes Muse as a neurofeedback component. I do not intend using the muse presets as my design is based on emotional valence estimated through pre-frontal hemispherical asymmetry.
My question is, according to you, what is the most efficient way of measuring the AF7/AF8 asymmetry?
Reading the muselib documentation I saw that the libmuse.MuseDataPacket class has a getEegChannelValue member. Being a beginner I’m not sure if using this method would give me a raw data output which I would need to FFT myself or I would be able to directly calculate the relative difference between the AF7/AF8 outputs called this way.
Also, can I take input only from two channels or should I, in some way, reference the others?
If I do not use the presets, would I need to make my own noise-filtering?
Thank you in advance,