Need Help Translating RAW EEG / Frequency Data Into Relaxation / Concentration Values


Does anyone have any experience translating the Raw EEG / Frequency Data from the muse 2016 headband into “relaxation” / “concentration”?

I currently have the sample code working in xcode 10.1 and swift 4. The device successfully connects to the application and starts to stream the Raw EEG / Frequency Data.

I am having trouble translating these values into “concentration” and “relaxation”.

Any ideas on the best practice for doing this would be greatly appreciated.


Interaxon deprecated Concentration and Mellow from their API many years ago and so far no-one from their team has released the algorithms.

From what I can tell, Mellow was a super basic algorithm, which was just tracking increases in Alpha relative, and Concentration did the same thing with Delta.


Thanks for the information! Very helpful.

I went back and used the relative functions and they seem to work well in both cases.

I will create my own simple algorithm for this and build the app around it.


Curious – how did you manage to get the Muse SDK working with Swift? This isn’t my first rodeo with importing an Objective-C framework into Swift (etc.), but it doesn’t seem to be working.

Any recommendations / code samples would be greatly appreciated.

It’s disappointing that Muse doesn’t maintain a CocoaPod, support BitCoding, etc. – Swift is now a decently mature language; Muse should spend the effort to make development in Swift as easy as possible.