Hey, I’ve had a little time to play around with my muse and it’s pretty evident that the calibration routine that runs in your own app makes a world of difference for reading data from the user’s brain.
A question to the Muse-team, to which I haven’t stumbled on the answer, is if you have any plans on releasing the calibration function so we all can use it in our apps? I really don’t see why it’s worth keeping it constrained to apps included with the Muse when it could be a much greater platform if we all could use it? Would be a great incentive to develop apps. Currently, there’s a huge hurdle to overcome, namely getting any comprehensible and usable signals from the Muse. That will probably keep most potential developers from even attempting to build anything.
Or have i missed some relevant piece of information?