I am an artist doing space interactive sound installations with sonic augmented reality using the iPhone and my dedicated apps developed for it.
I am interested in using the Muse product for capturing the relevant EEG data to have further control on the sound space.
I need to ask the following questions and get precise answers from you if possible;
By using the SDK, can I directly do data mining on the EEG data with some high level functions developed ready on your framework ?
The iphone being used on the installation space communicates with iBeacons (Bluetooth signal transmitters) to calculate the actual position of the visitor in the space. However the Muse device uses as well the Bluetooth connection. Can I use Muse and the iBeacons both together at the same time ? This is crucial to know for me. I am using these beacons on this link;
thank you for any information you can provide.