Creating Native IOS application for the Muse Headband


#1

Hello,

I am attempting to write a custom application for the Muse Headband. I understand that the headband current does not support Bluetooth BLE (although that would be a nice feature in the future), so I am using the iOS “External Accessory Framework” to connect to the device.

As you know, you should implement the following protocols when connecting to an external accessory:
@interface CIViewController () <EAAccessoryDelegate, NSStreamDelegate>

I register for the “AccessoryDidConnect” and “AccessoryDidDisconnect” messages, and these handlers are being called directly when I pair and disconnect the device.

I have also correctly added a key for com.interaxon.muse in the “Supported External Accessory Protocols” plist entry for my application.
I am able to create a session as follows:
// If found, this pointer will point to the Muse Headset
EAAccessory *museHeadset = nil;

  NSString *museProtocol = @"com.interaxon.muse";
  
  // Get the list of accessories
  NSArray *accessories = [[EAAccessoryManager sharedAccessoryManager] connectedAccessories];
  
  // Find out if one of them supports our protocol
  for (EAAccessory *accessory in accessories)
  {
      if ([accessory.protocolStrings containsObject:museProtocol])
      {
          museHeadset = accessory;
          break;
      }
      NSLog(@"Found accessory named: %@, Protocol: %@", accessory.name, accessory.protocolStrings);
  }
  
  // Did we find the headset?
  if (museHeadset)
  {
      _session = [[EASession alloc] initWithAccessory:museHeadset forProtocol:museProtocol];
      if (_session)
      {
          NSLog(@"We have a session!");
          
          [[_session outputStream] setDelegate:self];
          [[_session outputStream] scheduleInRunLoop:[NSRunLoop currentRunLoop] forMode:NSDefaultRunLoopMode];
          [[_session outputStream] open];

          [[_session inputStream] setDelegate:self];
          [[_session inputStream] scheduleInRunLoop:[NSRunLoop currentRunLoop] forMode:NSDefaultRunLoopMode];
          [[_session inputStream] open];

      }
  }

So, here, I am looking for a device with the right protocol, and when I find the com.interaxon.muse protocol, I create the session. That call succeeds.
Next, I hookup both the input and output streams, so that I will be able to both receive data from and send data to the device.

My stream hander is shown below:

  • (void)stream:(NSStream *)aStream handleEvent:(NSStreamEvent)eventCode
    {
    switch (eventCode) {
    case NSStreamEventNone:
    break;
    case NSStreamEventOpenCompleted:
    break;
    case NSStreamEventHasBytesAvailable:
    [self _readData];
    break;
    case NSStreamEventHasSpaceAvailable:
    [self _writeData];
    break;
    case NSStreamEventErrorOccurred:
    break;
    case NSStreamEventEndEncountered:
    break;
    default:
    break;
    }
    }

Once I have a session, the following eventCodes are being sent:

NSStreamEventOpenCompleted
NSStreamEventOpenCompleted
NSStreamEventHasSpaceAvailable (multiple times)

So, I get the “Open Completed” event twice, and the a whole stream of “HasSpaceAvailable” calls.
I NEVER get an data through the “NSStreamEventHasBytesAvailable”, it never gets called at all.

I am thinking that this is maybe because I am not sending the right data as a response to the “HasSpaceAvailable” event, but I am not sure what I should send. I simply want to receive the data that the muse-io application receives, so I want config, battery data, EEG (that is really what I want the most), accelerometer data etc… simply the data made available by the device.

I feel like I am really close to making this work, so I really would appreciate any help on this. There are some great things I feel I can do with the device once i get this working.

Thanks for your help,

Bennie Haelen


#2

Hi Bennie,

We appreciate your enthusiasm to build native applications on iOS and this is something we hope to support in the near future. We plan to make a simple API for developers to develop applications for iOS and Android natively. It will provide all the data you’re looking to see from the headband in real time for native development and we will be happy to help with support in that API once it is released. Unfortunately I can’t provide any specific time frames for this release, nor can I provide a simple explanation of how to communicate with the headband as there is very specific communication protocols between the headband and the receiver.

If you find yourself too keen to wait for this to be officially released you are welcome to take an alternative approach to getting the data from the headband into an application. As you mentioned above you already have a tool that is capable of this communication, Muse-IO.

Now Muse-IO provides a live OSC stream of the data which is compatible with network transmission. If you use Muse-IO and point the OSC path to your mobile device you can receive the data into your iOS device via Wifi through OSC. When you run Muse-IO you may specify the destination with the osc input parameter. (ie: --osc osc.udp://192.168.1.127:5000). You will then setup your iOS device to listen using it’s ip address and the port your have specified.

The library we have used in house is oscpack located here: http://www.rossbencina.com/code

There may be acceptable alternatives to this library as OSC is a fairly common data transfer method.

We have use this system in the past to test input into iOS application and it is quite effective. You can use this as a baseline to get some live data into any application you wish to develop and once you have the official API you will be able to switch over to an iOS standalone application.

I apologize that the path you were developing will not work, however our API will likely use a very similar approach, but take care of all the communication aspects of the bluetooth connection to allow you to just receive your raw data and dsp metrics.

I hope you find this helpful for your development needs, there should be a lot of resources out there on using OSC with mobile applications, but let us know if we can help further.


#3

Hello Farough,

First of all, thanks for the reply. While I agree that what you are proposal will work, it would mean that I now would need a PC or a MAC to run the muse-io command, correct? That would introduce another, otherwise unnecessary device in the chain. Because, as far as I know I cannot run muse-io on IOS, so I have to run it on Windows or Mac, send the stream to a place where the iOS device can pick it up, and go from there.

That is kind of messy, isn’t it? Ultimately, I really just want to do the same thing as what your “Muse Calm” application is doing, and I know that app can talk directly to the headset. Another solution would be to just have the source code of muse-io, because it is clearly executing the right steps to talk to the device.

I am looking for a strategy that would make the application of the headset practical, without having to involve two more devices.

Thanks,

Bennie


#4

Hi Bennie,

You’re correct. To use Muse-IO you will [B]need a computer[/B], OSX, Unix or Windows. I’m aware that this is an inconvenient way to get the data into an application as it will be tied to a network and computer. I suggested this solution because it will allow you to start your development with an input source without having to wait for the API to be released.

Obviously a practical application is much more meaningful for you, but a proof of concept is important in the preliminary stages of development and using this method will allow you to do this part of your design.

At this time source code for Calm or Muse-IO is not available, unfortunately.


#5

Hopefully the SDK will support iOS 8’s HealthKit Interfaces (and/or Swift)


#6

I’m new to OSC but familiar with C++ and iOS development. Can you provide some guidance on how to connect with oscpack?

First I tried MetatoneOSC which is based on F53OSC. https://github.com/cpmpercussion/MetatoneOSC
Using that library I was able to connect to muse-io running locally, and receive a version packet from the Muse headband but otherwise I got a lot of “Unrecognized OSC message”.

Next I grabbed https://github.com/heisters/iOS-oscpack however I am not yet able to connect it to the muse-io server running locally.

Can you provide some advice around what version of oscpack to use and maybe some example code on how to connect?


#7

We’ve provided a couple of examples of OSC receivers for the developer community to connect directly into iOS and Android.

They are public git repositories, you can download through git or as a zip file.

You can now find their links here:
​https://sites.google.com/a/interaxon.ca/muse-developer-site/developer-getting-started-guide?pageReverted=3

Let me know if you have any questions.


#8

When can we expect the iOS library… It is very disappointing that we have to wait this long for real iOS support, which it is clear that you are already supporting iOS internally through the Muse Calm app. You have to take care of your development community…


#9

Hi there,

We’ve recently published our Muse Communication Protocol if you would like to try to write your own receiver. It’s not as pretty as an SDK and API, but if you’re interested you can begin direct communication.

http://forum.choosemuse.com/forum/developer-forum/1896-muse-communication-protocol-published

Hope that helps