I’m developing a VR app were users are put in a situation were they must confront a common phobia. We would like give the user feedback for when they are becoming increasingly anxious or nervous. I feel like this is a pretty different emotion to be tracking compared to the existing “mellow” and “concentration” emotions in the experimental section. Does anyone have experience tracking this using Muse, or know of good resources to get me started? I’m hoping to abstract this to a point were I have a gradient number between 0 and 1, were 0 is a person’s default, average mental state, and 1 is stammering, um-ahh’ing, anxiety.
In some of my research before posting I came across http://www.greatbrain.com/depression_anxiety.php Here is how they defined reading anxiety through EEG:
- Excessive Coherence over the right hemisphere.
- Slow wave frequencies in the Theta (4-8 cycles per second) and slow Alpha (8-10 cycles per second) range over the right hemisphere, especially in the frontal and temporal regions (front and side).
- Excessive high frequency Beta in the center of the brain (central vertex).
- (not sure how to get Coherence out of Muse)
- low relative theta and alpha on 2,(frontal) and 3 (temporal)
- high relative beta on 0 and 3?
By the way, I’m getting data over OSC into Unity for this project.