Measuring Anxiety Level in my App



I’m developing a VR app were users are put in a situation were they must confront a common phobia. We would like give the user feedback for when they are becoming increasingly anxious or nervous. I feel like this is a pretty different emotion to be tracking compared to the existing “mellow” and “concentration” emotions in the experimental section. Does anyone have experience tracking this using Muse, or know of good resources to get me started? I’m hoping to abstract this to a point were I have a gradient number between 0 and 1, were 0 is a person’s default, average mental state, and 1 is stammering, um-ahh’ing, anxiety.

In some of my research before posting I came across Here is how they defined reading anxiety through EEG:

  • Excessive Coherence over the right hemisphere.
  • Slow wave frequencies in the Theta (4-8 cycles per second) and slow Alpha (8-10 cycles per second) range over the right hemisphere, especially in the frontal and temporal regions (front and side).
  • Excessive high frequency Beta in the center of the brain (central vertex).
I figured this might translate to
  • (not sure how to get Coherence out of Muse)
  • low relative theta and alpha on 2,(frontal) and 3 (temporal)
  • high relative beta on 0 and 3?
I'm brand new to EEG and brain hacking so I'd really appreciate if someone with more experience could guide my muddling some.

By the way, I’m getting data over OSC into Unity for this project.


Hi James,

Have you checked out the Intro to Brains section on There are several references there to sites and books that cover EEG and neurofeedback principles extensively.


[B]Technical Foundations of Neurofeedback[/B]

[B]The Neurofeedback Book[/B]

[B]Introduction to Quantitative EEG and Neurofeedback[/B]
[FONT=arial, sans-serif][COLOR=#222222]

I was actually just looking through the third book (Introduction to Quantitative EEG and Neurofeedback) and it’s got some good stuff in there re: coherence, since you seem interested in the subject.

UCSD also has a really good series of videos on BCI:

The same UCSD group is responsible for BCILAB, an open-source BCI toolkit that you might find really helpful to check out. You could even see how they implement some of the analysis you’re exploring using in your project.