In exploring my Muse headband I ended up building a small application that allows you to stream the data output from the headband via a live visualization, or directly to a CSV (text) file on your computer. There are a lot of things to unpack as to why or how I did this, so the following is a walkthrough and explanation in hopes to better share my thoughts about one of my favorite projects!

Github → ‣

ezgif.com-optimize.gif

What Is a Muse headband?

A Muse headband is a wireless, wearable EEG (electroencephalogram) that, paired with the Muse App (ios, android) creates a BioFeedback loop where Muse reads, interprets your brainwaves and sends feedback that helps you with meditation and sleep. The potential applications are much wider though, and Muse is used in clinical studies for a number of different other domains to help identify and track other biomarkers.

Image from Muse’s website marketing material

Image from Muse’s website marketing material

Step 1: Making the Muse Headband Data Readable & Streamable.

Connect Muse headband to something that can produce an Open Sound Control (OSC) stream of data. Simply put, this is just a framework designed to communicate high-frequency data over a network. In our case, we’re useing the Muse Direct application to produce the OSC stream. There is another app I’m aware that does this as well, and its called Mind Monitor and works with the Muse headband.

Untitled

The headband connects via bluetooth to our computer, which is running the Muse Direct application, which is transmitting our OSC stream on a local network. This just means we can access this over the wifi in our house without needing to use the “public” internet network.

Step 2: Python Application Reads OSC Stream & Deploys App

Once the OSC stream is up and running, we can then read it using an open source library in Python called python-osc. This allows us to “read” the data into a common structure such that we can then use that data for our application. In this project we’re using PyQt6 as our application building framework for one simple reason: speed. Our data streams in at 256 hertz, or 256 times per second (approximately). Although our graph does not update quite that often, its still pretty frequent. For this reason, PyQt6 & subsequently PyQtgGraph are used to create this live brainwave app.

Untitled

Untitled

Step 3: Run the App!

  1. Select which OCS stream to visualize/collect:

    Untitled

  2. Visualize the data!

    ezgif.com-optimize.gif

  3. Specify the time range to show on the graph (X Axis). This also would the the time range specified in the export when you press the Save Data button, which of course is the 3rd feature.

  4. Save Data is a button that will just export all the data shown on the screen to a CSV (text) file.

    Untitled


    Biggest Challenges, Lessons

    Building this application was a fun experiment & deep dive into the various kinds of data available in the Muse headband. Some of the main things that I learned are:

    Signal Processing Is Complex

    Honestly one of the biggest thing that I learned was how complex this data truly is. The data you see, for example the EEG data, one could almost assume that each channel was tuned to pick up a specific frequency of EEG (like channel 1 = Alpha, channel 2 = Beta) but that is not the case. The process to extract the various frequencies from the EEG channels would need futher preprocessing, feature extraction, and classification in order to produce any real useful insights. This is the case with every data type, as each channel can contain noise, spikes, or even missing data.

    Application Building Is Hard, Even With the Right Tools

    As a Data Analyst/Data Scientist, I first tried building this app with frameworks I was familiar with like Matplotlib, Plotly Streamlit, Gradio but struggled in the live stream part of the app. After some reasearch, I found PyQT6 and PyQtGraph and it worked really well.

    On top of the technical challenges, building an application that end up being useful to people needs to invole said people. My only goal was to visualize the data produced by the Muse headband, which we did! However the resulting application fails to deliver any true value as the data is not intuitive to understand & is limited in its functionality.

    Final Thoughts

    This was a fun project and I learned a lot about EEG data, the methods required for preprocessing this kind of raw biometric data, and the complexities of EEG and application building. For developers wanting to explore biometric data & raw signals, the Muse headband combined with Python-OSC & PyQt6 frameworks provide the tools capable of simplifying the process of reading and interpreting human biometric data, which is pretty wild when you think about it.