Author(s): Xander Boit
Mentor(s): Nathalia Peixoto, Neuroscience Program
AbstractMy mentor is Dr. Peixoto and my department is bioengineering.
This was completed in Fall 2022 as a URSP. Thank you OSCAR for providing funding and support for this project.
So existing literature; familiarity in music has just generally a positive correlation with how the brain responds.
There are a couple of sources I used.
The first one says that familiarity in music has been reported as an important factor in creating emotional and positive responses in the brain.
And then the second source says additionally, music can improve cognitive performance, especially when it is familiar.
And so, essentially what I concluded from all of these and other sources I found is that previous research shows that there are clear benefits associated with familiar music in relation to mental state.
Now what I was looking for is how does this actually apply to an action. Because a lot of this research is kind of stationary or is just a task. But with the VR equipment, we have this opportunity to have them record the EEG data while they are actually doing a task that involves full-body movement and music at the same time.
And so that’s why I kinda chose to do this project.
So the technology that I have for this project is in three categories: Virtual Reality Equipment, EEG Recording Equipment, and the Data Analyst Software.
The first part is the virtual reality equipment. The lab I am in has the HTC VIVE Pro 2 Headset.
These two pictures you can see are pictures I took in the lab of the VR headset. The top picture being the actual headset and the bottom being the two hand-held controllers that you use when using the headset.
I had a decent amount of issues getting the VR headset to work properly throughout the semester, and it kinda delayed my project a decent amount.
The first one was I had issues with audio output. The headset actually has two speakers on it, you can see off to the side of the top picture. The speakers are tilted outward currently. They would not play audio. And if they did play audio, if I got it to work, they were super scratchy and basically unbearable. Obviously, this is a huge issue for an audio based project, so I spent about several weeks figuring this out and eventually I just figured out that unplugging the cords and replugging them back in until it works is actually consistent for some reason, and I haven’t had many issues with it since, which surprises me to be honest. But, if it works, it works.
Then the other issue, probably the bigger issue, if video resolution. Now, we have a pretty good graphics card in the computer that is attached to this VR headset. But this VR headset is so powerful that it can outperform the top-of-the-line graphics cards. I decided to use my whole budget in one go and buy a new computer, which has a 3060 graphics card and top-of-the-line for lots of stuff. It was a pretty good deal since graphics cards are cheaper right now. I bought that, but with the whole new computer, it got delayed three to four times due to miscommunication, product delays, and whatnot. I will be getting it in the middle of December. I currently do not have it to actually use for this project, but I will have it for next semester.
The video game I am using is Beat Saber, which is probably the most popular VR game. If you have seen any VR game, it’s probably been Beat Saber. It’s essentially Guitar Hero, but VR. You have the two controllers and you have to slash things.
I have a video actually, it’s about thirty seconds long. I can show a little bit of what it looks like. This may be a little loud. I recorded this in the lab. You can see the two sabers I am holding. My left hand is the red one and my right hand is the blue one. You can see I am just slashing with the arrows on the cubes with the beat of the music. So this is how we are making it into a rhythm game recording thing.
I think the really cool thing about this is the EEG part of the project. We have the Muse 2 Headband in the lab, and that is what I am using; there is a picture of it to the right there.
This Muse headband records at four different locations: left ear, left forehead, right forehead, and right ear. So basically it’s just around the outside of the skull.
The cool thing about this is that when I initially designed the project, we planned on having a full-cap EEG on, which meant that we couldn’t use it during the VR section. But with this one, you can actually wear this underneath the VR headset. So we can record the EEG and have the VR going at the same time, so we don’t have to worry about any gap in time or just any of that error that could occur. This is very useful for getting what is actually going on in the brain while the activity is happening.
The third part of the technology aspect is MATLAB. You probably have heard about it before if you know anything about programming. It’s essentially a software that allows me to filter and analyze EEG data.
I went into it with two main goals. To separate my data into key frequency bands, like delta, gamma, beta, alpha, and theta. These are different frequency bands. Essentially, there is known information about what these bands indicate: calm, stressed, active. Just a bunch of different things, but I am not going to go into all of them, since they are kinda specific in some categories.
We eventually want to use these key frequencies to compare power levels of each band against familiar and unfamiliar samples, because this is a matched-pair design. Meaning that for each person, we are getting a familiar and unfamiliar sample. We are going to link those two samples together and look at the difference essentially of these power levels of each of the five frequency bands.
The progress so far; we haven’t made all the progress yet since I have not collected data yet. Generally, I think I got behind in this project, which is my own fault. I have the data, its loaded into MATLAB, and I can separate the frequency bands. I just need to figure out how to filter it exactly, which takes a decent amount of time going through literature and seeing what previous researchers have done. So that is where I am currently at with MATLAB.
Here is a picture I generated using my MATLAB code. This shows the RAW EEG data of the four locations that I talked about previously. The left ear, left forehead, right forehead, and right ear. You can see it’s a mess, this is why we need to filter and separate them. Because right now, this doesn’t mean anything.
This is an example kinda thing. This isn’t exactly how I want my data to look, since it is filtered incorrectly for what I am trying to do. You can see this separates the data into the five frequency bands. You can see the levels and how this is. This is kind of what I want, then we are gonna take the power level of these and compare them against each other for the familiar and unfamiliar.
For research logistics. The IRB approval process; I started way too late. I got distracted with troubleshooting the VR equipment and didn’t realize how long it would take. Right now I am at the final stage of approval for the IRB, so hopefully I get that done and can start next semester as soon as possible.
I mentioned it before, but computer purchasing delays. Had a lot of miscommunications, and it’s been pushed back to the middle of December now, when it is going to arrive.
Future plans. Get IRB approval, which I just mentioned. Gather the data, after I have the approval from IRB. Analyze and report the findings. And depending on how the results turn out, publish a research paper on the results.
This is my work cited. Thank you so much for watching this video and let me know if you have any questions.
4 replies on “Analyzing how the EEG of a person changes when playing a rhythm game of a familiar and unfamiliar song in a VR environment”
This is such an interesting project! One question I would have is if experiences may change across different types of VR headsets?
The main difference between VR headsets is the visual output. The HTC Vive Pro 2 that we are using is one of the highest-resolution VR sets on the market currently. Since this project focused on audio, its hard for me to say what I would expect to change. A lower resolution makes it seem less realistic in my opinion, so maybe a different headset could lead to it feeling more game-like?
Good presentation. Nice persistence with technology delays. I look forward to your results.
Hey Xander,
Great job on an awesome research! I enjoyed how you discussed all the processes through the research thoroughly to show all the processes involved. Overall, this is a very intriguing research and this was an amazing presentation and I wish you the best of luck.