OSCAR Celebration of Student Scholarship and Impact
Categories
Cells, Individuals, and Community College of Engineering and Computing OSCAR Undergraduate Research Scholars Program (URSP) - OSCAR

Pain, Medication Use and Biomarker Associations in Individuals with Polycystic Ovary Syndrome (PCOS) : Insights from the All of Us Research Program

Author(s): Jannatul Nayeem

Mentor(s): Jenny Phan, CASSBI

A

Background:
Polycystic Ovary Syndrome (PCOS) is a complex endocrine disorder often accompanied by chronic pain, yet the biological and social determinants of this pain remain underexplored. Understanding how stress-related biomarkers and healthcare access interact in shaping pain experiences may reveal mechanisms underlying health disparities in PCOS populations.

Objective:
This study examined associations between inflammatory and neuroendocrine biomarkers (C-reactive protein (CRP), cortisol, and body mass index (BMI)) and pain burden among individuals with PCOS, while exploring the moderating role of healthcare access and insurance coverage.

Methods:
Using data from the All of Us Research Program, 2,160 adults with PCOS (identified by ICD-9/10 codes) were analyzed. Pain burden was measured through pain-related diagnoses and pain medication dosage. Biomarker distributions were winsorized, log-transformed, and analyzed via multivariate regression models adjusting for age, race, socioeconomic status, and healthcare variables.

Results:
Pain burden alone was not significantly associated with higher CRP, cortisol, or BMI levels. However, healthcare access moderated these relationships: participants with greater barriers to care exhibited elevated inflammation and BMI with increasing pain, whereas those with adequate access showed flatter or reduced biomarker trends.

Conclusions:
Findings suggest that chronic pain and stress responses in PCOS may be shaped more by social and contextual factors than biological burden alone. Enhancing healthcare accessibility and equity could mitigate stress-related physiological outcomes and improve pain management for individuals with PCOS.

0:01 Hello, my name is Jannatul Nayeem. I am a student researcher with the B&LAB, um, at George Mason. I’m working directly with, um, Dr.
0:13 Jenny, um, in her our static load study. Which is the body’s biological stress response, and how it relates to menstrual disorders and chronic pain.
0:23 Um, from that study, I wanted to dive deeper into PCOS, and look at pain medication use and biomarker. Or associations and individuals with that disorder.
0:36 Umm, for methodology, I started off by using, umm, the all of us data set, uhh, database. Umm, it has a ton of data on- individuals, uhh, with all sorts of diseases and, umm, information from their doctor visits, umm, patient records, and also, umm, some survey questions that, the program itself asks
1:06 those participants, umm, and so through that, through that database, I was able to find what 2160 individuals with PCOS, and dive deeper into, uhh, their- biomarkers, uh, specifically for this, I’m using, umm, 3 biomarkers as predictors for inflammation and stress.
1:30 I’m using BMI, C-reactive protein, and cortisol. And then, umm, for- for their outcome variable, I use pain diagnosis along with their medication usage, umm, for medication usage, umm, I accounted for, umm, how many medications they’re taking.
1:52 And, and also what the dosage was for that medication, umm, and then some co-variates, such moderators that, sorry, some co-variates that I used was age, race, uhh, and SES index, and then for moderators, umm, I looked at healthcare access, uh, specifically insurance insurance status and access to care
2:19 . there. And so, I’m going to zoom into the results that I had, umm, hopefully in the video it resumes in two.
2:33 umm, but for my results, I found that, umm, pain alone didn’t start- we predict inflammation or stress, but limited access to care did.
2:45 Individuals with more barriers, such as lack of insurance, showed higher inflammation and BMI with pain, suggesting that health equity plays a critical role in PCOS pain.
2:56 Um, while my insurance data was limited, uh, because, uh, that not many people answered those questions, um, there’s still show some support that the idea- that stress biology and pain in PCOS are influenced by social environment and not just physiology.
3:22 Um, right now, I am continuing this study, um, to- or they’re deep in my understanding and advocate for equitable pain care in PCOS populations.
3:36 Umm, and so one thing I want to focus on more is, uh, imp- moving, um, how we, uh, state these questions, because I do feel like how the, uh, question is stated about access to care and insurance status is pretty sensitive.
3:58 So how can we go about it to, change the way, um, someone feels about answering those type of questions. Um, and so yeah, that was my study.
4:09 Thank you for listening.
Categories
College of Engineering and Computing College of Humanities and Social Science Honors College Making and Creating OSCAR Undergraduate Research Scholars Program (URSP) - OSCAR Winners

A Robotic Cat for Examining Camera Clarity and Privacy in Human–Robot Interaction

Author(s): Alexia De Costa

Mentor(s): Eileen Roesler, Department of Psychology

This project presents the Bioinspired Automated Robotic Cat (BARC), a functional companion robot designed to support research in human–robot interaction and privacy-aware design. BARC features camera-based facial detection, expressive gaze behaviors, audio responses, and various soft and rigid materials to mimic a household cat. Because camera systems can enhance interaction while raising privacy concerns, the ongoing study compares peoples’ responses under two conditions: a clear, high-quality camera filter and a blurred, low-clarity camera filter. Using surveys and observation of touch behavior, the study examines how camera clarity shapes engagement and perceived privacy, informing the design of social robots that are effective while respecting user comfort.
Have you ever wondered what a robot actually sees when it looks at you?
Today, social and service robots are becoming increasingly common, and many rely on cameras for facial recognition and user engagement. But as useful as cameras are, they also raise important questions: Do they make people feel watched? Can a robot feel friendly while still respecting privacy?

These questions lie at a key intersection in human–robot interaction, that robots need perception to understand us, yet high-resolution sensing can make people uncomfortable. So I wanted to explore a central challenge: can we reduce privacy concerns without making interactions less enjoyable? And does being transparent about what a robot sees change how people feel?

To investigate this, I designed and built a robot cat from scratch called BARC, the Bioinspired Automated Robotic Cat. BARC is part engineering platform and part research tool. It can switch between two controlled camera conditions: a clear, high-quality camera filter and a blurred, low-clarity filter that still allows for partial facial detection. These interchangeable physical filters let me directly compare how different levels of sensing clarity influence interaction.

BARC is also designed to feel expressive and lifelike. It uses camera-based facial detection for gaze behavior, animated OLED eyes, a speaker for cat-like sounds, and soft and rigid materials that mimic the look and feel of a household cat. Through surveys and observations of touch behavior, my ongoing study explores how these two camera conditions shape user engagement and perceived privacy.

To create BARC, I began with feline anatomical references, studying limb placement, joint spacing, and overall proportions, to inspire the CAD model for the chassis. I laser-cut the acrylic components and assembled them using screws and tab-and-slot joints for a sturdy, lightweight frame.

At the heart of the robot is a Raspberry Pi 4, which handles perception and behavioral control.

A camera provides the main sensory input for facial detection.

Two OLED displays animate expressive eyes that track the user once a face is detected, giving the illusion of attention and social presence.

A speaker and amplifier generate a range of cat sounds, from meows to purrs to alarmed yowls.

An accelerometer-gyroscope detects movement, such as being picked up or shaken, so BARC can respond appropriately.

Servos are controlled by a PCA9685 driver, animate the limbs, jaw, head, and tail.

All behaviors are programmed in Python and organized in a state machine with modes such as Idle, Seeking Attention, Interacting, and Startled. BARC transitions between these states based on sensory input and probability, helping interactions feel natural rather than scripted.

To examine how camera clarity influences engagement and privacy perceptions, BARC serves as a fully capable research platform. Seventy-two participants are currently part of a single-blind study with two groups:

Group 1: interacts with BARC using a clear camera filter

Group 2: interacts with BARC using a blurred, privacy-preserving filter

The physical filter is noticeable, so using filters in both groups keeps the robot visually consistent. That way, any differences we see are truly due to what the robot can or can’t perceive.

Participants interact with BARC, complete a survey measuring constructs such as Perceived Sociability and Perceived Enjoyment, and then are shown a live camera feed so they can see the actual resolution of the robot’s vision. Afterward, they complete a second survey measuring perceived privacy, perceived surveillance, disturbance, and attitudes about robots.

The hypotheses are:
1: No difference in sociability, enjoyment, or touch behavior.
2: The filtered-camera group will report higher perceived privacy.
3: The clear-camera group will report higher perceived surveillance.

This interdisciplinary project connects mechanical engineering, psychology, and human-robot interaction to better understand how people perceive robotic sensing. BARC’s expressiveness, biological inspiration, and controlled camera conditions make it a powerful research platform.

By comparing clear versus filtered camera views, this research explores whether privacy concerns come from what the robot actually sees, or what users believe it sees. Ultimately, the goal is to guide the design of future social robots that remain engaging and respectful of user’s privacy

Special thanks to Dr. Eileen Roesler (Psychology) and Dr. Daigo Shishika (Mechanical Engineering) for their invaluable mentorship. Thank you to Katya Schafer for assistance with data collection, and to Dr. Karen Lee and OSCAR for their support and funding, which made this project possible.

Thank you!

S

Categories
College of Engineering and Computing Making and Creating OSCAR Undergraduate Research Scholars Program (URSP) - OSCAR

Laser-Induced Graphene–Nanoparticle Platforms for Plasmonic Enhanced Photosensing

Author(s): Ali Kabli

Mentor(s): Pilgyu Kang, Mechanical Engineering

H

This project explores the potential for enhancing the performance of Laser-Induced Graphene (LIG) using metallic nanoparticles (NPs) as a platform for fabricating photosensors with enhanced sensitivity. The main question being addressed is can a Laser-Induced Graphene–Palladium nanoparticle (LIG-PdNP) nanocomposite enhance sensor sensitivity through plasmonic and interfacial effects? Research has been conducted in the past regarding the use of LIG as the functional material in a photosensor, and the rationalization behind using these metallic NPs in a nanocomposite material is to improve the sensitivity of the sensor by improving the photoresponsivity. This is due to the introduction of plasmonic effects from the NPs, which allows for the photocurrent to flow more efficiently. The main novelty behind this particular project’s approach lies in the synthesis of the nanocomposite, where classic means would have the NPs deposited on the LIG surface creating point contacts. The synthesis technique being explored here involved a one-step synthesis via precursors and a polymer substrate, which creates a “seamless interface” between the components of the functional material. This interface allows for the electrons to flow freely between the LIG and NPs, enhancing the photoresponsivity of the device. Two devices were compared, one with 0wt% of PdNPs, and another with 30wt% PdNPs in order to observe any improvements in the performance of the devices when hit with a blue laser (448.2nm wavelength). Future research regarding this project includes using NPs with higher plasmonic effects such as gold or silver, as well as refining the geometric footprint and patter of the sensor itself to increase performance further.
Hello everyone, my name is Ali Kabli and today I’m going to present to you my undergraduate research project, Laser Induced Graphene Nanoparticle Platforms for Plasmonic Enhanced Photosensing. This project was advised by Dr. Pilgyu Kang from the Department of Mechanical Engineering.

So to give a brief background and introduction, past research has been done by Dr. Kang and his group, utilizing laser-induced graphene, or LIG, as a sensing element in photosensors. Now, these sensors operate based on the premise of photosensitivity. You basically shine a laser of some known wavelength at the sensor, which will induce some photocurrent. The change in photocurrent can be observed and used for sensing purposes. We want to improve the sensitivity of these devices by introducing metallic nanoparticles, or NPs, to increase the plasmonic effects and photoresponsivity of these devices. Now, for the purposes of this project, the specific nanoparticles that were used were palladium. However, any metal that has known plasmonic effects can be used.

For the purposes of this presentation, or project, we proposed a novel nanocomposite synthesis technique, which resulted in a seamless interface between the LIG and the nanoparticles. Traditional methods would have you deposit these nanoparticles on the surface of the LIG, or whatever substrate you’re using, which results in a point contact between the particles and the bulk surface. The downside to this is the fact that that point contact doesn’t allow for the most efficient flow of electrons. However, through a one-step synthesis technique using precursors and polymer substrate, we are able to integrate these nanoparticles within the surface of the laser-induced graphene itself, allowing the electrons to flow seamlessly.

So, the main question that we were answering with this research project was, can a laser-induced graphene palladium nanoparticle nanocomposite enhance sensor sensitivity through plasmonic and interfacial effects? The plasmonic effects, once again, coming from the fact that we’re using these metallic nanoparticles, and the interfacial effects coming from the seamless interface through our unique synthesis technique.

The methods and procedure for this project involved the actual synthesis of our nanocomposite using the one-step technique. Then we would fabricate the photosensor device using the synthesized nanocomposite. It should be mentioned that the scale of this sensor was 500 millimeters by 500 milliliters, which is actually quite large given the nanoscale. It’s very, very large. So that may have resulted in the data being slightly skewed, which is an improvement that we will go over at the end of this presentation. Then we collected optical data regarding the photoresponsivity of the device by hooking it up to an optical testing apparatus where we would shine a laser on and off at known intervals. The laser’s wavelength was known for the purposes of this project. We were using a blue laser, 448.2 nanometers of wavelength, and we would plot the resulting photocurrent as a function of time. The long-term goals of this project are to one day harness these nanocomposites as a platform for plasmonically enhanced PEC or photoelectrochemical gas sensors.

Now here’s just a brief snapshot of the results. We see on the left side a comparison between the photocurrent resulting from a 30 weight percent nanoparticle nanocomposite and on the right side we have the photocurrent resulting from just pure LIG. As you can see the scale on the left side is in microamps, and the scale on the right side is in nanoamps, which means that we were able to show a drastic improvement, three orders of magnitude to be exact.

In conclusion, the experiment was a huge success in proving that plasmonic effects could enhance the sensitivity of these devices. However, more work is still needed in the future. We can refine the geometry and footprint of the sensor itself so that it’s a lot smaller than 500 by 500 millimeters. We can also test other nanoparticles with known greater plasmonic effects, such as gold or silver. And we can also play around with different laser parameters, focusing the laser’s beam more, increasing the wavelength, etc.

Some acknowledgements. Of course, my advisor, Dr. Pilgyu Kang, Graham Harper, who aided in data collection on this project, and Philip Acatrinei, for being an indispensable help in data collection and in setting up the experiment itself. He actually programmed the software that we were using to collect the data. So without him, this project would not have been possible. Thank you.

Categories
College of Engineering and Computing Making and Creating

Controlled Syringe Pump Extrusion to Create Hydrogel Gradients

Author(s): Elizabeth Clark

Mentor(s): Remi Veneziano, Bioengineering

The primary objective of this project is to build upon my previous research where I developed a method to create hydrogel gradients. Hydrogels are comprised of polymer(s) suspended in water. A gradient is the change from one concentration of to another. I used 10% gelatin weight ratio to deionized water mixed with dye. The gelatin was heated and stirred until dissolved and then was split into two portions and dyed two different colors and then placed into syringes while at around 45 degrees. By using a specialized nozzle, I could feed two syringes into one nozzle that has a static mixer at the tip to ensure the gelatin was evenly mixed. Gelatin is a liquid at higher temps (40-50 degrees C) and sets at lower temperatures so the gradients were extruded on a chilled metal plate so the gelatin would set almost immediately. Depending on how fast one syringe extruded versus the other I could change the color and even mix them. After creating several gradients by hand I utilized a syringe pump to have even extrusion rates. The syringe pump was utilized by alternating which pump was extruding so a colleague had to move the plate. This has applications in bioprinters and rather than having somebody move the plate manually the bioprinter will move the plate or the extruder. These results build on the potential of bioprinting gradients for use in bioprinters in regenerative medicine and other bioengineering applications.

Hello, my name is Elizabeth Clark and I’m a bioengineering student and my research project was built up on my previous research project which is creating hydrogen gradient as many cellular functions and processes utilize gradient in the human body.

So for keywords and background, hydrogels are comprises a polymers in water a gradient is the change of concentration so in this case in a line and will be represented by the changing color. Gelatin is the hydrogel I used. I use the 10% concentration so 10 mL of water I would use 1 g of gelatin and a syringe pump is the tool that I use that allows for the extrusion rate and you can program different extrusion rates.

So this slide just shows briefly the set up I used in my previous research project and I modified it slightly for the gelatin. Two syringes are being fed into a custom static mixer and extruded by hand. I only did this a few times just to ensure that a different hydrogel would work. Gelatin is a liquid at warmer temperatures so around 40 to 50°C and when placed on a cooler surface, in this case of metal plate that is chilled, it would sit almost immediately set.
I have with the syringe pump and it will alter the color by which one is extruded.

Here’s a video of that. When I wanted to change the color I would just pause one syringe pump and start to extrude on the other and then flip it. And as you can see on the right is the gradient that was just created from the video

Special thanks to Oscar for funding this project as well as my mentor Dr. Remi Veneziano, as well as the other people listed. Thank you.