OSCAR Celebration of Student Scholarship and Impact
Categories
College of Engineering and Computing College of Humanities and Social Science Honors College Making and Creating OSCAR Undergraduate Research Scholars Program (URSP) - OSCAR Winners

A Robotic Cat for Examining Camera Clarity and Privacy in Human–Robot Interaction

Author(s): Alexia De Costa

Mentor(s): Eileen Roesler, Department of Psychology

S

Abstract

This project presents the Bioinspired Automated Robotic Cat (BARC), a functional companion robot designed to support research in human–robot interaction and privacy-aware design. BARC features camera-based facial detection, expressive gaze behaviors, audio responses, and various soft and rigid materials to mimic a household cat. Because camera systems can enhance interaction while raising privacy concerns, the ongoing study compares peoples’ responses under two conditions: a clear, high-quality camera filter and a blurred, low-clarity camera filter. Using surveys and observation of touch behavior, the study examines how camera clarity shapes engagement and perceived privacy, informing the design of social robots that are effective while respecting user comfort.

Audio Transcript

Have you ever wondered what a robot actually sees when it looks at you?
Today, social and service robots are becoming increasingly common, and many rely on cameras for facial recognition and user engagement. But as useful as cameras are, they also raise important questions: Do they make people feel watched? Can a robot feel friendly while still respecting privacy?

These questions lie at a key intersection in human–robot interaction, that robots need perception to understand us, yet high-resolution sensing can make people uncomfortable. So I wanted to explore a central challenge: can we reduce privacy concerns without making interactions less enjoyable? And does being transparent about what a robot sees change how people feel?

To investigate this, I designed and built a robot cat from scratch called BARC, the Bioinspired Automated Robotic Cat. BARC is part engineering platform and part research tool. It can switch between two controlled camera conditions: a clear, high-quality camera filter and a blurred, low-clarity filter that still allows for partial facial detection. These interchangeable physical filters let me directly compare how different levels of sensing clarity influence interaction.

BARC is also designed to feel expressive and lifelike. It uses camera-based facial detection for gaze behavior, animated OLED eyes, a speaker for cat-like sounds, and soft and rigid materials that mimic the look and feel of a household cat. Through surveys and observations of touch behavior, my ongoing study explores how these two camera conditions shape user engagement and perceived privacy.

To create BARC, I began with feline anatomical references, studying limb placement, joint spacing, and overall proportions, to inspire the CAD model for the chassis. I laser-cut the acrylic components and assembled them using screws and tab-and-slot joints for a sturdy, lightweight frame.

At the heart of the robot is a Raspberry Pi 4, which handles perception and behavioral control.

A camera provides the main sensory input for facial detection.

Two OLED displays animate expressive eyes that track the user once a face is detected, giving the illusion of attention and social presence.

A speaker and amplifier generate a range of cat sounds, from meows to purrs to alarmed yowls.

An accelerometer-gyroscope detects movement, such as being picked up or shaken, so BARC can respond appropriately.

Servos are controlled by a PCA9685 driver, animate the limbs, jaw, head, and tail.

All behaviors are programmed in Python and organized in a state machine with modes such as Idle, Seeking Attention, Interacting, and Startled. BARC transitions between these states based on sensory input and probability, helping interactions feel natural rather than scripted.

To examine how camera clarity influences engagement and privacy perceptions, BARC serves as a fully capable research platform. Seventy-two participants are currently part of a single-blind study with two groups:

Group 1: interacts with BARC using a clear camera filter

Group 2: interacts with BARC using a blurred, privacy-preserving filter

The physical filter is noticeable, so using filters in both groups keeps the robot visually consistent. That way, any differences we see are truly due to what the robot can or can’t perceive.

Participants interact with BARC, complete a survey measuring constructs such as Perceived Sociability and Perceived Enjoyment, and then are shown a live camera feed so they can see the actual resolution of the robot’s vision. Afterward, they complete a second survey measuring perceived privacy, perceived surveillance, disturbance, and attitudes about robots.

The hypotheses are:
1: No difference in sociability, enjoyment, or touch behavior.
2: The filtered-camera group will report higher perceived privacy.
3: The clear-camera group will report higher perceived surveillance.

This interdisciplinary project connects mechanical engineering, psychology, and human-robot interaction to better understand how people perceive robotic sensing. BARC’s expressiveness, biological inspiration, and controlled camera conditions make it a powerful research platform.

By comparing clear versus filtered camera views, this research explores whether privacy concerns come from what the robot actually sees, or what users believe it sees. Ultimately, the goal is to guide the design of future social robots that remain engaging and respectful of user’s privacy

Special thanks to Dr. Eileen Roesler (Psychology) and Dr. Daigo Shishika (Mechanical Engineering) for their invaluable mentorship. Thank you to Katya Schafer for assistance with data collection, and to Dr. Karen Lee and OSCAR for their support and funding, which made this project possible.

Thank you!

Categories
Making and Creating OSCAR

Collaboration and Community: The Rise of Collective Creativity in Contemporary Christian Music since 2020

Author(s): Evan Sites

Mentor(s): Jesse Guessford, OSCAR

Abstract

For the research portion of my capstone project, I will examined the noticeable increase in collaboration among Christian Contemporary Music (CCM) artists since 2020. The CCM industry has long thrived on collaboration, but in the years following 2020, the practice has grown in both frequency and visibility. Several factors, including the rise of digital distribution, the influence of streaming platforms, and changes in worship culture accelerated by the COVID-19 pandemic, have contributed to this growth. By studying these developments, I aim to gain a deeper understanding of how collaboration serves as both an artistic and strategic tool for CCM artists and its impact on the CCM industry as a whole.

Audio Transcript

0:01
Hi, my name is Evan Sites and I am a
0:05
senior here at George Mason University.
0:09
I am currently pursuing a degree in
0:12
music technology as well as minoring in
0:15
business.
0:17
For my capstone project this semester,
0:20
the research portion specifically,
0:23
I decided to look at the Christian
0:28
contemporary music industry.
0:32
Reason being is because that is a field
0:36
of
0:38
um music that I’m very passionate about
0:40
and that I enjoy
0:44
doing as well as listening to.
0:49
What I researched this semester was the
0:52
noticeable increase in collaboration
0:55
among CCM artists since 2020.
1:00
Now, the CCM industry has long thrived
1:03
on collaboration, but I believe that the
1:08
practice has grown in both frequency and
1:11
visibility.
1:13
I believe that three things contributed
1:16
to this increase of collaboration that
1:19
we see today. Number one, the rise of
1:22
digital distribution. Number two,
1:25
increased use of streaming platforms.
1:28
Number three, worship culture changes
1:31
accelerated by the CO 19 pandemic.
1:35
I believe that I have gained a deeper
1:37
understanding of how collaboration
1:39
serves as an artistic and strategic tool
1:43
for CCM artists as well as the CCM
1:46
industry as a whole.
1:50
For this research project, I
1:52
specifically looked at four different
1:54
case studies. I looked at um a album
1:59
that Maverick City Music came out with
2:01
um a couple years ago and this album it
2:05
just represents um inclusivity,
2:08
innovation and community within the
2:09
worship um industry.
2:14
Another case study that I looked at was
2:16
a song called The Blessing. And this
2:19
song is an example of what
2:24
a researcher said in his book regarding
2:27
how digital technology has reshaped
2:30
music making in the Christian context.
2:35
The third case study I looked at was an
2:38
album by Elevation Worship and Maverick
2:41
City Music called Old Church Basement.
2:44
This album achieved massive commercial
2:46
and spiritual success. According to the
2:49
Gospel Music Association, Old Church
2:51
Basement set a new world worldwide
2:55
record for the most first day streams
2:58
for a Christian and gospel album on
3:00
Apple Music. It also won multiple
3:03
Grammys that year. The last case study I
3:07
looked at was an album by Chris Tommen
3:09
entitled Chris Tomlin and Friends. This
3:12
was a cross genre collaboration album.
3:15
It features
3:17
um multiple artists from the pop scene
3:21
as well as some country artists. Um this
3:26
album is an example of what Emma Madden
3:29
highlights in her article uh published
3:33
by NPR that highlights the recent
3:36
collaboration between CCM artists and
3:38
secular pop artists. and she frames
3:42
these partnerships as signs of the
3:45
genre’s evolving cultural position.
3:50
The broader implications of this study
3:52
show that collaboration serves as both
3:54
an artistic and strategic tool for um
3:59
CCM artists.
4:01
I foresee that digital collaboration is
4:04
going to be the new model for creative
4:06
ministries based off of the research
4:09
that I read. um during this semester.
4:13
In conclusion,
4:15
since 2020, collaboration has become a
4:18
defining force in the Christian
4:19
contemporary music scene.
4:22
It unites artists and audiences through
4:25
shared faith and creativity.
4:29
Digital platforms have defined worship
4:32
as both um something in the local and
4:35
global context. And the future of CCM
4:40
lies in community-based creativity.
4:43
Thank you.

Categories
Cells, Individuals, and Community College of Public Health Honors College OSCAR Undergraduate Research Scholars Program (URSP) - OSCAR

Rest and Results: The Relationship Between Sleep, Stress, and Grade Point Average (GPA) in Undergraduates

Author(s): Michael Kaleem

Mentor(s): Ali Weinstein, College of Public Health

R

Abstract

Michael Kaleem
URSP Abstract

Title
Rest and Results: The Relationship Between Sleep, Stress, and Grade Point Average (GPA) in Undergraduates

Authors: MK and AW

Background
Sleep plays an important role in college students’ cognitive functioning and overall academic success, making it a crucial area of study. However, the specific relationship between parameters of sleep and academic performance has not been well studied. In addition, college students have also reported increasing levels of stress over the past few years, and stress can affect both sleep and academic success. Therefore, the current investigation examined the associations between duration of sleep, sleep quality, and stress with academic success.

Methods
Data were collected by surveys completed by undergraduate students at a large, public university. Sleep duration and sleep quality were assessed by the Pittsburgh Sleep Quality Index (PSQI), and stress was measured with the Perceived Stress Scale. Academic success was operationalized as a self-reported GPA. Pearson correlations determined association between the variables of interest with p<0.05 set as the level of statistical significance. Results
There were 196 undergraduate students that participated (70.1% female, 36% white/non-Hispanic, 27.6% Asian/Pacific Islander, age: 18.1±0.5). Both sleep duration and sleep quality were statistically significantly related to GPA (r=0.17, p=0.02, r=-0.13; p=0.001, respectively). Therefore, as the number of hours of sleep increased and as sleep quality increased (lower number on PSQI is indicative of better sleep), GPA increased. Elevated stress levels were related to both sleep duration (r=-0.14; p=0.01) and sleep quality (r=0.40; p<0.001) but not significantly correlated to GPA (r=-0.01; p=0.92). Conclusion
This study found that sleep duration and sleep quality were positively associated with academic success. Although stress was not directly related to academic success, it was associated with both sleep duration and quality, suggesting that stress may influence academic success indirectly through its effects on sleep. Future research should explore how demographic, socioeconomic, and environmental factors influence sleep patterns and academic success to better inform strategies that support student success.

Audio Transcript

How many hours of sleep did you get last night? And do you think it affects your GPA? Sleep is something most college students sacrifice, yet it’s essential for memory, learning, and mental functioning. My name is Michael Kaleem, and our research explored the relationship between sleep, stress, and academic performance in undergraduates. We wanted to know: Could better sleep actually lead to better grades—and how does stress fit into the picture?

Sleep is more than just rest. During sleep, the brain strengthens memories, organizes information, and supports attention and problem-solving. So, in theory, students who sleep longer and sleep better should perform better academically. But college life is complex—so real data is needed to understand what’s actually happening.

Stress is one of the biggest disruptors of sleep. High stress can shorten sleep duration, worsen sleep quality, and impact mood and focus. Because stress influences both sleep and academic functioning, we wanted to understand whether stress plays a direct role in GPA—or whether its effects occur indirectly through sleep.
We surveyed 196 undergraduate students at a large public university. Sleep duration and sleep quality were measured using the Pittsburgh Sleep Quality Index, stress was assessed using the Perceived Stress Scale, and students self-reported their GPA. We used Pearson correlations to examine how these variables were related, with significance set at p < 0.05.
We found that both sleep duration and sleep quality were significantly related to GPA. Students who slept more hours tended to have higher GPAs. And students with better sleep quality—which means fewer sleep problems—also had higher GPAs. So in this sample, sleep really did matter for academic success.
Stress told a different story. Stress levels were not directly related to GPA. However, stress was strongly connected to both sleep duration and sleep quality. Students with higher stress slept fewer hours and had worse sleep quality. This suggests that stress may influence academic performance indirectly—by affecting the amount and quality of sleep students get.
Our findings show that sleep duration and sleep quality are important predictors of academic success. Even though stress didn’t directly affect GPA, it played a major role in disrupting sleep. This highlights a powerful message: helping students improve sleep habits and manage stress can support academic performance, cognitive functioning, and overall well-being.
Future research should explore how demographic, socioeconomic, and environmental factors influence sleep and academic outcomes. Understanding these differences can help universities design more effective programs to support healthier sleep, reduce stress, and improve student success across diverse populations.

Categories
Making and Creating Undergraduate Research Scholars Program (URSP) - OSCAR

Design and characterization of a de-novo adenine binding protein

Author(s): Amber Middleton

Mentor(s): Lee Solomon, Biochemistry

Abstract

This study explores the de novo design and characterization of a protein engineered to selectively bind adenine; a molecule critical to ATP function, nucleotide recognition, and a wide range of cellular processes [2]. Our objective is to determine whether targeted structural mutations can enhance adenine binding affinity beyond the levels achieved by the original computational model. The designed protein will be expressed via recombinant DNA techniques and purified using Ni-NTA affinity chromatography. Structural and functional characterization will be carried out using a variety of analytical techniques. Those include SDS-PAGE, circular dichroism (CD), fluorescence spectroscopy, surface plasmon resonance (SPR), and isothermal titration calorimetry (ITC). These methods will evaluate protein purity, secondary structure, and protein-ligand binding behavior. Through this approach we aim to identify key structural determinants that improve adenine specificity, offering new insights into rational protein design and the molecular basis of protein-ligand recognition.

Audio Transcript

My name is Amber Middleton and my project is called the design and characterization of a Doo adding binding protein. Our research question is can specific structure mutations and Danovo designed protein enhanced binding affinity and specificity for adding compared to the original computational design so why adding all nucleotide bases are rigid and aromatic, but addinine has a hydrogen bon or donor arrangement that is unique and specific um for selective and specific binding to proteins so it’s better suited for binding when interacted with ATP. The mutations that we made are from Alline to Isosine Alline to veailinging to 3ine and glycine to searin all these mutations were done by a regent PhD student Robert Spain for our methods, we had to express the protein purify it check the purity check the secondary structure and do a series of binding assets. The first thing is the expression of the proteins and recombinent DNA techniques, TB Media, LB Media, and inoculation overnight then we move on to NINTA chomatography to purify your protein where you’ll put your protein down into the column. It’ll run through. You’ll rise it with binding buffer samples and then you’ll rinse it with mixtures of binding buffer and ausion buffer samples and then you’ll collect each of those for analysis separately you then check the purity via SDS page so you’ll take those samples that you collected from the column, put them in run them at 180 V and they will separate by mass ideally, the thicker or darker the bands they hire their protein concentration. You’ll take the thickest ones, darkest ones, and do dialysis to remove all small salts and then we’ll move on a circulularichroism to check the secondary structure for for alpha helix proteins. You’ll see two negative peaks one at 208 n and one at 222 n which we do see in both of the pictures to the right the top being the wild type and the bottom being the mutant we then moved on to our first binding assay, which is surface plasma residence. We were only able to do a negative control with the wild type in adding proving that adding does not bind to the wild type. We then moved to our second binding assay fluorescence, and isodropy. This measures molecular interacts by detecting changes in fluorescent molecules, rotations so the faster the tumbling, the less binding that’s happening and the slower the tumbling, the more binding that is happening. This is some of our results from the first few anisropy experiments as you can see in yellow these are a little bit weird values. They imply that they’ protein technal gives more of a signal than protein addinine does, which essentially means that addingine quitching the protein signal or other things such as G-factor issues are going wrong our values are specific are expected to be between zero and 0.4 for anisatropy, but that’s not what we see in the highlighted so because of this, we wanted to move on to isothermal titration calorimetry, which measures heat released or absorbed, and these are the results that followed that on the left we have a IC thermogram of Valerab into the wild type AT&D, which is our positive controls and we were able to see that there is decreasing exothermic peaks, which proves that there was um functional liggin interaction and binding with the wild type protein, but on the right we have a ITC thermogram of adding with AT&D mutant, which looks nothing like the one on the left showing that there were only small producible changes in heat changes and there was no binding happening so our conclusions and future directions again we were able to purify and express our proteins and get up for alpha helix bundle, but upon doing finding assays, we were able to determine that addine and our mutant do not bond to each other so we have to go back to insilical design to re-engineer the binding pocket for addinine recognition using structural modeling and computational design followed by the validation CDEFSPRNITC and we aim to create a new mutant capable of selectively binding addine and this will help our understanding of targeted mutations to shape Lan specificity and enobble protein scaffolds

Categories
Making and Creating OSCAR Undergraduate Research Scholars Program (URSP) - OSCAR

Plasmonic Metal-Infused Laser-Induced Graphene for Enhanced Photodetection

Author(s): Graham Harper

Mentor(s): Pilgyu Kang, GMU Mechanical Engineering

 

Abstract

Laser-Induced Graphene (LIG) is a promising platform for next-generation flexible photodetectors due to its high conductivity, scalability, and low-cost fabrication. However, its optical-to-electrical conversion efficiency remains limited by weak light–matter interaction. In this work, we enhance LIG photodetection performance through the in-situ infusion of plasmonic palladium nanoparticles into the polymer precursor prior to laser carbonization. During laser processing, the nanoparticles become embedded within the porous graphene microstructure, enabling localized electromagnetic field enhancement via surface plasmon resonance. Electrical characterization under UV illumination demonstrates improved resistance modulation and consistent ON/OFF cycling behavior in Pd-infused LIG compared to bare LIG samples. These initial results confirm plasmon-assisted photocarrier generation and highlight an effective, single-step approach to improving responsivity in flexible photodetectors. Future efforts will investigate wavelength-dependent response and additional plasmonic materials such as silver and gold nanoparticles.

Audio Transcript

Hello, my name is Graham Harper from the Mechanical Engineering Department at George Mason University. Today, I’ll be presenting my research about Plasmonic Metal-Infused Laser-Induced Graphene for Enhanced Photodetection.

Photodetectors are critical components in environmental and optical sensing systems. However, many conventional photodetectors are expensive to fabricate and lack flexibility.
Laser-Induced Graphene offers a more scalable and low-cost alternative due to its conductive porous structure and ability to be processed on flexible substrates.
The challenge is improving how efficiently it converts light into a measurable electrical signal.

One promising way to improve photodetection is by taking advantage of surface plasmon resonance.
Metal nanoparticles, such as palladium, can enhance local electromagnetic fields when illuminated, generating more charge carriers in the device.
By infusing metal nanoparticles directly into the polymer before laser conversion, the plasmonic functionality becomes embedded within the graphene structure.
Our hypothesis is that metal infused laser-induced graphene will perform better under illumination than bare laser-induced graphene.

Our objective is to fabricate laser induced graphene using a UV or CO₂ laser, characterize its structure and electrical properties, and measure photodetection performance under illumination.
The main goal is to determine whether palladium-embedded laser induced graphene produces enhanced optical-electrical response.

To create Palladium infused laser-induced graphene, a palladium-doped polymer solution is spin-coated for thickness uniformity. A laser induces carbonization to form conductive graphene that has palladium nanoparticles dispersed throughout.
Electrical contacts are added using silver paste and copper wires.
Samples are tested under a 62 mA UV laser while recording resistance changes as the light switches on and off.

Our results show a clear increase in resistance change under illumination for the Pd-infused samples.
The cycling data demonstrates consistent ON/OFF behavior with strong repeatability, confirming plasmon-assisted photocarrier generation and successful light response.

We successfully created plasmonically enhanced laser-induced graphene, palladium-infused laser-induced graphene showed stronger optical-electrical response, and the fabrication method remains low-cost and scalable.
This demonstrates that plasmonic nanoparticles provide an effective pathway to improve flexible photodetectors.

Future goals include testing silver and gold nanoparticles with stronger plasmonic response, expanding testing to more wavelengths beyond UV, conducting durability and reliability testing, and performing additional structural analysis (Raman, SEM).

Thanks to the Undergraduate Research Scholars Program, Dr. Pilgyu Kang, and the Nanomaterials Lab at GMU for their support.

Categories
Cells, Individuals, and Community College of Engineering and Computing OSCAR Undergraduate Research Scholars Program (URSP) - OSCAR

Pain, Medication Use and Biomarker Associations in Individuals with Polycystic Ovary Syndrome (PCOS) : Insights from the All of Us Research Program

Author(s): Jannatul Nayeem

Mentor(s): Jenny Phan, CASSBI

A

Abstract

Background:
Polycystic Ovary Syndrome (PCOS) is a complex endocrine disorder often accompanied by chronic pain, yet the biological and social determinants of this pain remain underexplored. Understanding how stress-related biomarkers and healthcare access interact in shaping pain experiences may reveal mechanisms underlying health disparities in PCOS populations.

Objective:
This study examined associations between inflammatory and neuroendocrine biomarkers (C-reactive protein (CRP), cortisol, and body mass index (BMI)) and pain burden among individuals with PCOS, while exploring the moderating role of healthcare access and insurance coverage.

Methods:
Using data from the All of Us Research Program, 2,160 adults with PCOS (identified by ICD-9/10 codes) were analyzed. Pain burden was measured through pain-related diagnoses and pain medication dosage. Biomarker distributions were winsorized, log-transformed, and analyzed via multivariate regression models adjusting for age, race, socioeconomic status, and healthcare variables.

Results:
Pain burden alone was not significantly associated with higher CRP, cortisol, or BMI levels. However, healthcare access moderated these relationships: participants with greater barriers to care exhibited elevated inflammation and BMI with increasing pain, whereas those with adequate access showed flatter or reduced biomarker trends.

Conclusions:
Findings suggest that chronic pain and stress responses in PCOS may be shaped more by social and contextual factors than biological burden alone. Enhancing healthcare accessibility and equity could mitigate stress-related physiological outcomes and improve pain management for individuals with PCOS.

Audio Transcript

0:01 Hello, my name is Jannatul Nayeem. I am a student researcher with the B&LAB, um, at George Mason. I’m working directly with, um, Dr.
0:13 Jenny, um, in her our static load study. Which is the body’s biological stress response, and how it relates to menstrual disorders and chronic pain.
0:23 Um, from that study, I wanted to dive deeper into PCOS, and look at pain medication use and biomarker. Or associations and individuals with that disorder.
0:36 Umm, for methodology, I started off by using, umm, the all of us data set, uhh, database. Umm, it has a ton of data on- individuals, uhh, with all sorts of diseases and, umm, information from their doctor visits, umm, patient records, and also, umm, some survey questions that, the program itself asks
1:06 those participants, umm, and so through that, through that database, I was able to find what 2160 individuals with PCOS, and dive deeper into, uhh, their- biomarkers, uh, specifically for this, I’m using, umm, 3 biomarkers as predictors for inflammation and stress.
1:30 I’m using BMI, C-reactive protein, and cortisol. And then, umm, for- for their outcome variable, I use pain diagnosis along with their medication usage, umm, for medication usage, umm, I accounted for, umm, how many medications they’re taking.
1:52 And, and also what the dosage was for that medication, umm, and then some co-variates, such moderators that, sorry, some co-variates that I used was age, race, uhh, and SES index, and then for moderators, umm, I looked at healthcare access, uh, specifically insurance insurance status and access to care
2:19 . there. And so, I’m going to zoom into the results that I had, umm, hopefully in the video it resumes in two.
2:33 umm, but for my results, I found that, umm, pain alone didn’t start- we predict inflammation or stress, but limited access to care did.
2:45 Individuals with more barriers, such as lack of insurance, showed higher inflammation and BMI with pain, suggesting that health equity plays a critical role in PCOS pain.
2:56 Um, while my insurance data was limited, uh, because, uh, that not many people answered those questions, um, there’s still show some support that the idea- that stress biology and pain in PCOS are influenced by social environment and not just physiology.
3:22 Um, right now, I am continuing this study, um, to- or they’re deep in my understanding and advocate for equitable pain care in PCOS populations.
3:36 Umm, and so one thing I want to focus on more is, uh, imp- moving, um, how we, uh, state these questions, because I do feel like how the, uh, question is stated about access to care and insurance status is pretty sensitive.
3:58 So how can we go about it to, change the way, um, someone feels about answering those type of questions. Um, and so yeah, that was my study.
4:09 Thank you for listening.

Categories
Making and Creating OSCAR

Machine Learning aided Nanoindentation to Discover Material Properties

Author(s): Jake Samuel

Mentor(s): Ali Beheshti, Mechanical Engineering

 

Abstract

Machine learning has been explored as a method of identifying material properties from the material’s indentation data in a process called inverse analysis. A paper by Lu Lu, et. al examines machine learning techniques that could aid this process by adding a residual connection in a neural network (MFRN) [4]. This work examines how this technique improves inverse analysis for small samples of high fidelity data. The MFRN was compared to gaussian process regression, a multi-fidelity model that is well established. It was found that adding a residual connection lowered error significantly for inverse analysis for smaller samples of data.

Audio Transcript

Hello, my name is Jake Samuel and I’m here to talk about using machine learning aided nano indentation to discover material properties. So first let me get some background. Material property testing can be a long and expensive process. It can be hard to test on materials that may be small, or if you’re using a destructive test on a thin film material, you might not want that. So another way to go about for testing material properties is using nano indentation. So nano indentation is the process of adding a small indent in a material and then you’re measuring the loading and the unloading, the force over depth. And then if we were to know the material’s microstructure and the relationship of that between its properties, we can, we can find the material properties. However, that is a, that can be quite a hard process to do rigorously. So instead, machine learning techniques have been applied to this kind of problem with pretty good success, as machine learning models are able to find patterns which can be, that might not be obvious for humans. So the machine learning model we’re going to be examining in this model is a neural network which is just composed of, as you can see here, some hidden layers which have some kind of activation function. So each of them do a little math, and then you have some number of inputs which all lead to some number of outputs and essentially functions as a black box. And you can train it over very large data sets. But the problem is that if you were to train it on indentation data, an indentation test is also, you know, it’s a test you have to run. It can take a long time to perform. And if you are in the business of data collection, you know that data can become corrupted or you can have bad data really easily, and indentation is no different. If you have a long period of testing for indentation, you are likely to have bad data if the test gets messed up. So there is a paper by Lu lu that examines how certain machine learning techniques can be improved. So an example of this is multi fidelity, where if you have, not a lot of data points, you can supplement it with some, a lot of lower quality data points. For example, for indentation, if you don’t have a lot of experimental data points, you can supplement them with, sorry, you can supplement them with some simulations. And one of the innovations that is described in this paper is adding a residual connection to improve the neural network. So you will see some math here. So I’ll try to explain it really quickly. Multi fidelity uses some kind of linear function and a nonlinear function in combination. To learn both the linear and nonlinear relationships between the high and the low fidelity data. And what Lu lu describes here is this residual connection alpha-L Y-L, which in theory should make it easier to learn from data that might already be connected. So we’re going to compare this to Gaussian process regression which is a popular machine learning algorithm. It has been used since 2001 and multi fidelity models of Gaussian process regression have existed for a long while. So you can, we can call this the standard in multi fidelity machine learning. So in the work I have done is I have compared how I’ve tried to replicate the machine learning model proposed by Lu lu and compared it with traditional Gaussian processing. So here we see on when using simple simulation data. So this is from Fem2D and the high fidelity is Fem3D. In the orange, our multi fidelity model with the residual outperforms our multi fidelity Gaussian processing quite significantly. Now for any of you familiar with machine learning, you might think that the the learning curve is a bit flat. Usually you’ll see a curve similar to a downward slope like here. And the reason for this is because since this is strictly simulation data, it is really easy for a machine learning model to learn it. Think of it as a computer that’s spitting out data would, intuitively it would be easy for a computer to learn from it. And when measuring with mean average percent error, the residual does significantly better. With almost 10% MAPE. The MFGP gets around 90% MAPE. Now doing. Replicating these results on actual experimental data gives us some interesting figures. Here we have again in orange the multi fidelity model with the residual connection. And, and as you can see it does a lot better in lower data set sizes. So up until you get high fidelity data set size of 6, which I think is like 60 or 600, I think it’s 60, 60 high fidelity data points. Our residual connection does a lot better. But it evens out to be about the same. The traditional Gaussian processing regression does do a little bit better at 0.5% MAPE while the residual connection network plateaus at 1% MAPE. So from this we can learn that using a residual connection as proposed by Lu lu seems to hold promise for learning from low size data sets and low fidelity data sets. Now this is important because it could greatly reduce costs and, if you are running into problems with data collection, it is a useful thing to know. And for the future, future work will explore how physical data from indentations. Here we have a mark of an indentation and we can learn the indentation depth and the indentation width from it, how that could potentially relate to material properties such as creep. And this is relevant because this data is a lot less, it gives you a lot less information than if you would do like get data normally from nano-indentation tests. I would like to give a big thank you to Dr. Ali Beheshti and Shaheen Mahmood for supporting me in the lab, for Dr. Karen Lee and the OSCAR program for giving me the opportunity to do my research, and thank you for watching.

Categories
Undergraduate Research Scholars Program (URSP) - OSCAR US, Global, and Beyond

Game-Theoretic Analysis of International Pollution Policy

Author(s): Andrew Dara Or

Mentor(s): Moon Joon Kim, Economics, Mason Korea

O

Abstract

International pollution policy consistently fails to achieve meaningful emissions reductions, not because states lack environmental concern, but because the strategic structure of global cooperation rewards defection. This project applies game-theoretic analysis to explain why voluntary climate agreements break down and identifies the mathematical conditions under which cooperation becomes rational. Using a formal payoff model, I show that a country will only cooperate when the sum of sanctions and its discounted share of global climate benefits exceeds the domestic costs of mitigation and the gains from cheating:

+

(

)

+

.
s+αPV(B)≥c+g.

A key finding is that the Social Discount Rate (SDR) plays a decisive role in shaping this inequality. Higher SDRs sharply reduce the present value of future climate benefits, making defection the dominant strategy. Real-world policy behavior in the United States aligns with this model: high SDRs corresponded with withdrawal from the Paris Agreement, while lower SDRs accompanied renewed cooperation.

Case studies of the Montreal Protocol and Paris Agreement demonstrate that successful treaties alter incentives through sanctions, financing, and monitoring. The analysis also incorporates recent evidence linking pollution to declining political participation, revealing a feedback loop that undermines enforcement. Collectively, these results highlight the need to redesign treaty incentives, lower discount rates, and reduce free-rider gains to enable durable climate cooperation.

Audio Transcript

Hello, my name is Andrew Or. I am a junior economics major at George Mason University, and for my USRP project. I will be presenting game theory analysis of an international pollution policy. 
My research question is why do global pollution agreements so often fail and how can game theory help us design policies that may actually succeed? Although climate change is usually described as an environmental or scientific problem, at the international level, it is fundamentally a strategic problem. Every country benefits when global emissions fall, but the cost of reducing those emissions are domestic, immediate, and politically painful. 
This misalignment between national incentives and global welfare explained why international cooperation has been so fragile and why so many treaties fail to produce real emission deductions. Game theory provides a powerful framework for mostly the incentive structure that drives state behavior and for identifying the policies needed to shift the equilibrium towards cooperation. Pollution mitigation is a global public good. 
It is non-excludable, and it’s non-rival, because benefits are shared globally while costs are paid individually. States based a strong incentive to free ride. This is the core collective failure in global climate politics. 
Clever negotiations resemble a repeated prisoner’s dilemma. If both countries collaborate, the world achieves the best outcome. However, if each country gains by defecting while the other cooperates, and mutual defection becomes a stable equilibrium. 
This is why voluntary agreements without enforcement consistently underperform. The payoff matrix illustrate this problem. Cooperation requires immediate national cost, defection, offers short-term economic gains, cheaper energy, and competitive advantage. 
Because of benefits of reduced pollution are long term and global, individual states rarely find cooperations rational to their own. This structure creates bias towards defection. The key variable to achieving cooperation is social discount rate. 
In the United States during Trump’s 1st term, the social discount rate was from 3 to 7%, while during the Biden administration, the rate range from 2.5 to 5%. From a 2023 article by BRG, the recommended SDRs by economist is 2 to 3%. For environmental decisions because higher rates undervalue future climate stability. 
Under the higher Trump SDR, the U.S. withdrew from the Paris agreement, halted green climate fund contributions and scaled back multilateral climate commitments. This behavior is fully consistent with the cooperation condition where high SDR equals lower PVB and defection becomes rational. Under Biden administration, the US rejoined the Paris agreement, resume GCF contributions, and renewed climate diplomacy. 
Cooperation become more rational, once future benefits were valued appropriately. The 2 main case studies I chose to observe were the Montreal Protocol and the Paris agreement. The Montreal Protocol is the clearest example of a successful treaty because it altered the payoff structure. 
Severe trade sanctions raise the cost of defection, the multilateral fund reduced the cost of compliance for developing states, and strict verification, reduced cheating gains, or free rider problems. Countries including the US, China, and India had strong incentives to comply because the treaty satisfied these conditions. The Paris agreement, on the other hand, lacked sanctions, binding targets, hard monitoring, or enforcement mechanisms, and major emitters have expanded coal use or failed to meet pledges because of this. 
I generated model that will help represent this in action, and these are the variables I will use for clarification. The payoff from cooperating is the present value of long-term global climate benefit minus the domestic cost of mitigation. And the payoff from defecting is the cheating gains or free rider gains minus the sanctions for defection. 
A state only has incentives to cooperate if C is greater than or equal to D. Based on the information from the US, I created a mock graph to help better represent this model. Defection is the predicted equilibrium, not a political accident. Cooperation emerges only when treaty shipped incentives, high SDRs mathematically erase long-term climate benefits, U.S. withdrawal and reentry match the model’s predictions, successful treaties share sanctions, financing, and verification mechanisms. 
Game theory shows that international pollution policy does not fail because states are rational, it fails because strategic architecture rewards defection, to make cooperation rational, we must reshape the payoff matrix. There are 3 clear solutions. One is to raise sanctions and increase S, 2 is to reduce the free rider incentives by reducing G, and 3 is to reform discount rate policy lowering R. 
Evidently, this research is not finished as there is many more variables to explore. I would like to express my sincere gratification to the USP for provide me with the opportunity to present and further my research, and I’m also grateful for my mentor, Professor Moonjun Kim, for his guidance and support through the project.