Author(s): Alexia De Costa
Mentor(s): Eileen Roesler, Department of Psychology
S
Abstract
This project presents the Bioinspired Automated Robotic Cat (BARC), a functional companion robot designed to support research in human–robot interaction and privacy-aware design. BARC features camera-based facial detection, expressive gaze behaviors, audio responses, and various soft and rigid materials to mimic a household cat. Because camera systems can enhance interaction while raising privacy concerns, the ongoing study compares peoples’ responses under two conditions: a clear, high-quality camera filter and a blurred, low-clarity camera filter. Using surveys and observation of touch behavior, the study examines how camera clarity shapes engagement and perceived privacy, informing the design of social robots that are effective while respecting user comfort.
Audio Transcript
Have you ever wondered what a robot actually sees when it looks at you?
Today, social and service robots are becoming increasingly common, and many rely on cameras for facial recognition and user engagement. But as useful as cameras are, they also raise important questions: Do they make people feel watched? Can a robot feel friendly while still respecting privacy?
These questions lie at a key intersection in human–robot interaction, that robots need perception to understand us, yet high-resolution sensing can make people uncomfortable. So I wanted to explore a central challenge: can we reduce privacy concerns without making interactions less enjoyable? And does being transparent about what a robot sees change how people feel?
To investigate this, I designed and built a robot cat from scratch called BARC, the Bioinspired Automated Robotic Cat. BARC is part engineering platform and part research tool. It can switch between two controlled camera conditions: a clear, high-quality camera filter and a blurred, low-clarity filter that still allows for partial facial detection. These interchangeable physical filters let me directly compare how different levels of sensing clarity influence interaction.
BARC is also designed to feel expressive and lifelike. It uses camera-based facial detection for gaze behavior, animated OLED eyes, a speaker for cat-like sounds, and soft and rigid materials that mimic the look and feel of a household cat. Through surveys and observations of touch behavior, my ongoing study explores how these two camera conditions shape user engagement and perceived privacy.
To create BARC, I began with feline anatomical references, studying limb placement, joint spacing, and overall proportions, to inspire the CAD model for the chassis. I laser-cut the acrylic components and assembled them using screws and tab-and-slot joints for a sturdy, lightweight frame.
At the heart of the robot is a Raspberry Pi 4, which handles perception and behavioral control.
A camera provides the main sensory input for facial detection.
Two OLED displays animate expressive eyes that track the user once a face is detected, giving the illusion of attention and social presence.
A speaker and amplifier generate a range of cat sounds, from meows to purrs to alarmed yowls.
An accelerometer-gyroscope detects movement, such as being picked up or shaken, so BARC can respond appropriately.
Servos are controlled by a PCA9685 driver, animate the limbs, jaw, head, and tail.
All behaviors are programmed in Python and organized in a state machine with modes such as Idle, Seeking Attention, Interacting, and Startled. BARC transitions between these states based on sensory input and probability, helping interactions feel natural rather than scripted.
To examine how camera clarity influences engagement and privacy perceptions, BARC serves as a fully capable research platform. Seventy-two participants are currently part of a single-blind study with two groups:
Group 1: interacts with BARC using a clear camera filter
Group 2: interacts with BARC using a blurred, privacy-preserving filter
The physical filter is noticeable, so using filters in both groups keeps the robot visually consistent. That way, any differences we see are truly due to what the robot can or can’t perceive.
Participants interact with BARC, complete a survey measuring constructs such as Perceived Sociability and Perceived Enjoyment, and then are shown a live camera feed so they can see the actual resolution of the robot’s vision. Afterward, they complete a second survey measuring perceived privacy, perceived surveillance, disturbance, and attitudes about robots.
The hypotheses are:
1: No difference in sociability, enjoyment, or touch behavior.
2: The filtered-camera group will report higher perceived privacy.
3: The clear-camera group will report higher perceived surveillance.
This interdisciplinary project connects mechanical engineering, psychology, and human-robot interaction to better understand how people perceive robotic sensing. BARC’s expressiveness, biological inspiration, and controlled camera conditions make it a powerful research platform.
By comparing clear versus filtered camera views, this research explores whether privacy concerns come from what the robot actually sees, or what users believe it sees. Ultimately, the goal is to guide the design of future social robots that remain engaging and respectful of user’s privacy
Special thanks to Dr. Eileen Roesler (Psychology) and Dr. Daigo Shishika (Mechanical Engineering) for their invaluable mentorship. Thank you to Katya Schafer for assistance with data collection, and to Dr. Karen Lee and OSCAR for their support and funding, which made this project possible.
Thank you!