Wreathies: Agency in Play

A game built to measure how choice shapes learning.

role

Product Designer & Developer

Timeline

June - Sept 2025

Organization

Davis Media Lab

Tools

Procreate, Unity, Visual Studio Code

Procreate, Unity, Visual Studio Code

solution preview

📍 Currently in testing at the Museum of Science and Curiosity at Sacramento.

Start screen.

Gameplay screen.

Pilot data from ~25 participants indicates meaningful autonomy gain for the High-Choice version (+0.8 on 1–5).

context

agency in games

Many digital learning games limit player agency, but offering choice can increase autonomy and support learning. This project, led by the Davis Media Lab, explores how agency and autonomy influence children’s gameplay experiences.

The goal

Create a game with varying agency levels and measure autonomy levels.

my role

As the sole product designer and developer, I designed the gameplay and visuals, built the Unity prototypes, conducted testing with the research team, and iterated based on feedback.

secondary research

understanding the Landscape

To scope the space, I surveyed U.S. education game research, spoke with a Common Sense Media director, and played a range of learning games. Through my research, I discovered three recurring core principles for a successful game.

Research-Backed Design Principles

Guided by these principles, we started brainstorming concepts for an interactive experience.

Guided by these principles, we started brainstorming concepts for an interactive experience that targeted one measurable skill.

ideation

To capture themes of growth and creativity, we explored nature-focused ideas such as gardening and flower cultivation. We ultimately chose a flower shop concept because it clearly mapped to our three principles:

Progressive Challenge: Orders grow in difficulty.

Progressive Challenge:

Orders grow in difficulty.

Meaningful Agency:

Players arrange flowers.

Goals & Feedback

Clear requests with validation.

initial Flower shop concept

initial Flower shop concept
player objective

To make wreaths matching customer orders.

potential skills measured
potential skills measured

Count flower colors, pattern recognition, and memorization.

Counting, pattern recognition, and memorization.

Early paper sketch of the concept.

Now it was time to see if our concept landed, so we headed to play at a preschool.

Testing Round 1 at a preschool

concept validation

At a local preschool, we tested a whiteboard-and-magnets prototype with seven children and asked what they liked, wished for, and wondered about the game.

✅ session goal Completed

Validate that the wreath concept is intuitive and keeps children engaged.

Validate that the wreath concept is intuitive and keeps children engaged.

Our testing setup.

participant quote

“I like the colors of the flowers and putting them together.”

takeaways

satisfying clicks

Magnetic clicks of flowers drew kids in.

Pride in Making

Children were proud of their work.

needed More instruction

A few stalled without clearer steps.

Knowing the core concept worked, it was time to define the game parameters.

Zeroing In on the details

flower fraction game

We chose fractions as the core concept since the wreath functions as an intuitive whole. The setup allows difficulty to scale cleanly across rounds and gives us flexibility in how we structure versions.

Early digital game rendition.

With the concept and measurable skill locked in, it was time to build the game in Unity.

Development

first time working in unity

Coding the game revealed several challenges that shaped the final design and taught me how to work within tight constraints.

three Challenges & How I Solved Them

Collider Bugs

I replaced segment-based hit areas with one larger wreath hit zone to reduce missed drops and improve usability.

script reference errors

I cleaned up prefabs and inspector references to reduce runtime errors and stabilize gameplay.

Scoping to Meet Time & Skill Constraints
Scoping to Meet Time & Skill Constraints

I cut nonessential features to focus on a simple core loop to ensure the experience would be delivered on time.

Exploring Agency

What Happens When Kids Choose?

To isolate the effect of agency, we created three game versions that differed only in the end-of-round step: choosing the next client and/or the bow.

three versions of the same game

most agency

Choose client and bow.

Medium agency

Choose bow only.

least agency

Choose neither client nor bow.

Bow selection screen present in two versions.

Now that we had a plan for testing agency, our next step was to see if kids were engaged.

Testing Round 2 at the museum

Are Players Engaged?

We tested the game with three children at the museum hosting our study, gathering feedback on pacing and enjoyment that guided our next edits.

We tested the game with three children at the museum hosting our study, gathering feedback on pacing and enjoyment that guided our next edits.

key changes

added hobby objects

An object next to each customer, making selection a more conscious choice.

manual onboarding

Children go at their own pace with arrow navigation.

added reference pictures

Pictures of completed wreaths appear as a guide for the first five rounds.

participant quote

“The intro is too slow, I want to skip and start playing.”

Characters with hobby objects.

After this test confirmed kids were engaged, I turned my focus to how the game felt to play.

Testing Round 3 at the lab

polishing the Experience

Back at the lab, four older children played the game and answered targeted prompts (e.g., “What sound would make this step feel more satisfying?”) to collect specific UX feedback.

key changes

last two levels scratched to reduce fatigue

Because children were getting tired and the final two levels weren’t truly adding learning value, I removed them.

progress bar added

To help children see their progress, I added a progress bar showing how many levels were completed and how many remained.

more sound effects

To keep the experience engaging, I added small sound cues so every action felt noticed and the momentum continued.

final design

First round gameplay.

Intro sequence.

Early Results

Preliminary Data

The researchers logged autonomy scores from ~25 kids on a 1–5 scale (1 = game is in charge, 5 = I’m in charge). Kids who played the High-Choice (HC) version felt more autonomous than the No-Choice (NC) version across all three questions:

Most in charge: HC 3.83 vs. NC 3.00
Leader in the game: HC 3.42 vs. NC 2.64
Got to make choices: HC 4.08 vs. NC 3.17

Overall, the High-Choice version boosts perceived autonomy by ~0.84 points on average (on a 5-point scale). (Early read only; demographics and full data entry still in progress.)

looking back

Lessons Learned

Balancing Everyone's Needs

This project taught me how to balance competing needs. To succeed, the game had to produce meaningful data, keep kids eager to play, and stay scoped realistically for me as a beginner coder.

My strength lies in design

Although I am grateful for the experience of building the game from scratch, it confirmed I do my best work in user experience design and plan to keep my focus there. Building the game deepened my respect for developers and made me a clearer communicator.

Playing At the Museum!

What the Kids said

One child went "he looks like me!" to one of the characters, which is exactly why I made the characters diverse, so that the kids could see themselves in the game. Since the kids participated in a research study, we called them "junior scientists" and one child asked if his stuffed animal could also be a junior scientist!

up next…


Girl Scouts Site Redesign