This was the third time I got to lead this CallisonRTKL project, which included securing a budget, making purchases, managing internal and external communications, and coordinating a multidisciplinary team involving designers, engineers, construction partners, and festival organizers. I also provided creative direction, guiding the team to prioritize user needs, and took charge in prototyping and developing new forms of immersive interaction using motion capture technology with Unity 3D. This project helped me grow as a leader and expanded my experience in charting fun territories for interaction design.
Aparna Pillai, David Chamness, Diane Moore, Eduardo Silva, Espe Zivkovic, Garrett Lumens, Grace Nemeckay, KJ Burkland, Lucy Baraquio, Mark Hower, Nomi Cooper, Patrick Winston, Steven DeGiorge, Todd Lawson.
I introduced the team to the user-centered design process. This included paying close attention to our user needs, and engaging stakeholders and other third parties in participatory design, which guaranteed buy-in and a healthy feedback loop.
Observation
User Survey
Secondary Research
User Personas
Design Principles
Ideation Workshop
Spatial Design
Interaction Design
Visual Design
Structural Design
Prototyping
Heuristic Evaluation
Usability Study
Agile
Unity 3D Dev.
Construction
Detailing
Recycling
More than 6 years of data participating in the festival provide a great starting point for our research focus.
Aimed to identify factors that contribute to an enjoyable and memorable experience for both parents and their children.
Literature review on collaborative
co-located interactions, immersive experiences, computer vision and body tracking technologies.
We narrowed down our focus to kids aged 4 to 10 yrs. old. and their accompanying adults. Then, I lead an ideation workshop across the office to gather a diverse range of perspective and ideas.
Explore and interact with the installation on their own.
Enjoy a memorable and exciting experience
Develop new skills or interests through play
Fun and engaging activities
Visually appealing and colorful experiences
Interactive elements that stimulate curiosity
Age-appropriate and safe interactions
Clear and simple instructions
A sense of accomplishment and ownership
Foster shared bonding and connection with kids
Encourage children's learning through interaction
Discover new interests or experiences
Shared experiences with children or other adults
Visually appealing installations
Activities that encourage connection and collaboration
A safe environment for kids
Inclusive and accessible experiences
A balance between adult and child-centric elements
by catering to the curiosity and interests all age groups.
by catering to the curiosity and interests all age groups.
by designing an open and approachable pavilion.
where interactions feel natural and intuitive, without the need for buttons or traditional interfaces.
encouraging unique outcomes with the combination of the user’s unique choices and actions.
Through a collaborative brainstorming session, we narrowed down to four possible interaction types for the installation.
Users connect through collaborative dancing, producing sounds depending on their body positions at a given time.
Pulling strings, users can activate different melodies. When working together, they can generate new music.
Each ring in this enclosure provides a different interaction input, together, all elements enable users to connect thround shound.
Users "wave" their arm through an opening in the installation, mimicking the interactions in a harp, producing music together.
Our spatial design process involved looking at tackling multiple user requirements simultaneously. Dancing is a collaborative effort! We wanted groups in the game, but providing enough privacy that our users felt comfortable in a more intimate setting.
The 30-foot installation accommodates up to 9 people, ensuring a safe environment for users to interact and enjoy.
Elevated sides provide a sense of comfort and privacy, enabling users to confidently dance and engage with the installation.
Ramps ensure inclusion and ease of access for all users, sensors cater to both standing and sitting positions.
Inspired by the shapes of sound, the façade enhances immersion and serves as effective sound conductors.
I lead the creation of two distinct scenarios, one in which the installation would be used in a specific series of steps, and another one in which we provide autonomy to the user of making their own choices.
A series of pre-determined steps guide the users to trigger sounds, creating music as they navigate the installation.
Users have the choice to trigger any spot of the installation, and try to work collectively to generate harmony with the music.
The most exciting part of this project for me, with the opportunity to work on this phase entirely, from researching technology possibilities, prototyping, coding, testing, and ultimately creating the Unity 3D build driving the installation during the festival.
My first assumption was that players would not require visual affordances to interact in the dancing area. We wanted to embed invisible 3D virtual shapes that users would trigger as they collide with them, allowing users “explore” the virtual space, creating a mental map of the triggers and creating their own music through dancing.
Motion tracking with ZED 2i by StereoLabs.
Virtual objects and collision detection with Unity 3D.
Unity spawns a collider on the user’s hand, when it collides with floating spheres, it generates a particular sound.
High cognitive load required to remember the position of virtual triggers in mid-air.
Hard to recognize where a trigger starts or ends based solely on auditory feedback (low sensory definition.)
Potential solution: ground-based triggers.
For the next phase, I created a 1:1 scale prototype of the installation platform. The objective was to test the ideal separation between triggers, the final number of people that would fit in it (spatial constraints,) and test whether music is easily created or not.
Increased number of players to 9.
Triggers were left in the ground for easier identification and lower cognitive load.
Each trigger was given a distinct melody.
Easier to identify edges of ground-based triggers.
However, it is hard to register and remember which trigger activated each sound.
Multiple triggers playing simultaneously leads to cacophony, users did not feel they were creating music.
Designing the final interaction system required multiple rounds of fine-tuning. We opted for the combination of three types of interactions: Chimes, Loops, and Triggers. Each responding to a different user or system need.
"Scaled" musical notes
in the form of bell chimes.
Inviting journey and simple music-making for young users.
Preserve harmony regardless of active users count.
Synchronized musical loops, unmuted when active.
Provide autonomy and moments of surprise.
Individual "fun" sounds,
triggered on-demand.
I used projection mapping with Touchdesigner and Unity to create a High Fidelity prototype of the installation. We tested the final visual design for our triggers, and iterated in the sounds produced, fine tuning the installation ahead of the festival date.
Projection mapping through Touchdesigner allowed us to test final installation design.
Board composed of pre-loaded synchronized musical loops that would unmute when active by users.
Triggers have unique shapes for easy recognition and music-making.
The variety of looping vs. individual sounds proved successful in providing a sense of music-making while avoiding noise.
The different shapes and colors proved beneficial in learning the identity of each trigger, instead of repeatedly verifying triggers.
Here is a representation of how triggers would get activated when users stepped on them. The installation received more than 11,000 visits, not including the many returning users, in the span of two days at the festival. It was also featured in the front page cover of the Seattle Times newspaper.
A brief overview of my experience and strategy as Project Manager in this project.
I created 5 different sub-teams: Project Management, Spatial Design, Interaction Design and Prototyping, Structure Design, and Visual Design. This with the purpose of ensuring people could have ownership in the tasks they were working on, creating a horizontal hierarchy system. I have found this to be the best way to divide work in cultural initiatives.
Created an extensive Gantt with several iterations considering delivery dates, constraints, purchasing of materials, and our design delivery to our construction partners.
Weekly meetings to check progress, provide feedback, and set tasks for the next week.,
Project duration: 6 months.
Prioritize findings and tendering to our user needs from the get-go.
Applied Agile methodology for rapid iteration and prototyping of the different solutions, polishing our Unity 3D build on the go until it was proven and ready for the festival.
Discovering the challenges behind "walking" in virtual reality?
Having worked with feet tracking for the Festival Installation. We decided to take advantage of our lessons learned to develop a proof-of-concept for feet tracking in Virtual Reality.
Enabling leg tracking to activate elements or add enhanced interaction provides an enhanced experience. However, capabilities need to be limited, environment design has to be carefully planned, and additional safety precautions need to be ensured.
The first iteration used a Azure Kinect to detect the player's joint positions. However, we learned that the lag between the action, signal, and input into the game was too big.
I created a feet tracker using the Oculus Quest controllers as point of reference for the Avatar Controller. This proved to be avery reliable solution, since there was no lag in the input, allowing us to test permissible interactions for the user.
Walking and stepping into activators in the ground feels natural and safe.
Some adjustments are needed in the avatar to ensure proper visibility of the mesh.
Jumping is possible, but it is important to land at the same level that the initial spring.
Jumping requires safety measurements, such as predicting where the user could jump next and foreseeing any need to make a guardian visible.
Height change presented a potential hazard of tripping or falling. Users need to know where the actual ground is to calculate the movement of their legs.
Falling from a high distance with 2 legs is permissible, but more studies are needed.
Not Included: XR Rig - This model showcases the current strategy
to connect the current project to any XR input device.