I researched, designed and prototyped all physical and digital artifacts of this installation, including: secondary research, interaction design, sensors and actuators integration, arduino-based software development, and AR development through Unity Game Engine and Oculus Interaction SDK. This is individual research as part of my masters degree.
HCDE 539: Physical Computing and Prototyping | Spring 2023
Professor: Andrew Davidson
Masters of Science in Human-Centered Design and Engineering
University of Washington
Develop an effective 3D interactive gamepad with novel affordances requiring minimal learning.
Focus solution in fostering psychomotor skills, 3D thinking, and creativity.
Cater to the curiosity and interests of young kids, while ensuring flexibility and re-use.
Develop an interactive, sensor-driven augmented reality prototype as proof-of-concept.
Efficiently facilitate communication between sensors, actuators, microcontrollers, and mixed reality display.
The goal for this project was to achieve a workable physical computing prototype in two weeks. The design was streamlined and expedited, giving priority to the prototyping process.
Secondary Research
Diagrammatic Circuitry
Hardware Integration
Interaction Prototyping
Interaction Design
3D Design + Modeling
Digital Fabrication
Arduino Development
Unity Development
AR Development
Debugging
Optimization
Documentation
System Diagramming
Schematics
While this project focused in the novelties of introducing tangible affordances to augmented reality for kids, I conducted a literature review to learn about the current challenges surrounding children's and their use of digital devices.
● Longer exposure to digital devices is related to lower language and mimimized gestural skills in young children.
● Kids are spending an average of 6 hours per day on digital devices. Equivalent to 40% of their awake time.
● Physical play leads to significantly more learning and enjoyment compared to screen-only interactions.
In summary, digital interactions are becoming an essential element in children's life, but its use seems to diminish the time the can give to real physical play, suggesting that a medium between physical and digital play could bring some relief to this issue.
I created an initial mockup with a simple concept: providing an empty canvas with sensors and actuators that would come "alive" through augmented reality, allowing the user to touch, feel, and move objects that are both physical and digital in nature.
Flat colored board with fixtures and sensors, using 3D printing and laser cutting.
AR overlayed game environment, as the user moves some of the outlined sensors, the game would respond accordingly.
First, I had to demonstrate that a tangible, sensor-powered board with AR overlays is possible. I prototyped a simple system and I tested it with the Meta Quest Pro.
For this prototype, I relied on having a Windows PC as intermediary between the microcontroller and the headset.
Ideally, a Bluetooth or Wi-Fi-enabled Arduino would have been preferred, which was beyond the scope of this work.
I created a proof of concept where a ribbon potentiometer would drive the movement of a small sphere. This was driven by serial communication between the arduino and the computer, interpreting the values in Unity.
The next challenge was to achieve accurate tracking of the arcade board by the headset, considering it would change locations between uses.
I 3D printed a casing for the left controller(1) and used it as a positional anchor, when triggered (2), the headset would record the controller current position and overlay the game geometry in the correct place.
This video showcases the positioned board in Augmented Reality. I was able to effectively manipulate geometry in real time that responded to the sensors location.
For this, I calibrated all components by mapping their outputs, measuring their size and shape, and creating a "digital twin" copy of the system inside of unity.
After successfully driving AR interactions with the sensors and microcontrollers fixed to their location, I designed, modeled, and printed the components to drive each of the sensors.
Four different types of tactile input: Push, Turn, Hover, and Shake.
These would use potentiometers, buttons, and ultrasonic sensors.
I pivoted the design, from a "farm" specific template, to a generic one, with the
goal of enabling multiple types of applications: learning, play, or entertainment.