Interaction Arts & Engineering
Prototyping a visionOS app that turns people into expert DIY builders
I took some time to get back into Apple and Swift to prototype a concept that uses the current best-in-class consumer spatial computing platform to deliver some simple, everyday value for a wide range of users. Building for Apple Vision Pro and visionOS was frustrating at the start, but it ended up being the most satisfying design and programming experience I've had so far.
Concept
Core Features
Process
Results & Next Steps

The Concept: A "Guided Work" Spatial Computing App For The Average DIY Assembler

Assembling furniture, appliances, structures, and other DIY projects from paper instructions and even demo videos is often a less-than-ideal process that is riddled with errors and frustration. Many of these problems could be solved by simply having a good way to simulate the actions you need to take in each step, allowing you to see at life-sized scale how all of the parts go together.

Another way to put it: you are far less likely to have a "the door I screwed onto that new Ikea storage cabinet is backwards and upside-down" situation after you've assembled one or two of the same cabinets. Or even after just watching someone else complete the steps of the assembly.

This app utilizes the powerful mixed reality capabilities of the Apple Vision Pro to give people true-to-scale simulations of each step of their DIY project that helps them clearly understand how parts are meant to fit together before taking action to avoid costly mistakes and damage to materials.

From something as simple as a birdhouse, to sophisticated hi-tech machinery, the animated digital replicas plus the reference materials the app places in your surroundings help you to get things right the first time.

A Quick Note:
It's difficult to get high quality screen recordings from the Vision Pro for these kind of presentations. The view is a bit a shaky (it's basically a camera strapped to my head) and the blurred portions of augmented content (due to foveated rendering which is essential for optimizing graphics performance) outside of my eye gaze are visible in video output.
Flow: Assembly Window UI Interactions, Detach Scene, Place On Floor, And Watch Demo Animation
a similar flow but with an example of a user matching their parts to the demo scene's arrangement
More Examples of Comparing and Matching Parts

The Core Features

Animated 3D Scenes To Teach and Help You Get THings Right The First Time
Each step in a project displays an animated 3D scene to help assemblers clearly understand how to put pieces together correctly. Tap the detach button to get a true-to-life scaled version of the scene that you can place next to your work to match, compare, and guide your assembly.
The Guided Work User Interface
The UI used for navigating project assembly helps you keep track of your progress with a step completion system and also displays a variety of media (text, images, video, and audio) that users can place next to their workspace for easy access and guidance whenever they need to dig deeper into a step. No need to get up and search for misplaced paper instructions or unlock a tablet or phone.
Annotations

The Process

Like most spatial computing projects, the design-to-prototype process for this concept had many small and large parts to produce. Adding to that, I had an objective of simulating how the content creation and ingestion pipeline might work for this product that would require a large number of small and large companies to produce a significant number of 3D assets that would need to comply with specifications for display and playback in the app.

The main takeaways I got from the 3D pipeline were:

The last stop in the process, user testing, is also more difficult compared to the typical 2D app design process. Spatial computing headsets like the Vision Pro means you will probably need to have in-person sessions with your participants and hope they can fit comfortably into the headset. The biggest obstacle for me in this project is the need to purchase multiple Bekvam kitchen carts from Ikea to get realistic testing. So user testing will commence after I find a much smaller and cheaper product for testers to assemble.

  • Creators can use just about any 3D modeling app that can export USDZ to build their models, create materials, and sequence simple animations. Apple makes it easy to interface with that protocol. I used Blender for this project.
  • Reality Composer Pro is a good second stop in the pipeline where geometry, transforms, and material for the entire scene can be adjusted if needed.
  • There are plenty of size and positioning constraints and quirks in the pipeline's final stop, the visionOS build, that need to be considered in points one and two above.
Concept Development And Design
Prototyping
3D Pipeline

Results And Next Steps

This project was mostly about putting together a solid and somewhat comprehensive visionOS prototype, and going by that metric, it was definitely a success. And I got much more familiar with visionOS and SwiftUI along the way.

The next step for this one is to come up with a plan for testing this prototype, which probably means finding another DIY project to create content for and setting research/test objectives.
In my opinion, spatial computing technology is still in the early phases of a long period development before reaching a consistent level of mainstream adoption. The potential is there and easy to generalize, but we have some significant hardware challenges to solve for at this moment. So usually when I test product concepts in this category, the goal is more about learning specific reasons why people might be attracted to this technology in the future over trying to figure out if they want to go out and buy it today.