paint-brush
Mid-Fi Prototyping for Augmented Reality & Virtual Realityby@maxspeicher
430 reads
430 reads

Mid-Fi Prototyping for Augmented Reality & Virtual Reality

by Maximilian SpeicherDecember 24th, 2022
Read on Terminal Reader
Read this story w/o Javascript
tldt arrow

Too Long; Didn't Read

Prototyping in 3D isn’t the same as in just two dimensions. Compared to regular, flat websites, augmented and virtual reality have very different requirements when it comes to creating prototypes. To close a gap in the medium-fidelity range of AR/VR prototyping methods, we’ve created 360theater, which makes use of dioramas and 360° content for creating physical-digital prototypes. We’ve used our new method in workshops with design students and found that it comes close to the final experience while complementing other methods with different levels of fidelity.

People Mentioned

Mention Thumbnail
featured image - Mid-Fi Prototyping for Augmented Reality & Virtual Reality
Maximilian Speicher HackerNoon profile picture

This article describes research that has been conducted in collaboration with Katy Hagen and Michael Nebeling. The research paper was published at the 2021 ACM Conference on Engineering Interactive Computing Systems (EICS).


When I reflect on my time at the Michigan Information Interaction Lab, where I worked with Prof. Michael Nebeling, a lot of great projects (and more importantly: great people) come to mind. From GestureWiz to What is Mixed Reality? (internal codename: WTF-MR) to MRAT—we’ve done a lot of very exciting and innovative research. However, if I had to choose my one favorite project from that time, a different one stands out. It’s none of the CHI papers and none of those that have been cited the most. Instead, it’s the one that took the longest to be published. Michael presented it at the 2021 EICS conference: 360theater.

What is 360theater?

First and foremost, 360theater is a new medium-fidelity method for rapid prototyping of augmented and virtual reality experiences. It’s medium-fidelity since on the spectrum of prototyping methods that are suitable for AR/VR, it’s located somewhere in the middle between plain pen & paper and just programming everything directly in, e.g., Unity. In that middle range, there’s still a considerable gap when compared to the number of approaches that are lower- or higher-fidelity. However, pen & paper are pretty limited when it comes to prototyping 3D experiences and Unity programming can constitute a significant barrier to many non-technical designers. That’s why we came up with 360theater—as a means for prototyping in 3D with an outcome that is as close as possible to the final experience, but without the need for programming skills.


However, 360theater is also the name of an app that implements the eponymous prototyping method. Its source code is available here.

Why “theater”?

The first core concept of 360theater is diorama prototyping. That is, instead of using plain 2D paper—as one would do for regular, flat interfaces like websites—we rely on cardboard boxes, Play-Doh, cardboard sticks to attach objects to, and so on. You can also grab some Lego bricks if you like. This gives you the chance to prototype the spatial relationships and interactions of an augmented or virtual reality scene directly in 3D instead of having to draw complex 2D scenographies. Now, diorama prototyping is for no means new, or anything we came up with, but common practice in architecture, film, and—you guessed it—theater.

Why “360”?

To complement the “theater” and form our new prototyping method, the second core concept of 360theater is digital 360° content—be it a video, live stream, or photo, as you would capture with, e.g., a 360° camera such as the Ricoh Theta. This allows us to transform the physical diorama into a digital representation that can be viewed on a smartphone or VR headset. Our accompanying 360theater app then allows you to add virtual objects and a Wizard of Oz to manipulate those. Because we give the otherwise flat 360° content (essentially just a large photo mapped to the inside of a sphere) a 3D geometry, the additional virtual objects can be placed at precise 3D locations in the captured diorama. Overall, this gives you a part physical, part digital prototype that is fully interactive and can be tested with users.


The Wizard of Oz manipulates objects that the AR user observes in real-time. Taken from Speicher, Lewis, & Nebeling (2021).


To go with the example we also use in our paper, imagine you want to prototype a scene from Pokémon GO. You create a diorama with grass, a little river, and a Pikachu made from Play-Doh. Next, you place a 360° camera in the middle of the diorama and start a live stream captured by the 360theater app. Because you couldn’t manage to create a Pokéball from Play-Doh, you just add it as a virtual 3D model. You give a VR headset showing the scene to a test user and if they make the right gesture, your Wizard of Oz flings the virtual Pokéball towards the physical Pikachu, which is being caught and removed from the diorama.


Another variation of this could be to place the 360° camera in an actual room, place a virtual Pikachu and virtual Pokéball in the real environment via the 360theater app, and have the test user experience the scene as augmented reality through a smartphone.


Example of 360theater setups for AR and VR. Taken from Speicher, Lewis, & Nebeling (2021).


How did we do it?

We—that’s Katy, Micheal and I—started by holding design jams focused on paper and diorama prototyping with students from the University of Michigan School of Information. In those workshops, we observed how the students would go about prototyping AR or VR scenes using the given materials and elicit their requirements as designers. One was the need for a physical-digital integration in order to be able to show the prototyped experience on an actual end-device. Subsequently, we iteratively designed and implemented the 360theater app and held more design jams in which we investigated our method and also drew comparisons to paper prototyping, diorama prototyping, and another new approach Katy and Michael had been working on 360proto.


Three findings from these workshops were:

  1. Prototypes made with 360theater were indeed considered medium-fidelity by the participants.
  2. Prototypes made with 360theater came closest to the final product, compared to the other investigated methods.
  3. 360theater does close a gap in the landscape of prototyping tools and complements other methods.

Why is this one my favorite?

That’s a very good question and it’s not so easy to answer. All of the research projects we worked on in Michigan were exciting, and also very diverse, so it’s hard to compare. However, 360theater was somehow even more fun than the others. Thinking back, that’s most probably because it was the project where we worked most closely and directly with students. Rather than designing and implementing some system in silence and then having one distinct user study, we held the series of design jams mentioned above, where we simply worked with students to prototype AR/VR experiences and discuss the pros and cons of different methods. And that was just a hell lot of fun. On top, it is really cool to invent a whole new prototyping method, isn’t it‽

What else is out there?

Now, of course, 360theater is far from being the answer to everything or the only existing prototyping method for AR/VR. All methods have their pros and cons and make more or less sense at different stages of a design process. As our students discovered during the design jams, it’s always best to mix ’n’ match methods and transition between them as you progress with your design. Therefore, we’ve included an extensive assessment of the status quo and categorization of different prototyping methods in our paper, based on two dimensions—the level of fidelity as well as what kind of output is produced, from physical (e.g., paper prototyping) to physical-digital (e.g., 360theater) to purely digital (e.g., Unity).


☕ I love coffee, and if you want to support my work, you can always spend me one, or subscribe to my newsletter. 🗞️

Calls to Action

  1. Read our paper, and especially the section on related work (e.g., via ResearchGate).
  2. Check out the accompanying video here.

Acknowledgments

Thanks to Katy and Michael for working with me on this project and having the endurance to write and re-write (and re-write) the paper until we finally got it published. 🙂




Copyright © 2022 by Maximilian Speicher • Originally published by Geek Culture