Explore My Proof of Concept

Explore my proof-of-concept (Browser or Meta Quest):

Bezi Project - "Break Into Reality: AR"

Official Submission to the Bezi Break-Into-Reality Challenge:

Break Into Reality AR Challenge

My Submission: “Victory Park: Augmented Events”

https://www.youtube.com/watch?v=-lX2wzf19pE

Design Process

Screenshot 2024-03-19 144616.png

My design process was inspired by another project I was prototyping in Bezi at the time. It's a mixed reality app with the working title “Paper Plane XR,” designed to guide users in building paper airplanes.

Still a work in progress, I leveraged the paper airplane theme as a foundation for this Bezi AR-focused design challenge. My aim was to explore AR's potential in enhancing the experience of discovering events within community parks.

As a proof of concept, I proceeded with the idea of a fictitious event at a local park, dubbed “Paper Airplane Month.” Upon arrival at the park, the AR experience informs users and prompts an interactive action. Specifically, I envisioned guiding users from the entrance through various points of interest within the park. My goal was to enable users to point at real structures in the park, overlaying AR interactions on top of these areas.

IMG_4815.jpeg

IMG_4814.jpeg

Screenshot 2024-03-20 114855.png

Screenshot 2024-03-20 115508.png

Screenshot 2024-03-20 115713.png

For this challenge and prototype, my focus was on what would be the first/introductory area of the experience, the “Welcome” area.

I utilized Figma to develop the 2D UI elements, including all navigation icons. LumaAI was employed for Gaussian Splat capture, and Blender for 3D assets, mesh optimization, image rendering, and layout. Finally, Bezi was used to integrate everything into a unique AR experience anchored in the real world.

The most significant challenge was circumventing the absence of an image targeting system for occlusion and placement of AR objects. I opted to use the Gaussian Splat of the area as my method for aligning augmented elements with the real world once in Bezi. Additionally, I allowed users to visualize the low-poly stylized rendition of the Gaussian Splat in the app, enhancing the app's low-poly design style.

Screenshot 2024-03-20 120824.png

Screenshot 2024-03-20 122037.png

IMG_4938.jpg

IMG_4937.jpg

IMG_4936.jpg

https://lumalabs.ai/embed/8a48f7d2-a3bd-4681-9879-6a44a345260d?mode=sparkles&background=%23ffffff&color=%23000000&showTitle=true&loadBg=true&logoPosition=bottom-left&infoPosition=bottom-right&cinematicVideo=undefined&showMenu=false

I used generic icons to assist with navigation. The elements were parented inside the Bezi Body Rig 'head' layer to be displayed at the top of the experience as the main static navigation screen. Since AR was the primary focus, the navigation needed to integrate seamlessly with the touchscreen experience without being distracting. Each navigation button was animated using the Bezi State Engine. A paper airplane would guide the user into the next area to follow. Elements, like the arrows on the ground and the interactive signs that would appear as you walked further into the designated path, were triggered by colliders in Bezi.

Conclusions: