Case Study
10 minute read
The core interaction was designed around a simple real-world scenario: a person walking through a city notices a construction site and wants to know what's being built. If they already have the app, pointing their camera at the QR code on the site triggers the experience immediately. If they don't, scanning the code first routes them to download the app before continuing. Either way, the entry point lives in the physical world.
Once the code is recognized, a frame appears around it to confirm the app has picked up information, and a white card surfaces with basic project details like the name and developer. Tapping the card selects it, turning it purple to indicate activation, at which point the 3D rendering of the future building appears overlaid on the physical site in real time. An info card peeks up from the bottom of the screen simultaneously, available to interact with without interrupting the AR view. For users who want to go deeper, tapping into the rendering transitions them to a traditional screen experience where they can swipe, zoom, and explore the model in detail.
One challenge identified within the sprint was text legibility in variable lighting conditions. White text readable in shade can disappear against a bright sky. A full solution would require testing different text colors across lighting scenarios, something scoped as a next step beyond the 2-week timeline.


The app was structured around four bottom navigation tabs: Explore, Scanner, Forum, and Account. Each served a distinct purpose while staying connected through shared project pages. A user could arrive at a project from any direction — scanning a QR code in the physical world, discovering it through the map on Explore, or following a link from a forum post. The entry points were different but the destination was the same.
The navigation was designed outward from the AR experience. Scanner was the core interaction the product was built around, with Explore, Forum, and Account filling in the gaps around it. Explore handled accessibility, giving users who never passed a job site the ability to find and browse projects through a map view, alongside new projects and trending community posts. Forum was dedicated entirely to user-generated discussion. Account was scoped but not fully defined within the sprint.
One significant limitation worth acknowledging is that the AR visualization itself was designed without full clarity on what the technology could actually deliver. The interaction was designed around an idealized version of the experience, and a real implementation would require close collaboration with developers to understand what was technically feasible before committing to the flow.




