Exploring Collaboration Through Augmented Reality
Note: This piece was co-authored by Tim Bettridge.
One of the truly inspiring things about working at Connected is the opportunity to collaborate with smart, like-minded people from different disciplines. Whether it’s a product designer pairing with an iOS engineer, or a design researcher teaming up with a technical product manager, working alongside individuals with diverse backgrounds means that we’re building better products in a uniquely integrated way.
This cross-functional approach is not only used in our client-facing work but also in our bench research. Bench research occurs in between client engagements where practitioners can explore and dabble in the product spaces they are passionate about. Bench research can be tackled solo, but we’ve found that it’s more exciting when we can explore our interests within cross-disciplinary teams.
A short while back, two Product Designers and three Software Engineers joined forces to work on a bench research project called Channels. Channels is an augmented reality (AR) application for smartphones that explores collaboration in shared city spaces.
Key Technical Terminology
Augmented Reality (AR): Augmented Reality or AR experiences are when virtual objects are superimposed over a person’s experience of the real world. These objects can be visual, auditory, haptic, and even somatosensory and olfactory.
Spatial Maps: Spatial maps help devices to understand their physical environment so that they can realistically place virtual objects in that environment.
Localization: Localization technology allows devices to ground their location in an augmented reality scene by generating and comparing spatial maps through the combination of digital video and GPS sensors.
Persistence: Persistence refers to when virtual objects can remain in the same physical location in which they were previously positioned — so that they can be revisited, viewed, and interacted with after closing and re-opening an AR application.
Cloud Anchors: Cloud Anchors are hosted in the cloud and so allow augmented experiences to persist in the physical world by multiple users across space and time.
At its beginning, Channels was as an exploration into critical urban issues. Toronto is currently facing a tidal wave of dense residential development, and our access to green and open spaces is rapidly dwindling. As a result, parks — and less traditional public spaces like the Bentway — have become all the more special and sought-out by city dwellers.
Urban community gardens, children drawing with chalk, duelling chess players, and unique artistic events are ways in which people play in these public spaces — outside of the confines of their 400 square ft boxes in the sky. Our team saw an opportunity to make parks even more flexible, or malleable, to the different needs of all urban residents. As technologists, we saw an opportunity to create an even bigger sense of “space” for city dwellers with the flexibility offered through the virtual overlays of augmented reality.
Google’s Just a Line introduced a new way for people to play in AR. Just a Line is an app that lets users draw on the world with their friends. This creative spin got our team thinking: How might AR inspire people — and communities — to play in public spaces, like parks?
While we were thinking about communities, we started looking into the long history of digital community platforms like Internet Relay Chat (IRC) networks, internet forum based communities like The Well, and more contemporary platforms like Slack. We felt that community and collaboration would be key in making an AR experience compelling for people.
After some time, our team landed on Channels: an AR application for smartphones that helps people who want to explore and contribute to their local communities by providing them with co-creative tools that encourage collaboration.
While Just a Line provides users the ability to create ephemeral drawings in AR, Channels does something new. Amongst the plethora of mobile AR apps hitting the market channels manages something entirely new. It allows users to create, revisit, and interact with their creations — and the creations of others — even after closing and reopening the app. This is called persistence, and because it’s not formally supported by ARkit and ARcore, we were exploring brand new territory . Additionally, rather than every user experiencing the same park, with the same AR creations, users can create communities through “channels.”
To illustrate how Channels works, it’s useful to imagine how it would work through the lens of a story:
Lauren enters her local park and opens up her Channels app. There, she sees different communities she can join: one, where people are playing tic-tac-toe by drawing virtual Xs and Os in the sky. In another, people are leaving virtual sticky notes to indicate budding plant types. in yet another example, people have started to use virtual building blocks to design unique sculptural expressions. Lauren joins the tic-tac-toe community and unfinished grids that she can add to are revealed in the space around her.
Once our team figured out what we wanted to create, we needed to work in a way that would allow the product designers to explore from an experiential and visual perspective, and the engineers to tackle the technical challenges involved in creating a new kind of mobile AR experience. We worked similarly to how we do most of our projects at Connected: an agile, dual-track process where design and technical learnings rapidly feed into one another. We set up daily and weekly cadences for our small team, with stand-ups, collaborative whiteboarding sessions, and prioritization meetings that allowed us to quickly come to a minimal viable product (MVP) vision to work towards.
With a common goal, we got to work. The product designers set off to do benchmarking on digital community experiences in order to better understand how groups in these applications form. This benchmarking work helped the designers articulate a baseline user experience for the app: creating a channel, browsing nearby channels, joining a channel, viewing virtual drawings, and actually starting to draw in AR.
After outlining these four flows, the designers began wireframing the interface. To start, they looked at mobile apps like Ikea Place, World Brush, and more to understand some of the current best practices in designing smartphone-based augmented reality experiences. There is a lot to consider, so our product designers focused on applying a few key learnings in order to make design decisions such as avoiding interface elements that would obstruct our users’ viewports, breaking an overall sense of immersion, and minimizing virtual clutter by leveraging UI components like a floating action button.
In the meantime, our Engineers were also hard at work. The back-end architecture was being set up, which involved both a firebase database and AWS database — one to store both channels and drawings, and the other to store spatial maps and Cloud anchors.
While they were working towards delivering the prioritized user stories, our engineers ran into a feasibility issue with localization. Our engineers were able to figure out how virtual drawings could remain in specific locations, they soon realized that in order for those drawings to be viewable by other users, those users would need to be in the precise location that those drawings had been created. Our engineers asked our designers: How can we encourage users to go to a specific spot so that they can actually view and add to the creations of others?
Our team worked together to come up with a solution that involved leveraging volumetric UI elements which helped to encourage users to navigate to the right location, and then ‘unlock’ a virtual drawing by sweeping their phone around in a scanning motion.* By working as a cross-functional team we were able to figure out a way to help users discover virtual drawings in public spaces.
Note: This interaction is a temporary stopgap solution as several Cloud AR platforms are now making localization a more seamless user experience.
There’s a lot that we are still hoping to tackle with Channels, from UX and UI audits to evolving with the technical landscape and exploring new ways to facilitate co-creation. Our team is excited for the next round of work — which we will get to once we are off client work, and back to doing bench research.
For many, being “on bench” can be difficult to navigate. We can feel daunted by trying to figure out how to spend our time effectively and can miss the regular momentum and milestones that come with client engagements. However, working on Channels has shown us what’s possible when a cross-functional team forms around a common interest and desire to learn.
Through a shared passion for urbanism and new technologies, our team explored the current realm of spatial computing and, in doing so, started to understand the potential futures it can unlock. As designers, technologists and product developers we believe that it is imperative that we think more critically about the societal, and ethical implications of the future we are helping to build. Our team has spent a lot of time discussing and debating the impacts of Channels, and other mixed reality technologies. These discussions have led us to insights that have gone far beyond our initial MVP vision for Channels, and into foresight work around spatial computing.
Thanks for reading! This blog post is part one of four. Stay tuned for other pieces on the future of spatial computing.