Designing Information Spaces for Augmented Reality

In the winter quarter of 2019, our team explored how Augmented Reality can benefit different types of journalism. Over the course of the class, we built four unique projects that ranged from translating a sports broadcast into mobile AR to building visual experiences for podcast listeners

Translating Timelines into Augmented Reality

We began by breaking down how timelines function and what their capabilities were, and we settled on a problem statement: “How might we represent thematically related events over a period of time?” We used this as our guiding question when building our prototype, which we decided was going to represent Moore’s Law. Put simply, Moore’s Law states that the number of transistors in a computer chip doubles approximately every two years. We wanted to focus on representing what we believed to be the most important aspect of Moore’s Law: the increase in processing power over time. Through numerous iterations of sketches, we came up with a design to represent this concept.

Moore's Law prototype that we designed on Sketch

Each grey block represents 1.2 million transistors, which we decided was our unit of measurement because it was the first time there were more than a million transistors in a chip. Each successive year, we used those unit blocks to represent an increasing number of transistors, and consequently an increase in processing power. After finalizing our design on paper, we set off to prototype our model.

Using Google Blocks, we created the individual unit cubes and placed them on larger models for the computer chips. Then we imported these assets into Torch, where we were able to interact with them.

In the end, augmented reality was an interesting tool for representing scale and size because of the space available, which was really useful for timelines like Moore’s Law which show growth over time. But AR does have some limitations. Space constraints of the real world bring up the question of how we can represent events over time. Do we just spread the events out over a long hallway? Or do we stack them on top of each other? These are all questions and challenges that designers face when creating timelines in AR, which is why each project is context specific.

AR in Sports: Curling

AR in sports has existed for a while and has been used mainly as a supplement to live sports broadcasts (e.g. player rosters, down markers in American Football, or recreations of plays.)  Sports has long been an area of journalism that has managed to push boundaries and pioneered using new technologies. This cycle, we sought to use AR as an informational/explanatory tool instead of supplementing something that already exists. Curling is a sport that has drawn a lot of interest in recent years, spurred on by Olympic coverage—but not many people understand it. Given that, we took on the task of making an AR experience that could explain the basic rules and mechanics of curling.

To do this we created original assets in Google Blocks that represented the curling rock and the house (the bullseye scoring area). Additionally, we downloaded and modified human form and broom assets from Google Blocks to represent the brooms used in curling as well as the players who use them. Using the Torch Mobile AR app, we created an experience in which the user can essentially take on the role of a lead (the player that throws the first curling rocks) and pretend to send the curling rock down the ice. As the user moves the rock and the sweepers move forward as well until they get to the house mimicking the motions that a curling team goes through. Lastly there is an audio voiceover guiding the user through the experience and explaining movement, scoring, and position names.

Rough sketch of curling demo prototype

This prototype positively makes use of interactions in Torch. Presented in the prototype are user-object interactions—for example when the user gets close to the curling rock it moves—as well as object-object interactions, such as a scene change when the curling rock hits the house. These interactions make the prototype really immersive and fun. Audio was used as an explanatory tool which made the setting feel informative but in a casual non-intimidating way. The prototype does have some cons that are both specific to our execution but also some that can be generalized to sports and AR in general. Our prototype doesn’t totally encompass all the rules and regulations of curling. After doing this a user has an idea of how to play curling but probably couldn’t get up immediately and start a game. Our prototype also relies on audio to explain the game, instead of using text. This may alienate some users who are hearing impaired, or who simply cannot listen while using the experience. AR Sports experiences that seek to teach the rules of a game require a lot of physical space if you want the user to be fully engaged. The user cannot just sit around at a desk; they have to get up and move. This means that many users may simply not have a good location in which to experience the AR. And, of course, this approach is not suitable for physically disabled people.

AR to Illustrate Audio

For this experiment, our team devised a location-based Augmented Reality solution to accompany a conventional podcast. The idea is to guide users through a specific space where they will receive complementary visual cues while listening to their traditional podcasts. Initially we focused on developing AR visuals to augmenting podcasts from WBEZ, a nonprofit public radio station broadcasting from Chicago. The format of these podcasts has to do with local architecture and the history behind what makes these pieces of art significant. Although we found many fascinating stories in said podcasts, it wasn’t very feasible for our team to visit each site off-campus in order to construct our location-based Augmented Reality experience. So we decided to create our own podcast on the architecture and art at Northwestern that would emulate its WBEZ counterpart. By doing so, we were able to physically be in the space where our podcasts would take place, thus beginning the path down our location-based AR solution.

We continued our process by identifying architectural and artistic points of interest around campus, where we stumbled upon a trove of history and information throughout Deering Library. After this, we knew that the first episode of our Augmented Reality podcast had to revolve around this 86 year-old library. Our journey begins outside the west-facing entrance, where a traditional Northwestern photo is typically taken. Users are able to look around the picturesque architecture while viewing cues about its history and its creators. We then guide users inside the library, where similar visual cues will inform them about various pieces of art, such as El Bohemio. This story-telling continues until the user reaches the top floor of Deering, where they are greeted by beautiful stained-glass windows.

Overall, this was a very interesting media in which to explore the potential of Augmented Reality because it completely changed the way we perceive podcasts. In our work, we found both advantages as well as limitations to complementing podcasts with AR. As a benefit, these visual cues can give users a more robust podcast experience by allowing users to gain additional layers to the story, as well as encouraging active participation and engagement with the podcast by physically being in the space. Some limitations we faced when creating this location-based AR solution include the challenge for the average podcast listener to experience our version. That is, traditional podcasts are typically listened to on-the-go, such as during a commute. This experience requires that the listener make a trip to gain the benefit of the AR component.

AR You Can Use: Cooking

While AR tools exist for measuring size, our aim in this last cycle was to develop a tool that helped users measure volume. We focused on the New York Times Cooking Newsletter as a type of media that with its narrative storytelling brings recipes to life. In this way, we would work to create some sort of AR tool that helps users replicate the recipes brought to them by the NYT. While we realized many people have cups and spoons to measure volume for making food, not many people are familiar with the measurements for mixed drinks. Additionally, we wanted to include an aspect of customizability and not make everyone follow the same recipe, in that a recipe might be too much or not enough depending on your container size. We wanted to make the recipe adapt to the user. Our overall goal was to increase the accessibility for someone who might have a smartphone and wants to create a recipe, but might not have the necessary tools to do so. We are facilitating the process and increasing the availability to all users. Additionally, we wanted the user to take part in creating the narrative of the story. Although making a recipe is already an interactive thing, allowing them to use AR to do so makes the design much more participatory.

We began our process by selecting a recipe from the NYT Cooking Newsletter to explore. We found a Moscow Mule recipe that we wanted to move forward with. We decided to make the design adaptable by allowing the user to measure their own container and then be prompted with some sort of marker on Torch to pour the ingredients until a certain level.

Preliminary sketches for augmented reality tool to help pour drinks

Although the actual deployment of this sort of app would have been out of our technical skill level, our goal was not necessarily to create something that works but to explore the potential space that exists. Because of this, we used the iPhone’s Measure app to measure a cylindrical cup’s height and diameter, calculating the overall volume. Instead of telling the user to pour a specific amount of fluid ounces for a given ingredient, we used ratios to adapt to the volume of our specific cup. We marked the different ingredients using lines and would prompt the user with a voice over and text to pour the given ingredient until the corresponding line. The combination of voice and text also helped us address accessibility issues, making our prototype functional and adaptable to a vast majority of potential users.

Overall, we felt like this experience made the mixed drink making session much more engaging and intuitive for our users. While making this, we encountered some limitations of working with AR. The first was that usually for cooking users will need all hands available, and if they need to hold their phone in one for measuring this might be hard to accomplish. Because of this, we determined this prototype would work best for a head-mounted display instead of on an app like Torch. Furthermore, the simple measurement system we have developed for our prototype currently only works for common shapes such as cylinders. If we were to work with some other shape, it might be harder to measure accurately and the proportions could be off. Finally, we found that if the container is not transparent, the user might have a harder time pouring the ingredients accurately. However, despite all of these limitations, we found that measuring volume in AR to create mixed drinks was a feature that increased user participation and accessibility.


We learned that in order for each project to be successful, a key question was required to drive everything forward. These key questions affected the trajectory of every project.

We also learned that Augmented Reality allows stories to be more immersive, accessible, and easy to understand. It allows user to be active participants of a story, rather than passive consumers.

Augmented Reality can also be used as a tool to complement a story rather than replace it as a whole, making up for written journalism’s weaknesses. It allows users to visualize experiences, utilize their space, and explore distance scale.

About the project

Information Spaces in AR/VR

An experimental design project that explores an emerging concept of information spaces. This concept is behind Microsoft’s pivot to create future Windows operating systems that exist in mixed reality headsets. Students will explore the concept of a news/information app that exists in AR and/or VR. For example, a political news feed might exist on a wall in your living room, and breaking news would appear on a coffee table, twitter reactions surround the coffee table on the floor. Students will also explore the same concept in a virtual environment.

About the authors

Nia Adurogbola

Device Lab Fellow

Nia is a junior studying journalism from the Bronx, New York. She is currently a Device Lab Fellow who is trying to learn as much as she can about how AR/VR can be used to enhance journalism pieces. Outside of the Knight Lab she can be found working on photography and missing New York City's thin crust pizza.

Michela Ferrari

I’m a UX designer and researcher who believes design and communication go hand in hand. I am currently a senior at Northwestern University, majoring in Communication Studies with a minor in Business Institutions and completing the Segal Certificate in Design. I’m passionate about creating unique solutions that revolutionize experiences for users, whether generated through digital platforms, services or actual products.

Akhil Kambhammettu

Device Lab Fellow

Hey there! My name is Akhil Kambhammettu and I’m a sophomore studying journalism and computer science. I’m from Portland, OR so I love all things Pacific Northwest (Except coffee I’m not huge on that)! Talk to me about music, psychological thrillers, virtual and augmented reality, Nietzsche, and your favorite podcasts!

Paige Shin