This team is looking at how to accommodate image formats from a variety of popular 360-degree cameras, including iPhone panoramas. The goal is to make a tool not only for journalists but also for any storyteller looking to tell a story in VR.
Emmy winning interactive producer & Associate Professor @NorthwesternU, @KnightLab. Formerly of The New York Times. Creator of TimelineJS & StoryMapJS
Articles by or about Zach Wise
Projects Zach Wise has worked on.
Based on the outcomes of our a Exploring AR Visualizations project in the Winter, this project will take the unique forms of visualizations discovered and develop a tool that makes it easy for storytellers to build and embed augmented reality visualizations in their stories and projects.
Automated Photogrammetry. As AR and VR increasingly becomes the focus as we move away from smartphones, media organizations will have to find ways to produce content native to those mediums. These same organizations have a wealth of stories that continue to have value over time. Flat video and photo will become less desirable as we continue to move into these new spaces. Previous projects in the lab have surfaced an opportunity to take existing video and process it using photogrammetry software to produce a 3D model.
An experimental design project that explores visualizing data in three dimensions for augmented reality. Visualizations that can be examined and inspected by physically getting closer or understood by walking around them, open up exciting possibilities for how we communicate complex ideas and data that reveals hidden truths.
Editorial fact-checking is a mess at best and readers don't see the benefits. Typically they doubt it happens or don't appreciate the work it takes to make it happen. On the editing side, almost everyone who does it uses an antiquated process derived from print production habits even though most writers and editors are drafting in Google Docs. This can be better. Let's make it better for both editorial and readers!
An experimental design project that explores an emerging concept of information spaces. This concept is behind Microsoft’s pivot to create future Windows operating systems that exist in mixed reality headsets. Students will explore the concept of a news/information app that exists in AR and/or VR. For example, a political news feed might exist on a wall in your living room, and breaking news would appear on a coffee table, twitter reactions surround the coffee table on the floor. Students will also explore the same concept in a virtual environment.
Immersive technology allows creators to engage users in new and novel ways, many of which can make the interactions users have with information easier or more meaningful. This project will look at a four different storytelling formats that exist today (a cooking blog, a sports broadcast, a web interactive and a podcast) and reimagine them for augmented reality using tools like Torch for iOS and Magic Leap’s Create tool.
Juxtapose helps storytellers compare two pieces of similar media, including photos, and GIFs. It’s ideal for highlighting then/now stories that explain slow changes over time (growth of a city skyline, regrowth of a forest, etc.) or before/after stories that show the impact of single dramatic events (natural disasters, protests, wars, etc.). This popular tool could be more useful to storytellers and web-makers if it had a couple of key features that have come up in user feedback. Auto aligning images and animated GIF social sharing are two features that would be of great improvements.
Though many in journalism are excited about VR, few are addressing real issues with making it attractive and interactive for their audience. This story team will explore the idea of making multiple three-dimensional VR photos around a scene and linking them together so that the user can navigate it. They’ll be exploring complex VR design challenges, such as how to move around space without disorienting the user and how to easily author interactive environments.
Advancements in neuroscience and immersive technologies offer mechanisms for engineering an entirely new mode of performance art one that engages audiences to unprecedented degree. Using the latest VR production techniques, students used motion capture and machine learning to teach a computer to improvise a performance, creating an engaging VR experience.
The podcasting landscape is overcrowded, with larger voices from legacy broadcast media sometimes drowning out new entrants. Browsing for new-to-you, quality podcasts is hard, with shows scattered across distribution platforms. This team will explore how we might provide users a better path to discovering new podcasts.
The average person today that has a smartphone, walks around leaking information about themselves over radio signals. WiFi, bluetooth and NFC radiate personal information into the public airwaves. These signals can tell you a lot about a person without their knowledge. To raise awareness around privacy and security for digital devices, this project will seek to create a “mirror” that reflects back information that is radiating out from anyone who stands in front of it. Frequencies include: RFID cell phones, WiFi, bluetooth, Misc RF at 900Mhz 2.4Ghz 5Ghz
Imagine that any wall, building floor or doorway could come alive and tell stories. Using projection mapping, many artists, advertisers and industries are already doing it. In this project, students will explore ways in which projection mapping is currently being used and adapt them for journalistic purposes. Students will build and prototype their adaptations.
One of the most common problems we see in data storytelling is how and when to introduce an editorial layer onto a visualization. Mobile devices afford us very little real estate to work with, and interactivity must be limited. But without a “story” layer, users are left without the context to understand what events might impact or inform a trend. They see something going up or down but don’t see why. “Storyline” will be a tool for creating stories around line graphs.