"Groundbreaking BDH Augmented Reality Story Simulates World Where Herald Relevant"


The Brown Daily Herald has been around since 1866. It is a legacy publication in every sense of the word — the second-oldest student newspaper among the country's college dailies. I showed up to the offices at 195 Angell Street with an ambitious idea to drag the paper into the 21st century. The Multimedia Editors gave me the green light to start and recruit for an Immersive section, which would be one of the first collegiate teams of its kind making augmented and virtual reality journalism. The challenge of the new team was best surmised by a satirical piece in The Brown Noser: "Groundbreaking BDH Augmented Reality Story Simulates World Where Herald Relevant." Even at the campus scale, the Herald was feeling some of the pressures on the press: draining interest in print and lacking a consistent video/audio/web team to appeal to students' multimedia inclinations. The paper scatters throughout Providence, in Starbucks and in classrooms, but not enough people open the fold.

At the outset, my hope was to create immersive supplements to print stories: attention grabbing, innovative pieces meant to help BDH stories go viral. I attended weekly content planning meetings, listened to the Editors run through the publication plan, and chimed in when I thought we could contribute an immersive element. I would suggest making a QR code for an album review, and the room would oooh and aaah — I kept the secret of qr-code-generator.com close to my chest. I started learning what kinds of stories were best suited for immersive adaptation, developing a personal criteria coupled with understanding of the technology's capabilities. A previous summer internship of toolkit research taught me the slate of authoring software at our disposal, many of which had only been released in 2019.

Our plan was to bootstrap the production without funding, a slate of software free trials, and my personal Insta360 camera. The XR stories needed to be accessible to the least common technological denominator: we wanted Herald readers, young and old, to need nothing but their cell phone with an adequate software update. No VR headsets, Google Cardboards, or app downloads necessary. This goal limited our production even more, knowing that we could only reasonably make lightweight AR experiences for web, and flat 360 videos for YouTube and Facebook. Moreover, I would never have more than three days to produce an immersive story given the breakneck pace of the treadmill of news.

In this Appendix, I run through the evolution of the BDH Immersive team, reflecting on each piece we created how they leveraged the distinct media logic of XR: presence, context, and spatial storytelling. For each piece in the gallery below, I'll explain three elements: Concept, Production, and Response, and stitch them together into a broader arc of the team's learning.

XR Story Gallery

(click on each square to read more)


Reflections


Originally, this chapter about the Herald was slated for the middle of the thesis, meant to demonstrate how I have tried to leverage the distinct media logic of AR and VR in my own journalistic practice. It quickly became clear that most of our Herald stories could not leverage the best of XR — the localized AR content, the 6DoF VR experiences — due to our very limited slate of equipment, funding, and collegiate bandwidth. In the end, the experiments at the Herald were less about flexing the power of AR and VR, and more about learning the production challenges, watching the public's reaction, and thinking critically about what stories on and around Brown's campus were suitable for immersive telling.

From the outset, it was clear that our projects would be many readers' first exposure to XR technologies. To me, this meant the stakes were high: a first few immersive experiences can make or break a new user's enthusiasm for XR. The first AR piece about the EEE disease set an encouraging standard for our work... users really engaged with the tap-to-continue augmented narrative. The EEE project taught me what happens when you map the model of linear storytelling onto XR creation. The ability to tap through the narrative within seconds meant that users spent very little time with the experience, and were more drawn to satisfying UX than the substance of the article itself. I learned immediately that stories like this need to give the user agency, but still hold their attention. Maybe the continue arrow would show up on a delay, to keep people on each slide before moving forward. Or better yet, an interactive spatial story would give users the power to click on individual towns on the augmented map to spawn information about EEE in each place.

We followed up the EEE story with three short 360° video stories. I think the most promising piece in our early work is the RI Climate Strike video, which preserves an important event with enhanced fidelity of scale, sound, and space. The video captures the energy of the protests in a way only spherical imagery can: where you can look around at the signs, banners, activists, or speakers at your discretion. I really think the audiences' chants filling the space around you foments a real sense of presence in the scene (more so if you experience it in a headset).

But I quickly started to understand why the New York Times stalled the daily 360 videos and why the hype has faded for this flat 360 journalism. These videos are valuable for remote locations and crazy adventures, not so much for dark shuttle ride-alongs. The camera is most powerful when the subjects engage with it, look into the lens and therefore look the viewer in the eyes. Our stories did not leverage the best of the VRNF affordances: eye contact was few and far between, movement was limited, and we were missing the requisite microphones for spatial audio. I also started to understand with the challenge explained by Gabo Arora when making Clouds over Sidra — as a 360 journalist, how can you avoid staging shots? We tried to avoid staging any frames, only carrying the 360 cam around and seeing what we picked up. I acknowledge now that these videos might have been improved if we followed specific characters or interviewed individuals.

In content planning meetings, this learning started to inform my instinct of story selection. We raised the bar for what coverage qualified for 360 adaption, leading the focus to tilt to augmented reality work. The next two pieces were AR activations. The Fall Poll data visualization story was a fun day-of challenge and exercise in transforming 2D assets into 3D interact-ables. Compared to our first print-tracked AR piece about EEE, this piece did not fall victim to linear storytelling and instead gave the users complete agency to dictate what augmented content they would see. The four poll questions were available as a menu, and the only UI mechanic was choosing one of them, and then going back to the main menu. Indeed, the project encourages users to spend more time with the data, not just gloss over a series of printed graphs in the newspaper.