The Weather Channel is at it again. They’ve done another weather in mixed reality episode to convey the power of the winter storm that’s blanketed much of the U.S. this weekend (all except New York City, where we’ve had nothing but rain). We looked at their work back in the fall and how they’re using mixed reality to cover hurricanes, tornadoes, and forest fires. It’s an innovative, studio-based approach that requires no post-production. It’s not an immersive experience in your headset – not yet at least – but still fascinating.
Here’s the video of the effects of a winter storm in mixed reality. It gets good after the first minute.
Conveying the impact of weather in Mixed Reality
There’s an excellent interview in The Verge with Michael Chesterfield of The Weather Channel on the production process behind the episodes. We’ll highlight a few portions of the conversation here, but it’s worth reading in full.
How did you make the graphics we’re seeing in the ice storm segment The Weather Channel just released?
The ice storm experience is an Immersive Mixed Reality segment, where we are actually able to immerse the talent within an environment. We do that through a complete large green screen studio and special technology that allows us to take these experiences to air. It starts with the Unreal Engine, which is a high-end video gaming graphics engine. It allows us to build and adjust these graphics in real time . . .
What are the challenges to making these?
I like to compare this to producing a small movie. We start off with a concept and an idea of what’s going to be topical for this time of year, and then we’ll get together and write up a script, create a storyboard. Our goal is to be as realistic as possible in order to allow the audience to picture themselves in these experiences . . .
These graphics look realistic, but there’s also this video game or movie quality to them — what do you hope this accomplishes that a traditional weather forecast or actual footage of a natural hazard won’t?
What this allows us to do is to play God, if you will, and to make these hyper-realistic videos that show what the storm is going to look like before it actually strikes. This way we’re actually able to marry a forecast with what one can expect in these situations . . .
. . . The realism portion of it is extremely important, because it allows the viewer to put themselves into these situations, much like when we do put somebody out there in the field and we’re reporting live in rain or snow. The idea there is to give people a real sense for what’s going on.
Will immersive experiences desensitize us?
The interview ends with a question we’ve been wrestling with. Could doing the weather in mixed reality or other immersive technologies ultimately desensitize us to its power? In making the graphics that intense – will we stop taking them seriously? Will we need even more violent scenes in the future to grasp how dangerous storms can be?
Michael Chesterfield doesn’t see that as an issue,
. . . as long as we keep in mind that there is a goal for each one of these experiences and that we’re mixing them up, and that we’re mixing in the information, I think we’re going to be okay on that front.
On that front, we’re not so sure. Though as he notes, there’s more than graphics here. They’re “mixing in” information into the experience. Like The Weather Channel, we want to think that the intensity of the experience will help us understand the dangers of real weather. If we experience the weather in mixed reality, we won’t need to step outside into the real ice storm.
But could it develop in a different direction? Will we experience intense weather events in our VR and mixed reality headsets and underestimate the power of reality? Is the relationship between the virtual and the real like a seesaw? One side gains in position through the decline of the other?
There’s no simple answer here as we’ve never had a media form that puts us as an actor in the event. Or as we like to say, the character in the story. Right now, The Weather Channel’s amazing immersive productions remain studio-based. And ironically, we watch them on a nice, traditional, flat screen.
But soon enough, storm events will be rendered in our headsets. And we’ll feel like we were actually there. And then the ethical questions will pile up like the snow outside many people’s windows this weekend.
Emory Craig is a writer, speaker, and VR consultant with extensive experience in art, new media, and higher education. He speaks at global conferences on innovation, education, and ethical technology in the future. He has published widely and worked with the US Agency for International Development, the United Nations, and the Organization for Economic Co-operation and Development (OECD). Living at the intersection of learning, games, and immersive storytelling, he is fascinated by AI-based avatars, digital twins, and the ethical implications of blurring the boundaries between the real and the virtual.