A Live MultiCam VR Broadcast Experience
By the fall of 2020, we had produced dozens of edited pieces in VR but had never tackled the live VR game experience. There are real challenges placing cameras in a regular season game in locations that work for VR, but we knew that live games in VR is a top user request.
We presented the idea for a live broadcast in VR called “5G Batting Practice” to our technology partner T-Mobile. We wanted to improve existing process, uncover risks for this new type of live VR production, and offer fans of the sport exclusive on-field access during a crucial BP. We partnered with our friends at MLB Network to develop the technique for live 360 and also produce an hour long Postseason preshow that streamed live to viewers in VR.
CREDITS:
Producer
Michael Furno - MLB Network
Alexander Reyna - MLB
Creative Director
Alexander Reyna
Role
Creative/Art Direction, UX, Prototype
Collaborators
Nick Nolan, Streaming Media
AJ Sinker, VR Technical Direction
Geoff Erickson, Engineering
Te Liu, Engineering
We had several technical and design hurdles to overcome.
Our initial production in 2020 was a single camera experience with UI overlay and 2D broadcast all baked into a single 4K 360 composite video. The overall quality was decent but broadcast graphics tended to suffer because of the compositing process. Our 2020 offering pushed the limits of streaming live 4K video over HLS into an Oculus Quest headset and we found a balance between data rate, latency, and image quality.
Live compositing was done @ MLB Network using Grabyo. This synched video and HUD content into a single 4K HLS but lost some image quality.
We doubled down for Home Run Derby 2021.
We went from a single camera experience to four 360 cameras, included a 5th Wearable 2DHatcam, and built a minimap field of play that showed ball trail statistics. In addition, we used our StatsAPI to track and display Batter and ball trail information overlaid on top of all camera feeds.
This engineering and design challenge requires a ton of prototype, testing, and product innovation to solve very difficult problems including a system that packed a 2D score bug, powered by Singular, and a 2D broadcast into the same HLS feed to improve performance and better sync live data to video. We also experimented with letting the user control the placement of 2D elements such as the game feed, scorebug, and more.
Prototype for live keying of MLB Network HUD on top of 4K video. This is two HLS streams playing at the same time.
Prototype for user controlled HUD. Both elements share a single HLS feed but are separated in the geometry into separate pieces.
Using GPS for Augmented Reality overlays
We prototyped an approach that used GPS to find the approximate position of 360 cameras in the real world and built technology that allowed our team to camera fit a 3d stadium to the real world position when GPS failed. This allowed us to accurately place contextual data like ball trails and Statcast statistics on top of the 360 broadcast feed.
Prototype using stadium digital double matched to camera position derived from GPS. Base positions are a good match.
We gave fans unprecedented access to the behind the scenes exclusivity of Jewel events.
We launched on Home Run Derby day 2021 with a 5G Batting Practice preshow immediately followed by an industry first broadcast presentation of the entire Derby. This amazing activation allowed fans to get up close and personal with the stars of the sports and understand the story in totally new ways. The experience of seeing Pete Alonso hit his final Home Runs from multiple locations on the field has totally transformed the broadcast experience.










