Hyper Low latency Augmented Reality
This T-Mobile sponsored activation enhanced the immersive experience, allowed instant access to statistics and replays, and provided a greater connection for spectators in the venue during All Star Game and Home Run Derby 2023. We built a hyper-realtime AR-powered broadcast experience to keep fans connected to the action with instant context and nuance, from anywhere in the stadium.
CREDITS:
Vice President, Design and Innovation
Alexander Reyna
Role
EP, Product Design
Collaborators
David Santana, Product Design , Art Direction, UX/UI
Geoff Erickston, Lead Engineering
Nathan Tompkins, Senior Producer
Joel Feinberg, Senior Product Manager
Pioneering Augmented Reality
MLB Next, presented by T-Mobile, introduced real-time augmented reality to Major League Baseball for the very first time. Launched during the 2023 All-Star Week at T-Mobile Park, the app empowered fans with an unprecedented ability to explore pitch trajectories, follow ball flight, analyze defensive positioning, and access advanced stats — all overlaid directly onto the live field at 150 milliseconds (that’s 1/5th of a second!)
What had historically required a broadcast van, multiple analysts, and a high-end TV package now lived inside a fan’s phone, tightly synchronized to the moment they were watching. Fans could tilt, pinch, rotate, and inspect each play from angles that simply didn’t exist in a stadium environment before.
Challenge: The Stadium Experience Hadn’t Caught Up to the Broadcast At the ballpark, fans are limited to a scoreboard and what they can see from their seats.
MLB Next set out to close that gap. In order to do this we build a system that could get data from the ballpark to the cloud back down to client devices instantly. This allowed us to deliver real-time data on top of live plays with moving players and create a shared experience that works at stadium scale.
Younger audiences in particular expect participation, context, and agency. MLB Next gave them tools to explore the game in ways traditional viewing simply couldn’t support.
Building an AR Experience for Tens of Thousands of People
Variable lighting conditions, wireless congestion, unpredictable vantage points, and the performance constraints of mobile devices required us to rethink how AR should behave at scale.
We built MLB Next around lightweight, responsive 3D scenes that could be recalibrated instantly based on a fan’s location. The app updated play-by-play in real time, ingesting data directly from MLB’s live game feed with timecodes aligned to the on-field action.
Visual Positioning Prototype and stadium Digital Twin test. Visual Positioning Systems displayed baseball data fit to each unique person’s view of the field and allowed MLB to geolocate and limit the experience to only the fans attending the events.
It had to be instant (like 50ms!) data.
The result was an AR system that felt stable, intuitive, and responsive — even in a packed stadium operating at full network load. From the start, MLB Next was designed in collaboration with T-Mobile as a demonstration of what 5G could enable for live sports.
The secret of our success was a blend of WebSockets, ActiveMQ, and on prem Audio and Video feeds allowing instant transmission of real-time data. This solution pushed data and audio to 50K concurrent fans inside the park, as action unfolded in front of their eyes.
An ultra low latent strike zone delivered pitch data the moment the ball entered the catcher’s mitt. Fan’s heard the call and saw the data at the same time.
Seamless Instant Broadcast, Inside the BallparkFans have a disconnect between the broadcast and what they see.
MLB Next closed that gap.
With Mixhalo, fans received instant ESPN and FOX audio that stayed perfectly aligned with the action. And in collaboration with Red5Pro, we built a WebRTC prototype that captured four MLB “alt” angles, streamed to the cloud, and delivered back to fans in 50–150 milliseconds. Fast enough to match the crack of the bat with instant visual context.










