Software, Sauce and Sidemen
Evaluating tech at trade shows isn’t what it used to be. In the software era, every conversation has to unpack not just what a product does, but how it’s changed since last time. And if it’s a brand-new release? Even more talking is required to understand the ‘special sauce’ – and whether it can truly transform a workflow or business model.
Gone are the days of simply assessing hardware and giving it a thumbs up or down. It makes today’s trade shows more complex – but also more intriguing.
I didn’t attend the NAB Conference programme this year, but I noticed they were leaning into younger content creators and even launched a ‘Creator Council’ to help bridge old and new ways of thinking. That echoed what we heard at last year’s DTG Summit, where Jordan Schwarzenberger (co-founder of Arcade Media and Sidemen manager) pointed out that younger viewers have already “left the party” of traditional TV – and they’re throwing their own elsewhere.
No shockers on the big topics: AI (obviously), cloud virtualisation, sports-tech convergence, and monetisation in the streaming era. Globally, it’s clear that traditional broadcasters are increasingly feeling the pressure to evolve.
Fast Cars, Fake Faces, and F1 Holograms
Amazon, AWS and Nvidia nailed it when it came to engaging the NAB crowd. Their virtual F1 race setup – complete with e-sports simulators – had attendees queuing up all week. Smartly, they used the experience to generate live content across their stand, showcasing everything from cloud-based news editing to time-addressable media (TAM) and AI-driven storytelling, all hosted on Amazon Bedrock.
And yes, there was a hologram. Specifically, an AI-powered likeness of F1 strategist Ruth Buscombe, who chatted with visitors in multiple languages – thanks to a mash-up of Proto’s Conversational AI tech, Claude via Amazon Bedrock, HeyGen facial animation, and a stack of AWS services including SageMaker, Rekognition and Translate.
Techy interlude alert! You asked, I delivered.
From the Moon to MXL
Saturday morning brought an update from the UHD Forum and UHD Alliance over bagels (talks ongoing), and a lucky seatmate: Rebecca Sirmons, GM and Head of NASA+. She’s gearing up to live-stream the Artemis moon landing in 2027. Not your average breakfast chat – but one streaming engineers will definitely want to follow.
UHD Forum’s service tracker now counts 339 UHDTV services globally – though most, particularly in the UK, are delivered over IP rather than broadcast. France is the exception, planning to shut down HD DTT from 2029, with UHD becoming the default for channels like France 2.
Over at the ATSC 3.0 booth, Next Gen TV was being demoed in all its glory, Now in 76 US markets with Sinclair broadcasting 63 channels using Dolby Atmos. There was a nice demo of 2160 4K HDR over-air in less than 10 Mbps. Still, there’s a way to go politically in the US to get full alignment on an ATSC-3 switchover. Meanwhile, some low-power broadcasters are excited by the possibilities of datacasting.
More on that here: watchnextgentv.com
Deepfakes and Disruption
Synamedia had one of the most talked-about demos: their Senza platform – a cloud-based delivery system that renders the UI for a tiny in-home streaming device, powered from the edge. It’s a real shift in the economics of IP media delivery.
Also on show: Quortex Switch, the first SaaS-based multi-CDN management tool using new content steering standards. As a bonus, Synamedia added a spooky twist – AI-generated deepfake adverts featuring your own face. I’m still not sure everyone wants to star in their own commercial, but never say never…
Standards, Sensors and Spatial Everything
A catch-up with our HDMI partner Brad Bramy confirmed that the rollout of HDMI 2.2 (9Gbps) continues apace – though aligning the many moving parts of the CE industry remains a challenge. Meanwhile, China’s pushing its own high-bandwidth “GPMI” standard. The old rule still applies: any standard is fine – as long as it’s mine.
Virtual production was everywhere. Blackmagic showed off the Ursa Cine Immersive, a dual 8K camera built for Apple Vision Pro, priced at a mere $29,995 (plus tariffs, naturally). Sony demoed its A9 III with XYN spatial capture, offering photorealistic 3D assets for games, film and the metaverse – all handheld.
At the affordable end, the EVO cinebot ($10,000) brought robotic camera motion to the masses.
Meanwhile, Emergent Vision had 36 GigE cameras capturing mixed reality stages for MetaQuest headsets – giving visitors the chance to step into virtual space, fly around with a joystick, or both.
Ateme also gave us a glimpse of the future with an Apple Vision Pro basketball demo. Let’s just say – when a ball comes flying at your face, your reaction is a lot more real than virtual. Not for the faint-hearted!
Standards Watch
We checked in with colleagues from MovieLabs, SMPTE, CTA, SVTA, DVB, NAB and more. Highlights included:
- MovieLabs’ Ontology for Media Creation – aiming to standardise how the industry describes, exchanges and automates production data.
- NAB’s terrestrial backup plans for GPS positioning, plus a live SRT broadcast demo from the ISS.
- The EBU’s work on HLG and the Dynamic Media Facility (DMF) – an open-source, software-defined production model built with the Linux Foundation and NABA. The DMF’s Media Exchange Layer (MXL) promises a cloud-fit foundation for real-time media workflows. Grass Valley is among the early adopters.
Read more:
And finally…
Of course, no Vegas trip is complete without a little R&R. We were lucky enough to catch the virtual relay of U2’s Sphere concert – a dazzling blend of immersive sound and pixel-perfect visuals. It almost felt like they were really there. Almost.
Now, how do I squeeze 167,000 speakers and 268 million pixels into my front room?
Answers on a postcard…