BBC producers preview the future

This post was originally published on this site


Since the 1930s, BBC Research and Development (R&D) has been at the forefront of developing new broadcasting technology. Now, the sheer pace of innovation means it’s hard for most programme producers to keep up with all the latest developments.

And how can you start brainstorming ideas for cutting-edge content if you’re not aware of all the platforms and technical possibilities?

Through Connected Studio, R&D works specifically on developing digital and online pilots. I joined a recent workshop where producers from around the BBC were introduced to some fascinating projects to help them start thinking about what’s next in content production.

Innovations ranged from a smarter way of searching to mixing audio for immersive experiences. The focus was on four very different initiatives:

  • Nearly Live Production
  • Editorial Algorithms
  • Venue Explorer
  • Spatial Audio for Broadcast 

Below are my notes on each. For more details, the link on the heading will take you to the relevant page from BBC R&D.

Nearly Live Production

Here, static UHD cameras at the Edinburgh Fringe Festival were used to increase BBC event coverage with only a short delay from actual live broadcast. R&D’s Ian Wagdin demonstrated how it worked with three small cameras hooked up via ethernet to a live feed – of the inside of the room and looking down from a balcony onto a building site outside.

With each camera, you could also crop to and add a more zoomed-in frame, giving the editor six streams to choose from. Users can vision-mix as they go, and go back and swap or clean up shots before the content goes live.

With no camera operators and the streams going to a cloud-based server, this means producers can edit for almost-live broadcast on a laptop – perfect for something like the Fringe, with its tiny venues.

The producers present were certainly excited about the potential: “You’d be able to cover so much stuff and generate such a huge amount of content… For short-form content it’s a godsend,” was one observation.

Another colleague remarked: “We do Facebook Live and we’re always wondering if we can do more. For drama, we could shoot live behind-the-scenes and get interviews from actors and directors. It’s interesting to get the conversation going about how we can do things differently.”

Editorial Algorithms

Exec producer Olivier Thereux and colleague David Man have been trying to understand how curators work. Can a computer program ‘understand’ what someone was searching for in the same way a human does – by knowing more about the context of a particular search term when combing through a database of news articles?

Take ‘New Zealand’. What would someone mean by that? News pieces might mention cities or political parties, but not the country, so they wouldn’t necessarily show up in a regular search.

BBC Monitoring advised the team that there are people, places, organisations and other ‘free text’ possibilities (such as acronyms or locally-known people who aren’t otherwise tagged in the system) that are related to the concept ‘New Zealand’.

“Once you’ve got that, you’ve got a universe of things that are relevant to the topic you’re interested in,” said Thereux. In this case, the algorithm could cluster information like ‘New Zealand politics’ or ‘New Zealand culture’.

The group tested building a search around the concept of ‘cubism’. Cubism and cubist sculpture came up, and they added other related terms like Picasso, Picasso museums, Guernica, etc. Then they could filter results by relevant news sources such as Guardian Culture or NPR Arts.

They could also manually go through search results deleting articles that weren’t relevant, thereby ‘teaching’ the algorithm what they mean when they look for ‘cubism’ in the future.

Again, the producers were excited by the technology and wished they could implement it immediately. “This would make it a hell of a lot quicker for research,” said one enthusiast.

Venue Explorer

Senior R&D engineer Paul Debenham showed how interactive content can be overlaid onto online livestreams. This can be as simple as being able to zoom and pan around a venue during a show, or as complex as having a musical score and performance notes available at the appropriate time during the stream.

It’s done by having the content prepared and ready beforehand, and then ‘triggered’ by an editor during the show.

Aside from a test with the BBC Philharmonic, the same technology was used during the Commonwealth Games, Debenham explained. The team had the starting line-ups and schedule for the day, so when a user panned around a livestream of the stadium, they could zoom into the track and get a schedule for the next event and switch to the appropriate audio commentary and video stream.

One producer immediately saw the potential for GCSE study, particularly of set texts – say, Julius Caesar. That spun off lots of other ideas within the group, including teaching people how to do things like follow recipes on cookery shows or learn dance moves.  “Imagine if it was on Strictly – you’d get the country on their feet!” was one prediction.

Another producer suggested that having news broadcasts supplemented with extra information could be really helpful to the audience. What about a “Trump-checking” feature, for instance? The consensus was that some extra ‘what is the president talking about, what does he mean?’ context would be in high demand.

Spatial Audio for Broadcast

Audio expert Tom Parnell invited producers to immerse themselves in The Turning Forest, a virtual reality experience with 360 audio, where the sound moves all around you and changes volume as you turn around.

Although this sounds like technology you might only get with high-end VR hardware, radio fans listening to BBC iPlayer on headphones have already been treated to binaural audio at the BBC Proms and on Radio 4’s Fright Night, which Parnell also demonstrated.

Additionally, YouTube’s spatial audio spec and Facebook’s Spatial Workstation support 360 audio in some web browsers, making audience reach easy.

One producer wanted to know if you could navigate through a virtual world, hearing a different story depending on where you were. Parnell confirmed that simple panning could send different audio to different areas of the environment – something that could certainly have interesting implications for new types of storytelling.

All in all, it was a fascinating look at how quickly technology is moving and how the BBC is working to stay on top of latest developments – trying and refining what works best in the real broadcast and digital world.

And it’s only April. Let’s see what else the year brings in experimental content production.

 

BBC Connected Studio

3-D surround sound for the headphone generation

Personalised cookery show from BBC R&D is a piece of CAKE

Search techniques for investigative journalism

Comments are closed.

Proudly powered by WordPress | Theme: Baskerville 2 by Anders Noren.

Up ↑