Bringing AI-Generated Worlds to the LED Volume

“Be Like Me” by NACZZOS at Veles Productions

Overview

“Be Like Me” is a cyberpunk-inspired music video by artist NACZZOS, exploring the tension between human creativity and artificial intelligence. Produced at Veles Productions’ XR Studio, the project blended live performance with AI-generated virtual worlds displayed in real time on an LED volume.

To achieve this ambitious vision within schedule and budget constraints, the production team relied on Volinga’s 3D Gaussian Splat solution, combined with World Labs’ MarbleAI, Google Cloud Veo3, and a real-time virtual production pipeline.

Creative Vision

At the heart of the project was a philosophical question: Can AI truly replace human creativity?

The visual language reflected this tension: cold, futuristic cyberpunk environments contrasted with the raw physicality of a human performer. Neon-lit cityscapes, clone factories, and digital laboratories surrounded the artist, forming a dialogue between the organic and the synthetic.

Virtual production made it possible to visualize this concept live on set, allowing performers and directors to react to the world as it existed, not as something imagined for post-production.

To achieve this ambitious vision within schedule and budget constraints, the production team relied on Volinga’s 3D Gaussian Splat solution, combined with World Labs’ MarbleAI, Google Cloud Veo3, and a real-time virtual production pipeline.

“Our users really enjoy Volinga as an all-in-one package for taking Marble exports directly into Unreal Engine and turning early creative exploration into fully realized worlds.”

– Ian Curtis, Design at World Labs

Technical Challenge

The Veles Productions team faced several constraints:

  • Tight timelines that ruled out traditional environment modeling pipelines
  • Limited budget for large-scale physical builds
  • A need for rapid iteration across multiple environments
  • Seamless integration with LED volume, camera tracking, and set extension workflows


Traditional 3D asset creation would have required weeks or months of preparation. The production needed a faster path from concept to screen.

Why Volinga?

Volinga was selected as the real-time display solution for AI-generated Gaussian Splat environments created in World Labs MarbleAI. By enabling fast ingestion and display of Gaussian Splat environments, Volinga allowed the team to focus on creative decisions rather than technical bottlenecks.

“Volinga was essential for the success of this shoot. It reduced the time required to generate and display environments on the LED wall from months to roughly an hour. Volinga made it possible to iterate extremely quickly. Switching environments and aligning virtual spaces with the physical set became a practical, on-set process rather than a pre- or post-production task.”

– Martynian Rozwadowski, Head of Technology at Veles Productions

Workflow Overview

  • AI World Creation (World Labs MarbleAI): AI-generated stills and environments defined the cyberpunk cityscapes, clone factories, and laboratories central to the narrative.
  • Real-Time Deployment (Volinga): Volinga enabled these AI-generated environments to be displayed instantly as Gaussian Splats inside the LED volume.
  • Cinematic Extensions (Veo3): The same AI-generated source imagery was used to create Veo3 establishing shots, ensuring visual continuity.
  • On-Set Iteration: Volinga allowed fast switching between environments and real-time adjustment of virtual set transforms to align with practical elements on stage.

Virtual Production & Set Extension

The project used a stYpe camera tracking and set extension workflow, enabling shots that extended beyond the physical limits of the LED wall.

This approach allowed:

  • Wider framing than the LED wall alone would permit
  • Consistent parallax and perspective for the main camera
  • Greater perceived scale without additional physical infrastructure

Performers & Creative Impact

One of the most significant outcomes was the impact on performers and the creative team. Instead of acting against a green screen, they were immersed in fully realized AI-generated worlds. This immediacy translated into more authentic performances and stronger creative alignment across departments.

“Seeing the environments live on the LED wall changed everything. Performers reacted naturally because the world was actually there. It felt more like stepping into a sci-fi film than shooting a traditional music video.”

– Martynian Rozwadowski, Head of Technology at Veles Productions

Key Outcomes

The production achieved a dramatically accelerated approach to environment creation and deployment, allowing the team to move from concept to LED volume in record time. Real-time iteration across multiple virtual worlds enabled creative decisions to happen on set, while set extension techniques expanded the perceived scale of the environments beyond the physical limits of the LED wall.

By surrounding performers with fully realized XR environments rather than relying on green screen, the project enhanced performer engagement and helped deliver more natural, authentic performances. Together, these elements resulted in a seamless fusion of AI-generated worlds and live-action performance, supporting both the creative vision and the practical realities of the production.

Looking Ahead

The “Be Like Me” production demonstrated how AI-generated environments and real-time display tools like Volinga can unlock new creative possibilities — particularly for fast-moving, visually ambitious projects such as music videos.

The combination of World Labs’ AI-driven world generation and Volinga’s real-time deployment created a feedback loop where environments could be generated, reviewed, and deployed in rapid succession, enabling creative decisions to happen on set rather than weeks in advance.

Ready to bring AI-generated worlds onto your LED volume? Click HERE to explore what Volinga can unlock for your next production.

Credits

XR Studio Setup: Veles Productions XR Studio
Gaussian Splat Solution: Volinga 
AI Worldbuilding: World Labs MarbleAI
AI Shot-Creation: Google Cloud Veo3
AI Visuals: KlingAI
LED Volume: ROE Visual Ruby 1.9 Videowal
LED Processing: Megapixel Helios Processor
Camera: RED Digital Cinema Komodo-X
Camera Tracking: stYpe RedSpy 3.0
Set Extension Workflow: stYpe – set-extension workflow
Motion Capture: Internal MoCap System (Veles Productions)