Flux:: sound and picture development was founded in the 1990’s during the early days of digital audio software workstations, collaborating with Merging Technologies in the creation of Merging’s now well renowned products.
We are pleased to invite you for a two hour live online training session on the FLUX:: IRCAM Spat Revolution software.
In this training session we will explore the use of the software and learn how to design various types of systems, study use cases and look at different ways to integrate with third party elements.
The training includes a 90-day license of Spat Revolution, so you can take a deeper dive at your own pace, and to make it possible to follow the training steps hands on during the session.
Our qualified FLUX:: Immersive Consulting Group team members will be online to assist and to answer your questions live.
This training session will be divided in 4 main modules of roughly 30 minutes each. As we are delivering you this session live, feel free to ask your questions in the chat window. My colleagues will do their best to respond to those questions. At the end of each of the modules we will be reviewing some of the important questions received.
The post Introduction to Spat Revolution – Online Training Session appeared first on Flux::.Read More
Tokyo, Japan – April 2020…
Shinjuku has a long history as one of Tokyo’s most vibrant areas, and Waseda is arguably the district’s cultural and creative nexus. The area is home to the reknowned Waseda University, as well as numerous museums, theaters, night clubs, and performance spaces.
Waseda’s newest venue, the Artware Hub, opened its doors in late 2019. Commissioned by the Kakehashi Arts and Cultural Foundation, the space is the realization of a lifelong dream of Roland Corporation founder and visionary Ikutaro Kakehashi.
What appears at first glance to be a fairly typical live performance venue is, in fact, much more. Designed as an experimental acoustic space, the facility incorporates an unparalleled 36.8 multi-channel immersive audio system based around FLUX::Immersive’s SPAT Revolution software engine.
The concept for the space was brought to life by Ikuo Kakehashi, the Kakehashi Foundation’s Director, and son of the famed inventor, working with producer/engineer Keiichi Ito. As Ikuo explains, his father was always cognizant of the relationship between music and technology, and the impact of an environment on creative collaboration. “My father used to say that no matter how good the instrument maker made the instrument, it would be meaningless without a proper place to experience it,” Ikuo observes.
The venue takes its name from the term “Artware,” coined by Ikutaro Kakehashi. The term refers to not just hardware and software, but also to the human experience of an artistic performance. “We wanted to create a space that was more than merely a concert hall, that was a place where content of all kinds could be realized, and an environment where multiple people could gather and share in that performance,” Ikuo explains.
The immersive listening experience was an important part of the performance environment for Ikutaro Kakehashi. As early as 1991, he had developed a 3D audio processor dubbed RSS – Roland Sound Space – one of a number of early attempts to create an immersive experience using regular stereo speaker configuration. “In designing the space, one of the most important areas of content creation for us was immersive audio,” Ikuo confirms.
At the heart of the immersive audio system is FLUX::Immersive’s SPAT Revolution, providing acoustic simulation that allows the user to freely arrange input sources in a virtual space where loudspeakers may be freely placed to create virtually any configuration of sonic landscape.
The system is configured using a single purpose-built computer. An RME HDSPe-MADI FX interface provides up to 64 channels of MADI I/O at 96kHz across the 36-channel system. An Avid S6L console offers tight integration with SPAT Revolution for controlling input sources. Since the actual FOH mix position is program dependent, the S6L is purposely housed in a road case to enable it to be moved to the appropriate sweet spot.
Audio sources from the stage, as well as pro tools outputs, are connected to the S6L via AVB, and signal is routed to SPAT via MADI. An Avid MTRIX is at the core of the Artware Hub system, providing routing matrix and system monitoring and control. Outputs are converted from MADI to Dante for system distribution.
The multi-channel speaker arrangement is comprises of 24 in-wall speakers at two elevations, 2.5 meters and 5 meters, each with 12 speakers arranged in 360 degrees at 30 degrees apart. Nine more speakers – eight in a circle and one at zenith – are attached to a central grid mounted to the ceiling. Three more speakers occupy temporary locations on the front floor for special effects. Eight subwoofers in four cardioidal pairs complete the loudspeaker configuration.
Since its opening in the fall of 2019, the Artware Hub has hosted a wide range of performances and garnered high praise from visitors and performers alike. Thanks to the efforts of Ikuo Kakehashi and Keichi Ito, and innovative technologies like SPAT Revolution, Ikutaro Kakehashi’s vision can finally be realized.
The post Waseda’s Artware Hub Brings Immersive to Life with SPAT Revolution appeared first on Flux::.Read More
Los Angeles, CA – March 2020…
Like many audio professionals, Braeger Moore started out as a performing musician before turning his attention to recording and audio engineering. A native of Texas, he graduated from Texas Tech, then attended the Conservatory of Recording Arts and Sciences in Phoenix, where he met future business and creative partner Nate Greene. The two migrated to Los Angeles in 2013 and opened a small recording studio, mostly producing, mixing, and mastering records for hip hop and indy artists.
Although the studio proved profitable, Moore and Greene sought new challenges. When they met Clay Schmitt in 2017, the pieces fell into place, and the three partners formed Canopy LA, a sound studio focused on developing immersive audio for interactive, story-driven experiences. It wasn’t long before Moore’s position as Technical Director and Audio Technologist led him to FLUX::Immersive’s Spat Revolution real-time 3D audio mixing software.
“We were interested in immersive audio for sound design, film, games, and installs,” Moore recalls, “and installs caught our attention first. We wanted to work with director Adam Amaral of Master of Shapes but he was working with game engines, and we didn’t have experience with that.” To learn the tools, Moore designed a game, in collaboration with his two partners. They showed the game to Amaral, who agreed to work with them. That was Canopy LA’s entrée into installation work.
By late 2018, Moore was looking for the best way to tap into complex speaker arrays, including nontraditional ones. “I picked up Spat Revolution and learned my way around it, with help from FLUX::Immersive,” he relates. “Then a project came along for Facebook/Oculus where we had the opportunity to use Spat to achieve our vision.”
The project was a temporary install for an event at the historic Los Angeles Theatre. “People showed up to watch a movie and had no idea what was about to happen,” recounts Moore. “Previews start to roll, it looks like a normal commercial for a game, and then all of a sudden — boom!— the commercial explodes out from the front screen and lights up the entire roof, the sides, and all the way behind people. We showed immersive commercials for four different games, which required a combination of projection mapping of the entire theater in 360° in conjunction with holograms at the very front.”
The three-level venue seats more than 2,000 people. “We’d never done anything that huge before,” Moore discloses. “We knew we would use a lot of speakers and high channel counts, and we wanted to address each of the speakers discretely. Spat Revolution is the best and most intelligent choice for that. Spat is great to work with. You can design the layout specifically and modify it creatively. You have a bunch of different algorithms to choose from, and certain objects and motions suit different use cases. There are no boundaries to Spat except computing power, and it runs on a Mac or Windows PC. I shopped around for spatial audio solutions, and one person wanted $40,000 for a standalone hardware unit; that will never work for us.”
The Facebook/Oculus project didn’t use conventional commercials with voiceover. “The idea was to put you in the game,” clarifies Moore. “A good example of using Spat, and one of the most impressive for the people experiencing it, was the commercial for a game called Moss. This serpent starts in the front of the screen and crawls all the way around the back of the theater, crawling through the walls and being hidden and then exposed, so you know he’s around you. Then he shows up back at the front of the screen and roars. We had 25 discrete addressable channels of audio, so as the serpent crawled around behind people, we tied the sound to the motion, so you hear him moving through rocks and making creepy noises and that sort of stuff. The sound varies as he’s moving through the different surfaces and areas. It was super fun.”
In another commercial, a discus flies around the room and bounces off the walls. “That was one where I wish we had more speakers but Spat definitely helped that sound of bouncing around the room,” Moore states. “We did different effects for each area.”
Moore can’t imagine accomplishing this type of project without Spat. “Doing a smaller array that sums a lot of speakers into left and right wouldn’t be nearly as effective,” he notes. “And rendering each channel individually would be impossible. With Spat, each speaker was accounted for in the model and carried the correct part of the program. We’re usually working on really tight timelines, and the hugest optimization we implemented was using Spat to deal with the entire mix all at once.”
Canopy LA also used Spat at a recent Verizon off-site Super Bowl celebration. “The part involving Spat was unplanned until the day before launch,” muses Moore. “The main experience was a 7.1 presentation in a dome, which we did. The day before it went live, the client asked us to design sound for the hallway leading into the dome, using ten speakers down the hallway. So I created a Spat session at home, modeled the hallway, and designed effects emulating going out into a stadium. At the front of the hallway was a loud crowd, and as you got toward the back of the hallway it became a reverberating, less present crowd so it felt like you were getting further away but with the echoes of the hallway. Spat was perfect for that. And we did it in one night. Anybody else probably would have turned that down but with Spat, we could do it. My team and I love Spat because it makes high-channel-count, custom projects possible, and we can do them quickly.”Read More
Orleans, France – February 2020…
It has been a banner year for FLUX::Immersive, a pioneer in audio plug-in design and immersive technologies. The company enjoyed tremendous growth in 2019, and enters 2020 primed for even more expansion. Unified under the FLUX:: brand, the company now encompasses multiple divisions focused on engineering and business development.
The company’s home office in Orleans, France is now home to Flux: Software Engineering (SE). Led by company founder and President Gaël Martinet, Flux SE is focused on software research and development, including the company’s highly acclaimed SPAT Revolution immersive software. Flux SE’s third-party development projects include prototyping and development of audio plug-ins, workstation platforms, software processing libraries and algorithms, working with such high-profile collaborators as IRCAM, Merging Technologies, and Jünger Audio.
The recent opening of Montreal, Canada-based Flux Technologies, home of Flux:: Immersive Consulting Group, provides the company with a strong presence in North America. Technology veteran and long-time key partner Hugo Larin joins the company as a shareholder and Director of Business Development, and will spearhead global expansion of the company’s sales and marketing network, as well as overseeing integration with hardware manufacturers, and expanding the company’s growing presence in the live entertainment sector.
“We have come a long way since I founded Flux 20 years ago, and it’s been an amazing journey,” remarked Martinet. “And yet it’s clear that this was just the beginning. Our growth now has dramatically accelerated. We have some exciting new products and projects in the works, and the future is very bright for Flux.”
“Flux products speak for themselves, and our collaborations with technology leaders has given us a unique perspective on the needs and demands of industry professionals worldwide,” added Larin. “From our humble beginnings as a single-owner software engineering startup, Flux has grown to become a customer-focused leader and innovator. Today Flux technology can be found in the world’s leading performance venues and studios. We’re immensely proud of the team we’ve built, and the best is yet to come.”
For more information on Flux:: visit www.flux.audio or connect with #fluximmersive on social media.Read More
Orlando, FL—February 2020… Acclaimed audio engineer and sound designer Dan Scott has helped create Emmy award-winning television programs, as well as films, rides, and Broadway-caliber theater shows in venues around the world. A member of the Orlando-based Ty Fy Studios team since 2007, Scott has worked for such elite clients as LucasFilm, Cartoon Network, Fox, Walt Disney Entertainment, and Walt Disney Imagineering. Since 2016, he has relied on FLUX:: Immersive’s Spat Revolution real-time 3D audio mixing software to create realistic immersive virtual audio environments.
“The first show that I used Spat Revolution on was for a cruise line,” recalls Scott. “The ship had a new, large-scale theater show with a lot of immersive technologies. They were putting time and money into making the show look brilliant, with a ton of new video technology. They used small set pieces and a giant 6K-resolution curved LED video wall with shading and mapping that tracked the set, so the video reacted to what was happening onstage. There were projections on the floor and on the walls, and the scenes were dynamic and changing. They did a great job of changing the feel of the room for every scene. It was really beautiful.”
Scott wanted to scale up the show’s audio technology to complement the high-end video but the theater had a conventional audio infrastructure. ”With Spat I was able to use every individual speaker to create a virtual room instead of just panning between matrixed speaker clusters. I used Spat to spatialize dialog, effects, and music without any additional hardware.”
In theme park and cruise line productions, setting the mood is crucial to the experience, Scott observes. “From the moment they walk into the venue, you need to take the guests away from the thousands of people they’ve been surrounded by all day and transport them into the story you’re telling. The preshow walk-in for this specific show was an ambient soundscape to put the audience inside a craftsman’s workshop, including torches projected on the walls along the aisles, conveniently right on top of the speakers,” he describes. “Those were my favorite ambient elements of the scene. With Spat I was able to put different subtle crackling on every torch, and as you walked by each one you heard the torches’ sound go by. Spat gave us the ability to make it truly immersive and different for each person in the audience.”
During the walk-in, Scott was able to give audience members the effect of some elements coming from outside the walls of the workshop, in addition to the sounds coming from within, before the actor arrived onstage. “You hear him get up and walk, his chair scrapes, and it sounds as if he were backstage, even though we’re just using normal line array speakers up front,” Scott observes. “Spat Revolution placed everything so well that it sounded like he walked across the stage, wound a music box, and listened to a song play for a few moments; then he’d walk the other way and try one of his other music boxes. When the show started, we placed the sound right where the actor came up on a lift holding one of the music boxes we had just heard.”
Panning audio in a multi-speaker system is not new, Scott points out, “but Spat glues everything together so it doesn’t feel like you’re just panning from speaker to speaker. The virtual room adds reflections, and you’re localizing sources in the room by using every individual speaker to help create the environment, instead of just changing the volume percentage across the speakers.”
“I love that Spat automation is controlled by, and saved directly within, our Pro Tools sessions,” Scott continues. “We’ve used it in shows set outdoors in the countryside, dungeons, race tracks, you name it. It’s awesome to create what we want each space to sound like and grab automation snapshots to recall through the timeline of the show. Then anything we throw into that room feels like it’s in the right space for the scene. And it’s not just a sound coming from a speaker. You’re virtualizing the entire environment and utilizing every speaker in the system.”
Spat Revolution also enables the Ty Fy team to develop abnormal room layouts. “That’s big in theme parks,” Scott adds. “I worked on a Halloween attraction in Hong Kong where there were segments with multiple connected rooms, and I was able to easily move sounds realistically between them. The main sound source was placed in one room but you would hear the ambience in the next room over, so you get a sense of what’s coming. This was all set in a giant castle that was made of styrofoam and wire, so it didn’t have a natural acoustic ambience that matched the visual—but we could create an ambience that made the space feel much bigger than it actually was. It’s a lot more realistic with Spat than with previous methods, and it’s a lot more achievable because it’s faster and less expensive.”
Scott just finished a 360-degree film that was mixed in Dolby Atmos. “We were using the Dolby setup but I knew I could achieve some elements better in Spat because I could virtualize the entire space,” he notes. “Atmos is great and gives you access to a large number of speakers but it’s only discrete panning; it’s not filtering or adding ambience. It’s so great to be able take my sources into Spat, quickly design a room layout using either a CAD drawing or some quick laser measurements for accurate speaker locations, and source the outputs through the Atmos system. With Spat it was easier and faster to get more believable sounding movements.”
Scott has used Spat so successfully that clients specifically ask for it. “The reaction has been unanimously positive,” he reports. “Spat gives us more options and frees us. It opens up the creative process.”
The post Ty Fy Studios’ Dan Scott Creates Immersive Experiences with Spat Revolution appeared first on Flux::.Read More
Merging Technologies and FLUX:: Immersive (Booth 7-P182)
We have the pleasure to join our longtime manufacturing partner Merging Technologies previewing the integration of Spat Revolution with the Ovation Media Server and Sequencer, an excellence in Show Control.
This integration will show how both applications can be used for Immersive audio content creation and, most importantly, provide the option to create cues based on OSC recording and editing. In addition, this integration makes it possible for the Ovation mixer parameter and automation to communicate with Spat Revolution Source Objects.
FLUX:: Immersive and BlackTrax Real Time Tracking (Booth 3-C125)
Spat Revolution will this year be the spatial audio engine of choice for the Cast Group of Companies exhibit of their Real-Time Tracking solution BlackTrax. Together with Adamson Loudspeaker the audio demo will consist of a Spat Revolution audio engine interconnecting to the Milan-ready (AVB) world’s first network redundant loudspeakers.
The post ISE 2020 Amsterdam February 11-14 at the RAI Amsterdam Convention Centre appeared first on Flux::.Read More
Spat Revolution is a software engine running on standard computer hardware, dedicated to offline show content creation workflows and real time live applications.Computer Hardware
Running on generic hardware means that a vast pool of audio interfaces (e.g., MADI, network AVB, or Dante / AES67 virtual audio entities), and a wide range of sample rate options (from 44.1Khz – 384Khz), are available for the system device setup. Low latency can be achieved with low block size software options (starting at 16 block size), the appropriate choice of audio interface, and most importantly, a computer sufficiently resourced and optimized for real time audio. The actual Spat setup (number of sources, rooms and such) has no impact on latency, which is predictable, fixed, and can be easily defined in accordance with the user’s hardware setup.
Simultaneous output systems (virtual spaces called rooms) with more than 32 channels of networked 48Khz audio, or hardware dedicated as a real-time audio engine , typically requires specific validation.
In most application scenarios, a dedicated computer is preferred. While memory (RAM) resources are not critical, multiple core CPU processing power is critical, as parallel processing is used to treat the actual source in the software. Because the software offers 3D control areas, an efficient graphics with at least 4-8 Gb of memory is also required. Appropriate e optimization of the computer will then achieve optimal system performance.
For those not inclined to DYI system design, hardware can be specified (and provided) by integrators or via the FLUX::Immersive Consulting Group, offering a range of services for system deployment and integration. Channel count, sample rate, audio distribution method, and any required third party control integration , are basic parameters when defining proper system design elements.Live production, Immersive technical services, and consultancy
To support live production deployments, technical services are available from FLUX:: Immersive Consulting Group, ranging from fully configured hardware/software system packages based on specific project requirements, to pre-design, guidelines, deployments, system tuning, commissioning, and training.
To support Live production, new software options will be available in 2020 offering live application-specific feature sets:
With content creation workflow a critical part of the live production process, Spat Revolution’s ability to run on generic hardware offers the sound designer an opportunity to start an offline conception on a local computer (without the need for specific hardware). Spat Revolution seamlessly integrates with a variety of DAWs and Playback systems, allowing for local inter-application audio transport and automation. Compatible third party systems include Ableton Live, Nuendo, Pro Tools, Reaper, QLab, and many others. Spat Revolution’s production suite includes three plugins (AU, AAX and VST):Spat Send, Spat Return and Spat Room. The user can build automation network cues, write automation of an immersive mix to a timeline or to cues, preview the results (binaurally or on smaller scale speaker arrangement), and build a show that is ready to move to dedicated live playback systems and Spat Revolution Live Computer engine.
From creation to delivery!
The creation phase can be managed on a single computer, and easily migrated to a network of show controllers, audio playback, mixing desks and Immersive audio engines.Real Time Software Engine
Working in a live, real-time engine mode requires the ability for control via a variety of third party systems over the network. This integration is accomplished via the Spat OSC (Open Sound Control) dictionary, or via the presence of Spat plugins on the network from the DAW or Playback system. Live console integrates the ability to provide encoder control for each source in Spat Revolution, and take advantage of mixing desk snapshot capability to create automation with timing interpolation parameters. This integration is currently offered with Avid S6L Live Sound console via Spat Send plugin or using generic OSC commands with mixing console offering OSC control, such as the Digico SD series. The power of OSC means that any third party system with this capability can potentially be part of the immersive audio control. Also on the subject of real time, Spat Revolution offers support for real time tracking systems such as BlackTrax (RTTrPM procoll support), Zactrack and Stagetracker, which can be integrated in this real time immersive audio package.Redundancy
Redundancy and the need for a complete fail over being critical in live production, the Spat Revolution immersive system can operate in redundant mode, with the secondary system simultaneously outputting discrete audio channels as back-up output, routed to a loudspeaker management system or back to the audio mixing system. For this, two Spat Revolution systems (Spat license includes two activations) can be deployed and simultaneously receive audio feed and automation. While the Spat Send plugin allows for dual OSC output, each targeting the systems in parallel, other controllers (Lemur Ipad controllers, OSC Network cues, etc.) can target both systems as well . Audio is output from both the primary and secondary systems, providing redundant sources to the diffusion loudspeaker system.
To this extent, Spat Revolution can be deployed strictly as a remote control (no audio processing), targeting commands to two Spat computer engines dedicated to audio processing via OSC commands.Immersive Audio Spatialization Techniques
Spat Revolution addresses panning over multi-channel immersive audio systems without the requirement of a closed framework, adapting easily to speaker design arrangements for various productions with spatial audio techniques and panning methods to suit different applications (Binaural, High order ambisonic HOA or 2D/3D Channels based with WFS, VPAB, DBAP, KNN, SPCAP and more panning methods). Multiple virtual spaces in the software engine called “Rooms” allow you to deliver to multiple diffusion systems (virtual spaces using all or partial speaker arrangement setups) and offer extensive flexibility in creating custom 2D or 3D speaker arrangements. This allows addressing unconventional stage setups where 5, 7 or more speaker hangs are spread across the stage with somewhat equal separation between each, or having an arrangement of multiple loudspeakers in arbitrary locations. It is important to note that these virtual spaces each include acoustic simulations (reverb engines) to generate early reflections localized with each source, along with a reverb tail end that is diffused at the outputs, thus creating a sense of depth and reality!
On the subject of multiple virtual rooms in Spat revolution, it is important to mention that in the creation phase or studio style scenario, delivering simultaneous output streams is created by recording multiple stream formats of the same immersive mix in the DAW.. For example, a sound designer can create a binaural output preview on headphones while on a plane, and a surround speaker arrangement for studio work, while simultaneously creating content for multi-speaker arrangements of the actual show itself which may have speakers by the dozen!
Spat Revolution also offers advanced virtual source parameters from the simple (basic radiation control of azimuth, effective 360 degree pan, and basic source distance) to very complex situations where multiple perceptual factors are being controlledSoftware engineering and technology
FLUX:: has been a software development partner with French research institute IRCAM (www.ircam.fr), since 2008, and Spat Revolution is the result of decades of research and achievements. Many of these technologies have been successfully deployed in live sound installations with products including Spat in MaxMSP, Panoramix, with the legacy unique Spat audio plugin, and most recently with Spat Revolution.
The FLUX:: and IRCAM cooperation offers a variety of spatial audio techniques to users and designers, sharing a vision of open development. Behind these various spatialization and audio panning techniques is the desire to offer creativity, flexibility and the ability to adapt to each application and creative challenge, whether sweet spot-centric , live performance, or and installation-based, and regardless of where the audience may be distributed.
Other resources :
An Introduction to Spat
Spat Revolution Real Time Live Engine
Custom Speaker Configuration
Orléans, France—December 2019… FLUX:: and LS Media are pleased to announce that LS Media has joined the FLUX:: Immersive Consulting Group, a coalition of companies and individuals that provide services and support for the deployment of FLUX:: Spat Revolution and other FLUX:: immersive audio products and technologies.
Based in Montreal, Canada, LS Media provides a variety of services for the professional audio market, including design assistance, system commissioning, predesign, operator and system training, and hardware testing, qualification, and support. LS Media is especially well known for its services for AVID customers.
Founded in 2006, Flux:: creates intuitive and technically intuitive and technically innovative audio software tools, used by sound engineers and producers in the professional audio, broadcast, post production and mastering industry.
The French company is at the forefront of the immersive audio revolution, teaming with Paris-based research institute IRCAM to provide groundbreaking reverberation, acoustics, and spatialization technology and research.
“LS Media will still be an AVID Learning partner and will continue to provide AVID certification and commissioning,” notes LS Media Business Director Hugh Larin, “The big news is that as part of the FLUX Immersive Consulting Group, we’re now providing our services for the deployment of FLUX:: immersive technologies, including Spat Revolution. We can help with design assistance and predesign support, and we will provide consulting services when people are not quite sure where to start when putting together a FLUX:: immersive system. We’ll go all the way to hardware qualification, hardware support, and lists of what should be deployed.”
“LS Media is an essential partner in the FLUX Immersive Consulting Group,” emphasizes FLUX:: founder Gaël Martinet. “LS Media brings in-depth technical expertise and extensive experience providing high-level design, consulting, technical support, and other services for audio professionals. In particular, Hugo Larin has been a key collaborator in the FLUX:: Spat Revolution audio engine project, and he has deep roots in audio mixing, design, and operation, as well as in networked control and data distribution. With the addition of LS Media, the FLUX Immersive Consulting Group is far stronger. Customers who want to deploy Spat Revolution and our other immersive audio technologies can now access a level of services that was not previously available.”
The post LS Media Joins the FLUX:: Immersive Consulting Group appeared first on Flux::.Read More
Saskatoon, Canada—December 2019… When indie-rock trio Close Talker launched its
new LP, How Do We Stay Here?, they didn’t want to introduce it with a typical live show.
The band wanted people to truly listen to the music, with no distractions, and they sought
a fresh approach that would set them apart. The multi-instrumentalist trio of Matt
Kopperud, Will Quiring, and Christopher Morien and front-of-house engineer Kellan
Thackeray considered a listening party, with the band playing live and the audience
wearing headphones. It was a good start but they wanted to deliver something
more—say, an immersive, 3D experience. But how?
While they were mulling the possibilities, Thackeray met LS Media’s Hugo Larin and
Benoit Favreau, who were offering demos of the AVID S6L live mixing console. The LS
Media team also showed FLUX:: Spat Revolution real-time 3D audio mixing
software—and Thackeray realized he might have found the solution for Close Talker’s
The result of a partnership between French software developer FLUX:: and the French
research institute IRCAM, Spat Revolution lets you assign incoming audio to a practically
unlimited number of source objects in a virtual space. Each source object has an
extensive set of parameters controlling all aspects of the source in the space, including
real-time changes in position and much more, enabling the creation of complex, real-
time, immersive audio experiences.
Spat Revolution source objects appear in a virtual room and can be assigned to multiple
virtual rooms with different acoustic simulations and multiple different output destinations.
The results can be delivered in several formats, including as a binaural mix. Spat
Revolution also can integrate, locally and over a network, with DAWs, control devices,
and real-time tracking systems.
After consulting with Larin, Thackeray and the band realized this meant they could
deliver the concert they dreamt of, in which audience members listened on headphones
to an immersive binaural mix of the band’s real-time performance. “An ‘immersive
binaural mix’ is a nerdy way of saying you’re in the middle of the band, and the
instruments are around you, as opposed to shooting at you,” translates Close Talker’s
Kopperud. “We learned that it really does add value and depth to the experience. What’s
unique about it is our ability to put instruments around your head and not just in fixed
positions, and we can move the instruments around your head live, in real time. What’s
impressive on the tech end is that we’re able to process 3D movements around the
subject in real time without latency. So it’s putting the audience in the center of the music
and allowing the music to really engulf them.”
What began as one listening party turned into Close Talker’s trans-Canada “Immersion”
tour. Each venue accommodates up to 64 audience members listening on Audio-
Technica PRO5X headphones, which are fed by four 16-channel headphone distribution
amps. Since delivery is on headphones, “Immersion” concerts can be held in venues that
normally would not accommodate a live show.
“The AVID S6L console is processing all of the inputs, and we have two MADI cards in
the S6L engine, so the direct outs are patched out to two MADI interfaces that are
connected to two Mac mini computers running Spat,” Thackeray reveals. “It’s a fully
redundant system: The minis are running in parallel, so if one fails, I can very easily
switch to my backup system, which is always following all of my snapshots. The
integration of the AVID with Spat Revolution is fantastic.”
In Spat Revolution, Thackeray and Close Talker spatialize the audio streams in two
virtual rooms, one binaural and one stereo. Instruments they want to move around and
play with are assigned to the binaural room. Kick drum, bass synth, and bass guitar are
usually in the stereo room to provide a more traditional impact, although they could be
processed in the binaural room for effect. Depending on placement and movement within
the virtual rooms, the software generates the appropriate early reflections, clusters and
tail from acoustic simulation, creating a sense of depth and reality. The
outputs of the two virtual rooms are then returned to the console, summed, and sent via
the left/right main bus to the headphone distribution system. An AVID Pro Tools system
handles the recording of every show and enables virtual soundchecks for refining the
“The artistic intent of making these shows a communal experience, and knowing I had
the technology to do this, got me hooked right away on the project,” Thackeray
declares. “Like anything new, the question was how it would integrate into the creation-
to-delivery workflow. The possibility was given to the band to start creating in their DAW
environment, experiment, and prepare some rough spatial mix concepts positioning and
setting properties of sources as objects in the virtual space. This was done as they were
monitoring in various formats or with different binaural HRTFs.”
The result is a unique and intimate concert experience that achieves everything the band
hoped for. “Immersion’ is us trying to create an experience for the audience that’s
different from anything they’ve experienced before,” Quiring observes. “This is the most
intimate setting we could create to showcase our album exactly as we intended,” agrees
Kopperud. “The philosophy of our new album and the philosophy of this project have
merged at the perfect time. The whole album is an effort to describe something
indescribable. It’s us wrestling of this idea of trying to define what makes something
special, special. It would have been easier to just have left this as an idea that we dreamt
up in the van, like, ‘some band should do this someday.’ It’s pretty cool to look back and
reflect that we’re that band.”
The post Close Talker Creates Unique Immersive Concerts with Spat Revolution appeared first on Flux::.Read More
Close Talker, an Indie Rock band from Saskatchewan in Canada, virtually places their audience in the center of the stage with their silent binaural show Immersive.
The pre-production, live performance and recording of the show was created with the AVID S6L, Pro Tools and the Spat Revolution Immersive Audio Engine, using binaural audio and headphones for the audience.
An Avid S6L console with 2 stage remote I/O’s are handling the inputs and channel processing, while each input channel is sent post fader direct out to two computers (Main and Backup) running the Spat Revolution Immersive Audio Engine.
With Spat Revolution the mixing engineer creates the Spatial scene, the 3D location of the sources, Reverberation, Effects, and generate a binaural spatial audio mix which in the end is returned to the mixing desk and distributed to the audience’s headphones.
The S6L integrates the Spat Revolution using a plugin that allows for all the source parameters (each individual source / object going into the software) to be accessible on the encoders on the console and to be used in snapshots (show automation).
Kellan Thackeray, FOH for Close Talker, had a vision of where he wanted to bring this, and had been experimenting with some technologies to get there.
“Think silent disco but the audience headphone mix is processed with Spat Revolution to give the mix that extra push – this is a seated hardwired show to minimize latency and noise. The artistic intent of making these shows a communal experience, and knowing I had the technology and a way to do this, got me hooked right away on the project.”
The audience’s headphone mix is created with cutting edge binaural mixing technology, live in real time. The sound fully immerses the audience with instrumentation and voices orbiting and swirling around them on all axes, increasing the impact and depth of the band’s already acclaimed live show.
“Like anything new, the question was how it would Integrate in the creation to delivery workflow that became key. The possibility was given to the band to start the creation in their DAW environment, experiment and prepare some rough spatial mix concepts positioning and setting properties of source / objects in the space. This was done as they were monitoring in various formats or with different binaural HRTFs. “
As the time arrived for the actual pre-production and band rehearsal, the source / object parameters were captured with the snapshots of the S6L system as a starting point, then the programming of show snapshots was done. A Pro Tools system handles the recording of every show, and the ability to do a virtual production (virtual soundcheck) for refining the mixes.
The post Close Talker – Immersion: A silent concert in binaural audio appeared first on Flux::.Read More
Flux:: sound and picture development was founded in the 1990’s during the early days of digital audio software workstations, collaborating with Merging Technologies in the creation of Merging’s now well renowned products.
In 2007 Flux:: started releasing their own exquisite audio software product line tailored for demanding sound engineers, and has since then been focused on creating intuitive and technically innovative audio software tools, used by sound engineers and producers in the professional audio, broadcast, post production and mastering industry all over the world.