• Green / blue screen keying & garbage masking
• Camera tracking: Mo-Sys, NCAM, Intel, HTC & more
• Volume Grading
• Live Compositing with 2D and 3D Environments
• Support for 3D Notch Blocks and USD (Universal Scene Descriptor)
• LED wall control
• DMX lighting control based on LED wall image content
• Recording of video & metadata
• Metadata & Comp prep for VFX post
• Live Link to Unreal Engine
• SDI/NDI capture & output
• Direct GPU texture sharing with other apps
Assimilate Live FX is the first one-stop software solution, that is designed for live compositing in multiple virtual production workflows. From previsualization in location scouting and pre production quick comps to final pixel in-camera VFX in green screen- and LED wall based workflows. And the best of it all: You don't need to be a programmer to operate it! Live FX is easy and straight forward to use, so you'll get stunning results, really fast!
Live FX ships with all the tools needed to do green screen keying, volume grading, camera- and object tracking, as well as create advanced composites merging the live camera signal with CG elements - be it simple 2D textures, equirectangular footage or complete 3D environments. In LED wall based workflows, Live FX can auto-control the lighting in real time based on the projected image content via DMX protocol and sync all incoming and outgoing SDI feeds.
Live FX also features a direct integration with Unreal Engine, and can communicate with other applications such as Unity via Open Sound Control protocol or direct GPU texture sharing. Camera feeds, alpha channels, full composites, as well as all dynamic metadata from the tracking system or camera SDI feed can be recorded along the way and auto-prepped for post. Check out various workflow models below!
• Check our user guide for Live FX
• Install recommended Video-IO Drivers for AJA, Blackmagic and Bluefish444
• Watch our Live FX Hands-on Video Tutorials
• Join our Facebook Community
• Get in touch with our Support Team
Scroll down and check out various workflow examples for Live FX!
|Live FX||Live FX Studio|
|Green Screen Tools ⓘ||✓||✓|
|Syncing of multiple Live Feeds||✓||✓|
|Software Object Tracker||✓||✓|
|NDI Capture & Output||✓||✓|
|Open Sound Control (OSC) In & Out||✓||✓|
|Source Recording & Offline Comp||✓||✓|
|Live Link to Unreal Engine||✓||✓|
|OFX Plugin Support||✓||✓|
|Matchbox Shader Support||✓||✓|
|Basic Camera Tracking ⓘ||✓||✓|
|FreeD tracking protocol support ⓘ||✓||✓|
|Mo-Sys Camera tracker||·||✓|
|Ncam Camera tracker||·||✓|
|Antilatency Camera tracker||·||✓|
|SCRATCH VR included||·||✓|
|DMX / ArtNet Control||·||✓|
|Notch Block Support||·||✓|
|Video Router Control ⓘ||·||✓|
|LED wall control||·||✓|
|Universal Scene Descriptor (USD) support||·||✓|
|Offline/Online Comp Conform||·||✓|
For pricing see promo pricing on the online store
Shooting green screen starts with being able to key the green and replace it with a background of your choice. Live FX ships with HSV-, RGB-, Vector-, Chroma- and Luma-Qualifiers, that can be combined to generate the ultimate alpha channel. If that is all you need, great! You can output the full composite and the generated alpha channel separately via SDI, NDI or directly from the GPU via HDMI or DisplayPort to e.g. Unreal Engine! If you plan on bigger things, read on...
The next logical step is to replace the keyed green screen with another image. Live FX can load literally any format, including all popular camera RAW formats and even 360° content or complete 3D Notch Blocks to fill the background. The background can then be scaled and positioned as needed. Camera live feed and digital background can be color graded separately, as well as together to create the perfect illusion. Live FX even allows the usage of OFX plugins and Matchbox GLSL shaders directly inside the composite and applies them in real time. Even more so, per-frame metadata through the camera SDI, such as focal distance, can directly be linked to any parameter in Live FX - to e.g. blur a texture based on the camera lens' focus position.
Time to take it up a notch: Live FX features various different tracking techniques to offset the digital background based on camera movement. In many scenarios, the live object tracker can be sufficient in creating a parallax effect on the background image, which can be linked to the tracker. But of course Live FX also features a virtual camera, which can be linked directly to the physical camera on set: Whether using the camera's own gyro SDI-metadata, or a dedicated tracking device, like Mo-Sys Star Tracker, NCAM, StereoLabs, Intel RealSense or HTC Vive trackers - Live FX offers the most flexibility to capture camera tracking information and create a virtual 3D scene using 2D and 3D elements.
On virtual production shoots, metadata is a precious good. That's why Live FX will record all and any metadata from the tracking device, camera SDI or even user-input data such as Scene & Take, annotations and more into side car files that are ready to use for vfx-post. Users can record the live camera feed untouched, or the complete composite to ProRes, DNx MXF or H264. If the recording format allows for an alpha channel (such as ProRes 4444), the created alpha channel will be recorded as well!
Live FX allows for a unique and highly efficient workflow towards post: Start with the live composite and record the raw feeds and metadata separately. As soon as the recording stops, Live FX automatically creates an offline composite that is ready for instant playback and review. The offline composite is a duplicate of the live composite, using the just recorded files instead of the live signals. It also includes all other elements that were used in the live composite and provides the recorded animation channels for further manipulation. Once the high resolution media is offloaded from the camera, Live FX automatically assembles the online composite and replaces the on-set recorded clips with the high quality camera raw media. These online composites can be loaded into Assimilate’s SCRATCH® to create metadata-rich dailies or VFX plates in the OpenEXR format, including all the recorded frame-based metadata.
Assimilate Live FX can load any footage at any resolution and any frame rate and play it out to an LED wall. By that we mean 2D elements, such as a simple PNG file, a Quicktime file or any camera RAW material that is suitable. This includes 360° material for which Live FX can even animate the field of view based on camera tracking data. But it doesn't stop there: Live FX allows to load complete 3D environments in the form of Notch Blocks and tie the virtual camera to the physical camera on set!
Whether you're using Live FX to send content to the LED wall, or just piping through content from another source like Unreal Engine, Live FX ships with a complete finishing toolset, allowing to livegrade not only the volume itself, but also the live camera signal in order to merge it perfectly with the digital background. Through it's intuitive layer stack, Live FX allows to add an unlimited amount of layers to add other 2D and 3D elements into the scene with just a couple clicks.
With LED wall based workflows, camera tracking becomes paramount. Live FX allows for a number of methods to track your camera, depending on budget and technology. Whether you're using Mo-Sys StarTracker, NCAM, StereoLabs, HTC Vive trackers, Intel RealSense or simply your smartphone with its internal gyro sensor or ARKit app - Live FX is ready for any budget and on-set situation. But even without a dedicated tracking device, Live FX can perform accurate camera tracking by reading out Pan, Tilt and Roll information that ships via the live SDI signal of the camera.
Our main job in virtual production setup is to merge live imagery with CGI-elements. Live FX ships with all the tools that professional artists are familiar with to create stunning composites. Besides the color tools, such as Color wheels, Curves, Vector grid and the ability to load Lookup Tables, Live FX features powerfull keyers, and allows to combine different keys through layers and node tree. Unparalleled format support allows you to import literally any kind of footage, with and without alpha channels, in any resolution, any frame rate and even in 360 equirectangular. Literally, the hardware underneath is the limit of what Live FX can do!
While Live FX has been designed as a one-stop-application, it provides a number of hooks to be integrated into existing virtual production workflows. Live FX can be fully controlled through Open Sound Control (OSC) and also send metadata, such as playback controls, camera positional data and other metadata via OSC to other apps. Through a dedicated live link, there is a direct integration of Live FX with Unreal Engine and if the two run on the same machine, Live FX allows for direct texture sharing on the GPU for absolute zero-latency image exchange!
Not only the content of the LED wall has to be controlled, but of course the lighting inside the studio has to play along with the projected content in order to create the perfect illusion. Live FX supports a multitude of light panels through DMX protocol and can control them based on image content. The LED wall shows a flickering fire? No problem - Live FX will let your ARRI Sky Panels flicker along in the same frequency and hue. The car chase scene enters a tunnel and the lights need to go dark at the same time? Not a problem either.
Capturing metadata has been Assimilate's strong suit ever since. With virtual production workflows, capturing metadata is a must and Live FX makes it as easy as can be. Whether it's live SDI metadata from the camera itself, tracking data from Intel RealSense or user-input in the form of on-screen annotations or Scene&Take info - any- and everything metadata is being captured along the way, ready to use in post!
With Notch Builder, you can create amazing motion graphics and interactive VFX in real-time. These 3D environments can be exported as a Notch Block and be imported as a virtual background into Live FX Studio*. Inside the Live FX Studio project, you can link the camera tracking information from the physical camera to the virtual camera inside the Notch Block. The output from the Notch Block camera is then used to replace the green screen, or feed the LED wall. Live FX Studio can animate more than just the camera position and FOV: Any object or property of the scene is adjustable, or can be linked to an incoming live parameter. Notch Blocks can also have any number of inputs, that Live FX Studio can feed textures or plugins into, in order to create the perfect live composite.
*requires a Notch Playback license, which can be obtained from the Notch website.
OS: Windows 7 / 10. macOS 10.9 and up.
CPU: Any modern Apple, Intel or AMD processor.
GFX: Any modern graphics card. Preference for high end graphics: NVIDIA Quadro / AMD Radeon PRO. Note that on systems with standard Intel graphics not all features might be supported.
RAM: Min 8Gb. Preferred 12Gb or more.
SDI (optional): AJA, Blackmagic, Bluefish444.
See detailed system requirements here.
Each license comes with full access to the latest version and all updates of the software, as well as access to our technical support team.
Note that a permanent license comes with 1 year of support. After that, you can continue to use the software but to be eligible to further software updates or contact to our technical support team, you need to extend your support contract.
A site license offers you an unlimited number of licenses to be used within your facility. Please contact sales for more info.