Welcome to the free Open Beta of Live FX!

Assimilate Live FX is the first one-stop software solution, that is designed for live compositing in multiple virtual production workflows. From previsualization in location scouting and pre production quick comps to final pixel in-camera VFX in green screen- and LED wall based workflows. And the best of it all: You don't need to be a programmer to operate it! Live FX is easy and straight forward to use, so you'll get stunning results, really fast!



Get Live FX

In a nutshell - what does it do?

• Green / blue screen keying & garbage masking
• Camera tracking: Mo-Sys, Intel, HTC & more
• Volume Grading
• Live Compositing with 2D and 3D Environments
• Support for 3D Notch Blocks
• LED wall control
• DMX lighting control based on LED wall image content
• Recording of video & metadata
• Metadata & Comp prep for VFX post
• Live Link to Unreal Engine
• SDI/NDI capture & output
• Direct GPU texture sharing with other apps

Wanna know more?

Scroll down and check out various workflow examples for Live FX!

LFX_LogonScreen

Yes – we want your Feedback!

You put Live FX through its paces and have suggestions for making your workflow better? Missing a feature? Ran into any issue? Don’t hesitate to let us know by using the form below!

Live compositing - green screen based or LED wall driven

Assimilate Live FX is the first one-stop software solution, that is designed for live compositing in multiple virtual production workflows. From previsualization in location scouting and pre production quick comps to final pixel in-camera VFX in green screen- and LED wall based workflows. And the best of it all: You don't need to be a programmer to operate it! Live FX is easy and straight forward to use, so you'll get stunning results, really fast!

BK_Studio_04_setup
RR_01_LFX

What does it do?

Live FX ships with all the tools needed to do green screen keying, volume grading, camera- and object tracking, as well as create advanced composites merging the live camera signal with CG elements - be it simple 2D textures, equirectangular footage or complete 3D environments. In LED wall based workflows, Live FX can auto-control the lighting in real time based on the projected image content via DMX protocol and sync all incoming and outgoing SDI feeds.

Connect to Unreal Engine - and more!

Live FX also features a direct integration with Unreal Engine, and can communicate with other applications such as Unity via Open Sound Control protocol or direct GPU texture sharing. Camera feeds, alpha channels, full composites, as well as all dynamic metadata from the tracking system or camera SDI feed can be recorded along the way and auto-prepped for post. Check out various workflow models below!

RR_iPadControl

Start easy - advance fast

Shooting green screen starts with being able to key the green and replace it with a background of your choice. Live FX ships with HSV-, RGB-, Vector-, Chroma- and Luma-Qualifiers, that can be combined to generate the ultimate alpha channel. If that is all you need, great! You can output the full composite and the generated alpha channel separately via SDI, NDI or directly from the GPU via HDMI or DisplayPort to e.g. Unreal Engine! If you plan on bigger things, read on...

CXL_01_keyer

Green screen replacement & Grading

The next logical step is to replace the keyed green screen with another image. Live FX can load literally any format, including all popular camera RAW formats and even 360° content or complete 3D Notch Blocks to fill the background. The background can then be scaled and positioned as needed. Camera live feed and digital background can be color graded separately, as well as together to create the perfect illusion. Live FX even allows the usage of OFX plugins and Matchbox GLSL shaders directly inside the composite and applies them in real time. Even more so, per-frame metadata through the camera SDI, such as focal distance, can directly be linked to any parameter in Live FX - to e.g. blur a texture based on the camera lens' focus position.

Object- and camera tracking

Time to take it up a notch: Live FX features various different tracking techniques to offset the digital background based on camera movement. In many scenarios, the live object tracker can be sufficient in creating a parallax effect on the background image, which can be linked to the tracker. But of course Live FX also features a virtual camera, which can be linked directly to the physical camera on set: Whether using the camera's own gyro SDI-metadata, or a dedicated tracking device, like Mo-Sys Star Tracker, Intel RealSense or HTC Vive trackers - Live FX offers the most flexibility to capture camera tracking information and create a virtual 3D scene using 2D and 3D elements.

Record everything - incl. metadata

On virtual production shoots, metadata is a precious good. That's why Live FX will record all and any metadata from the tracking device, camera SDI or even user-input data such as Scene & Take, annotations and more into side car files that are ready to use for vfx-post. Users can record the live camera feed untouched, or the complete composite to ProRes, DNx MXF or H264. If the recording format allows for an alpha channel (such as ProRes 4444), the created alpha channel will be recorded as well!

Prep for VFX & Post Production

Live FX allows for a unique and highly efficient workflow towards post: Start with the live composite and record the raw feeds and metadata separately. As soon as the recording stops, Live FX automatically creates an offline composite that is ready for instant playback and review. The offline composite is a duplicate of the live composite, using the just recorded files instead of the live signals. It also includes all other elements that were used in the live composite and provides the recorded animation channels for further manipulation. Once the high resolution media is offloaded from the camera, Live FX automatically assembles the online composite and replaces the on-set recorded clips with the high quality camera raw media. These online composites can be loaded into Assimilate’s SCRATCH® to create metadata-rich dailies or VFX plates in the OpenEXR format, including all the recorded frame-based metadata.

Creating real time in-camera VFX

Assimilate Live FX can load any footage at any resolution and any frame rate and play it out to an LED wall. By that we mean 2D elements, such as a simple PNG file, a Quicktime file or any camera RAW material that is suitable. This includes 360° material for which Live FX can even animate the field of view based on camera tracking data. But it doesn't stop there: Live FX allows to load complete 3D environments in the form of Notch Blocks and tie the virtual camera to the physical camera on set!

Livegrading the Volume

Whether you're using Live FX to send content to the LED wall, or just piping through content from another source like Unreal Engine, Live FX ships with a complete finishing toolset, allowing to livegrade not only the volume itself, but also the live camera signal in order to merge it perfectly with the digital background. Through it's intuitive layer stack, Live FX allows to add an unlimited amount of layers to add other 2D and 3D elements into the scene with just a couple clicks.

Camera Tracking for everyone

With LED wall based workflows, camera tracking becomes paramount. Live FX allows for a number of methods to track your camera, depending on budget and technology. Whether you're using Mo-Sys StarTracker, HTC Vive trackers, Intel RealSense or simply your smartphone with its internal gyro sensor or ARKit app - Live FX is ready for any budget and on-set situation. But even without a dedicated tracking device, Live FX can perform accurate camera tracking by reading out Pan, Tilt and Roll information that ships via the live SDI signal of the camera.

Previz and beyond

Our main job in virtual production setup is to merge live imagery with CGI-elements. Live FX ships with all the tools that professional artists are familiar with to create stunning composites. Besides the color tools, such as Color wheels, Curves, Vector grid and the ability to load Lookup Tables, Live FX features powerfull keyers, and allows to combine different keys through layers and node tree. Unparalleled format support allows you to import literally any kind of footage, with and without alpha channels, in any resolution, any frame rate and even in 360 equirectangular. Literally, the hardware underneath is the limit of what Live FX can do!

Live FX is not an island!

While Live FX has been designed as a one-stop-application, it provides a number of hooks to be integrated into existing virtual production workflows. Live FX can be fully controlled through Open Sound Control (OSC) and also send metadata, such as playback controls, camera positional data and other metadata via OSC to other apps. Through a dedicated live link, there is a direct integration of Live FX with Unreal Engine and if the two run on the same machine, Live FX allows for direct texture sharing on the GPU for absolute zero-latency image exchange!

Did someone say DMX control?

Not only the content of the LED wall has to be controlled, but of course the lighting inside the studio has to play along with the projected content in order to create the perfect illusion. Live FX supports a multitude of light panels through DMX protocol and can control them based on image content. The LED wall shows a flickering fire? No problem - Live FX will let your ARRI Sky Panels flicker along in the same frequency and hue. The car chase scene enters a tunnel and the lights need to go dark at the same time? Not a problem either.

Metadata, metadata and nothing but metadata

Capturing metadata has been Assimilate's strong suit ever since. With virtual production workflows, capturing metadata is a must and Live FX makes it as easy as can be. Whether it's live SDI metadata from the camera itself, tracking data from Intel RealSense or user-input in the form of on-screen annotations or Scene&Take info - any- and everything metadata is being captured along the way, ready to use in post!

What you need to run Live FX

OS: Windows 7 / 10. OSX 10.9 and up.
CPU: Any modern Intel or AMD processor. Preference for Intel i7 Quadcore equal or up.
GFX: Any modern graphics card. Preference for high end graphics: NVIDIA Quadro / AMD Radeon PRO. Note that on systems with standard Intel graphics not all features might be supported.
RAM: Min 8Gb. Preferred 12Gb or more.
SDI (optional): AJA, Blackmagic.

See detailed system requirements here.

Learn to Use Live FX to the Fullest

TRY

Learn more about Live FX by watching one of the many available tutorials available here or visit our Vimeo page.

Browse and search the user manual on our support site here or contact support at support@assimilateinc.com.

Pick My Plan

Live FX will be available as:

  • a monthly automatic recurring subscription
  • a 1 year rental
  • a permanent license (incl. 1 year of support & updates)
  • support renewal with a permanent license
  • a site license *

Each license comes with full access to the latest version and all updates of the software, as well as access to our technical support team.

Note that a permanent license comes with 1 year of support. After that, you can continue to use the software but to be eligible to further software updates or contact to our technical support team, you need to extend your support contract.

* A site license offers you an unlimited number of licenses to be used within your facility. Please contact sales for more info.

explainv01

Cookies on this site

We use technology such as cookies on our site to personalise content, provide social media features, and analyse our traffic. Click to consent to the use of this technology across the web. You can change your mind and change your consent choices at anytime by returning to this site.

Privacy policy | Close
Settings