LightAct Announces Support of AJA Kona cards

Ljubljana, Slovenia and Grass Valley, CA (August 25, 2020)

Today LightAct announced that LightAct media server software for video, light and sound is now compatible with AJA’s complete line of KONA I/O cards. With the new integration, LightAct customers can use AJA KONA cards to ingest live video feeds via SDI or HDMI into LightAct’s media server software and combine them with real-time or pre-rendered graphics for display via projectors or LED screens.

Using KONA cards with LightAct software, AV professionals can visualize, program and run projection mappings and immersive installations that fuse live video with graphics. LightAct features an intuitive web-based UI that makes it easy to modify content, and can be purchased as a stand-alone license, starting at $199 USD, for use with supported hardware, or alongside a LightAct Solo or Sync rack media server. The media server software is Windows-compatible and integrates with the Unreal Engine, Unity and Notch real-time render engines. It has been used to create a number of high profile projects such as a 3D projection mapped model of the city of Tampa; interactive installation Funan Atrium; and more.

LightAct customers are dedicated to delivering the highest caliber content whether for projection mapping or interactive installations, and the integration with AJA KONA cards makes it much easier. The device provides flexible connectivity for high quality capture at an accessible price,” shared Mitja Prelovsek, LightAct. “We’re thrilled with the outcome of our work with AJA, and impressed by their guided approach from beginning to end.

Alongside the growing demand for video, projection mapping and immersive installations have gained traction in the last few years, and LightAct is a technology innovator in the space. We’re always looking at how we can extend product compatibility so that our customers can use the solutions they want, and are excited to expand that support to LightAct for KONA customers,” shared Nick Rashby, president, AJA Video. “It’s been a pleasure working with LightAct on the integration and we look forward to seeing how customers harness the technology in the future.

About LightAct

LightAct® has been a leading developer of media server technology since 2010. As the first media server fully embracing real-time and interactive technology, it has built an impressive track record of eye-popping and jaw-dropping multi-media installations all around the world. While consistently pushing the boundaries of what is possible, LightAct’s mission includes democratization of advanced projection mapping, virtual production and interactive installations which reflects in the flexibility and affordability of its software and hardware.

About AJA Video Systems, Inc.

Since 1993, AJA Video has been a leading manufacturer of video interface technologies,
converters, digital video recording solutions and professional cameras, bringing high-quality,
cost-effective products to the professional broadcast, video and post production markets. AJA
products are designed and manufactured at our facilities in Grass Valley, California, and sold
through an extensive sales channel of resellers and systems integrators around the world. For
further information, please see our website at

Epic Games supports LightAct with Epic MegaGrant

Epic Games, the company behind Fortnite and Unreal Engine, awarded LightAct Epic MegaGrant!

Epic Games awarded LightAct Epic MegaGrant to further the success of LightAct’s close integration with Unreal Engine, Epic’s real-time render engine.

This grant will allow LightAct to improve existing tools and introduce several new features enabling the use of Unreal Engine in advanced multi-media events & fixed installs.

Mitja Prelovšek, the CEO of LightAct, said: “We are extremely excited and grateful to become a recipient of MegaGrant. As shown by a number of projects combining LightAct with Unreal Engine, we have been pushing the adoption of Unreal Engine in fixed installations & live-events for several years now. And receiving this MegaGrant will allow us to accelerate the development of tools bringing these two platforms together. We cannot wait to show our users all the exciting new features we have planned!

About LightAct

LightAct® is a media server solution, created by VISIBLE d.o.o., a creative technology company based in Ljubljana, Slovenia.

LightAct is behind some of the most advanced multi-media projects all around the world combining projection mapping and complex video and lighting installations.

About Unreal Engine

Epic Games’ Unreal Engine technology brings high-quality games to PC, console, mobile, AR and VR platforms. Creators also use Unreal for photorealistic visualization, live events, multi-media installations, interactive product design, film, virtual production, mixed reality TV broadcast and animated entertainment. 

About Epic Games

Founded in 1991, Epic Games is the creator of Fortnite, Unreal, Gears of War, Shadow Complex, and the Infinity Blade series of games. Epic’s Unreal Engine technology, which brings high-fidelity, interactive experiences to PC, console, mobile, AR, VR and the Web, is freely available at The Epic Games store offers a handpicked library of games, available at

Projection of Unreal Engine content with LightAct 3.6.0

When we introduced Unreal Engine integration back in 2018, we were blown away by the endless possibilities this combination gives you. It combines the beauty and performance of Unreal Engine real-time rendering with LightAct’s media server functionalities.

A large majority of projects using Unreal Engine & LightAct are projection mapping projects (you can have a look at some of them in our showcase).

In all of these projects, the content flow is this:

  1. LightAct takes all the incoming information and passes the data to Unreal Engine.
  2. Unreal Engine takes the data and renders the content. Then it sends it back to LightAct using Spout.
  3. LightAct grabs Unreal content, applies all the required warping & blending and pushes it to projectors.

While this pipeline works well, we found that there is one drawback: sending of the content from Unreal back to LightAct consumes a significant amount of computer resources and adds a few frames of latency. 

So we thought: “Wouldn’t it be great if our users could push the content to projectors straight from Unreal Engine if they wanted?”.

That’s why we are very happy to introduce UnrealLink – a new feature in LightAct 3.6.0.

UnrealLink allows you to create nDisplay configuration files from LightAct. The files include the information about all the projectors, their intrinsic and extrinsic properties, their blend masks, their mapping to physical outputs as well as the IP address of the servers.

The UnrealLink workflow is very simple:

  1. import the 3D model of the projection object into LightAct. The location, rotation and scale of the model should match the one you’ve got in Unreal Engine.
  2. Calibrate and blend the projectors in LightAct using 3DCal or CamCal.
  3. Create nDisplay configuration files with UnrealLink.
  4. Use nDisplay Launcher to launch your packaged game using these configuration files.

UnrealLink does not replace the traditional workflow where you stream the content back to LightAct. In fact, in some cases you will still want to stream the content back to LightAct.

LightAct 3.5.0 is out with CamCal and many other features

LightAct 3.5.0 is out! It includes so many new features it is hard to choose the ones we are most excited about. But if we absolutely had to choose one, we’d probably go with CamCal camera projection calibration feature closely followed by new integrations such as AJA, Blackmagic, Deltacast and Stype.


CamCal is a camera calibration toolkit. It allows you to quickly and automatically calibrate projectors.

In the last few projects we worked on, projector calibration was definitely the most tedious task. That’s despite 3DCal feature. What’s more, we knew that every time the projectors are moved the process will have to be repeated.

So after Tampa Water Street project, we immediately started working on camera calibration feature. Now, it is finally here and you can see how it works in the video below.

The only thing you need to do manually is to calibrate the camera(s), however this has to be done only at the beginning of the project (or if you move the camera).

When the projectors have been moved, all you need to do is just run CamCal algorithm again and the projectors will be calibrated automatically.

New integrations (video capture and camera tracking)

We are very excited that we added support for AJA, Blackmagic design and Deltacast video capture cards as well as Stype camera tracking. 

Custom events

We added the ability to create Custom events. They act similarly to variables, but instead of a value, they save lifeline state. They are especially useful to reduce the number of connections between the nodes and to transfer events from one layer to another.


QuickUI is a window where you can add quick & accessible controls for your variables and events. It’s a completely customizable tool to control all the essential parameters of your show.

You can easily add a variable or an event by checking “Expose in QuickUI” checkbox in the properties of a variable or an event.

Perspective Thrower

We added Perspective thrower. A tool to throw content on video screens just as a projector would. With lens shifts and field of view. Works perfectly with Stype camera tracking node.

50% Off until the End of April

We hope you are staying healthy in these challenging times?

The epidemic and the lockdowns have a profound effect on everyone working in the AV industry. If any, then it’s the live-events sector among the most affected ones. It’s not much of an event, if it’s limited to 5 people, is it (or whatever is the limit in your country).

When the epidemic started to shut down one country after the other, we asked ourselves what can we do, as a company, to help.

Besides signing up all of our spare GPU time to foldingathome (it donates spare GPU time to find a cure for the virus) and, obviously, switching to working from home, we thought of cutting our prices.

We imagine that a lot of you might have a bit more time on your hands than you would want and some of you might consider using it to polish your skills in the tools of your choice. So until the end of April the prices of all Creator licences are cut by 50%. If you want to apply the discount, just enter ‘timetolearn’ coupon code on our website.

Stay healthy,
everyone at LightAct

Make Magic with Light

LightAct has a new Logo

LightAct has a new logo. Gone is the green ‘LA’ and in comes a new logo, which, we believe tells the story of LightAct in a much bolder way. At the same time it mitigates all of the problems the old logo had.

We’ve also slightly modified the name from Lightact to LightAct (it was our initial spelling actually), which makes it, we believe, visually much more nicely balanced.

The Background

LightAct was launched back in 2010. That’s when we introduced the green ‘LA’ sign. As with any company, the roads we were to take were quite open to say the least.

Through the years, though, as it became clearer what LightAct is and especially what it isn’t, we felt we needed a logo that would better express the uniqueness of LightAct.

We feel the new logo does just that.

It represents an ever-evolving nature of most of LightAct’s projects, mimicks the look of nodes and its connections and has a much bolder look than our previous logo.

We sincerely hope you like our new look!

Making of Funan Interactive Games

Funan Interactive Games is a project that challenged us in many ways. It pushed us to introduce some new products and caused some Lightact’s features to mature. In this article, we’ll explain the process that led to the successful completion of the project in May 2019.

But first, let’s go through what this project is all about. To do that, you can either head to the project page or just watch the project video below.

The task that our Client, Hexogon Solution, gave us, was to provide tracking, content and the media server for an 11 by 12 meters large interactive projection area on the floor of Funan, an iconic shopping mall in Singapore, that was rebuilt from the ground up.

When you need to track people on a large floor surface you usually use infrared scanners. But in our case, this wasn’t possible because we couldn’t install anything on the floor. The only option we had was to install some kind of tracking camera above the floor.

So the first task we had to do was to choose the right camera technology.


It was clear from the beginning that, as we are projecting dynamic content onto the area we need to track, we can’t use any kind of standard video camera, because the projected light would disturb the tracking. So, the next technology we investigated was infrared time-of-flight technology of which Microsoft Kinect and Intel RealSense are the most prominent examples.

The ceiling above the projection area is quite high. A section of it is 9 m above the area, but it’s offset from the center quite a lot. This means the camera would look at the tracking area at quite a steep angle, which would reduce the precision of the tracking.

There’s another section of the ceiling that’s positioned more centrally above the projected area, but it is 13 m high. As the range of Kinect and RealSense is much shorter than that, we had to look for another solution.

Next, we turned to thermal cameras. As always, though, their price was prohibitively high and besides that, we couldn’t find a camera with a wide enough view angle. As the installation height is around 13 m and the projection area is 11 by 12 m you need a pretty wide view angle to effectively cover the whole area.

So that didn’t work either.

Stereo Vision

Then, we looked at stereo vision technology. It works the same way as our eyes – by comparing the difference between images taken from 2 parallel cameras that are slightly offset. The advantage of this technology is that there is a myriad of lens options, so you can be sure, you’ll be able to find one for your project.

An additional benefit of this technology is that by changing the distance between the 2 cameras, you can adjust the depth precision and the range. The more apart they are, the longer the range. In this project, for example, where the installation height is 13 m above the floor, we placed them 1 m apart. This gave us depth precision of around 6 cm at distances of around 12 meters. That’s pretty good.

Of course, we had to test the whole thing so we rented a lift and one frigid morning went 13 m high up in the air.

Everything went as expected and we were able to get depth maps good enough for tracking.

In the video above, you can see the raw StereoView footage, the white-on-black blobs image you get after initial processing in Lightact, the results of finding minimum enclosing circles for these blobs and the Unreal Engine output.

Creating the Content in Unreal Engine

The project brief said that the content should switch between branded interactive visual effects (motion graphics) and a multi-player game that the visitors could play by walking on the projection area. Before choosing Unreal Engine we looked at several other options such as coding everything ourselves using cinder framework or perhaps using real-time VFX engines such as Notch. However, due to the speed of prototyping we were able to achieve with Unreal’s Blueprint system, the ability to create an actual game logic (think score-keeping, game level management and so on) and a huge pool of Unreal Engine experts we could work with, we chose Unreal Engine very early in the process.

So how does it all work?

We were able to quickly integrate Unreal Engine with Lightact so that we could pass textures and variables from one application to the other.

Unreal Engine receives the information about the location and radius (x,y,r) of all the circles, uses this information for gameplay logic and outputs the final render via Spout to Lightact. Lightact then warps and blends this texture and sends it out to 4 projectors.

We also upgraded Lightact Manager so that it monitors Unreal Engine game and makes sure that it always runs as it should.

The end-client can control the installation with Lightact WebUIs.

How did we create the content?

The Client forwarded us their branding guidelines based on which we created all of the content from scratch in Unreal Engine.

In the final project, everything is rendered in real-time and there is no pre-rendered content at all. Altogether there are 6 interactive visual effects and 4 interactive games which sums up to 10 separate Unreal levels.


There are hundreds of things we learned when we combined technologies that, to our knowledge at least, weren’t used together before. Things like what to do when tracking results disappear for a split second due to various hardware factors, how to integrate all the components into a reliable and user-friendly system and so on.

All in all, the Client, our partners and ourselves are very pleased with the result.

On to the next one!

Lightact Webinar


Due to popular demand, we are organizing our first interactive online webinar! It will be hosted by Lightact’s founder, Mitja. In around 30 minutes, he will walk you through Lightact’s content pipeline, Layer Layouts and the basics of Unreal Engine integration.

To make this as accessible as possible we are doing this on YouTube (

If you wouldn’t like to listen to the whole thing, but just have a particular question that’s bugging you, drop by and ask away in the chat window. To keep things flowing, most of the questions will be answered at the end of the webinar.

Quick Info

  • What: Online interactive webinar
  • Date: 30th May 2019
  • Hour: 4 PM CEST (equals 10 AM in New York and 10 PM in Hong Kong)
  • Where: YouTube (

[/vc_column_text][mk_padding_divider size=”80″][mk_fancy_text highlight_color=”#000000″ highlight_opacity=”0″ font_weight=”bold” font_family=”none” align=”center”]Come, tune in and learn about your new tool for creating great installations![/mk_fancy_text][mk_padding_divider size=”80″][/vc_column][/vc_row]