Home tech Netflix’s AI-Powered Green Screen Bathes the Cast in Eye-Burning Purple

Netflix’s AI-Powered Green Screen Bathes the Cast in Eye-Burning Purple

by admin

The process of making up, or placing actors in a context that doesn’t exist in reality, is as old as cinema itself and still difficult. Netflix has new technology that relies on machine learning to do some of the hard work, but it requires lighting the actors in a vibrant purple color.

For decades, the easiest makeup method was chroma key, with actors standing in front of a brightly colored background (originally blue, later green) that could be easily identified and replaced with anything, from a weather map to a battle with Thanos. The foreground is said to be “incoherent” and the background is a transparent “alpha” channel manipulated with the red, green and blue channels.

It’s easy and cheap, but there are a few downsides, including issues with transparent objects, fine details like hair, and of course, anything similar in color to the background. It’s usually pretty good, however, trying to replace it with more complicated and expensive methods (like a bright field camera) weakens it.

Netflix seekers are giving it a go, though, with a mix of old and new that can keep makeup simple and pure — at the expense of a hellish lighting setup.

As a recently published article shows, “purple green screen” produces dazzling results by putting actors in a lighting sandwich. behind them, bright green (actively lit, not in background); On the front, it’s a mix of red and blue, which creates great color contrasts.

The actors lit up in purple on a green screen. Picture credits: netflix

The resulting look on set will likely make even the most seasoned post-production artist cringe. Normally you want your actors lit in fairly natural light, so while they might need a little tapping here and there, their in-camera look is relatively natural. But if it’s lit exclusively with red and blue light, it completely distorts the appearance, because, of course, natural light doesn’t have much of its spectrum cut off.

But the technology is also clever about this by making the foreground only red/blue and the background only green, it simplifies the process of separating the two. A normal camera that normally captures these colors captures red, blue, and alpha instead. This makes the resulting parts very accurate, devoid of the artifacts resulting from the separation of the full-spectrum input from the main limited-spectrum background.

Sure, they seem to have replaced one difficulty with another: adjusting is now easy, but restoring the green channel for purple-lit targets is difficult.

This has to be done systematically and adaptively, given the different subjects and compositions, but a “naive” linear approach to injecting green results in a dull, yellowish look. How can it be automated? AI to the rescue!

The team trained the machine learning model on their training data, which is essentially “rehearsal” sequences of similar, but naturally lit scenes. The convolutional neural network receives corrections from the full-spectrum image to compare with the violet-illuminated one, and develops a process to quickly restore the lost green channel in a smarter way than a simple algorithm.

A simple algorithm gives mediocre (higher) results while a more complex ML model produces colors very similar to the underlying reality. Picture credits: netflix

So the color can be restored surprisingly well in post (“virtually indistinguishable” from the ground truth in camera) – but there remains the problem of the actors and set having to be lit in such a gruesome way. Many actors are already complaining that working in front of a green screen feels unnatural – imagine doing it while lit by harsh, inhuman light.

The document addresses this, however, with the ability to “double” the lighting, essentially turning the purple/green lighting on and off multiple times per second. It’s distracting (even dangerous) to do 24 times per second (that’s the frame rate most movies and TV are shot at), but if they turn the lights on faster – 144 times per second – it seems “almost constant”.

However, this requires complex synchronization with the camera, which only has to pick up light during the brief moments when the scene is purple. And they also have to consider dropped frames for motion…

As you can see, it’s still very experimental. But it’s also an interesting way to tackle an old problem in media production with a new high-tech approach. It wouldn’t have been possible five years ago, and while it may or may not be approved, it’s clearly worth a try.

Related News

Leave a Comment