Conquering banding and artifacts with rear projection

When we began our quest to set up a virtual production studio using rear projection, we thought it would be as simple as setting up a projector and screen, piping out some Unreal Engine graphics to the projector, putting actors in front of the screen, lighting the actors to match the projection, and shooting them with a decent video camera. We were right on some counts, but not all.

One of the issues we battled right away was artifacts when we shot the screen. When shooting, we got rolling bands on the recording. The bands appeared only on the projected background, not on the actors. And the bands weren’t visible to the naked eye, just the camera.

Banding when shooting projected image

The artifacts resembled the type of banding one can get with live shoots when the frequency of the lights used is out of sync with the camera’s shutter speed. This led us to suspect that there was some kind of synchronization issue between the camera and the projector. This ended up being true, but not for the reason we originally thought.

Key takeaway: When shooting a rear projection screen, you need a camera that shoots at true 24 fps. 

Banding in live shoots

To get a better handle on why we got these artifacts, let’s look at a parallel problem: banding in recorded video of live actors and sets.

Physical lights don’t emit light continuously–they are actually “flickering” at a very fast rate. In the USA and other countries that use alternating current (AC), lights flicker at a rate of 60Hz, meaning they turn on and off 60 times per second. In other countries that use direct current, the rate is 50Hz.

When you shoot a scene lit by such lights, you need to be aware of your camera’s shutter speed, the rate at which its shutter opens and closes to gather light and color information for recording. If your camera’s shutter speed is at odds with the light’s flicker rate, this will cause banding in your recording.

In the USA, for example, you’ll want to shoot with a shutter speed that’s a multiple of 1/60 (opens and closes 60 times per second) to jive with the flickering lights. This means that with a shutter speed of 1/60, 1/120, or 1/180, you won’t get any banding. The video below shows and explains this phenomenon very well.

If you’re shooting at 24fps, the ideal shutter speed for a cinematic look is double that number, or 1/48. But since 1/48 isn’t a multiple of 1/60, you might get banding at that shutter speed. Bumping up the shutter speed to 1/60 will stay close to the “cinematic look” shutter speed of 1/48 while also eliminating banding in your shots.

This advice works great for live shoots, but shooting a projector isn’t exactly the same thing. Still, we were able to apply that lesson to our problem.

Banding in projector shoots

In the same way a light flickers multiple times per second, a projector projects light and color at a particular rate. A common rate for projectors is 120Hz, meaning the screen refreshes 120 times per second.

Theoretically speaking, if the projector has a rate of 120Hz, one should be able to set the shutter speed to 1/60 or 1/120 and get a solid image. However, we didn’t find that to be the case.

For the past couple of years we’ve been using a Sony a6500 camera for our live film shoots, and we’ve been able to get some great quality with it. For example, we shot this award-winning comedy short with the Sony a6500.

But when we set up our virtual production studio and attempted to shoot the projection screen with the Sony a6500, we got rolling bands at every shutter speed. At 1/40 the bands were the least present and visible. We were able to reduce the rolling lines by displaying an Unreal Engine scene that was very bright or dark, but they were still always there.

Scene from “A Common Thread”, with actor against a projected background, using Sony a6500 to shoot at a shutter speed of 1/40. The rolling bands are minimized, but still very visible at high resolutions. The gray bars point up a couple of the more visible rolling bands in this scene.

To remove the lines, we tried everything–switching to 30 fps and 60 fps, setting the ISO to every possible number, using lights with different frequencies to light the actors (in case light spillage was the issue), and so on. We tried several different projectors, switching between LCD, DLP, and Laser, thinking that maybe the method of generating light and color was responsible for frequency issues. It didn’t matter what type of projector we used, the lines were still there.

(DLP, LCD, and Laser are all different projector bulb technologies. This link explains some of them. In our case, it didn’t matter–they all produced banding with the Sony a6500.)

On the Unreal Engine side, we tried pumping out imagery at different frame rates. We even piped live footage out to the screen, and recorded that. No matter what we tried, the rolling bands remained.

It’s the camera, not the projector!

Then we had a brainstorm: we invited a couple of cinematographer friends over to try out the virtual production setup with their cameras. One friend has a Lumix S5, and the other a Blackmagic Design Pocket Cinema Camera 6K. And…

Both their cameras worked! No banding when shooting the projection screen at a specific shutter speed. Just a crystal-clear image.

But why? When I say the Lumix and Blackmagic cameras worked at a particular shutter speed, I’m not being precise in my terminology. Both the Lumix S5 and Blackmagic DPCC 6K have settings for the shutter angle, not speed.

The term shutter angle is closely tied to the shutter speed for a particular frame rate–when you change any one of the three, at least one of the others changes. Some cameras have a setting for shutter angle, while others have a setting for shutter speed, and they’re generally thought of as being the same thing, just expressed a different way.

When each of the friends’ cameras were set to a shutter angle of 144 at a frame rate of 24 fps, which calculates to a shutter speed of 1/60, the rolling bands completely disappeared.

So why didn’t this work with the Sony a6500? After much digging, I discovered that the setting of 24 fps on the Sony actually means 23.967 fps. The technical reasons for forcing cameras to 23.967 fps rather than 24 are beyond the scope of this blog post, but suffice it to say that the reason is historical and has to do with the invention of video, and that Sony apparently likes to substitute 23.967 for 24 fps in all their cameras. For live shoots, it’s usually not a big deal–23.967 and 24 fps look very nearly the same. But when you’re shooting a projected image, you’re going to get artifacts no matter what you do.

With the Sony recording at 23.967, something is thrown off when the camera tries to shoot the 120Hz screen. Either the supposed shutter speed of 1/60 is not a true 1/60, or it needs a shutter angle that isn’t an integer like 144 to get true 1/60 and can’t get there. All I know is that with all other things being equal, the true 24 fps cameras did the job perfectly, and the Sony failed.

Our new camera

This means we’re going to need a new camera. We’re seriously considering the Blackmagic DPCC 6K, but we’re currently weighing it against Blackmagic’s 4K version for portability, battery life, and price. We’ll let you know when we make a decision.

In the meantime, my camera-rich friends are coming over to help us shoot some videos for our portfolio, so we can start doing gigs in the coming months. Stay tuned!