Commercials for local businesses

Many Worlds Productions is fortunate enough to be part of a thriving community in the historic Bywater neighborhood of New Orleans. We love being in the midst of all the locally owned restaurants, coffee houses, bars, and theaters, and within walking distance of some of the most amazing art installations the city has to offer.

One of these local gems is Frady’s, an old-fashioned grill and convenience store run by brother-and-sister team Kirk and Kerry Frady. Poboys and breakfast are cooked fresh and assembled in their tiny kitchen while you wait. The hot bar offers a different New Orleans classic every day of the week (red beans and rice on Mondays, y’all). All food is to-go, but many patrons elect to sit at one of the outside tables and chat with friends and neighbors while they enjoy their meal.

We’re big fans of their food, so we were more than happy to make this commercial for Frady’s social media. The commercial package includes several photos and videos for posting over a period of weeks, to keep Frady’s fans engaged and coming back for more.

Watching the commercial makes our mouths water, and we hope it does the same for you. If you’re ever in the area, stop in for a delicious roast beef poboy or the classic Grumpy Old Man breakfast, and be sure to say Hi to Kirk and Kerry!

It’s a wrap! Storyboards with Unreal Engine and MetaHumans

We just did a shoot for a short horror film using our virtual production setup, shooting about 15 minutes of script over 15 hours of production. We’re making this film for the Louisiana Film Prize, a short film competition taking place in October 2023.

One of the things that made the shoot go so smoothly was the fact that we made a storyboard ahead of time using MetaHumans. Most of the cast and crew was new to virtual production, and the storyboard gave them insights into what we’d be shooting and how it would work. We were also able to use the same camera setups from Unreal Engine to get the backgrounds up in seconds, and to pose the actors according to the storyboard.

Check out our video about the process:

One of the best aspects of doing storyboards and production with Unreal Engine and MetaHumans is the way the cast and crew responded. Everyone was excited to be part of this new technology, and many came up with creative ideas on the fly. Because it’s so easy to adjust the background, lights, and camera angle, we were able to try different things and get a lot of footage for the editor to play with.

We’ll have more as we get into post and complete the film.

Table read!

One of the best ways to get cast and crew excited about a film is to host a table read, where we read through the script together. It’s great for learning what works in the script (and what doesn’t), and in getting everyone on board with what you’re looking to accomplish.

Left to right: Kay (lighting), Yvonne (PA), Jarrod (actor), Lizzy (Script Supervisor)

Last week we held a table read at our home for our upcoming short film “The Last Influencer Standing,” a comedic horror story of influencers battling to the death for one million followers. About half the cast and crew were able to attend, and we assigned some crew to read for the cast that wasn’t there.

Terese (lead actor), Michele (director), Jen (costumes)

It also gave us an opportunity to do a quick costume fitting for the lead actor, and for the director to model the T-shirts the “influencers” would wear during the shoot.

It was great to see everyone have such a great time, and also to hear the laughs come at all the expected spots.

Principal photography is scheduled for next month. We can’t wait!

Conquering banding and artifacts with rear projection

Rolling bands in projected image

When we began our quest to set up a virtual production studio using rear projection, we thought it would be as simple as setting up a projector and screen, piping out some Unreal Engine graphics to the projector, putting actors in front of the screen, lighting the actors to match the projection, and shooting them with a decent video camera. We were right on some counts, but not all.

One of the issues we battled right away was artifacts when we shot the screen. When shooting, we got rolling bands on the recording. The bands appeared only on the projected background, not on the actors. And the bands weren’t visible to the naked eye, just the camera.

Banding when shooting projected image

The artifacts resembled the type of banding one can get with live shoots when the frequency of the lights used is out of sync with the camera’s shutter speed. This led us to suspect that there was some kind of synchronization issue between the camera and the projector. This ended up being true, but not for the reason we originally thought.

Key takeaway: When shooting a rear projection screen, you need a camera that shoots at true 24 fps. 

Banding in live shoots

To get a better handle on why we got these artifacts, let’s look at a parallel problem: banding in recorded video of live actors and sets.

Physical lights don’t emit light continuously–they are actually “flickering” at a very fast rate. In the USA and other countries that use alternating current (AC), lights flicker at a rate of 60Hz, meaning they turn on and off 60 times per second. In other countries that use direct current, the rate is 50Hz.

When you shoot a scene lit by such lights, you need to be aware of your camera’s shutter speed, the rate at which its shutter opens and closes to gather light and color information for recording. If your camera’s shutter speed is at odds with the light’s flicker rate, this will cause banding in your recording.

In the USA, for example, you’ll want to shoot with a shutter speed that’s a multiple of 1/60 (opens and closes 60 times per second) to jive with the flickering lights. This means that with a shutter speed of 1/60, 1/120, or 1/180, you won’t get any banding. The video below shows and explains this phenomenon very well.

If you’re shooting at 24fps, the ideal shutter speed for a cinematic look is double that number, or 1/48. But since 1/48 isn’t a multiple of 1/60, you might get banding at that shutter speed. Bumping up the shutter speed to 1/60 will stay close to the “cinematic look” shutter speed of 1/48 while also eliminating banding in your shots.

This advice works great for live shoots, but shooting a projector isn’t exactly the same thing. Still, we were able to apply that lesson to our problem.

Banding in projector shoots

In the same way a light flickers multiple times per second, a projector projects light and color at a particular rate. A common rate for projectors is 120Hz, meaning the screen refreshes 120 times per second.

Theoretically speaking, if the projector has a rate of 120Hz, one should be able to set the shutter speed to 1/60 or 1/120 and get a solid image. However, we didn’t find that to be the case.

For the past couple of years we’ve been using a Sony a6500 camera for our live film shoots, and we’ve been able to get some great quality with it. For example, we shot this award-winning comedy short with the Sony a6500.

But when we set up our virtual production studio and attempted to shoot the projection screen with the Sony a6500, we got rolling bands at every shutter speed. At 1/40 the bands were the least present and visible. We were able to reduce the rolling lines by displaying an Unreal Engine scene that was very bright or dark, but they were still always there.

Scene from “A Common Thread”, with actor against a projected background, using Sony a6500 to shoot at a shutter speed of 1/40. The rolling bands are minimized, but still very visible at high resolutions. The gray bars point up a couple of the more visible rolling bands in this scene.

To remove the lines, we tried everything–switching to 30 fps and 60 fps, setting the ISO to every possible number, using lights with different frequencies to light the actors (in case light spillage was the issue), and so on. We tried several different projectors, switching between LCD, DLP, and Laser, thinking that maybe the method of generating light and color was responsible for frequency issues. It didn’t matter what type of projector we used, the lines were still there.

(DLP, LCD, and Laser are all different projector bulb technologies. This link explains some of them. In our case, it didn’t matter–they all produced banding with the Sony a6500.)

On the Unreal Engine side, we tried pumping out imagery at different frame rates. We even piped live footage out to the screen, and recorded that. No matter what we tried, the rolling bands remained.

It’s the camera, not the projector!

Then we had a brainstorm: we invited a couple of cinematographer friends over to try out the virtual production setup with their cameras. One friend has a Lumix S5, and the other a Blackmagic Design Pocket Cinema Camera 6K. And…

Both their cameras worked! No banding when shooting the projection screen at a specific shutter speed. Just a crystal-clear image.

But why? When I say the Lumix and Blackmagic cameras worked at a particular shutter speed, I’m not being precise in my terminology. Both the Lumix S5 and Blackmagic DPCC 6K have settings for the shutter angle, not speed.

The term shutter angle is closely tied to the shutter speed for a particular frame rate–when you change any one of the three, at least one of the others changes. Some cameras have a setting for shutter angle, while others have a setting for shutter speed, and they’re generally thought of as being the same thing, just expressed a different way.

When each of the friends’ cameras were set to a shutter angle of 144 at a frame rate of 24 fps, which calculates to a shutter speed of 1/60, the rolling bands completely disappeared.

So why didn’t this work with the Sony a6500? After much digging, I discovered that the setting of 24 fps on the Sony actually means 23.967 fps. The technical reasons for forcing cameras to 23.967 fps rather than 24 are beyond the scope of this blog post, but suffice it to say that the reason is historical and has to do with the invention of video, and that Sony apparently likes to substitute 23.967 for 24 fps in all their cameras. For live shoots, it’s usually not a big deal–23.967 and 24 fps look very nearly the same. But when you’re shooting a projected image, you’re going to get artifacts no matter what you do.

With the Sony recording at 23.967, something is thrown off when the camera tries to shoot the 120Hz screen. Either the supposed shutter speed of 1/60 is not a true 1/60, or it needs a shutter angle that isn’t an integer like 144 to get true 1/60 and can’t get there. All I know is that with all other things being equal, the true 24 fps cameras did the job perfectly, and the Sony failed.

Our new camera

This means we’re going to need a new camera. We’re seriously considering the Blackmagic DPCC 6K, but we’re currently weighing it against Blackmagic’s 4K version for portability, battery life, and price. We’ll let you know when we make a decision.

In the meantime, my camera-rich friends are coming over to help us shoot some videos for our portfolio, so we can start doing gigs in the coming months. Stay tuned!

Virtual production on a budget

Earlier this year, Many Worlds Productions embarked on new adventure called virtual production. Virtual production is any filmmaking technique that seamlessly combines virtual and physical worlds in real time. You can think of it as any technique that cuts out post-production steps by mixing in the virtual world with the physical world during production, right there at the shoot. The production techniques used on the Disney+ series The Mandalorian are a classic example of virtual production.

There have been lots of filmmaking techniques over the years that combine the virtual and physical, but virtual production takes things a step further by eliminating that post-production step. And while budgets for The Mandalorian run into six figures per season, there are much less expensive ways to do virtual production.

We decided to start with a relatively inexpensive option: in-camera VFX (ICVFX) with rear projection. This technique is essentially a replacement for green screen. Instead of shooting an actor against a green screen and compositing the background in during post-production, the actor is shot against a screen showing the background. The background comes out of Unreal Engine live, meaning we can reposition it at any time, and compose our shots on the fly.

Video shot against projected background

Doing rear projection live in-camera also eliminates many of the challenges of compositing green screen footage–color spill, rough or mushy-looking edges around the live talent, green reflections on shiny objects, not to mention the basic challenge of lighting the green screen itself for an evenly distributed tone.

In the image above, for example, you can see the fine hairs on the actors’ heads against the bright window background. With green screen, these hairs wouldn’t show up, or might even disappear and reappear from frame to frame. ICVFX eliminates these issues.

Improving on historical rear projection

The grandfather of ICVFX is rear projection with pre-filmed footage, where a video or film is projected behind the actor during the action. One of the most famous uses of this classic technique was in the classic film The Wizard of Oz (1939), where the crew made a fake film of a cyclone ahead of time, and projected it behind the actors during the shoot.

This method has strict limitations, the most severe being that any movement of the camera upsets the perspective and breaks the realism. For example, if you look closely at the video above at around 1:40, you can see the ground immediately behind the fence sliding along in a strange manner as the camera pans to the right. It looks like the crew moved the projector to the right to move along with the camera and keep the distant tornado at the correct perspective, at the expense of realism for items in the foreground.

In 1939 these visuals blew everyone’s minds. Even today, the effect holds up pretty well–I watched this film at least a dozen times before noticing this problem–but the film was originally shown in theaters and then TV, where viewers couldn’t pause or rewind. But today’s viewers are a lot more savvy, and a situation like that would never fly on today’s streaming services.

Another limitation with traditional rear projection is that the projected film is what it is, and can’t be altered during the shoot. If the perspective of the projected image is just a bit off, you just have to deal with it.

When you use a background coming live out of Unreal Engine rather than a pre-filmed or pre-rendered background, there are two distinct advantages:

  • The background can be moved around or updated on the fly before shooting, so you can get the background you want. You can also blur it, lighten or darken it, change the sun’s position for different times of day, add fog, or do any manner of adjustments in real time, just before hitting the Record button on the physical camera.
  • You can connect up the physical camera’s position with the position of the Unreal Engine camera so that any movement of the physical camera during the shot will update the background to reflect the correct perspective. This technique, called camera tracking, opens up a whole world of possibilities. We haven’t explored this option yet, but we plan to in the near future.

Below is an account of our journey with rear projection out of Unreal Engine, when we used it to produce the short film A Common Thread for a three-day film competition.

For future projects, we’re doing a lot of test shots with backgrounds we found in the Unreal Marketplace, giving us the ability to put our actors in an office, a hospital, a snowy landscape, a spaceship, the Moon, or anywhere we like, all without leaving the comfort of our studio.

Stay tuned for more virtual production adventures!