5 VFX Tricks Used By Filmmakers

INTRODUCTION

There’s a certain prestige attached to practical filmmaking”. Real locations. Real effects. Cameras capturing real things.

But the truth is, for certain shots and ideas practical filmmaking can actually be, well, impractical.

There’s a common misconception that CGI is only used for big, impossible, spectacle-heavy moments. Explosions, creatures, or entire cities that feel obviously artificial. But in reality, some of the most widely used visual effects work is often completely invisible.

As someone who largely works on independent or locally funded productions, small doses of VFX are still a regular occurrence. Not because the films are trying to look bigger than they are but because, in many cases, VFX is simply the smarter, cheaper, and more practical option.

What follows are five cases where digital tools aren’t a compromise, but a more efficient and effective solution than going fully practical.

1 - CHROMA KEY

One of the biggest hidden costs in filmmaking is securing the space which the story takes place in.

This involves paying for privately owned locations or securing permits from the local government for filming in public spaces. 

This adds up quickly.

In some cases, it’s cheaper or more practical to shoot a controlled foreground element and extend or replace the background environment digitally. A basic and relatively low cost example of this is shooting on a soundstage, or even a real world location, and placing green or blue screens outside each window. 

These can later be chroma keyed, which removes that specific shade of green or blue from the shot, creating a transparent gap that can be replaced with a new background by a visual effects artist.

This technique has been used by many filmmakers who desire complete control, such as by Michael Haneke, or David Fincher.

Chroma Key can also be used to replace backgrounds for safety or practicality reasons, such as shooting an actor against a green screen, and replacing the background to make it look like they are riding a motorbike, or replacing an airplane window in a soundstage.

At the high end, this technique can be used to invisibly and complexly construct entire locations from scratch. Such as in After The Hunt, which had to get around setting the film at Yale University, but without actually being able to film at the real world location. Instead, they shot footage of sets on a different continent at Shepperton Studios in England.

They used the same idea of having foreground footage of sets which could be shot with the actors, but placed large chroma screens in the background of the shots, which could later be keyed out and replaced with a visual effects background based on Yale.

2 - SCREEN REPLACEMENT

Screen replacements are a relatively cheap, quick and common form of visual effects - and a clear example of VFX being more practical than practical filmmaking. If you ever see a TV, phone, or computer monitor on screen nowadays there’s a pretty good chance that what you’re viewing has been digitally manipulated.

That’s because if you shoot screens for real, you lock yourself into a single version of the content. The timing of what plays on screen, the design, interface, the animation, all of it, needs to be finalised before the camera ever rolls. This is often not possible, as the on screen content may need to be filmed or designed at a later stage. 

If filming practically you also have to make sure that the brightness of the screen is exactly exposed and aligned with your lighting - which could become an issue if shooting a bright exterior where the sun’s rays overpower the screen’s brightness. 

With VFX, none of that pressure exists. 

Often the same chroma key technique we just discussed is used, by saving a green or blue photo, and making it fullscreen on the device. Then keying that colour out in post and replacing it with the desired video, photo or on screen content. 

The downside to this is that the screen might cast a green or blue light onto the surrounding elements, like the hand, which may become difficult to correct in post.

An alternative is to use another solid colour like a white background, which gives a more realistic light source.

If the screen or camera will move then it might be useful to add some tracking marks on the screen, either on the image or with little bits of tape, that software can latch onto to figure out and track how the screen moves. 

Another good idea is to get a plate shot with the screen turned off. That way, VFX artists can use the natural reflections and textures from the screen and overlay it on top of the on screen layer to make the screen feel more realistic.

3 - CLEAN UPS

Clean-ups, or painting objects out, are one of the quiet workhorses of modern filmmaking. This process involves digitally removing or fixing unwanted elements in the frame - dolly tracks on the floor, a camera reflection in a mirror, or wall details like light switches and power sockets that draw too much attention. 

Smaller objects can often be removed with simple brush tools that sample the pixels surrounding the object, not unlike how objects are painted out in Photoshop.

Another fast and cost-effective approach is filming a plate shot. This involves locking the camera off in a fixed position, capturing the main action first, and then removing all moving elements from the frame. 

Once the set is clear, ideally including anything you intend to remove later, additional footage of the empty space is recorded. These “clean plates” give VFX artists the ability to seamlessly erase unwanted details in post by replacing the part of the shot that you want to remove in the main take with the same part of the frame in the clean plate.

Plates can also be used to control lighting. For example, I’ve used this technique when shooting wide shots during the final moments of dusk. By capturing a clean plate earlier, when the sky was still a rich blue, and later compositing it into the best performance take filmed later once the dusk sky had disappeared. 

The final shot combines the ideal sky with the strongest action in the lower part of the frame.

Plates can also be used to control window exposure and preserve highlights. By filming the main action inside, then capturing a separate plate with reduced exposure so the windows aren’t blown out, both elements can later be combined in post to create a balanced final image.

4 - WEATHER EFFECTS

Weather is an expensive and difficult element to control practically, requiring specialist SFX crews, extensive setups, and constant adjustment throughout the shooting day. 

Rain towers, fog or haze tubes that are laid on the floor, and snow machines can be effective in controlled environments, but in wide exterior shots these effects can be disrupted by wind, or be a challenge to maintain the continuity of the effect evenly across each frame.

Digital weather effects offer a more flexible and economical solution. These can be from stock video assets, or created in software tools such as Blender or Houdini. Rain, snow, mist, and atmospheric haze can be added in post-production with precise control over density, direction, and intensity, allowing the effect to be matched across shots and adjusted to support continuity and mood. This approach allows filmmakers to shape the atmosphere deliberately.

David Fincher frequently uses this technique to create specific weather at specific moments in the story entirely digitally. Instead of chasing weather during production, the environment is refined calmly and precisely in post.

5 - SKY REPLACEMENT 

Ever notice how films seem to have an endless supply of perfectly timed sunsets and dramatic cloud formations? Sometimes those skies are captured for real. But often, they’re subtly enhanced or entirely replaced in post-production.

Sky replacement is exactly what it sounds like: removing the sky from a shot and replacing it with a different one. This might be to add more texture and depth with clouds, push a scene toward a warmer golden hour look, or maintain continuity when weather conditions change between takes or shooting days. A flat, overcast sky might be technically accurate, but swapping it for something more expressive can better support the mood and visual rhythm of a scene.

The process is usually done using high-resolution still photographs or footage of skies, either shot by the filmmakers themselves or sourced from stock libraries. In locked-off shots, the sky can be isolated and replaced relatively easily. 

If the camera is moving, tracking data can be applied so the new sky follows the camera’s motion naturally. A small adjustment that hopefully nudges an image closer to the version that best serves the story, without the audience ever noticing the hand that shaped it.

Another trick, which is more often done by a colourist with grading software than VFX, is adding a gradient. Rather than having an entirely consistent colour, this layer creates a transition from a deeper, more saturated colour at the top to a lighter, less saturated colour at the horizon. 

It simulates the natural light falloff in the sky and mimics the effect of either a hard or a soft physical graduated neutral density filter which would be placed in front of the lens and were more frequently used in the days before this digital post production tool. 

CONCLUSION

Practical filmmaking isn’t going away. Real locations, light, and performances remain the foundation of cinema and the reference point for effective visual effects.

But practical doesn’t always mean cheaper or better. In many cases, VFX doesn’t replace reality - it quietly supports it, solving logistical problems, and perfecting images for directors, while remaining completely hidden.

Next
Next

5 Filmmaking Rules That Can Be Broken