Curious about how the Outlander team turns landlocked ships into ocean-going vessels? Learn about CGI and VFX in this exclusive interview.
I interviewed the boundlessly enthusiastic Dries du Preez, a South African VFX artist specializing in 3D and Motion Tracking who works in Cape Town. He’s a red-headed charmer who bubbles with excitement about his chosen field. Here’s what he told me about how computer generated imagery (CGI) and visual effects (VFX) magic happens.
Dries and I clicked immediately as we discovered a mutual love of fantasy novels. Keeping up with my notes at the same time as staying focused on the interview was a bit difficult at times—Dries speaks rapid fire and leaps across the conversation landscape. Fortunately, I’m well trained by grasshopper minds to follow a chat into the woods and back.
Before we start let’s just clarify: VFX, aka Visual Effects or Special Visual Effects, are visuals created in the post-production phase of filmmaking. CGI stands for Computer Generated Imagery, which is the creation of animated or still visuals using imaging software on the computer.
Back to Dries, who did not work on Outlander, but was happy to help me understand more about how this works for shows like Outlander. He has worked on various international films and TV series, including Roots, Tutankhamun and Eye In the Sky, which starred Helen Mirren, amongst others. He is currently one of the backroom miracles working on the entertainment we so readily absorb. This means he’s a really good guy to chat to about the mysteries of post-production.
Dries became fascinated by movies as a young boy, in particular with Star Wars: Episode I:Phantom Menace. He wasn’t just interested in the story and visuals he watched; he wanted to know, “How do they do that?” So, during his teen years, Dries happily devoted some time to learning the computer skills that would enable him to “visualise and manifest what others can’t see—to engage in world construction.” How cool is that?
Getting into VFX and the South African Film Industry
By the end of high school, Dries knew he wanted to be a filmmaker and decided to study at one of the film schools in Cape Town. When he started his bachelors degree, he had no plans to specialise in VFX. As someone who has a great interest in escapism, he found the framework and curriculum of the institution limiting in that stories that had to be relevant to South Africa rather than allowing him to tell the stories and explore the worlds that he wanted. So in his spare time he started tinkering with visual effects, animation and writing his own stories as he enjoyed the freedom those fields allowed him of creating everything himself. He won the best animation award in his first year, and by his third year, he had shifted his focused entirely to VFX, winning the best Visual Effects award for him and his team. It was for a space-based adventure that challenged him to use green screens, strategically half-built sets and reconciling reality with virtual environments. He found this came naturally and is very fulfilling work for him.
The increasing number of film productions by international crews in South Africa has meant pretty steady post-production work for Dries. He started working freelance and part-time, but then branched out as a contractual freelancer eventually happily joining the ranks with one of his old lecturers, who is active in the industry. He works primarily with a small company, Inspired Minority Pictures, that tackle big productions in both scale and name.
Visual effects are very much part of the movie making business today, and there are very few productions that don’t make use of it at some point. Two well-known movies that used an enormous amount of VFX were James Cameron’s Avatar and David Fincher’s Benjamin Button. In the latter, Brad Pitt’s face was manipulated in various ways, practically and digitally. At some points the changes were done superficially by prosthetics and subtle makeup; at others his face was entirely digital, having been put on an older body double.
The international film, District 9, a sci-fi movie set in a potential future South Africa, is a great example of a co-production of the United States, New Zealand and South Africa. It was filmed locally and made use of extensive CGI. The establishment of Cape Town Film Studios with its amazing facilities has meant that increasing numbers of international productions have been filmed locally. This has translated into regular work and incomes for hundreds of skilled locals such as Dries. Black Sails filmed for four years in Cape Town. Outlander was there for months, and the new Lara Croft Tomb Raider movie finished filming at the studios a month or so after Outlander started filming.
While VFX is a growing field with increasingly sophisticated computers and software allowing for more incredible results year after year, the human eye has not yet been replaced. People can do things and pick up nuances and subtleties that computers can’t. This is where someone like Dries come into the picture; literally, they create the realities of movies from computer bits.
https://www.youtube.com/watch?time_continue=5&v=HtvjTJoiF4o
Technical Genius (and an eye for detail)
Now to the technicalities: Essentially what Dries does is bridge the real world and the virtual world. This requires him to match the virtual camera movement with the actual camera movement. It uses photogrammetry, which is a combination of visuals and mathematical measurement of vectors. Still confused? So was I. So I asked a few dumb questions to help me understand better.
The process starts (after he’s been commissioned) by the production company sending him short clips from the film footage they have shot that they need VFX done on. They arrive in what Dries describes as ‘raw’—in other words they look faded and desaturated or “vanilla”—not much to get excited about if you’re not in the trade. He then looks at what he has, which can be pretty messy in terms of clutter that you would otherwise not want in the shot. When I interviewed Diana Gabaldon a few days after my interview with Dries, she described the number and range of people on set and a lot of that activity is captured in the raw footage. Dries says it can include cables, additional cameras, cameraman, tracking markers, odd bits of tackle used on set, along with the big green screen reflecting in the eyes of the stars and on their skin.
The workflow for a shot like this differs from VFX house to VFX house, but essentially the steps are the same. Several VFX artists work on the plate footage in parallel, each using the footage for their appropriate task. The tasks are broken into data extraction, clean-up and asset creation, which finally all lead into the assembly phase, or compositing. Dries, as a VFX generalist, can work on several of these tasks, but in the cases where his specialized field is required, he would do that.
The clean-up artist’s job is to clean the footage of all these extraneous bits that don’t belong in the final version. I thought of it like using stain remover before you do the laundry! It’s a somewhat more meticulous process than that as it means looking at the geometry of the shot to ensure that the items that were cleaned in one frame are also cleaned in the next. A lot of it is patchwork. For example if there is a logo or the name of a company on an item, the artist has to remove that in every single frame or else the production company will find itself having to pay royalties to those companies.
The data extraction is where Dries would come in. He uses tracking markers to measure; those markers are high contrast points that stand out in an image because (this is the mind boggling part to me) he has to go frame by frame to track changes. This has become Dries’ specialisation—he primarily does camera tracking, 2D and 3D work, extracting the camera’s motion from the footage via tracking, so the virtual environment’s camera matches reality. There are also roto-artists who trace and cut out shapes for the assembly part, or as it’s called “pulling a matte.”
Asset creation is the part where computer generated images are made with the intent of being combined with the plate footage. This step involves creating the ship and its sails, the water and the waves, but more on that in a bit. It’s not uncommon for productions to skimp on the expenses of burning stuff up for real or splashing tons of water everywhere. They’d rather pay someone like Dries to digitally create the effects they want.
The compositing process is about assembling the final shot, sorting out the colour balances, making sure the virtual objects mesh with the real objects, to make sure that it all looks as natural and true to life as possible.
VFX & the Ocean Waves
Getting those ships to sail the wide blue/green/turquoise/teal/steel/grey ocean waves, sails billowing in the breeze, takes a lot more than just the VFX software magic. Dries says you start by making a model of the ship with accurate data (measurements) from the set. You then need to decide what you want to put in. For example, the ship, sea, reflections in the eyes, wind, etc. Once that is done you do something counter-intuitive—you strip out all the green and blue chroma key colours, leaving you with the cleaned-up plate, minus all the green. Even with the green removed, there may be some details that couldn’t be keyed out. This is when rotoscoping mattes are brought in. Rotoscoping is manually put in lines or cut things out by hand, as this in some cases is much more accurate than using a computer program. This eventually gives you a floating image of the deck and the cast.
At this point, compositing begins. The ship and the waves need to move accurately. In order to make sure that they do in the final product, you take the proxy model ship and run simulations on dozens of PCs to find what works best. It means you have to have some idea about fluid dynamics and how the density and viscosity of sea water causes it to behave in particular ways. By this stage, my eyes were starting to goggle and my mind was completely boggled.
Once you have a simulation running that looks good and the art director and producer are happy, Dries says, you then have to do the rest. You have to match the film grain and the digitally-created images so that when it is rendered you should not be able to see the difference between the VFX and the actual film. In other words, they must match so seamlessly that the viewer has no idea what was real and what was digitally created. It is the ultimate magician’s sleight of hand.
CGI and VFX Post-Production Glitches
Now this doesn’t always work perfectly. Ideally, the VFX team should be part of pre-production, production and post production teams and planning. The belief that you can just fix something in post-production is a wry joke—it’s possible, but it shouldn’t be necessary. In other words, there are limits to what the VFX magicians can do. “Planning beforehand reflects in the final product,” says Dries.
Dries worked on Roots, especially the ocean sequences. They had filmed at the Salt River Studio on land-locked ships. The camera had to move to infer wave movements and they used a lot of water cannons to simulate the wash of water over the deck. Great, huh? Well, not really because although they measured the ship before they broke down the set, additional photography and more accurate measurements were required. This meant a huge headache for post-production.
Remember I mentioned vectors? Extracting the camera movement for the inferred waves from the footage to accurately capture the scale of the ship so the simulation can be as accurate as possible, was proving difficult. The inaccurate measurements meant that the digital hull of the ship that is to be modeled would not match reality, not to mention the tracking of the camera would be off. This could make the ship the size of a toy or the waves the size of tsunamis. Not a small problem at all but it had to be fixed so that’s what he did. The result speaks for itself.
Roots also starred Forrest Whittaker. In one of the versions of the edit, his character is whipped in the face, leaving him with a bloody wound on his cheek. This was ultimately cut, which meant all the subsequent shots of his face with the bloody wound had to be—you guessed it—fixed in post. Dries said he got to know the actor’s face better than his own, especially when the powers that be wanted to enhance the snow in one scene. For this every practical imitation snow flake that was present in the shots had to be painted out in each frame, only to be revealed again when digital snow was animated to land where the practical snow is. Additional snowfall had to be inserted behind the actors. Once the snow-flakes were inserted, they were checked and some had to be individually animated to dance naturally but not distract from the performance!
The Implications of VFX in Post-Production Scheduling
So how long does something like this take in real time? Dries reckons that for a few seconds of ocean footage, it could take between one to two weeks. Compositing alone can take a week on really challenging shots. This is after you have spent an equivalent time in parallel to complete all the preparation of clean up, generating assets, rotoscoping and redoing things if something went wrong or if data was missing.
I think this answers the question about why the Outlander Season 3 premiere took until September this year; there was so much work to be done. All those ocean scenes! Someone had to do the post-production work of making seas from nothing but pixels ‘n bits, removing the reflections of the screens from the ship’s sails, the actors’ eyes and the greenish cast to their skin as well. Weeks and months of work is required to get to the final product.
Many thanks to Dries for giving us the behind-the-scenes look at post-production and how CGI and VFX work.
Did you have any idea about what is involved in post-production? Are you fascinated by the behind the scenes work on Outlander? Can you tell us if you have any experience in these fields and what jobs you have worked on?
6 Comments
Leave your reply.