Artificial Intelligence, Unreal Engine Impact, Polarize The Business – Deadline

[ad_1]

The Greek philosopher Heraclitus once said, “Change is the only constant in life.” The entertainment sector is no exception to this adage. It’s an industry that has long benefited from advancements in technology that have increasingly enriched the quality of films and programs and have also condensed the filmmaking process across the entire pipeline from pre-production to distribution and exhibition.

But while it’s easy to look back and see how digital tech has improved the sector, assessing the future of the business as it stands on the threshold of two hot button topics — Artificial Intelligence (AI) and Epic Games’ Unreal Engine — is much less clear.

AI has been increasingly polarizing, with Twitter owner and Tesla CEO Elon Musk and Apple co-founder Steve Wozniak among more than 1,100 signatories of an open letter calling for a six-month moratorium on the development of advanced AI systems, citing “risks to society”. Meanwhile, Epic’s free Unreal Engine platform is popularizing the concept of the metaverse with its real-time 3D platform, which has been used in projects such as Rogue One: A Star Wars Story, The Batman and Ford v Ferrari. Significantly, Lucasfilm and Jon Favreau used Unreal Engine to build entire 3D environments for The Mandalorian, a real turning point for the technology.

But what does it all mean? Here, a number of experts in both fields help break down how these two tech advancements are disrupting the production sphere and what the future of the business looks like when they become a part of the everyday process of filmmaking.

Unreal Engine

According to Framestore’s Chief Creative Officer Tim Webber, “Things are moving so quickly and unpredictably at the moment that it’s really hard to know what’s going to be happening even in five years time.”

Webber, who won an Oscar for his VFX work on Gravity, says he and his team have been using Unreal Engine for a decade or more in pre-production for virtual production and the immersive division, virtual reality experiences and experiential setups. “Most of Unreal’s lifespan and history has been dedicated to the games industry, and that’s what has shaped it up until this time,” he says. “But more people, such as the film industry, are tailoring it to other needs, so it’s getting better and better all of the time.”

The tech has, he says, been impactful in planning and experimenting in a virtual space. “You can have people being motion captured in real time and see them as evolving creatures, and you can get an immediate response to it in real time.”

Framestore VFX supervisor Theo Jones adds, “It’s so useful in preproduction because it’s very good for rapid iteration and, obviously in preproduction, making a lot of changes very quickly and that interaction and interactivity is really key.”

Most notably, Unreal Engine has been popularized for use in in-camera effects, or what many refer to as ‘virtual production’, where, as in The Mandalorian, one can film in a space where the background is rendered in real time and put onto a wall behind the actors.

RELATED: Cannes Film Festival Full Coverage

“There are things that we do that are fundamentally different than any sort of DCC [digital content creation] renderer, that’s out there, such as how it approaches things in the real world, with real physics and real spatial depth,” says Miles Perkins, Industry Manager of Film & Television at Epic Games. “The big thing with Unreal Engine is that it can render in real time at least 24 frames per second, if not 60. And you interface with it in much the same way you would interface with a real set. That, to me, is what’s incredibly exciting.”

Perkins, who spent 23 years at Lucasfilm before moving to Epic Games, recalls first using Unreal with Steven Spielberg on 2001 title A.I. The forward-thinking director used an Unreal Tournament game engine to plan out his scenes for the sci-fi project.

“We were able to immediately visualize what it was going to be so that he could block out all of these shots,” he recalls.

The exec compares the change in technology at Epic to the first time he saw a test screening of Jurassic Park. “When I first saw the test and walked into the theater, I just knew that the industry was going to change because of computer-generated imagery. The exact same thing happened to me in 2019 when I joined Epic and I saw what we were doing with The Mandalorian.”

Indeed, since The Mandalorian, the proliferation of the tech on these LED walls has been, says Perkins, “almost hard to keep up with”. In 2019, there were just three of these stages, but now, according to the exec, there are more than 300.

Unreal has been particularly useful in the animation space, with companies such as Reel FX using the game engine to animate its Netflix series Super Giant Robot Brothers!. The company was the first to use the game engine to create an animation pipeline to visualize and render every aspect of the show. Perkins recalls the show’s director, Mark Andrews (Brave), turning around 140 shots per week instead of the normal 12-20. “Imagine what it means for the filmmaker,” Perkins says. “It frees them up to engage with the visuals. We’re really only on the precipice of what this is going to mean long-term.”

RELATED: Cannes Film Festival 2023 In Photos

The exec thinks that real-time technology will soon start to attract directors from outside the animation space. “Directors are now engaging with the animation in the same way they would with live-action, or near to that, so all of a sudden you’re going to start to see animation open up to people who are more traditionally live-action filmmakers.”

Directors are now engaging with the animation in the same way they would with live-action, or near to that, so all of a sudden you’re going to start to see animation open up to people who are more traditionally live-action filmmakers.

Miles perkins, epic games

Webber and Jones recently finished shooting a short animation title dubbed Flite, a live-action animation hybrid, entirely in Unreal Engine. Specifically, it used its in-house VFX pipeline FUSE (Framestore Unreal Shot Engine) to make the innovative short. With Unreal Engine at its heart, the purpose was to build a pipeline that enabled them to work at the scale of a feature film, utilizing all of the different specialist artists within the company for one project.

“Previously this wasn’t possible, and it had to be small teams working together,” Webber says. “But with Flite we wanted to build it so you could have your crew of 250 people with all the different expertise they’ve built over 20 years, all working together on something with hundreds of shots.”

It’s a great example, says Jones, of how the real-time technology can be rolled out into a pipeline that will “create a much more efficient way of working.”

Germany-based Telescope Animation is another outfit that is benefiting from this technology. The company was recently a recipient of Epic Games’ MegaGrants and a grant from the EU’s Creative Europe Media, which will see the company use Unreal Engine 5 for a multi-platform project The Last Whale Singer. In addition to a feature film, there is a planned prequel video game, episodic series and AR/VR projects that will use the same assets across all platforms.

For Telescope Animation founder Reza Memari, use of the game engine is revolutionizing the kind of animation coming out of Germany. While the European nation is a powerhouse when it comes to exporting animation, these projects are typically done at much smaller budgets (€8 million-€12 million) than their U.S. counterparts. The speed of rendering that Unreal can offer, plus the ability to create a larger pipeline for other platforms, makes it more efficient for smaller teams.

“Unreal has really accelerated the development toward the kind of animation that we are doing,” Memari says. “For small indie studios, it always meant there was a quality reduction, because we could never achieve the kind of polish that American studios can achieve because they have the money and the teams to keep reworking on it and revising it until it’s perfect. We just don’t have that kind of capacity, but Real Time allows us to work on it non-stop.”

‘The Mandalorian’

Disney+/Lucasfilm/Everett Collection

Artificial Intelligence

While there is a sense that Unreal Engine is propelling the industry forward in the virtual production space, the effects that AI will have on the business are still much less defined. AI, as we know it now, has a much broader use across the business, from screenwriting to VFX and distribution. As recent months have seen with the eruption of publicly available AI systems such as GPT-4 and ChatGPT, the film industry is — understandably — contemplating what the potential benefits and pitfalls of this technology could mean.

To start, there’s the efficiency side of it. For some in the business, particularly on the postproduction side, there are tools starting from the bottom up that are, Jones says, “taking some of the repetitive drudgery out of some of the tasks that artists have to do.”

Jones is currently using it on Disney’s live action Peter Pan & Wendy, and he says it’s been useful for “giving our artists better feedback”.

One of the key characters that we are bringing to life, we are able to use machine learning to give our animators much higher resolution characters to work with.

Theo Jones, Framestore

“One of the key characters that we are bringing to life, we are able to use machine learning to give our animators much higher resolution characters to work with,” he says. “So, normally, because our animators need to work interactively, they need to make very quick decisions and what we normally have to do is create a very, very high-resolution rig for the final images, which includes skeleton, muscles, skin, fascia, all of these things and how they interact, which is computationally expensive so we can’t normally get even close to that.”

With the use of AI, Jones has been able to give that representation to animators so they can review their work in real time and make creative decisions quicker. “A rig that was taking 20 to 30 seconds to move one frame forward, is now taking between two and six milliseconds to do the same computation.”

Indeed, it’s a handy tool to enable artists to make these decisions so they can spend more time on the creative process and less time waiting for timelines to update. But Webber cautions that they would never use the AI to create any kind of final image, as the uncertainty of copyright issues is too murky.

“At the moment, AI is a bit of a black box,” he says. “This is the problem with it in all areas, not just in the visual arts. You input something and you get an image or an essay, but you’ve got no idea what’s going on in between and you can’t tweak it very easily. That’s where it is now, but the big change will come where what you get out is something you can then manipulate yourself a little and adjust it.”

RELATED: Read All Of Deadline’s Cannes Reviews

Last month, the Writers Guild of America clarified its stance on the use of AI in the writing process and says it plans to “regulate use of material produced using artificial intelligence or similar technologies”. It also warned that AI could not be used as source material to “create MBA-covered writing or rewrite MBA-covered work, and AI-generated text cannot be considered in determining writing credits.”

It’s an interesting conversation that continues to be had, says producer and analyst Stephen Follows, who last year teamed up with a particle physicist to secure a professional script development deal to create a feature film script entirely written by AI.

“The WGA step is encouraging,” he says. “They’re taking a step where they’ve said, ‘OK, we obviously don’t want writers to be replaced.’”

In his experiment, Follows says his experience showed that the only thing that defines the quality of the output was the quality of the input in the text-generated AI. With this in mind, he feels that writers would largely be the ones who would benefit the most through using AI. “They are the most likely to get the most out of it to be able to protect their area of speciality and then also to be able to avoid things that are inherent.”

While he says the tech was half-decent at coming up with ideas, it was even better at giving good feedback. “Some of the best things we found were setting up important questions and getting them to answer them,” he says. “That’s its biggest strength. It comes up with a lot of ideas — and it comes up with a version of Alice in Wonderland quite a lot — but it’s shooting for the generic middle. However, if you’re coming up with so many ideas, you only need one nugget.”

Additionally, Follows found that the AI was good at problem-solving. “When you get a problem that we can’t think of an answer to, it can give a very concise answer that can be very convincing.”

Read the digital edition of Deadline’s Disruptors/Cannes magazine here.

He says, “Ultimately, with this technology, our concept of IP control and copyright is going to completely break down. It will also be a detriment to people who have very formulaic jobs — people whose main job has been gatekeeping.”

More recently, Miramax tapped AI tech developer Metaphysic to help de-age actors such as Tom Hanks and Robin Wright in Robert Zemeckis’ upcoming movie Here. Meanwhile, neural network lab Flawless has developed software called TrueSync, which utilizes AI to change the movement of an actor’s mouth for dubbing in different languages.

But the real questions on everyone’s mind are, will AI really be stealing all the creative work from humans in the film business, and will we all be living in The Matrix in the coming years? Jones thinks not. “AI will definitely be used in our industry increasingly going forward. But it will be used as a tool to help the creative process in the hands of artists, as opposed to replacing them, which people kind of get worried about. In fact, it will be giving them much better tools, and that’s what we’re seeing at the moment.”

[ad_2]

Source link

Leave a Reply

Your email address will not be published. Required fields are marked *