Jump to content
Slate Blackcurrant Watermelon Strawberry Orange Banana Apple Emerald Chocolate Marble
Slate Blackcurrant Watermelon Strawberry Orange Banana Apple Emerald Chocolate Marble
Sign in to follow this  
Rss Bot

How videogame graphics and movie VFX are converging

Recommended Posts

Convergence has been a key buzzword across the 3D industry over the last few years. The worlds of game graphics and movie VFX are advancing, and with this change, a natural crossover is happening. We're now seeing that the production methods used in the creation of digital art for movies and games are very similar.

This has been partly triggered by the rise of new technologies such as VR, and with both areas using the same tools such as Maya and ZBrush, but it wasn't that long ago that a single plant in the film Avatar had more polygons than an entire game environment.

Motion capture in games

Motion capture is a big area where we're seeing more and more convergence. I still get excited teaching PBR texturing to my students, knowing I am using implementations of the Disney GGX shader in real-time engines. 

Motion capture in games has allowed a more complex world of storytelling to evolve, with technology allowing game makers to create characters who are relatable and human in their facial expressions and physical mannerisms. 

Just look at the work of Imaginarium on Squadron 42, Naughty Dog's The Last of Us and Guerrilla Games' Horizon Zero Dawn. With the development of characters in games using motion capture, this allows the exploration of wider human themes.

Real-time rendering technology

ov7ubBhiHqDwxVmgoeQgU.jpg

Game developer DICE recreated the Star Wars universe with great effect for Battlefront

One of the other big developments that's occurred in this space – which we've seen most recently at events such as the Games Developers Conference (GDC) – is the power of real-time rendering.

Epic's Unreal Engine has really stolen a march on this, and it has recently teamed up with The Mill and Chevrolet to demonstrate the engine's potential with the short film Human Race. Merging live action storytelling with real-time visual effects, the film showcases how these technologies are pushing the limits by using real-time rendering in a game engine.

The fact that a similar approach was used for some of the scenes in Star Wars: Rogue One by ILM, using a tweaked version of Unreal Engine 4, just adds to its credentials. The team used this technology to bring the droid K-2SO to life in real time. 

While at the moment this technique is predominately being used for hard surfaces, it's surely only a matter of time before it supports more diverse objects. 

We're also seeing companies such as performance capture studio Imaginarium expanding to adapt to this change, with Andy Serkis' recently unveiling Imaginati Studios, a game developing studio with a focus on real-time solutions using Unreal Engine 4.

Photogrammetry in games

Another area where there is a real convergence and crossover of talent is in the use of photogrammetry, which involves taking photographic data of an object from many angles and converting it into stunningly realistic fully textured digital models. 

Creating game assets from photographs may not be new, but the process has now reached the kind of standards we are used to seeing in film production. 

From the incredibly realistic recreation of the Star Wars universe by DICE in Star Wars Battlefront to Crytek's Ryse: Son of Rome, the bar of video game graphics is getting higher and higher. It might be hyperbolic to suggest the visuals of Ryse are parallel to the classic film Gladiator, but it is nevertheless a stunning realisation of ancient Rome. 

The Vanishing of Ethan Carter is another game that uses photogrammetry to great effect. Epic Games' Paragon, where photoshoots were used to capture HDR lighting on hair and skin, is another fantastic example. 

Some of the most compelling-looking graphics in games were created with photographic processes, and photogrammetry has played a huge part in driving games graphics forward.

bnL8Ksw7j8K4rTTG5gmdS.jpg

Photogrammetry was used heavily in horror adventure game The Vanishing of Ethan Carter

Epic Games used Agisoft PhotoScan to capture its images, but there are many issues native to the photogrammetry process that need to be solved. Dealing with reflections in photos of objects and poor lighting can be a real challenge, and can reduce the realism of the final output. 

But this is where the marriage of technology and artistry come together. In the fantastic blog Imperfection for Perfection, technical artist Min Oh outlines Epic Games' process, detailing the use of colour checkers and capturing lighting conditions using VFX standard grey and chrome balls. 

Other inspiration comes from the team at DICE, who overcame lighting issues when capturing Darth Vader's helmet by removing light information from source images. 

Famed VFX supervisor Kim Libreri, who is now CTO at Epic Games, predicted that graphics would be indistinguishable from reality in a decade in 2012. And a few years on, it seems like we're well on our way.

Simon Fenton is head of games at Escape Studios, which runs courses including game art and VFX.

This article originally appeared in 3D World issue 223. Buy it here!

Related articles:

View the full article

Share this post


Link to post
Share on other sites

Join the conversation

You can post now and register later. If you have an account, sign in now to post with your account.

Guest
Reply to this topic...

×   Pasted as rich text.   Paste as plain text instead

  Only 75 emoji are allowed.

×   Your link has been automatically embedded.   Display as a link instead

×   Your previous content has been restored.   Clear editor

×   You cannot paste images directly. Upload or insert images from URL.

Sign in to follow this  

×