The future of game graphics

At the first public presentation, Epic Games gave new insights into the global lighting, particle effects and editor of Unreal Engine 4 – and talked about the graphics of the future and toa hard auto team mobile gaming in an interview.

The camera glides calmly through a snow-covered ruin. Dust dances in the air, the sun glitters on crystals, long shadows disappear in the dark. And then the devil is loose – in the truest sense of the word: a huge monster with a flaming gaze breaks out of the rock and marches through the castle gates – a huge volcano can be seen in the background. With this scene, Epic Games opened the first 40-minute worldwide presentation of Unreal Engine 4 at GDC Europe 2012. The technology demo ran on an unspecified PC with a Nvidia Geforce 680 GTX, Alan Willard was responsible for the presentation.

One of the most important new features, according to Epic Games, is the fully dynamic global lighting system. “All objects can be changed dynamically, and you can see that in the engine without preprocession,” Willard explains. It shows how the outer shell of statues can be changed and how a red glowing hammer is reflected in the world. Also impressive: With a virtual hand, the programmer grabs a water polo ball and draws a wet track through the test level, in which the environment is immediately and very credibly reflected. All this works without the baked-in lightmaps – the current version of the engine doesn’t support this at all, but it might change until the final version.

A carpet reflects differently

The reflections depend on the material – a carpet throws light back in a completely different way than a stone wall. The new particle effects are even more impressive. Willard shows a room in which there are several clouds of more than a million particles that can be colored and illuminated in different ways as they swirl wildly around and around each other. At one point you can see how a fireball sets a room half on fire, which is also animated with many tiny particles, and how an ice ball extinguishes the flames – also with countless ice crystal particles.

The editor has also been significantly revised. There is a new interface called Slate, which game developers can adapt to their personal preferences and needs. To edit the properties of levels and objects (which Epic will call Blueprint in the future), there is a tool called Kismet – which completely replaces Unreal Script. Developers can call each object individually in Kismet and write a program code for it, the execution of which can then be followed in detail in a kind of flowchart. “The system is extremely debuggable,” notes Willard. It is also possible to call C++ code directly from the engine, change it, compile it, and execute it immediately – without waiting.

The Unreal Engine 4 isn’t quite finished yet – Epic Games is still working on the system to display large outdoor landscapes. Nevertheless, it is already possible to agree a licensing agreement with the company on an individual basis and to work with the technology. A free Unreal Development Kit 4 will also be available at a later date not yet mentioned. According to the official version of Epic Games, the Unreal Engine 4 is currently only available for PCs. The company didn’t want to comment on next-generation consoles because of numerous non-disclosure agreements – which means: There’s already something there.

Top graphics on mobile devices

In another presentation Niklas Smedberg – Senior Engine Programmer in the Platform Team at Epic Games – talked about graphics on smartphones and tablets. He is mainly involved with titles like Infinity Blade Dungeons, which Epic is currently producing for iOS. At GDC Europe 2012, he focused on graphics programming for Imagination PowerVR Technologies’ SGX. Smedberg is very satisfied with the current hardware and especially praises the support of Shades, Render to Texture, Depth Textures and MSAA. He has described the basic structure of the chip, which addresses the individual pixels on the display almost directly and without a frame buffer – as all relevant GPUs do for mobile devices, apart from Nvidia’s Tegra.

“Mobile devices are the new PC,” Smedberg said, saying that some of the work steps that used to be important on PCs, and still are today, are needed again on smartphones and tablets. For example, adapting games to each individual device, including thorough testing – “the scalable graphics are back”. This is particularly difficult and time-consuming under Android, because there are many different devices with very diverse hardware. But there are also a few problems with Apple devices, such as the high resolution of the retina displays, because of which the graphics have to address a lot of pixels and be accordingly fast.

Smedberg advises programmers to allow enough time to optimize their software for mobile devices. They should also check whether it makes sense to outsource as many mathematical calculations as possible to the Vertex Shaders. He also reveals a little secret from the Epic trick box: “To calculate the depth of field, the team does not use the current image, but always the previous, finished image – this saves time and you have to look very closely to recognize artifacts in the game.