From the blocky pixels of those early forum machines to the high-definition, photorealistic landscapes we enjoy today, the history of gaming graphics is, in some way, the quest for ever more realism. It used to be that what the viewer wanted to do was connect the visual dots with their mind (so to speak), but now there are hyper-detailed volumes, the characters, the effects, environments, etc., with the way in!which reality seems to fade away in a sort of uncanny presentism. In the following we have the exploration of such advances and a sneak peek as to what may lie in the future for this ever changing art.
The Foundation: From Pixels to Polygons
“Realism” is a path that originated from a tiny seed of reality. Early computer games gave a growing spectrum of values of the web of gaming symbols; but in them had to be filled by characters in an otherwise quite limited image system–the ASCII character set.That might be a fun and artistic thing to do, but you would be working in a way with graphics that were not as fleshed out and were more at the mercy of the player’s imagination.
Where it really became a revolution was in the mid 90s, with 3D graphics and polygons. Consoles like the PlayStation and the Nintendo 64 announced an era in which worlds in games were built from 3D shapes, with sightlines and depth and a new way of getting iffy lost. Poly Counts On early 3D models, you could only have so many on screen. But they sowed the seeds for what would follow.
Higher and higher: texturing, shading, and HD
Read More Games graphics evolution The better the technology got, the more complex the games became. Texture mapping was crucial, as it allowed developers to slap very detailed imagery on a polygonal object, wrapping it with graphics to make it more realistic. Instead of a plain, monochrome wall, we could have a textured wall that shows individual bricks, or aging, or dirt.
The next major benchmark was the arrival of shaders. Shaders are small programs that run on the GPU, and define how light enters and exits surfaces. We could take a magic leap forward in visual effects there, and have dynamic lighting, reflections, refractions and some more physically accurate materials. For the first time in games, the appearance of a wet surface or of a metallic sheen, and the casting of soft shadows, had much greater depth, and therefore much greater realism.
Indeed, the leap to HD resolutions (720p, 1080p) was the single largest jump in visual fidelity between game consoles since the leap to 3D; textures, which looked soft and wuzzy, now looked razor sharp. This resulted in more complex environmental narratives as well as characters.
The Game-Changers: Physically Based Rendering (PBR) and Ray Tracing
While up until now early rendering work consisted primarily of creating ever more realistic approximations of both light and surface, the idea behind PBR is that (the materials themselves look better because) they are lit and rendered better. PBR pipelines try to simulate the interaction of light with surfaces as it happens in reality. That is to say that metal or wood or skin or cloth behaves automatically in a believable manner in response to light, irrespective of the lighting situation in the scene. PBR gives Artists a workflow that is much more natural and creates a much more plausible and richer visual result in all kind of light conditions and with all types of materials. The Witcher 3: Wild Hunt and Red Dead Redemption 2 are both prime examples of how much the PBR has contributed to the visual fidelity of modern games.
But one of the biggest graphical leaps in recent years was made with the inclusion of Ray Tracing. Rather than producing objects and shining lighting upon them (as rasterization, the traditional rendering technique, does), ray tracing traces the path that light would take in the real world as it interacts with objects in a scene, creating realistic lighting. It then radiates from, and in all directions about it, out from the camera, to form:
- True Reflections: The light of ray tracing mimics how light works in the real world with actual reflections on all reflective surfaces glass, water, and even on rough materials. This gives a breath-taking feeling of depth and presence, since the very environment is reflected in all objects — even those not immediately in the camera’s view.
- Realistic Shadows: Ray-traced shadows are much more realistic than hard shadows from traditional shadow maps, featuring soft edges, deforming based on light and object proximity and eliminating the need for shadow hacks.
- Global Illumination (GI): Arguably the most significant real-world reality ray tracing brings to the table. GI is an approximation of how light reflects off things and brightens up other parts of the scene (indirect lighting). It is this nuanced interplay of light that makes scenes feel extremely natural and earthed, getting rid of that “pre-baked” look we used to achieve with more primitive lighting methods! Imagine light bouncing off of a red wall and casting a little bit of a red hue on a nearby white wall – that’s global illumination.
- Refraction and Translucency: Also added to the list of properties that ray tracing can accurately mimic is how light passes through and bends inside transparent and translucent materials such as water, glass, and ice, further contributing to overall image quality.
The computational demands of real-time ray tracing are so severe that doing it in real time requires specialized hardware, like NVIDIA’s RTX and AMD’s Radeon RX graphics cards. While it’s not always the case as we saw up above, there are solutions like DLSS (Deep Learning Super Sampling), and FSR (FidelityFX Super Resolution), that both use AI to upscale a lower-resolution image to “a similar” visual quality (which is pretty much always slightly worse than native level) at a huge frame rate increase, fighting back somewhat the throw under the bus with ray tracing enabled.
Beyond Visuals: The Pursuit of Embodied Realism
Realism in gaming graphics isn’t just about static pictures, it is dynamic, because it is about believable worlds, believable characters.
- ADVANCED MOTION CAPTURE AND ANIMATION: KOF XIV feels like a KOF game –Even since the first KOF title in 1994, this reinvented KOF XIV with its classic 2D gameplay and “3-on-3 TEAM BATTLE” game system;The classic 3 vs Battle system is also there; those acclaimed features have been fully realized/re-used – Feel the heat of the battle and unleash your inner fighter! – All 50 characters, including for the 1st time 4 DLC characters, are available right from the start; KOF XIV is 3v3 team 3.Dream match – KOF OUTSTANDING REFINEMENT KOF XIV is simply the THE KING OF FIGHTERS with A heighten presentation – An evolved experience to speed your play. Another tech that has been heading in leaps and bounds is facial capture, resulting in characters being able to emote more faithfully than ever before.
- Destructible Environments and Real Physics Engines: New games feature realistic physical interaction and destructible environments. This enables environmental demolition, real-life scattering and dynamic fluid models, which help worlds to feel more reactive to and physical through players.
- Procedural generation and AI: Not strictly a graphical enhancement, procedural generation (typically along-side AI) is applied in order to build expansive, detailed worlds with minimal manual labor. This means that you can create sprawling, diverse game worlds whilst still maintaining great visual quality. AI is also harnessed to further the progress in graphics optimization that in turn are used to create realistic character movements and animations.
The Horizon: What’s Next for Gaming Graphics?
The inexorable march of hyperreality in video game graphics continues. Yet the future will produce even more immersive and realistic experiences:
- Path Tracing: Essentially an improved ray tracer, path tracing is designed to give a better simulation of light transport with a super-realistic rendering in mind. 26 Its computational cost is still extremely high at the moment, but with the consistent build-up of hardware and optimization I think you can expect to see it in more games sooner rather than later.
- Neural Rendering & AI-Generated Material AI is coming for your Generation—AI is starting to create incredibly detailed textures, models, even whole 3D scenes. 27 More specifically, neural rendering could change the way graphics are made and displayed, e.g., by allowing the real-time rendering of photorealistic scenes on diverse hardware.
- Instead of having to cheat with a surface to approximate an effect, volumetric rendering will make much more convincing clouds, fog, fire etc, will have depth and will respond naturally to light placement.
- Haptic Feedback and Multi-Sensory Immersion: It’s not just a visual world—haptics (like Sony’s DualSense) and 3D audio (like Sony’s Tempest Speaker systems) are part of what makes a scene overall more realistic, and gives an added layer of presence that is building upon the visual and engaging the rest of us. 28
- Added Hardware Innovation: And of course, there will be a continued push by more powerful GPU and CPU technology to power these creative advancements in visual fidelity creating even more elaborate, detailed and immersive scenes and effects in real time.
From simplistic sprites to hyper-detailed characters and photorealistic landscapes, graphics in gaming have come a very long way. Driven by an inextinguishable need for evolution in rendering technology, for more powerful hardware and for higher artistic ambitions, the search for realism continues to test the limits of digital art. With the ever-changing technology, the confines between virtual world and real will be blurrier than ever and lead to a more spectacular horizon of immersive entertainment.