Games Look Bad: HDR and Tone Mapping (2017)

(ventspace.wordpress.com)

Comments

pflenker 21 hours ago
I skipped the text and looked at the images and was unable to understand if they were supposed to be bad or good examples. I liked them. Then k read through the text and learned that they are supposed to be bad examples.

But why though? I suspect that either I am not good at this kind of thing, or this is a purist thing, like „don’t put pineapples on pizza because they don’t do that in Italy“.

I don’t want games to look realistic. A rainy day outside looks gray and drab, there is nothing wrong with rainy days in games not looking like the real thing, but awesome and full of contrasts.

refactor_master 25 July 2025
This is [...] a series examining techniques used in game graphics and how those techniques fail to deliver a visually appealing end result

All I see is opinions though. And the internet is full of them. You just have to Google "why does this game look so ...". At least if the author had compared the search stats of "good/bad/beautiful/washed out" it would've carried some weight.

The GTA 5 screenshot is a terrible example. It looks like a cheap, dead, video game environment, reminding me how far we've come.

markus_zhang 25 July 2025
One big issue I never understood is why do we need photorealism in games at all. They seem to benefit card manufacturers and graphic programmers, but other than that I feel it has nothing to do — and in fact may have negative impact on game quality.
dahart 22 hours ago
I truly don’t understand the author’s opinions about contrast here. The RE7 image is the only one here that looks ‘realistic’, and at a glance could be mistaken for a photograph, and he says it’s got way too much contrast.

No other image here comes anywhere even close, definitely not Zelda nor GTA5.

Personally I think the whole problem with the first 5 images is that they don’t have enough contrast, and they have too much detail. The color handling isn’t the only reason they don’t look realistic, but making sure every single pixel’s nicely exposed and that nothing gets too dark or too bright is allowing to let all the CG fakeness show through. One of the reasons the RE7 image looks better is you can’t clearly see every single thing in the image.

If you take photographs outside and the sun is in the shot, you will absolutely get some blown out white and some foreground blacks, and that’s realism. The CG here is trying too hard to squeeze all the color into the visible range. To my eyes, it’s too flat and too low contrast, not too high contrast.

uncircle 25 July 2025
I found this video to visualise what tone mapping is trying to achieve, and why "photorealism" is hard to achieve in computer graphics: https://www.youtube.com/watch?v=m9AT7H4GGrA

And I indirectly taught me how to use the exposure feature in my iPhone camera (when you tap a point in the picture). It's so that you choose the "middle gray" point of the picture for the tone mapping process, using your eyes which have a much greater dynamic range than a CCD sensor. TIL.

paulluuk 25 July 2025
I feel like this is very much a personal preference thing. They even called out Horizon Zero Dawn for looking very bad, and Zelda for looking very good.. while in my opinion the exact opposite is true.
os2warpman 23 hours ago
Why does everything (in big-budget video games) look shiny and wet?

If it is an attempt at realism, reality is not constantly shiny and wet.

If it a subjective artistic choice, it is objectively wrong and ugly.

Is there an expectation that everything look shiny and wet to make it seem more "dynamic"?

Is it an artists' meme, like the Wilhelm Scream in cinematic sound design?

jillesvangurp 25 July 2025
One game that actually puts a lot of effort into this is X-plane. They use physics based rendering and with recent updates they have done quite a bit of work on this (clouds, atmosphere, natural looking colors and shadows, HDR, etc.

There's a stark contrast here with MS Flight Simulator which looks great but maybe a bit too pretty. It's certainly very pleasing to look at but not necessarily realistic.

One thing with flying is that visibility isn't necessarily that good and a big part of using flight simulators professionally is actually learning to fly when the visibility is absolutely terrible. What's the relevance of scenery if visibility is at the legal minimums? You see the ground shortly before you land, a few feet in front of you.

And even under better conditions, things are hazy and flat (both in color and depth). A crisp, high contrast, saturated view is pretty but not what a pilot deals with. A real problem for pilots is actually spotting where the airport is. Which is surprisingly hard even when the weather is nice and sunny.

An interesting HDR challenge with cockpits is that the light level inside and outside are miles apart. When flying in the real world, your eyes compensate for this when you focus on the instruments or look outside. But technically any screenshot that features a bright outside and clearly legible instruments at the same time is not very realistic but also kind of necessary. You need to do some HDR trickery to make that work. Poor readability of instruments is something X-plane addressed in one of their recent updates. It was technically correct but not that readable.

X-plane rendering has made some big improvements with all this during the v12 release over the last three years.

int0x29 20 hours ago
I suspect contrast in a lot of the games he's skewering is high because they are shootery type games where players need too see things, understand them, and react to them quickly

Also I don't necessarily see a need to make everything look like physical film.

SirMaster 21 hours ago
This seems pretty irrelevant now. This article is from 2017 which is before we had proper real HDR support in Windows 10 and much better HDR support now in Windows 11.

And before we had OLED gaming monitors which can actually now display good HDR at 1000+ nits.

This was definitely during a transitional phase with mostly fake HDR techniques that needed tone-mapping. Now we have real HDR that doesn't need tone-mapping, or only a small amount of tone-mapping above the display peak nits point.

craxmerax 25 July 2025
When HDR is implemented properly, and you have a proper HDR display, it's such a transformative experience! Most games, however, don't have good HDR implementations. And for whatever reason HDR on Windows is still awful in 2025.
can16358p 25 July 2025
HDR is GREAT! Everyone trying to implement HDR + tone mapping excessively just for the sake of it and exaggerating it to show-off (just like those oversaturated Samsung phone screens) is not.
serd 25 July 2025
From an interview with legendary Nintendo designer Gunpei Yokoi and Yukihito Morikawa of MuuMuu:

"Do these playworlds really need to be that photorealistic, I wonder? I actually consider it more of a minus if the graphics are too realistic."

https://shmuplations.com/yokoi/

fabian2k 25 July 2025
For Horizon Zero Dawn I'd argue that the colors are clearly an artistic choice. They're not going for realistic colors at all. And the original game and its sequel do look very, very good.

There do seem to be plenty of issues around HDR for sure, in some games I had to intentionally disable HDR on my PS5 because it just looked bad on my setup.

perching_aix 25 July 2025
Note that this post is of course about high internal dynamic range specifically and the necessary tonemapping that then follows for presenting an SDR image, not about how modern games do actual HDR (but then that should be pretty similar on a high level to the extent I understand anyways).

> In the real world, the total contrast ratio between the brightest highlights and darkest shadows during a sunny day is on the order of 1,000,000:1.

And this is of course silly. In the real world you can have complete darkness, at which point dynamic range shoots up to infinity.

> A typical screen can show 8 (curved to 600:1 or so).

Not entirely sure about this either, monitors have been pulling 1000:1 and 2000:1 dynamic ranges since forever, even back in 2017 when this article was written, but maybe I just never looked too deep into it.

PaulHoule 25 July 2025
I was excited when I first heard about HDR but when I saw the implementation I thought: gee, they're going to screw up both the SDR and the HDR and that seems to be the case quite often. Going from SD -> HD your picture got better although it often got stretched out, but it's not so clear the HDR version of a movie is really going to be an improvement.
the__alchemist 22 hours ago
Hmm. I like the author's main point in many video games doing this unrealistically, but there are a few sticking points that are relevant from the past few years:

  - The omission of discussing HDR monitors, and how you can't really capture that on a screenshot. This is a game changer, especially with new games and monitors.
  - The omissions of discussing Unreal5 games that have come out in the past few years. (e.g. Talos principle 2, Hellblade 2, Stalker 2)
  - Not enough examples of games that do it well, with A/B comparisons of similar settings
  - The Nintendo screenshot as an example of doing things right isn't working for me.
Another interesting example of lighting done well is Kingdome Come Deliverance 2. The details don't look nearly as nice as, e.g. UE5 game and it unfortunately doesn't support monitor HDR, but it has very realistic looking lighting and scenes.
ChoGGi 25 July 2025
I don't want realistic looking games, I want pretty looking games.

Look at movies that go all in on realism, can't see anything, can't hear anything. That's terrible.

viktorcode 25 July 2025
Please watch this for yet another take on the issue: https://www.youtube.com/watch?v=j68UW21Nx6g

The points are: game graphics is indeed suffering, but the problem is not being unlike films and photos, it's the opposite. The games should stop using film industry produced tone mapping curves and instead create their own, making a clean break.

Personally, I agree with the video.

gampleman 25 July 2025
[2017]
jpadkins 21 hours ago
Path of Exiles 2 is a good recent example of a game that does a pretty good job of contrast and tone (staying true to the dark, gritty theme that is going for). I think it was smart of the devs to keep all the high contrast to effects and lighting.
mg 25 July 2025
I really don't know what to think of HDR.

I have yet to get any benefit out of it.

I disable it everywhere I can. In Instagram for example. When it is turned on (the default) every now and then I get some crazy glaring image in my feed that hurts.

Maybe it is because I don't play games? Is HDR useful anywhere outside of games?

ooterness 15 hours ago
The opposite extreme: physics-based rendering that models the entire optical chain of a film camera, including the emulsion layers.

https://youtu.be/YE9rEQAGpLw?feature=shared

esafak 22 hours ago
The screenshot of "Zelda: Breath of the Wild " the author holds up as an exemplar looks unrealistically tone mapped to me. The "bad" screenshots in the lede look more natural and pleasing. Not realistic -- they're too stylized for my taste -- but the Zelda screenshot is simply unrealistic in a different direction.
sgarland 25 July 2025
> The exposure level is also noticeably lower, which actually leaves room for better mid-tone saturation.

Decades ago, when I shot film, I remember discovering that I really liked how photos looked when underexposed by half a stop or so. I never knew why (and I wasn’t developing my own film, so I’ve no idea what the processor may have been doing), but I wonder if this was a contributing factor.

sgarland 25 July 2025
This is apparently an unpopular opinion, but in many games (fantasy RPGs come to mind), I like the fake look. It helps it look other-worldly, IMO. I think for something like Flight Sim, I’d prefer photorealism, but otherwise I’m fine with it looking like, well, a video game.

It might be a generational thing, too; I was born in the late 80s, and my formative years were spent playing cartoonish games like Commander Keen, Command & Conquer, etc.

0_____0 25 July 2025
pyrale 25 July 2025
> But all of them feel videogamey and none of them would pass for a film or a photograph. Or even a reasonably good offline render. Or a painting. They are instantly recognizable as video games, because only video games try to pass off these trashy contrast curves as aesthetically pleasing.

Author is fumbling the difference between aesthetics and realism. Videogames feeling videogamey? What a travesty.

gloosx 20 hours ago
All these massive studios with their grand budgets can't make a game which is not looking like a cartoon.

A real masterpiece of modern graphics is a game made by two brothers called "Bodycam"

codeulike 20 hours ago
altairprime 25 July 2025
It apparently took Mozilla a couple decades to allow displays to present #ff0000 as sRGB red correctly mapped into the display’s LUT, rather than as (100%, 0%, 0%) in the display’s native LUT, which is why for several years anyone using Firefox on a ProPhoto or Adobe RGB or, later, DCI-P3 or BT.2020 display would get eye-searing colors from the web that made you flinch and develop a migraine. It was, I assume, decided that the improper tone mapping curve gave their version of the web more lifelike color saturation than other browsers — at least on their majority platform Windows, which lacked simple and reasonable color management for non-professional users until Windows 11. So Firefox looked brighter, flashier on every shitty Windows display in the world, and since displays were barely capable of better than sRGB, that was good.

Unfortunately, this also meant that Firefox gave eyestrain headaches to every design professional in the world, because our pro color displays had so much more eye-stabbing color and brightness capability than everyone else’s. It sucked, we looked up the hidden preference that could have been flipped to render color correctly at any time, and it was tolerable.

Then Apple standardized DCI-P3 laptop displays on their phones and tablets, where WebKit did the right thing — and on laptops and desktops, where Firefox did not. Safari wasn’t very good yet back then to earn conversions, though certainly it is now, and when people tried to switch from Firefox the colors looked washed out and bland next to that native display punch. So everyone thought that Apple’s displays were too bright whenever they surfed the web and suffered through a bad LUT experience — literally, Firefox was jamming 100% phosphor brightness into monitors well in excess of sRGB’s specified luminosity range — by dimming their displays and complaining about Apple.

And one day, Chrome showed up; faster, lighter, and most critically, not migraine inducing. The first two advantages drew people in; the third made them feel better physically.

Designers, professionals, everyone who already had wide color monitors and then also students; would have eventually discovered (perhaps without ever realizing it!) that with Chrome (and with Safari, if they’d put up with it), they didn’t have to dim their monitors, because color wasn’t forcibly oversaturated on phosphors that could, at minimum, emit 50% higher nits than the old sRGB-era displays. The web didn’t cause eye strain and headaches anymore.

Firefox must have lost an entire generation of students in a year flat — along with the everyone in web design, photography, and marketing that could possibly switch. Sure, Chrome was slightly better at the time; but once people got used to normal sRGB colors again, they couldn’t switch back to Firefox without everything being garish and bright, and so if they wished to leave Chrome they’d exit to Safari or Opera instead.

I assume that the only reason Firefox finally fixed this was that CSS forcibly engraved into the color v3 specification a few years ago that, unless otherwise hinted, #ff0000 is in the sRGB color space and must be rendered as such. Which would have left them no room to argue; and so Firefox finally, far too late to regain its lost web designer proponents, switched the default.

As the article describes, Nintendo understands this lesson fully, and chose to ship Zelda with artistic color that renders beautifully assuming any crap TV display, rather than going for the contrast- and saturation-maximizing overtones of the paired combination of brighter- and more-saturated- than sRGB that TV manufacturers call HDR. One need only look to a Best Buy TV wall to understand: every TV is blowing out the maximum saturation and brightness possible, all peacocks with their plumage flashing as brightly as possible, in the hopes of attracting another purchase. Nintendo’s behaviors suck in a lot of ways, but their artistic output understands perfectly how to be beautiful and compelling without resorting to the Firefox approach.

(Incidentally, this is also why any site using #rrggbb looks last-century when embedded in, or shown next to, one designed using CSS color(..) clauses. It isn’t anything obvious, but once you know how to see it, it’s like the difference between 18-bit 256color ANSI and 24-bit truecolor ANSI. They’re not RGB hex codes; they’re sRGB hex codes.)

qoez 20 hours ago
Am I taking crazy pills or are the blacks actually crushed in zelda but not crushed in zero dawn (opposite to what's stated).
otikik 21 hours ago
The article lacks some examples of what "done right" means. It points to some videogames that "do it terribly" (they look ok to me? not photorealistic, but not every videogame has to be like that?) but it does not show how "a correct" version of each image would be. Just says "it looks obviously bad". Sorry but I don't see it. I'm fine with videogames looking videogamey.
freilanzer 25 July 2025
I cannot be the only who barely notices this in games.
Jyaif 25 July 2025
After reading this article I feel like I learned nothing about what makes HDR good or bad.
lawlessone 19 hours ago
slightly related but a lot of newer games have weird colour filtering and post processing effects now that drive me nuts
turnsout 21 hours ago
This article is just misinformed. Source: I’ve been working with color space conversion, HDR tone mapping, gamut mapping and “film look” for 20 years.

It’s clear from their critique of the first screenshots that their problem is not with HDR, but contrast levels. Contrast is a color grading decision totally separate from HDR tonemapping.

There’s then a digression about RED and Arri that is incorrect. Even their earliest cameras shot RAW and could be color matched against each other.

Then they assert that tone mapping is hampered by being a 1D curve, but this is more or less exactly how film works. AAA games often come up with their own curves rather than using stock curves like Hable or ACES, and I would assume that they’re often combined with 3D LUTs for “look” in order to reduce lookups.

The author is right about digital still cameras doing a very good job mapping the HDR sensor data to SDR images like JPEGs. The big camera companies have to balance “accuracy” and making the image “pleasing,” and that’s what photographers commonly call their “color science.” Really good gamut mapping is part of that secret sauce. However, part of what looks pleasing is that these are high contrast transforms, which is exactly what the author seems to not like.

They say “we don’t have the technical capability to run real film industry LUTs in the correct color spaces,” which is just factually incorrect. Color grading software and AAA games use the same GPUs and shader languages. A full ACES workflow would be overkill (not too heavy, just unnecessarily flexible) for a game, because you can do you full-on cinema color grading on your game and then bake it into a 3D LUT that very accurately captures the look.

The author then shows a screenshot of Breath of the Wild, which I’m nearly positive uses a global tonemap—it just might not do a lot of dynamic exposure adjustment.

Then they evaluate a few more images before praising a Forza image for being low contrast, which again, has nothing to do with HDR and everything to do with color grading.

Ultimately, the author is right that this is about aesthetics. Unfortunately, there’s no accounting for taste. But a game’s “look” is far more involved than just the use of HDR or tone mapping.

dartharva 25 July 2025
It's not just games, it's regular day-to-day UI too. I'm using an Acer 185Hz VRR HDR10 Gaming monitor.. on Eco mode with HDR disabled. Everything just looks better with HDR turned off for some reason I can't explain.
atoav 23 hours ago
As someone who worked a lot in realistic VFX I concur with the observation that nearly no game is doing tone mapping right and my guess to why that is always has been the fact that doing it right is just very complex.

There are many, many things artists need to do correctly, many of which have no idea of the whole pipeline. Let's say someone creates a scene with a tree in it. What is the correct brightness, saturation and gamma of that trees texture? And if that isn't correct, how could the lighting artist correctly set the light? And if the texture and the light is wrong the correct tone lmap will look like shit.

My experience is that you need to do everything right for a good tonemap to look realistically, and that means working like a scientist and having an idea of the underlying physical formulae and the way it has been implemented digitally. And that is sadly something not many productions appear to pull off. But if you pull it off everything pops into place.

The added complication with games is of course that you can't just oprimize the light for one money shot, it needs to look good from all directions. And that means it is hard to make it look as good as a film shot, because that risks making it look like crap from other directions which studios aren't willing to risk.

The dragon in The Hobbit isn't just about the tonemapping, it is at least as much (if not more so) a lighting issue. But the two can influence each other in a bad way.

fidotron 23 hours ago
All of the "bad" examples look like they're playing on a PC with poorly set gamma curves. Play on a TV where the curves are setup properly because TV people actually care about color reproduction.