Why even push for more realistic graphics anymore? - eviltoast

I am probably unqualified to speak about this, as I am using an RX 550 low profile and a 768P monitor and almost never play newer titles, but I want to kickstart a discussion, so hear me out.

The push for more realistic graphics was ongoing for longer than most of us can remember, and it made sense for most of its lifespan, as anyone who looked at an older game can confirm - I am a person who has fun making fun of weird looking 3D people.

But I feel games’ graphics have reached the point of diminishing returns, AAA studios of today spend millions of dollars just to match the graphics’ level of their previous titles - often sacrificing other, more important things on the way, and that people are unnecessarily spending lots of money on electricity consuming heat generating GPUs.

I understand getting an expensive GPU for high resolution, high refresh rate gaming but for 1080P? you shouldn’t need anything more powerful than a 1080 TI for years. I think game studios should just slow down their graphical improvements, as they are unnecessary - in my opinion - and just prevent people with lower end systems from enjoying games, and who knows, maybe we will start seeing 50 watt gaming GPUs being viable and capable of running games at medium/high settings, going for cheap - even iGPUs render good graphics now.

TLDR: why pay for more and hurt the environment with higher power consumption when what we have is enough - and possibly overkill.

Note: it would be insane of me to claim that there is not a big difference between both pictures - Tomb Raider 2013 Vs Shadow of the Tomb raider 2018 - but can you really call either of them bad, especially the right picture (5 years old)?

Note 2: this is not much more that a discussion starter that is unlikely to evolve into something larger.

  • LanAkou@lemm.ee
    link
    fedilink
    arrow-up
    88
    ·
    1 year ago

    Is it diminishing returns? Yes, of course.

    Is it taxing on your GPU? Absolutely.

    But, consider Control.

    Control is a game made by the people who made Alan Wake. It’s a fun AAA title that is better than it has any right to be. Packed with content. No microtransactions. It has it all. The only reason it’s as good as it is? Nvidia paid them a shitload of money to put raytracing in their game to advertise the new (at the time) 20 series cards. Control made money before it even released thanks to GPU manufacturers.

    Would the game be as good if it didn’t have raytracing? Well, technically yes. You can play it without raytracing and it plays the same. But it wouldn’t be as good if Nvidia hadn’t paid them, and that means raytracing has to be included.

    A lot of these big budget AAA “photorealism” games for PC are funded, at least partially, by Nvidia or AMD. They’re the games you’ll get for free if you buy their new GPU that month. Consoles are the same way. Did Bloodborne need to have shiny blood effects? Did Spiderman need to look better than real life New York? No, but these games are made to sell hardware, and the tradeoff is that the games don’t have to make piles of money (even if some choose to include mtx anyway).

    Until GPU manufacturers can find something else to strive for, I think we’ll be seeing these incremental increases in graphical fidelity, to our benefit.

    • Kir@feddit.it
      link
      fedilink
      arrow-up
      45
      ·
      1 year ago

      This is part of the problem, not a justification. You are saying that companies such Nvidia have so much power/money, that the whole industry must spend useless efforts into making more demanding games just to make their products relevants.

      • RiikkaTheIcePrincess@kbin.social
        link
        fedilink
        arrow-up
        19
        ·
        1 year ago

        Right? “Vast wealth built on various forms of harm is good actually because sometimes rich people fund neat things that I like!” Yeah sure, tell that to somebody who just lost their house to one of the many climate-related disasters lately.

        I’m actually disgusted that “But look, a shiny! The rich are good actually! Some stupid ‘environment’ isn’t shiny cool like a videogame!” has over fifty upvotes to my one downvote. I can’t even scrape together enough sarcasm at the moment to bite at them with. Just… gross. Depressing. Ugh.

      • LanAkou@lemm.ee
        link
        fedilink
        arrow-up
        2
        ·
        1 year ago

        Yeah, its too bad that that’s always true for every game and not just ~30 AAA year.

        Too bad Dave the Diver, Dredge, and Silksong will never get made 😔

    • jmcs@discuss.tchncs.de
      link
      fedilink
      arrow-up
      27
      ·
      1 year ago

      So the advantage is that it helps create more planned obsolescence and make sure there will be no one to play the games in 100 years?

      • LanAkou@lemm.ee
        link
        fedilink
        arrow-up
        23
        ·
        1 year ago

        Is that a real question? Like, what are we even doing here?

        The advantage is that game companies are paid by hardware companies to push the boundaries of gamemaking, an art form that many creators enjoy working in and many humans enjoy consuming.

        “It’s ultimately creating more junk so it’s bad” what an absolutely braindead observation. You’re gonna log on to a website that’s bad for the environment from your phone or tablet or computer that’s bad for the environment and talk about how computer hardware is bad for the environment? Are you using gray water to flush your toilet? Are you keeping your showers to 2 minutes, unheated, and using egg whites instead of shampoo? Are you eating only locally grown foods because the real Earth killer is our trillion dollar shopping industry? Hope you don’t watch TV or go to movies or have any fun at all while Taylor Swift rents her jet to Elon Musk’s 8th kid.

        Hey, buddy, Earth is probably over unless we start making some violent changes 30 years ago. Why would you come to a discussion on graphical fidelity to peddle doomer garbage, get a grip.

    • coyotino [he/him]@beehaw.org
      link
      fedilink
      arrow-up
      12
      ·
      edit-2
      1 year ago

      Would the game be as good if it didn’t have raytracing? Well, technically yes. You can play it without raytracing and it plays the same. But it wouldn’t be as good if Nvidia hadn’t paid them, and that means raytracing has to be included.

      To place another point on this: Control got added interest because their graphics were so good. Part of this was Nvidia providing marketing money that Remedy didn’t have before, but I think the graphics themselves helped this game break through the mainstream in a way that their previous games did not. Trailers came out with these incredible graphics, and critics and laygamers alike said “okay I have to check this game out when it releases.” Now, that added interest would mean nothing if the game wasn’t also a great game beyond the initial impressions, but that was never a problem for Remedy.

      For a more recent example, see Baldur’s Gate 3. Larian plugged away at the Divinity: OS series for years, and they were well-regarded but i wouldn’t say that they quite hit “mainstream”. Cue* BG3, where Larian got added money from Wizards of the Coast that they could invest into the graphics. The actual gameplay is not dramatically different from the Divinity games, but the added graphics made people go “this is a Mass Effect” and suddenly this is the biggest game in the world.

      We are definitely at a point of diminishing returns with graphics, but it cannot be denied that high-end, expensive graphics drive interest in new game releases, even if those graphics are not cutting-edge.

      • entropicdrift@lemmy.sdf.org
        link
        fedilink
        arrow-up
        7
        ·
        1 year ago

        This comment is 100% on-point, I’m just here to address a pet peeve: Queue = a line that you wait in, like waiting to checkout at the store Cue = something that sets off something else, e.g. “when Jimmy says his line, that’s your cue to enter stage left”

        So when you said:

        Queue BG3, where Larian got added money from Wizards of the Coast that they could invest into the graphics.

        What you meant was:

        Cue BG3, where Larian got added money from Wizards of the Coast that they could invest into the graphics.

      • dmrzl@programming.dev
        link
        fedilink
        arrow-up
        4
        ·
        1 year ago

        “We are definitely at a point of diminishing returns with graphics,”

        WTF. How can you look at current NERF developments and honestly say that? This has never been further from the truth in the history of real time graphics. When have we ever been so close to unsupervised, ACTUALLY photo-realistic (in the strict sense) graphics?

    • MBM@lemmings.world
      link
      fedilink
      arrow-up
      6
      ·
      1 year ago

      I played Control on a 10 year old PC. I’m starting to think I missed out on something

      • Blackmist@feddit.uk
        link
        fedilink
        English
        arrow-up
        4
        ·
        1 year ago

        I played it on PS5 and immediately went for the higher frame rate option instead.

        I think Ghostwire Tokyo was a much better use of RT than Control.

        Also found Ratchet and Clank to be surprisingly good for 30fps. I can’t put my finger on exactly what was missing from the 60fps non RT version, but it definitely felt lesser somehow.

    • HTTP_404_NotFound@lemmyonline.com
      link
      fedilink
      English
      arrow-up
      2
      ·
      1 year ago

      Until GPU manufacturers can find something else to strive for

      Machine Learning / AI has been a MASSIVE driver for Nvidia, for quite some time now. And bitcoin…

  • MentalEdge@sopuli.xyz
    link
    fedilink
    arrow-up
    52
    ·
    edit-2
    1 year ago

    Shadow can definitely look a lot better than this picture suggests.

    The biggest advancements in game graphics have not occurred in characters, except for perhaps in terms of animation and subsurface scattering tech.

    The main character always gets a disproportionate graphical resource allocation, and we achieved “really damn good” in that category a while ago.

    Adam Jensen didn’t look that much better in Mankind Divided, than he did in Human Revolution, but Prague IS SO MUCH MORE DETAILED than Detroit was.

    Then there’s efficiency improvements in rendering brought by systems like nanite, material shader improvements, more detailed lighting systems and more efficient ambient occlusion.

    Improvements in reverse kinematics is something I’m really excited about, as well.

      • MentalEdge@sopuli.xyz
        link
        fedilink
        arrow-up
        6
        ·
        1 year ago

        It is. Adam works for the secret underground interpol base in the middle of the city. There are abusive secret societies to dismantle, murder cases to solve, drug rings to bust, corrupt cops to beat up. Mankind Divided is a prime example of making a hub-world medium sized but super detailed, being just as good if not better than huge and full of nothing.

    • Doods@infosec.pubOP
      link
      fedilink
      arrow-up
      2
      ·
      edit-2
      1 year ago

      I can not elaborate on that as I am unqualified - remember, I have never played newer titles.

      • MentalEdge@sopuli.xyz
        link
        fedilink
        arrow-up
        15
        ·
        edit-2
        1 year ago

        My main point is that a headshot of the main character is not a good yardstick. The mc is always going to be rendered with enough oomph to look good, no matter the settings or game generation.

        The difference in recent years has been in environment detail and material shading, lightning, things you maybe can’t even enable due to playing on older hardware.

        While I agree ray tracing is a total energy hog, that’s not the only area seeing advancement. Rendering pipelines like nanite enable more graphics, AND less power consumption.

      • MrZee@lemm.ee
        link
        fedilink
        English
        arrow-up
        7
        ·
        1 year ago

        Three thoughts:

        1. I wonder if you would still have this take if you played a newer, high quality AAA game on a high end setup. I don’t mean to imply that your mind will definitely be blown — really don’t know — but it would be interesting to see what doing so would do to your opinion.

        2. Gaming is about entertainment. There is no denying that better/bigger/smoother/more immersive tends to add to the entertainment. So devs push those boundaries both for marketing reasons and because they want to push the limits. I have a hard time seeing a world in which gaming development as a whole says “hey, we could keep pushing the limits, but it would be more environmentally friendly and cheaper for our customers if we all just stopped advancing game performance.”

        3. There are SO MANY smaller studios and indie devs making amazing games that can run smoothly on 5/10/15 year old hardware. And there is a huge number of older games that are still a blast to play.

      • MentalEdge@sopuli.xyz
        link
        fedilink
        arrow-up
        6
        ·
        1 year ago

        Another point in favour of new graphics tech, you mentioned you’re worried about artist needing to do more work. As someone who has done 3D work, I can tell you that its actually easier to make something photo-real. The hard part is making it look good within the limitations of a game engine. How to get something that looks just as good, with simpler material shaders and fewer polygons.

        Tech like nanite actually eliminates the need for all that work. You can give the game engine the full-quality asset, and it handles all the difficult stuff to render it efficiently. This is why we are now seeing games that look as good as Unrecord coming from tiny new studios like DRAMA.

    • brennesel@feddit.de
      link
      fedilink
      arrow-up
      14
      ·
      1 year ago

      I totally agree. And I would add some of my favorite games like Outer Wilds, Satisfactory, or The Witness to the list that look great but don’t try to be realistic. Their art style only serves the purpose of their respective core gameplay.

    • Send_me_nude_girls@feddit.de
      link
      fedilink
      arrow-up
      4
      ·
      1 year ago

      I see where you’re coming from but I don’t agree, well at least not anymore. I used to. Thing is, we reached a point for me personally where an old game doesn’t necessarily look bad anymore, as my brain will fill on the gaps, even as an adult. I can hardly do that with any of the really old games, as they lack polygons and details, but anything more modern is good enough. Art style is more important than graphics quality. And art style doesn’t exclude realism. The same way a real life thing can look good or bad, even though both have real world graphics if you so want.

      I also don’t think it’s as easy as to make every game comic or pixel art, for some genre it simply doesn’t work as part of the gameplay is immersion in the world. Thankfully though we reached a point where even indie developer can use something like Unreal Engine or Unit to create high quality games. In future this will advance further thanks to AI animation, voiceover, textures and even models.

  • mcforest@kbin.social
    link
    fedilink
    arrow-up
    43
    ·
    1 year ago

    The thought that today’s state of technology is enough and we should stop improving sounds pretty Amish to me.

      • verbalbotanics@beehaw.org
        link
        fedilink
        arrow-up
        22
        ·
        1 year ago

        Luddites, the original ones were pretty rad. They were anti tech for anti capitalist reasons.

        I agree that Luddite is the more correct term since it’s more general now, but I hate that the term got warped over time to mean anyone that hates any new tech

    • Doods@infosec.pubOP
      link
      fedilink
      arrow-up
      15
      ·
      1 year ago

      I didn’t mean we should stop improving, what I meant is we should focus more on the efficiency and less on the raw power.

      • mcforest@kbin.social
        link
        fedilink
        arrow-up
        11
        ·
        edit-2
        1 year ago

        I think game and engine developers should do both. If it’s possible to improve efficiency and performance it should be done. But at the same time hardware is improving as well and that performance gain should be used.

        I’m kinda worried a little bit about the recent development in hardware though. At the moment GPU power mostly increases with energy consumption and only a little with improved architecture. That was different some years ago. But in my eyes thats a problem the hardware manufactorera have, not the game developers.

        • icesentry@lemmy.ca
          link
          fedilink
          arrow-up
          8
          ·
          edit-2
          1 year ago

          Performance is always about doing as much as possible with as little as possible. Making a game runs faster automatically makes it more efficient because the only way it can run faster is by doing less work. It’s just that whenever you can run faster it means the game has more room for other things.

        • Square Singer@feddit.de
          link
          fedilink
          arrow-up
          2
          ·
          1 year ago

          It’s resource consumption and graphics output are directly linked. If you gain more efficiency, that gives you more headroom to either reduce resource consumption or increase the graphics output. But you can’t maximize both. You have to decide what do to with the resources you have. Use them, or not use them. You can’t do both at the same time.

    • Wahots@pawb.social
      link
      fedilink
      arrow-up
      2
      ·
      1 year ago

      To kinda reiterate the point in a different light, I’d like more games with cel shading, which can look somewhat timeless if done right. I think Risk Of Rain 2 is cel shaded, and it looks fantastic while not also being the centerpiece of the game.

  • usrtrv@lemmy.ml
    link
    fedilink
    arrow-up
    26
    ·
    1 year ago

    I understand the sentiment, but it seems like you’re drawing arbitrary lines in the sand for what is the “correct” amount of power for gaming. Why waste 50 watts of GPU (or more like 150 total system watts) on a game that something like a SteamDeck will draw 15watts to do almost identically. 10 times less power for definitely not 10 times less fidelity. We could all the way back to the original Gameboy for 0.7 watts, the fidelity drops but so does the power. What is the “correct” wattage?

    I agree that the top end gpus are shit at efficiency and we should could cut back. But I don’t agree that fidelity and realism should stop advancing. Some type of efficiency requirement would be nice, but every year games should get more advanced and every year gpus should get better (and hopefully stay efficient).

    • iminahurry@discuss.tchncs.de
      link
      fedilink
      English
      arrow-up
      2
      ·
      1 year ago

      I agree that the top end gpus are shit at efficiency and we should could cut back.

      According to Steam survey, 4090, 3090, 6900XT, and 7900 XTX combined are being used by about 1.7% of gamers.

      This number is, of course, inflated (at least slightly) because people who have money to buy these cards are also more likely to buy games and people owning older/cheaper cards are more likely to be playing pirated copies.

      The top tier cards are showcase of technological advancement. They are not really used by a large number of people. So there’s not much point. It will only reduce the baseline for next generation, leading to less advancement.

      • usrtrv@lemmy.ml
        link
        fedilink
        arrow-up
        2
        ·
        1 year ago

        That’s a very good point, but a little misleading. A better number would be to add up all the top tier cards from every generation, not just the past 2. Just because they’re old doesn’t mean they still aren’t relatively inefficient for their generation.

        If we kept the generations exactly the same, but got rid of the top 1 or 2 cards. The technological advancement would be happening just as fast. Because really, the top tier cards are about silicon lottery and putting as much power in while keeping stable clocks. They aren’t different from an architecture perspective within the same generation. It’s about being able to sell the best silicon and more VRAM at a premium.

        But as you said, it’s still a drop in the bucket compared to the overall market.

    • Doods@infosec.pubOP
      link
      fedilink
      arrow-up
      2
      ·
      1 year ago

      I agree that I shouldn’t have set the arbitrary 50 watt thing, I just saw my GPU and bigger ones and came out with that number.

    • Doods@infosec.pubOP
      link
      fedilink
      arrow-up
      7
      ·
      edit-2
      1 year ago

      I don’t think higher graphics requirements hurt creativity, you can have an unrealistic looking game that is very GPU-intensive, I was mainly concerned about the costs and wasted money/efforts.

      But lowering the graphics budget - and the budget in general - can make creativity/risk-taking a more appealing option for AAA studios.

      Edit : I just noticed both sentences kind of contradict each other but you get the point.

  • Phen@lemmy.eco.br
    link
    fedilink
    arrow-up
    22
    ·
    1 year ago

    Because it impresses people and so it sells. If they didn’t do that, all those EAs and Ubisofts would have to find a new selling point like making their games good or something.

  • Renacles@discuss.tchncs.de
    link
    fedilink
    arrow-up
    21
    ·
    1 year ago

    I’ve been honestly blown away with how newer games look since I upgraded my graphics card.

    Remnant 2 is not even a AAA game but does such a good job with light and reflections that it looks better than anything released 5+ years ago.

    Then you have games like Kena: Bridge of Spirits, which have a very nice art style but take advantage of current hardware to add particles everywhere.

  • SenorBolsa@beehaw.org
    link
    fedilink
    English
    arrow-up
    20
    ·
    edit-2
    1 year ago

    I think in some cases there’s a lot of merit to it, for example Red Dead Redemption, both games are pretty graphically intensive (if not cutting edge) but it’s used to further the immersion of the game in a meaningful way. Red Dead Redemption 2 really sells this rich natural environment for you to explore and interact with and it wouldn’t quite be the same game without it.

    Also that example of Tomb Raider is really disingenuous, the level of fidelity in the environments is night and day between the two as well as the quality of animation. In your example the only real thing you can tell is the skin shaders, which are not even close between the two, SotTR really sells that you are looking at real people, something the 2013 game approached but never really achieved IMO.

    if you don’t care then good for you! My wallet wishes I didn’t but it’s a fun hobby nontheless to try and push things to their limits and I am personally fascinated by the technology. I always have some of the fastest hardware every other generation and I enjoy playing with it and doing stuff to make it all work as well as possible.

    You are probably correct in thinking for the average person we are approaching a point where they just really don’t care, I just wish they would push for more clarity in image presentation at this point, modern games are a bit of a muddy mess sometimes especially with FSR/DLSS

    It mattered a lot more early on because doubling the polygon count on screen meant you could do a lot more gameplay wise, larger environments, more stuff on screen etc. these days you can pretty much do what you want if you are happy to drop a little fidelity in individual objects.

    • Thrashy@beehaw.org
      link
      fedilink
      English
      arrow-up
      6
      ·
      edit-2
      1 year ago

      Also that example of Tomb Raider is really disingenuous, the level of fidelity in the environments is night and day between the two as well as the quality of animation. In your example the only real thing you can tell is the skin shaders, which are not even close between the two, SotTR really sells that you are looking at real people, something the 2013 game approached but never really achieved IMO.

      I’ve noticed this a lot in comparisons claiming to show that graphics quality has regressed (either over time, or from an earlier demo reel of the same game), where the person trying to make the point cherry-picks drastically different lighting or atmospheric scenarios that put the later image in a bad light. Like, no crap Lara looks better in the 2013 image, she’s lit from an angle that highlights her facial features and inexplicably wearing makeup while in the midst of a jungle adventure. The Shadow of the Tomb Raider image, by comparison, is of a dirty-faced Lara pulling a face while being lit from an unflattering angle by campfire. Compositionally, of course the first image is prettier – but as you point out, the lack of effective subsurface scattering in the Tomb Raider 2013 skin shader is painfully apparent versus SofTR. The newer image is more realistic, even if it’s not as flattering.

    • iminahurry@discuss.tchncs.de
      link
      fedilink
      English
      arrow-up
      2
      ·
      1 year ago

      I’m someone who doesn’t care about graphics a whole lot. I play most modern games at 1080p Mid/high on my RTX 3060.

      And yet, I totally agree with your points. Many times, older games had rich looking environment from a distance, but if you go close or try to interact with it, it just breaks the illusion. Like, leaves can’t move independently or plants just don’t react to your trampling then etc.

      A lot of graphical improvements are also accompanied with improvements in how elements interact with other elements in the game. And that definitely adds to the immersion, when you can feel like you’re a part of the environment.

  • GreenMario@lemm.ee
    link
    fedilink
    arrow-up
    20
    ·
    1 year ago

    I like seeing advances in graphics technology but if the cost is 10 year dev cycle and still comes out s-s-s-stuttering on high end PCs and current gen consoles then scale back some.

    I think we hit a point where it’s just not feasible enough to do it anymore.

    • DaSaw@midwest.social
      link
      fedilink
      English
      arrow-up
      8
      ·
      1 year ago

      I agree with everything he said. But I’ve also been saying things like that for thirty years. I remember when Morrowind came out complaining about companies using extra processing for shitty 3D graphics instead of sticking with high quality 2d that works perfectly fine and putting that extra processing power to work on better AI or something.

      I think the problem is that better graphics is the one thing they can do that will please a mass audience. Sure, there are plenty of other things they could be doing, but I would bet that each of them has a niche appeal that will have fewer fans to spread the cost among. Thus producers of “AAA” titles pretty much by definition have to pursue that mass audience. The question is when they reach that point of diminishing returns and be becomes more profitable to produce lower cost niche titles for smaller audience. And we also have to factor in that part of that “profit” of pleasing that assumption our society has that anything with niche appeal is necessarily “lower” in status than mass appeal stuff.

      I think we are approaching that point, if we haven’t already reached it. Indie stuff is becoming more and more popular, and more prevalent. It’s just hard to tell because indie stuff tends to target a smaller but more passionate audience. For example, while I am looking forward to trying Starfield out, I may be too busy playing yet more Stardew Valley to buy it right away, and end up grabbing it in a sale. (I haven’t even really checked if it’ll run on my current gaming laptop.)

  • magnetosphere @beehaw.org
    link
    fedilink
    English
    arrow-up
    18
    ·
    edit-2
    1 year ago

    Games don’t need better, more complex graphics. They need adequate time and resources during the development process. They need to actually be completed by their release date, not just barely playable. They need to be held to a higher standard of quality when publishers judge if they’re ready to sell.

  • Stuka@lemmy.ml
    link
    fedilink
    arrow-up
    18
    ·
    1 year ago

    I’m with you. Barely notice the changes in graphics, just the increasing of my gpu fan speeds over the years.

    I’m more interested in games that graphics that look good enough, but do more interesting things with the extra horsepower we have these days.

  • sculd@beehaw.org
    link
    fedilink
    English
    arrow-up
    17
    ·
    1 year ago

    Pushing for even more realistic graphics will make the cost of making even higher with no significant change in enjoyment of players.

    Players enjoyed games when we had Supernintendos and DOS games. They actually gave players more room for imagination.

    • drewdevorcula@beehaw.org
      link
      fedilink
      English
      arrow-up
      7
      ·
      1 year ago

      So… this is only partially correct IMHO.

      Yes, it will continue to be expensive for the studios that push the envelope. However, as those studios continue to invest large amounts of cash, the smaller studios are continually getting access to better and better tools because of it. That means that a small studio can create something that is not quite-as-good as the major studios, but still very competitive, and for significantly cheaper.

      As technology progresses, last-year’s tech will always fall in price.

      As to the point of enjoying Super Nintendo and DOS games, sure. Much of that style has returned in the form of pixel art games and what have you. But the conservative viewpoint of ‘8-bit was good enough in my day, why improve on it’ is just short-sighted in my opinion. Why keep pumping out Atari-grade stuff when so much more is possible? Why not advance and improve?

  • Kir@feddit.it
    link
    fedilink
    arrow-up
    17
    ·
    1 year ago

    Everything is ruined by marketing it’s capitalist roots, and game development is no exception.

    They push for fidelity just because sells well, the fact that this makes for the need of much powerful hardware is not a drawback for them. It’s actually good, since it’s someone you can profits on.

    Games need artistic direction and vision, much more than they need photorealism (which is great for some kind of games, but not a universal standard).

    • Micromot@lemmycook.de
      link
      fedilink
      arrow-up
      6
      ·
      1 year ago

      Photorealism really only fits a dew genres, almost all indie games have some kind of interesting artstyle like the long dark for example