But what is an objective measure for game quality? You’ll often see things used like total hours needed to complete it and things, but those are not measures of quality. Enjoyment per hour should be, but then it’s back to subjective. There isn’t an objective measure for a game being good. You can look at things like framerate and such, but it still doesn’t measure quality you can make your game very simplistic and get high FPS, and graphical quality is mostly subjective.
Viewing games outside of their context as a product for entertainment, which is inharently subjective, is always flawed.
Those are in fact all objective measures of a game’s quality. FPS on certain hardware, game length, frequency of crashes, the presence of microstuttering, lists of features, these are all things that can be quantified, and by being quantified they are made objective. You can take this information and compare games against each other to make purchasing decisions, critique them, etc. Those decisions are subjective, yet they are based on objective data.
But I didn’t say that we should only use objective measures to evaluate games, nor do I agree that we can only evaluate games subjectively. We need both, gaming media should give us both, but we both need to be able to distinguish between them.
Yes, those are objective, but if we run a PS2 game in modern hardware it’ll have high FPS. What does that mean for quality?
There are objective measured, but they’re useless without context that requires subjectivity. Do you like retro-asthetics? You may like the PS2 looking game with high FPS. If you don’t then you might not.
Bugs existing I guess is a useful objective-ish measure. It depends on what happens, how often, and when though, not just the number of them or them existing.
I agree we need media looking at both, but purely objective reporting should not be giving a game a rating on overall quality.
(I’m nit arguing with you. I’m pretty much agreeing. I just wanted to clarify what I meant.)
But what is an objective measure for game quality? You’ll often see things used like total hours needed to complete it and things, but those are not measures of quality. Enjoyment per hour should be, but then it’s back to subjective. There isn’t an objective measure for a game being good. You can look at things like framerate and such, but it still doesn’t measure quality you can make your game very simplistic and get high FPS, and graphical quality is mostly subjective.
Viewing games outside of their context as a product for entertainment, which is inharently subjective, is always flawed.
The 100% Objective Review
Those are in fact all objective measures of a game’s quality. FPS on certain hardware, game length, frequency of crashes, the presence of microstuttering, lists of features, these are all things that can be quantified, and by being quantified they are made objective. You can take this information and compare games against each other to make purchasing decisions, critique them, etc. Those decisions are subjective, yet they are based on objective data.
But I didn’t say that we should only use objective measures to evaluate games, nor do I agree that we can only evaluate games subjectively. We need both, gaming media should give us both, but we both need to be able to distinguish between them.
Yes, those are objective, but if we run a PS2 game in modern hardware it’ll have high FPS. What does that mean for quality?
There are objective measured, but they’re useless without context that requires subjectivity. Do you like retro-asthetics? You may like the PS2 looking game with high FPS. If you don’t then you might not.
Bugs existing I guess is a useful objective-ish measure. It depends on what happens, how often, and when though, not just the number of them or them existing.
I agree we need media looking at both, but purely objective reporting should not be giving a game a rating on overall quality.
(I’m nit arguing with you. I’m pretty much agreeing. I just wanted to clarify what I meant.)