On Wed, 4 Sep 2024 10:13:42 +0100, JAB <
noway@nochance.com> wrote:
On 02/09/2024 18:33, Spalls Hurgenson wrote:
On Mon, 2 Sep 2024 09:49:29 +0100, JAB <noway@nochance.com> wrote:
So first up is 'we' in this context refers to gamers in general and not
this group. So with that out of the way, this comes from Spall's
'Favourite Era of Gaming' thread and something I watched (don't worry
about the video as most of it is irrelevant). Something that was talked
about was is the current gaming industry really that bad or is our
perception of games skewed by information available to us.
My two cents:
I don't think that the gaming public is too negative. Rather, I think
it's a negative reaction to some awful trends in the industry. There
are lots of examples of gamers being extremely positive about games,
after all. Gamers WANT to love their games, but they're too often
being disappointed by the people selling those games.
<snip>
>
I do agree that there are real problems, especially with the big budget
game segment, but this was more focused on is 'our' perception of how
bad the games industry is biased due to the likes of social media, and
I'll probably also add that the hype that publishers push, and indeed
games journalists*, then meets the reality of the actual game. I've
still not got over just how disappointed I was when my pre-order of Bio
Shock arrived and I thought oh this is just a shooter in an underwater
city. At least the metal case was nice!
>
If I look at my YouTube feed for games then a lot of it is quite
negative even if this is due to where a lot of the problems occur, big
budget, also happen to be those that will generate clicks. Another way
of looking at it is, if there was less focus on negativity for clicks
would the overall perception be better or to put it simply is the gaming
industry as a whole really that bad?
>
*I still get irritated by the steady march of the score an average game
can get. 70% is a game that's kinda ok and 80% is good but nothing to
write home about?
Blame American school system.
Once upon a time, it was decided that an "average" student should be
able to answer 70% of the problems on a test. If they answered fewer
correctly, that meant the student was struggling; more meant the
student was 'above average'. This decision rapidly codified into the
grading system every American child becomes familiar with. A grade of
60% or less was a sign of failure.
Unfortunately, this grading scheme was later translated to OTHER
things, where the '70%' score didn't really make sense. Is an average
game one where it got '70%' of things right? How can you objectively
measure this? You can't, of course. There have been games which did
99% of everything right, but failed so spectacularly in one thing that
they were terrible games overall.
Still, the scoring system stuck. An average game was given a 70%
score. Bad games got 50s or 60s. Good games got 90s or 100s. Anything
below 50 became essentially meaningless, because there as no real
difference between a game graded '50%' and '20%'.
The end result was grade inflation, not helped by the smaller 'grading
space' available. Rather than 100 points, reviewers were largely
limited to a 35 point scale (since few ever went past that mark).
[It also doesn't help that a lot of reviewers have been
playing games for decades now, and games are, as a general
rule, just BETTER nowadays than they were in the past.
They've got much better production values, better level
design, better mechanics, and all sorts of quality-of-life
stuff we take for granted. In comparison to 'average'
games from the '90s and 2000s, the average game today _is_
a better experience. So if you gave "Croc 2" a 70% score
in 1999, how can you _not_ give "Yooka Laylee" an 85%
score in 2017? ;-)
Myself, if forced to assign a numerical value to a game, I assume an
'average' game gets a score of 50%; that is, roughly half the games
I've played are better, and half are worse. But honestly, a single
numerical grade of subjectively-reviewed material is pretty worthless.