Well not a scam, but now that I'm spending a lot more time writing reviews for gamedev than I have before, there are a couple of conclusions I can reach about the profession.
Namely that it's 90% a scam that depends more upon the prevailing winds of rumor and innuendo than actual quality. Probably less so in the book and product reviews that I do because there's a more objective (and easier to follow) standard to follow. If you send me a program and it crashes after 18 minutes, I won't rate it highly.
Movies, though, are a better "petri dish" for observation because they're almost entirely subjective. And here's a bit of statistical analysis already done for me by my pals at Rotten Tomatoes. For the uninitiated, RT gathers up movie reviews, determines if they're mostly-positive or mostly-negative, and gives you a movie's score based on that. While there is some subjectivity in determining whether a review's mostly-positive or mostly-negative, reviews tend to be so polarized that it's not an issue.
Now then, case in point. The original "Matrix" is accepted by most to be a good movie, and it correspondingly received a score of 87%. This is to be expected. The movie received good buzz and didn't disappoint.
Now then, the sequel proved problematic for reviewers. The buzz for the movie was still positive and there was no reason to think that a sequel would be anything but an equal to the original (as Peter Jackson showed). Following a big first weekend, the controversy began. While some liked the sequel, most thought it far inferior to the original. Despite the problems with the sequel, reviewers gave the movie A still respectable 75%, based more on rumor and expectation than the movie itself.
The third movie, though, was a much easier proposition. Since the second movie had disappointed so many fans and was filmed at the same time as the second, it was a foregone conclusion that the third movie would be as disappointing as the second. Reviewers weren't caught off-guard this time, and the third movie expectedly received a dismal 36% despite the fact that it was so similar to the second that it should've done about as well.
What's my point? It's that movie (or book or product or anything else) reviewers by and large aren't any more an objective source of the subject than you are. Their reviews are as much a reflection of public perception and rumor than the product itself. If you read a movie review, even when written by a well-respected reviewer, it's likely colored by what he's heard others say about the movie, the perception of other movies of its type, the quality of the stars in the movie, and even the quality of the poster.
It's the same for books. Gamedev gets plenty of book reviews that we don't approve because it's obvious that the reviewer reached his conclusion without reading the book ("Visual Basic SuXxOrS, so go get a C++ book"). My point is that "respected" reviewers are little better than those.
My wife met up with a guy once who was actually taking movie reviewing as a major in college. While talking with her, he kept going on about how much he loved Quentin Tarantino and hated the Coen Brothers. My first thought at hearing this was "Well then he's already failed movie-reviewing 101, as his reviews will be forever tainted by his love and/or hate of the moviemakers. A really awful Tarantino film or a really great Coen film won't be reviewed as objectively watchable or unwatchable because of his opinion the moviemakers."
Fact is, sometimes a great director makes utter crap, and sometimes a hack director makes something really good. Reviewing 101 should be "review the product, not the surroundings", but this kid missed that lesson. To be a reviewer is to try to be one of Heinlein's Fair Witnesses. And while that's a tough and sometimes frustrating standard, it's the standard.
So read your reviews with a grain of salt, but try to write 'em objectively.
And the house is blue on this side.