John Gruber, in a parenthetical comment, in an article about the horribleness of the Q10:
Why weren’t the gadget site reviews of the Z10 and Q10 more scathing?
Weren’t they? I don’t know, I didn’t read them, but let’s take a look:
- The Verge: “But for the faithful, make no mistake: the Q10 is the ultimate BlackBerry. All paths in BlackBerry’s nearly three-decade history lead to this phone. Happy Bold users will, and should, upgrade in droves. Unfortunately, BlackBerry needs much more than a stop-loss product.” That’s the last sentences of the 7.3 rating that The Verge gave the Q10 — sounds like they like it a lot. I mean I didn’t read the rest of the post, just the summary, like every other reader…
- Engadget: Their summary is so all over the board that they don’t even bother to make a conclusion on the device. Instead they offer up sentences that allow you to draw the conclusion you want to draw about the Q10. In one paragraph saying they would choose the Z10 and the next saying that the Q10 might be better for a certain niche. You’d think it would have been easier to write a definitive answer rather than stumble over this many words…
- Trusted Reviews: (I don’t read this site, but the domain forced me to include them.) In their 7.0 review of the Q10 they conclude: “The BlackBerry Q10 will not challenge the Samsung Galaxy S4 or HTC One, but this is not really what it has been designed to do. For business users, the Q10 will be a welcome and giant step up from an aging BlackBerry Bold 9900. It is unlikely to turn around BlackBerry’s fortunes just yet, though.”
All of these left me with this (if I was interested in this phone): “Seven out of ten isn’t bad, but I don’t understand why I should buy the phone. Wait, maybe I shouldn’t. Crap I don’t know.”
And that’s the point of reviews on major sites like the ones above: they don’t want to tank a rating, and they don’t want to be too opinionated. Doing either would hurt their business.
Yes, it would hurt their business to be honest.
Why? Because if they piss the company off that makes the devices they review, then they may not get press invites or review units — and that may in turn seriously hurt their revenue as they are based off of ad sales. ((Though Trusted Reviews doesn’t look to have many (any?) ads on the site.))
But are the sites wrong to give these devices 7 out of 10s? Amazon lists the international version of the Q10 with 41 user ratings which average out to 3.5 stars out of 5. Now I find that once Amazon has 200+ reviews on any item, the reviews are pretty accurate. ((That’s why I only buy things with four or more stars.)) Three and a half out of five, by the way, is the same as seven out of ten.
Now, it’s only 41 reviews, but there are a few other versions of the Q10 that we can add in (I didn’t do the math), all those versions have 5 stars. So Amazon reviewers are seemingly liking the device more than the above reference “review” sites.
There’s reason to think that the Q10 is shit and should have been called out on that, but that doesn’t seem to be a universal feeling. As much as I like to give The Verge shit, their rating seems to be tracking with the user ratings on Amazon thus far.
That’s not to excuse their lack of opinion and objective advice, that’s still shitty reviewing, but their ridiculous point ratings seem to be acceptable in the case of the Q10 by comparison of user ratings on Amazon.