So what's the solution then?
You might think that given that I work for TrustedReviews, a site the focus of which, funnily enough, is reviewing products that I'm about to say something to the effect that, actually, there is no way for user generated content to be considered trustworthy. That such work should be left to professionals like myself and my colleagues here and, I may begrudgingly concede, at other publications.
Well, I'm not but I need to qualify that. My personal view is that no amount of user reviews can be a substitute for a decent piece of professional editorial content. A user testing in a controlled environment, with no preconceptions and no reason to like or dislike a product is, to my mind, the best person to make an informed decision about it. Especially if, as reviewers universally are, that individual is looking at many similar products as well and can thus give that opinion in the context of the rest of the market.
But that doesn't mean that I think user commentary is useless, far from it. The trick is using it correctly. The best balance as I see it, is to attach user commentary to the end of a review proper, supplementing the reviewer's own summation of the product. User remarks which back up or, if appropriate, contradict the reviewer's.
That solution throws up issues all of its own though. There's a subconscious tendency to assume that anything appearing on a website has been approved by those running it, a sort of unspoken endorsement if you will - one that no amount of disclaimers will counteract in some peoples' minds.
If I'm average Joe public reading, purely as an example, TrustedReviews and I see user reviews attached to the end of normal product reviews I get the impression that TR has vetted those comments, that the opinions expressed therein are right. If those comments are factually incorrect, unhelpful or otherwise of negative influence to the site, suddenly you can loose the integrity that lends your reviewers their authority in the first place.
One way to work around the issue is to do what is perceived as being done anyway and prune the undesirable comments. But who decides what gets removed and what get's left? People working for the site? A group of moderators selected from the community? How, then, can readers know that any consensus seen in user remarks isn't being artificially constructed? All tricky questions and ones that aren't easy to answer comprehensively.
Perhaps you take Digg's route and allow other readers to decide which of their peers' opinions they think valuable. But as proved time and time again by Digg and YouTube, there's no consistency with this method - a useful comment might be marked down, while a pointless one is marked up.
A balance between the two would seem optimum then; with users able to flag to the moderators that another particular user has made an offensive or pointless contribution that needs removing. Ultimately now the industry and web populace has got the idea into its collective head that user-generated content is the flavour of the month there will inevitably be a growing trend to include it. Heck, even TrustedReviews has a forum, which is only one step withdrawn from what I've been talking about anyway, and we're in the process of adding a commenting system to our reviews too.
The crux of the issue: just because everyone has an opinion they're willing and able to share doesn't mean you have to accept it - wherever you might read or hear it. Take everything with a pinch of salt. Actually, make that two.