HotBlonde
HotBlonde
Joined: Feb 8, 2011
  • Threads: 96
  • Posts: 1573
July 21st, 2013 at 7:09:55 PM permalink
So, I had been wondering this for a long time and was wondering if anyone on here could help.

Before making a purchase that is important to me I like to look at the reviews and ratings online that people who have purchased and used the product before me have given. The product will give an average rating (usually between 1 to 5 stars) and will show the number of people who had submitted these ratings. Obviously an average 5-star rating is better than an average 1-star rating, but how can I calculate the STRENGTH of the rating taking both the rating itself into consideration along with the number of people who submitted these ratings?

Just an example: A product that has an average 4.5-star rating may look better than the product that has an average 4.0-star rating, but if the 4.5-star rating was determined by only 2 people's ratings (an average of one 4-star rating and one 5-star rating) and the 4.0 star rating was determined by 300 people, couldn't the 4.0-star rating product actually be a higher-rated product? If the 4.5-star rated product got up to the same amount of 300 review ratings, it could possibly end up having a lower average rating than the 4.0-star average rating.

I'm looking for an actual fomula that I can use to calculate the strength of the rating, using (average rating) and (number of people/submissions) in the formula. I'm not looking for just random general subjective answers such as "Well, just kinda use your intuition," I'm looking for a concrete mathematical formula. I can see why people could say something like that, but it is apparent that if one product (another example) has an average 3.75-star rating and another has the same 3.75-star average rating but the first one is rated by 435 people and the second one is rated by only 85 people that the first product's rating is stronger than the second. Hence my seeking an OBJECTIVE answer in the form of a mathematical formula.

If anyone knows the formula to calculate this please let me know. Thanks in advance.
OFFICIALLY and justifiably reclaimed my title as SuperHotBlonde!
onenickelmiracle
onenickelmiracle
Joined: Jan 26, 2012
  • Threads: 181
  • Posts: 6998
July 21st, 2013 at 7:29:29 PM permalink
You can never be sure the reviewers are even real and not plants, so the whole exercise seems pointless.
Winter's hope for us, a sunny day fools us, dreams die with a lie. They tried to kill us, Jimmy. They did. They're dirty f**king cops! *photo is not of an AP.
rdw4potus
rdw4potus
Joined: Mar 11, 2010
  • Threads: 80
  • Posts: 6809
July 21st, 2013 at 7:41:47 PM permalink
Quote: onenickelmiracle

You can never be sure the reviewers are even real and not plants, so the whole exercise seems pointless.



Yeah, there's that, and even if they're real, they don't have equal value. Some dude is going to give a perfect score because a crap product has a good brand, and somebody else is going to give a 0 rating with a comment that says "I love everything about this product, but it only comes in blue!"

But, if I were going to undertake this exercise, I'd borrow a method from political polling and look at the variance of each product's rating. A lower variance indicates a higher rating stability. So you can tell the difference between a product where almost everyone gave a 4 star review and a product where the reviews are all over the place and happen to average 4 stars. I'm not sure if I would, but a person could theoretically assign a higher level of confidence in the product with more consistent reviews.
"So as the clock ticked and the day passed, opportunity met preparation, and luck happened." - Maurice Clarett
rxwine
rxwine
Joined: Feb 28, 2010
  • Threads: 156
  • Posts: 8206
July 21st, 2013 at 9:51:11 PM permalink
Quote: onenickelmiracle

You can never be sure the reviewers are even real and not plants, so the whole exercise seems pointless.



I read the reviews with this is mind. Generally, I look for usage details. For instance: "When I snapped the lid on, it still kept coming off." or something like that which although could be faked at least sounds like the fake reviewer is working hard anyway to make it sound true.
prisoner of gravity
FleaStiff
FleaStiff
Joined: Oct 19, 2009
  • Threads: 259
  • Posts: 13101
July 22nd, 2013 at 3:59:27 AM permalink
You need only look at hotel reviews to see how some people will carp about minor things while others will overlook major points and many sites run active management programs to monitor cyberspace comments and create positive posters.

Eons ago a sporting goods store made certain its salesmen all used the products they sold, yet that store fell by the wayside in the early days of cyber marketing where actual knowledge of a product is less important than apparent and recent knowledge of it.

Those NY photography sites are examples of being closely run by the vendors, if you try to just order the sale item with no extras, they deem your transaction suspicious. So they advertize bargains but refuse to sell them.

The most rigorous comparison shopping takes place in travel medicine where people trek to exotic locales for a vacation and have some elective surgical procedure performed, but often the price tag starts at twelve grand.

Look at evaluation of vacation resorts... totally arbitrary.
JIMMYFOCKER
JIMMYFOCKER
Joined: Jan 24, 2011
  • Threads: 17
  • Posts: 540
July 22nd, 2013 at 6:11:52 AM permalink
Go to the Fat Wallet website and ask Joey
dwheatley
dwheatley
Joined: Nov 16, 2009
  • Threads: 25
  • Posts: 1246
July 22nd, 2013 at 8:54:08 AM permalink
I think I can adapt a formula for comparing sample means from a Statistics text book I have. Unfortunately, I won't be in the same room as this textbook until, let's say, Wednesday. Hopefully I remember to check then.

It also may be a bust without the sample variance, but that would be hard to calculate from an online rating site.
Wisdom is the quality that keeps you out of situations where you would otherwise need it
Nareed
Nareed
Joined: Nov 11, 2009
  • Threads: 373
  • Posts: 11413
July 22nd, 2013 at 11:45:26 AM permalink
Quote: rxwine

I read the reviews with this is mind. Generally, I look for usage details.



That's what I do, too. It's not so important that someone disliked a product, but why. Generally I focus in negative reviews just like that.
You can visit my blog Kathy's Cooking Corner at kathyscookingcorner.blogspot.mx ... .... When someone offers you friendship with one hand and stabs you in the back with the other, you tend to notice the knife a little bit more.
P90
P90
Joined: Jan 8, 2011
  • Threads: 12
  • Posts: 1703
July 24th, 2013 at 2:33:13 AM permalink
Quote: HotBlonde

Before making a purchase that is important to me I like to look at the reviews and ratings online that people who have purchased and used the product before me have given. The product will give an average rating (usually between 1 to 5 stars) and will show the number of people who had submitted these ratings. Obviously an average 5-star rating is better than an average 1-star rating, but how can I calculate the STRENGTH of the rating taking both the rating itself into consideration along with the number of people who submitted these ratings?


For practical purposes it's very simple.
If the product averages 1 to 3 star rating, there's probably something fundamentally wrong with it. Read the reviews to find out.
If the product averages 4 to 5 star rating, it's probably a working product with its ups and downs. Read the reviews to see what they are.

User ratings are meaningless whether you have 10 or 1000. Many people even rate a product before actually using it, because if they're using it and it works, they have to reason to go back to that web page. Following user ratings is a sure way to buy at worst middle of the road stuff and at very best overpay for the third best instead of the best.

If the kind of product you're looking for has any website seriously reviewing it, look there. This includes hand tools, power tools, electronics, sporting goods, vehicles...
Anything that has enthusiasts about it, anything that serves a production, safety or entertainment purpose, and generally anything that you normally notice when using it, will have independent websites reviewing it. Comparing the ratings even from a mediocre review website will be far more informative than comparing user ratings.


Quote: HotBlonde

I can see why people could say something like that, but it is apparent that if one product (another example) has an average 3.75-star rating and another has the same 3.75-star average rating but the first one is rated by 435 people and the second one is rated by only 85 people that the first product's rating is stronger than the second.


Do you want to translate that into a "true rating"? Afraid that this isn't possible. For instance, if the average rating for forks is 3.0 and for knives 4.5, then a rating of 3.75 by 400 people is worse than 3.75 by 80 people for knives and vice versa for forks.

You can find the reliability of a rating, i.e. how well it reflects the 'public opinion'. You can also find the significance of its deviation from the average rating for this type of product, but you have to know that average rating.
It's pretty obvious, for instance, that a rating of 1.5 by 100 people is much worse than a rating of 1 by 1 person. But it's "obvious" to us - there is no mathematical constant separating good and bad ratings. If you want to put strict math to it, you have to feed that math with actual numbers.


Quote: HotBlonde

I'm looking for an actual fomula that I can use to calculate the strength of the rating, using (average rating) and (number of people/submissions) in the formula.


You will also need to find the (standard deviation)=SD. That and (number of people/submissions)=SS (sample size) will do.

Then the confidence interval for the rating will be:
(average rating) - t * (standard deviation) / (sample size)^0.5 ... (average rating) + t * (standard deviation) / (sample size)^0.5
t depends on the confidence level. Simplistically, for a confidence of 2/3, which means that 2/3 of the time the true mean is within this interval, you can approximate t=1.

If you simply want to compare the relative reliability of a rating, it's SD/SS^0.5, i.e. standard deviation divided by square root of sample size. In all cases, you have to have the SD; a mix of 3 and 4 ratings is equally reliable at only ~1/4 the sample size required for a mix of 2 and 5 ratings.
Resist ANFO Boston PRISM Stormfront IRA Freedom CIA Obama
RaleighCraps
RaleighCraps
Joined: Feb 20, 2010
  • Threads: 79
  • Posts: 2501
July 24th, 2013 at 7:35:27 AM permalink
You also have the issue of quality.

It is very easy to have a product that works very very well, and is a 5 star rating by 18 users.
However, 2 users got a quality defect product, and were very unhappy, rating it a 1.

How do you use this information?
Looks like 10% of the product shipped is worthless, but if you get one of the other 90%, you will be very happy.
Are you willing to take the chance for a $30 item? How about for a $900 item?

Finally, you also have to worry about the qualifications of the reviewers. Are they knowledgeable enough to be writing a worthwhile review to begin with.

I recently wanted to purchase a spray gun for my air compressor to spray latex paint. Airless sprayers for latex are everywhere, but an air sprayer is a bit trickier. After reading a bunch of technical articles on what you need to make latex work in an air sprayer, I settled on a certain spray gun. When I read the reviews on the site, I almost changed my mind. I think the ratio was one happy person, to 4 disgusted reviews.
I decided to go ahead with my purchase, and the gun worked great. But, you have to know what you are doing. If you don't set all the knobs just right, it won't give you a good spray. So, in this case, a good product is getting a bad rating because too many people are not educated enough to use the gun properly.
Always borrow money from a pessimist; They don't expect to get paid back ! Be yourself and speak your thoughts. Those who matter won't mind, and those that mind, don't matter!

  • Jump to: