Thursday 22 October 2009

See Statistics? Scrutinise the source.

If you see a set of statistics coming from a researcher, be it in an
advertisement, newspaper, or even in a Social Studies/History Source-
Based Question, always question its source.

In SS classes, we are 'trained' to look at the provenance of a source
to see where it came from, and from there, interpret certain things -
what the sample size was, who was included in the sample, what this
data is supposed to represent, and how was the data collected. The
same should apply in real life - when you see that shampoo ad claiming
"7 out of 10 people experienced stronger hair", investigate how the
research was done. If the footnote says "in a trial of 117 people", I
would still be suspect. Why?

1. Because 117 people could include 100 employees, who, having
received financial incentive from the company in the form of a job and
a salary, have an obligatory pressure to say "it's brilliant". Or
superficially, if they love the company and their products, however
bad/good they may actually be, and that is why they choose to work for
the company and take free samples, then the sample is biased.

2. Because out of 117 people, the company picked and interviewed 90
people who had strong hair to begin with, and the shampoo didn't do
much to help. Or, the opposite could be true - they started from a low
base (i.e.: very weak hair), and increased the strength, even though
the hair was still breaking, albeit at a "60% lower rate of breakage".

3. Because the shampoo works best with certain types of hair (e.g.:
for blondes, or for oily hair), and out of 117 people, 80 people could
have the advantageous trait.

4. Because out of 200 people who initially tested the product, 83
people didn't like the product / didn't report back, so the company
just worked with what they had.

5. Because the company included, with the free sample of shampoo, some
extra hair tonic and vouchers to a special hair treatment place to
"examine the progress of the improvement in your hair".


See how a single footnote might seem to disclose the validity of a
statistic, but actually has so many flaws?

By the way, the above scenario was completely made-up. But I hope it
keeps your eyes open and wary of the claptrap that many researchers,
scientists, and advertisers put out.

No comments: