Evaluation Tips

From BUS360_Resources
Jump to: navigation, search

The big picture

If you were doing primary research (a survey, questionnaire, etc.), you could design your study so that you were measuring exactly what you wanted, when you wanted. Theoretically, you could then get perfectly reliable, relevant, and recent information. (In practice, of course, no study is perfect, so you'd still have to evaluate your results using the three Rs!)

However, primary research isn't always possible for money, time, and expertise reasons. Moreover, in some cases the information really can't be found using primary research. For example, if you were looking for past court cases on the legal issues involved in your topic or lists of companies in an industry, you wouldn't be doing a survey.

In such cases, you are left doing secondary research -- looking for information researched and published by others and deciding if (and how much) it applies to your situation.

Secondary research is automatically imperfect -- you are trying to use information collected by others for their own reasons, and you need to make big decisions based on that information. Therefore, you need to evaluate that information very closely... especially since a million dollars (sales, costs, liabilities, etc.) is likely involved in your decision or recommendation.

Your audience

x


Research and evaluation of the information you gather doesn't happen in a vacuum. You need to think about your audience:

  • Who will be reading the report? You probably have a primary audience, but will they be passing it on to others? Or will they at least be using the arguments you prepare as they discuss the topics with others? They are all your audience.
  • What does your (entire!) audience likely know about this topic already? What do they believe or expect? Can you make any assumptions?
  • How does your audience feel about the topic? Are they stressed? Is this a major decision involving large amounts of money and/or risk? Will that affect the information you provide and how you discuss it with them? Will you, perhaps, spend more time justifying the quality of the information you're citing if the person is stressed?

Facts & opinions, not containers

Don't forget that you are evaluating individual facts and opinions that you are considering using in your arguments, not the articles, magazines, or web sites in which you found them. (I.e., not the containers that held those facts and opinions.)

For example, a recently updated Government of Canada webpage might be quoting old environmental information by an outside research firm that has strong ties to the oil industry. The age and authority of the webpage aren't the main focus of your evaluation -- you would be looking at the organisation or person who originally produced the report that contained the stats, the age and method of their study, etc.

(Although it certainly would be interesting to find out why the government was quoting a potentially biased and old report when better info should be available! Think about purpose mentioned in the Reliability section below.)

The 3 Rs

Reliability

This is essentially your judgment of the quality of the information. Can you trust it? Will your manager understand that it is trustworthy?

-- Was it produced by a person or organization that you think has the expertise and experience to do this sort of research? (authority)

-- Why was it gathered and published? (purpose)

  • Is there a bias - and does the bias make the information unusable or at least likely to be an incomplete picture of the situation? (Note: Be careful on this one! Bias is common -- in many cases the only people who will publish information on a topic are those who have some sort of an interest in the topic. That doesn't mean the information is incorrect. Don't throw it out too fast. Instead, look for corroborating evidence elsewhere, as well as for conflicting evidence.)
  • Is the container (the article, web site, book, etc.) meant to entertain or persuade you, or was it intended to educate you? When you see that the author is trying to persuade or entertain you, you need to expect that facts may have been pulled out of context and that facts from other perspectives have been left out. That doesn't mean that the facts in the article are necessarily false -- just that you probably only have a small part of the whole story.

-- How was the information gathered? (documentation) For example, do you have the details on the sample size and nature, where the study was conducted , etc.? If the information was only quoted in your initial source, can you track down the original report to get as many of these details as possible?

-- Two common problems with documentation are:

(1) you can't find very much methodology, so you need to decide if you can still judge the information, and if so, how you would justify including it for your manager;
(2) you find methodology details, but the study seems to be measuring things differently than you would have for your specific problem (different geography, industry, company size, question, etc.), in which case you are moving into judging relevance...

Relevance

Relevance could also be described as applicability -- How well does the info fit your problem?

No matter how high quality the information is, you need to make sure it is relevant for your problem. For instance, if the information you've found is based on another industry in another country, then you have to decide if that really matters. Can you still use it as the basis for a decision?

-- Three common areas where the facts you found may not match your needs:

(1) Geography: Does the info you found apply to your geographic area? Does it apply to an area that is close enough to yours? Does it apply broadly to many areas, yet is still accurate for your area?
(2) Industry: Does the info (fact/claim) apply to just one industry? Does it match the industry you are working with in the case? Is it close enough? Or perhaps it seems to apply to all industries, in which case you may want to think about how similar your industry is to all others for the specific topic you are researching.
(3) Scale: Does the information you found apply only to large organisations? Or maybe only to small businesses? Do you believe it can apply to the scale of the organisation you are working with in your case study?

In addition, even if you do think it's relevant information, you need to figure out how to get your audience to understand the same thing. That is, once you've judged relevance, you still need to do some business communicating to make sure your audience doesn't have the same doubts you did when you first looked at the info.

Recency

Recency is about the age of the information.

  • Is there a chance it is outdated? Could newer findings have superseded the information you found?
  • Has the environment (technology, laws, human behaviour, societal priorities...) changed since the information was first gathered?
  • What will your manager think of the age of the information if you don't explain it to her?

Again, if you did the primary research yourself, you'd probably have the most recent (and relevant) information possible, but instead you are relying on secondary information. There is no magic date beyond which all the information is good or bad. It depends on the specific topic.

For example (hypothetically), do you think that the 8-yr-old information you found on your sub-question about employee reactions to a new personnel policy has been superceded since it was studied? If not, then do you think your manager will have concerns? Can you justify/explain including the information in your report if it was the best you could find?

One tricky bit to watch out for with recency: There are often several dates attached to a fact, so you have to pick the right one when you are deciding how recent the information is.

For example, I may conduct a study in 2007, write up the results in early 2008, try to get it published for a year, then finally get it accepted in a journal in 2009. Then a journalist might read about the study and quote from it in 2012. The date that you want to pay attention to here is 2007 -- that's when the information was initially gathered/created.