I recently read an article suggested by my web browser on the rising number of rent-burdened consumers. I learned that “rent-burdened” is a commonly used term, meaning a renter whose monthly housing costs were more than 30% of their income.
Yeah, I know, not very fascinating; economics for most people is about as exciting as watching paint dry. But bear with me; this is not about economics, but about research.
Anyway, I was so interested in the problem of being “rent-burdened” that I searched for another article—just a glutton for punishment, I guess. And there, at the top of the search results was another article, which spoke about how the number of rent-burdened households was falling, not rising! How could this be?
The other author was using the same definition of “rent-burdened,” so that wasn’t the problem. The two articles were written six months apart, and it’s doubtful that the figures could change in that short a time. After a long analysis of both articles, I found that the author talking about the falling number of rent-burdened households was using current figures, but the “rising” author was using figures that were three to seven years old. Was this author just incompetent, or pushing an agenda? Well, I’m going to name names: the article was published by Pew Charitable Trusts, an organization that ought to know what it’s doing.
My point is the first rule of internet research: Use multiple sources when you research a topic. That’s a good rule for any research, but especially for the Internet.
The second rule of Internet research applies here as well: Check the dates. In this case, the article was less than two months old, but the statistics they used were from three to seven years earlier.
The third rule is Check the authorship. Is the article from a respected information source, or just some joker with a blog? We should have been able to trust this article, but in this case, Pew Charitable Trusts let us down.
The fourth rule, unfortunately, appears to apply here as well: Check for obvious bias. Are multiple viewpoints presented, or is the information one-sided? The Pew article was dated April 2018, but it gave the results of a survey conducted between 2011 and 2015 by (surprise, surprise) Pew Charitable Trusts! One wonders why Pew would wait three years to issue these survey results. If they had issued them in 2016, however, it would have highlighted the bad economic conditions in the middle of the 2016 presidential race. Was there a two-year delay in reporting these figures in order to avoid giving more ammunition to one of the presidential candidates? Curious.
Finally, the fifth rule is Check the quality of the website and its content. Granted, this is a somewhat subjective thing, and does not apply to the Pew article. Their website is very nice-looking, and the article was well-written with no typos that I could see. But usually, if a website has bad information and unreliable research, the quality of the writing and the website design is likely to be pretty bad, as well. But let me add that this rule is not absolute. A good-quality site and well-written material could have bad information, and a slap-dash, poorly written website might have accurate information. But it’s more likely to be the other way around.
Next time you see an article online by Pew Charitable Trusts, give them another chance. Nobody’s perfect. But when doing research on the web, use these five rules to evaluate any material you find, and you won’t go wrong.