Information Technology Dark Side

Struggles of a Self-Taught Coder

Information Technology Dark Side header image 2

In Defense of Rigor (Part 3 of 3)

June 17th, 2007 · 1 Comment

DJ1.0, Contributing EditorDJ1.0 is a contributing editor of We don’t know much about DJ1.0, since he participates in the dark side anonymously. We suspect DJ1.0 is a “he” since he refers to a wife in an early post, but then again, maybe they’re from Massachusetts… Either way, you can reach DJ1.0 at

In my last two posts, I have been asserting (successfully?) that the rigor enforced in academic research and engineering disciplines could rescue IT from some of its self-destruction behavior. So far, I have received many dissenting comments from people in just about every section of IT, data warehousing, database, usability, marketing, management, and infrastructure.

The comments are all the same. “Our data is not ambiguous. We have lots of defined measurements. We are rigorous and we do know what our data means.” That is precisely my point. YOU have YOUR measurements. YOU know what YOUR data means but does anyone else? The knowledge your group has internalized doesn’t help generalized understanding.
For example, consider an extremely valuable group like infrastructure. Infrastructure monitors availability to 1/10 of 1% increments, every 15 seconds, with very precise graphs and charts and massive amounts of data and historical trends. That rigorous, right?

I’m not arguing for more data. I’m arguing for more meaning.
So I asked what availability means. What is 95% availability? Well as it turned out it didn’t include weekly scheduled maintenance outages. It didn’t measure when an application module was faulty but the application still ran. It didn’t measure when users were blocked from logging in due to a firewall issues. This is NOT to say they were wrong, they weren’t. But decisions are being made based on data that people don’t understand. If I tell my wife, “I’m 100% faithful.” She is not expecting me to leave out weekly maintenance windows.

That’s the kind of thing that just doesn’t fly in research. I’m NOT going to present a paper where I say “Our failure rate was only 1%” and leave it at that. What was your methodology? Is it commonly accepted? What were your assumptions? What is your expected error? Can you repeat it? That is how you make good decisions. Not with more data, but with a good understanding and trust in what that data means, where it comes from, and how applicable it is.

Getting back to availability, because the horse might still be alive, what good is 95% availability? Quick back-of-the-napkin calculations tell me that there are 21 hours a week between 1:00 and 3:00 AM. Are you really doing business during that time? Sure in this global 24×7 economy, it sounds good, but does it make business sense? Are there too many opportunity costs in keeping a system up when you don’t need to? Who knows, but without real meaning in your data, you really can’t ask the question.

I’m not arguing for more data. I’m arguing for more meaning.

The great irony of IT is that even as we live and breathe in massive seas of data, we simply don’t understand data itself. We have 50 million rows of transaction data and we don’t know a thing about our customers.

One of the things that I would encourage IT to borrow from academia is peer review. In academic culture there is openness to sharing and review. In fact, without peer review, your research goes nowhere.

In IT the opposite is often true. The more people that know about what you are doing, the less likely it will succeed. Academia expects you to publicly publish your works, expand on other studies and subject your research to rigorous peer review. It also has an entire culture built around that premise. IT has a real aversion to this kind of thing.

Thankfully this is changing. The emergence of wikis and agile ideas are starting to move IT in the right direction. However, without a cultural shift, it is bound to have problems. Having a wiki doesn’t make it any safer for your employees to express new ideas and it doesn’t make teams collaborate or share. But having the tools around to support a cultural change can make all the difference.

If you enjoyed this post, make sure you subscribe to my RSS feed!
Stumble it!

Tags: DJ1.0

1 response so far ↓

  • 1 A_Raybould // Jul 28, 2009 at 1:37 am

    I'm a couple of years late in replying, but there's been no discernible progress over that time! There has, however, been an event that should make it plain, to those who are interested in looking, that this matters: the financial crisis. Between the abuse of value-at-risk models, often by people who didn't understand them, and the corruption of information resulting from the ratings agencies selling out, there's plenty of evidence that lack of rigor hurts.

    Your article puts me in mind of Richard Feynman's fairly well-known talk on the nature of scientific inquiry, in which he compared pseudo-sciences to Polynesian cargo cults. In my opinion, economics has been perilously close to the latter, though I do believe it is getting better.

Leave a Comment