A staple of the information technology industry is the notion of a “best practice”. I’m not very fond of it.
Best practices are supposed to be the the golden standards by which technological solutions are implemented and problems resolved in the surest way possible. They’re often researched and developed by the vendor who produced the technology, and so they come with a certain weightiness. For example, if Microsoft says that an Exchange environment needs to be built a certain way for a certain number of users to work well, the conventional wisdom asks, “who am I — a lowly user of their technology — to disagree?”
There’s an irony in this. For years, I have heard technology experts complain that Cisco certification exams reflect perfect-world environments that don’t exist. They’ve said that they were only able to pass their exams by answering questions the way Cisco says you should do things, not the way they would actually solve networking problems in real life. This is just one example with one vendor, but it brings to the surface the familiar conflict between book learning and hands-on experience. People in the business know that the two solution paths are often different.
Therein lies the paradox. When it comes to certification exams, our experience tells us that the textbook solution, the “best practice”, may not genuinely be the best way to address a problem. It bugs us when these disconnects happen, but we play along with the vendor in order to get the certification and their stamp of approval. Yet, when we face a real world problem in search of a solution, we tend to seek out the industry whitepaper on the best practice and give it special reverence.
Whitepapers serve their purpose, but they’re crippled from the start as guides to perfection. First of all, who gets the special privilege of defining the criteria for all that is required to be “best”? Is there any room in the determination of what is best for a customer’s hard budget constraints, deployment timelines, and flexibility? Or is an industry best practice limited from the very beginning because it starts with a rigid problem specification that doesn’t match a real technical challenge, assumes unlimited access to resources and time, and assumes a pristine lab environment in which to operate?
Experience has taught me that best practices are merely templates to start from and nothing more. They are just tools that give us a benchmark to work from, and maybe they establish some realistic performance expectations. However, a person’s real-world experience deploying and understanding technology is always infinitely more valuable to me. Over-reliance on vendor best practices can be seen as forever leaning on a technological crutch. Saying that “vendor’s so-and-so’s best practice in this situation is to…” may appear to add credibility to a course of action, but it can also stifle experimentation during problem solving, innovation, and independent thought. We need more than just the ability to read and regurgitate our Google search results.
My advice? Read the whitepaper, read the best practice. Bask in the information presented, and then put it aside and be a critical thinker. Technology professionals are fantastically innovative, and they need to trust their own experience and imagination to solve their own unique problems, perhaps in even new ways that make sense to them, their customers, and their employers. No new thinking ever came out of blindly following a vendor best practice.