We had a fishbowl session at Prairie DevCon recently, titled “When and When Not to use Best Practices”. The idea for the session came from a Twitter conversation between myself, D’Arcy Lussier and Shane Shouldice. That Twitter exchange basically alluded to the sentiment that there is no such thing as a best practice.
Best Practices is not a term used solely in Software Development, and not a term coined by the Software Development community. Wikipedia defines Best Practice as “…a technique, method, process, activity, incentive, or reward that is believed to be more effective at delivering a particular outcome than any other technique, method, process, etc. when applied to a particular condition or circumstance.” and goes on to detail that “[a] given best practice is only applicable to particular condition or circumstance…”
There were some great participants in the session, and the overall sentiment echoed that of the twitter exchange: people’s frustration with how “best practices” have been adopted by other members of the software development community. The major issue is that developers blanketly implement best practices without putting any thought into the appropriateness of the practice in their context. This, I think, stems partially from the source of the best practice and the adeptness of the consumer of the best practice.
While I don’t think we can really expect to get away from “Best Practices” as a term in our industry, I think we can go a long way to improve people’s understanding of what a “best practice” is. Some important things when documenting Best Practices: there is a context in which they are appropriate, they are not fixed procedures. And for consuming Best Practices: be sure you understand the context in which the practice is intended to be used. Also, don’t be afraid to put some thought into your use of the practice; if it doesn’t quite fit in your circumstance change it or move on to another practice.