What IT can learn about the study of the “normalization of deviance” phenomena
Peter Waterhouse, Senior Strategist, CA Technologies
Although vendor-written, this contributed piece does not promote a product or service and has been edited and approved by Network World editors.
Salary survey 1
Salary Survey 2016: How does your compensation stack up?
Computerworld’s annual IT Salary Survey results are in. Find out what your peers said about their
Read Now
How many times have you witnessed a sub-optimal IT practice that everyone else thinks is ok, then over time accepted the behavior as being just fine and dandy?
Regardless of whether you lead a startup or work in an established business, we all have a tendency to accept dodgy behaviors. Even if outsiders see them as wrong, our IT teams are so accustomed to using them (without any adverse consequences) that they’re quickly established as “normal” and accepted.
Studies into what’s commonly referred to as the “normalization of deviance” have been conducted in areas such healthcare to aerospace, with evidence showing that many serious errors and disasters occur because established standards have been bypassed and bad practices “normalized”.
While examining this phenomena is critical in the context of safety, it’s equally applicable in how we develop, secure and operate software applications. With the boundaries blurred between the digital and physical world, any adverse behavior leading to security and reliability issues could have dire consequences for customers. And when software becomes infused into long lasting products (from light bulbs to limousines) it’s not so easy to exit markets.
As businesses look to software innovation for growth, time-to-market and quality become essential differentiators. Unfortunately both can be compromised if pre-existing change aversion or newer “speed at all cost” mandates lead to a normalization of deviance. More critically, if a head-in-the-sand IT culture persists, systemic business failures may eventuate – think massive security breaches or major application outages.
The DevOps movement, with its focus on collaboration across development and other IT functions, is now regarded as the best way of establishing the culture and environment needed to support fast and reliable software deliver. So maybe the secret to helping IT identify and eliminate poor practices is to take the benefits of DevOps and then guidance from other fields that are fighting normalization of deviance.
In healthcare, for example, studies illustrate seven factors that lead to a normalization of deviance, all of which are IT relevant:
The rules are stupid and inefficient – in healthcare, accidents occur when practitioners disable equipment warning systems because alarms are seen as distracting. This happens in IT all the time, like in operations where staff will filter out noise and alerts they regard as irrelevant. It also surfaces when testing is skipped because of manual processing and set-up delays.
Knowledge is imperfect and uneven – employees might not know a rule exists, or they might be taught a practice not realizing that it’s sub-optimal. In IT this persists because many new employees feel uncomfortable asking for help, or when the application of new technologies distort logical thinking.
The work itself, along with new technology, can disrupt work behaviors – to support goals of more continuous software delivery, organizations areintroducing many new technologies and methods – like Microservices and containers. New work practices and learning demands may lead staff to poorly implement technology or use it to perform functions it was never designed for.
We’re breaking rules for the good of the business – staff may bypass rules and good practice when they’re incentivized on faster delivery times or delivering new functional software enhancements. For example, repeatedly procuring additional (but unnecessary) hardware to rush through an update, rather than addressing the root-cause of performance problems.
The rules don’t apply to us…trust us – autonomous agile teams are beneficial, but empowering them to select their own one-off tools or to bypass compliance policies can compromise program objectives or lead to security breaches. Unfortunately in today’s fast-paced digital business, talented professionals often feel completely justified in playing the trust card.
Employees are afraid to speak up – violations become normal when employees stay silent. How many times has poor software code, costly projects (and bad managers) been tolerated because people are afraid to speak up? Even in IT organizations that have a strong blameless culture, people will stay quiet for fear of appearing “mean”.
Leaders withhold or dilute findings on application problems – whether you work in healthcare or IT, no-one wants to look bad to managers. Rather than present ugly news, many will distort the truth; presenting diluted or misleading information up the command chain. In IT this behavior is easily normalized, especially if teams get away with reporting technical vanity metrics over business outcomes.
No sudden cultural reawakening in IT or liberal sprinkling of collaboration fairy – dust will eliminate ingrained bad practices, but DevOps and Lean thinking can help identify warning signals. This starts with leaders visualizing the flow of value delivered by software applications, pinpointing all the bottlenecks and constraints impeding delivery.
Analogous to pathway stepping stones, these are all the value interrupts which, when lifted, reveal all the process and technology issues causing good people to do the wrong things. Immediate candidates are software release and testing processes, but don’t restrict analysis to the development side of the software factory. Every stone, be that enterprise architecture, stakeholder engagement, vendor management, operations or customer support can hide ugly behaviors that over time can become normalized.
Of course, identification is just the start. Next comes the hard part, with leaders using evidence to impress how behaviors impact current performance and business outcomes. This might involve using new tools, but this again courts disaster when advanced technologies becomes a vehicle to automate bad processes.
As with anything involving people, the organizational and psychological barriers encouraging staff to break rules or for their colleagues to remain silent is where most attention should be focused.
Click here to view complete Q&A of 70-467 exam
Certkingdom Review
Best Microsoft MCTS Certification, Microsoft 70-467 Training at certkingdom.com