It may be a point of common knowledge that more than half of all IT projects fail. The exact definition of failure varies depending on the study, but missing budget, scope or schedule are popular definitions. By this standard, the 2013 Chaos Manifesto reports that a whopping 90% of large IT projects fail. Even more concerning, a 2012 McKinsey report says that 17% of large IT projects fail so badly that they threaten the existence of the company.

The top three or ten root causes of IT project failure is a popular topic of discussion on social media and in the press. My own unscientific impressions are that lack of focus, poor technical readiness, bad planning or project management, inexperience, bad leadership, poor risk management, and insufficient funding are common villains, but by far, the most common complaint is poor requirements, and this perspective is borne out by more rigorous research studies.

To my ear, these explanations sound like complaints about the weather or the rules of chess. Donald Rumsfeld famously once said when asked about whether the armor on army vehicles was sufficient, “You go to war with the army you have.” I’d say the same applies to software projects – your team will march the path with the budget, skills, experience, tools and management talent that they have. For them to wish for different conditions is pointless.

To be fair, a team could ask for more money or time. You can send individuals to project management training. You can adopt architectural frameworks like Zachman or use UML to model your business process, or maybe Agile, or business architecture will improve the outcomes. The question is, which of these investments will yield the best returns?

The answer to the question depends largely on the frame with which you choose to view the problem. “All the world is a nail to a man with a hammer,” and to a portfolio manager, for example, all will be solved by improving the way we prioritize funding.

Nearly all prescriptions for better IT outcomes assume that more rigor or discipline will improve the predictability of outcomes and that predictability itself is an intrinsically valuable goal. They assume that there is a single right answer that can be derived through sufficient analysis or calculation. These common prescriptions prioritize control and completeness, and treat ambiguity as a threat to be contained or eliminated. With few exceptions, they are rooted in traditional manufacturing process control methods, with similar assumptions about quality, efficiency, and materials handling principles.

But is it possible to trade predictability and control for better results? If by "better results," we mean more impact, business value, and agility for less investment, it may be that executing on plan is less important than adapting to new insights. It may be that paying attention to the process of generating and testing new ideas pays better returns than conformance or reducing ambiguity.

Building software is inherently a creative process, so it makes sense to think about software projects as more like making a movie and less like manufacturing a product or building a house. The process of making a movie -- the roles, activities and milestones -- may be similar from one to the next, but the results will be different every time. The same holds true with software. Both are essentially creative efforts that depend on teams of people doing things that only people can do, like collaborate, be inspired, and invent. Great software requires great design, and design is a team sport. It depends on diversity, communication, a common purpose, and a commitment to learn.

Recognizing the process of design as essential to technical projects suggests that maybe our allergy to ambiguity and search for predictability is misguided. If design is to happen at all, we have to accept that we can't know the best solution to a problem at the outset, and in further, that our understanding of the problem is likely to evolve as we learn more.

Design is a well-understood practice among many professions, but how it applies to IT is still largely unexplored territory. The discipline is taught by the Stanford, RISDE, MIT's Media Lab, and many other well-respected institutions, especially in the context of product design. The word appears frequently in IT, but with different meaning, a vague shadow of all the practice might offer.

There is good evidence that adopting mature design practices holds stunning promise for improving IT outcomes and reducing project failures. I've worked with dozens of teams who have made a sustained commitment to build their design practices, and as a consequence, report doubling of collaboration scores and stakeholder engagement, 75% reduction in delivery cycle time, and a whopping 85% reduction in the total cost of defects and change requests.

Maybe it's time to move beyond listing the reasons for IT failure and start focusing on what works for IT success. Design works.