22 CROSSTALK The Journal of Defense Software Engineering August 2002 I 1990, I declared that the 1980s were a lost decade from the perspective of software development progress. The question I posed was, “Will there be a breakthrough in the 1990’s?” I went on to say, “It won’t happen automatically; people are too satisfied with unsatisfactory ways. We dare not make the mistake of complacency a la the automobile industry; we must push awareness and resource commitment to get ahead of the power curve of demand.” In 1994, I closed the annual Software Technology Conference (STC) with the observation that the underlying need within the defense community is for predictability: “From a Pentagon perspective, it is not the fact that software costs are growing annually and consuming more and more of our defense dollars that worries us. Nor is it the fact that our weapons systems and commanders are becoming more and more reliant on software to perform their mission. Our inability to predict how much a software system will cost, when it will be operational, and whether or not it will satisfy user requirements is the major concern. What our senior managers and DoD (Department of Defense) leaders want most from us is to deliver on our promises. They want systems that are on time, within budget, that satisfy user requirements, and are reliable.” The question I pose now is: “Where are we today, and where will we be tomorrow?” Did we lose our religion? Why did I use the metaphor of religion? Because religion is the traditional example of faith-based behavior – that is, behavior that is based on a belief system rather than on externally imposed rules such as the law of gravity or “she that has the gold, rules.” Emotional discussions regarding whether Ada or C++ should be preferred are frequently described as religious arguments based on beliefs rather than facts. Sadly, I still see the world of software being governed by religious-like belief systems rather than objective appraisals. When I left the Pentagon six years ago, I described some of what was happening as bumper sticker management, and the situation has not changed for the better. I sometimes have the feeling that the blind are leading the blind – the leadership is blissfully ignorant of the direction in which they are headed. The only meaningful direction from either the Office of the Secretary of Defense (OSD) or the military services in the last few years was the Gansler memo that directed the use of the Software Engineering Institute’s (SEI) Capability Maturity Model® (CMM®) Level 3 contractor organizations for Acquisition Category (ACAT) 1 systems. Do you know how many large-dollar (by that I mean $50 million or more) software intensive acquisitions are not ACAT 1? Virtually all Management Information System (MIS) and Command, Control, and Communications (C3) systems! During the past two years, there has been a 5.5 percent annual growth in the cost of ACAT 1 programs due to cost and schedule estimating and engineering changes (sound like software?). Yet these programs have the most experienced DoD industry managers, and have a requirement for CMM Level 3. About two-thirds of DoD acquisition dollars are for programs below the ACAT 1 threshold for which there is currently no CMM requirement. It is my guess that these nonACAT 1 programs are at least twice as bad as ACAT 1 programs – in other words, about $9 billion per year in cost growth associated with estimating and engineering problems, many of which are likely software related. In my opinion, they deserve more software management attention than results from the requirement to use best commercial practice. CMM Maturity Reality What is wrong with best commercial practice? It just does not exist among DoD contractors. It is a fantasy created by those who want to streamline acquisition, making it possible to cut the number of oversight personnel by reducing the opportunity for oversight. The best way to justify a hands-off attitude is to insist that contractors always do everything right! There are more mature software organizations today. Virtually every large DoD contractor can boast at least one organization at CMM Level 4 or above, and several organizations at CMM Level 3. On the other hand, most DoD software is still being developed in less mature organizations – mainly because the program executive officer or program manager (PM) doesn't demand that the part of the company that will actually build the software be Level 3! Many people used to tell me that the DoD needed to get on the dot-com bandwagon – those folks develop and deliver software fast. Yes, the Internet and the availability of Web browsers have fundamentally changed the environment within which even mission critical software is being developed. But instead of adapting proven software methods, the software research community has all but dropped its concerns with formal methods, peer reviews, clean-room processes, and other reliability techniques, including Ada, which was designed to promote reliability. Except for Barry Boehm at the University of Southern California, much of the academic community has more or less stopped investigating better ways of estimating system complexity and measuring software growth. Instead, invention of new user interfaces, new distributed computing architectures, and new (more flexible and less reliable) programming languages have been given top priority. The goals of reliable performance and preDid We Lose Our Religion?