Sign In

Communications of the ACM

Technical opinion

Software Process Improvement: It's a Journey, Not a Destination

AA Company1 (AAC) is a $2.3 billion publicly held service organization with approximately 12,000 employees located in the U.S. The applications development group consists of 185 employees working at one location. Facing rapid growth of the IT staff and an increasing need for new and improved information systems, AAC began to look for ways to improve its software development processes and turned to the Capability Maturity Model (CMM) for guidance. Since 1986, the CMM has been used as a framework for improving the software development process within organizations. Many companies have successfully adopted the CMM and experienced the benefical effects [3], but unfortunately, almost 70% of organizations attempting to adopt the CMM abandon the effort before realizing its potential benefits [4]. This column assesses the lessons learned by AAC in its quest to improve software development via the CMM.

Back to Top

The Software Process Improvement Initiative

The software process improvement (SPI) initiative at AAC began as an informal effort based largely on the CMM. As part of this effort, AAC performed a CMM self-assessment that placed the company clearly at Level 1. Subsequently, a formal team (the SPI team) was created to oversee the software process improvement effort. Shortly thereafter, the SPI team formed workgroups corresponding to each of the CMM Key Process Areas (KPA) (for example, Requirements Management), with the goal for these groups' efforts to culminate in reaching CMM Level 2 compliance in just 10 months. AAC planned to mark the attainment of this goal with a great celebration and by hanging a large banner proclaiming CMM Level 2 compliance in the entrance to the IS division. Unfortunately, workgroups discovered key practices were followed in some areas, but not all; or the way in which key practices were followed varied across different areas, thus making it difficult to document current processes. In many cases, no formal documentation existed and the effort to document existing processes proved to be daunting. The overall response was dismay at the amount of work remaining to reach Level 2.

One year later, AAC conducted another self-assessment to gauge improvement; results indicated AAC was still at Level 1. Discouraged with the most recent results and having no clear consensus from management on how to continue, AAC hired an advisor to join the process to provide a fresh perspective and much-needed guidance. In consultation with the new advisor, AAC upper management set a new course and new deadlines. This new course entailed continuing to document existing processes (as they had started the previous year) and, more importantly, using the insight from this process documentation exercise to create new processes where no formal process existed before and to modify existing inefficient or ineffective processes. As a result of this process, a new methodology was created.2

When the methodology was launched almost a year later, all developers within the organization were instructed via a formal written policy statement from the CIO to begin using the methodology for all projects. Subsequently, developers in all areas of the company received extensive training on the methodology. At this point the quality assurance group was formally given responsibility for methodology maintenance and training, which further solidified ownership of this project. A survey conducted approximately six months after the introduction of the methodology revealed no major problems with the methodology or with reluctance by the developers in using it.

The following year, almost as an afterthought, a CMM assessment was performed, revealing that the company was now at CMM Level 2. Although the company reached Level 2 nearly two years past its original deadline, no one seemed to notice the delay. Instead of a celebration marking the achievement, an email message was sent from a member of the quality assurance group congratulating the employees on their success in reaching Level 2. Although significant, achieving Level 2 became little more than a signpost on the road to continued software process improvement (see the timeline figure here).

Back to Top

Lessons Learned

AAC's experience reveals several insights into factors that can affect this project. Knowing the success (and failure) factors a priori allows a company the opportunity to plan for and take appropriate action. Each of the lessons learned by AAC is discussed next and summarized in the table here.

Lesson 1. No guidance was provided to the initial teams given this process improvement charge other than: "using the CMM as your framework, look for ways to improve software development." One by-product of the lack of guidance was the fact that management established unrealistic expectations with regard to the time it would take to become Level 2 certified. Management set a 10-month deadline whereas the actual time to reach Level 2 was almost four years. Most companies undertaking an SPI effort agree that it takes longer and costs more than they originally anticipated [3]. Like many other companies, AAC needed more guidance in its SPI efforts early in the process [3].

One possible reason many companies give up on an SPI effort is that they improperly set a goal of reaching a particular certification level rather than letting the process of reaching the goal be their reward.

Lesson 2. The early phases of the project also lacked commitment from management—usually a recipe for failure [1]. In the beginning, the project had a champion, but no real buy-in from IS management. However, as the project matured, the champion began to `sell' the importance of the project and moved it up the priority list. Consistent with findings from other studies [1, 3, 6], one of the eventual critical success factors for this project was the clear top-down commitment and visible management commitment.

Lesson 3. On the positive side, AAC took actions that greatly supported the SPI effort. To attenuate major developer resistance, the development group was fully engaged and had an influence in the process and in the resulting products (such as the methodology). Also, employee surveys were used extensively at various points in the initiative to obtain feedback. Top management quickly provided to all employees aggregated results, interpretations, and action plans based upon those results. Consistent with findings from other studies [3, 6] one of the critical success factors for this project was engaging all organization members in the process.

Lesson 4. With the creation of the SPI team, the responsibility of project leader was given to a well-respected and rapidly rising member of management. This person was well regarded by developers and management throughout the IS organization. Prior to the SPI effort, this person had been a project leader on several high-profile, and ultimately successful, projects. Management, at all levels, had faith in this person's ability to lead this effort. Having a well-respected individual or group is critical to the success of the effort [3, 5, 6].

Lesson 5. At AAC, the success of software development, and the success of the SPI initiative, was gauged by three primary indicators: customer satisfaction, number of days of deviation from scheduled delivery date, and the percentage of deviation from the proposed budget. It is important to have initial and post-implementation measures in order to determine success. Without initial measures there is no basis for comparison, and without post-implementation measures there is no indication of current status (to compare to the baseline). Many companies fail to determine their key measures and adequately collect the proper data to provide the measures (lack of measures is indicative of Level 1 organizations) [5].

Lesson 6. `Unfreezing' (reducing the factors maintaining current behavior) is generally considered a key first step in changing behavior [2]. In this case, early self-assessment turned out to be a valuable unfreezing tool because it clearly demonstrated to AAC the software development environment was in chaos and some immediate action was warranted, and, thus, provided the impetus management needed to push the SPI effort forward.

Lesson 7. Although not universally accepted, a few studies have urged organizations to treat their SPI efforts as a project [5, 6]. By doing so, AAC avoided adding the SPI efforts to an already overworked IS staff because the assignments were part of an employee's normal workload. Treating the SPI effort as a project emphasized its importance by putting it on par with software development projects and not to be worked on in one's spare time.

Lesson 8. AAC initially approached the use of the CMM framework with an emphasis on certification rather than continuous improvement. This emphasis has been found to be a critical failure factor [5]. It took several months before management at AAC bought into the philosophy of continuous improvement. To become CMM Level 2 certified, a company must document key processes and follow these processes. This company took the stance that if they were to go through the effort to document what they do (qualifying for Level 2), then they should, at the same time, look for ways to improve that process. Thus, rather than simply concentrating on doing the things to get to Level 2, the company began to look beyond Level 2 to an environment of continuous improvement.

Lesson 9. To be successful, an SPI initiative should be tailored to the needs of the organization [6]. At AAC, this concept was extended further because the CMM was used as a framework for an evolving SPI effort. The organization had the vision to determine when it was necessary to diverge from the CMM doctrine. One of the by-products of shifting from a position of "reach Level 2" to "continuous improvement" was the fact that AAC was able to capitalize on the opportunity to create a methodology while documenting existing processes.

Back to Top


Self-assessment and subsequent investigation provides an introspection that is otherwise not taken, and gives an organization the opportunity to look in a mirror—to see its strengths and weaknesses. Often, this self-discovery is one of the greatest benefits of a process improvement effort. In the company studied here, using the CMM as a framework with the goal of achieving Level 2 proved to be a long and arduous process where many mistakes were made, but many benefits realized. Although the company ultimately reached its goal, somewhere along the way the original goal no longer became the driving force. Rather, the company began to realize the benefits of continuous software process improvement. AAC used the CMM truly for what it is—a framework for improvement, not a clear-cut guide to development as viewed by some.

One possible reason many companies give up on an SPI effort is that they improperly set a goal of reaching a particular certification level rather than letting the process of reaching the goal be their reward. Overall, AAC's experience provides a valuable lesson for other companies looking to improve software development: software process improvement is a journey, not a destination.

Back to Top


1. Baddoo, N. and Hall, T. Motivators of software process improvement: An analysis of practitioners' views. The Journal of Systems and Software 62, 2 (2002), 85–96.

2. Hellriegel, D., Slocum, J.W., Jr., and Woodman, R.W. Organizational Behavior, Seventh Edition. West Publishing Company Minneapolis/St.Paul, MN, 1995.

3. Herbsleb, J., Zubrow, D., Goldenson, D., Hayes, W., and Paulk, M. Software quality and the capability maturity model. Commun. ACM 40, 6 (June 1997), 30–40.

4. Krasner, H. Accumulating the body of evidence for the payoff of software process improvement–1997;

5. Moitra, D. Managing change for software process improvement initiatives: A practical experience-based approach. Software Process–Improvement and Practice 4, 4 (1998), 199–207.

6. Stelzer, D. and Mellis, W. Success factors of organizational change in software process improvement. Software Process–Improvement and Practice 4, 4 (1998), 227–250.

Back to Top


Bill C. Hardgrave (bhardgrave@walton. is the Edwin and Karlee Bradberry Chair in Information Systems and the executive director of the Information Technology Research Institute in the Sam M. Walton College of Business at the University of Arkansas.

Deborah J. Armstrong (darmstrong@ is an assistant professor in the Information Systems department in the Sam M. Walton College of Business at the University of Arkansas.

Back to Top


1 Not the actual name of the company.

2 Methodology is defined here as a comprehensive guide to software development. AAC viewed its methodology as the "how-to" for software development within the organization; it was comprehensive, yet flexible enough to be used across the many different platforms/areas within the IT organization.

Back to Top


UF1Figure. SPI initiative timeline.

Back to Top


UT1Table. Summary of lessons learned for process improvement.

Back to top

©2005 ACM  0001-0782/05/1100  $5.00

Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. To copy otherwise, to republish, to post on servers or to redistribute to lists, requires prior specific permission and/or a fee.

The Digital Library is published by the Association for Computing Machinery. Copyright © 2005 ACM, Inc.


No entries found