7 Summary, Conclusion, and Future Planned Work
This report is the second in a series funded by SAF/AQX. At the end of our first report, we
provided a list of topics that we thought needed further research. Many of those have been
presented here. However, we also acknowledge that our treatment of these subjects is by no
means complete as the body of knowledge on these topics continues to evolve.
Since the publication of the first report and while we have been pursuing the contents for this
report, there has been considerable movement within the government and DoD to identify and
implement a new acquisition process that can take advantage of Agile methods. This movement
extends from recognition by the former Secretary of Defense, Robert Gates, that conventional
modernization programs may not meet the existing demand in today’s environment, to Congress’
inclusion of Section 804 in the National Defense Authorization Act for Fiscal Year 2010.
Pursuant to Section 804, OSD published a report providing updates on DoD’s progress toward
developing a new acquisition process for information technology capabilities [OSD 2010].
This recognition shows an understanding that the acquisition tempo must respond to the
operational tempo. In addition, there is still a need for process discipline that ensures effective use
of resources in providing needed capabilities. While some may say that Agile does not have the
appropriate discipline for use within the DoD, Agile does require adherence to processes and there
is evidence that using Agile methods in conjunction with other methods like CMMI is a powerful
approach to achieving the needed effectiveness. In addition to effectiveness, a focus on value
must be maintained. Agile practitioners have evolved the classic iron triangle to include value.
Even though value is included, those within DoD that have adopted Agile methods have learned
that a change in mindset and culture for the PMOs and other acquisition entities is required. In
order to change culture, you need to understand your current culture, the assumptions, shared
values, and artifacts that make up your current culture, and what the differences are with a new
Agile culture. Table 2 shows a comparison of the Agile and traditional DoD cultural elements.
This table should help those who want to adopt Agile methods understand some of the differences
and changes they will need to make. Table 3 provides some potential PMO actions and enablers
that can support the use of Agile methods. The end goal of many of these cultural shifts is to
enable partnership and shared understanding of what value means from a customer perspective.
When the acquirer, developer, and end user all share the same understanding of value, it is much
easier to agree upon priorities.
In order to make the transition to Agile, one must understand the terminology and the differences
between those of Agile practitioners and DoD acquisition personnel. This is only the beginning of
“being Agile.” “Being Agile” is more than adopting just another methodology. The key to “being
Agile” is embracing change.
Another part of becoming Agile is learning the common traits of Agile managers. Agile managers
are leaders, coaches, expeditors, and champions. Agile managers must be good team builders as
the team within the Agile culture is the cornerstone of the thought process and method. Managers
also need to master the time box, understand what contract types work best, learn which metrics
apply to Agile programs, and deal with distributed teams in an Agile manner.
CMU/SEI-2011-TN-002 | 78
One of the main sticking points with using Agile methods in DoD acquisition is how to
accommodate large capstone events such as CDR. There are other challenges that must be
addressed such as incentives to collaborate, definitions, and regulatory language. While there are
many issues in this arena, the main thing to remember is the purpose and intent of holding these
reviews in the first place. The purpose is to evaluate progress on and/or review specific aspects of
the proposed technical solution. Thus, expectations and criteria should be created that reflect the
level and type of documentation that would be acceptable for the milestone. This is not any
different from business as usual. However, the key here is for the government PMO and the
contractor to define the level and type of documentation needed, while they work within an Agile
environment that is unique to each program.
Estimating for Agile acquisition is another area that required considerable exploration. Estimation
in Agile is different from traditional estimation. In Agile, the estimates tend to be just-in-time
with a high-level estimate refined to create detailed estimates as more is learned about the
requirements. Traditional methods are more detailed up front with the details being refined as
more is learned. There are several parametric tools and even an AgileEVM tool that can be used
in the Agile environment. Both the estimator and the reviewer need to be aware that estimation
within Agile is different from traditional estimation and should act accordingly. How specific
issues can be dealt with is highly contract-specific but general guidance is provided within
Section 5.
We addressed the road to Agile adoption in DoD software acquisition. Our interviews revealed
two main reasons to adopt Agile—moving from a burning platform and an operational need that
cannot wait for traditional delivery times. Change is hard. Understanding the scope of the change
is essential. Table 4 addresses different adoption factors and expectations for the developer and
the customer/acquirer. Lastly, Table 5 provides some candidate transition mechanisms for DoD
adoption of Agile methods. Remember to find and nurture good sponsors for your adoption,
understand the adoption population, conduct a readiness assessment, and determine what adoption
mechanisms you will need and have them on hand early in the adoption process.
Our journey into finding ways for Agile methods to be adopted within DoD has been challenging
and exciting, and has taken unexpected turns. With Congressional direction and subsequent
involvement of OSD policy makers, we have come to realize the need for practical guidance for
the adoption of Agile. Our next endeavor will be to outline and then create a guidebook that DoD
users can employ to help them first understand if their program is a good candidate for using
Agile and then understand how to go about adopting Agile. In the meantime, we are working with
the PEX program (whom we interviewed) to jointly publish a paper outlining some potential
changes to Air Force regulations that would ease the adoption of Agile methods.
These additional potential topics for future work were identified during the review of this
document:
- Conduct a more in-depth treatment of contracting types.
- Explore views from the lean software development community that it is more important for
the contract to define how to negotiate changes than to define what will be built [Poppendieck
2003]
CMU/SEI-2011-TN-002 | 79
- As one of the reviewers pointed out, Agile methods should be unencumbered by regulation,
policy, and law. Agile must put people in charge over bureaucratic processes or it will fail.
The development teams can be as Agile as they want and they can have a total understanding
of the customers’ desires (which helps a lot for requirements analysis and trade off) but if the
touch points (listed below) to the rest of the world are not fixed, we will not see much
difference. Thus, there is a need to explore how the following elements impact and need to
work with Agile methods
− funding (colors of money, time limits on expiration of funds)
− funding approval (Investment Review Board [IRB], $250,000 limit on sustainment
enhancements, standard financial information structure [SFIS], Federal Financial
Management Improvement Act [FFMIA])
− architecture requirements approval (Department of Defense Architecture Framework
(DODAF), common logistics operating environment [CLOE])
− documentation (capability production document (CPD), test and evaluation master plan
(TEMP), information support plan (ISP), economic analysis (EA), capabilities-based
assessment (CBA), business case)
− data center lead times
− contract lead times (justifications and approvals [J&As] for add-on products by a vendor
whose proprietary stack you already own)
- The relationship of system engineering and Agile development methods determines how to
address the naming and content of technical milestones within an Agile context
As should be clear from the above list, there are still significant implementation areas for using
Agile methods in DoD acquisition that warrant study. Some of these topics will be covered in our
planned work.
CMU/SEI-2011-TN-002 | 80
CMU/SEI-2011-TN-002 | 81
Appendix A: Acronyms
ACAP Analyst Capability
AFB Air Force Base
AFEI Association for Enterprise Information
AFI Air Force Instruction
APO Acquisition Program Office
APEX Application Experience
AsD adaptive software development
BCR baseline change request
CBA capabilities based assessment
CDR Critical Design Review
CDRL contract data requirements list
CEO chief executive officer
CLOE common logistics operating environment
COCOMO Constructive Cost Model
COTR contracting officer’s technical representative
CPD capability production document
CPFF cost plus fixed fee
CPIF cost plus incentive fee
CMMI Capability Maturity Model Integration
DAG Defense Acquisition Guidebook
DARFAR Defense Acquisition Reform Findings and Recommendations
DASA CE Deputy Assistant Secretary of the Army for Cost & Economics
DAU Defense Acquisition University
DBS defense business systems
DFAR Defense Federal Acquisition Regulation
DIACAP DoD Information Assurance Certification and Accreditation Process
CMU/SEI-2011-TN-002 | 82
DID data item description
DoD Department of Defense
DoDAF Department of Defense Architecture Framework
DoDD DoD Directive
DoDI DoD Instruction
DSDM Dynamic Systems Development Method
DT&E development test and evaluation
EA economic analysis
EVM earned value management
FAR Federal Acquisition Regulation
FCs functional capabilities
FFMIA Federal Financial Management Improvement Act
FFP firm fixed price
FLEX Development Flexibility
FP fixed price
FRP full rate production
FY fiscal year
GAO Government Accountability Office
IA information assurance
IDIQ indefinite delivery indefinite quantity
INVEST Innovation for New Value, Efficiency, and Savings
IOT&E initial operational test and evaluation
IRB Investment Review Board
ISP information support plan
IT information technology
J&As justification and approvals
LCA life cycle architecture
LCO life cycle objectives
CMU/SEI-2011-TN-002 | 83
Lt Col Lieutenant Colonel
LRIP limited/low rate initial production
MDA Milestone Decision Authority
MOA memorandum of agreement
MOU memorandum of understanding
NDAA National Defense Authorization Act
O&M operations and maintenance
OSD Office of the Secretary of Defense
OT&E operational test and evaluation
PCAP Programmer Capability
PDR Preliminary Design Review
PEX Patriot Excalibur
PHP Hypertext PreProcessor
PL public law
PLCCE program life cycle cost estimate
PM program manager
PMB performance measurement baseline
PMO program management office
PWS program work statement
QR quality review
RESL Architecture / Risk Resolution
RICE reports, interfaces, conversions, enhancements, or extensions
RFP request for proposal
RUP Rational Unified Process
SAF Secretary of the Air Force
SDR System Design Review
SEER Software Evaluation and Estimation of Resources
SEER-SEM Software Evaluation and Estimation of Resources – Software Estimation Model
CMU/SEI-2011-TN-002 | 84
SEI Software Engineering Institute
SETA Systems Engineering and Technical Assistance
SFIS standard financial information structure
SIDRE Software Intensive Innovative Development and Reengineering/Evolution
SLIM Software Lifecycle Management-Estimate
SLOC source lines of code
SOW statement of work
SRR System Requirements Review
SSR Software Specification Review
TEAM Team Cohesion
TEMP Test and Evaluation Master Plan
TN technical note
TSP Team Software Process
V&V verification and validation
WBS work breakdown structure
XP eXtreme Programming
CMU/SEI-2011-TN-002 | 85
Appendix B: Glossary
Backlog
An accumulation, especially of unfinished work or unfilled orders.39
Done
- Having been carried out or accomplished; finished.40 Author’s note: In an Agile context, the
definition of done can include software, documentation, testing, and certification being complete
or any subset of this list being completed. The developer and product owner must agree on what is
included in “done”. With this in mind, another definition: 2. The useful definition of doneness
stresses the goal of all Agile iterations: the product must remain shippable.
- All visible features work
− as advertised
− within the expected environment
− in any combination
− without degradation over time
− with graceful handling of errors
- Hide all broken or unfinished features
This definition of doneness emphasizes this result: we want a stable app at all times. When we
start the app, we know what is expected to work because we can see it and try it. We can prioritize
new features by seeing how they must be reconciled with already-visible features.41
Epic
A connected or bundled set of stories that result in a definable (in the case of software, desirable)
capability or outcome. An epic is a large user story. It is possible to break up an epic into several
user stories.42
Iteration
In Agile software development,43 a single development cycle, usually measured as one or two
weeks. An iteration may also be defined as the elapsed time between iteration planning sessions
Just enough
Combining the two dictionary definitions of “just” and “enough” you get “exactly sufficient.”
Within the Agile community, this is an appropriate definition. Thus: just enough to be successful,
to get started, support the user story queue, accomplish our goal.
39 http://www.thefreedictionary.com/backlog
40 http://www.thefreedictionary.com/done
41 http://billharlan.com/pub/papers/Agile_Essentials.html
42 http://www.targetprocess.com/LearnAgile/AgileGlossary/ThemeEpic.aspx
43 http://searchsoftwarequality.techtarget.com/definition/iteration
CMU/SEI-2011-TN-002 | 86
Pattern
- A form of knowledge management. It is a literary form for documenting a common, successful
practice. It articulates a recurring problem, as well as the context of the problem and the
conditions that contribute to creating it. Likewise, the solution, the rationale for the solution, and
consequences of using it are given.44 2. A way to capture expertise. Patterns document good
ideas—strategies that have been shown to work well for a variety of people in a variety of
circumstances.45
Product Backlog
The master list of all functionality desired in the product.46
Release
5a. The act or an instance of issuing something for publication, use, or distribution. 2. Something
thus released: a new release of a software program.47
Sprint
A set period of time during which specific work must be completed and made ready for review.48
Often used as a synonym for iteration.
Story
In Agile software development, a story is a particular business need assigned to the software
development team. Stories must be broken down into small enough components that they may be
delivered in a single development iteration.
49
Story Point
According to Cohn, “Story points are a unit of measure for expressing the overall size of a user
story, feature, or other piece of work …The number of story points associated with a story
represents the overall size of the story. There is no set formula for defining the size of a story.
Rather a story-point estimate is an amalgamation of the amount of effort involved in developing
the feature, the complexity of developing it, the risk inherent in it and so on.”50
Technical Debt
Technical debt and design debt are synonymous, neologistic metaphors referring to the eventual
consequences of slapdash software architecture and hasty software development. Code debt
refers to technical debt within a codebase.
44 www.eberly.iup.edu/abit/proceedings%5CPatternsAPromisingApproach.pdf
45 Fearless Change, Patterns for Introducing New Ideas, Mary Lynn Mann, Linda Rising, Addison-Wesley, 2005,
Pearson Education, Inc
46 http://www.mountaingoatsoftware.com/scrum/product-backlog
47 http://www.thefreedictionary.com/release
48 http://searchsoftwarequality.techtarget.com/definition/Scrum-sprint
49 http://searchsoftwarequality.techtarget.com/definition/story
50 Cohn, M. , Agile Estimating and Planning, P. 36
CMU/SEI-2011-TN-002 | 87
Ward Cunningham first drew the comparison between technical complexity and debt in a 1992
experience report:
Shipping first time code is like going into debt. A little debt speeds development so long as it
is paid back promptly with a rewrite… The danger occurs when the debt is not repaid. Every
minute spent on not-quite-right code counts as interest on that debt. Entire engineering
organizations can be brought to a stand-still under the debt load of an unconsolidated
implementation, object-oriented or otherwise [Ozkaya 2011].
Timebox
A fixed amount of hours or days in which to accomplish something.51
Timeboxing
A planning technique common in planning projects (typically for software development), where
the schedule is divided into a number of separate time periods (timeboxes, normally two to six
weeks long), with each part having its own deliverables, deadline, and budget.52
User Story
Descriptions of discrete functionality known to be needed by a particular user segment that is part
of the project’s audience, and other stories that address infrastructure and quality attributes that
are pervasive to the product (e.g., security or usability).
Velocity
Velocity is a measure of a team’s rate of progress. It is calculated by summing the number of
story points assigned to each user story that the team completed during the iteration. If the team
completes three stories each estimated at five stories, its velocity is fifteen. If the team completes
two five-point stories, its velocity is ten.53 Velocity, in the Agile community, refers to the amount
of capacity of a particular team to produce working software. It does not have a general analogue
in traditional DoD projects.
51 http://www.agileadvice.com/archives/2006/02/timeboxing_a_cr.html
52 http://en.wikipedia.org/wiki/Timeboxing
53 Cohn, M. Agile Estimating and Planning, p 38.
CMU/SEI-2011-TN-002 | 88
CMU/SEI-2011-TN-002 | 89
Appendix C: Culture Details
This appendix adds a bit of detail on some of the research basis for the approach to cultural issues
seen in Section 2.
Dimensions of Culture
Formality and formal communication define the absolutes of a group or organization, and
breaking these rules creates anger. This is because there is often emotion around what is formal
and rule-based. Informality describes the latitude or surroundings that exist around the absolutes
and pushing the limits of the informal can generate anxiety. Informal understanding is often tacit;
it is what you can gather from mentors, role models, and exemplars in the organization. It includes
- unwritten rules
- everyday behavior
- common sense approaches
- common courtesy
- what makes people angry
- what is insulting, admirable, or praiseworthy
- interactions with others
- where people spend their time
Operational concerns are made up of formal and informal elements, and changes in this area are
often seen as bothersome or irritating [Hall 1980]. The operational or intentional aspects of the
culture have been codified and people can talk about them. Operational aspects include:
- policies enforced
- teaching/training mechanisms
- rites of passage; myths and legends
- rituals
- celebrations
These dimensions of culture are different from actual communication styles, which can also be
described as formal or informal. Because of the emphasis on flexibility, people interactions,
collaboration and working software (as opposed to processes, tools, plans, and documentation),
Agile styles are seen as informal. The DoD context is often described as the reverse—as formal.
But it is important to pause here: in looking at dimensions of culture and communication styles, it
is essential to consider our own unconscious assumptions. For example, someone who believes
that people need controls, authority, and tight structures in order to be productive, is likely to
interpret the informality and flexibility of an Agile environment as reflecting laziness, or a lack of
focus and discipline. Our own unconscious assumptions can blind us from understanding the
values, norms, and rules and practices of another culture.
CMU/SEI-2011-TN-002 | 90
CMU/SEI-2011-TN-002 | 91
Appendix D: COCOMO Factors List
One popular parametric cost-estimation tool is the COCOMO model. First published by Dr. Barry
Boehm in his 1981 book, Software Engineering Economics, COCOMO (Constructive Cost
Model) is an algorithmic-based parametric software cost-estimation model for estimating a
software project as an “effort equation,” which applies a value to tasks based on the scope of the
project (ranging from a small, familiar system to a complex system that is new to the
organization). COCOMO II is the successor of COCOMO 81, incorporating more contemporary
software development processes, such as code reuse, use of off-the-shelf software components,
and updated project databases [Boehm 1981].
At the heart of the COCOMO II model are the cost parameters themselves. These parameters are
scale factors (5) and effort multipliers (17). Scale factors represent areas where economies of
scale may apply. Effort multipliers represent the established cost drivers for software system
development. They are used to adjust the nominal software development effort to reflect the
reality of the current product being developed.
It would be reasonable to assert that an Agile development process would have an impact on some
of these parameters. The following scale factors and effort multipliers, pulled from COCOMO II,
might be impacted by the use of an Agile development process:
Development Flexibility (FLEX) Scale Factor
Definition: The FLEX scale factor is related to the flexibility in conforming to stated
requirements.
Rationale: The participation of the user in the Agile development process, coupled with an
iterative approach to building, should lower cost and schedule variance, because
appropriate use of the methods assures continual communication as situations
change. This permits appropriate reprioritization when needed.
Architecture/Risk Resolution (RESL) Scale Factor
Definition: The RESL scale factor is related to early, proactive risk identification and
elimination. The goal is to eliminate software risk by Preliminary Design Review
(PDR). This factor is also related to the need for software architecture.
Rationale: Although there is opportunity to tackle high-risk items early in the product
lifecycle with an Agile approach, there is no guarantee that this will actually
happen. The lack of clear guidance regarding how to accomplish a milestone
review in an Agile development process, and the general lack of consensus in the
Agile community on the need for or approach to developing a viable architecture,
could increase the cost estimate.
Team Cohesion (TEAM) Scale Factor
Definition: The TEAM scale factor accounts for sources of project turbulence and entropy
because of difficulties in synchronizing the project’s stakeholders (e.g., users,
CMU/SEI-2011-TN-002 | 92
customers, developers, maintainers, and interfacers). These difficulties may arise
from differences in stakeholder objectives and cultures, difficulties in reconciling
objectives, and stakeholders’ lack of experience and familiarity with operating as
a team.
Rationale: The Agile culture, in addition to the frequent interchanges between the user and
the developers, should provide plenty of opportunity to improve team cohesion
and should lower the cost estimate.
Analyst Capability (ACAP) Effort Multiplier
Definition: Analysts are personnel who work on requirements, high-level design, and detailed
design. The major attributes that should be considered in this rating are analysis
and design ability, efficiency and thoroughness, and the ability to communicate
and cooperate.
Rationale: The participation of users in the development process should improve the
knowledge of the analysts that elaborate the requirements and produce the
software design. The impact of the improvement should lower the cost estimate.
Programmer Capability (PCAP) Effort Multiplier
Definition: Current trends continue to emphasize the importance of highly capable analysts.
However, the increasing role of complex COTS packages, and the significant
productivity leverage associated with programmers’ ability to deal with these
COTS packages, indicates a trend toward higher importance of programmer
capability as well. Evaluation should be based on the capability of the
programmers as a team rather than as individuals. Major factors that should be
considered in the rating are ability, efficiency, and thoroughness, and the ability to
communicate and cooperate.
Rationale: The participation of users in the development process should improve the
knowledge of the programmers who write the software code. This is the factor
that most relates to the Agile measure of velocity. The impact of the improvement
should lower the cost estimate.
Application Experience (APEX) Effort Multiplier
Definition: The cost-estimating multiplier based on the domain knowledge and capability of
the software development staff is called APEX. The rating for this cost driver is
dependent on the level of applications experience of the project team developing
the software system or subsystem. The ratings are defined in terms of the project
team’s equivalent level of experience with this type of application.
Rationale: The participation of users in the development process should improve the domain
knowledge of the development team. The impact of the improvement should
lower the cost estimate.