Software project estimation : the fundamentals for providng high quality information to decision makers /

Software projects are often late and over-budget and this leads to major problems for software customers. Clearly, there is a serious issue in estimating a realistic, software project budget. Furthermore, generic estimation models cannot be trusted to provide credible estimates for projects as compl...

Πλήρης περιγραφή

Λεπτομέρειες βιβλιογραφικής εγγραφής
Κύριος συγγραφέας: Abran, Alain, 1949-
Μορφή: Ηλ. βιβλίο
Γλώσσα:English
Έκδοση: Hoboken, New Jersey : John Wiley & Sons Inc., [2015]
Θέματα:
Διαθέσιμο Online:Full Text via HEAL-Link
Πίνακας περιεχομένων:
  • Foreword xiii
  • Overview xvii
  • Acknowledgments xxiii
  • About the Author xxv
  • Part One Understanding the Estimation Process 1
  • 1. The Estimation Process: Phases and Roles 3
  • 1.1. Introduction 3
  • 1.2. Generic Approaches in Estimation Models: Judgment or Engineering? 4
  • 1.2.1. Practitioner's Approach: Judgment and Craftsmanship 4
  • 1.2.2. Engineering Approach: Modest-One Variable at a Time 5
  • 1.3. Overview of Software Project Estimation and Current Practices 6
  • 1.3.1. Overview of an Estimation Process 6
  • 1.3.2. Poor Estimation Practices 7
  • 1.3.3. Examples of Poor Estimation Practices 9
  • 1.3.4. The Reality: A Tally of Failures 10
  • 1.4. Levels of Uncertainty in an Estimation Process 11
  • 1.4.1. The Cone of Uncertainty 11
  • 1.4.2. Uncertainty in a Productivity Model 12
  • 1.5. Productivity Models 14
  • 1.6. The Estimation Process 16
  • 1.6.1. The Context of the Estimation Process 16
  • 1.6.2. The Foundation: The Productivity Model 17
  • 1.6.3. The Full Estimation Process 18
  • 1.7. Budgeting and Estimating: Roles and Responsibilities 23
  • 1.7.1. Project Budgeting: Levels of Responsibility 23
  • 1.7.2. The Estimator 25
  • 1.7.3. The Manager (Decision-Taker and Overseer) 25
  • 1.8. Pricing Strategies 27
  • 1.8.1. Customers-Suppliers: The Risk Transfer Game in Estimation 28
  • 1.9. Summary - Estimating Process, Roles, and Responsibilities 28
  • Exercises 30
  • Term Assignments 31
  • 2. Engineering and Economics Concepts for Understanding Software Process Performance 32
  • 2.1. Introduction: The Production (Development) Process 32
  • 2.2. The Engineering (and Management) Perspective on a Production Process 34
  • 2.3. Simple Quantitative Process Models 36
  • 2.3.1. Productivity Ratio 36
  • 2.3.2. Unit Effort (or Unit Cost) Ratio 38
  • 2.3.3. Averages 39
  • 2.3.4. Linear and Non-Linear Models 42
  • 2.4. Quantitative Models and Economics Concepts 45
  • 2.4.1. Fixed and Variable Costs 45
  • 2.4.2. Economies and Diseconomies of Scale 48
  • 2.5. Software Engineering Datasets and Their Distribution 49.
  • 2.5.1. Wedge-Shaped Datasets 49
  • 2.5.2. Homogeneous Datasets 50
  • 2.6. Productivity Models: Explicit and Implicit Variables 52
  • 2.7. A Single and Universal Catch-All Multidimensional Model or Multiple Simpler Models? 54
  • 2.7.1. Models Built from Available Data 55
  • 2.7.2. Models Built on Opinions on Cost Drivers 55
  • 2.7.3. Multiple Models with Coexisting Economies and Diseconomies of Scale 56
  • Exercises 58
  • Term Assignments 59
  • 3. Project Scenarios, Budgeting, and Contingency Planning 60
  • 3.1. Introduction 60
  • 3.2. Project Scenarios for Estimation Purposes 61
  • 3.3. Probability of Underestimation and Contingency Funds 65
  • 3.4. A Contingency Example for a Single Project 67
  • 3.5. Managing Contingency Funds at the Portfolio Level 69
  • 3.6. Managerial Prerogatives: An Example in the AGILE Context 69
  • 3.7. Summary 71
  • Further Reading: A Simulation for Budgeting at the Portfolio Level 71
  • Exercises 74
  • Term Assignments 75
  • Part Two Estimation Process: What Must be Verified? 77
  • 4. What Must be Verified in an Estimation Process: An Overview 79
  • 4.1. Introduction 79
  • 4.2. Verification of the Direct Inputs to An Estimation Process 81
  • 4.2.1. Identification of the Estimation Inputs 81
  • 4.2.2. Documenting the Quality of These Inputs 82
  • 4.3. Verification of the Productivity Model 84
  • 4.3.1. In-House Productivity Models 84
  • 4.3.2. Externally Provided Models 85
  • 4.4. Verification of the Adjustment Phase 86
  • 4.5. Verification of the Budgeting Phase 87
  • 4.6. Re-Estimation and Continuous Improvement to the Full Estimation Process 88
  • Further Reading: The Estimation Verification Report 89
  • Exercises 92
  • Term Assignments 93
  • 5. Verification of the Dataset Used to Build the Models 94
  • 5.1. Introduction 94
  • 5.2. Verification of DIRECT Inputs 96
  • 5.2.1. Verification of the Data Definitions and Data Quality 96
  • 5.2.2. Importance of the Verification of the Measurement Scale Type 97
  • 5.3. Graphical Analysis - One-Dimensional 100.
  • 5.4. Analysis of the Distribution of the Input Variables 102
  • 5.4.1. Identification of a Normal (Gaussian) Distribution 102
  • 5.4.2. Identification of Outliers: One-Dimensional Representation 103
  • 5.4.3. Log Transformation 107
  • 5.5. Graphical Analysis - Two-Dimensional 108
  • 5.6. Size Inputs Derived from a Conversion Formula 111
  • 5.7. Summary 112
  • Further Reading: Measurement and Quantification 113
  • Exercises 116
  • Term Assignments 117
  • Exercises-Further Reading Section 117
  • Term Assignments-Further Reading Section 118
  • 6. Verification of Productivity Models 119
  • 6.1. Introduction 119
  • 6.2. Criteria Describing the Relationships Across Variables 120
  • 6.2.1. Simple Criteria 120
  • 6.2.2. Practical Interpretation of Criteria Values 122
  • 6.2.3. More Advanced Criteria 124
  • 6.3. Verification of the Assumptions of the Models 125
  • 6.3.1. Three Key Conditions Often Required 125
  • 6.3.2. Sample Size 126
  • 6.4. Evaluation of Models by Their Own Builders 127
  • 6.5. Models Already Built-Should You Trust Them? 128
  • 6.5.1. Independent Evaluations: Small-Scale Replication Studies 128
  • 6.5.2. Large-Scale Replication Studies 129
  • 6.6. Lessons Learned: Distinct Models by Size Range 133
  • 6.6.1. In Practice, Which is the Better Model? 138
  • 6.7. Summary 138
  • Exercises 139
  • Term Assignments 139
  • 7. Verification of the Adjustment Phase 141
  • 7.1. Introduction 141
  • 7.2. Adjustment Phase in the Estimation Process 142
  • 7.2.1. Adjusting the Estimation Ranges 142
  • 7.2.2. The Adjustment Phase in the Decision-Making Process: Identifying Scenarios for Managers 144
  • 7.3. The Bundled Approach in Current Practices 145
  • 7.3.1. Overall Approach 145
  • 7.3.2. Detailed Approach for Combining the Impact of Multiple Cost Drivers in Current Models 146
  • 7.3.3. Selecting and Categorizing Each Adjustment: The Transformation of Nominal Scale Cost Drivers into /Numbers 147
  • 7.4. Cost Drivers as Estimation Submodels! 148
  • 7.4.1. Cost Drivers as Step Functions 148.
  • 7.4.2. Step Function Estimation Submodels with Unknown Error Ranges 149
  • 7.5. Uncertainty and Error Propagation 151
  • 7.5.1. Error Propagation in Mathematical Formulas 151
  • 7.5.2. The Relevance of Error Propagation in Models 153
  • Exercises 156
  • Term Assignments 157
  • Part Three Building Estimation Models: Data Collection and Analysis 159
  • 8. Data Collection and Industry Standards: The ISBSG Repository 161
  • 8.1. Introduction: Data Collection Requirements 161
  • 8.2. The International Software Benchmarking Standards Group 163
  • 8.2.1. The ISBSG Organization 163
  • 8.2.2. The ISBSG Repository 164
  • 8.3. ISBSG Data Collection Procedures 165
  • 8.3.1. The Data Collection Questionnaire 165
  • 8.3.2. ISBSG Data Definitions 167
  • 8.4. Completed ISBSG Individual Project Benchmarking Reports: Some Examples 170
  • 8.5. Preparing to Use the ISBSG Repository 173
  • 8.5.1. ISBSG Data Extract 173
  • 8.5.2. Data Preparation: Quality of the Data Collected 173
  • 8.5.3. Missing Data: An Example with Effort Data 175
  • Further Reading 1: Benchmarking Types 177
  • Further Reading 2: Detailed Structure of the ISBSG Data Extract 179
  • Exercises 183
  • Term Assignments 183
  • 9. Building and Evaluating Single Variable Models 185
  • 9.1. Introduction 185
  • 9.2. Modestly, One Variable at a Time 186
  • 9.2.1. The Key Independent Variable: Software Size 186
  • 9.2.2. Analysis of the Work-Effort Relationship in a Sample 188
  • 9.3. Data Preparation 189
  • 9.3.1. Descriptive Analysis 189
  • 9.3.2. Identifying Relevant Samples and Outliers 189
  • 9.4. Analysis of the Quality and Constraints of Models 193
  • 9.4.1. Small Projects 195
  • 9.4.2. Larger Projects 195
  • 9.4.3. Implication for Practitioners 195
  • 9.5. Other Models by Programming Language 196
  • 9.6. Summary 202
  • Exercises 203
  • Term Assignments 203
  • 10. Building Models with Categorical Variables 205
  • 10.1. Introduction 205
  • 10.2. The Available Dataset 206
  • 10.3. Initial Model with a Single Independent Variable 208.
  • 10.3.1. Simple Linear Regression Model with Functional Size Only 208
  • 10.3.2. Nonlinear Regression Models with Functional Size 208
  • 10.4. Regression Models with Two Independent Variables 210
  • 10.4.1. Multiple Regression Models with Two Independent Quantitative Variables 210
  • 10.4.2. Multiple Regression Models with a Categorical Variable: Project Difficulty 210
  • 10.4.3. The Interaction of Independent Variables 215
  • Exercises 216
  • Term Assignments 217
  • 11. Contribution of Productivity Extremes in Estimation 218
  • 11.1. Introduction 218
  • 11.2. Identification of Productivity Extremes 219
  • 11.3. Investigation of Productivity Extremes 220
  • 11.3.1. Projects with Very Low Unit Effort 221
  • 11.3.2. Projects with Very High Unit Effort 222
  • 11.4. Lessons Learned for Estimation Purposes 224
  • Exercises 225
  • Term Assignments 225
  • 12. Multiple Models from a Single Dataset 227
  • 12.1. Introduction 227
  • 12.2. Low and High Sensitivity to Functional Size Increases: Multiple Models 228
  • 12.3. The Empirical Study 230
  • 12.3.1. Context 230
  • 12.3.2. Data Collection Procedures 231
  • 12.3.3. Data Quality Controls 231
  • 12.4. Descriptive Analysis 231
  • 12.4.1. Project Characteristics 231
  • 12.4.2. Documentation Quality and Its Impact on Functional Size Quality 233
  • 12.4.3. Unit Effort (in Hours) 234
  • 12.5. Productivity Analysis 234
  • 12.5.1. Single Model with the Full Dataset 234
  • 12.5.2. Model of the Least Productive Projects 235
  • 12.5.3. Model of the Most Productive Projects 237
  • 12.6. External Benchmarking with the ISBSG Repository 238
  • 12.6.1. Project Selection Criteria and Samples 238
  • 12.6.2. External Benchmarking Analysis 239
  • 12.6.3. Further Considerations 240
  • 12.7. Identification of the Adjustment Factors for Model Selection 241
  • 12.7.1. Projects with the Highest Productivity (i.e., the Lowest Unit Effort) 241
  • 12.7.2. Lessons Learned 242
  • Exercises 243
  • Term Assignments 243
  • 13. Re-Estimation: A Recovery Effort Model 244.
  • 13.1. Introduction 244
  • 13.2. The Need for Re-Estimation and Related Issues 245
  • 13.3. The Recovery Effort Model 246
  • 13.3.1. Key Concepts 246
  • 13.3.2. Ramp-Up Process Losses 247
  • 13.4. A Recovery Model When a Re-Estimation Need is Recognized at Time T > 0 248
  • 13.4.1. Summary of Recovery Variables 248
  • 13.4.2. A Mathematical Model of a Recovery Course in Re-Estimation 248
  • 13.4.3. Probability of Underestimation −p(u) 249
  • 13.4.4. Probability of Acknowledging the Underestimation on a Given Month −p(t) 250
  • Exercises 251
  • Term Assignments 251
  • References 253
  • Index 257.