+612 9045 4394
 
CHECKOUT
Solid Software : Software Quality Institute Series - Shari Lawrence Pfleeger

Solid Software

Software Quality Institute Series

Paperback Published: 12th July 2001
ISBN: 9780130912985
Number Of Pages: 336

Share This Book:

Paperback

RRP $172.99
$127.80
26%
OFF
or 4 easy payments of $31.95 with Learn more
Ships in 7 to 10 business days

For courses in Software Reliability, Software Testing and Verification, Software Requirements, Software Metrics, and Software Engineering--Advanced

Solid Software presents realistic techniques for analyzing and improving the quality and robustness of any software system or software-intensive product. Solid Software isn't theoretical: it's a relentlessly practical decision maker's guide to making intelligent, responsible trade-offs that lead to the best software at the best cost. Solid Software draws upon dozens of real-world examples, based on the author's extensive experience as software quality consultants, and interviews with key software decision makers worldwide. Whether you're a developer, project manager, architect, executive, manager, or regulator, it's your single source for improving software quality in the real world.

Prefacep. xv
Why Is This Book Needed?p. 1
Software: The Universal Weak Link?p. 1
Unreliability and Unavailabilityp. 3
Lack of Securityp. 4
Unpredictable Performancep. 5
Difficulty in Upgradingp. 5
Trade-offs and Correlation in Aspects of Fragilityp. 6
Why Is This So Hard?p. 7
Programmer Optimism and Gutless Estimatingp. 7
Discrete versus Continuous Systemsp. 8
Immaturity Combined with Rapid Changep. 9
Repeating Our Mistakesp. 10
Solid, Survivable Softwarep. 12
Critical Systemsp. 12
Stakeholdersp. 13
Surviving a Software Projectp. 13
The Road Aheadp. 14
Referencesp. 17
Defining Quality: What Do You Want?p. 19
Five Views of Qualityp. 19
Risky Businessp. 22
Risk and Qualityp. 25
Consequences of Failurep. 27
Product Failurep. 27
Process Failurep. 31
Resource Failurep. 34
Rules of the Roadp. 37
Referencesp. 39
Hazard Analysisp. 41
The Rewards of Cautionp. 41
What Is Hazard Analysis?p. 43
HAZOPp. 45
Fault-Tree Analysisp. 47
Failure Modes and Effects Analysisp. 50
How to Describe Problemsp. 52
Failure Modesp. 52
Consequences and Probabilityp. 54
Planning for Hazard Analysisp. 59
Who Performs the Hazard Analysis?p. 59
When Are You Done?p. 60
For Additional Informationp. 62
Referencesp. 62
Testingp. 65
Types of Faultsp. 66
Orthogonal Defect Classificationp. 69
Testing Strategiesp. 72
Types of Testingp. 72
Approaches to Unit Testingp. 74
Approaches to Integration Testingp. 77
Comparison of Integration Strategiesp. 79
Approaches to Acceptance Testingp. 80
Results of Acceptance Testsp. 82
Approaches to Installation Testingp. 83
Test Cases and Resultsp. 83
Keeping Test Cases and Datap. 85
Who Should Test?p. 85
Automated Testing Toolsp. 87
Code Analysis Toolsp. 88
Test Execution Toolsp. 90
Testing: Good and Badp. 93
How Much Testing Is Enough?p. 98
Testing Planningp. 98
Stopping Criteriap. 102
Assessing Testing Risk and Trade-offsp. 103
Comparing Techniquesp. 105
Cost and Return on Investmentp. 107
Referencesp. 107
Software Designp. 111
The Audience for Designp. 112
The Meaning of Good Designp. 113
Modularity, Levels of Abstraction, and Information Hidingp. 113
Component Independencep. 115
Issues to Consider in Good Designp. 121
Collaborative Designp. 122
Designing the User Interfacep. 124
Concurrencyp. 128
Design Leverage Pointsp. 131
Fault Tolerance Philosophyp. 131
Error-Handling Designp. 134
Design Rationale and Historyp. 139
Design Patternsp. 140
Referencesp. 141
Predictionp. 145
Predicting Software Characteristicsp. 146
The Jelinski-Moranda Modelp. 149
The Littlewood Modelp. 149
Importance of the Operational Environmentp. 150
Predicting Effortp. 152
Expert Judgmentp. 153
Algorithmic Methodsp. 156
Machine Learning Methodsp. 157
Evaluating Model Accuracyp. 160
Predicting and Evaluating Return on Investmentp. 162
Technical Quality Alone Is Misleadingp. 162
Customer Satisfaction Alone Is Inadequatep. 166
Market Leadership Alone Is Inappropriatep. 168
Using Economic Value to Support Investment Decisionsp. 168
Predicting and Managing Riskp. 172
What Is a Risk?p. 173
Risk Management Activitiesp. 174
Cautions about Riskp. 178
Cleaning Up Our Risksp. 185
Acknowledgmentsp. 186
Referencesp. 187
Peer Reviewsp. 191
What Is a Review?p. 191
Review Effectivenessp. 194
Product Inspectionp. 197
Planningp. 198
Individual Preparationp. 200
Logging Meetingp. 200
Reworkingp. 201
Reinspectionp. 201
Process Improvementp. 201
Reviewing Speedp. 202
Fault Discovery: What the Reviewers Findp. 204
Fault Evasion: What the Reviewers Missp. 204
How to Improve Review Results: The Psychological Basisp. 205
Automating the Review Processp. 207
Pitfalls of the Review Processp. 209
The Role of Checklistsp. 210
Example Checklist for Source Code Inspectionsp. 210
Referencesp. 212
Static Analysisp. 215
Static Fault versus Dynamic Failurep. 216
When Faults Cause Failuresp. 216
Early versus Late Detectionp. 220
Measurements for Static Analysisp. 220
Coverage: How Much Is Enough?p. 223
Approaches to Static Analysisp. 224
Static Analysis of Designsp. 224
Using Automation to Find Code Faultsp. 224
Code Faults That Cannot Be Found by Automationp. 230
Static Noisep. 230
Referencesp. 232
Configuration Managementp. 233
Constant Changep. 233
Corrective Changesp. 235
Adaptive Changesp. 235
Perfective Changesp. 236
Preventive Changesp. 236
Worth the Effort?p. 237
Getting Controlp. 239
Versions, Releases, and the Challenge of Commercial Componentsp. 241
The Four Facets of SCMp. 244
Configuration Identificationp. 244
Configuration Control and Change Managementp. 245
Configuration Auditingp. 248
Status Accountingp. 249
Applying the Principles: Regression Testingp. 249
Change Control Boardsp. 250
Impact Analysisp. 252
One Size Does Not Fit Allp. 256
Tool Supportp. 256
Text Editorsp. 256
File Comparatorsp. 257
Compilers and Linkersp. 257
Cross-reference Generatorsp. 257
Static Code Analyzersp. 257
Configuration Management Repositoriesp. 258
Which Tools to Use?p. 258
Begin with the End, but Start Where You Arep. 259
Referencesp. 260
Using Appropriate Toolsp. 263
How Tools Developp. 264
The Evolution of Software Toolsp. 265
Tool Propertiesp. 268
The Anatomy of a Valuable Toolp. 269
The Unix Pipelinep. 269
TCL/TKp. 271
SKSp. 271
Tool Qualityp. 272
Compiler Validationp. 272
Tooling and Processp. 273
SAM 2000p. 274
Tooling and the Organizationp. 275
Referencesp. 276
Trust but Verifyp. 277
Where We Arep. 277
Learning from Mistakesp. 279
The Surveyp. 280
Objective Informationp. 282
Debriefing Meetingp. 282
Project History Dayp. 283
Publishing the Resultsp. 284
The Importance of Being Humanp. 286
Best Practicesp. 289
Making Decisionsp. 291
Group Decision Makingp. 294
How We Really Decidep. 296
How Groups Really Make Decisionsp. 298
Deciding What Is Right for Your Situationp. 299
What's Next?p. 301
S-systemsp. 302
P-systemsp. 303
E-systemsp. 304
Referencesp. 307
Indexp. 309
Table of Contents provided by Syndetics. All Rights Reserved.

ISBN: 9780130912985
ISBN-10: 0130912980
Series: Software Quality Institute Series
Audience: Tertiary; University or College
Format: Paperback
Language: English
Number Of Pages: 336
Published: 12th July 2001
Country of Publication: US
Dimensions (cm): 22.86 x 17.15  x 2.54
Weight (kg): 0.64