62c9523ba501bc4578a3a5f011387d1b.ppt
- Количество слайдов: 83
USC C S E University of Southern California Center for Software Engineering The Future of Software Engineering Barry Boehm, USC Boeing Presentation May 10, 2002 1 ©USC-CSE
USC C S E University of Southern California Center for Software Engineering Acknowledgments: USC-CSE Affiliates • Commercial Industry (15) – Daimler Chrysler, Freshwater Partners, Galorath, Group Systems. Com, Hughes, IBM, Cost Xpert Group, Microsoft, Motorola, Price Systems, Rational, Reuters Consulting, Sun, Telcordia, Xerox • Aerospace Industry (6) – Boeing, Lockheed Martin, Northrop Grumman, Raytheon, SAIC, TRW • Government (8) – DARPA, DISA, FAA, NASA-Ames, NSF, OSD/ARA/SIS, US Army Research Labs, US Army TACOM • FFRDC’s and Consortia (4) – Aerospace, JPL, SEI, SPC • International (1) 2 – Chung-Ang U. (Korea) ©USC-CSE
USC C S E University of Southern California Center for Software Engineering Trends Traditional • Standalone systems • Mostly source code • Requirements-driven • Control over evolution • Focus on software • Stable requirements • Premium on cost • Staffing workable 3 Future • Everything connected-maybe • Mostly COTS components • Requirements are emergent • No control over COTS evolution • Focus on systems and software • Rapid change • Premium on value, speed, quality • Scarcity of critical talent ©USC-CSE
USC C S E University of Southern California Center for Software Engineering Large-System Example: Future Combat Systems To This. . . From This. . . Network Centric Distributed Platforms Small Unit UAV Other Layered Sensors Network Centric Force Robotic Sensor Distributed Fire Mechanisms Robotic Direct Fire Exploit Battlefield Non-Linearities using Technology to Reduce the Size of Platforms and the Force Robotic NLOS Fire Manned C 2/Infantry Squad 4 ©USC-CSE
Total Collaborative Effort to Support FCS CHPS Robotics AUTONOMY VISION PDR CDR Detailed Design Unmanned Ground MOBILITY DESIGN Vehicle COMMAND & CONTROL Maneuver COMMS SUO C 3 LOITER ATTACK Maneuver BLOS AFSS Networked Fires Weapon* PRECISION ATTACK 3 D PLATFORM Organic All-Weather A 160 NEAR ALL WEATHER ($5. 0 M) Targeting Vehicle ALL WEATHER All-Weather Surveillance PRECISION SENSING and Targeting Sensor IOR 1 5 IOR 2 SOG review Build Robotic Unmanned Ground Vehicle BLOS Surveillance & Targeting System SOG review FY 06 T&E Preliminary Design FY 05 Shakeout Concept Development / Modeling and Simulation Government. Run Experiments FY 04 FY 03 Redesign / Test Brassboard FY 02 Breadboard FY 01 Design Competition FY 00
USC C S E University of Southern California Center for Software Engineering FCS Product/Process Interoperability Challenge 6 ©USC-CSE
LA SPIN Copyright © 2001 C-bridge 7 O O ba sic lo p ve de ss e. U es s or ct A oc Pr m ste at er en G in e us t. B Sy se s se bj bj ec ec me Ca t D nt O t. D se bj s es om str ec ai ateg t I ign n nt , L M ies er og od ac el tio ical in ns D g , S at Po ys a M lis te h m ode D Se l es ig rv n, i Bu ces ild Pl an Bu ild St 1 ab ili Bui za l tio d 2 Re n B ui le l as et d o Be Te ta st Pr Pi og lo ra t. P m ro gr a Pr od m uc tio n d ai le et D in e ef D tif y C S E en en Id oc um D USC Center for Software Engineering University of Southern California Small-Systems Example: e. Commerce • 16 -24 week fixed schedule Iteration Scope, Listening, Delivery focus Define Design ©USC-CSE Lines of readiness Are we ready for the next step? Develop Deploy
USC C S E University of Southern California Center for Software Engineering Future Software/Systems Challenges • Distribution, mobility, autonomy • Systems of systems, networks of networks, agents of agents • Group-oriented user interfaces • Nonstop adaptation and upgrades • Synchronizing many independent evolving systems • Hybrid, dynamic, evolving architectures • Complex adaptive systems behavior 8 ©USC-CSE
USC C S E University of Southern California Center for Software Engineering Complex Adaptive Systems Characteristics • Order is not imposed; it emerges (Kauffman) • Viability determined by adaptation rules – Too inflexible: gridlock – Too flexible: chaos • Equilibrium determined by “fitness landscape” – A function assigning values to system states • Adaptation yields better outcomes then optimization 9 – Software approaches: Agile Methods ©USC-CSE
USC C S E University of Southern California Center for Software Engineering Getting the Best from Agile Methods • Representative agile methods • The Agile Manifesto • The planning spectrum – Relations to Software CMM and CMMI • Agile and plan-driven home grounds • How much planning is enough? – Risk-driven hybrid approaches 10 ©USC-CSE
USC C S E University of Southern California Center for Software Engineering Leading Agile Methods • • * • • • 11 Adaptive Software Development (ASD) Agile Modeling Crystal methods Dynamic System Development Methodology (DSDM) e. Xtreme Programming (XP) Feature Driven Development Lean Development Scrum ©USC-CSE
USC C S E University of Southern California Center for Software Engineering XP: The 12 Practices • • • The Planning Game Small Releases Metaphor Simple Design Testing Refactoring • • • Pair Programming Collective Ownership Continuous Integration 40 -hour Week On-site Customer Coding Standards -Used generatively, not imperatively 12 ©USC-CSE
USC C S E University of Southern California Center for Software Engineering The Agile Manifesto We are uncovering better ways of developing software by doing it and helping others do it. Through this work we have come to value: • Individuals and interactions over processes and tools • Working software over comprehensive documentation • Customer collaboration over contract negotiation • Responding to change over following a plan That is, while there is value in the items on the right, we value the items on the left more. 13 ©USC-CSE
USC C S E University of Southern California Center for Software Engineering Counterpoint: A Skeptical View Letter to Computer, Steven Rakitin, Dec. 2000 “customer collaboration over contract negotiation” Translation: Let's not spend time haggling over the details, it only interferes with our ability to spend all our time coding. We’ll work out the kinks once we deliver something. . . “responding to change over following a plan” Translation: Following a plan implies we would have to spend time thinking about the problem and how we might actually solve it. Why would we want to do that when we could be coding? 14 ©USC-CSE
USC C S E University of Southern California Center for Software Engineering The Planning Spectrum Hackers XP Milestone Adaptive Risk- Driven SW Devel. Models … … Milestone Plan-Driven Models Inch- Pebble Micro-milestones, Ironbound Contract Agile Methods Software CMMI 15 ©USC-CSE
USC C S E University of Southern California Center for Software Engineering Agile and Plan-Driven Home Grounds Agile Home Ground Plan-Driven Home Ground • Agile, knowledgeable, collocated, collaborative developers • Above plus representative, empowered customers • Mix of customer capability levels • Reliance on tacit interpersonal knowledge • Reliance on explicit documented knowledge • Largely emergent requirements, rapid change • Requirements knowable early; largely stable • Architected for current requirements • Architected for current and foreseeable requirements • Refactoring inexpensive • Refactoring expensive • Smaller teams, products • Larger teams, products • Premium on rapid value 16 • Plan-oriented developers; mix of skills • Premium on high-assurance ©USC-CSE
USC C S E University of Southern California Center for Software Engineering How Much Planning Is Enough? - A risk analysis approach • Risk Exposure RE = Prob (Loss) * Size (Loss) – “Loss” – financial; reputation; future prospects, … • For multiple sources of loss: RE = [Prob (Loss) * Size (Loss)]sources 17 ©USC-CSE
USC C S E University of Southern California Center for Software Engineering Example RE Profile: Planning Detail - Loss due to inadequate plans high P(L): inadequate plans high S(L): major problems (oversights, delays, rework) RE = P(L) * S(L) low P(L): thorough plans low S(L): minor problems Time and Effort Invested in plans 18 ©USC-CSE
USC C S E University of Southern California Center for Software Engineering Example RE Profile: Planning Detail - Loss due to inadequate plans - Loss due to market share erosion high P(L): inadequate plans high S(L): major problems (oversights, delays, rework)) high P(L): plan breakage, delay high S(L): value capture delays RE = P(L) * S(L) low P(L): few plan delays low S(L): early value capture low P(L): thorough plans low S(L): minor problems Time and Effort Invested in Plans 19 ©USC-CSE
USC University of Southern California C S E Center for Software Engineering Example RE Profile: Planning Detail RE = P(L) * S(L) - Sum of Risk Exposures high P(L): inadequate plans high S(L): major problems (oversights, delays, rework) Sweet Spot low P(L): few plan delays low S(L): early value capture high P(L): plan breakage, delay high S(L): value capture delays low P(L): thorough plans low S(L): minor problems Time and Effort Invested in Plans 20 ©USC-CSE
USC C S E University of Southern California Center for Software Engineering Comparative RE Profile: Plan-Driven Home Ground RE = P(L) * S(L) Higher S(L): large system rework Mainstream Sweet Spot Plan-Driven Sweet Spot Time and Effort Invested in Plans 21 ©USC-CSE
USC C S E University of Southern California Center for Software Engineering Comparative RE Profile: RE =P(L) * S(L) Agile Home Ground Mainstream Sweet Spot Agile Sweet Spot Lower S(L): easy rework Time and Effort Invested in Plans 22 ©USC-CSE
USC C S E University of Southern California Center for Software Engineering Conclusions: CMMI and Agile Methods • Agile and plan-driven methods have best-fit home grounds – Increasing pace of change requires more agility • Risk considerations help balance agility and planning – Risk-driven “How much planning is enough? ” • Risk-driven agile/plan-driven hybrid methods available – Adaptive Software Development, RUP, MBASE, Ce. BASE Method – Explored at USC-CSE Affiliates’ Executive Workshop, March 12 -14 • CMMI provides enabling criteria for hybrid methods – Risk Management, Integrated Teaming 23 ©USC-CSE
USC C S E University of Southern California Center for Software Engineering Trends Traditional • Standalone systems • Mostly source code • Requirements-driven • Control over evolution • Focus on software • Stable requirements • Premium on cost • Staffing workable 24 Future • Everything connected-maybe • Mostly COTS components • Requirements are emergent • No control over COTS evolution • Focus on systems and software • Rapid change • Premium on value, speed, quality • Scarcity of critical talent ©USC-CSE
USC C S E University of Southern California Center for Software Engineering Getting the Best from COTS • COTS Software: A Behavioral Definition • COTS Empirical Hypotheses: Top-10 List – COTS Best Practice Implications • COTS-Based Systems (CBS) Challenges 25 ©USC-CSE
USC C S E University of Southern California Center for Software Engineering COTS Software: A Behavioral Definition • Commercially sold and supported – Avoids expensive development and support • No access to source code – Special case: commercial open-source • Vendor controls evolution – Vendor motivation to add, evolve features • No vendor guarantees – Of dependability, interoperability, continuity, … 26 ©USC-CSE
USC C S E University of Southern California Center for Software Engineering COTS Top-10 Empirical Hypotheses-I - Basili & Boehm, Computer, May 2001, pp. 91 -93 1. Over 99% of all executing computer instructions come from COTS – Each has passed a market test for value 2. More than half of large-COTS-product features go unused 3. New COTS release frequency averages 8 -9 months – Average of 3 releases before becoming unsupported 4. CBS life-cycle effort can scale as high as N 2 – N = # of independent COTS products in a system 5. CBS post-deployment costs exceed development costs – Except for short-lifetime systems 27 ©USC-CSE
USC C S E University of Southern California Center for Software Engineering Usual Hardware-Software Trend Comparison Log N HW New transistors in service/year SW • Different counting rules • Try counting software as Lines of Code in Service (LOCS) = (#platforms) * (#object LOCS/platform) New Source Lines of Code/year Time 28 ©USC-CSE
USC C S E University of Southern California Center for Software Engineering Lines of Code in Service: U. S. Dept. of Defense 29 ©USC-CSE
USC C S E University of Southern California Center for Software Engineering 1. Do. D LOCS: % COTS, 2000 Platform M = Million, T = Trillion # LOCS % COTS P’form (M) (T) Non-COTS LOCS (T) Mainframe . 015 200 3 95 . 15 Mini . 10 10 99 . 10 Micro 2 100 200 99. 5 1. 00 Combat 2 2 4 80 Total 217 2. 05 (<1%) • COTS often an economic necessity 30 ©USC-CSE
USC C S E University of Southern California Center for Software Engineering 2. Use of COTS Features: Microsoft Word and Power Point - K. Torii Lab, Japan: ISERN 2000 Workshop • Individuals: 12 -16% of features used • 10 -Person Group: 26 -29% of features used • Extra features cause extra work • Consider build vs. buy for small # features 31 ©USC-CSE
USC C S E University of Southern California Center for Software Engineering 3. COTS Release Frequency: Survey Data - Ron Kohl/GSAW surveys: 1999 -2002* GSAW Survey Release Frequency (months) 1999 6. 3 2000 8. 5 2001 8. 75 2002 9. 6 • Adaptive maintenance often biggest CBS life cycle cost • Average of 3 releases before becoming unsupported * Ground System Architecture Workshops, Aerospace Corp. 32 ©USC-CSE
USC C S E University of Southern California Center for Software Engineering 4. CBS Effort Scaling As High As N 2 Strong COTS Coupling Weak COTS Coupling COTS COTS Custom SW Effort ~ N + N(N-1)/2 = N(N+1)/2 (here = 6) 33 Custom SW Effort ~ N (here = 3) ©USC-CSE
USC C S E University of Southern California Center for Software Engineering 4. CBS Effort Scaling - II Strong COTS Coupling Relative COTS Adaptation Effort 10 8 Weak COTS Coupling 6 4 2 1 2 3 4 # COTS • Reduce # COTS or Weaken COTS coupling - COTS choices, wrappers, domain architectures, open standards, COTS refresh synchronization 34 ©USC-CSE
USC C S E University of Southern California Center for Software Engineering COTS Top-10 Empirical Hypotheses-II 6. Less than half of CBS development effort comes from glue code – But glue code costs 3 x more per instruction 7. Non-development CBS costs are significant – Worth addressing, but not overaddressing 8. COTS assessment and tailoring costs vary by COTS product class 9. Personnel factors are the leading CBS effort drivers – Different skill factors are needed for CBS and custom software 10. CBS project failure rates are higher than for custom software – But CBS benefit rates are higher also 35 ©USC-CSE
USC C S E University of Southern California Center for Software Engineering 6. COCOTS Effort Distribution: 20 Projects - Mean % of Total COTS Effort by Activity ( +/- 1 SD) 70. 00% 61. 25% 60. 00% % Person-months 50. 00% 49. 07% 50. 99% 40. 00% 31. 06% 30. 00% 20. 75% 21. 76% 20. 27% 11. 31% 10. 00% -10. 00% 2. 35% 0. 88% 0. 00% -7. 57% assessment -7. 48% tailoring glue code system volatility -20. 00% • Glue code generally less than half of total • No standard CBS effort distribution profile 36 ©USC-CSE
USC C S E University of Southern California Center for Software Engineering 9. Personnel Factors Are Leading CBS Effort Drivers: Different Skill Factors Needed Custom Development CBS Development • Rqts. /implementation assessment • Rqts. / COTS assessment • Algorithms; data & control structures • COTS mismatches; connector structures • Code reading and writing • COTS mismatches; assessment, tailoring, glue code development; coding avoidance • Adaptation to rqts. changes • Adaptation to rqts. , COTS changes • White-box/black-box testing • Black-box testing • Rethink recruiting and evaluation criteria 37 ©USC-CSE
USC C S E University of Southern California Center for Software Engineering 10. CBS Projects Tend to Fail Hard Major Sources of CBS Project Failure • • CBS skill mismatches CBS inexperience CBS optimism Weak life cycle planning CBS product mismatches Hasty COTS assessment Changing vendor priorities New COTS market entries • These are major topics for COTS risk assessment 38 ©USC-CSE
USC C S E University of Southern California Center for Software Engineering CBS Challenges • • • 39 Process specifics – Milestones; role of “requirements” – Multi-COTS refresh frequency and process Life cycle management – Progress metrics; development/support roles Cost-effective COTS assessment and test aids – COTS mismatch detection and redressal Increasing CBS controllability – Technology watch; wrappers; vendor win-win Better empirical data and organized experience ©USC-CSE
USC C S E University of Southern California Center for Software Engineering Trends Traditional • Standalone systems • Mostly source code • Requirements-driven • Control over evolution • Focus on software • Stable requirements • Premium on cost • Staffing workable 40 Future • Everything connected-maybe • Mostly COTS components • Requirements are emergent • No control over COTS evolution • Focus on systems and software • Rapid change • Premium on value, speed, quality • Scarcity of critical talent ©USC-CSE
USC C S E University of Southern California Center for Software Engineering Integrating Software and Systems Engineering • The software “separation of concerns” legacy – Erosion of underlying modernist philosophy – Resulting project social structure • Responsibility for system definition • The CMMI software paradigm 41 ©USC-CSE
USC C S E University of Southern California Center for Software Engineering The “Separation of Concerns” Legacy “The notion of ‘user’ cannot be precisely defined, and therefore has no place in CS or SE. ” – Edsger Dijkstra, ICSE 4, 1979 “Analysis and allocation of the system requirements is not the responsibility of the SE group but is a prerequisite for their work. ” – Mark Paulk at al. , SEI Software CMM v. 1. 1, 1993 42 ©USC-CSE
USC C S E University of Southern California Center for Software Engineering Cosmopolis: The Erosion of Modernist Philosophy • Dominant since 17 th century • Formal, reductionist – Apply laws of cosmos to human polis • Focus on written vs. oral; universal vs. particular; general vs. local; timeless vs. timely – one-size-fits-all solutions • Strong influence on focus of computer science • Weak in dealing with human behavior, rapid change 43 ©USC-CSE
USC C S E University of Southern California Center for Software Engineering Resulting Project Social Structure I wonder when they'll give us our requirements? SOFTWARE AERO. ELEC. MGMT. MFG. COMM 44 G&C PAYLOAD ©USC-CSE
USC C S E University of Southern California Center for Software Engineering Responsibility for System Definition • Every success-critical stakeholder has knowledge of factors that will thwart a project’s success – Software developers, users, customers, … • It is irresponsible to underparticipate in system definition – Govt. /TRW large system: 1 second response time • It is irresponsible to overparticipate in system definition – Scientific American: focus on part with software solution 45 ©USC-CSE
USC C S E University of Southern California Center for Software Engineering The CMMI Software Paradigm • System and software engineering are integrated – Software has a seat at the center table • Requirements, architecture, and process are developed concurrently – Along with prototypes and key capabilities • Developments done by integrated teams – Collaborative vs. adversarial process – Based on shared vision, negotiated stakeholder concurrence 46 ©USC-CSE
USC C S E University of Southern California Center for Software Engineering Systems Engineering Cost Model (COSYSMO) Barry Boehm, Donald Reifer and Ricardo Valerdi 47 ©USC-CSE
USC C S E University of Southern California Center for Software Engineering Development Approach • Use USC seven step model building process • Keep things simple – Start with Inception phase first when only large-grain information tends to be available – Use EIA 632 to bound tasks that are part of the effort – Focus on software intensive subsystems first (see next chart for explanation), then extend to total systems • Develop the model in three steps – Step 1 – limit effort to software-intensive systems – Step 2 – tackle remainder of phases in life cycle – Step 3 – extend work to broad range of systems • Build on previous work 48 ©USC-CSE
USC University of Southern California C S E Center for Software Engineering The Form of the Model # Requirements # Interfaces # Scenarios # Algorithms + Volatility Factor COCOMO II-based model Size Drivers Effort COSYSMO Effort Multipliers - Application factors -8 factors - Team factors -8 factors - Schedule driver 49 Duration Calibration ©USC-CSE WBS guided By EIA 632
USC C S E University of Southern California Center for Software Engineering Step 2: Size the Effort Using Function Point Analogy To compute size, develop estimates for the drivers and then sum the columns Weightings of factors relative work use average requirement as basis 50 ©USC-CSE
USC C S E University of Southern California Center for Software Engineering Basic Effort Multipliers - I • Applications Factors (8) – – – – Requirements understanding Architecture understanding Level of service requirements, criticality, difficulty Legacy transition complexity COTS assessment complexity Platform difficulty Required business process reengineering Technology maturity Detailed rating conventions will be developed for these multipliers 51 ©USC-CSE
USC C S E University of Southern California Center for Software Engineering Basic Effort Multipliers - II • Team Factors (8) – – – – Number and diversity of stakeholder communities Stakeholder team cohesion Personnel capability Personnel experience/continuity Process maturity Multi-site coordination Formality of deliverables Tool support Detailed rating conventions will be developed for these multipliers 52 ©USC-CSE
USC C S E University of Southern California Center for Software Engineering Next Steps • Data collection – gather actual effort and duration data from actual system engineering effort • Model calibration – calibrate the model first using expert opinion, then refine as actual data being collected • Model validation – validate that the model generates answers that reflect a mix of expert opinions and realworld data (move towards the actual data as it becomes available) • Model publication – publish the model periodically as new versions become available (at least bi-annually) • Model extension and refinement – extend the model to increase its scope towards the right; refine the model to increase its predictive accuracy 53 ©USC-CSE
USC C S E University of Southern California Center for Software Engineering Trends Traditional • Standalone systems • Mostly source code • Requirements-driven • Control over evolution • Focus on software • Stable requirements • Premium on cost • Staffing workable 54 Future • Everything connected-maybe • Mostly COTS components • Requirements are emergent • No control over COTS evolution • Focus on systems and software • Rapid change • Premium on value, speed, quality • Scarcity of critical talent ©USC-CSE
USC C S E University of Southern California Center for Software Engineering Value-Based Software Engineering (VBSE) • Essential to integration of software and systems engineering – Software increasingly determines system content – Software architecture determines system adaptability • 7 key elements of VBSE • Implementing VBSE – Ce. BASE Method and method patterns – Experience factory, GQM, and MBASE elements – Means of achieving CMMI Level 5 benefits 55 ©USC-CSE
USC C S E University of Southern California Center for Software Engineering 7 Key Elements of VBSE 1. Benefits Realization Analysis 2. Stakeholders’ Value Proposition Elicitation and Reconciliation 3. Business Case Analysis 4. Continuous Risk and Opportunity Management 5. Concurrent System and Software Engineering 6. Value-Based Monitoring and Control 7. Change as Opportunity 56 ©USC-CSE
USC C S E University of Southern California Center for Software Engineering The Information Paradox (Thorp) • No correlation between companies’ IT investments and their market performance • Field of Dreams – Build the (field; software) – and the great (players; profits) will come • Need to integrate software and systems initiatives 57 ©USC-CSE
USC C S E University of Southern California Center for Software Engineering DMR/BRA* Results Chain Order to delivery time is an important buying criterion INITIATIVE Contribution Implement a new order entry system ASSUMPTION OUTCOME Contribution OUTCOME Reduced order processing cycle (intermediate outcome) Increased sales Reduce time to process order Reduce time to deliver product *DMR Consulting Group’s Benefits Realization Approach 58 ©USC-CSE
USC C S E University of Southern California Center for Software Engineering The Model-Clash Spider Web: Master Net 59 ©USC-CSE
USC C S E University of Southern California Center for Software Engineering Easy. Win On. Line Negotiation Steps 60 ©USC-CSE
USC C S E University of Southern California Center for Software Engineering Red cells indicate lack of consensus. Oral discussion of cell graph reveals unshared information, unnoticed assumptions, hidden issues, constraints, etc. 61 ©USC-CSE
USC C S E University of Southern California Center for Software Engineering Example of Business Case Analysis ROI= (Benefits-Costs)/Costs Option BRapid 3 Return on Investment Option B 2 Option A 1 Time -1 62 ©USC-CSE
USC C S E University of Southern California Center for Software Engineering Architecture Choices Rigid Hyper-flexible … Flexible F 12 F 1, F 2, F 3 F 6, F 7 63 F 4, F 5 F 8, F 9, F 10, F 11 ©USC-CSE Intelligent Router
USC C S E 64 University of Southern California Center for Software Engineering Architecture Flexibility Determination ©USC-CSE
USC C S E University of Southern California Center for Software Engineering Value-Based Software Engineering (VBSE) • Essential to integration of software and systems engineering – Software increasingly determines system content – Software architecture determines system adaptability • 7 key elements of VBSE • Implementing VBSE – Ce. BASE Method and method patterns – Experience factory, GQM, and MBASE elements – Means of achieving CMMI Level 5 benefits 65 ©USC-CSE
USC C S E University of Southern California Center for Software Engineering Experience Factory Framework - III Progress/Plan/ Goal Mismatches Org. Shared Vision & Improvement Strategy • Org. Improvement Goals – • Goal-related questions, metrics Org. Improvement Strategies – Planning context ves tiati Ini Goal achievement models Initiative Plans – • Initiative-related questions, metrics Initiative Monitoring and Control Experience-Base Analysis Analyzed experience, Updated models Project experience Experience Base Models and data Project Shared Vision and Strategy 66 • – Achievables, Opportunities Org. Goals Org. Improvement Initiative Planning & Control Models and data Planning Context ©USC-CSE Project Planning and Control
USC C S E University of Southern California Center for Software Engineering VBSE Experience Factory Example • Shared vision: Increased market share, profit via rapid development • Improvement goal: Reduce system development time 50% • Improvement strategy: Reduce all task durations 50% • Pilot project: Rqts, design, code reduced 50%; test delayed – Analysis: Test preparation insufficient, not on critical-path • Experience base: OK to increase effort on noncritical-path activities, if it decreases time on critical-path activities 67 ©USC-CSE
USC University of Southern California C S E Center for Software Engineering Ce. BASE Method Strategic Framework -Applies to organization’s and projects’ people, processes, and products Progress/Plan/Goal mismatches -shortfalls, opportunities, risks Plan/Goal mismatches Org-Portfolio Shared Vision Organization/ Portfolio: Experience Factory, GQM • Org. Value Propositions (VP’s) -Stakeholder values • Current situation w. r. t. VP’s • Improvement Goals, Priorities • Global Scope, Results Chain • Value/business case models Shortfalls, opportunities, risks Initiatives • Project Value Propositions -Stakeholder values • Current situation w. r. t. VP’s • Improvement Goals, Priorities • Project Scope, Results Chain • Value/business case models Org. Strategic Plans • Strategy elements • Evaluation criteria/questions • Improvement plans -Progress metrics -Experience base Shortfalls, opportunities, risks Project vision, goals Project Shared Vision Project: MBASE Planning context Scoping context Planning Context 68 Planning Context Project Plans • LCO/LCA Package -Ops concept, prototypes, rqts, architecture, LCplan, rationale • IOC/Transition/Support Package -Design, increment plans, quality plans, T/S plans • Evaluation criteria/questions • Progress metrics LCO: Life Cycle Objectives LCA: Life Cycle Architecture Plan/goal mismatches IOC: Initial Operational Capability GMQM: Goal-Model-Question-Metric Paradigm MBASE: Model-Based (System) Architecting and Software Engineering ©USC-CSE Monitoring & Control Context Org. Monitoring & Control • Monitor environment -Update models • Implement plans • Evaluate progress -w. r. t. goals, models • Determine, apply corrective actions • Update experience base Project experience, progress w. r. t. plans, goals Monitoring & Control context Proj. Monitoring & Control • Monitor environment -Update models Monitoring • Implement plans & Control • Evaluate progress context -w. r. t. goals, models, plans • Determine, apply corrective actions • Update experience base Progress/Plan/goal mismatches -Shortfalls, opportunities, risks
USC University of Southern California C S E Center for Software Engineering Ce. BASE Method Coverage of CMMI - I • Process Management – – – Organizational Process Focus: 100+ Organizational Process Definition: 100+ Organizational Training: 100 Organizational Process Performance: 100 Organizational Innovation and Deployment: 100+ • Project Management – – – – 69 Project Planning: 100 Project Monitoring and Control: 100+ Supplier Agreement Management: 50 Integrated Project Management: 100 Risk Management: 100 Integrated Teaming: 100 Quantitative Project Management: 70 - ©USC-CSE
USC University of Southern California C S E Center for Software Engineering Ce. BASE Method Coverage of CMMI - II • Engineering – – – Requirements Management: 100 Requirements Development: 100 Technical Solution: 60+ Product Integration: 70+ Verification: 70 Validation: 80+ • Support – – – 70 Configuration Management: 70 Process and Product Quality Assurance: 70 Measurement and Analysis: 100 Decision Analysis and Resolution: 100 Organizational Environment for Integration: 80 Causal Analysis and Resolution: 100 ©USC-CSE
USC C S E University of Southern California Center for Software Engineering Ce. BASE Method Simplified via Method Patterns • Schedule as Independent Variable (SAIV) – Also applies for cost, cost/schedule/quality • Opportunity Trees – Cost, schedule, defect reduction • Agile COCOMO II – Analogy-based estimate adjuster • COTS integration method patterns • Process model decision table 71 ©USC-CSE
USC C S E University of Southern California Center for Software Engineering The SAIV Process Model 1. Shared vision and expectations management 2. Feature prioritization 3. Schedule range estimation 4. Architecture and core capabilities determination 5. Incremental development 6. Change and progress monitoring and control 72 ©USC-CSE
USC C S E University of Southern California Center for Software Engineering Shared Vision, Expectations Management, and Feature Prioritization • Use stakeholder win-win approach • Developer win condition: Don’t overrun fixed 24 -week schedule • Clients’ win conditions: 56 weeks’ worth of features • Win-Win negotiation – Which features are most critical? – COCOMO II: How many features can be built within a 24 -week schedule? 73 ©USC-CSE
USC C S E University of Southern California Center for Software Engineering COCOMO II Estimate Ranges 74 ©USC-CSE
USC C S E University of Southern California Center for Software Engineering Core Capability Incremental Development, and Coping with Rapid Change • Core capability not just top-priority features – Useful end-to-end capability – Architected for ease of adding, dropping marginal features • Worst case: Deliver core capability in 24 weeks, with some extra effort • Most likely case: Finish core capability in 16 -20 weeks – Add next-priority features • Cope with change by monitoring progress – Renegotiate plans as appropriate 75 ©USC-CSE
USC C S E University of Southern California Center for Software Engineering SAIV Experience I: USC Digital Library Projects • Life Cycle Architecture package in fixed 12 weeks – Compatible operational concept, prototypes, requirements, architecture, plans, feasibility rationale • Initial Operational Capability in 12 weeks – Including 2 -week cold-turkey transition • Successful on 24 of 26 projects – Failure 1: too-ambitious core capability • Cover 3 image repositories at once – Failure 2: team disbanded • Graduation, summer job pressures 76 ©USC-CSE
USC C S E University of Southern California Center for Software Engineering RAD Opportunity Tree Business process reengineering Reusing assets Applications generation Schedule as independent variable (SAIV) Eliminating Tasks Tools and automation Work streamlining (80 -20) Increasing parallelism Reducing Time Per Task Reducing Risks of Single-Point Failures Reducing failures Reducing their effects Early error elimination Process anchor points Improving process maturity Collaboration technology Minimizing task dependencies Avoiding high fan-in, fan-out Reducing task variance Removing tasks from critical path Reducing Backtracking Activity Network Streamlining 24 x 7 development Nightly builds, testing Weekend warriors Increasing Effective Workweek Better People and Incentives Transition to Learning Organization 77 ©USC-CSE
USC C S E University of Southern California Center for Software Engineering Trends Traditional • Standalone systems • Mostly source code • Requirements-driven • Control over evolution • Focus on software • Stable requirements • Premium on cost • Staffing workable 78 Future • Everything connected-maybe • Mostly COTS components • Requirements are emergent • No control over COTS evolution • Focus on systems and software • Rapid change • Premium on value, speed, quality • Scarcity of critical talent ©USC-CSE
USC C S E University of Southern California Center for Software Engineering Critical Success Factors for Software Careers • Understanding, dealing with sources of software value – And associated stakeholders, applications domains • Software/systems architecting and engineering – Creating and/or integrating software components – Using risk to balance disciplined and agile processes • Ability to continuously learn and adapt 79 ©USC-CSE
USC C S E University of Southern California Center for Software Engineering Critical Success Factors for Software Careers - USC Software Engineering Certificate Courses • Understanding, dealing with sources of software value – And associated stakeholders, applications domains – CS 510: Software Management and Economics • Software/systems architecting and engineering – Creating and/or integrating software components – Using risk to balance disciplined and agile processes – CS 577 ab: Software Engineering Principles and Practice – CS 578: Software Architecture • Ability to continuously learn and adapt – CS 592: Emerging Best Practices in Software Engineering 80 ©USC-CSE
USC C S E University of Southern California Center for Software Engineering USC SWE Certificate Program and MS Programs Provide Key Software Talent Strategy Enablers • Infusion of latest SWE knowledge and trends • Tailorable framework of best practices • Continuing education for existing staff - Including education on doing their own lifelong learning • Package of career-enhancing perks for new recruits - Career reward for earning Certificate - Option to continue for MS degree - Many CS BA grads want both income and advanced degree • Available via USC Distance Education Network 81 ©USC-CSE
USC C S E University of Southern California Center for Software Engineering Conclusions • World of future: Complex adaptive systems – Distribution; mobility; rapid change • Need adaptive vs. optimized processes – Don’t be a process dinosaur – Or a change-resistant change agent • Marketplace trends favor value/risk-based processes – Software a/the major source of product value – Software the primary enabler of system adaptability • Individuals are organizations need pro-active software talent strategies – Consider USC SW Engr. Certificate and MS programs 82 ©USC-CSE
USC C S E University of Southern California Center for Software Engineering References D. Ahern, A. Clouse, & R. Turner, CMMI Distilled, Addison Wesley, 2001. C. Baldwin & K. Clark, Design Rules: The Power of Modularity, MIT Press, 1999. B. Boehm, D. Port, A. Jain, & V. Basili, “Achieving CMMI Level 5 with MBASE and the Ce. BASE Method, ” Cross Talk, May 2002. B. Boehm & K. Sullivan, “Software Economics: A Roadmap, ” The Future of Software Economics, A. Finkelstein (ed. ), ACM Press, 2000. R. Kaplan & D. Norton, The Balanced Scorecard: Translating Strategy into Action, Harvard Business School Press, 1996. D. Reifer, Making the Software Business Case, Addison Wesley, 2002. K. Sullivan, Y. Cai, B. Hallen, and W. Griswold, “The Structure and Value of Modularity in Software Design, ” Proceedings, ESEC/FSE, 2001, ACM Press, pp. 99 -108. J. Thorp and DMR, The Information Paradox, Mc. Graw Hill, 1998. Software Engineering Certificate website: sunset. usc. edu/SWE-Cert Ce. BASE web site : www. cebase. org CMMI web site : www. sei. cmu. edu/cmmi MBASE web site : sunset. usc. edu/research/MBASE 83 ©USC-CSE


