Скачать презентацию Software Engineering Tools Research on Only 10 a Скачать презентацию Software Engineering Tools Research on Only 10 a

07bcba7e31b3ec20ddbcfea46b3a922f.ppt

  • Количество слайдов: 38

Software Engineering Tools Research on Only $10 a Day William Griswold University of California, Software Engineering Tools Research on Only $10 a Day William Griswold University of California, San Diego UW-MSR Workshop: Accelerating the Pace of Software Tools Research: Sharing Infrastructure August 2001

Goals 1. How program analysis is used in software engineering, and how that impacts Goals 1. How program analysis is used in software engineering, and how that impacts research 2. Issues for tool implementation, infrastructure 3. Infrastructure approaches and example uses – My lab’s experiences, interviews with 5 others – Not every infrastructure out there, or IDE infras 4. Lessons learned 5. Challenges and opportunities 2

Base Assumptions Background • Software engineering is about coping with the complexity of software Base Assumptions Background • Software engineering is about coping with the complexity of software and its development – Scale, scope, arbitrariness of real world • Evaluation of SE tools is best done in settings that manifest these complexities – Experiment X involves a tool user with a need – Hard to bend real settings to your tool • Mature infrastructure can put more issues within reach at lower cost – Complete & scalable tools, suitable for more settings 3

Role of Program Analysis in SE Discover hidden or dispersed program properties, display them Role of Program Analysis in SE Discover hidden or dispersed program properties, display them in a natural form, and assist in their change • Behavioral: find/prevent bugs; find invariants – PREfix, Purify, Hot. Path, JInsight, Deja. Vu, Test. Tube • Structural: find design anomalies, architecture – Lackwit, Womble, RM, Seesoft, RIGI, slicers • Evolutionary: enhance, subset, restructure – Restructure, Star. Tool, Wolf. Pack 4

Analysis Methods • Dynamic – Trace analysis – Testing • Static – Lexical (e. Analysis Methods • Dynamic – Trace analysis – Testing • Static – Lexical (e. g. , grep, diff) – Syntactic – Data-flow analysis, abstract interpretation – Constraint equation solving – Model checking; theorem proving Issues are remarkably similar across methods 5

Use in Iterative Analysis Cycle 1. Steps 2 -4 may be done manually, with Use in Iterative Analysis Cycle 1. Steps 2 -4 may be done manually, with ad Programmer identifies problem or task hoc automation, an interactive Is this horrific hack the cause tool, or bug? tool. of my a batch 2. Choose program source model and analysis I think I’ll do a slice with. User-tool[data-flow analysis] Sprite. “interface” is rich and dynamic. 3. Extract (and analyze) model [Programmer feeds code to slicer, chooses variable reference in code that has wrong value] 4. Render model (and analysis) [Tool highlights reached source text] 5. Reason about results, plan course of action Nope, that hack didn’t get highlighted… 6

Interactive, Graphical, Integrated 7 Interactive, Graphical, Integrated 7

The Perfect Tool User For our most But “Your tool will solve all sorts The Perfect Tool User For our most But “Your tool will solve all sorts of problems. recent tool, the first study it’ll have to analyze my entire 1 MLOC program, involved a 500 KLOC which doesn’t compile right now, Fortran/C written and is app developed on SGI’s in 4 languages. I want the results as fast as compilation, with an intuitive graphical display linked back to the source and integrated into our IDE. I want to save the results, and have them automatically updated as I change the program. Oh, I use Windows, but some of my colleagues use Unix. It’s OK if the tool misses stuff or returns lots of data, we can post-process. We just want a net win. ” 8

Unique Infrastructure Challenges • Wide-spectrum needs (e. g. , GUI) – Provide function and/or Unique Infrastructure Challenges • Wide-spectrum needs (e. g. , GUI) – Provide function and/or outstanding interoperability • Whole-program analysis versus interactivity – Demand, precompute, reuse [Harrold], modularize • Source-to-source analysis and transformation – Analyze, present, modify as programmer sees it • Ill-defined task space and process structure Saving grace is programmer: intelligent, adaptive – Can interpret, interpolate, iterate; adjust process – Requires tool (and hence infrastructure) support 9

Infrastructure Spectrum • Monolithic environment – Generative environment (Gandalf, Synthesizer Generator), programming language (Refine) Infrastructure Spectrum • Monolithic environment – Generative environment (Gandalf, Synthesizer Generator), programming language (Refine) – Reuse model: high-level language • Generator (compiler) or interpreter • Component-based – Frameworks, toolkits (Ponder), IDE plug-in support – Reuse model: interface • Piecewise replacement and innovation • Subclassing (augmentation, specialization) 10

Monolithic Environments • Refine: syntactic analysis & trans env [Reasoning] – Powerful C-like functional Monolithic Environments • Refine: syntactic analysis & trans env [Reasoning] – Powerful C-like functional language w/ lazy eval. – AST datatype w/grammar and pattern language – Aggregate ADT’s, GUI, persistence, C/Cobol targets – Wolfpack C function splitter took 11 KLOC (1/2 reps, 5% LISP), no pointer analysis; slow [Lakhotia] • Code. Surfer: C program slicing tool [Gramma. Tech] – Rich GUI, PDG in repository, Scheme “back door” – ~500 LOC to prototype globals model [Lakhotia] – Not really meant for extension, code transformation Great for prototyping and one-shot tasks 11

Components Overview 1. Standalone components – Idea: “Ad hoc” composition, lots of choices – Components Overview 1. Standalone components – Idea: “Ad hoc” composition, lots of choices – Example Component: EDG front-ends – Example Tools: static: Alloy [Jackson] dynamic: Daikon [Ernst] 2. Component architectures – Idea: Components must conform to design rules – Examples: data arch: Aristotle [Harrold] control arch: Icaria [Atkinson] 3. Analyses (tools) as components – Idea: Infrastructure-independent tool design – Example: Star. Tool [Hayes] 12

Standalone Components • Component generators – Yacc, lex, Java. CC, Jlex, JJTree, ANTLR … Standalone Components • Component generators – Yacc, lex, Java. CC, Jlex, JJTree, ANTLR … – Little help for scoping, type checking (symbol tables) • Representation packages for various languages – Icaria (C AST), GNAT (Ada), EDG (*), … • GUI systems galore, mostly generic – WFC, Visual Basic, Tcl/Tk, Swing; dot, vcg • Databases and persistence frameworks • Few OTS analyses available – Model checkers and SAT constraint solvers 13

Edison Design Group Front-Ends • Front-ends for C/C++, Fortran, Java (new) – Lexing, parsing, Edison Design Group Front-Ends • Front-ends for C/C++, Fortran, Java (new) – Lexing, parsing, elaborated AST, generates C • Thorough static error checking – Know what you get, but not robust to errors • API’s best for translation to IR – Simple things can be hard; white-box reuse • Precise textual mappings – C/C++ AST is post-processed, but columns correct • C++ front-end can’t handle some features 14

Ad Hoc Component Example, Static Analysis Alloy Tool [Jackson] • Property checker for Alloy Ad Hoc Component Example, Static Analysis Alloy Tool [Jackson] • Property checker for Alloy OO spec language – Takes spec and property, finds counterexamples – Uses SAT constraint solvers for analysis back-end – Spec language designed explicitly for analyzability • Front-end – Wrote own lexer (JLex), parser (CUP), AST – Eased because of analyzability • Translation to SAT formula “IR” – Aggregate is mapped to collection of scalars – Several stages of formula rewriting 15

Alloy, cont’d • Uses 3 SAT solvers, each with strengths – National challenge resulted Alloy, cont’d • Uses 3 SAT solvers, each with strengths – National challenge resulted in standard SAT “IR” – Allowed declarative format for hooking in a solver • Java Swing for general GUI, dot for graphs – Scalars are mapped back to aggregates, etc. , and results are reported as counterexamples – Currently don’t map results directly back to program • Expects to use variables as a way to map to source • About 20 KLOC of new code to build Alloy 16

Alloy: Lessons • Designing for analyzability a major benefit – Eases all aspects of Alloy: Lessons • Designing for analyzability a major benefit – Eases all aspects of front-end and translation to SAT – Adding 3 kinds of polymorphism added 20 KLOC! • SAT solver National Challenge a boon – Several good solver components – Standard IR eased integration • SAT solver start/stop protocol the hardest – Primitive form of computational steering – Subprocess control, capturing/interpreting output 17

Ad Hoc Component Example, Dynamic Analysis Daikon Tool [Ernst] • Program invariant detector for Ad Hoc Component Example, Dynamic Analysis Daikon Tool [Ernst] • Program invariant detector for C and Java – Instruments program at proc entries/exits, runs it – Infers variable value patterns at program points • Programs with test-suites have been invaluable – Class programs with grading suites – Siemens/Rothermel C programs with test-suites • Front-end the least interesting, 1/2 the work – Parser, symbol table, AST/IR manipulation, unparser • Get any two: manipulation toys with symbol table • Symbol table the hardest, unparser the easiest – Lots of choices, a few false starts 18

Daikon: Choosing Java Front-End • Byte-code instrumenters (JOIE, Bobby) – – Flexible and precise Daikon: Choosing Java Front-End • Byte-code instrumenters (JOIE, Bobby) – – Flexible and precise insertion points Loss of names complicates mapping to source Byte codes generated are compiler dependent Debugging voluminous instrumentation is hard • Source-level instrumentation – Java lacks “insertability”, e. g. , no comma operation – Invalidates symbol table, etc. – Chose Jikes, an open source compiler (got 2 of 4) • Added AST manipulation good enough to unparse • New byte-code instrumenters; EDG for Java 19

Ad Hoc Components: Critique • Freedom is great, but integration is weak – Data Ad Hoc Components: Critique • Freedom is great, but integration is weak – Data bloat: replicated and unused functionality – Minimal support for mapping between reps • Data: implementation of precise mappings • Control: synchronize to compute only what’s needed • Scalability a huge issue; data-flow information for a 1 MLOC program, highly optimized: 500 MB AST 500 MB BB/CFG 500 MB Bit-vectors Space translates to time by stressing memory hierarchy Component-based architecture to the rescue 20

Data-based Component Architecture Aristotle Infrastructure [Harrold] • Data-flow analysis and testing infra for C Data-based Component Architecture Aristotle Infrastructure [Harrold] • Data-flow analysis and testing infra for C • Database is universal integration mechanism – Provides uniform, loose integration • Separately compiled tools can write and read DB – Added Pro. Langs framework [Ryder] at modest cost • Scalability benefits – Big file system overcomes space problem – Persistence mitigates time problem • Performance still an issue, hasn’t been focus – Loose control integration produces reps in toto – DB implemented with flat files 21

Control-based Component Architecture Icaria Infrastructure [Atkinson] • Scalable data-flow (and syntactic) infra for C Control-based Component Architecture Icaria Infrastructure [Atkinson] • Scalable data-flow (and syntactic) infra for C – Hypothesis: need optimized components, control integration, and user control for good performance • Space- and time-tuned data structures – AST, BB’s, CFG; bit-vectors semi-sparse & factored – Memory allocation pools, free “block” – Steensgaard pointer analysis • Also piggybacked with CFG build pass for locality • Event-based demand-driven architecture – Compute all on demand; even discard/recompute – Persistently store “undemandable” information 22

Event-based Demand Architecture 23 Event-based Demand Architecture 23

Icaria: User Control • Declarative precision management – Context sensitivity (call stack modelling) – Icaria: User Control • Declarative precision management – Context sensitivity (call stack modelling) – Pointer analysis (e. g. , distinguish struct fields) • Iteration strategies – With tuned bit-vector stealing and reclamation • Declarative programmer input – ANSI/non-ANSI typing, memory allocators, … – Adds precision, sometimes speed-up • Termination control – Suspend/resume buttons, procedural hook – Because analysis is a means to an end (a task) 24

Icaria: The Price of Performance • Must conform to architectural rules to get performance Icaria: The Price of Performance • Must conform to architectural rules to get performance benefits – E. g. , can’t demand/discard/redemand your AST unless it meets architecture’s protocol • May cascade into a lot of front-end work – Can buy in modularly, incrementally • “Demand” in batch • Don’t discard • Reconsider demand strategy for new analysis – I. e. , when to discard, what to save persistently 25

Icaria: Scenario – Java Retarget • Use existing AST or derive off of Ponder’s Icaria: Scenario – Java Retarget • Use existing AST or derive off of Ponder’s • Rethink pointer analysis – Calls through function pointers mean bad CG – Intersect (filter) Steensgaard with language types? • Modular; variant works for C • Rethink 3 -address code and call-graph – Small methods (many, deep calling contexts) – “Allocation contexts” instead of calling contexts? • Context sensitivity module would support • Existing analyses not likely reusable OTS 26

Icaria: Applications • Icaria supports Cawk, Sprite slicer, Star. Tool – Cawk generated by Icaria: Applications • Icaria supports Cawk, Sprite slicer, Star. Tool – Cawk generated by Ponder syntactic infra [Atkinson] – Slicer is 6 KLOC: 50% GUI, 20% equations • Discard AST, CFG • Persistently store backwards call-graph • Scalability – Simple Cawk scripts run at 500 KLOC/minute – Sliced gcc (200 KLOC) on 200 MHz/200 MB Ultra. Sparc • 1 hour --> 1/2 minute by tuning function pointers • Dependent on program and slice • Other parameters less dramatic 27

Analysis Components Designing for Reusable Analyses • Approaches assume that tool is coded “within” Analysis Components Designing for Reusable Analyses • Approaches assume that tool is coded “within” infrastructure – Complicates migration to a new infrastructure • Genoa [Devanbu] and sharlit [Tjiang] are “monolithic” language/generator solutions • How design a reusable “analysis component”? – A client of infrastructure, so incomplete • Addressed for Star. Tool reengineering tool – Only front-end infra and target lang. , not Tcl/tk GUI 28

Star. Tool: Main View • “Referenced-by” relation for entity in clustered hierarchy • Views Star. Tool: Main View • “Referenced-by” relation for entity in clustered hierarchy • Views are navigable, customizable, and annotable 29

Star. Tool: Adapter Approach [Hayes] • Interpose an adapter [GHJV] to increase separation of Star. Tool: Adapter Approach [Hayes] • Interpose an adapter [GHJV] to increase separation of analysis and infra • What adapter interface allows best retargets? • Low-level: a few small, simple operations – E. g. , generic tree traversal ops Star Adapter Infra More responsibility in Star relieves all future adapters ? Did 3 retargets, including to GNAT Ada AST [Dewar] 30

Star. Tool: Lessons Learned • Retargets range from 500 to 2000 LOC – Precise Star. Tool: Lessons Learned • Retargets range from 500 to 2000 LOC – Precise mappings to source, language complexity • Best interface assumes nothing about infra – In extreme, don’t assume there’s an AST at all – Means providing operations that make Star. Tool’s implementation easy (despite that there’s just one) • E. g. , iterator for “all references similar to this ” • Metaquery operations resolve feature specifics – Gives adapter lots of design room, can choose best – More, bigger ops; mitigated by template class [GHJV] – Got multi-language tool using 2 levels of adapters 31

Observations Conclusion! • Infrastructures for prototyping or scalability – 1000 LOC effort won’t scale-up, Observations Conclusion! • Infrastructures for prototyping or scalability – 1000 LOC effort won’t scale-up, yet – Absolute effort is lessening, scale increasing – Boring stuff is still 1/2+ effort • Trend towards components – Span of requirements, performance, IDE integration – Many components are programmable, however • Interactive whole-program analysis stresses modularity (reuse) of infrastructure – Much reuse is white-box 32

Observations, cont’d • Retargeting is expensive, defies infrastructure – Symbol table (scoping, typing), and Observations, cont’d • Retargeting is expensive, defies infrastructure – Symbol table (scoping, typing), and base analyses – Language proliferation & evolution continue, slowly – Tool retargets lag language definition, maybe a lot • Bigger components are better [Sullivan] – Many small components complicate integration – Mitigates symbol-table issue – Reuse still hard, sometimes white-box • Language analyzability has big impact – Front-end, mappings, precise and fast analysis – Designers need to consider consequences 33

Open Issues • Effective infrastructures for “deep” analysis – In principle not hard – Open Issues • Effective infrastructures for “deep” analysis – In principle not hard – In practice, performance/precision tradeoffs can require significant rewrites for “small” change • Out of private toolbox, beyond white-box reuse – Fragile modularity, complexity, documentation • Robustness – Useful for incomplete or evolving systems – Complicates the analysis, results harder to interpret • Modification: beyond instrumentation & translation 34

Emerging Challenges • Integration into IDE’s – GUI dependence, native AST; reuse across IDE’s Emerging Challenges • Integration into IDE’s – GUI dependence, native AST; reuse across IDE’s • What is a program? What is the program? – Multi-language programs – Federated applications, client-server apps – Trend is towards writing component glue • Less source code (maybe), but huge apps • How treat vast, numerous packages? Sans source? • Current tools provide/require stub code • Multi-threading is entering the main stream 35

Opportunities • Faster computers, better OS’s and compilers – Basic Dell’s can take two Opportunities • Faster computers, better OS’s and compilers – Basic Dell’s can take two processors, and it works • Compatibility packages: Cygwin, VMware, Exceed • Emergence of Java, etc. , for tool construction – Better type systems, garbage collection – API model, persistence, GUI, multi-threading – (Maybe better analyzability, too) • Infrastructure – Modular analyses [Ryder], incremental update – Visualization toolkits (e. g. , SGI’s Mine. Set) • Open source: share, improve; benchmarks 36

URL’s Refine www. reasoning. com Code. Surfer www. grammatech. com EDG www. edg. com URL’s Refine www. reasoning. com Code. Surfer www. grammatech. com EDG www. edg. com Alloy sdg. lcs. mit. edu/alloy Daikon cs. washington. edu/homes/mernst/daikon Aristotle www. cc. gatech. edu/aristotle Pro. Langs www. prolangs. rutgers. edu Icaria, etc. www. cs. ucsd. edu/~wgg/Software 37

Thanks! Michael Ernst: Dynamic analysis Daniel Jackson: Alloy Mik Kersten: IDE integration Mary Jean Thanks! Michael Ernst: Dynamic analysis Daniel Jackson: Alloy Mik Kersten: IDE integration Mary Jean Harrold: Aristotle Arun Lakhotia: Refine and Code. Surfer Nicholas Mitchell: Compiler infras, EDG John Stasko: Visualization Michelle Strout: Compiler infrastructures Kevin Sullivan: Mediators and components 38