fafc832b66b06d42ef5ca6f82fe40bd6.ppt
- Количество слайдов: 51
LCG Applications Area Status Torre Wenaus, BNL/CERN LCG Applications Area Manager http: //cern. ch/lcg/peb/applications LHCC Meeting January 27, 2003
Outline u u u Applications area scope and organization Architecture Personnel and planning u Little on planning since I talked to (most of you) about it in Nov Status of applications area projects u SPI u POOL u SEAL u PI u Math libraries u Simulation Relationship to other LCG activity areas Conclusion 2 Torre Wenaus, BNL/CERN
The LHC Computing Grid Project Structure Project Overview Board Project Leader Project Execution Board (PEB) Grid Projects Requirements, Work plan, Monitoring Software and Computing Committee (SC 2) WP WP WP Project Work Packages RTAG 3 Torre Wenaus, BNL/CERN
LCG Areas of Work Fabric (Computing System) u u u u Physics Data Management Fabric Management Physics Data Storage LAN Management Wide-area Networking Security Internet Services Grid Technology u u u Physics Applications Software Grid middleware Standard application services layer Inter-project coherence/compatibility u u u Application Software Infrastructure – libraries, tools Object persistency, data management tools Common Frameworks – Simulation, Analysis, . . Adaptation of Physics Applications to Grid environment Grid tools, Portals Grid Deployment u u u 4 Data Challenges Grid Operations Network Planning Regional Centre Coordination Security & access policy Torre Wenaus, BNL/CERN
Applications Area Organization Apps Area Leader Project Leaders Overall management, coordination, architecture Project Work Package WP WP WP Leaders Project WP WP Architects Forum … WP WP Direct technical collaboration between experiment participants, IT, EP, ROOT, LCG personnel 5 Torre Wenaus, BNL/CERN
Focus on Experiment Need u Project structured and managed to ensure a focus on real experiment needs u SC 2/RTAG process to identify, define (need-driven requirements), initiate and monitor common project activities in a way guided by the experiments themselves u Architects Forum to involve experiment architects in day to day project management and execution u Open information flow and decision making u Direct participation of experiment developers in the projects u Tight iterative feedback loop to gather user feedback from frequent releases u Early deployment and evaluation of LCG software in experiment contexts u Success defined by experiment adoption and production deployment 6 Torre Wenaus, BNL/CERN
Applications Area Projects u u u u Software Process and Infrastructure (SPI) (operating – A. Aimar) u Librarian, QA, testing, developer tools, documentation, training, … Persistency Framework (POOL) (operating – D. Duellmann) u POOL hybrid ROOT/relational data store Mathematical libraries (operating – F. James) u Math and statistics libraries; GSL etc. as NAGC replacement u Group in India will work on this (workplan in development) Core Tools and Services (SEAL) (operating – P. Mato) u Foundation and utility libraries, basic framework services, system services, object dictionary and whiteboard, grid enabled services Physics Interfaces (PI) (launched – V. Innocente) u Interfaces and tools by which physicists directly use the software. Interactive (distributed) analysis, visualization, grid portals Simulation (launch planning in progress) u Geant 4, FLUKA, simulation framework, geometry model, … Generator Services (launch as part of simu) u Generator librarian, support, tool development Bold: Recent developments (last 3 months) 7 Torre Wenaus, BNL/CERN
Project Relationships LCG Applications Area Persistency (POOL) Physicists Interface (PI) … Math Libraries Core Libraries & Services (SEAL) 8 Torre Wenaus, BNL/CERN LHC Experiments Software Process & Infrastructure (SPI) Other LCG Projects in other Areas
Candidate RTAG timeline from March Blue: RTAG/activity launched or (light blue) imminent 9 Torre Wenaus, BNL/CERN
LCG Applications Area Timeline Highlights Applications POOL V 0. 1 internal release Architectural blueprint complete POOL first production release Distributed production using grid services Distributed end-user interactive analysis Full Persistency Framework Q 1 Q 2 Q 3 Q 4 2002 2003 2004 2005 LCG TDR LCG launch week “ 50% prototype” (LCG-3) LCG-1 reliability and performance targets First Global Grid Service (LCG-1) available 10 Torre Wenaus, BNL/CERN
Architecture Blueprint u u u u u Executive summary RTAG established in June Expt architects, other experts Response of the RTAG to the mandate Blueprint scope After 14 meetings, much email. . . Requirements Use of ROOT A 36 -page final report Accepted by SC 2 October 11 Blueprint architecture design precepts u High level architectural issues, approaches Blueprint architectural elements u Specific architectural elements, suggested patterns, examples Domain decomposition Schedule and resources Recommendations http: //lcgapp. cern. ch/project/blueprint/ 11 Torre Wenaus, BNL/CERN
Principal architecture requirements u u u Long lifetime: support technology evolution C++ today; support language evolution Seamless distributed operation Usability off-network Component modularity, public interfaces Interchangeability of implementations Integration into coherent framework and experiment software Design for end-user’s convenience more than the developer’s Re-use existing implementations Software quality at least as good as any LHC experiment Meet performance, quality requirements of trigger/DAQ software Platforms: Linux/gcc, Linux/icc, Solaris, Windows 12 Torre Wenaus, BNL/CERN
Component Model u u Communication via public interfaces APIs targeted to end-users, embedding frameworks, internal plug-ins Plug-ins u Logical module encapsulating a service that can be loaded, activated and unloaded at run time Granularity driven by u component replacement criteria u dependency minimization u development team organization 13 Torre Wenaus, BNL/CERN
Software Structure . . . Other Frameworks Visualization Framework Implementationneutral services Reconstruction Framework Simulation Framework Applications Grid middleware, … Basic Framework STL, ROOT libs, CLHEP, Boost, … Foundation Libraries Optional Libraries 14 ROOT, Qt, … Torre Wenaus, BNL/CERN
Distributed Operation u u Architecture should enable but not require the use of distributed resources via the Grid Configuration and control of Grid-based operation via dedicated services u Making use of optional grid middleware services at the foundation level of the software structure u u Insulating higher level software from the middleware Supporting replaceability Apart from these services, Grid-based operation should be largely transparent Services should gracefully adapt to ‘unplugged’ environments 15 Torre Wenaus, BNL/CERN
Managing Objects u u u Object Dictionary u To query a class about its internal structure u Essential for persistency, data browsing, etc. u The ROOT team and LCG plan to develop and converge on a common dictionary (common interface and implementation) with an interface anticipating a C++ standard (XTI) (Timescale ~1 yr? ) Object Whiteboard u Uniform access to application-defined transient objects, including in the ROOT environment Object definition based on C++ header files u Used by ATLAS and CMS 16 Torre Wenaus, BNL/CERN
Other Architectural Elements u u u Python-based Component Bus u Plug-in integration of components providing a wide variety of functionality u Component interfaces to bus derived from their C++ interfaces Scripting Languages u Python and CINT (ROOT) to both be available u Access to objects via object whiteboard in these environments Interface to the Grid u Must support convenient, efficient configuration of computing elements with all needed components 17 Torre Wenaus, BNL/CERN
Domain Decomposition Products mentioned are examples; not a comprehensive list Grey: not in common project scope (also event processing framework, TDAQ) Torre Wenaus, BNL/CERN 18
Use of ROOT in LCG Software u Among the LHC experiments u ALICE has based its applications directly on ROOT u The 3 others base their applications on components with implementation-independent interfaces u u Look for software that can be encapsulated into these components All experiments agree that ROOT is an important element of LHC software u Leverage existing software effectively and do not unnecessarily reinvent wheels Therefore the blueprint establishes a user/provider relationship between the LCG applications area and ROOT u LCG AA software will make use of ROOT as an external product Draws on a great ROOT strength: users are listened to very carefully! u So far so good: the ROOT team has been very responsive to needs for new and extended functionality coming from POOL 19 Torre Wenaus, BNL/CERN
Blueprint RTAG Outcomes u SC 2 decided in October… u u Blueprint is accepted RTAG recommendations accepted to u Start common project on core tools and services u Start common project on physics interfaces 20 Torre Wenaus, BNL/CERN
Applications Area Personnel Status u u u 18 LCG apps hires in place and working; +1 last week, +1 in Feb u Manpower ramp is on target (expected to reach 20 -23) u Contributions from UK, Spain, Switzerland, Germany, Sweden, Israel, Portugal, US, India, and Russia ~12 FTEs from IT (DB and API groups) ~12 FTEs from CERN EP/SFT, experiments u CERN established a new software group as the EP home of the LCG applications area (EP/SFT) u u u Led by John Harvey Taking shape well. Localized in B. 32 Soon to be augmented by IT/API staff working in applications area; they will move to EP/SFT u Will improve cohesion, sense of project participation, technical management effectiveness 21 Torre Wenaus, BNL/CERN
Applications Area Personnel Summary Details at http: //lcgapp. cern. ch/project/mgmt/App. Manpower. xls Peopl FTE e s 19 19. 0 Working directly for apps area projects 15 15. 0 ROOT 2 2. 0 Grid integration work with experiments 2 2. 0 IT/DB 3 2. 1 IT/API 11 9. 7 EP/SFT + experiments total 22 11. 9 Working directly for apps area projects 19 9. 9 Architecture, management 5 2. 0 Total directly working on apps area projects 48 36. 7 Overall applications area total 55 42. 7 Total LCG hires working, total Contributions from 22 Torre Wenaus, BNL/CERN
Current Personnel Distribution 23 Torre Wenaus, BNL/CERN
FTE-years
Personnel Resources – Required and Available SPI Math libraries Now Physicist interface Generator services Mar-05 Dec-04 Sep-04 Jun-04 Mar-04 Dec-03 Sep-03 Jun-03 Mar-03 Simulation Dec-02 60 50 40 30 20 10 0 Sep-02 FTEs Estimate of Required Effort SEAL POOL Quarter ending Blue = Available effort Future estimate based on 20 LCG, 12 IT, 28 EP + experiments 25 Torre Wenaus, BNL/CERN
Schedule and Resource Tracking (example) 26 Torre Wenaus, BNL/CERN
MS Project Integration – POOL Milestones 27 Torre Wenaus, BNL/CERN
Apps area planning materials u u u Planning page linked from applications area page Applications area plan spreadsheet: overall project plan u http: //lcgapp. cern. ch/project/mgmt/App. Plan. xls u High level schedule, personnel resource requirements Applications area plan document: overall project plan u http: //lcgapp. cern. ch/project/mgmt/App. Plan. doc u Incomplete draft Personnel spreadsheet u http: //lcgapp. cern. ch/project/mgmt/App. Manpower. xls u Currently active/foreseen apps area personnel, activities WBS, milestones, assigned personnel resources u http: //atlassw 1. phy. bnl. gov/Planning/lcg. Planning. html 28 Torre Wenaus, BNL/CERN
L 1 Milestones (1) 29 Torre Wenaus, BNL/CERN
L 1 Milestones (2) 30 Torre Wenaus, BNL/CERN
Software Process and Infrastructure Project u. Alberto Aimar, CERN IT/API Specifications Development Analysis and Design Release Planning Testing Deployment and Installation Debugging …. . Software development Support a. General Service for Software projects a. Provide general services needed by each project u u CVS repository, Web Site, Software Library Mailing Lists, Bug Reports, Collaborative Facilities b. Provide components specific to the software phases u Tools, Templates, Training, Examples, etc. 31 Torre Wenaus, BNL/CERN
SPI Services u u CVS repositories u One repository per project u Standard repository structure and #include conventions u Will eventually move to IT CVS service when it is proven AFS delivery area, Software Library u /afs/cern. ch/sw/lcg u Installations of LCG-developed and external software u LCG Software Library ‘toolsmith’ started in December Build servers u Machines with needed Linux, Solaris configurations Project portal (similar to Source. Forge) http: //lcgappdev. cern. ch u Very nice new system using Savannah (savannah. gnu. org) u Used by POOL, SEAL, SPI, CMS, … u Bug tracking, project news, FAQ, mailing lists, download area, CVS access, … 32 Torre Wenaus, BNL/CERN
33 Torre Wenaus, BNL/CERN
34 Torre Wenaus, BNL/CERN
SPI Components u u u u Code documentation, browsing Doxygen, LXR, View. CVS Testing Framework Cpp. Unit, Oval Memory Leaks Valgrind Automatic Builds NICOS (ATLAS) Coding and design guidelines Rule. Checker Standard CVS organization Configuration management SCRAM u Acceptance of SCRAM decision shows ‘the system works’ Project workbook u. All components and services should be in place mid-Feb u. Missing at this point: Nightly build system (Being integrated with POOL for testing) Software library (Prototype being set up now) 35 Torre Wenaus, BNL/CERN
POOL Project u u. Dirk Duellmann, CERN IT/DB Pool of persistent objects for LHC u Targeted at event data but not excluding other data u Hybrid technology approach u u u Leverages existing ROOT I/O technology and adds value u u u Transparent cross-file and cross-technology object navigation RDBMS integration Integration with Grid technology (eg EDG/Globus replica catalog) u u Object level data storage using file-based object store (ROOT) RDBMS for meta data: file catalogs, object collections, etc (My. SQL) network and grid decoupled working modes Follows and exemplifies the LCG blueprint approach u u u Components with well defined responsibilities Communicating via public component interfaces Implementation technology neutral 36 Torre Wenaus, BNL/CERN
POOL Release Schedule u u u End September - V 0. 1 (Released Oct 2) u All core components for navigation exist and interoperate u Assumes ROOT object (TObject) on read and write End October - V 0. 2 (Released Nov 15) u First collection implementation Principal apps area milestone End November - V 0. 3 (Released Dec 18) defined in March LCG launch: Hybrid prototype by year end u First public release u EDG/Globus File. Catalog integrated u Persistency for general C++ classes (not instrumented by ROOT), but very limited: elementary types only u Event metadata annotation and query End February – V 0. 4 u Persistency of more complex objects, eg. with STL containers u Support object descriptions from C++ header files (gcc-xml) June 2003 – Production release 37 Torre Wenaus, BNL/CERN
Dictionary: Reflection / Population / Conversion New in POOL 0. 3 In progress 38 Torre Wenaus, BNL/CERN
POOL Milestones 39 Torre Wenaus, BNL/CERN
Core Libraries and Services (SEAL) Project u u u. Pere Mato, CERN EP/SFT/LHCb Launched in Oct u 6 -member (~3 FTE) team initially; build up to 8 FTEs by the summer u Growth mainly from experiment contributions Scope: u Foundation, utility libraries u Basic framework services u Object dictionary (taken over from POOL) u Grid enabled services Purpose: u Provide a coherent and complete set of core classes and services in conformance with overall architectural vision (Blueprint RTAG) u Facilitate the integration of LCG and non-LCG software to build coherent applications u Avoid duplication of software; use community standards Areas of immediate relevance to POOL given priority Users are software developers in other projects and the experiments 40 Torre Wenaus, BNL/CERN
SEAL Work Packages u u u u Foundation and utility libraries u Boost, CLHEP, experiment code, complementary in-house development u Participation in CLHEP workshop this week Component model and plug-in manager u The core expression in code of the component architecture described in the blueprint. Mainly in-house development. LCG object dictionary u C++ class reflection, dictionary population from C++ headers, ROOT gateways, Python binding Basic framework services u Object whiteboard, message reporting, component configuration, ‘event’ management Scripting services u Python bindings for LCG services, ROOT Grid services u Common interface to middleware Education and documentation u Assisting experiments with integration 41 Torre Wenaus, BNL/CERN
SEAL Schedule u u Jan 2003 - Initial work plan delivered to SC 2 on Jan 10 th and approved u Including contents of version v 1 alpha March 2003 - V 1 alpha u Essential elements with sufficient functionality for the other existing LCG projects (POOL, …) u Frequent internal releases (monthly? ) June 2003 - V 1 beta u Complete list of the currently proposed elements implemented with sufficient functionality to be adopted by experiments June 2003 - Grid enabled services defined u The SEAL services which must be grid-enabled are defined and their implementation prioritized. 42 Torre Wenaus, BNL/CERN
Estimation of Needed Resources for SEAL WBS Name FTE (available/required) 1 Foundation and Utility libraries 0. 5 / 1. 0 2 Component Model and Plug-in Manager 0. 5 / 0. 5 3 LCG Object Dictionary 0. 5 / 1. 5 4 Basic Framework Services 0. 5 / 1. 0 5 Scripting Services 0. 5 / 1. 0 6 Grid Services 0. 0 / 1. 5 7 Education and Documentation 0. 5 / 1. 5 total 3. 0 / 8. 0 Current resources should be sufficient for v 1 alpha (March) 43 Torre Wenaus, BNL/CERN
Math Libraries Project u u u. Fred James, CERN EP/SFT Many different libraries in use u General purpose (NAG-C, GSL, . . ) u HEP-specific ( CLHEP, ZOOM, ROOT) u Modern libraries dealing with matrices and vectors (Blitz++, Boost. . ) Financial considerations: can we replace NAG with open source u RTAG: yes Do comparative evaluation of NAG-C and GSL u Collect information on what is used/needed u Evaluation of functionality and performance HEP-specific libraries expected to evolve to meet future needs 44 Torre Wenaus, BNL/CERN
Math library recommendations & status u u u Establish support group to provide advice and info about existing libraries, and identify and develop new functionality u Group established in October Which libraries and modules are in use by experiments u Little response to experiment requests for info; group in India is scanning experiment code to determine usage Detailed study should be undertaken to assess needed functionality and how to provide it, particularly via free libraries such as GSL u Group in India is undertaking this study u u u Initial plan of work developed with Fred James in December Targets completion of first round of GSL enhancements for April u Based on priority needs assessment Work plan needs to be presented to the SC 2 soon u Scheduled tomorrow, but will be late 45 Torre Wenaus, BNL/CERN
Physicist Interface (PI) Project u. Vincenzo Innocente, CERN EP/SFT/CMS u u u Interfaces and tools by which physicists will directly use the software Planned scope: u Interactive environment: physicist’s desktop u Analysis tools u Visualization u Distributed analysis, grid portals Currently developing plans and trying to clarify the grid area u Completed survey of experiments on their needs and interests u Talking also to grid projects, other apps area projects, ROOT, … Initial workplan proposal will be presented to PEB, SC 2 this week Will plan inception workshops for identified work areas u Identify contributors, partners, users; deliverables and schedules; personnel assignments 46 Torre Wenaus, BNL/CERN
Proposed Near Term Work Items for PI u u Abstract interface to analysis services u GEANT 4 and some experiments do not wish to depend on a specific implementation u One implementation must be ROOT u Request for a coherent LCG analysis tool-set Interactive analysis environment u Access to experiment objects (event-data, algorithms etc) u Access to high level POOL services (collections, meta. Data) u Transparent use of LCG and grid services u With possibility to ‘expose’ them for debugging and detailed monitoring GUI (point&click) and scripting interface Interactive event and detector visualization u Integrated with the analysis environment u Offering a large palette of 2 D and 3 D rendering u u 47 Torre Wenaus, BNL/CERN
Simulation Project u u u Mandated by SC 2 in Dec to initiate simulation project, following RTAG recommendations Project being organized now u Discussions with experiments, G 4, FLUKA, ROOT, … on organization and roles in progress u Probable that I will lead the overall project during a startup period, working with a slate of strong subproject leaders Scope (these are the tentative subprojects): u Generic simulation framework u u u Geant 4 development and integration FLUKA integration Physics validation u u u Multiple simulation engine support, geometry model, generator interface, MC truth, user actions, user interfaces, average ‘GEANE’ tracking, utilities ALICE virtual MC as starting point if it meets requirements simulation test and benchmark suite Fast (shower) parameterisation Generator services 48 Torre Wenaus, BNL/CERN
Collaborations u u u LCG apps area needs to collaborate well with projects broader than the LHC u See that LHC requirements are met, provide help and support targeted at LHC priorities, while being good collaborators (e. g. not assimilators) u e. g. CLHEP: discussions at workshop this week u e. g. Geant 4 in context of simulation project We also welcome collaborators on LCG projects u e. g. (renewed? ) interest from Ba. Bar in POOL We also depend on the other LHC activity areas u Grid Technology: u u Ensuring that the needed middleware is/will be there, tested, selected and of production grade AA distributed software will be robust and usable only if the grid middleware it uses is so Grid Deployment: AA software deployment Fabrics: CERN infrastructure, AA development and test facilities 49 Torre Wenaus, BNL/CERN
A note on my own time u u Nearing the one year mark of doing the apps area leader job with 75% of my time, resident at CERN u Other 25% I am an ATLAS/BNL person in the US (~1 week/mo) Working well, and sustainable (the LCG job I mean) u From my perspective at least! I will be lightening my ATLAS/US load, which will be welcome u Expect to hand over the US ATLAS Software Manager job to a highly capable person in a few days or weeks u Expect to hand over the ATLAS Planning Officer job shortly to someone else Remaining formal US responsibility is BNL Physics Applications Software Group Leader, for which I have a strong deputy (David Adams) 50 Torre Wenaus, BNL/CERN
Concluding Remarks u u u u Expected area scope essentially covered by projects now defined Manpower is in quite good shape Buy-in by the experiments is good u Direct participation in leadership, development, prompt testing and evaluation, RTAGs EP/SFT group taking shape well as a CERN hub Participants remote from CERN are contributing, but it isn’t always easy POOL and SPI are delivering, and the other projects are ramping up u Persistency prototype released in 2002, as targeted in March Important benchmark to come: delivering production-capable POOL scheduled for June 51 Torre Wenaus, BNL/CERN
fafc832b66b06d42ef5ca6f82fe40bd6.ppt