Скачать презентацию Tony Doyle a doyle physics gla ac uk Grid Скачать презентацию Tony Doyle a doyle physics gla ac uk Grid

dbae9c94a1ae58c14d4701a2a88181ff.ppt

  • Количество слайдов: 30

Tony Doyle a. doyle@physics. gla. ac. uk Grid. PP – From Prototype To Production, Tony Doyle a. [email protected] gla. ac. uk Grid. PP – From Prototype To Production, HEPi. X Meeting, Edinburgh, 25 May 2004 Tony Doyle - University of Glasgow

Outline • • Grid. PP Project Introduction UK Context Components: A. Management B. Middleware Outline • • Grid. PP Project Introduction UK Context Components: A. Management B. Middleware C. Applications D. Tier-2 E. Tier-1 F. Tier-0 • Challenges: – – – – Middleware Validation Improving Efficiency Meeting Experiment Requirements. . Via The Grid? Work Group Computing Events. . To Files. . To Events Software Distribution Distributed Analysis • Historical Perspective • What is the Grid Anyway? • Is Grid. PP a Grid? • Summary Tony Doyle - University of Glasgow

Grid. PP – A UK Computing Grid for Particle Physics Grid. PP 19 UK Grid. PP – A UK Computing Grid for Particle Physics Grid. PP 19 UK Universities, CCLRC (RAL & Daresbury) and CERN Funded by the Particle Physics and Astronomy Research Council (PPARC) Grid. PP 1 - Sept. 2001 -2004 £ 17 m "From Web to Grid" Grid. PP 2 – Sept. 2004 -2007 £ 16(+1)m "From Prototype to Production" Tony Doyle - University of Glasgow

Grid. PP in Context Experiments Institutes Tier-2 Centres UK Core e-Science Programme Apps Int Grid. PP in Context Experiments Institutes Tier-2 Centres UK Core e-Science Programme Apps Int Apps Dev Grid. PP Tier-1/A Grid Support Centre Middleware, Security, Networking CERN LCG EGEE Not to scale! Tony Doyle - University of Glasgow

Grid. PP 1 Components UK Tier-1/A Regional Centre Hardware and Manpower Grid Application Development Grid. PP 1 Components UK Tier-1/A Regional Centre Hardware and Manpower Grid Application Development LHC and US Experiments + Lattice QCD Management Travel etc European Data. Grid (EDG) Middleware Development LHC Computing Grid Project (LCG) Applications, Fabrics, Technology and Deployment Tony Doyle - University of Glasgow

Grid. PP 2 Components A. Management, Travel, Operations B. Middleware Security Network Development C. Grid. PP 2 Components A. Management, Travel, Operations B. Middleware Security Network Development C. Grid Application Development LHC and US Experiments + Lattice QCD + Phenomenology F. LHC Computing Grid Project (LCG Phase 2) [review] E. Tier-1/A Deployment: Hardware, System Management, Experiment Support D. Tier-2 Deployment: 4 Regional Centres - M/S/N support and System Management Tony Doyle - University of Glasgow

A. Grid. PP Management Collaboration Board (Production Manager) (Dissemination Officer) GGF, LCG, EDG (EGEE), A. Grid. PP Management Collaboration Board (Production Manager) (Dissemination Officer) GGF, LCG, EDG (EGEE), UK e. Science, Liaison Grid. PP 1 (Grid. PP 2) Project Leader Project Management Board Technical (Deployment) Board Project Map Risk Register Experiments (User) Board Tony Doyle - University of Glasgow

CB PMB Deployment Board Tier 1/Tier 2, Testbeds, Rollout User Board Metadata Storage Workload CB PMB Deployment Board Tier 1/Tier 2, Testbeds, Rollout User Board Metadata Storage Workload Network Security Info. Mon. LCG ARDA Requirements Application Development User feedback Expmts EGEE Service specification & provision A. Management Structure In LCG Context Tony Doyle - University of Glasgow

Storage Workload Network Security Requirements Application Development User feedback and re nt Testbeds, Rollout Storage Workload Network Security Requirements Application Development User feedback and re nt Testbeds, Rollout Service ecification provision Info. Mon. EGEE LCG ARDA Expmts

B. Middleware, Security and Network Development Grid Data Management Network Monitoring Security Middleware Networking B. Middleware, Security and Network Development Grid Data Management Network Monitoring Security Middleware Networking Configuration Management Storage Interfaces Information Services Security M/S/N builds upon UK strengths as part of International development Tony Doyle - University of Glasgow

C. Application Development Ba. Bar Ali. En → ARDA GANGA SAMGrid Lattice QCD CMS C. Application Development Ba. Bar Ali. En → ARDA GANGA SAMGrid Lattice QCD CMS Tony Doyle - University of Glasgow

D. UK Tier-2 Centres North. Grid **** Daresbury, Lancaster, Liverpool, Manchester, Sheffield South. Grid D. UK Tier-2 Centres North. Grid **** Daresbury, Lancaster, Liverpool, Manchester, Sheffield South. Grid * Birmingham, Bristol, Cambridge, Oxford, RAL PPD, Warwick Scot. Grid * Durham, Edinburgh, Glasgow London. Grid *** Brunel, Imperial, QMUL, RHUL, UCL Current UK Status: 10 Sites via LCG Tony Doyle - University of Glasgow

D. The UK Testbed: Hidden Sector Tony Doyle - University of Glasgow D. The UK Testbed: Hidden Sector Tony Doyle - University of Glasgow

E. The UK Tier-1/A Centre • High quality data services • National and International E. The UK Tier-1/A Centre • High quality data services • National and International Role • UK focus for International Grid development April 2004: • 700 Dual CPU • 80 TB Disk • 60 TB Tape (Capacity 1 PB) Grid Operations Centre CMS LHCb Ba. Bar ATLAS Tony Doyle - University of Glasgow

Real Time Grid Monitoring LCG 2 24 May 2004 Tony Doyle - University of Real Time Grid Monitoring LCG 2 24 May 2004 Tony Doyle - University of Glasgow

E. Grid Operations • Grid Operations Centre – Core Operational Tasks – Monitor infrastructure, E. Grid Operations • Grid Operations Centre – Core Operational Tasks – Monitor infrastructure, components and services – Troubleshooting – Verification of new sites joining Grid • Grid Support Centre – Acceptance tests of new middleware – Core Support Tasks releases – Running UK Certificate Authority – Verify suppliers are meeting SLA – Performance tuning and optimisation – Publishing use figures and accounts – Grid information services – Monitoring services – Resource brokering – Allocation and scheduling services – Replica data catalogues – Authorisation services – Accounting services Tony Doyle - University of Glasgow

F. Tier 0 and LCG: Foundation Programme • Aim: build upon Phase 1 • F. Tier 0 and LCG: Foundation Programme • Aim: build upon Phase 1 • Ensure development programmes are linked • Project management: Grid. PP LCG • Shared expertise: • LCG establishes the global computing infrastructure • Allows all participating physicists to exploit LHC data • Earmarked UK funding to be reviewed in Autumn 2004 Required Foundation: LCG Fabric, Technology and Deployment Tony Doyle - University of Glasgow

The Challenges Ahead I: Implementing the Validation Process Build Development Certification System Unit Testbed The Challenges Ahead I: Implementing the Validation Process Build Development Certification System Unit Testbed ~15 CPU Testbed ~40 CPU Production Run nightly build & auto. tests Individual WP tests Grid certification Certified public release for use by apps. Build system WPs Fix problems Integration Team Overall release tests Process to: Test frameworks Test support Releases Tagged Test policies candidate Releases Test documentation Test platforms/compilers Test Group Application Certification Apps. Representatives Releases Certified candidate Releases Certified release selected for deployment Certification Tagged release selected for certification add unit tested code to repository Integration Tagged package Build Users Application Testbed ~1000 CPU 24 x 7 Problem reports Tony Doyle - University of Glasgow

The Challenges Ahead II: Improving Grid “Efficiency” Tony Doyle - University of Glasgow The Challenges Ahead II: Improving Grid “Efficiency” Tony Doyle - University of Glasgow

The Challenges Ahead III: Meeting Experiment Requirements (UK) Total Requirement: In International Context Q The Challenges Ahead III: Meeting Experiment Requirements (UK) Total Requirement: In International Context Q 2 2004 LCG Resources: Tony Doyle - University of Glasgow

The Challenges Ahead IV: Using (Anticipated) Grid Resources Dynamic Grid Optimisation over JANET Network The Challenges Ahead IV: Using (Anticipated) Grid Resources Dynamic Grid Optimisation over JANET Network 2004 ~7, 000 1 GHz CPUs ~400 TB disk (note x 2 scale change) 2007 ~30, 000 1 GHz CPUs ~2200 TB disk Tony Doyle - University of Glasgow

The Challenges Ahead V: Work Group Computing Tony Doyle - University of Glasgow The Challenges Ahead V: Work Group Computing Tony Doyle - University of Glasgow

The Challenges Ahead VI: Events. . to Files. . to Events • VOMS-enhanced Grid The Challenges Ahead VI: Events. . to Files. . to Events • VOMS-enhanced Grid certificates to access databases via metadata • Non-Trivial. . Event 1 Event 2 Event 3 RAW RAW ESD ESD AOD AOD TAG TAG Data Files Data RAW Files Data Files Data Files ESD Files Data Files AOD Files Data TAG Data Tier-0 (International) Tier-1 (National) Tier-2 (Regional) Tier-3 (Local) “Interesting Events List” Tony Doyle - University of Glasgow

The Challenges Ahead VII: software distribution • ATLAS Data Challenge Physics Models (DC 2) The Challenges Ahead VII: software distribution • ATLAS Data Challenge Physics Models (DC 2) this year to validate Step 1: Monte Carlo world-wide computing Monte Carlo Truth Data model Data Challenges • Packaging, distribution Detector Simulation and installation: Scale: MC Raw Data one release build takes 10 hours produces 2. 5 GB of Reconstruction Trigger System files • Complexity: MC Event Summary Data MC Event Tags Data Acquisition 500 packages, Mloc, 100 s of developers and 1000 s Level 3 trigger of users – ATLAS collaboration is widely distributed: 140 institutes, all wanting to use the software – needs ‘push-button’ easy installation. . Run Conditions Trigger Tags Raw Data Calibration Data Step 2: Real Data Reconstruction Event Summary Data Event Tags ESD Tony Doyle - University of Glasgow

Complex workflow… The Challenges Ahead VIII: LCG/ARDA analysis distributed Development 1. Ali. En (ALICE Complex workflow… The Challenges Ahead VIII: LCG/ARDA analysis distributed Development 1. Ali. En (ALICE Grid) provided a pre. Grid implementation [Perl scripts] 2. ARDA provides a framework for PP application middleware Tony Doyle - University of Glasgow

Historical Perspective • I wrote in 1990 a program called Historical Perspective • I wrote in 1990 a program called "Worl. Dwid. Eweb", a point and click hypertext editor which ran on the "Ne. XT" machine. This, together with the first Web server, I released to the High Energy Physics community at first, and to the hypertext and Ne. XT communities in the summer of 1991. • Tim Berners-Lee • The first three years were a phase of persuasion, aided by my colleague and first convert Robert Cailliau, to get the Web adopted… • We needed servers to provide incentive and examples, and all over the world inspired people put up all kinds of things… • Between the summers of 1991 and 1994, the load on the first Web server ("info. cern. ch") rose steadily by a factor of 10 every year… Tony Doyle - University of Glasgow

What is The Grid Anyway? From Particle Physics Perspective The Grid is: not hype, What is The Grid Anyway? From Particle Physics Perspective The Grid is: not hype, but surrounded by it a working prototype running on testbed(s)… about seamless discovery of PC resources around the world using evolving standards for interoperation the basis for particle physics computing in the 21 st Century not (yet) as transparent as end-users want it to be Tony Doyle - University of Glasgow

What is “The Grid” Is Grid. PP a Grid? Anyway? http: //www-fp. mcs. anl. What is “The Grid” Is Grid. PP a Grid? Anyway? http: //www-fp. mcs. anl. gov/~foster/Articles/What. Is. The. Grid. pdf 1. Coordinates resources that are not subject to centralized control 1. YES. This is why development and maintenance of a UK-EU-US testbed is important 2. … using standard, open, general 2. YES. . . Globus/Condor. G/EDG meet -purpose protocols and this requirement. Common interfaces experiment application layers are also important here. 3. … to deliver nontrivial qualities of service 3. NO(T YET)… Experiments define whether this is true currently only ~100, 000 jobs submitted via the testbed c. f. internal component tests of up 10, 000 jobs per day. Next step: LCG-2 deployment outcome… this year Tony Doyle - University of Glasgow

Grid. PP – Grid. PP Summary: Theory and Experiment From Web to Grid • Grid. PP – Grid. PP Summary: Theory and Experiment From Web to Grid • • UK Grid. PP started 1/9/01 EU Data. Grid: First Middleware ~1/9/01 Development requires a testbed with feedback – “Operational Grid” • • • – – • Fit into UK e-Science structures • Experience in distributed computing essential to build and exploit the Grid § Scale in UK? 0. 5 PBytes and 2, 000 distributed CPUs Ø Grid. PP in Sept 2004 • Grid jobs are being submitted now. . user feedback loop is important. . All experiments have immediate requirements Current Experiment Production: “The Grid” is a small component Non-technical issues: • Recognising context Building upon expertise Defining roles Sharing resources Major deployment activity is LCG – We contribute significantly to LCG and our success depends critically on LCG • • “Production Grid” will be difficult to realise: Grid. PP 2 planning underway as part of LCG/EGEE Many Challenges Ahead. . Tony Doyle - University of Glasgow

Grid. PP Summary: From Prototype to Production Ba. Bar. Grid Ba. Bar CDF ATLAS Grid. PP Summary: From Prototype to Production Ba. Bar. Grid Ba. Bar CDF ATLAS LHCb ALICE CMS CERN Computer Centre RAL Computer Centre 19 UK Institutes Separate Experiments, Resources, Multiple Accounts 2001 EGEE SAMGrid D 0 GANGA EDG ARDA LCG CERN Prototype Tier-0 Centre UK Prototype Tier-1/A Centre LCG CERN Tier-0 Centre UK Tier-1/A Centre 4 UK Tier-2 Centres 4 UK Prototype Tier-2 Centres Prototype Grids 2004 'One' Production Grid 2007 Tony Doyle - University of Glasgow