Скачать презентацию Tony Doyle Grid PP Oversight Committee 15 May Скачать презентацию Tony Doyle Grid PP Oversight Committee 15 May

f0d6e42b4fe4dccf5c9b384e8a3a6eb8.ppt

  • Количество слайдов: 56

Tony Doyle Grid. PP Oversight Committee 15 May 2002 Tony Doyle - University of Tony Doyle Grid. PP Oversight Committee 15 May 2002 Tony Doyle - University of Glasgow

Document Mapping § § Exec Summary Goals Metrics for success Project Elements Ø Ø Document Mapping § § Exec Summary Goals Metrics for success Project Elements Ø Ø § Risks/Dependencies (and mechanisms) Summary Ø § Ø PMB-02 -EXEC PMB-01 -VISION PMB-02 -EXEC Gantt Charts, PMB -05 -LCG, TB -01 -Q 5 -Report, TB-02 UKRollout, PMB-06 Tier. Astatus, PMB-04 Resources PMB-03 -STATUS, PMB -07 -INSTRUMENTS PMB-02 -EXEC Tony Doyle - University of Glasgow

Outline § The Vision Thing… § Grid 1. Scale 2. Integration 3. Dissemination 4. Outline § The Vision Thing… § Grid 1. Scale 2. Integration 3. Dissemination 4. LHC Analyses 5. Other Analyses 6. Data. Grid 7. LCG 8. Interoperability 9. Infrastructure 10. Finanaces § Summary Tony Doyle - University of Glasgow

Grid. PP Documents Tony Doyle - University of Glasgow Grid. PP Documents Tony Doyle - University of Glasgow

Grid. PP Vision From Web to Grid Building the next IT Revolution Premise The Grid. PP Vision From Web to Grid Building the next IT Revolution Premise The next IT revolution will be the Grid. The Grid is a practical solution to the data -intensive problems that must be overcome if the computing needs of many scientific communities and industry are to be fulfilled over the next decade. Aim The Grid. PP Collaboration aims to develop and deploy the largest-scale science Grid in the UK for use by the worldwide particle physics community. Many Challenges. . Shared distributed infrastructure For all experiments Tony Doyle - University of Glasgow

Grid. PP Objectives 1. SCALE: Grid. PP will deliver the Grid software (middleware) and Grid. PP Objectives 1. SCALE: Grid. PP will deliver the Grid software (middleware) and hardware infrastructure to enable the testing of a prototype of the Grid for the LHC of significant scale. 2. INTEGRATION: The Grid. PP project is designed to integrate with the existing Particle Physics programme within the UK, thus enabling early deployment and full testing of Grid technology and efficient use of limited resources. 3. DISSEMINATION: The project will disseminate the Grid. PP deliverables in the multidisciplinary e-science environment and will seek to build collaborations with emerging non-PPARC Grid activities both nationally and internationally. 4. UK PHYSICS ANALYSES (LHC): The main aim is to provide a computing environment for the UK Particle Physics Community capable of meeting the challenges posed by the unprecedented data requirements of the LHC experiments. 5. UK PHYSICS ANALYSES (OTHER): The process of creating and testing the computing environment for the LHC will naturally provide for the needs of the current generation of highly data intensive Particle Physics experiments: these will provide a live test environment for Grid. PP research and development. 6. DATAGRID: Grid technology is the framework used to develop this capability: key components will be developed as part of the EU Data. Grid project and elsewhere. 7. LCG: The collaboration builds on the strong computing traditions of the UK at CERN. The CERN working groups will make a major contribution to the LCG research and development programme. 8. INTEROPERABILITY: The proposal is also integrated with developments from elsewhere in order to ensure the development of a common set of principles, protocols and standards that can support a wide range of applications. 9. INFRASTRUCTURE: Provision is made for facilities at CERN (Tier-0), RAL (Tier-1) and use of up to four Regional Centres (Tier-2). 10. OTHER FUNDING: These centres will provide a focus for dissemination to the academic and commercial sector and are expected to attract funds from elsewhere such that the full programme can be realised. (…. WHAT WE SAID WE COULD DO IN THE PROPOSAL) Tony Doyle - University of Glasgow

Grid – A Single Resource Many millions of events Peta Bytes of data storage Grid – A Single Resource Many millions of events Peta Bytes of data storage Many 1000 s of computers required Various conditions GRID A unified approach Many samples GRID Distributed resources Heterogeneous operating systems Worldwide collaboration A unified approach Tony Doyle - University of Glasgow

GRID Grid - What’s been happening? OGSA § ¨ A unified approach A lot… GRID Grid - What’s been happening? OGSA § ¨ A unified approach A lot… GGF 4, OGSA and support of IBM (and others) § [as opposed to. NET development framework and passports to access services] § Timescale? September 2002 § W 3 C architecture for web services § Chose (gzipped) XML as opposed to other solutions for metadata descriptions… and web-based interfaces § linux § [as opposed to other platforms… lindows? ? ] § C++ (experiments) and C, Java (middleware) APIs § [mono - Open Source implementation of the. NET Development Framework? ? ] Tony Doyle - University of Glasgow

Grid. PP Context Provide architecture and middleware Future LHC Experiments Running US Experiments Use Grid. PP Context Provide architecture and middleware Future LHC Experiments Running US Experiments Use the Grid with real data Build Tier-A/prototype Tier -1 and Tier-2 centres in the UK and join worldwide effort to develop middleware for the experiments Use the Grid with simulated data Tony Doyle - University of Glasgow

GRID EDG Test. Bed 1 Status A unified approach Web interface showing status of GRID EDG Test. Bed 1 Status A unified approach Web interface showing status of (~400) servers at testbed 1 sites GRID extend to all expts Tony Doyle - University of Glasgow

LHC computing at a glance § § e 1. scal The investment in LHC LHC computing at a glance § § e 1. scal The investment in LHC computing will be massive § LHC Review estimated 240 MCHF (before LHC delay) § 80 MCHF/y afterwards These facilities will be distributed § Political as well as sociological and practical reasons Europe: 267 institutes, 4603 users Elsewhere: 208 institutes, 1632 users Tony Doyle - University of Glasgow

RTAG Status § 6 RTAGs created to date: § § § § 7. LCG RTAG Status § 6 RTAGs created to date: § § § § 7. LCG RTAG 1 (Persistency Framework; status: completed) RTAG 2 (Managing LCG Software; status: running) RTAG 3 (Math Library Review; status: running) RTAG 4 (GRID Use Cases; status: starting) RTAG 5 (Mass Storage; status: running) RTAG 6 (Regional Centres; status: starting) Two more in advanced state of preparation: § Simulation components § Data Definition Tools Tony Doyle - University of Glasgow

Fabrics & Grid Deployment § 7. LCG Level 1 Milestone: deploy a Global Grid Fabrics & Grid Deployment § 7. LCG Level 1 Milestone: deploy a Global Grid Service within 1 year § sustained 24 X 7 service § including sites from three continents ¨ identical or compatible Grid middleware and infrastructure § several times the capacity of the CERN facility § and as easy to use § Ongoing work at CERN to increase automation and streamline configuration, especially for migration to Red. Hat 7. 2. § Aim to phase out old CERN solutions by mid-2003. Tony Doyle - University of Glasgow

e 1. timescal LCG Timeline Prototype of Hybrid Event Store (Persistency Framework) Hybrid Event e 1. timescal LCG Timeline Prototype of Hybrid Event Store (Persistency Framework) Hybrid Event Store available for general users applications Distributed production using grid services Full Persistency Framework Distributed end-user interactive analysis Q 1 Q 2 Q 3 Q 4 2002 2003 2004 2005 LHC Global Grid TDR grid “ 50% prototype” (LCG-3) available LCG-1 reliability and performance targets First Global Grid Service (LCG-1) available Tony Doyle - University of Glasgow

Be a part of this? LCG Development – Long Term Attachment at CERN § Be a part of this? LCG Development – Long Term Attachment at CERN § This will enable Grid developments in the UK to be (more) fully integrated with long -term Grid development plans at CERN. § The proposed mechanism is: § 1. submit a short one-page outline of current and proposed work, noting how this work can best be developed within a named team at CERN, by e-mail to the Grid. PP Project Leader (Tony Doyle) and Grid. PP CERN Liaison (Tony Cass). § 2. This case will be discussed at the following weekly Grid. PP PMB meeting and outcomes will be communicated as soon as possible by e-mail following that meeting. Notes 1. The minimum period for LTA is 3 months. It is expected that a work programme will be typically for 6 months (or more). 2. Prior Data. Grid and LHC (or other) experiments' Grid work are normally expected. 3. It is worthwhile reading http: //cern. ch/lcg/peb/applications in order to get an idea of the areas covered, and the emphasis placed, by the LCG project on specific areas (building upon Data. Grid and LHC experiments' developments). 4. Please send all enquiries and proposals to: Tony Doyle and Tony CASS Tony Doyle - University of Glasgow

Summary of LCG § § § § § 7. LCG Project got under way Summary of LCG § § § § § 7. LCG Project got under way early this year Launch workshop and early RTAGs give good input for high-level planning … … to be presented to LHCC in July New plan takes account of first beam in 2007 No serious problems foreseen in synchronising LCG plans with those of the experiments Collaboration with the many Grid projects needs more work Technical collaboration with the Regional Centres has to be established Recruitment of special staff going well (but need to keep the recruitment momentum going) Serious problem with materials funding Tony Doyle - University of Glasgow

Building upon Success § § 6. Data. Grid The most important criterion for establishing Building upon Success § § 6. Data. Grid The most important criterion for establishing the status of this project was the European Commission review on March 1 st 2002. The review report of project IST-2000 -25182 DATAGRID is available from PPARC. The covering letter states “As a general conclusion, the reviewers found that the overall performance of the project is good and in some areas beyond expectations. ” The reviewers state “The deliverables due for the first review were in general of excellent quality, and all of them were available on time… All deliverables are approved. The project is doing well, exceeding expectations in some areas, and coping successfully with the challenges due to its size. ” Tony Doyle - University of Glasgow

6. Data. Grid Tony Doyle - University of Glasgow 6. Data. Grid Tony Doyle - University of Glasgow

WP 1 – Workload Management (Job Submission) 6. Data. Grid 1. Authentication grid-proxy-init 2. WP 1 – Workload Management (Job Submission) 6. Data. Grid 1. Authentication grid-proxy-init 2. Job submission to Data. Grid Important to implement this dg-job-submit for all experiments… 3. Monitoring and control dg-job-status dg-job-cancel dg-job-get-output 4. Data publication and replication (WP 2) globus-url-copy, GDMP 5. Resource scheduling – use of CERN MSS JDL, sandboxes, storage elements Tony Doyle - University of Glasgow

WP 2 - Spitfire 6. Data. Grid Tony Doyle - University of Glasgow WP 2 - Spitfire 6. Data. Grid Tony Doyle - University of Glasgow

WP 3 - R-GMA Application Code Archiver API User code monitors output here. Application WP 3 - R-GMA Application Code Archiver API User code monitors output here. Application Code Consumer API Sensor Code Consumer API Consumer Servlet Registry API User code here. Builds on R-GMA Database Structures. Producer API Archiver Servlet DBProducer Servlet Registry Servlet Schema API Registry API 6. Data. Grid Schema Servlet Producer. Servlet “Event Dictionary” Tony Doyle - University of Glasgow

WP 4 - LCFG 6. Data. Grid Tony Doyle - University of Glasgow WP 4 - LCFG 6. Data. Grid Tony Doyle - University of Glasgow

Data Flow Diagram for SE WP 5 – Storage Element u Interface Layer A Data Flow Diagram for SE WP 5 – Storage Element u Interface Layer A consistent interface to MSS. The Core and the Bottom Layer 2 6. Data. Grid n MSS Castor s HPSS Queue Manager Network Interface s RAID arrays s 3 1 SRM s Request Manager DMF s Pipe Manager 4 Enstore s Pipe Store Disk n Handler MSM Named Pipe Grid. FTP 6 s Named Pipe 5 Named Pipe 8 Named Pipe 7 Interfaces Grid. RFIO s Tape /grid s OGSA s Tony Doyle - University of Glasgow

WP 6 - Test. Bed 1 Status 6. Data. Grid Web interface showing status WP 6 - Test. Bed 1 Status 6. Data. Grid Web interface showing status of (~400) servers at testbed 1 sites GRID extend to all expts Tony Doyle - University of Glasgow

WP 7 – Network Monitoring 6. Data. Grid Tony Doyle - University of Glasgow WP 7 – Network Monitoring 6. Data. Grid Tony Doyle - University of Glasgow

WP 7 - EDG Authorisation grid-mapfile generation o=xyz, dc=eu-datagrid, dc=org ou=People CN=Mario Rossi CN=John WP 7 - EDG Authorisation grid-mapfile generation o=xyz, dc=eu-datagrid, dc=org ou=People CN=Mario Rossi CN=John Smith Authentication Certificate o=testbed, dc=eu-datagrid, dc=org ou=Testbed 1 VO Directory CN=Franz Elmer Authentication Certificate ou=People ou=? ? ? CN=John Smith “Authorization Directory” Authentication Certificate mkgridmap 6. Data. Grid local users CN=Franz Elmer grid-mapfile ban list Tony Doyle - University of Glasgow

WP 8 - Applications § 1. Realistic Large-Scale Tests § Reliability! Need reliable dg-job-* WP 8 - Applications § 1. Realistic Large-Scale Tests § Reliability! Need reliable dg-job-* command suite § 2. Data management § Reliability! Need reliable gdmp-* command suite, file-transfer commands § 3. Mass Storage Support § Working access to MSS (CASTOR and HPSS at CERN, Lyon) § 4. Lightweight User Interface § Put on a laptop or std. Desktop machine u 5. n u 6. n u 7. n 6. Data. Grid Portability Demonstrable portability of middleware: a) use other resources, b) debugging Scratch Space Job requests X amount of scratch space to be available during execution, system tells job where it is Output File Support JDL support for output files: specify where output should go in JDL, not in job script Tony Doyle - University of Glasgow

Expt. Feedback 4. and 5. Expts Tony Doyle - University of Glasgow Expt. Feedback 4. and 5. Expts Tony Doyle - University of Glasgow

5. Other Expts 8. Interoperability = Minimal e-Bureaucracy Tony Doyle - University of Glasgow 5. Other Expts 8. Interoperability = Minimal e-Bureaucracy Tony Doyle - University of Glasgow

GRID JOB SUBMISSION – External User Experience 5. Other Expts Tony Doyle - University GRID JOB SUBMISSION – External User Experience 5. Other Expts Tony Doyle - University of Glasgow

Things Missing, apparently… 5. Other Expts Tony Doyle - University of Glasgow Things Missing, apparently… 5. Other Expts Tony Doyle - University of Glasgow

Expt. Feedback 4. and 5. Expts Tony Doyle - University of Glasgow Expt. Feedback 4. and 5. Expts Tony Doyle - University of Glasgow

Grid. PP Poster 3. Dissemination Tony Doyle - University of Glasgow Grid. PP Poster 3. Dissemination Tony Doyle - University of Glasgow

Tier 1/A EDG Poster 3. Dissemination Tony Doyle - University of Glasgow Tier 1/A EDG Poster 3. Dissemination Tony Doyle - University of Glasgow

Ba. Bar Poster 3. Dissemination Tony Doyle - University of Glasgow Ba. Bar Poster 3. Dissemination Tony Doyle - University of Glasgow

LHCb Poster 3. Dissemination Tony Doyle - University of Glasgow LHCb Poster 3. Dissemination Tony Doyle - University of Glasgow

Scot. GRID Poster 3. Dissemination Tony Doyle - University of Glasgow Scot. GRID Poster 3. Dissemination Tony Doyle - University of Glasgow

Identifiable Progress. . . 3. Dissemination t 0 t 1 Tony Doyle - University Identifiable Progress. . . 3. Dissemination t 0 t 1 Tony Doyle - University of Glasgow

Web. Log Allows every area/sub group to have its own 'news' pages Tony Doyle Web. Log Allows every area/sub group to have its own 'news' pages Tony Doyle - University of Glasgow

Grid. PP & Core e-Science Centres 3. Dissemination Written formally to all e-Science centres Grid. PP & Core e-Science Centres 3. Dissemination Written formally to all e-Science centres inviting contact and collaboration with Grid. PP. § Ne. SC § Close ties, hosted 2 nd Grid. PP Collaboration Meeting, Collaboration on EDIKT Project? Training. . . § Belfast § Replied but not yet up and running. § Cambridge § Close ties, hosted 3 rd Grid. PP Collaboration Meeting. Share one post with Grid. PP. Will collaborate on ATLAS Data Challenges. § Cardiff § Replied - contacts through QM (Vista) and Brunel Grid. PP Group. Tony Doyle - University of Glasgow

Grid. PP & Core e-Science Centres 3. Dissemination § London § No formal reply Grid. PP & Core e-Science Centres 3. Dissemination § London § No formal reply but close contacts through IC HEP Group. IC will host 5 th Grid. PP Collaboration Meeting. § Manchester § No collab. projects so far. Manchester HEP Group will host 4 th Grid. PP Collaboration Meeting. § Newcastle § In contact - Database projects? § Oxford § Close ties, collaboration between Oxford HEP Group and Grid. PP on establishment of central Tier-2 centre? CS/Core -Grid. PP-EDG links? Probably host 6 th Grid. PP Collaboration Meeting. § Southampton § Replied but no collaboration as yet. Tony Doyle - University of Glasgow

GLUE § § 8. Interoperability How do we integrate with developments from elsewhere in GLUE § § 8. Interoperability How do we integrate with developments from elsewhere in order to ensure the development of a common set of principles, protocols and standards that can support a wide range of applications? GGF… Within the Particle Physics community, these ideas are currently encapsulated in the Grid Laboratory Uniform Environment (GLUE). Recommend this as a starting point for the wider deployment of Grids across the Atlantic. See http: //www. hicb. org/glue/GLUE-v 0. 1. doc (Ruth Pordes et al. ) Tony Doyle - University of Glasgow

8. Interoperability Tony Doyle - University of Glasgow 8. Interoperability Tony Doyle - University of Glasgow

UK Tier-A/prototype Tier-1 Centre 9. Infrastructure § § Ø + § § Roles Tier-A UK Tier-A/prototype Tier-1 Centre 9. Infrastructure § § Ø + § § Roles Tier-A Centre for Ba. Bar EDG testbed(s) LCG prototype Tier-1 Centre prototype Tier-1 for LHC experiments (Data Challenges independent of LCG development…) Interworking with other UK resources (JIF, JREI, e. SC) UK portal existing LEP, DESY and non-accelerator experiments Purchases First year = Hardware Advisory Group (HAG 1) Determine balance between cpu, disk, and tape Experts on specific technologies Propose more HAGs (2 and 3). . Needs to be successful in all roles. . . = Tony Doyle - University of Glasgow

Rollout of the UK Grid for PP § § § 9. Infrastructure Operational stability Rollout of the UK Grid for PP § § § 9. Infrastructure Operational stability of Grid. PP middleware = Testbed team The “gang of four” … Andrew Mc. Nab, Steve Traylen, Dave Colling (other half) and Owen Moroney Ensures the release of “Testbed” quality EDG software § documentation § lead for other system managers in terms of implementation § pre-defined software cycle releases (2 months. . ) § § Subject of the Rollout Plan… “Planning for EDG Testbed software deployment and support at participating UK sites” (Pete Clarke, John Gordon) LCG is the proposed mechanism by which the EDG testbed at CERN becomes an LCG Grid Service. The evolution of the EDG testbed to the LCG Grid Service will take account of both EDG and US grid technology. Need to take account of this. . Tony Doyle - University of Glasgow

Longer Term. . § § + § § 9. Infrastructure LCG Grid Service Takes Longer Term. . § § + § § 9. Infrastructure LCG Grid Service Takes account of EDG and US grid technology A large-scale Grid resource, consistent with the LCG timeline, within the UK. Scale in UK? 0. 5 Pbytes and 2, 000 distrib. CPUs = Grid. PP in Sept 2004 “ 50% prototype” Tony Doyle - University of Glasgow

£ 17 m 3 -Year Project Dave Britton 10. Finances § Five components § £ 17 m 3 -Year Project Dave Britton 10. Finances § Five components § § § Tier-1/A = Hardware + ITD Support Staff Data. Grid = Data. Grid Posts + PPD Staff Applications = Experiments Posts Operations = Travel + Management + e Early Investment CERN = LCG posts + Tier-0 + e LTA Tony Doyle - University of Glasgow

1. Recruitment § § EDG Funded Posts (Middleware/Testbed) § All 5 in post + 1. Recruitment § § EDG Funded Posts (Middleware/Testbed) § All 5 in post + 1 additional EDG Unfunded Posts (Middleware/Testbed) § 15 out of 15 in post Grid. PP Posts (Applications + Tier 1/A) § Allocated Dec 2001 § 13 out of 15 in post CERN Posts § First Round = 105 Applicants, 12 Offers, 9 Accepted § 4 in Applications, 2 Data Management, 3 Systems § Second Round = 140 applicants, 9 Offers § Third Round ~ 70 Applicants § Aim ~ 28 posts Tony Doyle - University of Glasgow

2. Monitoring Staff Effort [SM] Robin Middleton Tony Doyle - University of Glasgow 2. Monitoring Staff Effort [SM] Robin Middleton Tony Doyle - University of Glasgow

3. Progress towards deliverables. . Pete Clarke Tony Doyle - University of Glasgow 3. Progress towards deliverables. . Pete Clarke Tony Doyle - University of Glasgow

-1. Next steps. . § 10. Finances O(100 k) § e CLRC support through -1. Next steps. . § 10. Finances O(100 k) § e CLRC support through to Sept 04 § Other experiments – unfunded in peer review process § Tier-2 centres – unfunded initially § £ 2. 3 m § e. DIKT (e-Data, Information and Knowledge Transformation) [SHEFC] Particle Physics = application area - assignment of two (of twelve) FTEs in initial planning. Discussions ongoing with EPCC. § O(€ 100 m) § The first call for Framework VI will be early next year. § Call out now for expressions of interest for new networks and integrated projects. § Draft document led by David Williams (CERN) “Enabling Grids and e -Science in Europe” plans to extend the current paradigms with CERN at its focus as the European e-Science Centre. § We believe this is the right approach. § Incorporates the UK’s e-Science agenda, adding a European dimension. It also recognises the central role of CERN and builds upon the recent successes of EDG. § PPARC Contact: Neil Geddes Tony Doyle - University of Glasgow

Testbed Status Overview Green Dot Birmingham y Bristol y Brunel Cambridge Edinburgh y Glasgow Testbed Status Overview Green Dot Birmingham y Bristol y Brunel Cambridge Edinburgh y Glasgow y Imperial Lancaster Liverpool Manchester y Oxford y QMUL y RAL y RHUL UCL G 1. 1. 3 y ØMetrics G 2. 0(b) EDG-CE Babar-CE y y y y y y y y Andrew Mc. Nab - Manchester HEP - 10 May 2002 Tony Doyle - University of Glasgow

Imperial What is in place in the UK testbed? (an RB centric view of Imperial What is in place in the UK testbed? (an RB centric view of the world) Only Grid. PP and Babar VOs Imperial Bristol Replica Catalogue R. B. RAL CE, SE, UI Birmingham CE, UI II JSS CE, SE, UI ØMetrics LB Liverpool CE, UI IN 2 P 3 -Babar UI RHUL QMUL Bristol CE, UI Tony Doyle - University of Glasgow

Grid Support Centre § UKHEP CA uses primitive technology § It works but takes Grid Support Centre § UKHEP CA uses primitive technology § It works but takes effort § 201 personal certs issued (98 still valid) § 119 other certs issued (93 still valid) § 8. Interoperability ØMetrics GSC will run a CA for UK escience CA § Uses open. CA; Registration Authority uses web § We plan to use it § Namespace identifies RA, not Project §UK § § Through GSC we have access to skills of CLRC e. SC Use helpdesk to formalise support later in the rollout §e-Science §Certification §Authority Tony Doyle - University of Glasgow

Summary § § 1. Ø 2. 3. 4. 5. A vision is only useful Summary § § 1. Ø 2. 3. 4. 5. A vision is only useful if its shared Grid success is fundamental for PP Scale in UK? 0. 5 Pbytes and 2, 000 distrib. CPUs Grid. PP in Sept 2004 Integration – ongoing. . Dissemination – external and internal LHC Analyses – ongoing feedback mechanism. . Other Analyses – closely integrated using EDG tools 6. 7. 8. 9. 10. § § § 11. 12. Data. Grid - major investment = must be (and is so far) successful LCG – Grid as a Service Interoperability – sticky subject Infrastructure – Tier-A/1 in place, Tier-2’s to follow… Finances – (very well) under control Next steps on framework VI. . CERN = EU’s e-science centre? Co-operation required with other disciplines/industry Monitoring mechanisms in place Emphasis on deliverables Tony Doyle - University of Glasgow

Executive 2 Summary § § § Significant progress. . . Project is now well Executive 2 Summary § § § Significant progress. . . Project is now well defined in a broad sense and is progressing on a series of fronts. We have responded and outlined our plans to address the concerns of the last OC concerning: 1. 2. 3. 4. WP 5; Rollout plan; Monitoring instruments; Metrics for success. § The project progress in: has demonstrated 1. Widespread deployment of EDG testbeds in the UK; 2. Integration with specific experimental areas (Ba. Bar, UKDMC and LISA); and 3. Demonstrating Grid deployment in the UK at the Ne. SC opening. § We see various challenges ahead: 1. Development of more detailed metrics and monitoring of outputs; 2. Management of changes due to external developments (e. g. OGSA); 3. Development of Tier-2 deployment; 4. Engagement of the UK HEP community; and 5. Future funding initiatives such as Framework VI. Tony Doyle - University of Glasgow