2294b3a24295e20f7e28cebf7b31b6d2.ppt
- Количество слайдов: 65
Data Grids Enabling Data Intensive Global Science Paul Avery University of Florida avery@phys. ufl. edu Physics Colloquium University of Mississippi November 11, 2003 University of Mississippi (Nov. 11, 2003) Paul Avery 1
Overview Ø Grids, Data Grids and examples Ø LHC computing as principal Grid driver Ø Grid & network projects Ø Sample Grid R&D efforts Ø Promising Ø Success new directions story for HEP u Leadership, partnership, collaboration University of Mississippi (Nov. 11, 2003) Paul Avery 2
The Grid Concept Ø Grid: Geographically distributed computing resources configured for coordinated use u Fabric: Physical resources & networks provide raw capability u Middleware: Software ties it all together: tools, services, etc. u Ownership: Resources controlled by owners and shared w/ others Ø Goal: Transparent resource sharing University of Mississippi (Nov. 11, 2003) Paul Avery 3
Grids and Resource Sharing Ø Resources for complex problems are distributed u Advanced scientific instruments (accelerators, telescopes, …) u Storage, computing, people, institutions Ø Organizations require access to common services u Research collaborations (physics, astronomy, engineering, …) u Government agencies, health care organizations, corporations, … Ø Grids make possible “Virtual Organizations” u Create a “VO” from geographically separated components u Make all community resources available to any VO member u Leverage strengths at different institutions Ø Grids require a foundation of strong networking u Communication tools, visualization u High-speed data transmission, instrument operation University of Mississippi (Nov. 11, 2003) Paul Avery 4
Grid Challenges Ø Operate a fundamentally complex entity u Geographically distributed resources u Each resource under different administrative control u Many failure modes Ø Manage workflow of 1000 s of jobs across Grid u Balance policy vs. instantaneous capability to complete tasks u Balance effective resource use vs. fast turnaround for priority jobs u Match resource usage to policy over the long term Ø Maintain a global view of resources and system state u Coherent end-to-end system monitoring u Adaptive learning for execution optimization Ø Build managed system & integrated user environment University of Mississippi (Nov. 11, 2003) Paul Avery 5
Data Grids & Data Intensive Sciences Ø Scientific discovery increasingly driven by data collection u Computationally intensive analyses u Massive data collections u Data distributed across networks of varying capability u Internationally distributed collaborations Ø Dominant u 2000 u 2005 u 2010 u 2015 factor: data growth (1 Petabyte = 1000 TB) ~0. 5 Petabyte ~10 Petabytes ~1000 Petabytes? How to collect, manage, access and interpret this quantity of data? Drives demand for “Data Grids” to handle additional dimension of data access & movement University of Mississippi (Nov. 11, 2003) Paul Avery 6
Data Intensive Physical Sciences Ø High energy & nuclear physics u Belle/Ba. Bar, Tevatron, RHIC, JLAB, LHC Ø Astronomy u Digital sky surveys: SDSS, VISTA, other Gigapixel arrays u VLBI arrays: multiple- Gbps data streams u “Virtual” Observatories (multi-wavelength astronomy) Ø Gravity wave searches u LIGO, GEO, VIRGO, TAMA Ø Time-dependent 3 -D systems (simulation & data) u Earth Observation u Climate modeling, oceanography, coastal dynamics u Geophysics, earthquake modeling u Fluids, aerodynamic design u Pollutant dispersal University of Mississippi (Nov. 11, 2003) Paul Avery 7
Data Intensive Biology and Medicine Ø Medical data and imaging u X-Ray, mammography data, etc. (many petabytes) u Radiation Oncology (real-time display of 3 -D images) Ø X-ray crystallography u Bright X-Ray sources, e. g. Argonne Advanced Photon Source Ø Molecular genomics and related disciplines u Human Genome, other genome databases u Proteomics (protein structure, activities, …) u Protein interactions, drug delivery Ø High-res brain scans (1 -10 m, time dependent) University of Mississippi (Nov. 11, 2003) Paul Avery 8
LHC and Data Grids University of Mississippi (Nov. 11, 2003) Paul Avery 9
Large Hadron Collider (LHC) @ CERN km Tunnel in Switzerland & France 27 TOTEM CMS ALICE LHCb Search for Origin of Mass & Supersymmetry (2007 – ? ) University of Mississippi (Nov. 11, 2003) ATLAS Paul Avery 10
CMS Experiment at LHC “Compact” Muon Solenoid at the LHC (CERN) Smithsonian standard man University of Mississippi (Nov. 11, 2003) Paul Avery 11
LHC: Key Driver for Data Grids Ø Complexity: Millions Ø Scale: of individual detector channels Peta. Ops (CPU), Petabytes (Data) Ø Distribution: Global distribution of people & resources 2000+ Physicists 159 Institutes 36 Countries University of Mississippi (Nov. 11, 2003) Paul Avery 12
LHC Data Rates: Detector to Storage Physics filtering 40 MHz Level 1 Trigger: Special Hardware 75 GB/sec 75 KHz Level 2 Trigger: Commodity CPUs 5 GB/sec 5 KHz Level 3 Trigger: Commodity CPUs 0. 1 – 1. 5 GB/sec 100 Hz University of Mississippi (Nov. 11, 2003) Raw Data to storage (+ simulated data) Paul Avery 13
LHC: Higgs Decay into 4 muons 109 collisions/sec, selectivity: 1 in 1013 University of Mississippi (Nov. 11, 2003) Paul Avery 14
LHC Data Requirements CMS ATLAS Storage Ø Raw recording rate 0. 1 – 1 GB/s Ø Accumulating at 5 -8 PB/year Ø 10 PB of disk Ø ~100 PB total within a few years Processing Ø 200, 000 of today’s fastest PCs LHCb University of Mississippi (Nov. 11, 2003) Paul Avery 15
Hierarchy of LHC Data Grid Resources CMS Experiment Online System 100 -1500 MBytes/s Tier 0 10 -40 Gbps Tier 1 Korea Russia UK Tier 2 CERN Computer Center > 20 TIPS USA 2. 5 -10 Gbps Tier 2 Center 1 -2. 5 Gbps Tier 3 Institute Physics cache Tier 4 PCs University of Mississippi (Nov. 11, 2003) 1 -10 Gbps ~10 s of Petabytes/yr by 2007 -8 ~1000 Petabytes in < 10 yrs? Paul Avery 16
Most IT Resources Outside CERN 2008 Resources University of Mississippi (Nov. 11, 2003) Paul Avery 17
LHC and Global Knowledge Communities Non-hierarchical use of Data Grid University of Mississippi (Nov. 11, 2003) Paul Avery 18
Data Grid Projects University of Mississippi (Nov. 11, 2003) Paul Avery 19
Global Context: Data Grid Projects ØU. S. Infrastructure Projects ØEU, u Gri. Phy. N (NSF) u i. VDGL (NSF) u Particle Physics Data Grid (DOE) u PACIs and Tera. Grid (NSF) u DOE Science Grid (DOE) u NSF Middleware Infrastructure (NSF) Asia major projects u European Data Grid (EU) u EDG-related national Projects u LHC Computing Grid (CERN) u EGEE (EU) u Cross. Grid (EU) u Data. TAG (EU) u Grid. Lab (EU) u Japanese Grid Projects u Korea Grid project Not exclusively HEP (LIGO, SDSS, ESA, Biology, …) Ø But most driven/led by HEP Ø Many $M brought into the field Ø University of Mississippi (Nov. 11, 2003) Paul Avery 20
EU Data. Grid Project University of Mississippi (Nov. 11, 2003) Paul Avery 21
U. S. Particle Physics Data Grid DOE funded u. Funded 1999 – 2004 @ US$9. 5 M (DOE) u. Driven by HENP experiments: D 0, Ba. Bar, STAR, CMS, ATLAS u. Maintains practical orientation: Grid tools for experiments University of Mississippi (Nov. 11, 2003) Paul Avery 22
U. S. Gri. Phy. N and i. VDGL Projects Ø Both funded by NSF (ITR/CISE + Physics) u Gri. Phy. N: u i. VDGL: Ø Basic composition (~120 people) u Gri. Phy. N: u i. VDGL: u Expts: Ø Grid $11. 9 M (NSF) (2000 – 2005) $14. 0 M (NSF) (2001 – 2006) 12 universities, SDSC, 3 labs 20 universities, SDSC, 3 labs, foreign partners CMS, ATLAS, LIGO, SDSS/NVO research/infrastructure vs Grid deployment u Gri. Phy. N: CS research, Virtual Data Toolkit (VDT) development u i. VDGL: Grid laboratory deployment using VDT u 4 physics experiments provide frontier challenges Ø Extensive student involvement u Undergrads, postdocs participate at all levels u Strong outreach component University of Mississippi (Nov. 11, 2003) Paul Avery 23
experiments 2007 u High energy physics u 100 s of Petabytes Ø LIGO 2002 u Gravity wave experiment u 100 s of Terabytes Ø Sloan Digital Sky Survey u Digital astronomy (1/4 sky) u 10 s of Terabytes 2001 Data growth Ø LHC Community growth Gri. Phy. N/i. VDGL Science Drivers Ø Massive CPU (Peta. Ops) Ø Large distributed datasets (>100 PB) Ø International (global) communities (1000 s) University of Mississippi (Nov. 11, 2003) Paul Avery 24
Gri. Phy. N: Peta. Scale Virtual-Data Grids Production Team Single Researcher Workgroups Interactive User Tools Virtual Data Tools Request Planning & Scheduling Tools Resource Management Services Security and Policy Services Peta. Ops u Petabytes u Performance u Other Grid Services Transforms Distributed resources (code, storage, CPUs, networks) Raw data source University of Mississippi (Nov. 11, 2003) Request Execution & Management Tools Paul Avery 25
International Virtual Data Grid Laboratory (Fall 2003) SKC LBL Caltech Wisconsin Michigan PSU Fermilab Argonne Indiana Oklahoma FSU Arlington Brownsville University of Mississippi (Nov. 11, 2003) Paul Avery UF Tier 1 Tier 2 Tier 3 BNL J. Hopkins Hampton Vanderbilt UCSD/SDSC Boston U FIU Partners u. EU u. Brazil u. Korea u. Japan? 26
Testbed Successes: US-CMS Example Ø Extended runs for Monte Carlo data production u 200 K event test run identified many bugs in core Grid middleware u 2 months continuous running across 5 testbed sites (1. 5 M events) u Demonstrated at Supercomputing 2002 University of Mississippi (Nov. 11, 2003) Paul Avery 27
LCG: LHC Computing Grid Project Ø Prepare & deploy computing environment for LHC expts u Common applications, tools, frameworks and environments u Emphasis on collaboration, coherence of LHC computing centers u Deployment only: no middleware development Ø Move from testbed systems to real production services u Operated and supported 24 x 7 globally u Computing fabrics run as production physics services u A robust, stable, predictable, supportable infrastructure Ø Need to resolve some issues u Federation vs integration u Grid tools and technologies University of Mississippi (Nov. 11, 2003) Paul Avery 28
Sep. 29, 2003 announcement University of Mississippi (Nov. 11, 2003) Paul Avery 29
Current LCG Sites University of Mississippi (Nov. 11, 2003) Paul Avery 30
Sample Grid R&D University of Mississippi (Nov. 11, 2003) Paul Avery 31
Sphinx Grid-Scheduling Service Sphinx Client Request Processing Chimera Virtual Data System Condor-G/DAGMan VDT Client Information Warehouse Data Management Globus Resource Replica Location Service Information Gathering Mon. ALISA Monitoring Service Sphinx Server University of Mississippi (Nov. 11, 2003) VDT Server Site Paul Avery 32
GAE: Grid Analysis Environment Ø GAE is crucial for LHC experiments u Large, diverse, distributed community of users u Support for 100 s-1000 s of analysis tasks, over dozens of sites u Dependent on high-speed networks Ø GAE is where the physics gets done analysis teams u Team structure: Local, national, global u Teams share computing, storage & network resources Ø But the global system has finite resources u Widely varying task requirements and priorities u Need for robust authentication and security u Need to define and implement collaboration policies & strategies University of Mississippi (Nov. 11, 2003) Paul Avery 33
CAIGEE: A Prototype GAE Ø CMS Analysis – an Integrated Grid Enabled Environment u Exposes “Global System” to physicists u Supports data requests, preparation, production, movement, analysis u Targets US-CMS physicists University of Mississippi (Nov. 11, 2003) Paul Avery 34
Grid-Enabled Analysis Prototypes Collaboration Analysis Desktop JASOn. PDA (via Clarens) COJAC (via Web Services) ROOT (via Clarens) University of Mississippi (Nov. 11, 2003) Paul Avery 35
Virtual Data: Derivation and Provenance Ø Most scientific data are not simple “measurements” u They are computationally corrected/reconstructed u They can be produced by numerical simulation Ø Science & eng. projects are more CPU and data intensive u Programs are significant community resources (transformations) u So are the executions of those programs (derivations) Ø Management of dataset transformations important! u Derivation: u Provenance: Instantiation of a potential data product Exact history of any existing data product We already do this, but manually! University of Mississippi (Nov. 11, 2003) Paul Avery 36
Virtual Data Motivations (1) “I’ve found some interesting data, but I need to know exactly what corrections were applied before I can trust it. ” Data “I’ve detected a muon calibration error and want to know which derived data products need to be recomputed. ” consumed-by/ generated-by product-of Transformation execution-of “I want to search a database for 3 muon SUSY events. If a program that does this analysis exists, I won’t have to write one from scratch. ” University of Mississippi (Nov. 11, 2003) Paul Avery Derivation “I want to apply a forward jet analysis to 100 M events. If the results already exist, I’ll save weeks of computation. ” 37
Virtual Data Motivations (2) Ø Data track-ability and result audit-ability u Universally Ø Facilitate sought by scientific applications resource sharing and collaboration u Data is sent along with its recipe u A new approach to saving old data: economic consequences? Ø Manage workflow u Organize, Ø Repair locate, specify, request data products and correct data automatically u Identify Ø Optimize dependencies, apply x-tions performance u Re-create data or copy it (caches) Manual /error prone Automated /robust University of Mississippi (Nov. 11, 2003) Paul Avery 38
LHC Analysis with Virtual Data mass = 160 decay = bb mass = 160 decay = ZZ mass = 160 decay = WW WW leptons mass = 160 decay = WW Other cuts mass = 160 decay = WW WW e Other cuts University of Mississippi (Nov. 11, 2003) Paul Avery Scientist adds a new derived data branch mass = 160 decay = WW WW e Pt > 20 Other cuts 39
Chimera Virtual Data System ØVirtual Data Language (VDL) u Describes virtual data products XML VDC XML Abstract Planner DAX ØAbstract Job Flow Planner u Creates a logical DAG (dependency graph) ØConcrete Job Flow Planner u Interfaces with a Replica Catalog u Provides a physical DAG submission file to Replica Catalog Concrete Planner Condor-G Virtual data & CMS production MCRun. Job ØGeneric and flexible u As a toolkit and/or a framework u In a Grid environment or locally University of Mississippi (Nov. 11, 2003) DAG Logical (VDC) Physical ØVirtual Data Catalog u Used to store VDL DAGMan Paul Avery 40
Test: Sloan Galaxy Cluster Analysis Sloan Data Galaxy cluster size distribution University of Mississippi (Nov. 11, 2003) Paul Avery 41
Promising New Directions University of Mississippi (Nov. 11, 2003) Paul Avery 42
HEP: Driver for International Networks Ø BW in Mbps (2001 estimates) Ø Now seen as too conservative! University of Mississippi (Nov. 11, 2003) Paul Avery 43
HEP & Network Land Speed Records Ø 9/01 102 Mbps CIT-CERN Ø 5/02 450 -600 Mbps SLAC-Manchester Ø 9/02 1900 Mbps Chicago-CERN Ø 11/02 [LSR] 930 Mbps California-CERN Ø 11/02 [LSR] 9. 4 Gbps in 10 Flows California-Chicago Ø 2/03 [LSR] 2. 38 Gbps in 1 Stream California-Geneva Ø 10/03 [LSR] 5 Gbps in 1 Stream HEP/LHC driving network developments New network protocols Ø Land speed records Ø ICFA networking chair Ø HENP working group in Internet 2 Ø University of Mississippi (Nov. 11, 2003) Paul Avery 44
U. S. Grid Coordination: Trillium Ø Trillium = Gri. Phy. N + i. VDGL + PPDG u Large overlap in leadership, people, experiments u HEP primary driver, but other disciplines too Ø Benefit of coordination u Common software base + packaging: VDT + Pacman u Wide deployment of new technologies, e. g. Virtual Data u Stronger, broader outreach effort u Unified U. S. entity to interact with international Grid projects Ø Goal: establish permanent production Grid u Short term: u Medium term: u Long term: University of Mississippi (Nov. 11, 2003) Grid 2003 Grid 2004, etc. (increasing scale) Open Science Grid Paul Avery 45
Grid 2003 Ø 27 sites (U. S. , Korea) Ø ~2000 CPUs Ø SC 2003 demo University of Mississippi (Nov. 11, 2003) Paul Avery 46
Open Science Grid Ø http: //www. opensciencegrid. org/ u Specific goal: Support US-LHC research program u General goal: U. S. Grid supporting other disciplines Ø Funding mechanism: DOE/NSF u Laboratories (DOE) and universities (NSF) u Sep. 17 NSF meeting: physicists, educators, NSF/EHR, Quark. Net Ø Getting there: “Functional Demonstration Grids” u Grid 2003, Grid 2004, Grid 2005, … u New release every 6 -12 months, increasing functionality & scale u Constant participation in LHC computing exercises University of Mississippi (Nov. 11, 2003) Paul Avery 47
CHEPREO: Center for High Energy Physics Research and Educational Outreach Florida International University § § E/O Center in Miami area i. VDGL Grid Activities CMS Research AMPATH network (S. America) Funded September 2003
Ultra. Light: 10 Gb/s Network Submitted Nov. 10, 2003 University of Mississippi (Nov. 11, 2003) Paul Avery 10 Gb/s+ network • Caltech, UF, FIU, UM, MIT • SLAC, FNAL, BNL • Int’l partners 49 • Cisco, Level(3), Internet 2
A Global Grid Enabled Collaboratory for Scientific Research (GECSR) Ø Main participants u Michigan u Caltech u Maryland u FIU Ø First Grid-enabled Collaboratory Ø Tight integration between u Science of Collaboratories u Globally scalable work environment u Sophisticated collaborative tools (VRVS, VNC; Next-Gen) u Monitoring system (Mon. ALISA) Ø Initial targets: Global HEP collaborations Ø Applicable to other large-scale scientific endeavors University of Mississippi (Nov. 11, 2003) Paul Avery 50
Dynamic Workspaces Enabling Global Analysis Communities University of Mississippi (Nov. 11, 2003) Paul Avery 51
GLORIAD: US-Russia-China Network Ø New 10 Gb/s network linking US-Russia-China u Plus Grid component linking science projects Ø Meeting u HEP Ø Broad people (Hesheng, et al. ) agreement that HEP can drive Grid portion u Other Ø More at NSF April 14 with US-Russia-China reps. applications will be solicited meetings planned University of Mississippi (Nov. 11, 2003) Paul Avery 52
Grids: Enhancing Research & Learning Ø Fundamentally alters conduct of scientific research u “Lab-centric”: u “Team-centric”: u “Knowledge-centric”: community Ø Strengthens Activities center around large facility Resources shared by distributed teams Knowledge generated/used by a role of universities in research u Couples universities to data intensive science u Couples universities to national & international labs u Brings front-line research and resources to students u Exploits intellectual resources of formerly isolated schools u Opens new opportunities for minority and women researchers Ø Builds partnerships to drive advances in IT/science/eng u HEP Physics, astronomy, u “Application” sciences Computer Science u Universities Laboratories u Scientists Students u Research Community IT industry University of Mississippi (Nov. 11, 2003) Paul Avery biology, CS, etc. 53
HEP’s Broad Impact and Relevance Ø HEP is recognized as the strongest science driver for Grids u (In collaboration with computer scientists) u LHC a particularly strong driving function Ø Grid projects are driving important network developments u “Land speed records” attract much attention u ICFA-SCIC, I-HEPCCC, US-CERN link, ESNET, Internet 2 Ø We are increasing our impact on education and outreach u Providing Ø HEP technologies, resources for training, education, outreach involvement in Grid projects has helped us! u Many $M brought into the field u Many visible national and international initiatives u Partnerships with other disciplines increasing our visibility u Recognition at high levels (NSF, DOE, EU, Asia) University of Mississippi (Nov. 11, 2003) Paul Avery 54
Summary Ø Progress occurring on many fronts u CS research, 11 VDT releases, simplified installation u Testbeds, productions based on Grid tools using i. VDGL resources Ø Functional Grid testbeds providing excellent experience u Real applications u Scaling up sites, CPUs, people Ø Collaboration occurring with more partners u National, international (Asia, South America) u Testbeds, monitoring, deploying VDT more widely Ø New directions being followed u Networks: Increased capabilities (bandwidth, effic, services) u Packaging: Auto-install+run+exit at remote computing sites u Virtual data: Powerful paradigm for scientific computing u Research: Collaborative and Grid tools for distributed teams University of Mississippi (Nov. 11, 2003) Paul Avery 55
Grid References Ø Grid Book u www. mkp. com/grids Ø Globus u www. globus. org Ø Global Grid Forum u www. gridforum. org Ø PPDG u www. ppdg. net Ø Gri. Phy. N u www. griphyn. org Ø i. VDGL u www. ivdgl. org Ø Tera. Grid u www. teragrid. org Ø EU Data. Grid u www. eu-datagrid. org University of Mississippi (Nov. 11, 2003) Paul Avery 56
Extra Slides University of Mississippi (Nov. 11, 2003) Paul Avery 57
Some (Realistic) Grid Examples Ø High energy physics u 3, 000 physicists worldwide pool Petaflops of CPU resources to analyze Petabytes of data Ø Fusion power (ITER, etc. ) u Physicists quickly generate 100 CPU-years of simulations of a new magnet configuration to compare with data Ø Astronomy u An international team remotely operates a telescope in real time Ø Climate modeling u Climate scientists visualize, annotate, & analyze Terabytes of simulation data Ø Biology u. A biochemist exploits 10, 000 computers to screen 100, 000 compounds in an hour University of Mississippi (Nov. 11, 2003) Paul Avery 58
Gri. Phy. N Goals Ø Conduct CS research to achieve vision u “Virtual Data” as unifying principle Ø Disseminate u Primary Ø Integrate deliverable of Gri. Phy. N into Gri. Phy. N science experiments u Common Ø Impact u HEP, through Virtual Data Toolkit (VDT) Grid tools, services other disciplines biology, medicine, virtual astronomy, eng. Ø Educate, involve, train students in IT research u Undergrads, postdocs, underrepresented groups University of Mississippi (Nov. 11, 2003) Paul Avery 59
i. VDGL Goals and Context Ø International u. A u. A Virtual-Data Grid Laboratory global Grid laboratory (US, EU, E. Europe, Asia, S. America, …) place to conduct Data Grid tests “at scale” mechanism to create common Grid infrastructure laboratory for other disciplines to perform Data Grid tests focus of outreach efforts to small institutions Ø Context of i. VDGL in LHC computing program u Develop and operate proto-Tier 2 centers u Learn how to do Grid operations (GOC) Ø International participation u Data. Tag partner project in EU u New international partners: Korea and Brazil u UK e-Science programme: support 6 CS Fellows per year in U. S. University of Mississippi (Nov. 11, 2003) Paul Avery 60
ATLAS Simulations on i. VDGL Resources Fermilab SDSS UW Milwaukee LIGO Argonne, Chicago Boston U Michigan BNL LBL Indiana OU UTA Joint project with i. VDGL University of Mississippi (Nov. 11, 2003) Paul Avery Florida US CMS Tier 1 Prototype Tier 2 Testbed sites 61
US-CMS Testbed Korea Wisconsin MIT Taiwan Fermilab Russia Caltech FSU UCSD Florida Rice FIU Brazil University of Mississippi (Nov. 11, 2003) Paul Avery 62
World. Grid Demonstration (Nov. 2002) Ø Joint i. VDGL + EU effort u Resources from both sides (15 sites) u Monitoring tools (Ganglia, MDS, Net. Saint, …) u Visualization tools (Nagios, Map. Center, Ganglia) Ø Applications u CMS: u ATLAS: Ø Submit u Jobs Ø Major CMKIN, CMSIM ATLSIM jobs from US or EU can run on any cluster demonstrations u IST 2002 (Copenhagen) u SC 2002 (Baltimore) University of Mississippi (Nov. 11, 2003) Paul Avery 63
World. Grid Sites (Nov. 2002) University of Mississippi (Nov. 11, 2003) Paul Avery 64
International Grid Coordination Ø Global Grid Forum (GGF) u International forum for general Grid efforts u Many working groups, standards definitions Ø Close collaboration with EU Data. Grid (EDG) u Many Ø HICB: connections with EDG activities HEP Inter-Grid Coordination Board u Non-competitive forum, strategic issues, consensus u Cross-project policies, procedures and technology, joint Ø HICB-JTB projects Joint Technical Board u Definition, oversight and tracking u GLUE interoperability group Ø Participation of joint projects in LHC Computing Grid (LCG) u Software Computing Committee u Project Execution Board (PEB) u Grid Deployment Board (GDB) University of Mississippi (Nov. 11, 2003) (SC 2) Paul Avery 65
2294b3a24295e20f7e28cebf7b31b6d2.ppt