c00b15c7444d0b283c2272f5a4146179.ppt
- Количество слайдов: 42
Tera. Grid Science Support Nancy Wilkins-Diehr San Diego Supercomputer Center Area Director for Science Gateways
Talk Outline • User Support – User Engagements – Advanced Support for Tera. Grid Applications (ASTA) • Science Gateways – Initial projects – Deployment strategies – Preparation for expansion • Education, Outreach and Training (EOT)
Questions answered in this presentation • • – – – – Has user engagement been effective? How are user requirements investigated and defined? How for uncertainty and change in user requirements managed? How is usability evaluated, e. g. , formatively and summatively? How are applications prioritized for implementation? What refinements or changes to 2. 1 -4 are envisaged? Has outreach been effective? How is the potential for the wider take up of applications assessed? How are applications being adapted for use by wider user communities? Has training been effective? How is effectiveness being assessed? What quality control measures are in place for training materials? What refinements or changes to 4. 1 -2 are envisaged? Science Gateways: The Tera. Grid report refers to a document, a Science Gateway primer, that reports on general strategy for portal deployment. The reference given is http: //wg. teragrig. org/Gateways but this site is private (a password is needed). Please forward a copy of this document. We would like to be able to assess the maturity of the Science Gateways activities. Please provide appropriate information during the presentations. Are effective science portal building environments available to the user community? If so, what is available? – I. e. , what science portals that invoke simulations and/or manage massive data sets are in operation across Tera. Grid and used by discipline science communities? If not, what is the progress toward this? Has a Grid/Web Services environment been established? To what extent is it used by the science community? What cross connections / resource sharing have been made with other Grids? How much effort and funds have been/will be invested in developing and testing inter-grid interoperability?
Tera. Grid User Services Sergiu Sanielevici Pittsburgh Supercomputing Center Area Director for User Services
Components of User Support • 24/7 Help desk integrating all sites • Training and tutorials • Extensive documentation • Tera. Grid User Portal • User contact team • Intensive support – ASTA – Science Gateways • User Survey
Tera. Grid User Portal Vision • Integrate important user capabilities in one place: – Information services: • Documentation, training, real time consulting • Notification (news, MOTDs, next downtimes, etc. ) • Resources info, calendars, cross -site run scheduling • Network info – Account services • Allocation requests • Allocation management & usage reporting • Accounts management (including setting up grid credentials) – Interactive services • Job launching • File transfers • Linear workflow • Data mining – Listing of and access to data collections – Remote vis (interactive), and eventually collaborative • With personalizability and customizability, it can be a foundation for application portals and (some) science gateways
Proactive Approach to Discovering and Meeting User Requirements • User Contact team for each allocated LRAC/MRAC project • Results in ability to understand, track and anticipate evolving needs of the users • Codes specifically written or requested by allocated users receive highest installation priority –Optimization, Scaling, I/O, ETF Network Utilization, Workflow mapping –Overcoming application-level obstacles to portability and interoperation –Resolving third-party package issues • Intensive support for selected projects: ASTA Program
Plans for 2006 • Improve reach and quality of personalized, proactive user support system • Improve tracking and logging of staff-user interactions • Improve User Survey content, administration, and follow-up • Work with external evaluators • Consider new tools e. g. User Forum
Advanced Support for Tera. Grid Applications (ASTA) • Inaugurated 6/1/05; 10 projects now underway • Already produced remarkable new science using TG -deployed software … including the SC 05 Analytics Challenge winner. • Help users to: – Achieve their science objectives – Utilize Tera. Grid resources interestingly and effectively • Improve the quality of the Tera. Grid infrastructure – Provide feedback to staff when testing, piloting and exercising Tera. Grid capabilities • Selection by TG staff, NSF, PIs willing and able to assign developer time from within their project.
Simulation of Blood Flow in Human Arterial Tree on the Tera. Grid Supported by NSF and Tera. Grid Team Members Brown University: Imperial College, London: Argonne National Lab: ASTA: S. Dong, L. Grinberg, A. Yakhot, G. E. Karniadakis S. J. Sherwin N. T. Karonis, J. Insley, J. Binns, M. Papka D. C. O’Neal, C. Guiang, J. Lim
Simulating & Visualizing Human Arterial Tree Computation USA Visualization ANL Flow data a Flo dat w Viz servers ta w o Fl UK da Viewer client SC 05, Seattle, WA
What ASTA Helps With • Nek. Tar development and porting • Mpich-G 2 on heterogeneous platforms • Cross-platform access and “firefighting” • Visualization • Project coordination
CMS on the Tera. Grid Compact Muon Solenoid Experiment Large Hadron Collider PI: Harvey Newman, Cal. Tech • CMS experiment is looking for the Higgs particles, thought to be responsible for mass, and to find supersymmetry, a necessary element for String theory. Currently running event simulations and reconstructions to validate methods prior to experimental data becoming available. “Using the NSF Tera. Grid for Parametric Sweep CMS Applications”, to appear in Proceedings of the International Symposium on Nuclear Electronics and Computing (NEC’ 2005), Sofia, Bulgaria, Sept. 2005 • • Tera. Grid ASTA Team: Tommy Minyard, Edward Walker, Kent Milfeld, Jeff Gardner Simulations running simultaneously across multiple Tera. Grid sites, SDSC, NCSA and TACC, using grid middleware tool, Grid. Shell Complex workflow consisting of multiple execution stages running a large number of serial jobs (~1000 s) with very large datasets stored on SDSC HPSS and staged to local sites prior to job runs Used 420 K CPU hours on Tera. Grid systems last year, usage expected to increase this and coming years
What Tera. Grid Staff Helped With (pre-ASTA) • Grid. Shell development allows the Tera. Grid to be used as a personal Condor pool – Condor jobs scheduled across multiple sites – Do not need shared architectures or queuing systems – Makes use of Tera. Grid protocols for data transfer – Fits into existing Teragrid software stack • CMS production chain run through this system – 40, 000 jobs – SC 05 demo
Current ASTA Projects Span Disciplines Project Discipline End Date Cellulose + Cellulase interactions using CHARMM, PI Brady Port, Scale and Optimize Code Molecular Dynamics 3/31/2006 MD Data Repository, PI Jakobsson Molecular Dynamics 3/31/2006 Liquid Rocket Engine Coaxial Injector Modeling, PI Heister Computational model development and implementation Computational Fluid Dynamics 3/31/2006 Nek. Tar Arterial Tree Simulations, PI Karniadakis Code porting and optimization; MPICH-G 2 and visualization support Computational Fluid Dynamics 3/31/2006 Vortonics: CFD with Vortex Degrees of Freedom, PI Boghosian MPICH-G 2 and visualization support Computational Fluid Dynamics 3/31/2006 SPICE Non-Equilibrium Simulations, PIs Coveney and Boghosian Code deployment, grid and steering implementation support DNA Modeling 3/31/2006 ENZO Cosmic Simulator, PI Norman Cosmology 3/31/2006 Seismology 3/31/2006 CIG: Cyberinfrastructure for Geodynamics, PI Gurnis Develop software framework, repository, portal and training Geophysics 5/31/2006 BIRN (Biomedical Informatics Research Network), PI Ellisman Biomedical Imaging 9/30/2006 Implementation of architectural components Code optimization and scaling, network data handling and archiving SCEC Tera. Shake-2 and Cyber. Shake, PI Olsen Code optimization, TG data handling and archiving, task flow mapping Develop and optimize codes; map task flows to TG
Proposed ASTA Candidates Project Discipline LEAD: Storm-Scale Forecasts and Library Atmospheric modeling CERN LHC support: CMS; ATLAS High energy physics BNL RHIC experiment: STAR High energy physics Nano. Hub: Nemo-3 D Nanotechnology NAMD-G Molecular Dynamics PPM: Turbulent Astrophysical Flows, interactive simulations Astrophysics
Tera. Grid Science Gateways Nancy Wilkins-Diehr San Diego Supercomputer Center Area Director for User Services
Science Gateways A new initiative for the Tera. Grid • Increasing investment by communities in their own cyberinfrastructure, but heterogeneous: • Resources • Users – from expert to K-12 • Software stacks, policies • Science Gateways – Provide “Tera. Grid Inside” capabilities – Leverage community investment • Three common forms: – Web-based Portals – Application programs running on users' machines but accessing services in Tera. Grid – Coordinated access points enabling users to move seamlessly between Tera. Grid and other grids. Workflow Composer
Initial Focus on 10 Gateways Listed in Program Plan Science Gateway Prototype Discipline Science Partner(s) Tera. Grid Liaison Linked Environments for Atmospheric Discovery (LEAD) Atmospheric Droegemeier (OU) Gannon (IU), Pennington (NCSA) National Virtual Observatory (NVO) Astronomy Szalay (Johns Hopkins) Williams (Caltech) Network for Computational Nanotechnology (NCN) and “nano. HUB” Nanotechnology Lundstrum (PU) Goasguen (PU) Open Life Sciences Gateway Biomedicine and Biology Schneewind (UC), Osterman (Burnham/UCSD), De. Long (MIT), Dusko (INRA) Stevens (UC/Argonne) Biology and Biomedical Science Gateway Biomedicine and Biology Cunningham (Duke), Magnuson (UNC) Reed (UNC), Blatecky (UNC) Neutron Science Instrument Gateway Physics Cobb (ORNL) Grid Analysis Environment High-Energy Physics Newman (Caltech) Bunn (Caltech) Transportation System Decision Support Homeland Security Stephen Eubanks (LANL) Beckman (Argonne) Groundwater/Flood Modeling Environmental Wells (UT-Austin), Engel (ORNL) Boisseau (TACC) Science Grid [Gr. Phy. N/iv. DGL/Grid 3] Multiple Pordes (FNAL), Huth (Harvard), Avery (Uflorida) Foster (UC/Argonne), Kesselman (USCISI), Livny (UW) Proposed Supplemental Activity: Empowering Science, Research, and Discovery Russ Miller, Mark Green, University of Buffalo • Enabling scientific and engineering domain applications using Grid-enabling Application Templates ( GATs), • Porting 16 applications per year as well as providing support in terms of training 20 -30 research groups per year
So how will we meet all these needs? • With RATS! (Requirements Analysis Teams) • Collection, analysis and consolidation of requirements to jump start the work – Interviews with 10 Gateways – Common user models, accounting needs, scheduling needs • Summarized requirements for each Tera. Grid working group – Accounting, Security, Web Services, Software • Areas for more study identified • Primer outline for new Gateways in progress • And milestones
Implications for Tera. Grid working groups • Accounting – Support for accounts with differing capabilities – Ability to associate compute job to a individual portal user – Scheme for portal registration and usage tracking – Support for OSG’s Grid User Management System (GUMS) – Dynamic accounts • Security – Community account privileges – Need to identify human responsible for a job for incident response – Acceptance of other grid certificates – TG-hosted web servers, cgi-bin code • Web Services – Initial analysis completed 12/05 – Some Gateways (LEAD, Open Life Sciences) have immediate needs – Many will build on capabilities offered by GT 4, but interoperability could be an issue – Web Service security – Interfaces to scheduling and account management are common requirements • Software – Interoperability of software stacks between TG and peer grids – Software installations for gateways across all TG sites – Community software areas – Management (pacman, other options)
Significant Progress in CY 2005 • January-March – Initial Gateway interviews and requirements analysis completed • April – Internal web page • Project descriptions, RAT reports, staffing, milestones, email archives, presentations • May – Biweekly calls begin • Variety of issues discussed, special presentations – Accounts for all developers – Progress tracking for all gateways – Special presentations • Edward Walker, gridshell • Lee Liming, GT 4 – Address recommendations to and from tgacctmgmt and security-wg – Three new RATs • Portal technology (John Cobb) • Web services (Ivan Judson) • OSG (Stuart Martin) • June – International Science Gateways workshop at GGF 14 • August – Repo area for software exchanges • JDBC SQL for accounting queries to be first piece of contributed code • September – Security-wg provides requirements for community accounts • October – Gateways provide means to collect required info, expanded user responsibilites form for community accounts in production – Production community accounts in use (nano. XX, bioportal) – Discussions with security-wg about portal hosting within TG (NVO, HEP) – SC 05 prep begins– demos, posters, movie clips, images, booth scheduling – Web Services recommendations complete – “How to become a gateway” at www. teragrid. org – User-friendly listing of gateways • November – SC 05 focus continues – GT 4 deployment evaluation, Mike Showerman joins call – Special presentations • Grid. Chem • PURSE and GAMA – Call with Roy Williams and security-wg to discuss “weak cert” concept – Gateway plans collected for Program Plan • December – Finalize Program Plan input – Outline plans for next quarter
Early CY 2006 Plans • CI Channel presentation (March) • Montana State Workshop sponsored by Lariat (March) – How Grid Computing can Accelerate Research – Special Talks on Bio-informatics and the Grid • Portal Technology RAT, John Cobb • Account management through User Portal, Eric Roberts • Audit trails for community accounts • Begin implementation of TG and Gateway provided web services • Complete further analysis of scheduling requirements and implementation ideas • Full day training session at TG AHM
Gateways Under the Hood: Open Life Science Gateway and Web Services • OLSG integrates four components: – Tools from National Microbial Pathogen Data Resource (http: //www. nmpdr. org) and The. Seed (http: //theseed. uchicago. edu/FIG/index. cgi) – Open bioinformatics tools and data – Web Services – Tera. Grid resources • Providing: – Web-based access for account administration, trivial access to resources, and documentation. – Web service based access to tools, including: • Taverna, Kepler, other workflow tools • Microsoft Development Environment • Open Source Web Service Toolkits: – SOAP: : Lite [perl], ZSI [python], Apache Axis [c/java] • Bioinformatics toolkits such as Bio. Perl and Bio. Python – Data access • Tera. Grid presentation requested at for February NIH meeting • http: //lsgw. mcs. anl. gov/
OLSG Helps Define TG-wide Policies • Q 1 FY 06 Accomplishments – Web Service Enabled SEED Software – Developed Life Science Gateway Architecture – Led Web Services RAT, working to develop the right model for Gateways with respect to Tera. Grid Resources, Security, and User Model • Q 2 FY 06 Plans – Deploy prototype web/grid services based Tera. Grid hosted access to community developed computational phylogeny tools (e. g. , PHYLIP suite) – Develop strategy for supporting large-scale computing needs for the National Centers for Biomedical Computing (i. e. , the BISTI Centers)
Gateways Under the Hood: LEAD, Workflows and Web Services • Providing tools that are needed to make accurate predictions of tornados and hurricanes • Data exploration and Grid workflow
Log in and see your My. LEAD Space • x
Creating a workflow for Data Mining • Use ADa. M services from UAH Nexrad II Radar Data 3 DMesocyclone Detection Feature Extraction Service ESML Descriptor ESML_Converter Data Transformation Service Min. Max. Normalizer Data Normalization Service Bayes. Classifying Classification Service Visualization
Monitor results in real time
Large workflows can be composed
Educational Resources
Gateways Under the Hood: OSG and Grid Interoperation • OSG RAT led by Stuart Martin – Implementation of Grid Service Interoperability • Deploying and Supporting Common Grid Services and Protocols • Creating OSG Gateways – Basic Grid Interoperability Services • Authentication / Authorization / Accounting (AAA) • Information Services • Job Execution • Data Handling – User and Application Level Grid Interoperability Services • Resource Discovery / Selection • Resource Brokering • Job Submission and Bookkeeping • Data Management – Interoperability Quality Assessment • User Support and Troubleshooting • Application Performance • Grid Interoperability wg formed 12/05
Grid Interoperation • Tera. Grid/OSG Interop work (Stuart Martin et al. ) drove organization of a multi-grid interoperation initiative begun in 2005. • Leaders from Tera. Grid, OSG, EGEE, APAC, NAREGI, DEISA, Pragma, UK NGS, KISTI will lead an interoperation initiative in 2006. • Six international “RATs” will meet for the first time at GGF-16 in February 2006 – Application Use Cases • (Bair/Tera. Grid, Alessandrini/DEISA) – Authentication/Identity Mgmt • (Skow/Tera. Grid) – Job Description Language • Newhouse/UK-NGS – Data Location/Movement • Pordes/OSG – Information Schemas • Matsuoka/NAREGI – Testbeds • Arzberger/Pragma Leaders from nine Grid initiatives met at SC 05 to plan an application-driven “Inerop Challenge” in 2006.
Tera. Grid Education, Outreach and Training (EOT) and External Relations (ER) Scott Lathrop Argonne National Laboratory Director for EOT
Mission, Goals, and Strategies The mission is to engage larger and more diverse communities of researchers, educators and learners in discovering, using, and contributing to Tera. Grid. The goals are to: – Enable awareness and access to Tera. Grid resources – Provide education and training for all disciplines, and all stages of learning (K-12 through professional) – Promote diversity among all Tera. Grid activities – Expand the community of users of Tera. Grid The strategies are to – Work with Tera. Grid Science Gateways, User Support and the Core program – Leverage strategic external partnerships and – Assess the community impact.
EOT and ER Team Members Using the User Support model, the GIG coordinates a TGwide EOT and ER program with an enthusiastic group of RP and Core/CIP staff. • • • Argonne/UChicago: Scott Lathrop, Ray Bair, Joe Insley Cal. Tech: Sarah Bunn Indiana: Craig Stewart, Julie Wernert NCSA: Sandie Kappes, Edee Wiziecki, Mike Freemon, Bill Bell, Trish Barker ORNL: John Cobb, Betsy Riley PSC: Sergiu Sanielevici, Beverly Clayton, Cheryl Begandy, Mike Schneider, Sean Fulton Purdue: Sebastien Goasguen, Gary Bertoline, Krishna Madhavan, Steve Dunlop SDSC: Diane Baxter, Ange Mason, Don Frederick, Ashley Wood, Greg Lund, Diana Diehl, Tim Gumto TACC: Stephenie Mc. Lean, Faith Singer-Villalobos
Education Plans and Effectiveness Plans • Professional development for and with UG faculty and secondary school teachers Development and dissemination of resources including software, curricular materials, and lesson plans • Mentoring of students in using cyberinfrastructure to learn math and science, and in pursuing advanced studies Effectiveness • Leading the SC Education Programs SC 05 -SC 06 • Nano. HUB used by 10 universities in dozens of UG/Grad courses. • Scaling-up successful EOT-PACI/EPIC projects (e. g. Teacher. TECH) • External Partnerships: EPIC, NSDL-Computational Science Education Reference Desk, the National Computational Science Institute, and CIP
SC Education Program Plans and Effectiveness • Purdue is leading the SC 05 and SC 06 Education Program, including summer workshops • Tera. Grid Team has been asked to propose a multiyear Education Program starting with SC 07 -09 – Goal is to provide greater continuity and broader, sustained integration of computational science education for undergraduate education – Proposal being made to the SC Steering Committee next week to initiate the program in 2006 to prepare for SC 07 – Engages a large national planning team representing multiple state and national programs that can help leverage and sustain the program
Outreach Plans and Effectiveness Plans Raise awareness for Tera. Grid’s impact on research and education Engage under-represented people in Tera. Grid development and use, with a focus on MSI college faculty and students Outreach with new communities that have not traditionally been users of cyberinfrastructure and grid computing Effectiveness • New Science Gateways: Telescience, BIRN and NEES • Community engagement to applications via professional society meetings, conferences, and workshops; usage has increased • External Partnerships: Minority Serving Institution Network, Humanities, Arts, and Social Sciences (HASTAC and CHASS)
Training Plans and Effectiveness Plans • Hands-on training for researchers on topics from introductory to advanced applications of grid computing. • Training venues include live workshops, Access Grid sessions, and on-line Web. CT courses • Coordination of training opportunities across Tera. Grid Effectiveness • Review of training materials by experts in the field • Post-workshop surveys by participants assessing the quality • Tracking of Web. CT course usage for enhancement • User surveys provide feedback on quality and needs • Identification of needs by ASTA, Science Gateways, and User Support • Joint workshops and training activities by GIG, RPs, and CIP • PSC is investigating Standardized User Monitoring Suite • Established Partnerships: NMI, National Microbial Pathogen Data Resource (NMPDR), and CIP
External Relations Plans and Effectiveness Plans • Promote Tera. Grid use and adoption via publicity • Organize public relations efforts • Highlight Tera. Grid’s value via communications • Communications of tech changes for smooth transitions • Provide internal communications strategies for all of TG Effectiveness • Press Releases, news stories, science nuggets • Publications – Tera. Grid brochure, user publications lists • Website – increased usage • Presentations – multiple venues and multiple events • Event Management and Logistics (e. g. SCxx) • External Partnerships: OSG, ASCRIBE, Grid. Today, HPCwire


