Скачать презентацию The Opt IPuter and Its Applications Invited Talk Скачать презентацию The Opt IPuter and Its Applications Invited Talk

abd1eb1a955df24dc3c944214905848f.ppt

  • Количество слайдов: 39

The Opt. IPuter and Its Applications Invited Talk Cyberinfrastructure for Humanities, Arts, and Social The Opt. IPuter and Its Applications Invited Talk Cyberinfrastructure for Humanities, Arts, and Social Sciences A Summer Institute SDSC UCSD July 26, 2006 Dr. Larry Smarr Director, California Institute for Telecommunications and Information Technologies Harry E. Gruber Professor, Dept. of Computer Science and Engineering Jacobs School of Engineering, UCSD

From “Supercomputer–Centric” to “Supernetwork-Centric” Cyberinfrastructure Terabit/s 32 x 10 Gb “Lambdas” Gigabit/s 60 TFLOP From “Supercomputer–Centric” to “Supernetwork-Centric” Cyberinfrastructure Terabit/s 32 x 10 Gb “Lambdas” Gigabit/s 60 TFLOP Altix 1 GFLOP Cray 2 Optical WAN Research Bandwidth Has Grown Much Faster Than Supercomputer Speed! Megabit/s T 1 Network Data Source: Timothy Lance, President, NYSERNet Computing Speed (GFLOPS) Bandwidth of NYSERNet Research Network Backbones

The Opt. IPuter Project – Creating a “Super. Web” for Data Intensive Researchers • The Opt. IPuter Project – Creating a “Super. Web” for Data Intensive Researchers • NSF Large Information Technology Research Proposal – Calit 2 (UCSD, UCI) and UIC Lead Campuses—Larry Smarr PI – Partners: SDSC, USC, SDSU, NCSA, NW, TA&M, Uv. A, SARA, NASA Goddard, KISTI, AIST, CRC(Canada), CICESE (Mexico) • Industrial Partners – IBM, Sun, Telcordia, Chiaro, Calient, Glimmerglass, Lucent • $13. 5 Million Over Five Years—Now In the Fourth Year NIH Biomedical Informatics Research Network NSF Earth. Scope and ORION

What is the Opt. IPuter? • Applications Drivers Interactive Analysis of Large Data Sets What is the Opt. IPuter? • Applications Drivers Interactive Analysis of Large Data Sets • Opt. IPuter Nodes Scalable PC Clusters with Graphics Cards • IP over Lambda Connectivity Predictable Backplane • Open Source Lambda. Grid Middleware Network is Reservable • Data Retrieval and Mining Lambda Attached Data Servers • High Defn. Vis. , Collab. SW High Performance Collaboratory www. optiputer. net See Nov 2003 Communications of the ACM for Articles on Opt. IPuter Technologies

Dedicated Optical Channels Makes High Performance Cyberinfrastructure Possible (WDM) Source: Steve Wallach, Chiaro Networks Dedicated Optical Channels Makes High Performance Cyberinfrastructure Possible (WDM) Source: Steve Wallach, Chiaro Networks “Lambdas” Parallel Lambdas are Driving Optical Networking The Way Parallel Processors Drove 1990 s Computing

National Lambda Rail (NLR) and Tera. Grid Provides Cyberinfrastructure Backbone for U. S. Researchers National Lambda Rail (NLR) and Tera. Grid Provides Cyberinfrastructure Backbone for U. S. Researchers NSF’s Tera. Grid Has 4 x 10 Gb Lambda Backbone Seattle International Collaborators Portland Boise Ogden/ Salt Lake City UC-Tera. Grid UIC/NW-Starlight Chicago Cleveland New York City Denver San Francisco Pittsburgh Washington, DC Kansas City Los Angeles San Diego Links Two Dozen State and Regional Optical Networks Albuquerque Raleigh Tulsa Atlanta Phoenix Dallas Baton Rouge Las Cruces / El Paso Jacksonville Pensacola San Antonio Houston NLR 4 x 10 Gb Lambdas Initially Capable of 40 x 10 Gb wavelengths at Buildout DOE, NSF, & NASA Using NLR

Creating a North American Superhighway for High Performance Collaboration Next Step: Adding Mexico to Creating a North American Superhighway for High Performance Collaboration Next Step: Adding Mexico to Canada’s CANARIE and the U. S. National Lambda Rail

Opt. IPuter Scalable Adaptive Graphics Environment (SAGE) Allows Integration of HD Streams Opt. IPortal– Opt. IPuter Scalable Adaptive Graphics Environment (SAGE) Allows Integration of HD Streams Opt. IPortal– Termination Device for the Opt. IPuter Global Backplane

Opt. IPortal– Termination Device for the Opt. IPuter Global Backplane • • • 20 Opt. IPortal– Termination Device for the Opt. IPuter Global Backplane • • • 20 Dual CPU Nodes, 20 24” Monitors, ~$50, 000 1/4 Teraflop, 5 Terabyte Storage, 45 Mega Pixels--Nice PC! Scalable Adaptive Graphics Environment ( SAGE) Jason Leigh, EVL-UIC Source: Phil Papadopoulos SDSC, Calit 2

The World’s Largest Tiled Display Wall— Calit 2@UCI’s HIPer. Wall HDTV Digital Cameras Digital The World’s Largest Tiled Display Wall— Calit 2@UCI’s HIPer. Wall HDTV Digital Cameras Digital Cinema Zeiss Scanning Electron Microscope Center of Excellence in Calit 2@UCI Albert Yee, PI Calit 2@UCI Apple Tiled Display Wall Driven by 25 Dual-Processor G 5 s 50 Apple 30” Cinema Displays 200 Million Pixels of Viewing Real Estate! Falko Kuester and Steve Jenks, PIs Featured in Apple Computer’s “Hot News”

3 D Videophones Are Here! The Personal Varrier Autostereo Display • Varrier is a 3 D Videophones Are Here! The Personal Varrier Autostereo Display • Varrier is a Head-Tracked Autostereo Virtual Reality Display – 30” LCD Widescreen Display with 2560 x 1600 Native Resolution – A Photographic Film Barrier Screen Affixed to a Glass Panel – The Barrier Screen Reduces the Horizontal Resolution To 640 Lines • • Cameras Track Face with Neural Net to Locate Eyes The Display Eliminates the Need to Wear Special Glasses Source: Daniel Sandin, Thomas De. Fanti, Jinghua Ge, Javier Girado, Robert Kooima, Tom Peterka—EVL, UIC

How Do You Get From Your Lab to the National Lambda. Rail? “Research is How Do You Get From Your Lab to the National Lambda. Rail? “Research is being stalled by ‘information overload, ’ Mr. Bement said, because data from digital instruments are piling up far faster than researchers can study. In particular, he said, campus networks need to be improved. High-speed data lines crossing the nation are the equivalent of six-lane superhighways, he said. But networks at colleges and universities are not so capable. “Those massive conduits are reduced to two-lane roads at most college and university campuses, ” he said. Improving cyberinfrastructure, he said, “will transform the capabilities of campus-based scientists. ” -- Arden Bement, the director of the National Science Foundation www. ctwatch. org

To Build a Campus Dark Fiber Network— First, Find Out Where All the Campus To Build a Campus Dark Fiber Network— First, Find Out Where All the Campus Conduit Is!

UCSD Campus-Scale Routed Opt. IPuter with Nodes for Storage, Computation and Visualization UCSD Campus-Scale Routed Opt. IPuter with Nodes for Storage, Computation and Visualization

The New Optical Core of the UCSD Campus-Scale Testbed: Evaluating Packet Routing versus Lambda The New Optical Core of the UCSD Campus-Scale Testbed: Evaluating Packet Routing versus Lambda Switching Goals by 2007: >= 50 endpoints at 10 Gig. E >= 32 Packet switched >= 32 Switched wavelengths >= 300 Connected endpoints Approximately 0. 5 TBit/s Arrive at the “Optical” Center of Campus Switching will be a Hybrid Combination of: Packet, Lambda, Circuit -OOO and Packet Switches Already in Place Funded by NSF MRI Grant Lucent Glimmerglass Force 10

Opt. IPuter@UCI is Up and Working ONS 15540 WDM at UCI campus MPOE (CPL) Opt. IPuter@UCI is Up and Working ONS 15540 WDM at UCI campus MPOE (CPL) 10 GE DWDM Network Line 1 GE DWDM Network Line Calit 2 Building Kim-Jitter Measurements This Week! Tustin CENIC Calren POP Wave-2: layer-2 GE. UCSD address space 137. 110. 247. 210 -222/28 Floor 4 Catalyst 6500 UCSD Optiputer Network Engineering Gateway Building, SPDS Viz Lab Floor 3 Catalyst 6500 Floor 2 Catalyst 6500 Los Angeles HIPer. Wall MDF Catalyst 6500 w/ firewall, 1 st floor closet Wave-1: UCSD address space 137. 110. 247. 242246 NACS-reserved for testing Catalyst 3750 in 3 rd floor IDF Catalyst 3750 in NACS Machine Room (Optiputer) UCInet Catalyst 3750 in CSI Created 09 -27 -2005 by Garrett Hildebrand Modified 11 -03 -2005 by Jessica Yu ESMF 10 GE Wave 1 1 GE Wave 2 1 GE

Calit 2/SDSC Proposal to Create a UC Cyberinfrastructure of Opt. IPuter “On-Ramps” to Tera. Calit 2/SDSC Proposal to Create a UC Cyberinfrastructure of Opt. IPuter “On-Ramps” to Tera. Grid Resources Opt. IPuter + Cal. REN-XD + Tera. Grid = “Opti. Grid” UC Davis UC San Francisco UC Berkeley UC Merced UC Santa Cruz UC Los Angeles UC Santa Barbara UC Riverside UC Irvine UC San Diego Creating a Critical Mass of End Users on a Secure Lambda. Grid Source: Fran Berman, SDSC , Larry Smarr, Calit 2

Opt. IPuter Software Architecture--a Service-Oriented Architecture Integrating Lambdas Into the Grid Distributed Applications/ Web Opt. IPuter Software Architecture--a Service-Oriented Architecture Integrating Lambdas Into the Grid Distributed Applications/ Web Services Source: Andrew Chien, UCSD Visualization Telescience SAGE Data Services Juxta. View Vol-a-Tile Lambda. RAM Distributed Virtual Computer (DVC) API DVC Runtime Library DVC Configuration DVC Services DVC Communication DVC Job Scheduling DVC Core Services Resource Namespace Identify/Acquire Management Security Management High Speed Communication Storage Services GSI XIO Robu. Store Globus PIN/PDC Discovery and Control Lambdas GRAM IP GTP CEP XCP Lambda. Stream UDT RBUDP

PI Larry Smarr Announced January 17, 2006 $24. 5 M Over Seven Years PI Larry Smarr Announced January 17, 2006 $24. 5 M Over Seven Years

Marine Genome Sequencing Project Measuring the Genetic Diversity of Ocean Microbes CAMERA’s Sorcerer II Marine Genome Sequencing Project Measuring the Genetic Diversity of Ocean Microbes CAMERA’s Sorcerer II Data Will Double Number of Proteins in Gen. Bank!

Announced January 17, 2006 Announced January 17, 2006

CAMERA’s Direct Access Core Architecture Will Create Next Generation Metagenomics Server Sargasso Sea Data CAMERA’s Direct Access Core Architecture Will Create Next Generation Metagenomics Server Sargasso Sea Data Moore Marine Microbial Project NASA Goddard Satellite Data Community Microbial Metagenomics Data. Base Farm Flat File Server Farm 10 Gig. E Fabric Request + Web Services JGI Community Sequencing Project W E B PORTAL Sorcerer II Expedition (GOS) Traditional User Dedicated Compute Farm (1000 CPUs) Response Direct Access Lambda Cnxns Tera. Grid: Cyberinfrastructure Backplane (scheduled activities, e. g. all by all comparison) (10000 s of CPUs) Source: Phil Papadopoulos, SDSC, Calit 2 Local Environment Web (other service) Local Cluster

The Future Home of the Moore Foundation Funded Marine Microbial Ecology Metagenomics Complex First The Future Home of the Moore Foundation Funded Marine Microbial Ecology Metagenomics Complex First Implementation of the CAMERA Complex Major Buildout of Calit 2 Server Room Underway Photo Courtesy Joe Keefe, Calit 2

enter Institute Will Combine Telepresence with Remote Inte Live Demonstration of 21 st Century enter Institute Will Combine Telepresence with Remote Inte Live Demonstration of 21 st Century National-Scale Team Science 25 Miles Venter Institute Opt. IPuter Visualized Data HDTV Over Lambda

UIC/UCSD 10 GE CAVEWave on the National Lambda. Rail Emerging Opt. IPortal Sites Opt. UIC/UCSD 10 GE CAVEWave on the National Lambda. Rail Emerging Opt. IPortal Sites Opt. IPortals UW UIC EVL NEW! MIT NEW! JCVI UCI SIO UCSD Sun. Light SDSU CICESE CAVEWave Connects Chicago to Seattle to San Diego…and Washington D. C. as of 4/1/06 and JCVI as of 5/15/06

Borderless Collaboration Between Global University Research Centers at 10 Gbps Maxine Brown, Tom De. Borderless Collaboration Between Global University Research Centers at 10 Gbps Maxine Brown, Tom De. Fanti, Co-Chairs i. Grid 2005 THE GLOBAL LAMBDA INTEGRATED F ACILITY www. igrid 2005. org September 26 -30, 2005 Calit 2 @ University of California, San Diego California Institute for Telecommunications and Information Technology 100 Gb of Bandwidth into the Calit 2@UCSD Building More than 150 Gb GLIF Transoceanic Bandwidth! 450 Attendees, 130 Participating Organizations 20 Countries Driving 49 Demonstrations 1 - or 10 - Gbps Per Demo

Cine. Grid Leverages Opt. IPuter Cyberinfrastructure to Enable Global “Extreme Media” Collaboration • Cine. Cine. Grid Leverages Opt. IPuter Cyberinfrastructure to Enable Global “Extreme Media” Collaboration • Cine. Grid Experiments Aim to Push the State of the Art in: – Streaming & Store-and-Forward File Transfer – – – • – Using High-speed, Low-latency Network Protocols HDTV in Various Formats for: – Teleconferencing – Telepresence – Production 2 K And 4 K Digital Cinema Workflows and Distribution Stereo – In High Resolution (2 K, 4 K) – Virtual Reality In Higher Resolution (24 Megapixel) Distributed Tiled Displays With 20 -100 Megapixels Long term Digital Archiving – – International Workshop: – Tokyo July 2006 – Calit 2 Dec 2006 Source: Tom De. Fanti, Laurin Herr

Opt. IPuter 4 K Telepresence over IP at i. Grid 2005 Demonstrated Technical Basis Opt. IPuter 4 K Telepresence over IP at i. Grid 2005 Demonstrated Technical Basis for Cine. Grid New Calit 2 Digital Cinema Auditorium Lays Technical Basis for Global Digital Keio University President Anzai Cinema UCSD Chancellor Fox Sony NTT SGI

Calit 2 Works with CENIC to Provide the California Optical Core for Cine. Grid Calit 2 Works with CENIC to Provide the California Optical Core for Cine. Grid Partnering with SFSU’s Institute for Next Generation Internet SFSU UCB Digital Archive of Films Cine. Grid. TM will Link UCSD/Calit 2 and USC School of Cinema TV with Keio University Research Institute for Digital Media and Content Plus, 1 Gb and 10 Gb Connections to: Prototype of Cine. Grid. TM • Seattle, Canada, Japan, Asia, Australia, New Zealand • Chicago, Canada, Japan, Europe, Russia, China • Tijuana USC Extended So. Cal Opt. IPuter to USC Source: Laurin Herr, School of Cinema. Pacific Interface Television Cine. Grid. TM Project Leader Calit 2 UCI Calit 2 UCSD

Calit 2 and the Venter Institute Test Cine. Grid™ with HDTV Movie by John Calit 2 and the Venter Institute Test Cine. Grid™ with HDTV Movie by John Carter Star. Light Chicago Sony HDTV JH 3 JCVI Calit 2 Auditorium Live Demonstration of 21 st Century Entertainment Delivery June 14, 2006 JC Venter Institute Rockville, MD

i. Grid 2005 Kyoto Nijo Castle Source: Toppan Printing Interactive VR Streamed Live from i. Grid 2005 Kyoto Nijo Castle Source: Toppan Printing Interactive VR Streamed Live from Tokyo to Calit 2 Over Dedicated Gig. E and Projected at 4 k Resolution

i. Grid 2005 Cultural Heritage China and USA • • • Great Wall Cultural i. Grid 2005 Cultural Heritage China and USA • • • Great Wall Cultural Heritage International Media Centre, China San Diego State University, USA Great Wall Society, China San Diego Supercomputer Center, USA Chinese Academy of Sciences, China GLORIAD, USA Chinese Institute of Surveying and Mapping, China Cybermapping Lab, University of Texas-Dallas, USA GEON viz/3 D-scanning lab, University of Idaho, USA Stanford University, USA Source : Maxine Brown, EVL UIC Data Acquisition from Laser Scanning Combined with Photogrammetry Enables the Construction of Unique Cultural Heritage Images from China. 3 D Designs Of The Great Wall Are Combined With 3 D Scans Of Physical Images And Satellite Imagery Stored On Servers In China And San Diego. www. internationalmediacentre. com/imc/index. html

NSF’s Ocean Observatories Initiative (OOI) Envisions Global, Regional, and Coastal Scales LEO 15 Inset NSF’s Ocean Observatories Initiative (OOI) Envisions Global, Regional, and Coastal Scales LEO 15 Inset Courtesy of Rutgers University, Institute of Marine and Coastal Sciences $300 M in President’s Budget for OOI

Coupling Regional and Coastal Ocean Observatories Using Opt. IPuter and Web/Grid Services www. neptune. Coupling Regional and Coastal Ocean Observatories Using Opt. IPuter and Web/Grid Services www. neptune. washington. edu www. mbari. org/mars/ LOOKING: (Laboratory for the Ocean Observatory Knowledge Integration Grid) LOOKING Funded by NSF ITRJohn Delaney, UWash, PI) www. sccoos. org/

Using the Opt. IPuter to Couple Data Assimilation Models to Remote Data Sources Including Using the Opt. IPuter to Couple Data Assimilation Models to Remote Data Sources Including Biology NASA MODIS Mean Primary Productivity for April 2001 in California Current System Regional Ocean Modeling System (ROMS) http: //ourocean. jpl. nasa. gov/

Interactive Remote Data and Visualization Services • • • Multiple Scalable Displays • Hardware Interactive Remote Data and Visualization Services • • • Multiple Scalable Displays • Hardware Pixel Streaming • Distributed Collaboration Scientific-Info Visualization AMR Volume Visualization Glyph and Feature Visualization Services NCSA Altix Data and Vis Server Linking to Opt. IPuter • Data Mining for Areas of Interest • Analysis and Feature Extraction • Data Mining Services National Laboratory for Advanced Data Research An SDSC/NCSA Data Collaboration

The Synergy of Digital Art and Science Visualization of JPL Simulation of Monterey Bay The Synergy of Digital Art and Science Visualization of JPL Simulation of Monterey Bay 4 k Resolution Source: Donna Cox, Robert Patterson, NCSA Funded by NSF LOOKING Grant

First Remote Interactive High Definition Video Exploration of Deep Sea Vents-Prototyping NEPTUNE Canadian-U. S. First Remote Interactive High Definition Video Exploration of Deep Sea Vents-Prototyping NEPTUNE Canadian-U. S. Collaboration Source John Delaney & Deborah Kelley, UWash

High Definition Still Frame of Hydrothermal Vent Ecology 2. 3 Km Deep 1 cm. High Definition Still Frame of Hydrothermal Vent Ecology 2. 3 Km Deep 1 cm. Source: John Delaney and Research Channel, U Washington White Filamentous Bacteria on 'Pill Bug' Outer Carapace