10.25.05 – As the world begins to focus increasingly on cyberinfrastructure, UC vice chancellors and campus chief information officers began meeting in February this year to map out a strategy for development of an integrated UC cyberinfrastructure (CI) to support research. The two California Institutes for Science and Innovation closest in mission to this area, Calit2 and CITRIS, co-hosted the second meeting held October 10-11 in the new Calit2 building at UCSD.
Some 70 people attended the event, including some by virtual means over video links. They represented nine of the 10 UC campuses; the UC Office of the President; Lawrence Berkeley, Lawrence Livermore, and Los Alamos national laboratories; CENIC; and the National Science Foundation. Many technical staff from both divisions of Calit2 participated as well.
Beth Burnside, Research Vice Chancellor at UC Berkeley, laid the context for the importance of the meeting. She said UC was interested in exploiting CENIC capabilities more fully and called for developing appropriate partnerships, replacement of research tools with a more persistent environment built on robust infrastructure, and greater emphasis on data repositories, performance standards, organizing principles to develop infrastructure in a decentralized environment, and an investment strategy.
As outlined by Kris Hafner, Associate Vice President at UCOP, the meeting’s objectives included (1) identifying current UC capabilities and understanding the research drivers across a broad range of disciplines, (2) beginning to develop a UC blueprint for cyberinfrastructure and (3) identifying opportunities for UC-wide collaboration and action.
“Cyberinfrastructure includes computing cycles, networking, storage, information management, and leadership on shared standards, middleware, and basic applications,” she said. “We’re taking a view that is broader than the NSF definition, as we see a compelling need for cyberinfrastructure across all disciplines. We need a continuing dialog across the vice chancellors and IT leaders on our campuses, and involving CENIC and the California ISIs.”
Guy Almes, the new Director of the NSF cyberinfrastructure office formed in late July this year, also addressed the audience. “Since we’re now within the office of the director, we’re going to be more accountable to all directorates,” he said. He described a strategic plan for 2006-2010, recently published, to enable petascale science and engineering through a world-class high-performance computing environment.
NSF’s strategy is based on integrating four categories of tools and services: high-end computing, data, collaboration and communications, and education and training. “We want to ensure high-speed campus-, metro-, and state-level networks with nothing slower than 10 Gbps. We want to encourage campus clusters to strengthen capacity, flexibility, and synergy with the TeraGrid and the Open Science Grid.”
Fran Berman, Director of the San Diego Supercomputer Center , defined CI as extending the reach of individual, institute, department, and campus capabilities. “UC can lead in development of regional CI by improving connections on the campuses, improving connectivity to the national grids, and coordinating access to, and interoperability among, computational resources.” UC should link regional CI with the TeraGrid, in which SDSC is participating: “The OptIPuter plus CENIC plus the TeraGrid gives you the ‘OptiGrid,’” she said, providing a play on words to underscore the notion of the need for leveraging key projects and capabilities to create the highest-performance grid possible.
She suggested pursuing “low-hanging fruit”: integration of data collections, providing high-bandwidth access to UC computational resources to incorporate higher capacity and capability, and providing higher-bandwidth access to UC scientific instruments.
Leadership starts at home, she said: Developing a blueprint for regional CI should be the next step through coordination among OP, campus CIOs, the Cal ISIs, and the national labs.
Horst Simon, Associate Laboratory Director, Computing Sciences, at Lawrence Berkeley National Laboratory, cited three trends that need to be addressed: the widening gap between application performance and peak performance of high-end computing systems, the recent emergence of large, multidisciplinary computational science teams in the DOE research community, and the flood of scientific data from simulations and experiments along with the convergence of simulation with experimental data collection and analysis in complex workflows.
Calit2 Director Larry Smarr discussed recent developments showcased at iGrid 2005, the first large event held in the Calit2 building the last week of September. These developments included a demonstration of high-definition video of a thermal vent 2.5 kilometers below the sea surface and the first trans-Pacific, super-high-definition telepresence meeting in the Calit2 digital cinema theater. This meeting supported, literally, a real-time conversation between the president of Keio University in Tokyo with his counterpart Marye Anne Fox, chancellor of UCSD, over a distance of 9,000 miles. “It was truly photorealistic with no discernable jitter,” he said with amazement.
Smarr also sees the opportunity for California to provide regional optical networks to support an emerging concept called CineGrid: Grid infrastructure to support distributed production, editing, and dissemination of digital cinema. These networks could connect such groups as the San Francisco State Institute for the Next-Generation Internet and those working in the movie industry especially in Los Angeles and San Francisco , as well as extend the southern California OptIPuter project to the USC School of Cinema-Television.
Other speakers addressed CI successes and challenges in a range of research applications areas, the role of CENIC, and current computational (cluster) capabilities across the UC campuses. The attendees then broke into four breakout groups focused on networking, computational capability, data science and stewardship, and research competitiveness, which reported back to the group in plenary session the last half day of the meeting.