Located at University of California San Diego, the San Diego Supercomputer Center (SDSC) enables international science and engineering discoveries through advances in computational science and data-intensive, high-performance computing.
Continuing this legacy into the era of cyberinfrastructure, SDSC is considered a leader in data-intensive computing, providing resources, services and expertise to the national research community including industry and academia The mission of SDSC is to extend the reach of scientific accomplishments by providing tools such as high-performance hardware technologies, integrative software technologies, and deep interdisciplinary expertise to these communities.SDSC was founded in 1985with a $170 million grant from the National Science Foundation's (NSF) Supercomputer Centers program. From 1997 to 2004, SDSC extended its leadership in computational science and engineering to form the National Partnership for Advanced Computational Infrastructure (NPACI), teaming with approximately 40 university partners around the country. Today, SDSC is an Organized Research Unit of the University of California, San Diego with a staff of talented scientists, software developers, and support personnel.
SDSC is led by Dr. Michael Norman, who was named SDSC interim director in June 2009 and appointed to the position of director in September 2010. Norman is a distinguished professor of physics at UC San Diego and a globally recognized astrophysicist. As a leader in using advanced computational methods to explore the universe and its beginnings, Norman directed the Laboratory for Computational Astrophysics, a collaborative effort between UC San Diego and SDSC.
A broad community of scientists, engineers, students, commercial partners, museums, and other facilities work with SDSC to develop cyberinfrastructure-enabled applications to help manage their extreme data needs. Projects run the gamut from creating astrophysics visualization for the American Museum of Natural History, to supporting more than 20,000 users per day to the Protein Data Bank, to performing large-scale, award-winning simulations of the origin of the universe or how a major earthquake would affect densely populated areas such as southern California. Along with these data cyberinfrastructure tools, SDSC also offers users full-time support including code optimization, training, 24-hour help desk services, portal development and a variety of other services.
As one of the NSF's first national supercomputer centers, SDSC served as the data-intensive site lead in the agency's TeraGrid program, a multiyear effort to build and deploy the world's first large-scale infrastructure for open scientific research. SDSC currently provides advanced user support and expertise for XSEDE (Extreme Science and Engineering Discovery Environment) the five-year NSF-funded program that succeeded TeraGrid in mid-2011.
Within just the last two years, SDSC has launched several new supercomputer systems. The Triton Resource, an integrated, data-intensive compute system primarily designed to support UC San Diego and UC researchers was launched in 2009, along with Dash, the first high-performance compute system to leverage super-sized "flash memory" to accelerate investigation of a wide range of data-intensive science problems. Trestles, a 100-teraflops cluster launched in early 2010 as one of the newest XSEDE resources, is already delivering increased productivity and fast turn-around times to a diverse range of researchers.
In early 2012, SDSC will deploy Gordon, a much larger version of the Dash prototype. With 250 trillion bytes of flash memory and 64 I/O nodes, Gordon will be capable of handling massive databases while providing up to 100 times faster speeds when compared to hard drive disk systems for some queries.
External Link:
Cross-Link:
- Gordon for genomics research (2012.09.26) - how a HPC system armed for large data problem
No comments:
Post a Comment