Oxy-Gon High Temperature Quenching Furnace
You are here
Tantalum Crucible Welder
Rare Earth Metal Melting and Casting Unit
Leco CS-444 Carbon/Sulfur Determinator
Hydrogen Analyzer Hot Vacuum Extraction System
The development of emerging technologies such as molecular computing, nanotechnology, and next generation catalysts will continue to place increasing demands on chemical simulation software, requiring more capabilities and more sophisticated simulation environments. Such software will be too complex for a single group, or even a single discipline to develop independently. Coupling multiple physical models in one domain and coupling simulations across multiple time and length-scales will become the norm rather than the exception. These simulations will also run on more complicated and diverse hardware platforms, potentially with hundreds of thousands of processors and performance exceeding one petaFLOP/s. This evolution will transform the way chemists must think about scientific problems, models and algorithms, software lifecycle and the use of computational resources. Advances in chemical science critical to DOE and national challenges require adoption of new approaches for large-scale collaborative development and a flexible, community-based architecture. We propose to employ the infrastructure of the Common Component Architecture to develop interfaces among three of the most important computational chemistry codes in the world: General Atomic and Molecular Electronic Structure System (GAMESS), the Massively Parallel Quantum Chemistry program (MPQC) and Northwest Chem (NWChem).
Mark Gordon, Masha Sosonkina, Theresa Windus
High-performance applications executing on distributed systems achieve only a fraction of the peak aggregate performance of the underlying hardware and middleware. This is due mainly to the mismatch between the way parallel computation and communication are organized into applications and the optimal way to use the processor, memory, and interconnect hardware. Different programming models, language primitives, and supporting services are “single-box” systems to distributed systems with nodes located thousands of miles apart. The purpose of the project is twofold: to achieve transparent tuning of high-performance applications to the communication subsystem while facilitating transition to future programming models, and to augment the newly developed and enhanced programming models with information about the communication environment.
It is proposed to further the present understanding of circulating fluidized beds from the conceptual standpoint of kinetic theory. The primary purpose is to provide a theoretical underpinning for the construction of computer codes to better understand and predict multiphase flow behavior in circulating fluidized beds, and, in particular, to provide theoretical estimates for the transport coefficient analogues that parameterize the computer simulations.
Rodney Fox, Shankar Subramaniam
Researchers will arrive at a comprehensive and unified description of nuclei and their reactions that is grounded in the interactions between the constituent nucleons. Current phenomenological models of nuclear structure and reactions will be replaced with a well-founded microscopic theory delivering maximum predictive power with minimal, well quantified uncertainties. A national effort will link theoretical physics and computational science together to develop forefront software for state-of-the-art architectures. A national capability will be built to calculate nuclear structure and low-energy nuclear cross sections, and assess their uncertainties, relevant to several DOE programs. Nuclear structure and reactions play an essential role in the science to be investigated at the Rare Isotope Accelerator (RIA) and in nuclear physics applications to the Science-Based Stockpile Stewardship program, next generation reactors, and threat reduction. In order to build this capability, we will develop a multi-pronged program of theoretical, algorithmic, and computational developments that will deliver nuclear cross section information critical to DOE programs that is more accurate than is currently available. We anticipate an expansion of the computational techniques and methods we currently employ, and developments of new treatments, to take advantage of petascale architectures.
A multi-scale approach, from very high accuracy quantum chemistry methods, to coarse grained potentials that are derived directly from quantum mechanics, to large scale stochastic models will be developed and applied to important problems in bio-remediation, including the interactions of heavy metal complexes with proteins and their building blocks. To make this effort computationally viable, new parallel computing paradigms will be developed, including innovative methods that are designed to take advantage of new petascale hardware systems.
Mark Gordon, Masha Sosonkina, Theresa Windus