The Center for Turbulence Research (CTR) is a research consortium for fundamental study of turbulent flows. It is jointly operated byStanford UniversityandNational Aeronautics and Space Administration (NASA).
The principal objective of the CTR is to stimulate significant advances in the physical understanding of turbulence and related non-linear multi-scale phenomena. These advances are directed to improving capabilities for control of turbulence and to modeling turbulence for engineering analysis. Particular emphasis is placed on probing turbulent flow fields developed by direct numerical simulations and/or laboratory experiments, on using new diagnostic techniques and mathematical methods, on concepts for turbulence control and modeling, and on complex effects on turbulence. These effects include, complex geometry, chemical reactions, complex fluids, multi fluid phases, magnetohydrodynamics and hypersonics.
The Center's view is that the key to advances in turbulence is sustained interaction by researchers. It is essential that those working on theoretical aspects of the problem interact with those conducting experimental or computational research, and that a broad range of viewpoints and methods be brought together in a catalytic manner. The essence of the CTR is to provide the central core of this needed critical mass activity.
(Top)DNA molecules in the shear flowand (bottom)solar convection patterns
The main elements of the Center are an extensive visiting Fellows program, a biennial summer program, seminars and workshops, and a core ofPhD students and postdoctoral researchers. Although the emphasis of the CTR is to advance the understanding of turbulent flows for aerospace applications, it is an interdisciplinary program; researchers with interest in turbulence are sought fromMathematics, Aeronautics, Meteorology, Physics, Astrophysics, Solar Physics, Computer Science, Oceanography, and other areas. Traditionally, 30% of CTR Postdoctoral Fellows have had Ph.D's in Physics and Applied Mathematics.
Studying Turbulence Using Numerical Simulations
The direct simulation of a turbulent flow had been out of reach until the first large parallel computers became available.CTR’s early landmark numerical predictions represented a paradigm shift for researchers investigating the mechanics of the turbulent flows. The simulations generated large databases that were archived and used by scientists around the world to test theories, evaluate modeling ideas, and validate computational codes, and in some cases to calibrate measuring instruments.
In 1980s, CTR used a pioneering parallel computer, the ILLIAC-IV, to perform the largest turbulence simulations achieved until then. The work was well received; soberingly enough, however, it was not the quality of the data that won over many of our colleagues but rather a five-minute motion picture of the simulated flow. The movie showed trajectories of marker particles in a turbulent flow between parallel plates; remarkably, it resembled similar visualizations made two decades earlier, by filming actual water flow in a laboratory at Stanford University.
ILLIAC-IV system information:
- 64 Processors
- 2,048 Words per processor
- 4 Mflops Peak Performance
- 10 MB hard disks
- I/O 500 Mbits per second
- $31 Million in 1972
- 5,000 Miles of Cables
- Arpanet Connection
- Custom Programming Languages: CFD. Vectoral
ILLIAC IV - 1977
Advanced Simulation and Computing
Today CTR researchers are using several computer clusters at Stanford which are many times more powerful than the ILLIAC IV. For large production runs, they have access to advanced computational facilities at the National Laboratories, including the Columbia at NASA-Ames with 10,000 processors.
In 1997 Stanford became one of five universitiesASC centerssponsored by the U.S. Department of Energy. This was a major boost for complex integrated turbulence research activities and computational engineering at CTR. In addition the ASC program provided access to unprecedented computational resources at DOE's laboratories. The most recent computer made accessable to the Center is the IBM BlueGene at Livermore, which has 130,000 CPU's and 360 TFlops performance.
ASC Exponentially Increasing Computing Resources
Scalable Parallel Tools: Ingredients for Modern Computational Engineering
With major help from ASC, CTR has devoted its attention to the development of large scalable multi-physics codes for prediction of complex engineering systems. CTR is developing high-fidelity computational tools for industrial applications. ItsIndustrial Affiliates Programprovides the means for transfer of CTR technology to the affiliate members. The goal of CTR research is to develop validated and cost-effective prediction capability for simulations of complex engineering systems involving turbulent flows.
Parallel performance of one part of the ASC project in CTR
Real-Time Predictive Capability - We Are Not There Yet
Two and a half years after the Columbia disaster, the Space Shuttle Discovery was launched on July 26, 2005, and two weeks later returned to Earth safely after completing its mission. The Discovery was instrumented extensively with video cameras to monitor any damage it might have received on its aerodynamic surfaces similar to that on Columbia's last mission. A few minutes into Discovery's flight, a piece of foam was seen to break off from the external tank, but this time it appeared that it missed the shuttle. This caused a concerned NASA to put future shuttle flights on hold until the flow/foam interaction problem is understood and corrective steps can be taken.
After reaching orbit, it was observed that a gap filler between two of the heat shielding tiles was protruding approximately an inch from the underbody of the shuttle just aft of the nose landing gear door. This location was sufficiently early in the development of the boundary layer to cause concern. NASA engineers became worried that the gap filler might promote early transition from laminar to turbulent flow, which would lead to substantially higher heat transfer to the shuttle's underbody and wing leading edge panels upon reentry into the Earth's atmosphere. NASA had to decide whether to send an astronaut on a space walk and repair mission to remove the gap filler or leave it in place. If early transition took place at or near peak heating condition, at Mach 23, it was estimated that the increased heat load due to turbulence would exceed the allowed margins of safety. If it took place at a lower Mach number, then the removal of the gap filler would not be necessary. This question was posed to the engineers on the ground, who had to come up with an answer quickly. NASA received conflicting answers on the question of transition; based on their computational model, one group predicted that transition would take place at Mach 24 and the other group predicted transition at Mach 21. NASA went with the conservative prediction and sent astronaut Steve Robinson (author of "Coherent Motions in the Turbulent Boundary Layer," published in Volume 23 of this series) to remove the gap filler, which he did very gracefully.
This emergency episode is a reminder that in spite of years of research, there are many things in fluid mechanics that literally determine our lives and yet elude our ability to predict. We still do not have a definitive answer to Discovery's quandary in spite of the strides made in the field. In engineering science, maturity implies predictability, and as we have seen here, the claim that fluid mechanics is a mature field is slightly exaggerated!