TY - GEN
T1 - Plasma fusion code coupling using scalable I/O services and scientific workflows
AU - Podhorszki, Norbert
AU - Klasky, Scott
AU - Liu, Qing
AU - Docan, Ciprian
AU - Parashar, Manish
AU - Abbasi, Hasan
AU - Lofstead, Jay
AU - Schwan, Karsten
AU - Wolf, Matthew
AU - Zheng, Fang
AU - Cummings, Julian
PY - 2009
Y1 - 2009
N2 - In order to understand the complex physics of mother nature, physicist often use many approximations to understand one area of physics and then write a simulation to reduce these equations to ones that can be solved on a computer. Different approximations lead to different equations that model different physics, which can often lead to a completely different simulation code. As computers become more powerful, scientists can either write one simulation that models all of the physics or they produce several codes each for different portions of the physics and then 'couple' these codes together. In this paper, we concentrate on the latter, where we look at our code coupling approach for modeling a full device fusion reactor. There are many approaches to code coupling. Our first approach was using Kepler workflows to loosely couple three codes via files (memory-to-disk-to-memory coupling). This paper describes our new approach moving towards using memory-to-memory data exchange to allow for a tighter coupling. Our approach focuses on a method which brings together scientific workflows along with staging I/O methods for code coupling. Staging methods use additional compute nodes to perform additional tasks such as data analysis, visualization, and NxM transfers for code coupling. In order to transparently allow application scientist to switch from memory to memory coupling to memory to disk to memory coupling, we have been developing a framework that can switch between these two I/O methods and then automate other workflow tasks. Our hybrid approach allows application scientist to easily switch between in-memory coupling and file-based coupling on-the-fly, which aids debugging these complex configurations.
AB - In order to understand the complex physics of mother nature, physicist often use many approximations to understand one area of physics and then write a simulation to reduce these equations to ones that can be solved on a computer. Different approximations lead to different equations that model different physics, which can often lead to a completely different simulation code. As computers become more powerful, scientists can either write one simulation that models all of the physics or they produce several codes each for different portions of the physics and then 'couple' these codes together. In this paper, we concentrate on the latter, where we look at our code coupling approach for modeling a full device fusion reactor. There are many approaches to code coupling. Our first approach was using Kepler workflows to loosely couple three codes via files (memory-to-disk-to-memory coupling). This paper describes our new approach moving towards using memory-to-memory data exchange to allow for a tighter coupling. Our approach focuses on a method which brings together scientific workflows along with staging I/O methods for code coupling. Staging methods use additional compute nodes to perform additional tasks such as data analysis, visualization, and NxM transfers for code coupling. In order to transparently allow application scientist to switch from memory to memory coupling to memory to disk to memory coupling, we have been developing a framework that can switch between these two I/O methods and then automate other workflow tasks. Our hybrid approach allows application scientist to easily switch between in-memory coupling and file-based coupling on-the-fly, which aids debugging these complex configurations.
KW - Code coupling
KW - Parallel I/O
KW - Plasma simulation
KW - Workflow design
KW - Workflow execution
UR - http://www.scopus.com/inward/record.url?scp=74049156300&partnerID=8YFLogxK
UR - http://www.scopus.com/inward/citedby.url?scp=74049156300&partnerID=8YFLogxK
U2 - 10.1145/1645164.1645172
DO - 10.1145/1645164.1645172
M3 - Conference contribution
AN - SCOPUS:74049156300
SN - 9781605587172
T3 - Proceedings of the 4th Workshop on Workflows in Support of Large-Scale Science, WORKS '09, in Conjunction with SC 2009
BT - Proceedings of the 4th Workshop on Workflows in Support of Large-Scale Science, WORKS '09, in Conjunction with SC 2009
T2 - 4th Workshop on Workflows in Support of Large-Scale Science, WORKS '09, in Conjunction with SC 2009
Y2 - 16 November 2009 through 16 November 2009
ER -