Computer Scientist and Research Engineer with Over 30 Years of Experience, Olivier Bressand is a computer scientist and research engineer with over three decades of experience, specializing in the development and maintenance of software components for data management. His expertise spans input/output operations for simulation codes, visualization, and the interconnection of codes and tools in HPC (High-Performance Computing) environments, ranging from vector clusters to exascale systems.
As an architect and project manager of the Hercule scientific data management platform for 25 years, Olivier Bressand has developed recognized expertise, holding a Senior Expert role at the French Alternative Energies and Atomic Energy Commission (CEA) for over 15 years. The Hercule platform addresses the needs for simulation data communication within complex multi-physics workflows, ensuring interoperability among multiple massively parallel computational codes.
In the early 2000s, Hercule’s architectural principles introduced generic mechanisms for describing typed objects specific to simulation domains within a database. These advancements incorporated Big Data concepts, such as metadata to optimize information retrieval, and the design of high-performance parallel I/O architectures tailored to parallel file systems.
Olivier Bressand has also played a key role in solving production challenges in the CEA’s HPC centers, delivering tailored solutions for various computational codes, including the open-source Ramses code. He has contributed to national and international conferences and co-supervised Loïc Straffela’s PhD thesis, focusing on the computational aspects of I/O for Hercule and the Ramses code.
Proceedings of the 21st ACM International Conference on Computing Frontiers: Workshops and Special Sessions, Association for Computing Machinery, p. 94-100, 2024

abstract
Abstract
The new emerging scientific workloads to be executed in the upcoming exascale supercomputers face major challenges in terms of storage, given their extreme volume of data. In particular, intelligent data placement, instrumentation, and workflow handling are central to application performance. The IO-SEA project developed multiple solutions to aid the scientific community in adressing these challenges: a Workflow Manager, a hierarchical storage management system, and a semantic API for storage. All of these major products incorporate additional minor products that support their mission. In this paper, we discuss both the roles of all these products and how they can assist the scientific community in achieving exascale performance.
Astronomy and Astrophysics, Volume 643, 2020


abstract
Abstract
We present the Extreme-Horizon (EH) cosmological simulation, which models galaxy formation with stellar and active galactic nuclei (AGN) feedback and uses a very high resolution in the intergalactic and circumgalactic medium. Its high resolution in low-density regions results in smaller-size massive galaxies at a redshift of z = 2, which is in better agreement with observations compared to other simulations. We achieve this result thanks to the improved modeling of cold gas flows accreting onto galaxies. In addition, the EH simulation forms a population of particularly compact galaxies with stellar masses of 10 ¹⁰⁻¹¹ M⊙ that are reminiscent of observed ultracompact galaxies at z ≃ 2. These objects form primarily through repeated major mergers of low-mass progenitors and independently of baryonic feedback mechanisms. This formation process can be missed in simulations with insufficient resolution in low-density intergalactic regions.