From research to industry
The French Alternative Energies and Atomic Energy Commission (CEA) is a key player in research, development and innovation.
Discover the main research areas on which the CEA works.
Find the latest scientific and institutional news of the CEA.
The CEA publishes various scientific and technical periodicals and videos. Through them, you can discover the CEA’s major research topics and the latest technological innovations produced by its laboratories.
The CEA Research News is a newsletter bringing to your knowledge the most impacting scientific and societal advances and impacts enabled by major European research projects carried by CEA and which are covering the main priorities of the European Union.
Clefs CEA | Article | Nuclear energy
Clefs CEA n°64 - Les voix de la recherche - Journey to the heart of Big Data
Neutronics is the study of the path of neutrons through matter. The neutron population in a reactor core is governed by equations, the resolution of which can be used in particular to determine the reactivity, the power output by the nuclear fission reactions for ultimate conversion into electrical power, the isotopic composition, in all operating conditions – normal, incident and accident.
Complete version of the article published in Clefs CEA n°64 - Journey to the heart of Big Data.
Quite apart from measurements, these physical parameters of interest are obtained by running a neutronics calculation called the “core calculation”. Once they have been obtained, it is then possible to access key safety parameters, that is the hot spot factors corresponding to the power peaks in the core, the reactivity coefficients which reflect the sensitivity of the chain reaction to the variation in physical parameters such as the temperature of the nuclear fuel and that of the moderator, and the anti-reactivity margin which indicates the amplitude of the reduction in core reactivity (or subcriticality level) during a reactor scram.
For this reactor, the criteria to be minimised are the variation in reactivity at the end of the cycle, the void coefficient, the maximum sodium temperature during an ULOF* accident sequence, an economic criterion linked to the mass of nuclear fuel and the volume of the core. The constraints to be met are the maximum variation in reactivity over a cycle (2600 pcm*), the regeneration gain range (-0.1 - +0.1), the maximum number of dpa* (140), the maximum core volume (13.7 m3), the maximum sodium temperature during an ULOF accident sequence (1200°C), the maximum fuel temperature at the end of the cycle (2500°C).
To carry out this study, the performance of more than ten million reactors was evaluated for selection of the optimum reactor configurations with respect to the criteria and constraints defined. Exploration in the research space is by means of meta-models (neural networks for example) to reduce the computing time.
When applied to the ASTRID design studies, this method therefore led to the characterisation of an “optimal Pareto population” (in other words, impossible to improve one criterion without penalising the others) of 25,000 individuals-reactors. It gave results such as those illustrated in the following figure by means of a parallel coordinates visualisation (also called COBWEB) in which each line corresponds to an individual (a reactor) and each axis represents the variation range, within the population, of the characteristics (criteria, parameters, properties) reflecting the reactor performance criteria mentioned above.
These studies showed the benefits of evolutionary algorithms run in HPC for resolving complex multi-criteria optimisation problems with constraints. HPC computing resources today enable meta-heuristics to be used, naturally making use of the parallelism of the computing means and finally combining various research strategies. These works as a whole drew attention to the possible improvements for processing what is referred to as “many-objectives”, when the number of objectives is high. Other fields of research have appeared, in particular optimisation under uncertainties, whether random or/and epistemic.
 J.M. Do, G. Arnaud, A.M. Baudron, J.J. Lautard, “Fuel loading pattern for heterogeneous EPR core configuration using a distributed evolutionary algorithm, International Conference on Mathematics”, Computational Methods & Reactor Physics (M&C 2009), Saratoga Springs, New York, May 3-7, 2009. J.M. Do, J.J. Lautard, A.M. Baudron, S. Douce, G. Arnaud, “Use of metaheuristics for design of fuel loading pattern in light reactors comprising some radial and axial heterogeneities”, Proceedings of the 2011 IEEE International Symposium on Parallel and Distributed Processing Workshop, pp. 374-380. E. Hourcade, F. Gaudier, G. Arnaud, D. Funtowiez, K. Ammar, “Supercomputing application for reactors code design and optimization”, Joint International Conference on Supercomputing in Nuclear Applications and Monte Carlo 2010, Tokyo, Japan, October 17-21, 2010. K. Ammar, Conception multi-physique et multi-objectif des cœurs de RNR-Na hétérogènes : développement d’une méthode d’optimisation sous incertitudes, thèse de doctorat, Université Paris Sud, 2015. VIZIR: F. Gaudier, “URANIE: The CEA/DEN Uncertainty and Sensitivity platform”, 6th International Conference on Sensitivity Analysis of model output - Procedia and Behavioral Sciences, Chapitre VI : Conclusions, perspectives vol. 2, pp. 7660-7661, 2010. URANIE: https://sourceforge.net/projects/uranie/
In order to understand the celestial objects making up the Universe, astrophysicists are developing 3-dimensional, time-dependent numerical simulations. They are generating a constantly increasing quantity of data, which have to be efficiently analysed in order to lift the many veils shrouding the mysteries of our Universe.
The new international projects, such as the Euclid space telescope, are ushering in the era of Big Data for cosmologists. Our questions about dark matter and dark energy, which on their own account for 95% of the content of our Universe, throw up new algorithmic, computational and theoretical challenges. The fourth concerns reproducible research, a fundamental concept for the verification and credibility of the published results.
CEA has a full role to play in the initiatives being taken at the national and European levels, to stimulate research and innovation in the field of Big Data.
From the economic intelligence (EI) standpoint, advanced processing techniques applied to vast sets of data present as many risks as there are opportunities. How does one strike the right balance between the defensive aspects and the competitive aspects of EI?
Together with CEA, we are planning for the future:that of the quantum computer,on which our engineers are already working closely with those at CEA, and “quantum-safe” cryptography.
CEA is a French government-funded technological research organisation in four main areas: low-carbon energies, defense and security, information technologies and health technologies. A prominent player in the European Research Area, it is involved in setting up collaborative projects with many partners around the world.