Online higher-order error correction of nonlinear diffusion generalized perturbation theory using neural networks

Thumbnail Image
Date
1999
Authors
Kondapalli, Naveen
Major Professor
Advisor
Maldonado, G.I.
Committee Member
Journal Title
Journal ISSN
Volume Title
Publisher
Abstract
An Online scheme is developed for the reduction of errors associated with the second order generalized perturbation theory (GPT) approximation of the neutron diffusion fundamental-mode eigenvalues (1/ k[Subscript eff)] The primary application of this work is nuclear fuel loading optimization. The k[Subscript eff] approximation is generated which estimates perturbed conditions as a function of fuel material perturbations (i.e., fuel assembly shuffles) relative to a reference core loading pattern. The implementation of GPT for approximating k[Subscript eff] reduces the computational time required, as it is faster than a forward solution by a factor of 8 to 15, however, GPT produces errors associated with its second order approximation as the perturbations get larger. The main emphasis in this study it to achieve improved approximations of the End-of-cycle (EOC) k[Subscript eff] with minimal increase in the computational time. A simple Feed Forward SimpleNet is designed to predict the k[Subscript eff] values with zeroth, first and second order approximations of k[Subscript eff] as inputs available from GPT. An Online scheme is applied to train and test the ANN in parallel with the execution of GPT. The output from the ANN is read back and is used for further calculations involving k[Subscript eff]. This study shows that Online scheme can be applied to reduce the average errors and achieve lower computation time for calculation of k[Subscript eff] values.
Series Number
Journal Issue
Is Version Of
Versions
Series
Academic or Administrative Unit
Type
thesis
Comments
Rights Statement
Copyright
Funding
DOI
Supplemental Resources
Source