高级检索

留言板

尊敬的读者、作者、审稿人, 关于本刊的投稿、审稿、编辑和出版的任何问题, 您可以本页添加留言。我们将尽快给您答复。谢谢您的支持!

姓名
邮箱
手机号码
标题
留言内容
验证码

基于GPR-SVR协同训练的乏燃料衰变热预测研究

刘子豪 刘彤 温欣 李懿 王蓓琪

刘子豪, 刘彤, 温欣, 李懿, 王蓓琪. 基于GPR-SVR协同训练的乏燃料衰变热预测研究[J]. 核动力工程, 2025, 46(2): 272-281. doi: 10.13832/j.jnpe.2024.070016
引用本文: 刘子豪, 刘彤, 温欣, 李懿, 王蓓琪. 基于GPR-SVR协同训练的乏燃料衰变热预测研究[J]. 核动力工程, 2025, 46(2): 272-281. doi: 10.13832/j.jnpe.2024.070016
Liu Zihao, Liu Tong, Wen Xin, Li Yi, Wang Beiqi. Prediction of Spent Nuclear Fuel Decay Heat Based on GPR-SVR Co-training[J]. Nuclear Power Engineering, 2025, 46(2): 272-281. doi: 10.13832/j.jnpe.2024.070016
Citation: Liu Zihao, Liu Tong, Wen Xin, Li Yi, Wang Beiqi. Prediction of Spent Nuclear Fuel Decay Heat Based on GPR-SVR Co-training[J]. Nuclear Power Engineering, 2025, 46(2): 272-281. doi: 10.13832/j.jnpe.2024.070016

基于GPR-SVR协同训练的乏燃料衰变热预测研究

doi: 10.13832/j.jnpe.2024.070016
基金项目: 国家核技术开发科研项目(HNKF202314-48);中核集团“领创科研”项目(CNNC-LCKY-202242)
详细信息
    作者简介:

    刘子豪(2000—),男,硕士研究生,现主要从事人工智能与核燃料数值模拟等研究,E-mail: liuzihao968@sjtu.edu.cn

    通讯作者:

    刘 彤,E-mail: tongliu@sjtu.edu.cn

  • 中图分类号: TL363

Prediction of Spent Nuclear Fuel Decay Heat Based on GPR-SVR Co-training

  • 摘要: 在压水堆核电厂中,乏燃料组件的衰变热是堆芯余热的主要来源,准确预测衰变热对于反应堆冷却系统的设计和安全分析至关重要,但传统核素衰变模拟程序计算成本高,而机器学习模型由于数据不足可能存在过拟合问题。本文基于高斯过程回归(GPR)和支持向量回归(SVR)方法建立了协同训练的基础模型,生成了高质量的乏燃料衰变热虚拟数据,并与核电厂实测数据组成了混合数据集,采用混合数据集训练极限学习机(ELM)模型,对乏燃料衰变热进行了预测。结果表明,与常规的机器学习模型相比,协同训练显著提升了衰变热预测的稳定性和准确性。经过混合数据集训练后,ELM模型的预测稳定性提高了39.9%,衰变热预测结果的均方根误差(RMSE)比传统核素衰变模拟程序低25.7%。本研究提出的方法可为解决核工程领域存在的小数据集问题提供新思路。

     

  • 图  1  各模型在测试集$ {V}_{0} $的预测结果

    Figure  1.  Predictions of Each Model on the Test Set $ {V}_{0} $

    图  2  协同交叉训练流程图

    Figure  2.  Flowchart of Co-training

    图  3  协同训练方法降低预测误差结果

    Figure  3.  Examples of Using Co-training to Reduce Prediction Error

    图  4  预测误差随协同训练次数变化情况

    Figure  4.  Prediction Error RMSE Changes with the Epoch of Co-training

    图  5  ELM模型在测试集$ {V}_{0} $的预测结果

    Figure  5.  Predictions of ELM Model on the Test Set $ {V}_{0} $

    图  6  ELM模型的预测结果和95%CI

    Figure  6.  Predictions and 95%CI of ELM Model

    图  7  ELM模型在所有测试集的预测结果对比

    Figure  7.  Comparison of Predictions of ELM Model in All Test Sets

    图  8  衰变模拟程序计算值与ELM模型预测值对比

    Figure  8.  Comparison of Calculated Values by Decay Simulation Code and Predicted Values by ELM Model

    表  1  乏燃料衰变热数据总结

    Table  1.   Summary of Spent Fuel Decay Heat Data

    反应堆组件排列乏燃料组件富集度/%初始铀质量/kg燃耗深度/[MW·d·t−1(U)]卸料后冷却时间/d
    Ringhals317×172.1~3.103451.743~489.05019699~473084717~7304
    Ringhals2、Turkey Point15×153.095~3.252423.896~455.78933973~509625804~8468
    San Onofre1、Point Beach214×143.397~4.005361.72~386.8026540~393841078~3012
    下载: 导出CSV

    表  2  GPR和SVR预测误差汇总

    Table  2.   Summary of GPR and SVR Predictions

    模型 RE/% RMSE/% MAPE/%
    点1 点2 点3
    $ {{\mathrm{M}}}_{{\mathrm{GPR}}0} $ 72.8 2.4 9.0 72.0 5.7
    $ {{\mathrm{M}}}_{{\mathrm{SVR}}0} $ 10.5 32.9 1.3 76.3 3.8
    神经网络 90.8 15.4
    下载: 导出CSV

    表  3  衰变热预测误差汇总

    Table  3.   Summary of Decay Heat Prediction Errors

    模型 最大RE/% 最小RE/% RMSE/%
    $ {\mathrm{M}}_{\mathrm{E}\mathrm{L}\mathrm{M}}^{\mathrm{c}\mathrm{o}} $ 4.54 0.01 12.05
    $ {\mathrm{M}}_{\mathrm{E}\mathrm{L}\mathrm{M}}^{0} $ 10.39 0.35 39.42
    下载: 导出CSV
  • [1] 高拥军,郭振武,陈秋炀. 含钆量对乏燃料组件衰变热的影响[J]. 机械设计,2021, 38(S1): 167-169.
    [2] 王梦琪,彭超,黎辉,等. 三代非能动核电厂乏燃料贮运系统衰变热计算及关键因素研究[J]. 辐射防护,2023, 43(S1): 14-19.
    [3] 杨宁,李华琪,张信一,等. 西安脉冲堆衰变热功率精细化计算[J]. 核动力工程,2021, 42(6): 24-31.
    [4] MAEDA S, SEKINE T, AOYAMA T. Measurement and analysis of decay heat of fast reactor spent MOX fuel[J]. Annals of Nuclear Energy, 2004, 31(10): 1119-1133. doi: 10.1016/j.anucene.2003.11.002
    [5] KROMAR M, GODFREY A T. Determination of the spent fuel decay heat with the VERA core simulator[J]. Frontiers in Energy Research, 2022, 10: 1046506. doi: 10.3389/fenrg.2022.1046506
    [6] EBIWONJUMI B, CHEREZOV A, DZIANISAU S, et al. Machine learning of LWR spent nuclear fuel assembly decay heat measurements[J]. Nuclear Engineering and Technology, 2021, 53(11): 3563-3579. doi: 10.1016/j.net.2021.05.037
    [7] HART G L W, MUELLER T, TOHER C, et al. Machine learning for alloys[J]. Nature Reviews Materials, 2021, 6(8): 730-755. doi: 10.1038/s41578-021-00340-w
    [8] NICHOLLS M. Machine Learning—state of the art: the critical role that machine learning can play in advancing cardiology was outlined at a packed session at ESC 2019[J]. European Heart Journal, 2019, 40(45): 3668-3669. doi: 10.1093/eurheartj/ehz801
    [9] HUANG Q Y, PENG S N, DENG J, et al. A review of the application of artificial intelligence to nuclear reactors: where we are and what’s next[J]. Heliyon, 2023, 9(3): e13883. doi: 10.1016/j.heliyon.2023.e13883
    [10] PANAGOPOULOS O P, XANTHOPOULOS P, RAZZAGHI T, et al. Relaxed support vector regression[J]. Annals of Operations Research, 2019, 276(1): 191-210.
    [11] 陈静,卢燕臻,江灏,等. 基于孪生模型的堆芯自给能中子探测器信号异常检测[J]. 核动力工程,2023, 44(3): 210-216.
    [12] 颜建国,郑书闽,郭鹏程,等. 基于机器学习的螺旋流动过冷沸腾CHF预测研究[J]. 核动力工程,2023, 44(3): 65-73.
    [13] 雷济充,谢金森,于涛,等. 基于数据挖掘技术的组件核子密度预测研究[J]. 核动力工程,2021, 42(4): 126-132.
    [14] BLUM A, MITCHELL T. Combining labeled and unlabeled data with co-training[C]//Proceedings of the Eleventh Annual Conference on Computational Learning Theory. Madison: Association for Computing Machinery, 1998: 92-100.
    [15] PENG J Z, ESTRADA G, PEDERSOLI M, et al. Deep co-training for semi-supervised image segmentation[J]. Pattern Recognition, 2020, 107: 107269. doi: 10.1016/j.patcog.2020.107269
    [16] LU Y J, XU M, WU C X, et al. Cross-lingual implicit discourse relation recognition with co-training[J]. Frontiers of Information Technology & Electronic Engineering, 2018, 19(5): 651-661.
    [17] LIN Y, QU Z Y, CHEN H, et al. Nuclei segmentation with point annotations from pathology images via self-supervised learning and co-training[J]. Medical Image Analysis, 2023, 89: 102933. doi: 10.1016/j.media.2023.102933
    [18] ZHU Q X, ZHANG H T, TIAN Y, et al. Co-training based virtual sample generation for solving the small sample size problem in process industry[J]. ISA Transactions, 2023, 134: 290-301. doi: 10.1016/j.isatra.2022.08.021
    [19] XIE Y, LIANG Y F, GONG M G, et al. Semisupervised graph neural networks for graph classification[J]. IEEE Transactions on Cybernetics, 2023, 53(10): 6222-6235. doi: 10.1109/TCYB.2022.3164696
    [20] SVENSK KÄRNBRÄNSLEHANTERING AB. Measurements of decay heat in spent nuclear fuel at the Swedish interim storage facility, Clab: SKB R-05-62[R]. Sweden: Swedish Nuclear Fuel and Waste Management Co., 2006.
    [21] GAULD I C, ILAS G, MURPHY B D, et al. Validation of SCALE 5 decay heat predictions for LWR spent nuclear fuel: ORNL/TM-2006/13[R]. U.S: Nuclear Regulatory Commission, 2010.
    [22] ILAS G, GAULD I C. SCALE analysis of CLAB decay heat measurements for LWR spent fuel assemblies[J]. Annals of Nuclear Energy, 2008, 35(1): 37-48. doi: 10.1016/j.anucene.2007.05.017
    [23] JANSSON P, BENGTSSON M, BÄCKSTRÖM U, et al. Data from calorimetric decay heat measurements of five used PWR 17x17 nuclear fuel assemblies[J]. Data in Brief, 2020, 28: 104917. doi: 10.1016/j.dib.2019.104917
    [24] JANG J, EBIWONJUMI B, KIM W, et al. Validation of spent nuclear fuel decay heat calculation by a two-step method[J]. Nuclear Engineering and Technology, 2021, 53(1): 44-60. doi: 10.1016/j.net.2020.06.028
    [25] RASMUSSEN C E, WILLIAMS C K I. Gaussian processes for machine learning[M]. Cambridge: MIT Press, 2006: 16: 112.
    [26] INSUA D R, RUGGERI F, WIPER M P. Bayesian analysis of stochastic process models[M]. Chichester: John Wiley & Sons, Ltd, 2012: 11.
    [27] BOLOIX-TORTOSA R, MURILLO-FUENTES J J, PAYÁN-SOMET F J, et al. Complex Gaussian processes for regression[J]. IEEE Transactions on Neural Networks and Learning Systems, 2018, 29(11): 5499-5511. doi: 10.1109/TNNLS.2018.2805019
    [28] 冯榆坤. 基于支持向量回归算法的船型优化设计研究[D]. 上海: 上海交通大学,2020.
    [29] VAPNIK V N. The nature of statistical learning theory[M]. 2nd ed. New York: Springer, 2000: 138.
    [30] LAU K W, WU Q H. Online training of support vector classifier[J]. Pattern Recognition, 2003, 36(8): 1913-1920. doi: 10.1016/S0031-3203(03)00038-4
    [31] LIU X Y, JIN H. High-precision transient fuel consumption model based on support vector regression[J]. Fuel, 2023, 338: 127368. doi: 10.1016/j.fuel.2022.127368
    [32] HUANG G B, ZHU Q Y, SIEW C K. Extreme learning machine: a new learning scheme of feedforward neural networks[C]//2004 IEEE International Joint Conference on Neural Networks (IEEE Cat. No. 04CH37541). Budapest: IEEE, 2004: 985-990.
    [33] ZOU W D, XIA Y Q, CAO W P. Back-propagation extreme learning machine[J]. Soft Computing, 2022, 26(18): 9179-9188. doi: 10.1007/s00500-022-07331-1
    [34] ZHANG Y S, WU J, ZHOU C, et al. Instance cloned extreme learning machine[J]. Pattern Recognition, 2017, 68: 52-65. doi: 10.1016/j.patcog.2017.02.036
    [35] MARELLI S, SUDRET B. UQLab: a framework for uncertainty quantification in matlab[C]//Vulnerability, Uncertainty, and Risk: Quantification, Mitigation, and Management. Liverpool: American Society of Civil Engineers, 2014: 2554-2563.
    [36] LI D C, WEN I H. A genetic algorithm-based virtual sample generation technique to improve small data set learning[J]. Neurocomputing, 2014, 143: 222-230. doi: 10.1016/j.neucom.2014.06.004
    [37] TAORMINA R, CHAU K W. Data-driven input variable selection for rainfall–runoff modeling using binary-coded particle swarm optimization and Extreme Learning Machines[J]. Journal of Hydrology, 2015, 529: 1617-1632. doi: 10.1016/j.jhydrol.2015.08.022
    [38] MACALLISTER A, KOHL A, WINER E. Using high-fidelity meta-models to improve performance of small dataset trained Bayesian Networks[J]. Expert Systems with Applications, 2020, 139: 112830. doi: 10.1016/j.eswa.2019.112830
    [39] NAZARÉ T S, DA COSTA G B P, CONTATO W A, et al. Deep convolutional neural networks and noisy images[C]//Proceedings of the 22nd Iberoamerican Congress. Valparaíso: Springer, 2017: 416-424.
    [40] WANG J D, LAN C L, LIU C, et al. Generalizing to unseen domains: a survey on domain generalization[J]. IEEE Transactions on Knowledge and Data Engineering, 2023, 35(8): 8052-8072.
    [41] CHEN Z N, WANG W Q, ZHAO Z C, et al. Bag of tricks for out-of-distribution generalization[C]//Computer Vision - ECCV 2022 Workshops. Tel Aviv: Springer, 2022: 465-476.
  • 加载中
图(8) / 表(3)
计量
  • 文章访问数:  38
  • HTML全文浏览量:  10
  • PDF下载量:  3
  • 被引次数: 0
出版历程
  • 收稿日期:  2024-07-04
  • 修回日期:  2024-08-15
  • 网络出版日期:  2025-01-20
  • 刊出日期:  2025-04-15

目录

    /

    返回文章
    返回