引用本文: | 杨慧中,卢鹏飞,张素贞,陶振麟.网络泛化能力与随机扩展训练集[J].控制理论与应用,2002,19(6):963~966.[点击复制] |
YANG Hui-zhong\+1, \+1, +2, \+2 (1. College of Communication & Control Engineering, Southern Yangtze University,Jiangsu Wuxi 214036,China; 2. Research Institute of Automation, East China University of Science, Shanghai 200237,China,LU Peng-fei,ZHANG Su-zhen,TAO Zhen-lin.Generalization of networks and random expanded training sets[J].Control Theory and Technology,2002,19(6):963~966.[点击复制] |
|
网络泛化能力与随机扩展训练集 |
Generalization of networks and random expanded training sets |
摘要点击 2951 全文点击 2443 投稿时间:2001-11-28 修订日期:2002-07-01 |
查看全文 查看/发表评论 下载PDF阅读器 |
DOI编号 10.7641/j.issn.1000-8152.2002.6.032 |
2002,19(6):963-966 |
中文关键词 前馈神经网络 泛化能力 最大局部熵密度函数 Chebyshev不等式 |
英文关键词 feed forward neural networks generalization locally most entropic probability density function Chebyshev inequality |
基金项目 |
|
中文摘要 |
针对神经网络的过拟合和泛化能力差的问题, 研究了样本数据的输入输出混合概率密度函数的局部最大熵密度估计, 提出了运用Chebyshev不等式的样本参数按类分批自校正方法, 以此估计拉伸样本集, 得到新的随机扩充训练集. 使估计质量更高, 效果更好. 仿真结果证明用这种方法训练的前馈神经网络具有较好的泛化性能. |
英文摘要 |
Aiming at the problems of over-fitting and generalization for neural networks, the locally most entropic colored Gaussian joint input-output probability density function (PDF) estimate is studied, and a new method by means of Chebyshev inequality is proposed to self-revise respectively according to every cluster. In terms of the method, a random expanded training set is obtained. The simulation results illustrate that generalization of feed forward neural networks using the expanded training sets is greatly improved. |
|
|
|
|
|