首页 | 本学科首页   官方微博 | 高级检索  
     

具有非可微激活函数的神经网络性能分析
引用本文:穆文全,廖晓峰,虞厥邦. 具有非可微激活函数的神经网络性能分析[J]. 电子科技大学学报(社会科学版), 1996, 0(4)
作者姓名:穆文全  廖晓峰  虞厥邦
作者单位:成都电子科技大学光电子技术系
摘    要:神经网络已广泛应用于各类问题,然而BP算法要求有连续且可微的激活函数,文中提出一种用于训练非可微激活函数的神经网络学习算法。同时,利用相对熵误差测度,算法被完整地导出。实验结果表明,在解决异或问题、编码/解码问题及其补问题时,算法收敛速度非常快,收敛结果也令人满意。

关 键 词:神经网络,激活函数,非可微激活函数,相对熵误差测度

Performance Analysis of Neural Network with Non-differentiable Activation Function
Mu Wenquan,Liao Xiaofeng,Yu Juebang. Performance Analysis of Neural Network with Non-differentiable Activation Function[J]. Journal of University of Electronic Science and Technology of China(Social Sciences Edition), 1996, 0(4)
Authors:Mu Wenquan  Liao Xiaofeng  Yu Juebang
Abstract:Neural network is Used to solve a wide variety of problems.However,BP algorithm requires a continuous and differentiable activation function. This Paper presents a ladening algorithm which may be used to train NN whose neurons may have non-differentiable activation function.The algorithm is demonstrated by using linear ramp activation function with the relative entropy error measure.Experimental results show that the proposed algorithm is much faster in solving XOR problem,the encoder/decoder problems and their complements.
Keywords:neural network  activation function  non- differentiable activation function  relative entropy error measure  
本文献已被 CNKI 等数据库收录!
设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号