排序方式: 共有2条查询结果,搜索用时 93 毫秒
1
1.
We describe the application of tools from statistical mechanics to analyse the dynamics of various classes of supervised learning rules in perceptrons. The character of this paper is mostly that of a cross between a biased non-encyclopaedic review and lecture notes: we try to present a coherent and self-contained picture of the basics of this field, to explain the ideas and tricks, to show how the predictions of the theory compare with (simulation) experiments, and to bring together scattered results. Technical details are given explicitly in an appendix. In order to avoid distraction we concentrate the references in a final section. In addition this paper contains some new results: (i) explicit solutions of the macroscopic equations that describe the error evolution for on-line and batch learning rules; (ii) an analysis of the dynamics of arbitrary macroscopic observables (for complete and incomplete training sets), leading to a general Fokker–Planck equation; and (iii) the macroscopic laws describing batch learning with complete training sets. We close the paper with a preliminary expose´ of ongoing research on the dynamics of learning for the case where the training set is incomplete (i.e. where the number of examples scales linearly with the network size). 相似文献
2.
1