Jump to content
Wikipedia The Free Encyclopedia

Backpropagation through structure

From Wikipedia, the free encyclopedia
Technique for training recursive neural networks
This article needs additional citations for verification . Please help improve this article by adding citations to reliable sources. Unsourced material may be challenged and removed.
Find sources: "Backpropagation through structure" – news · newspapers · books · scholar · JSTOR
(May 2015) (Learn how and when to remove this message)

Backpropagation through structure (BPTS) is a gradient-based technique for training recursive neural networks, proposed in a 1996 paper written by Christoph Goller and Andreas Küchler.[1]

References

[edit ]
  1. ^ Goller, Christoph; Küchler, Andreas (1996). "Learning Task-Dependent Distributed Representations by Backpropagation Through Structure". Proceedings of International Conference on Neural Networks (ICNN'96). Vol. 1. pp. 347–352. CiteSeerX 10.1.1.49.1968 . doi:10.1109/ICNN.1996.548916. ISBN 0-7803-3210-5. S2CID 6536466.


Stub icon

This artificial neural network-related article is a stub. You can help Wikipedia by expanding it.

AltStyle によって変換されたページ (->オリジナル) /