Journal: Neural Computation

Volume 4, Issue 2

141 -- 166Roberto Battiti. First- and Second-Order Methods for Learning: Between Steepest Descent and Newton s Method
167 -- 190Douglas A. Miller, Steven W. Zucker. Efficient Simplex-Like Methods for Equilibria of Nonsymmetric Analog Networks
191 -- 195Joshua Alspector, Torsten Zeppenfeld, Stephan Luna. A Volatility Measure for Annealing in Feedback Neural Networks
196 -- 210Joseph J. Atick, A. Norman Redlich. What Does the Retina Know about Natural Scenes?
211 -- 223Christof Koch, Heinz G. Schuster. A Simple Network Showing Burst Synchronization without Frequency Locking
224 -- 233Bruce W. Suter, Matthew Kabrisky. On a Magnitude Preserving Iterative MAXnet Algorithm
234 -- 242Jürgen Schmidhuber. Learning Complex, Extended Sequences Using the Principle of History Compression
243 -- 248Jürgen Schmidhuber. A Fixed Size Storage ::::O::::(::::n:::::::3:::) Time Complexity Learning Algorithm for Fully Recurrent Continually Running Networks
249 -- 269David A. Cohn, Gerald Tesauro. How Tight Are the Vapnik-Chervonenkis Bounds?
270 -- 286Gary R. Bradski, Gail A. Carpenter, Stephen Grossberg. Working Memory Networks for Learning Temporal Order with Application to Three-Dimensional Visual Object Recognition