141 | -- | 166 | Roberto Battiti. First- and Second-Order Methods for Learning: Between Steepest Descent and Newton s Method |
167 | -- | 190 | Douglas A. Miller, Steven W. Zucker. Efficient Simplex-Like Methods for Equilibria of Nonsymmetric Analog Networks |
191 | -- | 195 | Joshua Alspector, Torsten Zeppenfeld, Stephan Luna. A Volatility Measure for Annealing in Feedback Neural Networks |
196 | -- | 210 | Joseph J. Atick, A. Norman Redlich. What Does the Retina Know about Natural Scenes? |
211 | -- | 223 | Christof Koch, Heinz G. Schuster. A Simple Network Showing Burst Synchronization without Frequency Locking |
224 | -- | 233 | Bruce W. Suter, Matthew Kabrisky. On a Magnitude Preserving Iterative MAXnet Algorithm |
234 | -- | 242 | Jürgen Schmidhuber. Learning Complex, Extended Sequences Using the Principle of History Compression |
243 | -- | 248 | Jürgen Schmidhuber. A Fixed Size Storage ::::O::::(::::n:::::::3:::) Time Complexity Learning Algorithm for Fully Recurrent Continually Running Networks |
249 | -- | 269 | David A. Cohn, Gerald Tesauro. How Tight Are the Vapnik-Chervonenkis Bounds? |
270 | -- | 286 | Gary R. Bradski, Gail A. Carpenter, Stephen Grossberg. Working Memory Networks for Learning Temporal Order with Application to Three-Dimensional Visual Object Recognition |