default search action
ICNN 1988: San Diego, CA, USA
- Proceedings of International Conference on Neural Networks (ICNN'88), San Diego, CA, USA, July 24-27, 1988. IEEE 1988
Volume 1
- Craig Gotsman, Eli Shamir, Daniel Lehmann:
Asynchronous dynamics of random Boolean networks. 1-7 - Bill Baird:
Bifurcation theory methods for programming static or periodic attractors and their bifurcations in dynamic neural networks. 9-16 - Victor Eliashberg:
Neuron layer with reciprocal inhibition as a mechanism of random choice. 17-25 - David L. Standley, John L. Wyatt Jr.:
Stability theorem for lateral inhibition networks that is robust in the presence of circuit parasitics. 27-36 - K. E. Kürten:
Dynamical properties of threshold automata with nearest-neighbor interactions on a regular lattice. 37-43 - Bang W. Lee, Bing J. Sheu:
An investigation on local minima of a Hopfield network for optimization circuits. 45-51 - Luzian Wolf:
Recurrent nets for the storage of cyclic sequences. 53-60 - Teuvo Kohonen, György Barna, Ronald L. Chrisley:
Statistical pattern recognition with neural networks: benchmarking studies. 61-68 - David J. Burr:
An improved elastic net method for the traveling salesman problem. 69-76 - Daryl H. Graf, Wilf R. LaLonde:
A neural controller for collision-free movement of general robot manipulators. 77-84 - Geoffrey J. Hueter:
Solution of the traveling salesman problem with an adaptive ring. 85-92 - Stephen P. Luttrell:
Self-organising multilayer topographic mappings. 93-100 - Walter J. Freeman:
Why neural networks don't yet fly: inquiry into the neurodynamics of biological intelligence. 1-7 - James N. Templeman:
Race networks: a theory of competitive recognition networks based on the rate of reactivation of neurons in cortical columns. 9-16 - Nasser M. Nasrabadi, Yushu Feng:
Vector quantization of images based upon the Kohonen self-organizing feature maps. 101-108 - Helge J. Ritter, Klaus Schulten:
Kohonen's self-organizing maps: exploring their computational capabilities. 109-116 - Duane DeSieno:
Adding a conscience to competitive learning. 117-124 - Joachim M. Buhmann, Klaus Schulten:
Invariant pattern recognition by means of fast synaptic plasticity. 125-132 - George G. Lendaris:
Conceptual graph knowledge systems as problem context for neural networks. 133-140 - Bart Kosko:
Feedback stability and unsupervised learning. 141-152 - Tzi-Dar Chiueh, Rodney M. Goodman:
High-capacity exponential associative memories. 153-160 - Sukhan Lee, Rhee Man Kil:
Multilayer feedforward potential function network. 161-171 - Fathi M. A. Salam:
A formulation for the design of neural processors. 173-180 - Karen Haines, Robert Hecht-Nielsen:
A BAM with increased information storage capacity. 181-190 - Olli Ventä, Teuvo Kohonen:
A content-addressing software method for the elimination of neural networks. 191-198 - Robert J. T. Morris, Larry D. Rubin, Wing Shing Wong:
A decentralized tunable short term neural network memory and application to tracking. 199-206 - Chan S. Bak, Michael J. Little:
Memory capacity of artificial neural networks with high order node connections. 207-216 - Armin Fuchs, Hermann Haken:
Pattern recognition and associative memory as dynamical processes in nonlinear systems. 217-224 - Bohdan Macukow, Henri H. Arsenault:
Neural network model using a normalized inner product as a measure of similarity. 225-230 - Wen-Ran Zhang, Su-Shing Chen:
A logical architecture for cognitive maps. 231-238 - M. L. Rossen, Les T. Niles, Gary N. Tajchman, Marcia A. Bush, J. A. Anderson:
Training methods for a connectionist model of consonant-vowel syllable recognition. 239-246 - Cris Koutsougeras, Christos A. Papachristou:
Training of a neural network for pattern classification based on an entropy measure. 247-254 - Kazumi Saito, Ryohei Nakano:
Medical diagnostic expert system based on PDP model. 255-262 - Ronald J. Williams:
On the use of backpropagation in associative reinforcement learning. 263-270 - Christopher L. Scofield:
Learning internal representations in the Coulomb energy network. 271-276 - Erkki Oja, Teuvo Kohonen:
The subspace learning algorithm as a formalism for pattern recognition and neural networks. 277-284 - Risto Miikkulainen, Michael G. Dyer:
Forming global representations with extended backpropagation. 285-292 - Gregg C. Oden:
FuzzyProp: a symbolic superstrate for connectionist models. 293-300 - Karen Kukich:
Backpropagation topologies for sequence generation. 301-308 - Joseph C. Pemberton, Jacques J. Vidal:
When is the generalized delta rule a learning rule? a physical analogy. 309-315 - Peter A. Sandon, Leonard Uhr:
A local interaction heuristic for adaptive networks. 317-324 - Jocelyn Sietsma, Robert J. F. Dow:
Neural net pruning-why and how. 325-333 - A. Von Lehmen, Eung Gi Paek, P. F. Liao, A. Marrakchi, Jay S. Patel:
Factors influencing learning by backpropagation. 335-341 - Paul J. Werbos:
Backpropagation: past and future. 343-353 - Lucio Prina Ricotti, Susanna Ragazzini, Giuseppe Martinelli:
Learning of word stress in a sub-optimal second order back-propagation neural network. 355-361 - Sun-Yuan Kung, Jenq-Neng Hwang:
An algebraic projection analysis for optimal hidden units size and learning rates in back-propagation learning. 363-370 - Demetri Psaltis, Mark A. Neifeld:
The emergence of generalization in networks with constrained representations. 371-381 - Stefanos D. Kollias, Dimitris Anastassiou:
Adaptive training of multilayer neural networks using a least squares estimation technique. 383-390 - Patrick Gallinari, Sylvie Thiria, Françoise Fogelman:
Multilayer perceptrons and data analysis. 391-399 - Rodney Winter, Bernard Widrow:
MADALINE RULE II: a training algorithm for neural networks. 401-408 - Kiumi Akingbehin:
A decentralized algorithm for learning in adaptable networks. 409-416 - Patrice Gelband, Edison Tse:
Neural 'selective' processing and learning. 417-424 - Allan Hartstein, Roger H. Koch:
A self-learning threshold-controlled neural network. 425-430 - M. H. Hassoun, D. W. Clark:
An adaptive attentive learning algorithm for single-layer neural networks. 431-440 - Don R. Hush, John M. Salas:
Improving the learning rate of back-propagation with the gradient reuse algorithm. 441-447 - John F. Kolen:
Faster learning through a probabilistic approximation algorithm. 449-454 - Dziem D. Nguyen, James S. J. Lee:
A new LMS-based algorithm for rapid adaptive classification in dynamic environments: theory and preliminary results. 455-463 - J. F. Shepanski:
Fast learning in artificial neural systems: multilayer perceptron training using optimal estimation. 465-472 - Donald Woods:
Back and counter propagation aberrations. 473-479 - David E. van den Bout, Thomas K. Miller III:
A stochastic architecture for neural nets. 481-488 - Guo-Zheng Sun, H. H. Chen, Yee-Chun Lee:
Parallel sequential induction networks: a new paradigm of neural network architecture. 489-496 - James S. J. Lee, James C. Bezdek:
A feature projection based adaptive pattern recognition network. 497-505 - Myungsook Klassen, Yoh-Han Pao, Victor Chen:
Characteristics of the functional link net: a higher order delta rule net. 507-513 - Steven C. Suddarth, Stewart A. Sutton, Alistair D. C. Holden:
A symbolic-neural method for solving control problems. 516-523 - Donald F. Specht:
Probabilistic neural networks for classification, mapping, or associative memory. 525-532 - Pasi Koikkalainen, Erkki Oja:
Specification and implementation environment for neural networks using communicating sequential processes. 533-540vol.1. - Elizabeth C. Botha, Etienne Barnard, David P. Casasent:
Optical neural networks for image analysis: imaging spectroscopy and production systems. 541-546 - John G. Daugman:
Relaxation neural network for nonorthogonal image transforms. 547-560 - Dwight D. Egbert, Edward E. Rhodes, Philip H. Goodman:
Preprocessing of biomedical images for neurocomputer analysis. 561-568 - David E. Glover:
An optical Fourier/electronic neurocomputer automated inspection system. 569-576 - Sandra P. Clifford, Nasser M. Nasrabadi:
Integration of stereo vision and optical flow using Markov random fields. 577-584 - Thierry Troudet, Ali Tabatabai:
An adaptive neural net approach to the segmentation of mixed gray-level and binary pictures. 585-592 - S. E. Troxel, Steven K. Rogers, Matthew Kabrisky:
The use of neural networks in PSRI target recognition. 593-600 - John A. Vlontzos, Sun-Yuan Kung:
A hierarchical system for character recognition with stochastic knowledge representation. 601-608 - Jack H. Winters:
Superresolution for ultrasonic imaging in air using neural networks. 609-616 - Sheldon Gardner:
Ultradiffusion, scale space transformation, and the morphology of neural networks. 617-623 - Alireza Khotanzad, Jiin-Her Lu:
Distortion invariant character recognition by a multi-layer perceptron and back-propagation learning. 625-632 - Shun-ichi Amari:
Statistical neurodynamics of various versions of correlation associative memory. 633-640 - Bunpei Irie, Sei Miyake:
Capabilities of three-layered perceptrons. 641-648 - Martin L. Brady, Raghu Raghavan, Joseph Slawny:
Gradient descent fails to separate. 649-656 - A. Ronald Gallant, Halbert White:
There exists a neural network that does not make avoidable mistakes. 657-664 - Katsunori Shimohara, Tadasu Uchiyama, Yukio Tokunaga:
Back-propagation networks for event-driven temporal sequence processing. 665-672 - Thomas W. Ryan:
The resonance correlation network. 673-680 - Ashok K. Goel, J. Ramanujam, P. Sadayappan:
Towards a 'neural' architecture for abductive reasoning. 681-688 - Chen-Han Sung, Carey E. Priebe:
Temporal pattern recognition. 689-696
Volume 2
- Valeriy I. Nenov, Michael G. Dyer:
DETE: connectionist/symbolic model of visual and verbal association. 17-24 - Reza Shadmehr, Gary D. Lindquist:
A neural network for pattern generation in the scratch reflex. 25-32 - Judith E. Dayhoff:
Temporal structure in neural networks with impulse train connections. 33-45 - Bill Betts:
The toad optic tectum as a recurrent on-center off-surround neural net with quenching threshold. 47-54 - Kevin G. Kirby, Michael Conrad:
Bit-vector optimization algorithms for control of learning in neurons with second-messenger dynamics. 55-62 - Thaddeus F. Pawlicki, Dar-Shyang Lee, Jonathan J. Hull, Sargur N. Srihari:
Neural network models and their application to handwritten digit recognition. 63-70 - Yi-Tong Zhou, Rama Chellappa:
Computation of optical flow using a neural network. 71-78 - H. Taichi Wang, Bimal Mathur, Christof Koch:
A model of motion computation in primates. 79-86 - Shiro Usui, Yoshimi Kamiyama, Manabu Sakakibara:
Physiological engineering model of the retinal horizontal cell layer. 87-94 - A. Jean Maren, V. Minsky, M. Ali:
A multilayer cooperative/competitive method for creating hierarchical structures by clustering maximally-related nodes. 95-105 - Lawrence D. Jackel, Hans Peter Graf, Wayne E. Hubbard, John S. Denker, Don Henderson, Isabelle Guyon:
An application of neural net chips: handwritten digit recognition. 107-115 - Ken Johnson, Cindy Daniell, Jerry Burman:
Feature extraction in the Neocognitron. 117-126 - Ganapathy Krishnan, Deborah Walters:
Psychologically plausible features for shape recognition in a neural network. 127-134 - Ted Pawlicki:
Recognizing image invariants in a neural network architecture. 135-142 - Dean A. Pomerleau, George L. Gusciora, David S. Touretzky, H. T. Kung:
Neural network simulation at Warp speed: how we got 17 million connections per second. 143-150 - Lex A. Akers, Mark R. Walker:
A limited-interconnect synthetic neural IC. 151-158 - George A. Works:
The creation of Delta: a new concept in ANS processing. 159-164 - Sun-Yuan Kung, Jenq-Neng Hwang:
Parallel architectures for artificial neural nets. 165-172 - Jim Bailey, Dan Hammerstrom:
Why VLSI implementations of associative VLCNs require connection multiplexing. 173-180 - Dimitris Anastassiou:
Nonstandard A/D conversion based on symmetric neural networks. 181-188 - Vernon G. Dobson:
Decrementing associative networks. 189-196 - James M. Goodwin, Bruce E. Rosen, Jacques J. Vidal:
Exploration of learning in an associative magnetic processor. 197-204 - Shuichi Kurogi:
Abilities and limitations of a neural network model for spoken work recognition. 205-214 - Stefano Nolfi, Domenico Parisi:
Learning to understand sentences in a connectionist network. 215-219 - Z. G. An, Susan M. Mniszewski, Y. C. Lee, G. J. Papcun, Gary D. Doolen:
HIERtalker: a default hierarchy of high order neural networks that learns to read English aloud. 221-228 - J. P. Lewis:
Creation by refinement: a creativity paradigm for gradient descent learning networks. 229-233 - Manoel Fernando Tenorio, M. Daniel Tom, Richard G. Schwartz:
Adaptive networks as a model for human speech development. 235-242 - Noboru Sugie, Jie Huang, Noboru Ohnishi:
Localizing sound source by incorporating biological auditory mechanism. 243-250 - Jane Y. Murdock, Abdo A. Husseiny, Enju Liang, Sam A. Abolrous, Rodrigo J. Rodriguez:
Improvement on speech recognition and synthesis for disabled individuals using fuzzy neural net retrofits. 251-258 - Harold Szu:
Fast TSP algorithm based on binary neuron output and analog neuron input using the zero-diagonal interconnect matrix and necessary and sufficient constraints of the permutation matrix. 259-266 - M. Goldstein, Nikzad Benny Toomarian, Jacob Barhen:
A comparison study of optimization methods for the bipartite matching problem (BMP). 267-273 - Yoon-Pin Simon Foo, Yoshiyasu Takefuji:
Stochastic neural networks for solving job-shop scheduling. I. Problem representation. 275-282 - Yoon-Pin Simon Foo, Yoshiyasu Takefuji:
Stochastic neural networks for solving job-shop scheduling. II. architecture and simulations. 283-290 - Shailesh U. Hegde, Jeffrey L. Swee, William B. Levy:
Determination of parameters in a Hopfield/Tank computational network. 291-298 - David E. van den Bout, Thomas K. Miller III:
A traveling salesman objective function that works. 299-303 - E. Wacholder, J. Han, R. C. Mann:
An extension of the Hopfield-Tank model for solution of the multiple traveling salesmen problem. 305-324 - J. Ramanujam, P. Sadayappan:
Optimization by neural networks. 325-332 - Robert D. Brandt, Yao Wang, Alan J. Laub, Sanjit K. Mitra:
Alternative networks for solving the traveling salesman problem and the list-matching problem. 333-340 - Yoon-Pin Simon Foo, Yoshiyasu Takefuji:
Integer linear programming neural networks for job-shop scheduling. 341-348 - Alexander Moopenn, A. P. Thakoor, Tuan Duong:
A neural network for Euclidean distance minimization. 349-356