


Остановите войну!
for scientists:


default search action
IJCNN 1999: Washington, DC, USA
- International Joint Conference Neural Networks, IJCNN 1999, Washington, DC, USA, July 10-16, 1999. IEEE 1999
Neural Memory, Learning and Adaptation
- David D. Vogel:
A partial model of cortical memory based on disinhibition. 1-5 - Javier Ropero Peláez, Marcelo Godoy Simões:
A computational model of synaptic metaplasticity. 6-11 - Simona Doboli, Ali A. Minai, Phillip J. Best:
Generating smooth context-dependent neural representations. 12-15 - A. N. Radchenko:
Biophysical basis of neural memory. 16-20 - Witali L. Dunin-Barkowski, Donald C. Wunsch:
Cerebellar learning: a possible phase switch in evolution. 21-26 - Rushi Bhatt, Karthik Balakrishnan, Vasant Konavar:
A hybrid model for rodent spatial learning and localization. 27-32 - K. F. Wilson, Charles F. Osborne:
A numerical exploration of a stochastic model of human list learning. 33-37 - Neill R. Taylor, John G. Taylor:
Learning to generate temporal sequences by models of frontal lobes. 38-41 - Roman Borisyuk, Frank Hoppensteadt:
Oscillatory model of the hippocampal memory. 42-45 - John G. Taylor:
A mathematical analysis of adaptive synapses. 46-51 - Robert Kozma, Walter J. Freeman:
A possible mechanism for intermittent oscillations in the KIII model of dynamic memories - the case study of olfaction. 52-57 - András Lörincz, György Buzsáki:
Computational model of the entorhinal-hippocampal region derived from a single principle. 58-63 - Peter Aszalos, Szabolcs Kéri, György Kovács, György Benedek, Zolton Janka, András Lörincz:
Generative network explains category formation in Alzheimer patients. 64-68
Global Brain Models
- Simon Y. Berkovich:
Probing the architecture of the brain in experimentation with afterimages. 69-73 - Asim Roy:
Brain's internal mechanisms - a new paradigm. 74-79 - Thomas E. Portegys:
A connectionist model of motivation. 80-85 - Frank R. Funderburk, Karen I. Bolla:
Steps toward development of an integrated neurobiological model of cocaine misuse effects. 86-90 - John G. Taylor:
Neural networks for consciousness: the central representation. 91-96 - Shannon R. Campbell:
Rate of synchrony in locally coupled chains of relaxation oscillators. 97-102 - Robert Homer:
A neural network model of personality. 103-108 - Iren Valova, Yukio Kosugi:
Modeling higher level processing functions inherent to the human brain. 109-112 - Steven L. Bressler, Mingzhou Ding:
Coordination dynamics in large-scale cortical networks. 113-116 - Carlos Alberto Rossi, Ladislao Bodnar:
Artificial learning. 117-120 - Svetlana Levitan, Ioana Stoica, James A. Reggia:
A model of lateralization and asymmetries in cortical maps. 121-124
Vision
- Javier Ropero Peláez, Marcelo Godoy Simões:
Pattern completion through thalamo-cortical interaction. 125-130 - Karen G. Haines, John A. Moya, Thomas P. Caudell:
Modeling nonsynaptic communication between neurons in the lamina ganglionaris of Musca domestica. 131-136 - Guy D. Chun, Thomas P. Caudell:
A model of saccadic generation based on the neurobiology of the superior colliculus. 137-141 - Thaddeus A. Roppel, Denise M. Wilson, Kevin Dunman, Vlatko Becanovic, Mary Lou Padgett:
Design of a low-power, portable sensor system using embedded neural networks and hardware preprocessing. 142-145 - Jeremy B. Badler, Edward L. Keller:
Decoding of information from distributed motor maps. 146-151 - Alexander G. Sukhov, Tattiana G. Bezdudnaya:
Peculiarities of frequency-phase filtering of signals at different stages of information processing in rat barrel cortex. 152-155 - Dmitry Shaposhnikov, Lubov N. Podladchikova:
Local nonuniformity of the visual perception in the peripheral vision field. 156-159 - Kô Sakai, S. Tanaka:
Analysis of spatial nonlinear responses in cortical complex cells. 160-163 - Yu Bo, Liming Zhang:
Knowledge matching model with dynamic weights based on the primary visual cortex. 164-169 - Shigeki Nakauchi, Shiro Usui, Jussi Parkkinen, Pertti Silfsten:
Computational explanations for color transparency. 170-173 - Vitali V. Gavrik:
Single-pigment optical mechanism for color opponency in a photoreceptor cell. 174-177 - Y. Ito, Tohru Yagi, H. Kanda, S. Tanaka, M. Watanabe, Yoshiki Uchikawa:
Cultures of neurons on micro-electrode array and control of their axon growth in hybrid retinal implant. 178-181 - Pamela Abshire, Andreas G. Andreou:
Relating information capacity to a biophysical model for blowfly retina. 182-187 - Chota M. Markan, Basabi Bhaumik:
Diffusive Hebbian model for orientation map formation. 188-191 - Michael Becker, Rolf Eckmiller, Ralph Hünermann:
Psychophysical test of a tunable retina encoder for retina implants. 192-195 - Takehiko Ogawa, Takashi Minohara, Hajime Kanada, Yukio Kosugi:
Realization of geometric illusions using artificial visual model based on acute-angled expansion among crossing lines. 196-199 - Hiroshi Kume, Yuko Osana, Masafumi Hagiwara:
Solving the binding problem with feature integration theory. 200-205
Modeling of the Hippocampal Function (Special Session)
- Nestor A. Schmajuk:
The hippocampus and the brain: a neural network model. 206-209
Sensor/Motor and Neural Processes
- Mark Glezer, Yuri Shkuro, James A. Reggia:
Simulated callosal lesions in a neural model of left and right hemispheric regions. 210-214 - Natalia Shevtsova, James A. Reggia:
Lesion effects in a bihemispheric letter-identification model. 215-218 - Witali L. Dunin-Barkowski, Sergey N. Markin, Lubov N. Podladchikova, Donald C. Wunsch:
Climbing fibre Purkinje cell twins are found. 219-222 - Hiroshi Wakuya, Katsunori Shida:
Acquired sensorimotor coordinated signal transformation in a bi-directional neural network model. 223-228 - Yong Li, Ning Lan, Fusheng Yang:
The inverse identification of neuromuscular system by using neural networks in elbow function self-correcting system. 229-232 - Kazutaka Someya, Atsushi Fujita, Yoshifumi Sekine, Kawyuki Aihara:
Chaotic phenomena of an active axon. 233-238 - I. E. Kanounikov, E. V. Antonova:
Comparison of fractal characteristics of the electroencephalogram at schoolchildren 10-12 years old in norm and with difficulties in learning. 239-242 - I. E. Kanounikov, E. V. Antonova, B. V. Kiselev, D. R. Belov:
Dependence of one of the fractal characteristics (Hurst exponent) of the human electroencephalogram on the cortical area and type of activity. 243-246 - Nicolas P. Rougier, Frédéric Alexandre:
Spatial knowledge transfer between models of hippocampus and associative cortex. 247-251 - Hervé Frezza-Buet, Frédéric Alexandre:
Modeling prefrontal functions for robot navigation. 252-257 - Shenchu Xu, Zhenxiang Chen, Jinshui Zhong, Boxi Wu, Guoqiang Gao, Hang Xiao, Jianping Wu, Yun Shi, Jian Chen, Xiaofan Yang:
The animal tests of chaotic signal therapy for epilepsy (CSTE). 258-261 - Zhenya He, Yifeng Zhang, Luxi Yang, Yuhui Shi:
Control chaos in nonautonomous cellular neural networks using impulsive control methods. 262-267 - Anthony N. Burkitt, Graeme M. Clark:
Synchronization of the neural response to noisy periodic synaptic input. 268-273
Auditory System and Language
- Guy J. Brown, DeLiang Wang:
The separation of speech from interfering sounds: an oscillatory correlation approach. 274-279 - Xugang Lu, Daowen Chen:
Integrating spatial and temporal mechanisms in auditory neural fiber's computational model. 280-283 - Bing Xiang, Xihong Wu, Zhimin Liu, Huisheng Chi:
Auditory model based speech feature extraction and its application to speaker identification. 284-287 - Michiro Negishi, Daniel Bullock, Michael Cohen:
A self-organizing two-stream model of language comprehension. 288-292 - Ryuta Fukuda, Junko Hara, William Rodman Shankle, Toshio Inui, Masaru Tomita:
Predicting human cortical connectivity for language areas using the Conel database. 293-295 - Susan L. Denham, Michael J. Denham:
Synaptic depression may explain many of the temporal response properties observed in primary auditory cortex: a computational investigation. 296-300 - Gang Wang, Miki Haseyama, Nobuo Suga:
A recurrent network model for range processing of the mustached bat. 301-304 - Anthony N. Burkitt:
Analysis of neural response for excitation-inhibition balanced networks with reversal potentials for large numbers of inputs. 305-308
The Electronic Nose (Special Session)
- Paul E. Keller:
Overview of electronic nose algorithms. 309-312 - Dumitru Dumitrescu, Beatrice Lazzerini, Francesco Marcelloni:
Olfactory signal classification based on evolutionary computation. 313-316 - Geza Szekely, Mary Lou Padgett, Gerry V. Dozier, Thaddeus A. Roppel:
Odor detection using pulse coupled neural networks. 317-321
Higher Order and Recurrent Networks
- Tertulien Ndjountche, Rolf Unbehauen, Fa-Long Luo:
Signal separation processor based on second-order statistic algorithms. 322-327 - Shuxiang Xu, Ming Zhang:
Adaptive higher-order feedforward neural networks. 328-332 - Ming Zhang, Shuxiang Xu, Bo Lu:
Neuron-adaptive higher order neural network group models. 333-336 - Satoshi Matsuda:
"Optimal" neural representation of higher order for quadratic combinatorial optimization. 337-340 - Olivier Moynot, Manuel Samuelides:
Dynamics of large random recurrent neural networks: oscillations of 2-population model. 341-344 - Habtom W. Ressom:
Modeling a compression plant using recurrent neural networks. 345-348 - Alexander G. Parlos, Omar T. Rais, Amir F. Atiya:
Multi-step-ahead prediction using dynamic recurrent neural networks. 349-352 - Yee Chin Wong, Malur K. Sundareshan:
A simplex optimization approach for recurrent neural network training and for learning time-dependent trajectory patterns. 353-358 - Devasis Bassu, James T. Lo, Justin Nave:
Training recurrent neural networks with noisy input measurements. 359-363 - James T. Lo, Devasis Bassu:
Mathematical justification of recurrent neural networks with long and short-term memories. 364-369
Functional Approximation and Prediction
- Shuxiang Xu, Ming Zhang:
Approximation to continuous functionals and operators using adaptive higher-order feedforward neural networks. 370-374 - Gerald H. L. Cheang, Andrew R. Barron:
Estimation with two hidden layer neural nets. 375-378 - E. D. Avedyan, G. V. Barkan, I. K. Levin:
Synthesis of multilayer neural networks architecture (for the case of cascaded NNs). 379-382 - Danil V. Prokhorov, Lee A. Feldkamp:
Application of SVM to Lyapunov function approximation. 383-387 - Wolfgang Eppler, Hans N. Beck:
Piecewise linear networks (PLN) for function approximation. 388-391 - Guy Ferland, Tet Hin Yeap:
Prediction of nonlinear dynamical system output with multilayer perceptron and radial basis function neural networks. 392-397 - Irwin W. Sandberg:
Time-delay polynomial networks and quality of approximation. 398-403 - Andrzej Lozowski, Rafal Komendarczyk, Jacek M. Zurada:
Hamiltonian vector field for the Lorenz invariant set. 404-408 - Thomas Hanselmann, Anthony Zaknich, Yianni Attikiouzel:
Learning functions and their derivatives using Taylor series and neural networks. 409-412 - Takamasa Koshizen, Yves Rosseel, Yoshihiro Tonegawa:
A new EM algorithm using Tikhonov regularization. 413-418 - E. Basson, Andries P. Engelbrecht:
Approximation of a function and its derivatives in feedforward neural networks. 419-421
Math Foundations I
- Evgeny E. Dudnikov:
Increase the number of stable equilibrium points in a Hopfield-type neural network. 422-426 - Hiroki Suyari, Ikuo Matsuba:
New approach to the storage capacity of neural networks using the minimum distance between input patterns. 427-431 - Xinchuan Zeng, Tony R. Martinez:
Extending the power and capacity of constraint satisfaction networks. 432-437 - Hugo de Garis, Norberto Eiji Nawa, Michael Hough, Michael Korkin:
Evolving an optimal de/convolution function for the neural net modules of ATR's artificial brain project. 438-443 - Masahiko Yoshioka, Masatoshi Shiino:
Pattern coding based on firing times in a network of spiking neurons. 444-449 - Ernst M. Kussul, Dmitri A. Rachkovskij, Donald C. Wunsch:
The random subspace coarse coding scheme for real-valued vectors. 450-455 - Howard E. Michel, Abdul Ahad S. Awwal:
Enhanced artificial neural networks using complex numbers. 456-461 - Dan Ventura:
Implementing competitive learning in a quantum system. 462-466 - Arturo Hernández Aguirre, Cris Koutsougeras, Bill P. Buckles:
On model selection in SLT and linear basis neural networks. 467-472 - Ana Maria Tomé:
The generalized eigen-decomposition approach to blind source problems. 473-476 - Kayvan Najarian, Guy Albert Dumont, Michael S. Davies:
A learning-theory-based training algorithm for variable-structure dynamic neural modeling. 477-482 - Kazunori Miyamoto, Hirohisa Aman, Torao Yanaru, Masahiro Nagamatsu:
Symbolized particles store type neuron model and its application. 483-487 - Adrian Silvescu:
Fourier neural networks. 488-491
Math Foundations II
- Michael J. Healy:
Colimits in memory: category theory and neural systems. 492-496 - Xiuwen Liu, DeLiang Wang:
A boundary-pair representation for perception modeling. 497-501 - Yan Zhang:
On logical semantics of hybrid symbolic-neural networks for commonsense reasoning. 502-505 - Toshiyuki Tanaka:
Exploration of mean-field approximation for feedforward networks. 506-509 - Arnulfo P. Azcarraga, Marlene Rose Lim:
2D order of self-organizing Kristal maps. 510-513 - Osamu Araki, Kapcyuki Aihara:
Dual coding in a network of spiking neurons: aperiodic spikes and stable firing rates. 514-518 - Sorin Draghici:
Some new results on the capabilities of integer weights neural networks in classification problems. 519-524 - Michikio Yamana, Masatoshi Shiino, Masahiko Yoshioka:
Oscillator neural network model with distributed native frequencies. 525-528 - Ján Jockusch, Helge J. Ritter:
An instantaneous topological mapping model for correlated stimuli. 529-534 - Neil M. Branston, Wael El-Deredy:
Parameter selection and state dominance in hidden Markov models of neuronal activity. 535-539 - Lei Xu:
Bayesian ying-yang supervised learning, modular models, and three layer nets. 540-545 - Lei Xu:
BYY data smoothing based learning on a small size of samples. 546-551 - Lei Xu:
Bayesian ying-yang theory for empirical learning, regularization and model selection: general formulation. 552-557 - Kazuyoshi Tsutsumi, Kazuo Nakajima:
Maximum/minimum detection by a module-based neural network with redundant architecture. 558-561 - John D. Carew, Amir Assadi, Hamid Edhbalnia:
A method for investigating the nonlinear dynamics of the human brain from analysis of functional MRI data. 562-567
Nonlinear Problems and Programming
- Satoshi Matsuda:
Theoretical analysis of quantized Hopfield network for integer programming. 568-571 - Eric Bax:
Validation of fusion through linear programming. 572-575 - Anton M. Sirota, Alexander A. Frolov, Dusan Húsek:
Nonlinear factorization in the hippocampal neural structure. 576-581 - Youshen Xia, Jun Wang:
Primal neural networks for solving convex quadratic programs. 582-587 - Jun Wang, Youshen Xia:
A dual neural network solving quadratic programming problems. 588-593 - Jaques Reifman, Earl E. Feldman:
Nonlinear programming with feedforward neural networks. 594-598 - David B. McCaughan, David A. Medler, Michael R. W. Dawson:
Internal representation in networks of nonmonotonic processing units. 599-604 - Gerhard X. Ritter, Thad W. Beaver:
Morphological perceptrons. 605-610 - Michael Ciraula, Irwin W. Sandberg:
Uniform approximation of discrete-time nonlinear systems. 611-616