Spiking Neuron Models

Spiking Neuron Models PDF

Author: Wulfram Gerstner

Publisher: Cambridge University Press

Published: 2002-08-15

Total Pages: 498

ISBN-13: 9780521890793

DOWNLOAD EBOOK →

Neurons in the brain communicate by short electrical pulses, the so-called action potentials or spikes. How can we understand the process of spike generation? How can we understand information transmission by neurons? What happens if thousands of neurons are coupled together in a seemingly random network? How does the network connectivity determine the activity patterns? And, vice versa, how does the spike activity influence the connectivity pattern? These questions are addressed in this 2002 introduction to spiking neurons aimed at those taking courses in computational neuroscience, theoretical biology, biophysics, or neural networks. The approach will suit students of physics, mathematics, or computer science; it will also be useful for biologists who are interested in mathematical modelling. The text is enhanced by many worked examples and illustrations. There are no mathematical prerequisites beyond what the audience would meet as undergraduates: more advanced techniques are introduced in an elementary, concrete fashion when needed.

Neuronal Dynamics

Neuronal Dynamics PDF

Author: Wulfram Gerstner

Publisher: Cambridge University Press

Published: 2014-07-24

Total Pages: 591

ISBN-13: 1107060834

DOWNLOAD EBOOK →

This solid introduction uses the principles of physics and the tools of mathematics to approach fundamental questions of neuroscience.

Principles of Neural Design

Principles of Neural Design PDF

Author: Peter Sterling

Publisher: MIT Press

Published: 2015-05-22

Total Pages: 567

ISBN-13: 0262028700

DOWNLOAD EBOOK →

Neuroscience research has exploded, with more than fifty thousand neuroscientists applying increasingly advanced methods. A mountain of new facts and mechanisms has emerged. And yet a principled framework to organize this knowledge has been missing. In this book, Peter Sterling and Simon Laughlin, two leading neuroscientists, strive to fill this gap, outlining a set of organizing principles to explain the whys of neural design that allow the brain to compute so efficiently. Setting out to "reverse engineer" the brain -- disassembling it to understand it -- Sterling and Laughlin first consider why an animal should need a brain, tracing computational abilities from bacterium to protozoan to worm. They examine bigger brains and the advantages of "anticipatory regulation"; identify constraints on neural design and the need to "nanofy"; and demonstrate the routes to efficiency in an integrated molecular system, phototransduction. They show that the principles of neural design at finer scales and lower levels apply at larger scales and higher levels; describe neural wiring efficiency; and discuss learning as a principle of biological design that includes "save only what is needed." Sterling and Laughlin avoid speculation about how the brain might work and endeavor to make sense of what is already known. Their distinctive contribution is to gather a coherent set of basic rules and exemplify them across spatial and functional scales.

Advances in Computational Intelligence

Advances in Computational Intelligence PDF

Author: Joan Cabestany

Publisher: Springer Science & Business Media

Published: 2011-05-30

Total Pages: 601

ISBN-13: 3642215009

DOWNLOAD EBOOK →

This two-volume set LNCS 6691 and 6692 constitutes the refereed proceedings of the 11th International Work-Conference on Artificial Neural Networks, IWANN 2011, held in Torremolinos-Málaga, Spain, in June 2011. The 154 revised papers were carefully reviewed and selected from 202 submissions for presentation in two volumes. The first volume includes 69 papers organized in topical sections on mathematical and theoretical methods in computational intelligence; learning and adaptation; bio-inspired systems and neuro-engineering; hybrid intelligent systems; applications of computational intelligence; new applications of brain-computer interfaces; optimization algorithms in graphic processing units; computing languages with bio-inspired devices and multi-agent systems; computational intelligence in multimedia processing; and biologically plausible spiking neural processing.

Membrane Computing Models: Implementations

Membrane Computing Models: Implementations PDF

Author: Gexiang Zhang

Publisher: Springer Nature

Published: 2021-07-01

Total Pages: 292

ISBN-13: 9811615667

DOWNLOAD EBOOK →

The theoretical basis of membrane computing was established in the early 2000s with fundamental research into the computational power, complexity aspects and relationships with other (un)conventional computing paradigms. Although this core theoretical research has continued to grow rapidly and vigorously, another area of investigation has since been added, focusing on the applications of this model in many areas, most prominently in systems and synthetic biology, engineering optimization, power system fault diagnosis and mobile robot controller design. The further development of these applications and their broad adoption by other researchers, as well as the expansion of the membrane computing modelling paradigm to other applications, call for a set of robust, efficient, reliable and easy-to-use tools supporting the most significant membrane computing models. This work provides comprehensive descriptions of such tools, making it a valuable resource for anyone interested in membrane computing models.

Pulsed Neural Networks

Pulsed Neural Networks PDF

Author: Wolfgang Maass

Publisher: MIT Press

Published: 2001-01-26

Total Pages: 414

ISBN-13: 9780262632218

DOWNLOAD EBOOK →

Most practical applications of artificial neural networks are based on a computational model involving the propagation of continuous variables from one processing unit to the next. In recent years, data from neurobiological experiments have made it increasingly clear that biological neural networks, which communicate through pulses, use the timing of the pulses to transmit information and perform computation. This realization has stimulated significant research on pulsed neural networks, including theoretical analyses and model development, neurobiological modeling, and hardware implementation. This book presents the complete spectrum of current research in pulsed neural networks and includes the most important work from many of the key scientists in the field. Terrence J. Sejnowski's foreword, "Neural Pulse Coding," presents an overview of the topic. The first half of the book consists of longer tutorial articles spanning neurobiology, theory, algorithms, and hardware. The second half contains a larger number of shorter research chapters that present more advanced concepts. The contributors use consistent notation and terminology throughout the book. Contributors Peter S. Burge, Stephen R. Deiss, Rodney J. Douglas, John G. Elias, Wulfram Gerstner, Alister Hamilton, David Horn, Axel Jahnke, Richard Kempter, Wolfgang Maass, Alessandro Mortara, Alan F. Murray, David P. M. Northmore, Irit Opher, Kostas A. Papathanasiou, Michael Recce, Barry J. P. Rising, Ulrich Roth, Tim Schönauer, Terrence J. Sejnowski, John Shawe-Taylor, Max R. van Daalen, J. Leo van Hemmen, Philippe Venier, Hermann Wagner, Adrian M. Whatley, Anthony M. Zador

How to Build a Brain

How to Build a Brain PDF

Author: Chris Eliasmith

Publisher: Oxford University Press

Published: 2013-04-16

Total Pages: 475

ISBN-13: 0199794693

DOWNLOAD EBOOK →

How to Build a Brain provides a detailed exploration of a new cognitive architecture - the Semantic Pointer Architecture - that takes biological detail seriously, while addressing cognitive phenomena. Topics ranging from semantics and syntax, to neural coding and spike-timing-dependent plasticity are integrated to develop the world's largest functional brain model.

Spike-timing dependent plasticity

Spike-timing dependent plasticity PDF

Author: Henry Markram

Publisher: Frontiers E-books

Published:

Total Pages: 575

ISBN-13: 2889190439

DOWNLOAD EBOOK →

Hebb's postulate provided a crucial framework to understand synaptic alterations underlying learning and memory. Hebb's theory proposed that neurons that fire together, also wire together, which provided the logical framework for the strengthening of synapses. Weakening of synapses was however addressed by "not being strengthened", and it was only later that the active decrease of synaptic strength was introduced through the discovery of long-term depression caused by low frequency stimulation of the presynaptic neuron. In 1994, it was found that the precise relative timing of pre and postynaptic spikes determined not only the magnitude, but also the direction of synaptic alterations when two neurons are active together. Neurons that fire together may therefore not necessarily wire together if the precise timing of the spikes involved are not tighly correlated. In the subsequent 15 years, Spike Timing Dependent Plasticity (STDP) has been found in multiple brain brain regions and in many different species. The size and shape of the time windows in which positive and negative changes can be made vary for different brain regions, but the core principle of spike timing dependent changes remain. A large number of theoretical studies have also been conducted during this period that explore the computational function of this driving principle and STDP algorithms have become the main learning algorithm when modeling neural networks. This Research Topic will bring together all the key experimental and theoretical research on STDP.

Theoretical Neuroscience

Theoretical Neuroscience PDF

Author: Peter Dayan

Publisher: MIT Press

Published: 2005-08-12

Total Pages: 477

ISBN-13: 0262541858

DOWNLOAD EBOOK →

Theoretical neuroscience provides a quantitative basis for describing what nervous systems do, determining how they function, and uncovering the general principles by which they operate. This text introduces the basic mathematical and computational methods of theoretical neuroscience and presents applications in a variety of areas including vision, sensory-motor integration, development, learning, and memory. The book is divided into three parts. Part I discusses the relationship between sensory stimuli and neural responses, focusing on the representation of information by the spiking activity of neurons. Part II discusses the modeling of neurons and neural circuits on the basis of cellular and synaptic biophysics. Part III analyzes the role of plasticity in development and learning. An appendix covers the mathematical methods used, and exercises are available on the book's Web site.