Loading...
  • computation
  • Neural networks have been a subject of intense research activities over the past few decades due to their wide applications in many areas such as signal processing, pattern recognition, associative memories, parallel computation, and optimization solution. (hindawi.com)
  • neuron
  • A memory cell is composed of four main elements: an input gate, a neuron with a self-recurrent connection (a connection to itself), a forget gate and an output gate. (danmackinlay.name)
  • Our group develops (i) computational models for cognitive processes like decision making, and recognition and learning of complex spatiotemporal patterns, and (ii) neuronal models at several spatiotemporal scales, e.g. neural mass or single neuron models. (mpg.de)
  • 2018
  • Arana-Daniel N., Valdés-López J., Alanís A.Y., López-Franco C. (2018) Traversability Cost Identification of Dynamic Environments Using Recurrent High Order Neural Networks for Robot Navigation. (springer.com)
  • theory
  • Concepts from linear system theory were adapted to represent some aspects of neural dynamics, including solutions of simultaneous linear equations \(Y = AX\) using matrix theory, and concepts about cross-correlation. (scholarpedia.org)
  • Rovithakis, G.A., Christodoulou, M.A.: Adaptive Control with Recurrent High-Order Neural Networks: Theory and Industrial Applications. (springer.com)
  • dynamics
  • Direction-of-Change Forecasting using a Volatility- Based Recurrent Neural Network ," CeNDEF Working Papers 06-16, Universiteit van Amsterdam, Center for Nonlinear Dynamics in Economics and Finance. (repec.org)
  • Although previous researchers have explored the potential of this network to solve combinatorial optimization problems or store reoccurring activity patterns as attractors of its deterministic dynamics, a basic open problem is to design a family of Hopfield networks with a number of noise-tolerant memories that grows exponentially with neural population size. (readbyqxmd.com)
  • delays
  • Therefore, increasing attention has been paid to the problem of stability analysis of neural networks with time-varying delays, and recently a lot of research works have been reported for delayed neural networks and system (see [ 1 - 17 ] and references therein). (hindawi.com)
  • It is also shown that memristor-based recurrent neural networks with time-varying delays are stabilizable at the origin of the state space by using a linear state feedback control law with appropriate gains. (edu.au)
  • stability
  • Therefore, many initial findings on the global dissipativity [ 18 , 22 - 30 ] or Lagrange stability [ 31 - 36 ] analysis of neural networks have been reported. (hindawi.com)
  • Hopfield
  • Robust Exponential Memory in Hopfield Networks. (readbyqxmd.com)
  • It is also shown that the memory recovery process developed for the new network can be used to greatly expand the radius of attraction of standard Hopfield networks for "incomplete" (as opposed to "noisy") prompts. (ubc.ca)
  • The mathematical analysis begins by deriving an expression for the free energy of a Hopfield network when a near-orthogonal memory set has been stored. (ubc.ca)
  • decades
  • Over decades such neuronal feedback attracted a huge amount of theoretical modeling [ 1 - 3 ] and one of the most prominent functions that is proposed for the recurrent synaptic connections is that of associative memory. (springeropen.com)
  • time
  • Feedback networks structured to have memory and a notion of "current" and "past" states, which can encode time (or whatever). (danmackinlay.name)
  • Recurrent networks, on the other hand, take as their input not just the current input example they see, but also what they have perceived previously in time. (skymind.ai)
  • memories
  • As a result, the network actually learns a memory set which is "near-orthogonal", even though the visible components of the memories are randomly selected. (ubc.ca)
  • Finally, the theoretical maximum information content of sets of near-orthogonal memories is calculated as a function of the level of orthogonality, and is compared to the amount of information that can be stored in the new network. (ubc.ca)
  • nodes
  • Both of these networks are named after the way they channel information through a series of mathematical operations performed at the nodes of the network. (skymind.ai)