Loading...
  • Prediction
  • We propose a detection methodology by adapting a three-layered, deep learning architecture of (1) recurrent neural network [bi-directional long short-term memory (Bi-LSTM)] for character-level word representation to encode the morphological features of the medical terminology, (2) Bi-LSTM for capturing the contextual information of each word within a sentence, and (3) conditional random fields for the final label prediction by also considering the surrounding words. (springer.com)
  • Smith, C , Doherty, J and Jin, Y (2014) Multi-Objective Evolutionary Recurrent Neural Network Ensemble for Prediction of Computational Fluid Dynamic Simulations In: IEEE Congress on Evolutionary Computation (CEC), 2014-07-06 - 2014-07-11, Beijing, PEOPLES R CHINA. (surrey.ac.uk)
  • This is a simple framework for time-series prediction of the time to the next event applicable when we have any or all of the problems of continuous or discrete time, right censoring, recurrent events, temporal patterns, time varying covariates or time series of varying lengths. (chalmers.se)
  • Martinsson2017, author={Martinsson, Egil}, title={WTTE-RNN : Weibull Time To Event Recurrent Neural Network A model for sequential prediction of time-to-event in the case of discrete or continuous censored data, recurrent events or time-varying covariates}, abstract={In this thesis we propose a new model for predicting time to events: the Weibull Time To Event RNN. (chalmers.se)
  • networks with time-varyin
  • Therefore, increasing attention has been paid to the problem of stability analysis of neural networks with time-varying delays, and recently a lot of research works have been reported for delayed neural networks and system (see [ 1 - 17 ] and references therein). (hindawi.com)
  • It is also shown that memristor-based recurrent neural networks with time-varying delays are stabilizable at the origin of the state space by using a linear state feedback control law with appropriate gains. (edu.au)
  • neuron
  • A memory cell is composed of four main elements: an input gate, a neuron with a self-recurrent connection (a connection to itself), a forget gate and an output gate. (danmackinlay.name)
  • It is proven herein that there are 2(2n(2)-n) equilibria for an n-neuron memristor-based neural network and they are located in the derived globally attractive sets. (edu.au)
  • Our group develops (i) computational models for cognitive processes like decision making, and recognition and learning of complex spatiotemporal patterns, and (ii) neuronal models at several spatiotemporal scales, e.g. neural mass or single neuron models. (mpg.de)
  • computation
  • Neural networks have been a subject of intense research activities over the past few decades due to their wide applications in many areas such as signal processing, pattern recognition, associative memories, parallel computation, and optimization solution. (hindawi.com)
  • 2018
  • Arana-Daniel N., Valdés-López J., Alanís A.Y., López-Franco C. (2018) Traversability Cost Identification of Dynamic Environments Using Recurrent High Order Neural Networks for Robot Navigation. (springer.com)
  • interval
  • This paper is concerned with the robust dissipativity problem for interval recurrent neural networks (IRNNs) with general activation functions, and continuous time-varying delay, and infinity distributed time delay. (hindawi.com)
  • It is also possible that there is no equilibrium point in some situations, especially for interval recurrent neural networks with infinity distributed delays. (hindawi.com)
  • Identification
  • Recurrent High Order Neural Networks (RHONN) trained with Extended Kalman Filter (EKF) are used to identify rough terrain traversability costs, and besides the good results in the identification tasks, we get the advantages of using a robust machine learning method such as RHONNs. (springer.com)
  • spatial
  • Anderson (1968) initially described his intuitions about neural pattern recognition using a spatial cross-correlation function. (scholarpedia.org)
  • However, neural networks usually have a spatial extent due to the presence of a multitude of parallel pathways with a variety of axon sizes and lengths. (hindawi.com)
  • patterns
  • Although previous researchers have explored the potential of this network to solve combinatorial optimization problems or store reoccurring activity patterns as attractors of its deterministic dynamics, a basic open problem is to design a family of Hopfield networks with a number of noise-tolerant memories that grows exponentially with neural population size. (readbyqxmd.com)
  • theory
  • Concepts from linear system theory were adapted to represent some aspects of neural dynamics, including solutions of simultaneous linear equations \(Y = AX\) using matrix theory, and concepts about cross-correlation. (scholarpedia.org)
  • Rovithakis, G.A., Christodoulou, M.A.: Adaptive Control with Recurrent High-Order Neural Networks: Theory and Industrial Applications. (springer.com)
  • continuous
  • Continuous-nonlinear network laws typically arose from an analysis of behavioral or neural data. (scholarpedia.org)
  • Decoding continuous hind limb joint angles from sensory recordings of neural system provides a feedback for closed-loop control of hind limb movement using functional electrical stimulation. (readbyqxmd.com)
  • stability
  • Therefore, many initial findings on the global dissipativity [ 18 , 22 - 30 ] or Lagrange stability [ 31 - 36 ] analysis of neural networks have been reported. (hindawi.com)
  • decades
  • Over decades such neuronal feedback attracted a huge amount of theoretical modeling [ 1 - 3 ] and one of the most prominent functions that is proposed for the recurrent synaptic connections is that of associative memory. (springeropen.com)
  • paper
  • In this paper, it is presented a neural network methodology for learning traversability cost maps to aid autonomous robotic navigation. (springer.com)
  • output
  • Here's a diagram of an early, simple recurrent net proposed by Elman , where the BTSXPE at the bottom of the drawing represents the input example in the current moment, and CONTEXT UNIT represents the output of the previous moment. (skymind.ai)
  • approach
  • However, from a practical point of view, it is not always the case that the neural network trajectories will approach a single equilibrium point that is the equilibrium point will be unstable. (hindawi.com)
  • time
  • Recurrent networks, on the other hand, take as their input not just the current input example they see, but also what they have perceived previously in time. (skymind.ai)