Search results

Results 21 – 41 of 47
Advanced search

Search in namespaces:

  • ...tensive operations commonly associated with [[deep learning]] and [[neural networks]], such as matrix multiplications and convolutions. TPU slices are integral * [[Convolutional Neural Networks]] (CNNs) for image classification and object detection
    3 KB (416 words) - 22:23, 21 March 2023
  • * '''Recurrent Neural Networks (RNNs)''': A class of neural networks with loops that allow information to persist over time, making them well-su * '''Long Short-Term Memory (LSTM) networks''': A type of RNN specifically designed to address the vanishing gradient p
    4 KB (598 words) - 15:46, 19 March 2023
  • '''tf.keras''' is a high-level neural networks API, integrated within the [[TensorFlow]] machine learning framework. Devel ...model''': A linear stack of layers, suitable for simple feedforward neural networks.
    4 KB (566 words) - 22:28, 21 March 2023
  • ...utational challenges posed by the increasing size and complexity of modern neural network models. It involves the concurrent execution of different parts of ...useful for models with a sequential structure, such as [[Recurrent Neural Networks (RNNs)]] or [[Transformers]].
    4 KB (597 words) - 13:23, 18 March 2023
  • * [[Recurrent Neural Networks]] (RNNs): A type of neural network specifically designed for handling sequential data. RNNs have feedb * [[Long Short-Term Memory]] (LSTM) networks: A variant of RNNs that addresses the vanishing gradient problem, enabling
    3 KB (483 words) - 12:18, 19 March 2023
  • ...dvanced architectures like long short-term memory (LSTM) networks or gated recurrent units (GRUs). The outputs of these layers are combined to create a context-
    3 KB (453 words) - 13:14, 18 March 2023
  • [[Keras]] is an open-source, high-level neural networks [[API]] (Application Programming Interface) designed to simplify the proces ...erstand APIs. This enables developers to build, train, and evaluate neural networks with only a few lines of code, simplifying the development process and lowe
    4 KB (562 words) - 05:02, 20 March 2023
  • ...nt attention in recent years due to the advancements in [[recurrent neural networks]] (RNNs) and other sequence modeling techniques. A sequence-to-sequence model is a type of neural network architecture designed specifically to handle input and output seque
    3 KB (480 words) - 13:28, 18 March 2023
  • ...standing. Google Blog. https://ai.googleblog.com/2017/08/transformer-novel-neural-network.html?m=1</ref> As of 2022, the paper has become one of the most cit ...were based on a complex [[recurrent neural network]] or a [[convolutional neural network]] that included an encoder and decoder. The top performing models a
    7 KB (904 words) - 16:58, 16 June 2023
  • * [[Convolutional neural networks]] (CNNs), which utilize multi-dimensional tensors to represent images and p * [[Recurrent neural networks]] (RNNs), which use tensors to store and process sequences of data.
    3 KB (473 words) - 22:25, 21 March 2023
  • ...tistical NLU include [[Hidden Markov Models]], [[n-grams]], and [[Bayesian Networks]]. ...as [[Recurrent Neural Networks]] (RNNs), [[Long Short-Term Memory]] (LSTM) networks, and [[Transformer]]-based architectures like [[GPT]] and [[BERT]] have ach
    4 KB (566 words) - 13:23, 18 March 2023
  • ...s, such as [[Convolutional Neural Networks]] (CNNs) and [[Recurrent Neural Networks]] (RNNs), which require significant computational power. GPUs have become a
    3 KB (498 words) - 19:16, 19 March 2023
  • ...graphical information. Machine learning models, like convolutional neural networks (CNNs), have been developed for tasks like image classification, object det ...ic, and other audio signals. Machine learning models like recurrent neural networks (RNNs) and [[WaveNet]] have been employed to perform tasks such as speech r
    4 KB (564 words) - 13:22, 18 March 2023
  • ...=Sepp|last2=Schmidhuber|first2=Jürgen|title=Long short-term memory|journal=Neural Computation|date=1997|volume=9|issue=8|pages=1735–1780|doi=10.1162/neco.1 LSTM networks consist of a series of memory cells, which are designed to store and manipu
    4 KB (567 words) - 12:13, 19 March 2023
  • ...s, such as [[Convolutional Neural Networks]] (CNNs) and [[Recurrent Neural Networks]] (RNNs), are particularly adept at learning representations from raw data.
    3 KB (477 words) - 01:14, 21 March 2023
  • ...g models, such as [[convolutional neural networks]] and [[recurrent neural networks]], where complex data structures are required.
    4 KB (561 words) - 13:24, 18 March 2023
  • ...s, such as [[convolutional neural networks]] (CNNs) and [[recurrent neural networks]] (RNNs). Their efficient tensor computation capabilities make them ideal f
    3 KB (476 words) - 22:22, 21 March 2023
  • ...]] models such as Convolutional Neural Networks (CNN) and Recurrent Neural Networks (RNN).
    4 KB (534 words) - 13:27, 18 March 2023
  • ...e to traditional recurrent neural networks (RNNs) and convolutional neural networks (CNNs) for natural language processing tasks. The key innovation in the Tra
    4 KB (548 words) - 13:11, 18 March 2023
  • ...lutional Neural Networks]] (CNNs) for image processing, [[Recurrent Neural Networks]] (RNNs) for sequential data, and [[Transformer]]-based models for natural
    4 KB (548 words) - 13:23, 18 March 2023
View ( | ) (20 | 50 | 100 | 250 | 500)