About 379,000 results
Open links in new tab
  1. We propose a general Convolutional Neural Network (CNN) encoder model for machine translation that fits within in the framework of Encoder-Decoder models proposed by Cho, et. al. [1]. A CNN takes as input a sentence in the source language, performs multiple convolution and pooling operations, and uses a fully

  2. NMT model consists of an encoder that generates intermediate expressions from source language sentences and a decoder that predicts target language sentences from intermediate representations.

  3. Encoders-Decoders, Sequence to Sequence Architecture. - Medium

    Mar 10, 2021 · The encoder-decoder architecture for recurrent neural networks is the standard neural machine translation method that rivals and in some cases outperforms classical statistical machine...

  4. Demystifying Encoder Decoder Architecture & Neural Network

    Jan 12, 2024 · Machine translation: One of the most common applications of the Encoder-Decoder architecture is Machine Translation. This is where a sequence of words in one language as shown above (encoder-decoder architecture with RNN) is translated into another language.

  5. Neural machine translation with a Transformer and Keras

    May 31, 2024 · The encoder and decoder. Build & train the Transformer. Generate translations. Export the model. To get the most out of this tutorial, it helps if you know about the basics of text generation and attention mechanisms. A Transformer is a sequence-to-sequence encoder-decoder model similar to the model in the NMT with attention tutorial. A single ...

  6. Encoder-Decoder Recurrent Neural Network Models for Neural Machine

    Aug 7, 2019 · The encoder-decoder architecture for recurrent neural networks is the standard neural machine translation method that rivals and in some cases outperforms classical statistical machine translation methods.

  7. How to Develop an Encoder-Decoder Model for Sequence-to …

    Aug 27, 2020 · The encoder-decoder model provides a pattern for using recurrent neural networks to address challenging sequence-to-sequence prediction problems such as machine translation. Encoder-decoder models can be developed in the Keras Python deep learning library and an example of a neural machine translation system developed with this model has been ...

  8. Transformers to the rescue: alleviating data scarcity in arabic ...

    1 day ago · Current methodologies predominantly rely on neural machine translation (NMT) models, hindered by adequately annotated training data scarcity. ... who employed an encoder–decoder model with nine CNN layers and an attention mechanism. They incrementally trained their model based on words with rare segmentation to overcome the encoder–decoder ...

  9. Multi-scale convolutional transformer network for motor imagery …

    Apr 15, 2025 · Brain-computer interface (BCI) systems allow users to communicate with external devices by translating neural signals into real-time commands. Convolutional neural networks (CNNs) have been ...

  10. Image captioning deep learning model using ResNet50 encoder

    Apr 16, 2025 · This paper presents an attention-based, Encoder-Decoder deep architecture that makes use of convolutional features extracted from a CNN model pre-trained on ImageNet (Xception), together with ...

  11. Some results have been removed
Refresh