About 268,000 results
Open links in new tab
  1. Backpropagation in Neural Network - GeeksforGeeks

    Apr 5, 2025 · Backpropagation is a technique used in deep learning to train artificial neural networks particularly feed-forward networks. It works iteratively to adjust weights and bias to minimize the cost function. In each epoch the model adapts these parameters reducing loss by following the error gradient.

    Missing:

    • Soft Computing

    Must include:

  2. What is a backpropagation algorithm? - TechTarget

    A backpropagation algorithm, or backward propagation of errors, is an algorithm that's used to help train neural network models. The algorithm adjusts the network's weights to minimize any gaps -- referred to as errors -- between predicted outputs and the actual target output.

    Missing:

    • Soft Computing

    Must include:

  3. Backpropagation 1. Identify intermediate functions (forward prop) 2. Compute local gradients 3. Combine with upstream error signal to get full gradient

    Missing:

    • Soft Computing

    Must include:

  4. Backpropagation - Wikipedia

    In machine learning, backpropagation is a gradient estimation method commonly used for training a neural network to compute its parameter updates. It is an efficient application of the chain rule to neural networks.

    Missing:

    • Soft Computing

    Must include:

  5. 14 Backpropagation – Foundations of Computer Vision

    Backpropagation is an algorithm that efficiently calculates the gradient of the loss with respect to each and every parameter in a computation graph. It relies on a special new operation, called backward that, just like forward, can be defined for each layer, and acts in isolation from the rest of …

  6. In this lecture we will discuss the task of training neural networks using Stochastic Gradient Descent Algorithm. Even though, we cannot guarantee this algorithm will converge to optimum, often state-of-the-art results are obtained by this algorithm and it has become a benchmark algorithm for ML.

    Missing:

    • Soft Computing

    Must include:

  7. Back Propagation in Neural Network: Machine Learning Algorithm

    Jun 12, 2024 · The Back propagation algorithm in neural network computes the gradient of the loss function for a single weight by the chain rule. It efficiently computes one layer at a time, unlike a native direct computation.

    Missing:

    • Soft Computing

    Must include:

  8. Backpropagation Process - Tpoint Tech - Java

    Mar 17, 2025 · Backpropagation algorithms are a set of methods used to efficiently train artificial neural networks following a gradient descent approach which exploits the chain rule.

  9. Backpropagation (\backprop" for short) is a way of computing the partial derivatives of a loss function with respect to the parameters of a network; we use these derivatives in gradient descent,

    Missing:

    • Soft Computing

    Must include:

  10. What is Backpropagation Algorithm - Online Tutorials Library

    Feb 15, 2022 · The backpropagation algorithm is used to train a neural network more effectively through a chain rule method. This gradient is used in a simple stochastic gradient descent algorithm to find weights that minimize the error. The error propagates backward from the output nodes to the inner nodes.

    Missing:

    • Soft Computing

    Must include:

  11. Some results have been removed