Computational Graphs

We can use this kind of graph in order to represent any function, where the nodes of the graph are steps of the function that we go through.

  • The inputs here are x and W,
  • The multiplication node represents the matrix multiplication x*W.
  • The “hinge loss” node represents us calculating our hinge loss**term, and the “R” node represents our regularization term R
  • FInally our total loss is the sum of the Regularization term and the data term.

Once we can represent our function as a computational graph, we can implement a technique called backpropagation - which will recursively use the chain rule in order to compute the gradient with respect to every variable in the computational graph.

This becomes useful once we start to work with more complex function such as a Convolutional Neural Network. Eg. We have an input at the top, and we have a loss at the bottom, and the input has to go through many layers of transofrmations to get to the loss function.

results matching ""

    No results matching ""