This is simply wrong. Backprop has the same asymptotic time complexity as forward.
bobmarleybiceps•6mo ago
I think they're misusing "forward propagation" and "backward propagation" to be basically mean "post training inference" and "training".
they seem to be assuming n iterations of the backward pass, which is why it's larger...
vrighter•6mo ago
n iterations would be a constant factor, which is omitted from asymptotic complexity
constantcrying•6mo ago
"If we once again assume that there are the same number of neurons in each layer, and that the number of layers equal the number of neurons in each layer we find:"
These are terrible assumptions.
Why not compute the runtime as the product of the actual sizes? Then the comparison will also make more sense.
duvenaud•6mo ago
bobmarleybiceps•6mo ago
vrighter•6mo ago