Hello Everyone, Following is my question: Suppose you had a neural network with
ID: 3640343 • Letter: H
Question
Hello Everyone,Following is my question:
Suppose you had a neural network with linear activation functions. That is, for each unit the output is some constant c times the weighted sum of the inputs.
a.Assume that the network has one hidden layer. For a given assignment to the weights w, write down equations for the value of the units in the output layer as a function of w and the input layer x, without any explicit mention of the output of the hidden layer. Show that there is a network with no hidden units that computes the same function.
b.Repeat the calculation in part (a), but this time do it for a network with any number of hidden layers_
c.Suppose a network with one hidden layer and linear activation functions has n input and output nodes and h hidden nodes. What effect does the transformation in part (a) to a network with no hidden layers have on the total number of weights? Discuss in particular the case h CC n
Thanks in advance,
Sam
Explanation / Answer
formula for a neuron with weight vector Wv(t) is
Wv(t + 1) = Wv(t) + (v, t) (t)(D(t) - Wv(t))
Related Questions
Hire Me For All Your Tutoring Needs
Integrity-first tutoring: clear explanations, guidance, and feedback.
Drop an Email at
drjack9650@gmail.com
drjack9650@gmail.com
Navigate
Integrity-first tutoring: explanations and feedback only — we do not complete graded work. Learn more.