def costFunction(self, X, y): #Compute Cost Function with weights already stored in class self.yHat=self.forward(X) J=0.5sum((y-self.yHat)2) return J def relucostFunction(self, X, y): #Compute Cost Function with weights already stored in class self.yHat=self.reluforward(X) J=0.5sum((y-self.yHat)2) return J def costFunctionPrime(self, X, y): #Compute derivatives with respect to W1 and W2 self.yHat=self.forward(X) delta3 = np.multiply(-(y-self.yHat),self.sigmoidPrime(self.z3)) dJdW2= (self.a2.T,delta3) delta2= (delta3,self.W2.T)self.sigmoidPrime(self.z2) dJdW1= (X.T,delta2) return dJdW1,dJdW2
a) This code calculates the cost function for a neural network.
b) The code computes derivatives for gradient descent optimization.
c) It defines functions for backpropagation in a neural network.
d) All of the above.