Algorithm Discussion Bookmark this page Once you have completed the implementation of the 3 learning algorithms, you should qualitatively verify your implementations. In main.py we have included a block of code that you should uncomment. This code loads a 2D dataset from toy_data.txt, and trains your models using T= 10,1 = 0.2. main.py will compute 0 and @ for each of the learning algorithms that you have written. Then, it will call plot_toy_data to plot the resulting model and boundary. Plots 6 points possible (graded) In order to verify your plots, please enter the values of and @o for all three algorithms. (For example, if @= (1,0.5), then type 1, 0.5 without the brackets. Make sure your answers are correct up to 4 decimal places.) For the perceptron algorithm: @= Bo = For the average perceptron algorithm: @= 80 = For the Pegasos algorithm: = 00= Submit You have used 0 of 20 attempts Save

Respuesta :

Answer:

As there are large number of iterations we would come to find that if the data is linearly separable then

Both the Average perceptron algorithm and pegasos algorithm converges quickly.

Perceptron algorithm will converge a little at last.

So all three will converge but average perceptron and pegasos will converge earlier as compared to the perceptron one.

Explanation: