Computer exercises Several of the exercises use the data in the following table.
ID: 3845591 • Letter: C
Question
Computer exercises
Several of the exercises use the data in the following table.
Section 5.4
1. Consider basic gradient descent (Algorithm 1) and Newton’s algorithm (Algorithm 2) applied to the data in the tables.
(a) Apply both to the three-dimensional data in categories 1 and 3. For the gradient descent use (k) = 0.1. Plot the criterion function as function of the iteration number.
(b) Estimate the total number of mathematical operations in the two algorithms.
(c) Plot the convergence time versus learning rate. What is the minimum learning rate that fails to lead to convergence?
W3 sample 0.1 1.1 7.1 4.2 3.0 2.9 6.8 7.1 1.4 4.3 0.5 8.7 8.9 0.2 -3.5 4.1 4.5 0.0 2.9 2.1 4.2 7.7 2.0 2.7 6.3 1.6 0.1 5.2 4.1 2.8 4.2 1.9 4.0 2.2 -6.7 -4.0 3.1 5.0 1.4 3.2 1.3 3.7 0.5 9.2 -0.8 1.3 2.4 4.0 3.4 6.2 0.9 1.2 2.5 6.1 4.1 3.4 5.0 6.4 8.4 3.7 5.1 1.6 7.1 9.7 10 3.9 4.0 4.1 2.2 1.9 5.1 -8.0 -6.3Explanation / Answer
MAIN.M
% Loading the dataset
dataSet = load('DataSet.txt');
% Storing the values in seperate matrices
W1 = dataSet(:, 1);
w3 = dataSet(:, 2);
gradient.m
function [ parameters, criterionHistory ] = gradient(w1, w3, parameters, learningRate, iteration )
% Running gradient descent
for i = 1:iteration
% Calculating the transpose of our hypothesis
h = (w1* parameters – w3';
% Updating the parameters
parameters(1) = parameters(1) - learningRate * (1/m) * h * w1(:, 1);
parameters(2) = parameters(2) - learningRate * (1/m) * h * w1 (:, 2);
% Keeping track of the criterion function
criterionHistory(i) = criterion(w1, w3, parameters);
end
criterion.m
Related Questions
drjack9650@gmail.com
Navigate
Integrity-first tutoring: explanations and feedback only — we do not complete graded work. Learn more.