2013-06-03

Machine Learning 第五波编程作业 – Regularized Linear Regression and Bias/Variance

仅列出核心代码:

1.linearRegCostFunction.m

h = X * theta;
J = (X * theta - y).' * (X * theta - y) / (2*m)...
    +(lambda/(2*m)) * sum(theta(2:end).^2);
grad = grad(:);
grad(1) = (X(:, 1).' * (h - y)) /m;

grad(2:end) = (X(:, 2:end).' * (h - y)) /m ...
+ (lambda/m) * theta(2:end);

2.learningCurve.m

for i = 1:m
    Xi = X(1:i, :);
    yi = y(1:i);
    lambda = 1;
    [theta] = trainLinearReg(Xi, yi, lambda);
    lambda = 0;
    % For train error, make sure you compute it on the training subset
    [error_train(i), ~] = linearRegCostFunction(Xi, yi, theta, lambda);
    % For validation error, compute it over the entire cross validation set
    [error_val(i), ~] = linearRegCostFunction(Xval, yval, theta, lambda);
end

3.polyFeatures.m

for i =1:p
    X_poly(:, i) = X(:, 1).^i;
end

4.validationCurve.m

for i = 1:length(lambda_vec)
    [theta] = trainLinearReg(X, y, lambda_vec(i));
    % For train error, make sure you compute it on the training subset
    [error_train(i), ~] = linearRegCostFunction(X, y, theta, 0);
    % For validation error, compute it over the entire cross validation set
    [error_val(i), ~] = linearRegCostFunction(Xval, yval, theta, 0);
end

课程地址:https://www.coursera.org/course/ml

没有评论:

发表评论