> 2021年04月16日信息消化 ### 每天学点机器学习 ##### Vectorization Vectorizedzation example $$ h_\theta(x) = \sum_{j=0}^n \theta_j x_j = \theta^Tx $$ Unvectorized implementation ```octave prediction = 0.0; for j = 1:n+1, prediction = prediction + theta(j) * x(j) end; ``` Vectorized implementation ```octave prediction = theta' * x ``` $$ h_\theta(x) = \sum_{j=0}^n \theta_j x_j = \theta^Tx $$ Let A*A* be a 10x10 matrix and x*x* be a 10-element vector. Your friend wants to compute the product Ax*A**x* and writes the following code: ``` v = zeros(10, 1); for i = 1:10 for j = 1:10 v(i) = v(i) + A(i, j) * x(j); end end ``` v = A * x; ##### Exercise https://matlab.mathworks.com/ Therefore, it is important that you read and follow the instructions below before attempting a programming exercise in MATLAB Online. The necessary files for each exercise are contained in the 'machine-learning-ex' folder along with this script. To begin a programming exercise, right-click the corresponding exercise folder inside of the machine-learning-ex folder, and select 'Open': Note: 1. It is important that you set your Current Folder to the exercise folder before working on the exercise, otherwise you may experience unexpected behavior and will not be able to submit. 2. If you are logged out of MATLAB Online you will have to reset your Current Folder to the exercise folder before continuing to work on that exercise. #### Completing a Programming Exercise ##### Open the exercise script To start the programming exercise, open the exercise script, exn.mlx, where n is the exercise number. The exercise script contains instructions to guide you through the exercise as well as the necessary MATLAB code to load and visualize data and to test your functions. ##### Complete the function definitions At several points in the exercise you will be prompted to open an existing function file and complete the function definition according to the instructions in the exercise script. After completing and saving the function file, you will usually be prompted to run a code section in the Live Script. The code will your function and compare your result with the expected output. An example of how to complete the first function file in ex1, warmUpExercise.m, is shown below: ##### Submit your solutions 1. Enter the command submit at the command prompt (>>) in the Command Window. #### Ex1 Linear regression with one variable ```octave data = load('ex1data1.txt'); % read comma separated data X = data(:, 1); y = data(:, 2); ``` ##### 2.1 Plotting the data plotData.m ```octave function plotData(x, y) figure; plot(x, y, 'rx', 'MarkerSize', 10); % Plot the data ylabel('Profit in $10,000s'); % Set the y-axis label xlabel('Population of City in 10,000s'); % Set the x-axis label end ``` ##### 2.2 Gradient Descent [Gradient Descent in Matlab | stackoverflow](https://stackoverflow.com/questions/23984925/gradient-descent-in-matlab) [图解梯度下降背后的数学原理](https://zhuanlan.zhihu.com/p/60535541) the hypothesis $h_{\theta}(x)$ is given by the linear model $h_\theta(x) = \theta^Tx=\theta_0+\theta_1x_1$ Recall that the parameters of your model are the *θ* values. These are the values you will adjust to minimize cost. One way to do this is to use the batch gradient descent algorithm. In batch gradient descent, each iteration performs the update simultaneously update $\theta_j$ for all j $$ \theta_j := \theta_j - \alpha\frac{1}{m}\sum_{i=1}^m(h_\theta(x^{(i)})-y^{(i)})x_j^{(i)} $$ gradientDescent.m ```octave function [theta, J_history] = gradientDescent(X, y, theta, alpha, num_iters) %GRADIENTDESCENT Performs gradient descent to learn theta % theta = GRADIENTDESCENT(X, y, theta, alpha, num_iters) updates theta by % taking num_iters gradient steps with learning rate alpha % Initialize some useful values m = length(y); % number of training examples J_history = zeros(num_iters, 1); for iter = 1:num_iters theta = theta - (alpha/m) * (X' * (X * theta - y)); J_history(iter) = computeCost(X, y, theta); end end ``` Implementation ```octave m = length(X); % number of training examples X = [ones(m,1),data(:,1)]; % Add a column of ones to x theta = zeros(2, 1); % initialize fitting parameters iterations = 1500; alpha = 0.01; theta = gradientDescent(X, y, theta, alpha, iterations); ``` ##### 2.2.3 Computing the cost J(theta) [How to compute Cost function for linear regression](https://jp.mathworks.com/matlabcentral/answers/468415-how-to-compute-cost-function-for-linear-regression) In this section, you will implement a function to calculate so you can check the convergence of your gradient descent implementation. computeCost.m ```octave function J = computeCost(X, y, theta) % Initialize some useful values m = length(y); % number of training examples J = 0; h = X * theta; sError = (h - y) .^ 2; J = sum(sError) / (2 .* m); end ``` ```octave % Compute and display initial cost with theta all zeros computeCost(X, y, theta) % Compute and display initial cost with non-zero theta computeCost(X, y,[-1; 2]) ``` ##### Feature Nomalization FEATURENORMALIZE(X) 返回 X 的归一化版本,其中每个特征的均值为0,标准差为 是1。 ```matlab function [X_norm, mu, sigma] = featureNormalize(X) %FEATURENORMALIZE Normalizes the features in X % FEATURENORMALIZE(X) returns a normalized version of X where % the mean value of each feature is 0 and the standard deviation % is 1. This is often a good preprocessing step to do when % working with learning algorithms. % You need to set these values correctly X_norm = X; mu = zeros(1, size(X, 2)); sigma = zeros(1, size(X, 2)); for i = 1:size(X, 2) mu(i) = mean(X(:,i)); sigma(i) = std(X(:,i)); X_norm(:,i) = X_norm(:,i) - mu(i); X_norm(:,i) = X_norm(:,i) / sigma(i); end end ``` ```matlab % Add intercept term to X X = [ones(m, 1) X]; ``` ### 其他值得阅读 #### 这就是运动对大脑的影响 | BDNF, HIIT 原文:[This Is What Exercise Does to Your Brain](https://elemental.medium.com/this-is-what-exercise-does-to-your-brain-7068b6a1af81) 那么,什么样的运动对你的大脑最有利呢?大部分的研究都是在慢跑等适度有氧运动上进行的,但最近的证据表明,举重和高强度间歇训练对你也有好处。弗吉尼亚理工学院人类营养、食品和运动系的高级研究助理朱莉娅-巴索说,在健身方面经历最大收益的人显示出最大的认知变化,这表明高强度的锻炼提供了额外的好处。然而,无论活动强度如何,情绪的提升都会发生。 > So what type of exercise is best for your brain? Most of the research has been done on moderate aerobic exercise like jogging, but recent evidence suggests that weight lifting and high-intensity interval training are good for you, too. Julia Basso, a senior research associate in the Department of Human Nutrition, Foods, and Exercise at Virginia Tech, says people who experience the biggest gains in their fitness show the biggest cognitive changes, suggesting higher-intensity workouts provide extra benefit. However, the mood boosts occur no matter the intensity of the activity. 其中最关键的变化之一是释放了一种名为脑源性神经营养因子的生长激素,或BDNF。当谈到运动对大脑的积极影响时,BDNF是明星。 > One of the most crucial changes is the release of a growth hormone called brain-derived neurotrophic factor, or BDNF. When it comes to exercise’s positive effects on the brain, BDNF is the star. BDNF帮助大脑在神经元之间建立新的连接或突触--这一过程被称为突触可塑性,被认为是学习的基础。 > BDNF helps the brain build new connections, or synapses, between neurons — a process called synaptic plasticity that is thought to be the foundation for learning. ### 一点收获 [**Working Backwards**](https://click.convertkit-mail.com/4zu50rprmotehep3mohx/3ohphkhqn67w26br/aHR0cHM6Ly9hbXpuLnRvLzN1T2hxQ0M=)**:** I’ve long been obsessed with Amazon’s business operations practices, and [besides this essay from Zack Kanter](https://click.convertkit-mail.com/4zu50rprmotehep3mohx/n2hohvhnkw3dx8b6/aHR0cHM6Ly96YWNra2FudGVyLmNvbS8yMDE5LzAzLzEzL3doYXQtaXMtYW1hem9uLw==), this book is the best thing I’ve seen written about them. The two authors worked directly with Bezos, so [the book](https://click.convertkit-mail.com/4zu50rprmotehep3mohx/3ohphkhqn67w26br/aHR0cHM6Ly9hbXpuLnRvLzN1T2hxQ0M=) is filled with one-liners like this: “**Good intentions don’t work. Mechanisms do.**” Long-term, you can’t solve problems by working more or trying harder. If you want persistent change, you have to fix the underlying system. --Friday Finds (4/16/21) 好的意图不起作用。机制才有用。 长期来看,你无法通过更多工作或更努力来解决问题。如果你想要持久的改变,你必须修复底层系统。