# 线性回归与梯度下降

#### Linear regression

The model function for linear regression, which is a function that maps from

`x`

to`y`

is represented as. To train a linear regression model, you want to find the best

parameters that fit your dataset. - To compare how one choice of
is better or worse than another choice, you can evaluate it with a cost function is a function of . That is, the value of the cost depends on the value of .

- The choice of
that fits your data the best is the one that has the smallest cost .

- To compare how one choice of
To find the values

that gets the smallest possible cost , you can use a method called **gradient descent**.- With each step of gradient descent, your parameters
come closer to the optimal values that will achieve the lowest cost .

- With each step of gradient descent, your parameters
The trained linear regression model can then take the input feature

and output a prediction .

#### Gradient Descent

Gradient descent involves repeated steps to adjust the value of your parameter

- At each step of gradient descent, it will be helpful for you to monitor your progress by computing the cost
as gets updated. - In this section, you will implement a function to calculate
so that you can check the progress of your gradient descent implementation.

###### Cost function

As you may recall from the lecture, for one variable, the cost function for linear regression

is the model's prediction through the mapping, as opposed to , which is the actual value from the dataset. is the number of training examples in the dataset.

###### Model prediction

- For linear regression with one variable, the prediction of the model
for an example is represented as .

This is the equation for a line, with an intercept

###### Algorithm

The gradient descent algorithm is:

where, parameters

is the model's prediction, while , is the target value.