상세 컨텐츠

본문 제목

Cost Function - Intuition Ⅱ

Machine Learning

by 찌르렁 2020. 10. 26. 19:32

본문

Let's see a case of regression problem.

 

We can make a straight line of hypothesis($h_\theta(x)$) in this case and it depends on the value of $\theta_0, \theta_1$. Below picture shows cost value which dependes on the hypothesis.

 

Using this graph, we can make a contour plot. A contour plot is a graph that contains many contour lines. A contour line of a two variable function has a constant value at all points of the same line. An example of such a graph is the one to the right below.

 

Take any color and going along the 'circle', one would expect to get the same value of the cost function. For example, the three green points found on the green line above have the same value for $J(\theta_0, \theta_1)$ and as a result, they are found along the same line. The red dot displays the value of the cost function for the graph on the left when $\theta_0=800$ and $\theta_1 = -0.15$.

 

Taking another h(x) and plotting its contour plot, one gets the following graphs:

 

$h(x)=360+0*x$

When $\theta_0=360$ and $\theta_1=0$, the value of $J(\theta_0, \theta_1)$ in the contour plot gets closer to the center thus reducing the cost function error. Now giving our hypothesis function a slightly positive slope results in a better fit of the data.

 

The graph above minimizes the cost function as much as possible and consequently, the result of $\theta_0$ and $\theta_1$ then to be around 250 and 0.12 repectively. Plotting those values on our graph to the right seems to put our point in the center of the inner most 'circle'.

'Machine Learning' 카테고리의 다른 글

Cost Function  (0) 2020.10.26
Model Representation  (0) 2020.10.26
Unsupervised Learning  (0) 2020.10.26
Supervised Learning  (0) 2020.10.26
What is Machine Learning?  (0) 2020.10.26

관련글 더보기

댓글 영역