본문 바로가기
ML | 데이터과학/머신러닝

Cost Function for Logistic regression

by 노아론 2018. 1. 26.
linear regression, cost, logistic

Cost function for logistic

logistic hyphothesis

H(x)=11+exH(x)=\frac{1}{1+e^{-x}}

cost(W)=1mc(H(x),y)cost(W)=\frac{1}{m}\sum{c(H(x),y)}

c(H(x),y){log(H(x))y=1log(1H(x))y=0c(H(x),y)\begin{cases} -log(H(x)) & \text{y=1} \\ -log(1-H(x)) & \text{y=0} \end{cases}


tensorflow에서 구현할때 if문을 달아야한다는 점이 있으므로

c(H(x),y)=ylog(H(x))(1y)log(1H(x)) c(H(x),y)=-ylog(H(x))-(1-y)log(1-H(x))

로 쓸 수 있다.


y=1일때 (1y)가 0이되고 y=0일때 -y=0이 되며 (1-y)는 1이 됨을 이용함y=1\text{일때 } (1-y)\text{가 0이되고 }y=0\text{일때 -y=0이 되며 (1-y)는 1이 됨을 이용함}

cost(W)=1mylog(H(x))+(1y)log(1H(x)) cost(W)=-\frac{1}{m}\sum ylog(H(x))+(1-y)log(1-H(x))
W:=WαααWcost(W) W:=W-\alpha\frac{\alpha}{\alpha W}cost(W)

#cost function
cost = tf.reduce_mean(-tf.reduce_sum(Y*tf.log(hypothesis) + (1-y)*tf.log(1-hypothesis)))

#Minimize
a = tf.Variable(0.1) # Learning rate, alpha.
optimizer = tf.train.GradientDescentOptimizer(a)
train = optimizer.minimize(cost)


tf.train.GradientDescentOptimizer(a)W:=WαααWcost(W) W:=W-\alpha\frac{\alpha}{\alpha W}cost(W) 와 같다


댓글