代码拉取完成,页面将自动刷新
同步操作将从 nnstrings/白话机器学习中的数学代码 强制同步,此操作会覆盖自 Fork 仓库以来所做的任何修改,且无法恢复!!!
确定后同步将在后台操作,完成时将刷新页面,请耐心等待。
import numpy as np
import matplotlib.pyplot as plt
class RegressionCurve:
def __init__(self, t0, t1):
self.theta_0 = t0
self.theta_1 = t1
def f(self, x):
return self.theta_0 + x * self.theta_1
def E(self, x, y):
return 0.5 * np.sum((y - self.f(x)) ** 2)
def standardize(self, x):
mu = x.mean()
sigma = x.std()
return (x - mu) / sigma
def Regression(self, x, y):
ETA, diff, count = 1e-3, 1, 0
error = self.E(x, y)
while diff > 1e-2:
self.theta_0 = self.theta_0 - ETA * np.sum(self.f(x) - y)
self.theta_1 = self.theta_1 - ETA * np.sum((self.f(x) - y) * x)
current_error = self.E(x, y)
diff = error - current_error
error = current_error
count += 1
log = 'times:{} theta_0:{:.3f} theta_1:{:.3f} diff:{:.4f}'.format(
count, self.theta_0, self.theta_1, diff)
print(log)
def draw(self, train_x, train_y):
x = np.linspace(3, -3, 100)
plt.plot(train_x, train_y, 'o')
plt.plot(x, self.f(x))
plt.show()
if __name__=="__main__":
train = np.loadtxt("click.csv", delimiter=",", skiprows=1)
train_x, train_y = train[:, 0], train[:, 1]
theta_0 = np.random.rand()
theta_1 = np.random.rand()
regression = RegressionCurve(theta_0, theta_1)
train_z = regression.standardize(train_x)
plt.plot(train_z, train_y, 'o')
plt.show()
regression.Regression(train_z, train_y)
regression.draw(train_z, train_y)
此处可能存在不合适展示的内容,页面不予展示。您可通过相关编辑功能自查并修改。
如您确认内容无涉及 不当用语 / 纯广告导流 / 暴力 / 低俗色情 / 侵权 / 盗版 / 虚假 / 无价值内容或违法国家有关法律法规的内容,可点击提交进行申诉,我们将尽快为您处理。