国产探花免费观看_亚洲丰满少妇自慰呻吟_97日韩有码在线_资源在线日韩欧美_一区二区精品毛片,辰东完美世界有声小说,欢乐颂第一季,yy玄幻小说排行榜完本

首頁 > 編程 > Python > 正文

Python編程實現線性回歸和批量梯度下降法代碼實例

2020-02-16 11:26:50
字體:
來源:轉載
供稿:網友

通過學習斯坦福公開課的線性規劃和梯度下降,參考他人代碼自己做了測試,寫了個類以后有時間再去擴展,代碼注釋以后再加,作業好多:

import numpy as npimport matplotlib.pyplot as pltimport randomclass dataMinning:  datasets = []  labelsets = []    addressD = '' #Data folder  addressL = '' #Label folder    npDatasets = np.zeros(1)  npLabelsets = np.zeros(1)    cost = []  numIterations = 0  alpha = 0  theta = np.ones(2)  #pCols = 0  #dRows = 0  def __init__(self,addressD,addressL,theta,numIterations,alpha,datasets=None):    if datasets is None:      self.datasets = []    else:      self.datasets = datasets    self.addressD = addressD    self.addressL = addressL    self.theta = theta    self.numIterations = numIterations    self.alpha = alpha      def readFrom(self):    fd = open(self.addressD,'r')    for line in fd:      tmp = line[:-1].split()      self.datasets.append([int(i) for i in tmp])    fd.close()    self.npDatasets = np.array(self.datasets)    fl = open(self.addressL,'r')    for line in fl:      tmp = line[:-1].split()      self.labelsets.append([int(i) for i in tmp])    fl.close()        tm = []    for item in self.labelsets:      tm = tm + item    self.npLabelsets = np.array(tm)  def genData(self,numPoints,bias,variance):    self.genx = np.zeros(shape = (numPoints,2))    self.geny = np.zeros(shape = numPoints)    for i in range(0,numPoints):      self.genx[i][0] = 1      self.genx[i][1] = i      self.geny[i] = (i + bias) + random.uniform(0,1) * variance  def gradientDescent(self):    xTrans = self.genx.transpose() #    i = 0    while i < self.numIterations:      hypothesis = np.dot(self.genx,self.theta)      loss = hypothesis - self.geny      #record the cost      self.cost.append(np.sum(loss ** 2))      #calculate the gradient      gradient = np.dot(xTrans,loss)      #updata, gradientDescent      self.theta = self.theta - self.alpha * gradient      i = i + 1          def show(self):    print 'yes'    if __name__ == "__main__":  c = dataMinning('c://city.txt','c://st.txt',np.ones(2),100000,0.000005)  c.genData(100,25,10)  c.gradientDescent()  cx = range(len(c.cost))  plt.figure(1)  plt.plot(cx,c.cost)  plt.ylim(0,25000)  plt.figure(2)  plt.plot(c.genx[:,1],c.geny,'b.')  x = np.arange(0,100,0.1)  y = x * c.theta[1] + c.theta[0]  plt.plot(x,y)  plt.margins(0.2)  plt.show()

圖1. 迭代過程中的誤差cost

圖2. 數據散點圖和解直線

總結

以上就是本文關于Python編程實現線性回歸和批量梯度下降法代碼實例的全部內容,希望對大家有所幫助。感興趣的朋友可以繼續參閱本站:

發表評論 共有條評論
用戶名: 密碼:
驗證碼: 匿名發表
主站蜘蛛池模板: 星子县| 富锦市| 盐边县| 外汇| 湖州市| 泸溪县| 汾阳市| 武陟县| 绥中县| 商洛市| 永兴县| 富锦市| 马鞍山市| 金昌市| 沧州市| 平邑县| 嘉定区| 禹城市| 永吉县| 宜春市| 开阳县| 贵阳市| 天长市| 安泽县| 玉龙| 榕江县| 广安市| 三门县| 新竹县| 罗田县| 兴城市| 连江县| 南丹县| 元江| 枣强县| 哈尔滨市| 溧阳市| 嘉兴市| 张北县| 吉隆县| 郴州市|