国产探花免费观看_亚洲丰满少妇自慰呻吟_97日韩有码在线_资源在线日韩欧美_一区二区精品毛片,辰东完美世界有声小说,欢乐颂第一季,yy玄幻小说排行榜完本

首頁(yè) > 編程 > Python > 正文

Python使用numpy實(shí)現(xiàn)BP神經(jīng)網(wǎng)絡(luò)

2020-01-04 15:43:22
字體:
來(lái)源:轉(zhuǎn)載
供稿:網(wǎng)友

本文完全利用numpy實(shí)現(xiàn)一個(gè)簡(jiǎn)單的BP神經(jīng)網(wǎng)絡(luò),由于是做regression而不是classification,因此在這里輸出層選取的激勵(lì)函數(shù)就是f(x)=x。BP神經(jīng)網(wǎng)絡(luò)的具體原理此處不再介紹。

import numpy as np  class NeuralNetwork(object):   def __init__(self, input_nodes, hidden_nodes, output_nodes, learning_rate):     # Set number of nodes in input, hidden and output layers.設(shè)定輸入層、隱藏層和輸出層的node數(shù)目     self.input_nodes = input_nodes     self.hidden_nodes = hidden_nodes     self.output_nodes = output_nodes      # Initialize weights,初始化權(quán)重和學(xué)習(xí)速率     self.weights_input_to_hidden = np.random.normal(0.0, self.hidden_nodes**-0.5,                      ( self.hidden_nodes, self.input_nodes))      self.weights_hidden_to_output = np.random.normal(0.0, self.output_nodes**-0.5,                      (self.output_nodes, self.hidden_nodes))     self.lr = learning_rate          # 隱藏層的激勵(lì)函數(shù)為sigmoid函數(shù),Activation function is the sigmoid function     self.activation_function = (lambda x: 1/(1 + np.exp(-x)))      def train(self, inputs_list, targets_list):     # Convert inputs list to 2d array     inputs = np.array(inputs_list, ndmin=2).T  # 輸入向量的shape為 [feature_diemension, 1]     targets = np.array(targets_list, ndmin=2).T       # 向前傳播,F(xiàn)orward pass     # TODO: Hidden layer     hidden_inputs = np.dot(self.weights_input_to_hidden, inputs) # signals into hidden layer     hidden_outputs = self.activation_function(hidden_inputs) # signals from hidden layer           # 輸出層,輸出層的激勵(lì)函數(shù)就是 y = x     final_inputs = np.dot(self.weights_hidden_to_output, hidden_outputs) # signals into final output layer     final_outputs = final_inputs # signals from final output layer          ### 反向傳播 Backward pass,使用梯度下降對(duì)權(quán)重進(jìn)行更新 ###          # 輸出誤差     # Output layer error is the difference between desired target and actual output.     output_errors = (targets_list-final_outputs)      # 反向傳播誤差 Backpropagated error     # errors propagated to the hidden layer     hidden_errors = np.dot(output_errors, self.weights_hidden_to_output)*(hidden_outputs*(1-hidden_outputs)).T      # 更新權(quán)重 Update the weights     # 更新隱藏層與輸出層之間的權(quán)重 update hidden-to-output weights with gradient descent step     self.weights_hidden_to_output += output_errors * hidden_outputs.T * self.lr     # 更新輸入層與隱藏層之間的權(quán)重 update input-to-hidden weights with gradient descent step     self.weights_input_to_hidden += (inputs * hidden_errors * self.lr).T     # 進(jìn)行預(yù)測(cè)     def run(self, inputs_list):     # Run a forward pass through the network     inputs = np.array(inputs_list, ndmin=2).T          #### 實(shí)現(xiàn)向前傳播 Implement the forward pass here ####     # 隱藏層 Hidden layer     hidden_inputs = np.dot(self.weights_input_to_hidden, inputs) # signals into hidden layer     hidden_outputs = self.activation_function(hidden_inputs) # signals from hidden layer          # 輸出層 Output layer     final_inputs = np.dot(self.weights_hidden_to_output, hidden_outputs) # signals into final output layer     final_outputs = final_inputs # signals from final output layer           return final_outputs 

以上就是本文的全部?jī)?nèi)容,希望對(duì)大家的學(xué)習(xí)有所幫助,也希望大家多多支持VEVB武林網(wǎng)。


注:相關(guān)教程知識(shí)閱讀請(qǐng)移步到python教程頻道。
發(fā)表評(píng)論 共有條評(píng)論
用戶名: 密碼:
驗(yàn)證碼: 匿名發(fā)表
主站蜘蛛池模板: 牡丹江市| 略阳县| 常宁市| 宝兴县| 肃南| 同心县| 浦北县| 寿阳县| 平度市| 安乡县| 揭阳市| 广元市| 竹山县| 柘城县| 邳州市| 乌鲁木齐县| 福贡县| 宁阳县| 开原市| 葵青区| 清徐县| 湘乡市| 金堂县| 台南市| 留坝县| 晴隆县| 马公市| 新宾| 会宁县| 霞浦县| 苏州市| 伊宁县| 徐闻县| 白水县| 五华县| 隆昌县| 革吉县| 乌兰察布市| 邻水| 绍兴县| 华安县|