当先锋百科网

首页 1 2 3 4 5 6 7

一、BP神经网络的概念

BP神经网络是一种多层的前馈神经网络,其主要的特点是:信号是前向传播的,而误差是反向传播的。具体来说,对于如下的只含一个隐层的神经网络模型:

BP神经网络的过程主要分为两个阶段,第一阶段是信号的前向传播,从输入层经过隐含层,最后到达输出层;第二阶段是误差的反向传播,从输出层到隐含层,最后到输入层,依次调节隐含层到输出层的权重和偏置,输入层到隐含层的权重和偏置。

故BP神经网络的完整流程如下:

重点代码:

# 前向运算
    def feedforward(self, a):
        """Return the output of the network if ``a`` is input."""
        for b, w in zip(self.biases, self.weights):
            a = sigmoid(np.dot(w, a) + b)
        return a

# 随机梯度下降法  eta学习率
    def SGD(self, training_data, epochs, mini_batch_size, eta,
            test_data=None):
        """Train the neural network using mini-batch stochastic
        gradient descent.  The ``training_data`` is a list of tuples
        ``(x, y)`` representing the training inputs and the desired
        outputs.  The other non-optional parameters are
        self-explanatory.  If ``test_data`` is provided then the
        network will be evaluated against the test data after each
        epoch, and partial progress printed out.  This is useful for
        tracking progress, but slows things down substantially."""

        training_data = list(training_data)  # 50000个样本
        n = len(training_data)

        if test_data:
            test_data = list(test_data)  # 10000个样本
            n_test = len(test_data)

        for j in range(epochs):
            random.shuffle(training_data)
            mini_batches = [
                training_data[k:k + mini_batch_size]
                for k in range(0, n, mini_batch_size)] # 将50000个样本分成mini_batch_size批分别运行
            for mini_batch in mini_batches:
                self.update_mini_batch(mini_batch, eta) # 一批一批运行
            if test_data:
                print("Epoch {} : {} / {}".format(j, self.evaluate(test_data), n_test))
            else:
                print("Epoch {} complete".format(j))

 

参考:深入理解BP神经网络 - 简书 (jianshu.com)

相关资料:链接:https://pan.baidu.com/s/11ySWBrYYeQXGUMaQA6ZtLg
提取码:9hf5