本文介紹了tensorflow中next_batch的具體使用,分享給大家,具體如下:
此處給出了幾種不同的next_batch方法,該文章只是做出代碼片段的解釋,以備以后查看:
def next_batch(self, batch_size, fake_data=False): """Return the next `batch_size` examples from this data set.""" if fake_data: fake_image = [1] * 784 if self.one_hot: fake_label = [1] + [0] * 9 else: fake_label = 0 return [fake_image for _ in xrange(batch_size)], [ fake_label for _ in xrange(batch_size) ] start = self._index_in_epoch self._index_in_epoch += batch_size if self._index_in_epoch > self._num_examples: # epoch中的句子下標是否大于所有語料的個數,如果為True,開始新一輪的遍歷 # Finished epoch self._epochs_completed += 1 # Shuffle the data perm = numpy.arange(self._num_examples) # arange函數用于創建等差數組 numpy.random.shuffle(perm) # 打亂 self._images = self._images[perm] self._labels = self._labels[perm] # Start next epoch start = 0 self._index_in_epoch = batch_size assert batch_size <= self._num_examples end = self._index_in_epoch return self._images[start:end], self._labels[start:end]
該段代碼摘自mnist.py文件,從代碼第12行start = self._index_in_epoch開始解釋,_index_in_epoch-1是上一次batch個圖片中最后一張圖片的下邊,這次epoch第一張圖片的下標是從 _index_in_epoch開始,最后一張圖片的下標是_index_in_epoch+batch, 如果 _index_in_epoch 大于語料中圖片的個數,表示這個epoch是不合適的,就算是完成了語料的一遍的遍歷,所以應該對圖片洗牌然后開始新一輪的語料組成batch開始
def ptb_iterator(raw_data, batch_size, num_steps): """Iterate on the raw PTB data. This generates batch_size pointers into the raw PTB data, and allows minibatch iteration along these pointers. Args:  raw_data: one of the raw data outputs from ptb_raw_data.  batch_size: int, the batch size.  num_steps: int, the number of unrolls. Yields:  Pairs of the batched data, each a matrix of shape [batch_size, num_steps].  The second element of the tuple is the same data time-shifted to the  right by one. Raises:  ValueError: if batch_size or num_steps are too high. """ raw_data = np.array(raw_data, dtype=np.int32) data_len = len(raw_data) batch_len = data_len // batch_size #有多少個batch data = np.zeros([batch_size, batch_len], dtype=np.int32) # batch_len 有多少個單詞 for i in range(batch_size): # batch_size 有多少個batch  data[i] = raw_data[batch_len * i:batch_len * (i + 1)] epoch_size = (batch_len - 1) // num_steps # batch_len 是指一個batch中有多少個句子 #epoch_size = ((len(data) // model.batch_size) - 1) // model.num_steps # // 表示整數除法 if epoch_size == 0:  raise ValueError("epoch_size == 0, decrease batch_size or num_steps") for i in range(epoch_size):  x = data[:, i*num_steps:(i+1)*num_steps]  y = data[:, i*num_steps+1:(i+1)*num_steps+1]  yield (x, y)            
新聞熱點
疑難解答