number_examples和batch_size之间有什么关系吗?
来源:10-4 Encoder构建
慕九州3352676
2021-06-27
老师您好,我在跑这部分的代码的时候,不太清楚number_examples和batch_size之间还有什么影响吗?
我把number_examples设置为12,把batch_size设置为9的时候还一切正常,但是设置为10、11、12的时候均抛出异常,这是为什么?
代码如下(只是对您的代码在pycharm中整理了一下):
from seq2seq import preprocessing as pre import tensorflow as tf data_path = 'data/toydata.txt' num_examples = 12 input_tensor, target_tensor, inp_lang, targ_lang = pre.load_dataset(data_path, num_examples) input_tensor_train, _, target_tensor_train, _ = pre.data_split(data_path, num_examples) BUFFER_SIZE = len(input_tensor_train) BATCH_SIZE = 9 steps_per_epoch = len(input_tensor_train) // BATCH_SIZE embedding_dim = 256 units = 1024 vocab_inp_size = len(inp_lang.word_index) + 1 vocab_tar_size = len(targ_lang.word_index) + 1 print(vocab_inp_size, vocab_tar_size) dataset = tf.data.Dataset.from_tensor_slices((input_tensor_train, target_tensor_train)).shuffle(BUFFER_SIZE) dataset = dataset.batch(BATCH_SIZE, drop_remainder=True) example_input_batch, example_target_batch = next(iter(dataset)) print(example_input_batch)
写回答
1回答
-
正十七
2021-06-27
重复问题:https://coding.imooc.com/learn/questiondetail/Ene1kYr20gvYBD5q.html
012021-06-27
相似问题