Tensorflow `from_tensor_slices` 卡死

Tensorflow `from_tensor_slices` 卡死

警告memory exceeded 10% of system memory
我load本地的tfrecord成np array

def CUB_load_data():

ds = tfds.load('caltech_birds2011', download=False, data_dir='../../datasets/')

train_data = ds['train']

test_data = ds['test']

train_x = []

train_y = []

test_x = []

test_y = []

for i in train_data.__iter__():

resized = cv2.resize(i['image'].numpy(), dsize=(224,224))

train_x.append(resized)

train_y.append(i['label'])

for i in test_data.__iter__():

resized = cv2.resize(i['image'].numpy(), dsize=(224,224))

test_x.append(resized)

test_y.append(i['label'])

return (train_x, train_y), (test_x, test_y)

这部分应该没问题

我用from_tensor_slices去创建tf.dataset,我尝试过修改batch,但一样没效果,卡死并且报超过系统10%容量的警告。
图片是CUB200_2011,所有图片也就1.G上下。用tfds.load生成的tfrecord

def load_data():

(train_x, train_y), (test_x, test_y) = CUB_load_data()

SHUFFLE_BUFFER_SIZE = 500

BATCH_SIZE = 2

@tf.function

def _parse_function(img, label):

feature = {}

img = tf.cast(img, dtype=tf.float32)

img = img / 255.0

feature["img"] = img

feature["label"] = label

return feature

train_dataset_raw = tf.data.Dataset.from_tensor_slices(

(train_x, train_y)).map(_parse_function)

test_dataset_raw = tf.data.Dataset.from_tensor_slices(

(test_x, test_y)).map(_parse_function)

train_dataset = train_dataset_raw.shuffle(SHUFFLE_BUFFER_SIZE).batch(

BATCH_SIZE)

test_dataset = test_dataset_raw.shuffle(SHUFFLE_BUFFER_SIZE).batch(

BATCH_SIZE)

return train_dataset, test_dataset

以上是 Tensorflow `from_tensor_slices` 卡死 的全部内容, 来源链接: utcz.com/a/157388.html

回到顶部