Category : dataloader

I’m quite new to programming and have now clue where my error comes from. I got the following code to set up my dataset for training my classifier: class cows_train(Dataset): def __init__(self, folder_path): self.image_list = glob.glob(folder_path+’/content/cows/train’) self.data_len = len(self.image_list) def __getitem__(self, index): single_image_path = self.image_list[index] im_as_im = Image.open(single_image_path) im_as_np = np.asarray(im_as_im)/255 im_as_np = np.expand_dims(im_as_np, 0) ..

Read more

I have a dataframe with some (35 columns) Variables and millions (rows) of timesteps. I would like to cut the data with the timeseriesgenerator (https://www.tensorflow.org/api_docs/python/tf/keras/preprocessing/sequence/TimeseriesGenerator). I have done it before for some neural nets. I would like to stick to the generator, because I need to skip some samples, but how many isn’t clear at ..

Read more

I’m doing: train_set = AudioLoader(time_mask_max=TIME_MASK_MAX, sequence_length=args.sequence_length) print(train_set.len) train_loader = torch.utils.data.DataLoader(train_set, shuffle=True, num_workers=1, batch_size=args.batch_size, collate_fn=collate_batch, **params) Where: def collate_batch(batch): ”’ Padds batch of variable length note: it converts things ToTensor manually here since the ToTensor transform assume it takes in images rather than arbitrary tensors. ”’ # get sequence lengths print(‘HHH’) lengths = torch.tensor([t.shape[0] for t ..

Read more

I am writting a customed dataloader, while the returned value makes me confused. import torch import torch.nn as nn import numpy as np import torch.utils.data as data_utils class TestDataset: def __init__(self): self.db = np.random.randn(20, 3, 60, 60) def __getitem__(self, idx): img = self.db[idx] return img, img.shape[1:] def __len__(self): return self.db.shape[0] if __name__ == ‘__main__’: test_dataset ..

Read more

So my issue is that when not using DataLoader, just creating 1000 epochs and doing the learning, the results are ok, and the losses drop to ~0.2. However, when trying to use DataLoader, the output is: |batch|index|loss| 8 0 0.6232748031616211 .. 8 23 0.6030591726303101 9 0 0.5626393556594849 9 1 0.6434788703918457 .. 9 20 0.6232720017433167 I ..

Read more