I get the following error: RuntimeError: Error(s) in loading state_dict for XceptionHourglass: Missing key(s) in state_dict: "conv1.weight", "conv1.bias", "bn1.weight", "bn1.bias", "bn1.running_mean", "bn1.running_var", "conv2.weight", "conv2.bias", "bn2.weight", "bn2.bias", "bn2.running_mean", "bn2.running_var"….., I start training by: model = train_mask_net(64) This calls the function train_mask_net where I have included torch.save in the epoch loop. I wanted to load one of ..
I’m trying to got List-type data while itering the DataLoader. Here is a simple example: from torch.utils.data import DataLoader,Dataset tests = [(‘test resume1’,[1,2,3]), (‘test resume2’,[‘a’,’b’,’c’]), (‘test resume3’,[‘Q’,"W",’E’]), (‘test resume4’,[‘,’,’.’,’/’]), (‘test resume5’,[‘!’,’@’,’#’])] class testdataset(Dataset): def __init__(self,data): self.x = [item for item in data] self.y = [item for item in data] def __getitem__(self,index): return self.x[index],self.y[index] def __len__(self): ..
I am trying to implement bidirectional LSTM on time series data. The main file calls the dataloader to load the data for the model. Main.py import copy import torch import torch.nn as nn import torch.nn.functional as F import torch.optim as optim from torch.optim.lr_scheduler import StepLR import numpy as np import time import utils import models ..
class CoalDataset(Dataset): def __init__(self, images_dir: str, masks_dir: str): super(CoalDataset, self).__init__() self.images_files = [os.path.join(images_dir, filename) for filename in os.listdir(images_dir)] self.masks_files = [os.path.join(masks_dir, filename) for filename in os.listdir(masks_dir)] self.cache_data = Queue() self.point = 0 self.num_part = 16 self.len = len(self.images_files) @classmethod def load(cls, filename) -> torch.Tensor: if os.path.splitext(filename)[-1] == ".npy": return torch.as_tensor(np.load(filename)) return torch.as_tensor(cv2.imread(filename).transpose((2, 0, 1))) @classmethod ..
Encountering a problem when trying to use "dataloader" to segment my dataset. I used gray picture which displays pixel as ( 2294 * 1914) after using "transforms" got tensor.size() = ( 1, 2294, 1914). But after using "dataloader", the returning .size() got ( 5, 1, 2294, 1914, 3) , which should be ( 5, 1, ..
I am switching from a much older version of PyTorch from 3 years ago to stable PyTorch 1.9 in CentOS 7 (GPU-based) and with no change in the original paper code, I get the following error. Is there a quick fix to this? (fashcomp) [[email protected] fashion-compatibility]$ python main.py –name test_baseline –learned –l2_embed –datadir ../../../data/fashion/ /scratch3/venv/fashcomp/lib/python3.8/site-packages/torchvision/transforms/transforms.py:310: ..
I have a dataset that I recorded on my own. I recorded it in a way to be balanced, so I do the same number of recordings for each class. However, when I split my data into train, test, and validation I do not apply stratify. My question is should we apply stratify to balanced ..
I’m trying to learn AI. I have GAN (generative adversarial network) code with images with ALPHA Channel(transparency). All images have alpha channel. To prove that I wrote small image_validator.py program like below from PIL import Image import glob def main(): image_list =  img_number = 0 for filename in glob.glob(‘data/*/*.*’): try: im = Image.open(filename) # ..
I have Images set which has transparency. I’m trying to train GAN(Generative adversarial networks). How can I preserve transparency. I can see from output images all transparent area is BLACK. How can I avoid doing that ? I think this is called "Alpha Channel". Anyways How can I keep my transparency ? Below is my ..
I downloaded this image set https://www.kaggle.com/jessicali9530/stanford-dogs-dataset and extracted those images folder in to my data folder So now it’s like this Below is my code from __future__ import print_function import torch.nn as nn import torch.optim as optim import torch.utils.data import torchvision.datasets as dset import torchvision.transforms as transforms import torchvision.utils as vutils from torch.autograd import Variable ..