You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
File "C:\Users\Pichau\anaconda3\envs\py37\lib\site-packages\torch\nn\functional.py", line 1370, in linear
ret = torch.addmm(bias, input, weight.t())
RuntimeError: size mismatch, m1: [1 x 1024], m2: [256 x 8] at C:/w/1/s/windows/pytorch/aten/src\THC/generic/THCTensorMathBlas.cu:290
Even when I specify other --which_model_netE resnet512 like junyanz sugested on the original source code, but then the code freezes and doesn't process any image.
Could you tell what networks were used to process results with the dataset 512x512 you used in the article... it seems this repo is the only one BicycleGAN repository that implemented networks that support bigger images in comparison to the 256x256...
NOTE: i've used the sabe dataset to train a model using the original source code, but it requires imgs to be 256* and the results are not satisfatory.
HELP PLZ <3
The text was updated successfully, but these errors were encountered:
I'm trying to run your code with a 512x512 paired dataset (like in pix2pix models) so the train images are 1024x512, but every time I try
python train.py --dataroot ./datasets/08K_Rplan_Black --name 08K_RPlan_Black --model bicycle_gan --use_dropout --loadSize 512 --fineSize 512 --display_winsize 512
I GOT:
Even when I specify other
--which_model_netE resnet512
like junyanz sugested on the original source code, but then the code freezes and doesn't process any image.Could you tell what networks were used to process results with the dataset 512x512 you used in the article... it seems this repo is the only one BicycleGAN repository that implemented networks that support bigger images in comparison to the 256x256...
NOTE: i've used the sabe dataset to train a model using the original source code, but it requires imgs to be 256* and the results are not satisfatory.
HELP PLZ <3
The text was updated successfully, but these errors were encountered: