Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

RuntimeError: size of dimension does not match previous size, operand 1, dim 0. #30

Open
hahaha12379 opened this issue Feb 14, 2022 · 1 comment

Comments

@hahaha12379
Copy link

I don't konw how to solve this problem,><,谢谢大家了
Traceback (most recent call last):
File "train.py", line 177, in
main()
File "train.py", line 88, in main
metrics = engine.train(trainx, trainy[:,0,:,:])
File "/home/mist/Graph-WaveNet-master/engine.py", line 17, in train
output = self.model(input)
File "/usr/local/lib/python3.6/dist-packages/torch/nn/modules/module.py", line 727, in _call_impl
result = self.forward(*input, **kwargs)
File "/home/mist/Graph-WaveNet-master/model.py", line 192, in forward
x = self.gconv[i](x, new_supports)
File "/usr/local/lib/python3.6/dist-packages/torch/nn/modules/module.py", line 727, in _call_impl
result = self.forward(*input, **kwargs)
File "/home/mist/Graph-WaveNet-master/model.py", line 36, in forward
x1 = self.nconv(x,a)
File "/usr/local/lib/python3.6/dist-packages/torch/nn/modules/module.py", line 727, in _call_impl
result = self.forward(*input, **kwargs)
File "/home/mist/Graph-WaveNet-master/model.py", line 13, in forward
x = torch.einsum('ncvl,vw->ncwl',(x,A))
File "/usr/local/lib/python3.6/dist-packages/torch/functional.py", line 342, in einsum
return einsum(equation, *_operands)
File "/usr/local/lib/python3.6/dist-packages/torch/functional.py", line 344, in einsum
return _VF.einsum(equation, operands) # type: ignore
RuntimeError: size of dimension does not match previous size, operand 1, dim 0

@Meelisha
Copy link

Were you able to figure out this issue?

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants