Discussions>Params argument given to the optimizer should be an iterable>

Params argument given to the optimizer should be an iterable

I tried to create a simple optimization example in Pytorch, but I get a weird error saying that weights are not iterable, however, the variable is a matrix.

import torch

weights = torch.nn.Parameter(data=torch.Tensor(1, 1, 2, 2), requires_grad=True)
optimizer = torch.optim.SGD(weights, lr=0.1, momentum=0.9, weight_decay=1e-4, nesterov=False)

Here is what I get when I run the script described above

TypeError                                 Traceback (most recent call last)
<ipython-input-175-023057ab67ac> in <module>
----> 1 optimizer = torch.optim.SGD(weights, lr=0.1, momentum=0.9, weight_decay=1e-4, nesterov=False)

/usr/local/lib/python3.9/site-packages/torch/optim/optimizer.py in __init__(self, params, defaults)
     39
     40         if isinstance(params, torch.Tensor):
---> 41             raise TypeError("params argument given to the optimizer should be "
     42                             "an iterable of Tensors or dicts, but got " +
     43                             torch.typename(params))

TypeError: params argument given to the optimizer should be an iterable of Tensors or dicts, but got torch.FloatTensor

PyTorch 1.8
Python 3.9

3 votesJW326.00
1 Answers
JO295.00
2

You just have to pass an array (or iterable object) to the optimizer, something like this

import torch

weights = torch.nn.Parameter(data=torch.Tensor(1, 1, 2, 2), requires_grad=True)
optimizer = torch.optim.SGD([weights], lr=0.1, momentum=0.9, weight_decay=1e-4, nesterov=False)
Reply
Couldn't find what you were looking for?and we will find an expert to answer.
How helpful was this page?