pytorch: Random classifier: ValueError: optimizer got an empty parameter list

Is there any best practice or efficient way to have a random classifier in pytorch? My random classifier basically looks like this:

def forward(self, inputs):     # get a random tensor     logits = torch.rand(batch_size, num_targets, num_classes)     return logits 

This should be fine in principle, but the optimizer raises a ValueError because the classifier – in contrast to all other classifiers / models in the system – does not have any parameters that can be optimized, obviously. Is there a torch built-in solution to this or must I change the system’s architecture (to not perform optimization)?

Edit: If adding some arbitrary parameters to the model as shown below, the loss will raise an RuntimeError: element 0 of tensors does not require grad and does not have a grad_fn

def __init__(self, transformer_models: Dict, opt: Namespace):     super(RandomMulti, self).__init__()     self.num_classes = opt.polarities_dim     # add some parameters so that the optimizer doesn't raise an exception     self.some_params = nn.Linear(2, 2) 

My assumption really would be that there is a simpler solution, since having a random baseline classifier is a rather common thing in machine learning.

Add Comment
1 Answer(s)

Indeed having a "random" baseline is common practice, but usually you do not need to explicitly generate one, let alone "train" it. In most cases you can have quite accurate expectation values for the "random" baseline. For instance, in ImageNet classification you have 1000 categories of equal size, than predicting a category at random should give you an expected accuracy of 1/1000. You do not need to instantiate a random classifier to produce that number.

If you insist on explicitly instantiate a random classifier – what is the meaning of "training" it? There are the errors you get, pytorch simply cannot understand what you are doing. You can have a random classifier and you can evaluate its performance, but there is no meaning to training it.

Add Comment

Your Answer

By posting your answer, you agree to the privacy policy and terms of service.