Drop last batch

I am trying to develop a custom model. In my model, I have an LSTM where I am using the previous hidden and cell states (h_0, c_0). So, if the number of samples isn’t divisible by the batch_size, I am facing an error - RuntimeError: Expected hidden[0] size (1, 3, 256), got [1, 8, 256].

Torch Dataloader has an option drop_last. How can I achieve such functionality with sb.dataio.dataset.DynamicItemDataset?

Well, I solved this by overriding the make_dataloader function in my class and adding loader_kwargs['drop_last'] = True before calling the sb.dataio.dataloader.make_dataloader fucntion.

Interesting. @Gastron Should we allow the passing of this argument to our data loader ?

1 Like

Oh, yep, no need to override make_dataloader for this, though that also works, sure. But drop_last can simply be passed as part of the normal train_loader_kwargs.