fynance.models.rolling.RollMultiLayerPerceptron.set_lr_scheduler

RollMultiLayerPerceptron.set_lr_scheduler(lr_scheduler, **kwargs)

Set dynamic learning rate.

Parameters:
lr_scheduler : torch.optim.lr_scheduler._LRScheduler

Method from torch.optim.lr_scheduler to wrap self.optimizer, cf module torch.optim.lr_scheduler in PyTorch documentation [2].

**kwargs

Keyword arguments to pass to the learning rate scheduler.

References

[2]https://pytorch.org/docs/stable/optim.html#how-to-adjust-learning-rate