add last_epoch argument to salt/modelwrapper.py
This MR adds an option to the learning rate scheduler for resetting it. Because of the OneCycleLR, the learning rate for the steps are computed when starting the training. When one wants to train for more than the maximum number of epochs, one has to reset the learning rate scheduler. This MR makes the option for configuring the LR scheduler accessible in the config file.