Post
1307
There's a new
New optimizers include:
* AdafactorBigVision -
* ADOPT -
* MARS -
* LaProp -
* Cautious Optimizers - a modification to all of the above, prefix with
I shared some caution comparisons in this model repo: rwightman/timm-optim-caution
For details, references, see the code: https://github.com/huggingface/pytorch-image-models/tree/main/timm/optim
timm
release, v 1.0.12, with a focus on optimizers. The optimizer factory has been refactored, there's now a timm.optim.list_optimizers()
and new way to register optimizers and their attributes. As always you can use an timm
optimizer like a torch
one, just replace torch.optim
with timm.optim
New optimizers include:
* AdafactorBigVision -
adfactorbv
* ADOPT -
adopt
/ adoptw
(decoupled decay)* MARS -
mars
* LaProp -
laprop
* Cautious Optimizers - a modification to all of the above, prefix with
c
as well as cadamw
, cnadamw
, csgdw
, clamb
, crmsproptf
I shared some caution comparisons in this model repo: rwightman/timm-optim-caution
For details, references, see the code: https://github.com/huggingface/pytorch-image-models/tree/main/timm/optim