首页 > 其他分享 >深度学习优化器:《Lookahead Optimizer: k steps forward, 1 step back》

深度学习优化器:《Lookahead Optimizer: k steps forward, 1 step back》

时间:2024-08-11 18:27:12浏览次数:5  
标签:Optimizer Lookahead back steps lookahead optimizer

深度学习优化器:《Lookahead Optimizer: k steps forward, 1 step back》


项目地址:

https://github.com/michaelrzhang/lookahead


pytorch版本:

https://github.com/michaelrzhang/lookahead/blob/master/lookahead_pytorch.py


论文地址:

https://arxiv.org/abs/1907.08610


使用方法:(pytorch)

optimizer = # {any optimizer} e.g. torch.optim.Adam
if args.lookahead:
    optimizer = Lookahead(optimizer, la_steps=args.la_steps, la_alpha=args.la_alpha)

We found that evaluation performance is typically better using the slow weights. This can be done in PyTorch with something like this in your eval loop:


if args.lookahead:
    optimizer._backup_and_load_cache()
    val_loss = eval_func(model)
    optimizer._clear_and_load_backup()




@article{zhang2019lookahead,
title={Lookahead Optimizer: k steps forward, 1 step back},
author={Zhang, Michael R and Lucas, James and Hinton, Geoffrey and Ba, Jimmy},
journal={arXiv preprint arXiv:1907.08610},
year={2019}
}


标签:Optimizer,Lookahead,back,steps,lookahead,optimizer
From: https://www.cnblogs.com/devilmaycry812839668/p/18353724

相关文章