原因是因为一些基本库和model不匹配了:
Q:rope_scaling
must be a dictionary with with two fields, name
and factor
, got {'factor': 8.0, 'low_freq_factor': 1.0, 'high_freq_factor': 4.0, 'original_max_position_embeddings': 8192, 'rope_type': 'llama3'}
pip install --upgrade transformers
Q: ImportError: cannot import name 'top_k_top_p_filtering' from 'transformers'
pip install --upgrade trl
Q: ImportError: Using the Trainer
with PyTorch
requires accelerate>=0.26.0
:
pip install -U accelerate
标签:scaling,LLaMa,ImportError,rope,install,factor,pip
From: https://www.cnblogs.com/epicmo/p/18509986