【大模型-flash attention安装】成功解决flash attention安装site-packages/flash_attn_2_cuda.cpython-310-x86_64-linux-gnu.so: undefined symbol: _ZN2at4_ops9_pad_enum4callERKNS_6TensorEN3c108ArrayRefINS5_6SymIntEEElNS5_8optionalIdEE
本次修炼方法请往下查看
标签:attention,flash,版本,attn,2.1,安装 From: https://blog.csdn.net/lov1993/article/details/140804472