Skip to content

Commit

Permalink
Merge pull request #669 from OptimalScale/rpan-support-a6000
Browse files Browse the repository at this point in the history
Add flash attention install for A6000
  • Loading branch information
2003pro authored Oct 29, 2023
2 parents 2a311b3 + 7932633 commit f846912
Showing 1 changed file with 2 additions and 2 deletions.
4 changes: 2 additions & 2 deletions install.sh
Original file line number Diff line number Diff line change
Expand Up @@ -3,6 +3,6 @@
pip install -e .

gpu_state="$(nvidia-smi --query-gpu=name --format=csv,noheader)"
if [[ "${gpu_state}" == *"A100"* || "${gpu_state}" == *"A40"* ]]; then
if [[ "${gpu_state}" == *"A100"* || "${gpu_state}" == *"A40"* || "${gpu_state}" == *"A6000"* ]]; then
pip install flash-attn==2.0.2
fi
fi

0 comments on commit f846912

Please sign in to comment.