Add _support_flash_attn_2 to Llama 2 32k

#37
Together org

Required for flash attention 2

arshzahed changed pull request status to merged

Sign up or log in to comment