We read every piece of feedback, and take your input very seriously.
To see all available qualifiers, see our documentation.
There was an error while loading. Please reload this page.
1 parent df5dafa commit 3132a93Copy full SHA for 3132a93
CMakeLists.txt
@@ -576,7 +576,7 @@ else()
576
FetchContent_Declare(
577
vllm-flash-attn
578
GIT_REPOSITORY https://github.com/vllm-project/flash-attention.git
579
- GIT_TAG 0aff05f577e8a10086066a00618609199b25231d
+ GIT_TAG 9732b0ce005d1e6216864788502d5570004678f5
580
GIT_PROGRESS TRUE
581
# Don't share the vllm-flash-attn build between build types
582
BINARY_DIR ${CMAKE_BINARY_DIR}/vllm-flash-attn
0 commit comments