Skip to content

Commit 69724e8

Browse files
khluuSzymonOzog
authored andcommitted
[misc] Fix setup.py condition to avoid AMD from being mistaken with CPU (vllm-project#13022)
Signed-off-by: kevin <[email protected]> Signed-off-by: SzymonOzog <[email protected]>
1 parent b6761ea commit 69724e8

File tree

1 file changed

+3
-2
lines changed

1 file changed

+3
-2
lines changed

setup.py

Lines changed: 3 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -48,8 +48,9 @@ def load_module_from_path(module_name, path):
4848
"so vLLM may not be able to run correctly", sys.platform)
4949
VLLM_TARGET_DEVICE = "empty"
5050
elif (sys.platform.startswith("linux") and torch.version.cuda is None
51-
and os.getenv("VLLM_TARGET_DEVICE") is None):
52-
# if cuda is not available and VLLM_TARGET_DEVICE is not set,
51+
and os.getenv("VLLM_TARGET_DEVICE") is None
52+
and torch.version.hip is None):
53+
# if cuda or hip is not available and VLLM_TARGET_DEVICE is not set,
5354
# fallback to cpu
5455
VLLM_TARGET_DEVICE = "cpu"
5556

0 commit comments

Comments
 (0)