Skip to content

Commit 027acd9

Browse files
authored
[Fix] Tests breaking with vLLM:Main (#306)
Closes #305 The changes in upstream that are breaking tests with vLLM:main branch are twofolds: 1. [This PR](vllm-project/vllm#20061) introduces[ an import](https://github.com/vllm-project/vllm/blob/8020e98c9f033e76c97eb8261f772d59eba49c9a/vllm/model_executor/model_loader/bitsandbytes_loader.py#L23) which is incompatible with torch 2.5.1 2. Previously, when `model_config.task` was set to `auto`, the task could be automatically resolved in vLLM: after the post-init method of `ModelConfig`, the `task` field was automatically replaced from `auto` to .e.g. `generate` or `embed`). With the changes in [This PR](vllm-project/vllm@020f58a), the task in `model_config`, cannot be automatically resolved anymore when it is `generate`. This doesn't look like it is intended, but rather a side effect of adding new supported tasks. A cleaner (but longer) way would be to do a PR to vLLM to fix that and reintroduce the automatic resolution of `auto` task field. This PR addresses both of the breaking changes: * for 1. it installs pytorch==2.7.0 in github workflows when vLLM:main branch is used * for 2: it infers the task through `model_config.supported_tasks` rather than `model_config.task` now --------- Signed-off-by: Sophie du Couédic <[email protected]>
1 parent 5975e98 commit 027acd9

File tree

2 files changed

+10
-2
lines changed

2 files changed

+10
-2
lines changed

.github/workflows/test.yml

Lines changed: 7 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -70,11 +70,16 @@ jobs:
7070
tests/**/*.py
7171
vllm_spyre/**/*.py
7272
73-
- name: "Install PyTorch"
74-
if: steps.changed-src-files.outputs.any_changed == 'true'
73+
- name: "Install PyTorch 2.5.1"
74+
if: (steps.changed-src-files.outputs.any_changed == 'true' && !matrix.vllm_version.repo)
7575
run: |
7676
pip install torch=="2.5.1+cpu" --index-url https://download.pytorch.org/whl/cpu
7777
78+
- name: "Install PyTorch 2.7.0"
79+
if: (steps.changed-src-files.outputs.any_changed == 'true' && matrix.vllm_version.repo)
80+
run: |
81+
pip install torch=="2.7.0+cpu" --index-url https://download.pytorch.org/whl/cpu
82+
7883
- name: "Install uv"
7984
if: steps.changed-src-files.outputs.any_changed == 'true'
8085
uses: astral-sh/setup-uv@v5

vllm_spyre/platform.py

Lines changed: 3 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -67,6 +67,9 @@ def check_and_update_config(cls, vllm_config: VllmConfig) -> None:
6767

6868
is_decoder = model_config.task == "generate"
6969
is_embedding = model_config.task == "embed"
70+
if model_config.task == "auto":
71+
is_embedding = "embed" in model_config.supported_tasks
72+
is_decoder = "generate" in model_config.supported_tasks
7073

7174
# v0 is only supported for embedding models, and embedding models must
7275
# be run on v0

0 commit comments

Comments
 (0)