-
-
Notifications
You must be signed in to change notification settings - Fork 10.4k
[gpt-oss] guard import when triton kernel is not installed #22529
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Conversation
Signed-off-by: Yongye Zhu <[email protected]>
👋 Hi! Thank you for contributing to the vLLM project. 💬 Join our developer Slack at https://slack.vllm.ai to discuss your PR in #pr-reviews, coordinate on features in #feat- channels, or join special interest groups in #sig- channels. Just a reminder: PRs would not trigger full CI run by default. Instead, it would only run Once the PR is approved and ready to go, your PR reviewer(s) can run CI to test the changes comprehensively before merging. To run CI, PR reviewers can either: Add 🚀 |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Code Review
This pull request aims to make triton_kernels
an optional dependency by guarding its import. My review focuses on the correctness of this implementation. I've identified a critical issue where the current change is incomplete and will likely cause runtime errors if triton_kernels
is not installed. The code that depends on the optional imports also needs to be conditionally defined.
from vllm.utils import has_triton_kernels | ||
|
||
if True: | ||
if has_triton_kernels(): |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
While this change correctly guards the imports, the code that uses these imported modules (e.g., swiglu
, PrecisionConfig
) also needs to be conditionally defined. Without this, if triton_kernels
is not installed, the Python interpreter will raise a NameError
when it tries to define functions or classes that reference these names.
To fix this, you should move all code that depends on triton_kernels
inside this if
block. You should also consider adding an else
block to define dummy implementations or raise an ImportError
for the exported names (like BatchedOAITritonExperts
and triton_kernel_moe_forward
) to provide a clear error message to users who try to use this functionality without triton_kernels
installed.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
@zyongye I think gemini's comment makes sense. Can you add something like this?
if has_triton_kernels():
import ...
else:
FnSpecs = None
FusedActivation = None
...
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
But this function will never touch those lines if there's no triton kernel installed. I put an error in the FusedMoe init
https://github.com/vllm-project/vllm/blob/main/vllm/model_executor/layers/fused_moe/layer.py#L729-L730
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Just tried and I have to use:
-if True:
+if has_triton_kernels():
import triton_kernels.swiglu
from triton_kernels.matmul_ogs import (FnSpecs, FusedActivation,
PrecisionConfig, matmul_ogs)
from triton_kernels.routing import routing
+else:
+ PrecisionConfig = None
Otherwise, I get:
(APIServer pid=5435) File "/root/vllm/vllm/model_executor/layers/fused_moe/gpt_oss_triton_kernels_moe.py", line 146, in BatchedOAITritonExperts
(APIServer pid=5435) w1_precision: PrecisionConfig, w2_precision: PrecisionConfig):
(APIServer pid=5435) ^^^^^^^^^^^^^^^
(APIServer pid=5435) NameError: name 'PrecisionConfig' is not defined
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
ok new change pushed
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
@zyongye Actaully, we can just use TYPE_CHECKING
guard. I fixed the PR.
Signed-off-by: Yongye Zhu <[email protected]>
Signed-off-by: Yongye Zhu <[email protected]>
Signed-off-by: Woosuk Kwon <[email protected]>
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Thanks for the fix. Will merge after it passes the pre-commit and lints.
…ect#22529) Signed-off-by: Yongye Zhu <[email protected]> Signed-off-by: Woosuk Kwon <[email protected]> Co-authored-by: Woosuk Kwon <[email protected]> Signed-off-by: Jinzhen Lin <[email protected]>
…ect#22529) Signed-off-by: Yongye Zhu <[email protected]> Signed-off-by: Woosuk Kwon <[email protected]> Co-authored-by: Woosuk Kwon <[email protected]> Signed-off-by: Noam Gat <[email protected]>
…ect#22529) Signed-off-by: Yongye Zhu <[email protected]> Signed-off-by: Woosuk Kwon <[email protected]> Co-authored-by: Woosuk Kwon <[email protected]> Signed-off-by: Paul Pak <[email protected]>
…ect#22529) Signed-off-by: Yongye Zhu <[email protected]> Signed-off-by: Woosuk Kwon <[email protected]> Co-authored-by: Woosuk Kwon <[email protected]> Signed-off-by: Diego-Castan <[email protected]>
…ect#22529) Signed-off-by: Yongye Zhu <[email protected]> Signed-off-by: Woosuk Kwon <[email protected]> Co-authored-by: Woosuk Kwon <[email protected]>
…ect#22529) Signed-off-by: Yongye Zhu <[email protected]> Signed-off-by: Woosuk Kwon <[email protected]> Co-authored-by: Woosuk Kwon <[email protected]>
…ect#22529) Signed-off-by: Yongye Zhu <[email protected]> Signed-off-by: Woosuk Kwon <[email protected]> Co-authored-by: Woosuk Kwon <[email protected]> Signed-off-by: Xiao Yu <[email protected]>
…ect#22529) Signed-off-by: Yongye Zhu <[email protected]> Signed-off-by: Woosuk Kwon <[email protected]> Co-authored-by: Woosuk Kwon <[email protected]>
No description provided.