Skip to content

Navigation Menu

Sign in
Appearance settings

Search code, repositories, users, issues, pull requests...

Provide feedback

We read every piece of feedback, and take your input very seriously.

Saved searches

Use saved searches to filter your results more quickly

Sign up
Appearance settings

is there a way to run vllm without torch.compiled model? #11051

carlesoctav announced in Q&A
Discussion options

i try to debug with print statement but it cannot be done on torch.compiled model.

You must be logged in to vote

Replies: 1 comment

Comment options

VLLM_USE_V1=0

You must be logged in to vote
0 replies
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Category
Q&A
Labels
None yet

AltStyle によって変換されたページ (->オリジナル) /