Thanks to visit codestin.com
Credit goes to github.com

Skip to content

Conversation

tobrun
Copy link

@tobrun tobrun commented Sep 18, 2025

This is a first step towards improving the usability and robustness of this repository.
This change ensures we clean up all the vLLM instances when we exit:

  • finished executing the task
  • error occurs
  • keyboard interruption

imo, this isn't the optimal solution. I personally would love to divide the LLM engine lifetime from the actual job processing but that is for a future PR.

closes #109

@callanwu
Copy link
Member

Thanks for the PR! We'll review it soon.

@beronomar-collab
Copy link

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

Zombie vLLM instances
3 participants