Thanks to visit codestin.com
Credit goes to github.com

Skip to content

Fix: LLMInferenceService reconciliation for Gateway refs in baseRefs#4944

Merged
terrytangyuan merged 1 commit intokserve:masterfrom
vivekk16:fix-gateway-baseref-reconciliation
Jan 12, 2026
Merged

Fix: LLMInferenceService reconciliation for Gateway refs in baseRefs#4944
terrytangyuan merged 1 commit intokserve:masterfrom
vivekk16:fix-gateway-baseref-reconciliation

Conversation

@vivekk16
Copy link
Contributor

What this PR does / why we need it:

When an LLMInferenceService references Gateway configurations through baseRefs (via LLMInferenceServiceConfig), changes to those Gateways were not triggering reconciliation of the LLMInferenceService. This caused the service to get stuck in GatewaysNotReady state or be slow to become ready.

This PR modifies the enqueueOnGatewayChange handler to resolve baseRefs and check the combined spec for gateway references, ensuring reconciliation is triggered correctly.

Type of changes

  • Bug fix (non-breaking change which fixes an issue)

Feature/Issue validation/testing:

  1. Create a Gateway in the target namespace
  2. Create an LLMInferenceServiceConfig with spec.router.gateway.refs pointing to the Gateway
  3. Create an LLMInferenceService with baseRefs referencing the LLMInferenceServiceConfig
  4. Wait for the Gateway to become Ready
  5. Verify that the LLMInferenceService reconciles and transitions from GatewaysNotReady to Ready promptly without manual intervention

…ceService reconciliation

Signed-off-by: Vivek Karunai Kiri Ragavan <[email protected]>
Copy link
Member

@pierDipi pierDipi left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

/lgtm
/approve

@terrytangyuan terrytangyuan merged commit f1f7bed into kserve:master Jan 12, 2026
71 of 72 checks passed
spolti added a commit to spolti/kserve that referenced this pull request Jan 29, 2026
Commits in this batch:
35755fc ci: PR style check (kserve#4499)
30d1b75 AIOHTTP's HTTP Parser auto_decompress feature is vulnerable to zip bomb (kserve#4939)
53004e7 (jooho/upstream_master) feat: add automatic modelFormat annotation for InferenceServices (kserve#4953)
7d10530 Bump Gateway API Inference Extension (GIE) to v1.2.0 (kserve#4886)
b43d60c Separate LocalModelCache webhook from KServe controller. (kserve#4941)
58fa35d Fix CVE-2025-68156: Update expr-lang/expr to v1.17.7 (kserve#4934)
7a7c5aa refactor: deduplicate test configs with helper functions (kserve#4952)
6f95f1a Fix: use correct image tag in LLMISvc E2E workflow (kserve#4948)
f1f7bed Fix: LLMInferenceService reconciliation for Gateway refs in baseRefs (kserve#4944)
a70985a CVE-2025-66418 - Unbounded number of links in the decompression chain (kserve#4928)
11aad94 chore: bump github.com/kedacore/keda/v2 from 2.16.1 to 2.17.3 (kserve#4927)
b253a50 Fix: opentelemetry helm installation script (kserve#4932)
4041412 refactor: replace bash script with Python and improve generate-version (kserve#4935)
2e987ef Add precommit check to sync golangci Go version with go.mod (kserve#4930)
f44cb66 fix: make deploy-dev for development env (kserve#4881)
f15f6ac Address several CVEs (kserve#4912)
54faf3b ci: add retry request for e2e tests to reduce transient failures (kserve#4795)
6b7bc43 chore: Add .gitattributes to mark vendored and generated code (kserve#4904)
29a6a2b ci: split KServe and Storage  publish workflow into separate jobs (kserve#4801)
299706d Improved CA Bundle Management For LLM Inference Services (kserve#4803)
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

3 participants