Thanks to visit codestin.com
Credit goes to github.com

Skip to content

llama : remove llama_kv_cache_view API + remove deprecated #13653

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Merged
merged 1 commit into from
May 20, 2025

Conversation

ggerganov
Copy link
Member

ref #13194

The llama_kv_cache_view_* API was used at some point to debug the state of the KV cache. However it is not something that should be part of the public interface of the library. It will be replaced with environment-controlled flags that enable internal prints and statistics related to the KV cache.

Also combining this change with removing some old llama_kv_cache_* interfaces that have been deprecated for some time now.

API changes

  • Remove llama_kv_cache_view_* API
  • Remove deprecated llama_kv_cache_* API

@ggerganov ggerganov requested a review from slaren May 20, 2025 12:42
@ggerganov ggerganov merged commit a4090d1 into master May 20, 2025
53 checks passed
@ggerganov ggerganov deleted the gg/llama-kv-cache-view-rm branch May 20, 2025 13:13
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
Projects
None yet
Development

Successfully merging this pull request may close these issues.

2 participants