Thanks to visit codestin.com
Credit goes to github.com

Skip to content

Tags: basnijholt/agent-cli

Tags

v0.22.0

Toggle v0.22.0's commit message

Verified

This commit was created on GitHub.com and signed with GitHub’s verified signature.
Copy raw transcript before LLM step (#54)

## Summary
- copy the raw ASR transcript to the clipboard immediately when LLM cleanup is enabled
- leave the existing LLM clipboard update in place so the cleaned text still overwrites the raw transcript
- update the LLM-enabled transcribe test to cover the new clipboard behavior and log output

## Testing
- uv run pytest tests/agents/test_transcribe.py::test_transcribe_main_llm_enabled

v0.21.1

Toggle v0.21.1's commit message

Verified

This commit was created on GitHub.com and signed with GitHub’s verified signature.
Fix server crash in finally block when transcription fails early (#50)

* Fix server crash in finally block when transcription fails early

The server would crash with UnboundLocalError when an exception occurred
early in the transcribe_audio function before raw_transcript was initialized.
This happened because the finally block always tried to log the transcription
without checking if the variables existed.

Changes:
- Initialize raw_transcript and cleaned_transcript at the start of the function
- Add safety check in finally block to only log if there's content to log
- Wrap logging operation in try-catch to prevent crashes from logging failures
- Add comprehensive tests for various failure scenarios

This ensures the server remains stable even when errors occur during
transcription processing.

* Update README.md

* Fix additional server crash issue with LLM errors

Found and fixed a second crash issue where the server would exit when
LLM connections failed. The service was calling sys.exit(1) which
terminated the entire server process.

Changes:
- Set exit_on_error=False in process_and_update_clipboard for web API context
- Add proper handling for partial success (transcription ok, cleanup failed)
- Add test for LLM connection errors to ensure server stays running
- Return informative error messages when cleanup fails but transcription succeeds

This ensures the server remains stable even when LLM services are unavailable.

* fix test_process_and_update_clipboard

---------

Co-authored-by: github-actions[bot] <github-actions[bot]@users.noreply.github.com>

v0.21.0

Toggle v0.21.0's commit message

Verified

This commit was created on GitHub.com and signed with GitHub’s verified signature.
feat: add support for llama-server via OpenAI-compatible API (#45)

* feat: add support for llama-server via OpenAI-compatible API

- Add openai_base_url configuration option to OpenAILLM config
- Update OpenAI provider to support custom base URLs (e.g., llama-server)
- Make API key optional when using custom base URLs
- Add --openai-base-url CLI option to all agents
- Update example configuration with llama-server usage instructions

This allows using llama-cpp's llama-server by setting:
  --llm-provider openai --openai-base-url http://localhost:8080/v1

* Update README.md

* test: fix OpenAILLM config instantiations in tests

Add missing openai_base_url parameter to all OpenAILLM instantiations
in test files to match the updated config schema.

---------

Co-authored-by: github-actions[bot] <github-actions[bot]@users.noreply.github.com>

v0.20.0

Toggle v0.20.0's commit message

Verified

This commit was created on GitHub.com and signed with GitHub’s verified signature.
feat: add audio recovery mechanism for transcription failures (#44)

v0.19.0

Toggle v0.19.0's commit message

Verified

This commit was signed with the committer’s verified signature.
basnijholt Bas Nijholt
fix(transcribe): improve system prompt to prevent unwanted prefixes

The transcribe agent sometimes added unwanted prefixes like "Sure. Here's
the cleaned-up text:" or wrapped output in quotes despite instructions.

Enhanced the system prompt with:
- CRITICAL directive emphasizing output format
- Specific examples of wrong responses to avoid
- Clear example of correct response format
- Reinforced reminders in agent instructions

This addresses the root cause through better prompting rather than
post-processing fixes.

v0.18.0

Toggle v0.18.0's commit message

Verified

This commit was signed with the committer’s verified signature.
basnijholt Bas Nijholt
Fix test transcription_log

v0.17.0

Toggle v0.17.0's commit message

Verified

This commit was signed with the committer’s verified signature.
basnijholt Bas Nijholt
Fix correctly parsing subcommand

v0.16.0

Toggle v0.16.0's commit message

Verified

This commit was created on GitHub.com and signed with GitHub’s verified signature.
feat: Add Gemini support with LLMs (#21)

v0.15.0

Toggle v0.15.0's commit message

Verified

This commit was created on GitHub.com and signed with GitHub’s verified signature.
Make all CLI options consistent (#20)

v0.14.0

Toggle v0.14.0's commit message

Verified

This commit was signed with the committer’s verified signature.
basnijholt Bas Nijholt
Add extra-instructions to the config