Thanks to visit codestin.com
Credit goes to github.com

Skip to content

Only start tracing worker thread on first span/trace #804

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Merged
merged 1 commit into from
Jun 2, 2025
Merged

Conversation

rm-openai
Copy link
Collaborator

@rm-openai rm-openai commented Jun 2, 2025

Closes #796. Shouldn't start a busy waiting thread if there aren't any traces.

Test plan

import threading
assert threading.active_count() == 1
import agents
assert threading.active_count() == 1

Copy link

@Copilot Copilot AI left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Copilot encountered an error and was unable to review this pull request. You can try again by re-requesting a review.

self._worker_thread = threading.Thread(target=self._run, daemon=True)
self._worker_thread.start()
# We lazily start the background worker thread the first time a span/trace is queued.
self._worker_thread: threading.Thread | None = None
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Why not start a task + asyncio q, then you don't have an extra thread to worry about and you can start the task unconditionally

Copy link
Collaborator Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Hmm I was thinking:

  1. small overhead in cases where there are no traces (the unconditional task would still be scheduled every poll_interval seconds, even though it doesnt have work)
  2. It wouldn't work super well in totally sync contexts
  3. cleaning up the asyncio task is kinda annoying imo

lmk if you feel like I should change, happy to.

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

No strong opinions

Copy link
Contributor

@pakrym-oai pakrym-oai left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Approving but an async queue might be cleaner

@rm-openai rm-openai merged commit 995af4d into main Jun 2, 2025
5 checks passed
@rm-openai rm-openai deleted the rm/pr804 branch June 2, 2025 19:32
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

Background thread is created on library import
2 participants