-
-
Notifications
You must be signed in to change notification settings - Fork 35.8k
Revert "Make influxdb batch settings configurable" #155808
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
base: dev
Are you sure you want to change the base?
Conversation
This reverts commit 0e2a460.
|
Hey there @mdegat01, mind taking a look at this pull request as it has been labeled with an integration ( Code owner commandsCode owners of
|
|
@jgaalen something you might want to fix without reverting the original PR? |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Pull Request Overview
This PR refactors the InfluxDB integration to remove user-configurable batch parameters and replace them with hardcoded constants. The changes align with Home Assistant's coding guideline that polling intervals and batch parameters should not be user-configurable, but rather determined programmatically by the integration.
- Removed user-configurable
batch_buffer_sizeandbatch_timeoutparameters from the configuration schema - Replaced configurable batch parameters with hardcoded constants
BATCH_BUFFER_SIZE(100) andBATCH_TIMEOUT(1) - Simplified the
get_events_json()method by removing dynamic batch timeout calculation logic
Reviewed Changes
Copilot reviewed 2 out of 2 changed files in this pull request and generated 2 comments.
| File | Description |
|---|---|
homeassistant/components/influxdb/const.py |
Removed configuration constants and schema for batch parameters; added hardcoded BATCH_BUFFER_SIZE and BATCH_TIMEOUT constants |
homeassistant/components/influxdb/__init__.py |
Updated imports to use new constant names; removed batch configuration from schema; simplified InfluxThread initialization and batch_timeout() method; streamlined timeout calculation in get_events_json() |
| else batch_timeout - (time.monotonic() - start_time) | ||
| ) | ||
| while len(json) < BATCH_BUFFER_SIZE and not self.shutdown: | ||
| timeout = None if count == 0 else self.batch_timeout() |
Copilot
AI
Nov 4, 2025
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
The timeout calculation is incorrect. After the first event (count > 0), the timeout is always set to BATCH_TIMEOUT (1 second) regardless of how much time has already elapsed. The original code calculated a decreasing timeout based on batch_timeout - (time.monotonic() - start_time) to ensure the total batch collection time didn't exceed the timeout. This change means subsequent events will wait the full 1 second each time, potentially causing the batch collection to take much longer than intended (up to 100 seconds for a full batch instead of 1 second total).
| def get_events_json(self): | ||
| """Return a batch of events formatted for writing.""" | ||
| queue_seconds = QUEUE_BACKLOG_SECONDS + self.max_tries * RETRY_DELAY | ||
| start_time = time.monotonic() | ||
| batch_timeout = self.batch_timeout() | ||
|
|
||
| count = 0 | ||
| json = [] | ||
|
|
||
| dropped = 0 |
Copilot
AI
Nov 4, 2025
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
The removal of start_time = time.monotonic() and the batch timeout check if count > 0 and time.monotonic() - start_time >= batch_timeout: break eliminates the mechanism that ensures batch collection completes within the timeout period. This was important for preventing the thread from spending excessive time collecting events when the queue has many items. Consider restoring this logic to maintain the original behavior where the total batch collection time is bounded.
Reverts #134758
See comment #134758 (comment)
When fixing that problem the tests doesn't work (locks).
Breaks dev apparently too in it's current state https://github.com/home-assistant/core/actions/runs/19080794349/job/54509512666?pr=155802