-
Notifications
You must be signed in to change notification settings - Fork 120
fix(analytics): Fix group by feature in analytics v2 version #587
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. Weβll occasionally send you account related emails.
Already on GitHub? Sign in to your account
fix(analytics): Fix group by feature in analytics v2 version #587
Conversation
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Caution
Changes requested β
Reviewed everything up to 958ae51 in 1 minute and 54 seconds. Click for details.
- Reviewed
1609lines of code in4files - Skipped
0files when reviewing. - Skipped posting
9draft comments. View those below. - Modify your settings and rules to customize what types of comments Ellipsis leaves. And don't forget to react with π or π to teach Ellipsis.
1. docs/prds/refactor-analytics-function.md:1
- Draft comment:
Excellent, detailed PRD document. Consider adding any migration or rollout notes if applicable. - Reason this comment was not posted:
Confidence changes required:0%<= threshold30%None
2. internal/domain/events/feature_usage.go:5
- Draft comment:
Removal of the shopspring/decimal import is acceptable if the domain layer no longer exposes implementation details. - Reason this comment was not posted:
Confidence changes required:0%<= threshold30%None
3. internal/repository/clickhouse/feature_usage.go:472
- Draft comment:
The grouping logic in getStandardAnalytics looks well-structuredβensuring 'feature_id' is always included. Doubleβcheck that the ordering of scan targets matches the SQL alias order. - Reason this comment was not posted:
Comment did not seem useful. Confidence is useful =0%<= threshold30%The comment starts with a purely informative statement about the grouping logic, which is not allowed. The second part of the comment asks the author to double-check the ordering of scan targets, which is also not allowed. Therefore, this comment should be removed.
4. internal/repository/clickhouse/feature_usage.go:1547
- Draft comment:
Consider injecting the FeatureService instead of instantiating it inline here. This follows dependency injection best practices. - Reason this comment was not posted:
Comment was not on a location in the diff, so it can't be submitted as a review comment.
5. internal/service/feature_usage_tracking.go:1066
- Draft comment:
Ensure that dependencies like CustomerRepo and FeatureRepo are injected via ServiceParams or defined in the service struct to avoid nil dereferences at runtime. - Reason this comment was not posted:
Comment was on unchanged code.
6. internal/service/feature_usage_tracking.go:1272
- Draft comment:
The enrichment of analytics with feature and meter data is clear. Ensure thorough testing for edge cases, especially for cases when points are empty or data is partially available. - Reason this comment was not posted:
Confidence changes required:30%<= threshold30%None
7. internal/service/feature_usage_tracking.go:1520
- Draft comment:
MergeTimeSeriesPoints correctly aggregates points by timestamp. Review the usage of pointers in the loop to ensure there's no unintended aliasing. - Reason this comment was not posted:
Confidence changes required:30%<= threshold30%None
8. internal/service/feature_usage_tracking.go:2183
- Draft comment:
The weighted sum aggregation formula appears reasonable; ensure that period calculations and remaining seconds are unit-tested to handle edge cases. - Reason this comment was not posted:
Confidence changes required:30%<= threshold30%None
9. internal/service/feature_usage_tracking.go:2230
- Draft comment:
Overall, the refactoring into smaller helper functions improves readability. Consider addressing remaining TODOs and refining log messages for production readiness. - Reason this comment was not posted:
Confidence changes required:30%<= threshold30%None
Workflow ID: wflow_CtmTyOZN5wEhXnCt
You can customize by changing your verbosity settings, reacting with π or π, replying to comments, or adding code review rules.
|
@ellipsis-dev can you review ? |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Caution
Changes requested β
Reviewed everything up to 3356663 in 1 minute and 49 seconds. Click for details.
- Reviewed
1655lines of code in4files - Skipped
0files when reviewing. - Skipped posting
2draft comments. View those below. - Modify your settings and rules to customize what types of comments Ellipsis leaves. And don't forget to react with π or π to teach Ellipsis.
1. internal/service/feature_usage_tracking.go:950
- Draft comment:
GetDetailedUsageAnalytics is now broken into smaller helper functions (validateAnalyticsRequest, fetchAnalyticsData, buildAnalyticsResponse), which improves maintainability and readability. Ensure that any changes here keep the API contract unchanged. - Reason this comment was not posted:
Comment did not seem useful. Confidence is useful =0%<= threshold30%The comment is purely informative, describing the refactoring changes made to theGetDetailedUsageAnalyticsfunction. It does not provide any actionable feedback or specific suggestions for improvement. It also asks to ensure the API contract is unchanged, which is against the rules.
2. internal/service/feature_usage_tracking.go:1158
- Draft comment:
The analytics enrichment logic (enrichWithMetadata, calculateCosts, aggregateAnalyticsByGrouping) has been modularized well. Verify that the cost calculation, grouping, and metadata enrichment continue to match existing business logic. - Reason this comment was not posted:
Comment did not seem useful. Confidence is useful =0%<= threshold30%This comment is asking the PR author to verify that the cost calculation, grouping, and metadata enrichment continue to match existing business logic. This falls under asking the author to ensure behavior is intended, which is against the rules. Therefore, this comment should be removed.
Workflow ID: wflow_vEy0X6qKe3jLONjR
You can customize by changing your verbosity settings, reacting with π or π, replying to comments, or adding code review rules.
|
@coderabbitai can you review ? |
|
Tip For best results, initiate chat on the files or code changes. π§© Analysis chainπ Script executed: Length of output: 62994 @hiteshshimpi-55, I'll review this PR for you. This is a substantial refactoring of the analytics function. Let me analyze the changes in detail. π― OverviewThis PR refactors π΄ Critical Issues1. Broken Parallel Fetching LogicLocation: The go func() {
// Wait for customer first
customer := <-customerChan // β οΈ DEADLOCK: reads before customer goroutine completes
subscriptions, err := s.fetchSubscriptions(ctx, customer.ID)
// ...
}()Problem: The subscription goroutine reads from Fix: Remove the "parallel" approach here since subscriptions depend on customer data: func (s *featureUsageTrackingService) fetchAnalyticsData(ctx context.Context, req *dto.GetUsageAnalyticsRequest) (*AnalyticsData, error) {
// 1. Fetch customer first (required for subscriptions)
customer, err := s.fetchCustomer(ctx, req.ExternalCustomerID)
if err != nil {
return nil, err
}
// 2. Fetch subscriptions and analytics in parallel
var wg sync.WaitGroup
var subscriptions []*subscription.Subscription
var analytics []*events.DetailedUsageAnalytic
var subErr, analyticsErr error
wg.Add(2)
go func() {
defer wg.Done()
subscriptions, subErr = s.fetchSubscriptions(ctx, customer.ID)
}()
go func() {
defer wg.Done()
params := s.createAnalyticsParams(ctx, req)
params.CustomerID = customer.ID
analytics, analyticsErr = s.fetchAnalytics(ctx, params)
}()
wg.Wait()
if subErr != nil {
return nil, subErr
}
if analyticsErr != nil {
return nil, analyticsErr
}
// Rest of the function...
}2. GROUP BY Logic Change May Break QueriesLocation: // Always include feature_id for cost calculation, then add requested grouping dimensions
groupByColumns := []string{"feature_id"}
groupByColumnAliases := []string{"feature_id"}Problem: The code now always includes
Impact:
Recommendation: Either:
3. Aggregation Function Not IntegratedLocation: // Aggregate results by requested grouping dimensions
analytics := s.aggregateAnalyticsByGrouping(data.Analytics, data.Params.GroupBy)This function is meant to fix issue #2 above, but:
Recommendation: Fix the SQL queries instead of post-processing.
|
Important
Refactor
GetDetailedUsageAnalyticsfor improved performance and maintainability with parallel data fetching and reduced complexity.GetDetailedUsageAnalyticsinfeature_usage_tracking.gofor improved performance and maintainability.fetchCustomer,fetchSubscriptions, andfetchAnalytics.AnalyticsDatastruct to consolidate data handling.validateAnalyticsRequest,fetchAnalyticsData, andbuildAnalyticsResponse.feature_usage.goandclickhouse/feature_usage.go.getStandardAnalyticsandgetMaxBucketAnalyticsfor improved query handling.This description was created by
for 3356663. You can customize this summary. It will automatically update as commits are pushed.