Thanks to visit codestin.com
Credit goes to github.com

Skip to content

Conversation

@mdkaifansari04
Copy link
Contributor

@mdkaifansari04 mdkaifansari04 commented Dec 24, 2025

Fixes: #4662

Description

This pr adds a github pr comment leaderboard to the global leaderboard page.
it shows contributors who actively comment on prs across blt repos.

2025-12-24_22-43-33

note: the screenshot uses test data, so activity and dates may look random.

how it works

  • pr comments are captured via existing github webhooks
  • data updates automatically in real time
  • uses the same infra as other github leaderboards

what it shows

  • top 10 commenters from last 6 months
  • real github usernames and avatars
  • pr comment counts
  • profile links
  • bot comments are excluded

maintainer notes

  • no new setup needed
  • works out of the box with current github webhooks
  • data can be backfilled if required

why this matters

code reviews matter. this highlights people who show up, help others, and improve prs.
it rewards real engagement, not just commits.

Summary by CodeRabbit

  • New Features

    • Added GitHub comment tracking and a new comment leaderboard for PR commenters (last 6 months).
    • Automatic processing of GitHub issue and review comments for leaderboard updates.
  • Improvements

    • Leaderboard layout refreshed to a responsive grid with improved spacing and clear no-data messaging.
    • Token earnings now display a timestamp of when they were earned.
  • Tests

    • Added tests validating comment handling and filtering (including bot and non-PR comments).

✏️ Tip: You can customize this high-level summary in your review settings.

Copilot AI review requested due to automatic review settings December 24, 2025 17:25
@github-actions github-actions bot added files-changed: 5 PR changes 5 files migrations PR contains database migration files labels Dec 24, 2025
@github-actions
Copy link
Contributor

👋 Hi @mdkaifansari04!

This pull request needs a peer review before it can be merged. Please request a review from a team member who is not:

  • The PR author
  • DonnieBLT
  • coderabbitai
  • copilot

Once a valid peer review is submitted, this check will pass automatically. Thank you!

@github-actions github-actions bot added the needs-peer-review PR needs peer review label Dec 24, 2025
@coderabbitai
Copy link
Contributor

coderabbitai bot commented Dec 24, 2025

Walkthrough

Adds a GitHubComment model and migration, webhook handlers for GitHub comment events (issue_comment and pull_request_review_comment), tests for webhook behavior, leaderboard aggregation of recent PR comments, and template updates to display a GitHub Comment Leaderboard.

Changes

Cohort / File(s) Summary
Migration & Model
website/migrations/0261_githubcomment.py, website/models.py
New GitHubComment model (fields: comment_id, body, comment_type, created_at, updated_at, url) with FKs to GitHubIssue, UserProfile, Contributor, Repo; migration added. BaconEarning gains timestamp and __str__.
Webhook Handlers & Leaderboard Logic
website/views/user.py
New handlers handle_issue_comment_event and handle_review_comment_event that validate PR comments, ignore bot authors, resolve issue/PR and upsert GitHubComment. Leaderboard aggregation (comment_leaderboard, last 6 months, non-bot, filtered repos) added; GitHubComment and Max imported.
Template UI
website/templates/leaderboard_global.html
Layout refactor to grid; new GitHub Comment Leaderboard card and avatar fallbacks; added no-data messages and responsive spacing adjustments.
Tests
website/tests/test_webhook_comments.py
New tests exercising webhook endpoints: PR comment creates GitHubComment, non-PR comment ignored, bot comment ignored. Posts JSON payloads and asserts DB state.

Sequence Diagram

sequenceDiagram
    participant GitHub
    participant WebhookHandler as Webhook Handler
    participant DB as Database
    participant LeaderboardView as Leaderboard View
    participant Template

    GitHub->>WebhookHandler: POST "issue_comment" / "pull_request_review_comment"
    activate WebhookHandler
    WebhookHandler->>WebhookHandler: Validate payload (is PR?, not bot)
    alt valid PR comment & not bot
        WebhookHandler->>DB: Find GitHubIssue by repo+number
        WebhookHandler->>DB: Upsert GitHubComment (comment_id, type, body, url, timestamps, fks)
        WebhookHandler-->>GitHub: 200 OK
    else skip (non-PR or bot)
        WebhookHandler-->>GitHub: 200 OK (no-op)
    end
    deactivate WebhookHandler

    LeaderboardView->>DB: Query GitHubComment (last 6 months, filters)
    activate LeaderboardView
    DB-->>LeaderboardView: Comments grouped by user (count, latest)
    LeaderboardView->>LeaderboardView: Aggregate & sort (count DESC, recent DESC)
    LeaderboardView->>Template: Render context with `comment_leaderboard`
    deactivate LeaderboardView

    Template->>Template: Render GitHub Comment Leaderboard card
Loading

Estimated Code Review Effort

🎯 4 (Complex) | ⏱️ ~45 minutes

🚥 Pre-merge checks | ✅ 3 | ❌ 2
❌ Failed checks (2 warnings)
Check name Status Explanation Resolution
Out of Scope Changes check ⚠️ Warning The PR introduces an unrelated change: adding a timestamp field and str method to the BaconEarning model, which is not part of the GitHub comment leaderboard feature or issue #4662. Remove the BaconEarning model modifications (timestamp field and str method) as they are unrelated to the GitHub comment leaderboard feature and should be in a separate PR.
Docstring Coverage ⚠️ Warning Docstring coverage is 25.00% which is insufficient. The required threshold is 80.00%. Write docstrings for the functions missing them to satisfy the coverage threshold.
✅ Passed checks (3 passed)
Check name Status Explanation
Description Check ✅ Passed Check skipped - CodeRabbit’s high-level summary is enabled.
Title check ✅ Passed The title accurately describes the main change: adding a GitHub comment leaderboard feature. It is clear, specific, and directly reflects the primary objective of the pull request.
Linked Issues check ✅ Passed The PR successfully implements the core requirements from issue #4662: displays top commenters from the last six months, filters to OWASP-BLT repos, excludes bot comments, captures comments via webhooks, and provides real-time updates.

✏️ Tip: You can configure your own custom pre-merge checks in the settings.

✨ Finishing touches
  • 📝 Generate docstrings
🧪 Generate unit tests (beta)
  • Create PR with unit tests
  • Post copyable unit tests in a comment

Thanks for using CodeRabbit! It's free for OSS, and your support helps us grow. If you like it, consider giving us a shout-out.

❤️ Share

Comment @coderabbitai help to get the list of available commands and usage tips.

@github-actions
Copy link
Contributor

📊 Monthly Leaderboard

Hi @mdkaifansari04! Here's how you rank for December 2025:

Rank User PRs Reviews Comments Total
#15 @aaditya8979 3 2 0 38
#16 @mdkaifansari04 1 5 4 38
#17 @nitinawari 1 6 0 34

Leaderboard based on contributions in December 2025. Keep up the great work! 🚀

@github-actions
Copy link
Contributor

❌ Pre-commit checks failed

The pre-commit hooks found issues that need to be fixed. Please run the following commands locally to fix them:

# Install pre-commit if you haven't already
pip install pre-commit

# Run pre-commit on all files
pre-commit run --all-files

# Or run pre-commit on staged files only
pre-commit run

After running these commands, the pre-commit hooks will automatically fix most issues.
Please review the changes, commit them, and push to your branch.

💡 Tip: You can set up pre-commit to run automatically on every commit by running:

pre-commit install
Pre-commit output
[INFO] Initializing environment for https://github.com/pre-commit/pre-commit-hooks.
[WARNING] repo `https://github.com/pre-commit/pre-commit-hooks` uses deprecated stage names (commit, push) which will be removed in a future version.  Hint: often `pre-commit autoupdate --repo https://github.com/pre-commit/pre-commit-hooks` will fix this.  if it does not -- consider reporting an issue to that repo.
[INFO] Initializing environment for https://github.com/pycqa/isort.
[WARNING] repo `https://github.com/pycqa/isort` uses deprecated stage names (commit, merge-commit, push) which will be removed in a future version.  Hint: often `pre-commit autoupdate --repo https://github.com/pycqa/isort` will fix this.  if it does not -- consider reporting an issue to that repo.
[INFO] Initializing environment for https://github.com/astral-sh/ruff-pre-commit.
[INFO] Initializing environment for https://github.com/djlint/djLint.
[INFO] Initializing environment for local.
[INFO] Installing environment for https://github.com/pre-commit/pre-commit-hooks.
[INFO] Once installed this environment will be reused.
[INFO] This may take a few minutes...
[INFO] Installing environment for https://github.com/pycqa/isort.
[INFO] Once installed this environment will be reused.
[INFO] This may take a few minutes...
[INFO] Installing environment for https://github.com/astral-sh/ruff-pre-commit.
[INFO] Once installed this environment will be reused.
[INFO] This may take a few minutes...
[INFO] Installing environment for https://github.com/djlint/djLint.
[INFO] Once installed this environment will be reused.
[INFO] This may take a few minutes...
[INFO] Installing environment for local.
[INFO] Once installed this environment will be reused.
[INFO] This may take a few minutes...
check python ast.........................................................Passed
check builtin type constructor use.......................................Passed
check yaml...............................................................Passed
fix python encoding pragma...............................................Passed
mixed line ending........................................................Passed
isort....................................................................Failed
- hook id: isort
- files were modified by this hook

Fixing /home/runner/work/BLT/BLT/website/tests/test_webhook_comments.py
Fixing /home/runner/work/BLT/BLT/website/views/user.py


For more information, see the pre-commit documentation.

@github-actions github-actions bot added the pre-commit: failed Pre-commit checks failed label Dec 24, 2025
Comment on lines +1434 to +1444
comment_id=comment_data.get("id"),
defaults={
"github_issue": github_issue,
"user_profile": user_profile,
"body": comment_data.get("body", ""),
"comment_type": "issue_comment",
"created_at": dateutil_parser.parse(comment_data.get("created_at")),
"updated_at": dateutil_parser.parse(comment_data.get("updated_at")),
"url": comment_data.get("html_url"),
},
)

This comment was marked as outdated.

Copy link
Contributor

Copilot AI left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Pull request overview

This PR adds a GitHub PR comment leaderboard to the global leaderboard page, displaying top contributors who actively comment on pull requests across BLT repositories. The implementation captures PR comments via GitHub webhooks and displays the top 10 commenters from the last 6 months.

Key Changes

  • New GitHubComment model to store PR comments with relationships to issues, users, and contributors
  • Two new webhook handlers (handle_issue_comment_event and handle_review_comment_event) to process GitHub comment events
  • Comment leaderboard query and UI component displaying contributor names, avatars, and comment counts

Reviewed changes

Copilot reviewed 5 out of 5 changed files in this pull request and generated 12 comments.

Show a summary per file
File Description
website/models.py Adds GitHubComment model with fields for storing comment data, relationships to issues/users/contributors, and metadata
website/migrations/0261_githubcomment.py Database migration creating the GitHubComment table with appropriate fields and foreign keys
website/views/user.py Implements webhook handlers for issue comments and review comments; adds comment leaderboard query aggregating comments by contributor
website/templates/leaderboard_global.html Adds new comment leaderboard UI card; updates grid layout to accommodate new leaderboard section
website/tests/test_webhook_comments.py Adds tests for issue comment webhook handler covering PR comments, non-PR comments, and bot filtering
Comments suppressed due to low confidence (5)

website/views/user.py:1497

  • The contributor field is never populated in this webhook handler. The GitHubComment model has a contributor field that is used in the leaderboard query (line 642), but it's not being set here. You should create or fetch a Contributor object based on the sender's GitHub ID and assign it to the comment. Look at how other webhook handlers like update_github_issues.py handle this (lines 215-224), where they use Contributor.objects.get_or_create with github_id.
    if action == "closed" and is_merged and pr_user_profile:
        pr_user_instance = pr_user_profile.user
        assign_github_badge(pr_user_instance, "First PR Merged")

    return JsonResponse({"status": "success"}, status=200)


def handle_push_event(payload):
    pusher_profile = UserProfile.objects.filter(github_url=payload["sender"]["html_url"]).first()
    if pusher_profile:
        pusher_user = pusher_profile.user
        if payload.get("commits"):

website/views/user.py:1497

  • The repo field is not being populated when creating GitHubComment objects. The GitHubComment model includes a repo field (line 2335-2340 in models.py), but it's not being set in the defaults dictionary. Consider adding the repo field to ensure data consistency, similar to how it's done in the model definition.
    if action == "closed" and is_merged and pr_user_profile:
        pr_user_instance = pr_user_profile.user
        assign_github_badge(pr_user_instance, "First PR Merged")

    return JsonResponse({"status": "success"}, status=200)


def handle_push_event(payload):
    pusher_profile = UserProfile.objects.filter(github_url=payload["sender"]["html_url"]).first()
    if pusher_profile:
        pusher_user = pusher_profile.user
        if payload.get("commits"):

website/views/user.py:1443

  • The contributor field is never populated in this webhook handler. The GitHubComment model has a contributor field that is used in the leaderboard query (line 642), but it's not being set here. You should create or fetch a Contributor object based on the sender's GitHub ID and assign it to the comment. Look at how other webhook handlers like update_github_issues.py handle this (lines 215-224), where they use Contributor.objects.get_or_create with github_id.
        default=None,
        field_name="pull_request.merged_at",
    )

    # --- Repo mapping (same style as handle_issue_event) ---
    repo_html_url = repo_data.get("html_url")
    repo_full_name = repo_data.get("full_name")  # "owner/repo" (for logging only)

    if not pr_global_id or not repo_html_url:
        logger.warning("Pull request event missing required data (id or repo_html_url)")
        return JsonResponse({"status": "error", "message": "Missing required data"}, status=400)

website/views/user.py:1443

  • The repo field is not being populated when creating GitHubComment objects. The GitHubComment model includes a repo field (line 2335-2340 in models.py), but it's not being set in the defaults dictionary. Consider adding the repo field to ensure data consistency, similar to how it's done in the model definition.
        default=None,
        field_name="pull_request.merged_at",
    )

    # --- Repo mapping (same style as handle_issue_event) ---
    repo_html_url = repo_data.get("html_url")
    repo_full_name = repo_data.get("full_name")  # "owner/repo" (for logging only)

    if not pr_global_id or not repo_html_url:
        logger.warning("Pull request event missing required data (id or repo_html_url)")
        return JsonResponse({"status": "error", "message": "Missing required data"}, status=400)

website/views/user.py:1471

  • Bot filtering logic is inconsistent with handle_issue_comment_event. In handle_issue_comment_event (line 1415-1417), there's an additional condition checking if is_bot is true AND login is not in an exceptions list. This handler only checks if login is in the bots list OR if is_bot is true, making it more restrictive. Consider aligning the bot filtering logic between both handlers for consistency.
                issue_id=pr_global_id,  # unique per PR (avoids clash with issues)
                repo=repo,
                defaults={
                    "type": "pull_request",
                    "title": pr_title,
                    "body": pr_body,
                    "state": pr_state,
                    "url": pr_html_url,
                    "is_merged": is_merged,
                    "created_at": created_at,
                    "updated_at": updated_at,
                    "closed_at": closed_at,
                    "merged_at": merged_at if is_merged else None,
                    "user_profile": pr_user_profile,

@github-actions github-actions bot added the tests: passed Django tests passed label Dec 24, 2025
@github-actions
Copy link
Contributor

❌ Pre-commit checks failed

The pre-commit hooks found issues that need to be fixed. Please run the following commands locally to fix them:

# Install pre-commit if you haven't already
pip install pre-commit

# Run pre-commit on all files
pre-commit run --all-files

# Or run pre-commit on staged files only
pre-commit run

After running these commands, the pre-commit hooks will automatically fix most issues.
Please review the changes, commit them, and push to your branch.

💡 Tip: You can set up pre-commit to run automatically on every commit by running:

pre-commit install
Pre-commit output
[INFO] Initializing environment for https://github.com/pre-commit/pre-commit-hooks.
[WARNING] repo `https://github.com/pre-commit/pre-commit-hooks` uses deprecated stage names (commit, push) which will be removed in a future version.  Hint: often `pre-commit autoupdate --repo https://github.com/pre-commit/pre-commit-hooks` will fix this.  if it does not -- consider reporting an issue to that repo.
[INFO] Initializing environment for https://github.com/pycqa/isort.
[WARNING] repo `https://github.com/pycqa/isort` uses deprecated stage names (commit, merge-commit, push) which will be removed in a future version.  Hint: often `pre-commit autoupdate --repo https://github.com/pycqa/isort` will fix this.  if it does not -- consider reporting an issue to that repo.
[INFO] Initializing environment for https://github.com/astral-sh/ruff-pre-commit.
[INFO] Initializing environment for https://github.com/djlint/djLint.
[INFO] Initializing environment for local.
[INFO] Installing environment for https://github.com/pre-commit/pre-commit-hooks.
[INFO] Once installed this environment will be reused.
[INFO] This may take a few minutes...
[INFO] Installing environment for https://github.com/pycqa/isort.
[INFO] Once installed this environment will be reused.
[INFO] This may take a few minutes...
[INFO] Installing environment for https://github.com/astral-sh/ruff-pre-commit.
[INFO] Once installed this environment will be reused.
[INFO] This may take a few minutes...
[INFO] Installing environment for https://github.com/djlint/djLint.
[INFO] Once installed this environment will be reused.
[INFO] This may take a few minutes...
[INFO] Installing environment for local.
[INFO] Once installed this environment will be reused.
[INFO] This may take a few minutes...
check python ast.........................................................Passed
check builtin type constructor use.......................................Passed
check yaml...............................................................Passed
fix python encoding pragma...............................................Passed
mixed line ending........................................................Passed
isort....................................................................Failed
- hook id: isort
- files were modified by this hook

Fixing /home/runner/work/BLT/BLT/website/tests/test_webhook_comments.py
Fixing /home/runner/work/BLT/BLT/website/views/user.py


For more information, see the pre-commit documentation.

@github-actions github-actions bot added tests: failed Django tests failed and removed tests: passed Django tests passed labels Dec 24, 2025
@github-actions github-actions bot added last-active: 15d Issue last updated 15 days ago last-active: 16d Issue last updated 16 days ago last-active: 17d Issue last updated 17 days ago and removed last-active: 15d Issue last updated 15 days ago last-active: 16d Issue last updated 16 days ago labels Jan 9, 2026
Copilot AI review requested due to automatic review settings January 11, 2026 23:49
@github-actions github-actions bot removed changes-requested PR has requested changes from a reviewer last-active: 17d Issue last updated 17 days ago labels Jan 11, 2026
Copy link
Collaborator

@DonnieBLT DonnieBLT left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

looks good, can we also please have a command to load the comments in addition to the webhook?

@github-actions
Copy link
Contributor

❌ Tests failed

The Django tests found issues that need to be fixed. Please review the test output below and fix the failing tests.

How to run tests locally

# Install dependencies
poetry install --with dev

# Run all tests
poetry run python manage.py test

# Run tests with verbose output
poetry run python manage.py test -v 3

# Run a specific test
poetry run python manage.py test app.tests.TestClass.test_method
Test output
WARNING:root:No DATABASE_URL environment variable set, and so no databases setup
Reading .env file from /home/runner/work/BLT/BLT/.env
DATABASE_URL: not set
no database url detected in settings, using sqlite
DEBUG This service is instrumented using OpenTelemetry. OpenTelemetry or one of its components could not be imported; please add compatible versions of opentelemetry-api and opentelemetry-instrumentation packages in order to get Storage Tracing data.
   [ERROR] Error: no such table: auth_user
Traceback (most recent call last):
  File "/home/runner/.cache/pypoetry/virtualenvs/blt-yuw0N2NF-py3.11/lib/python3.11/site-packages/django/db/backends/utils.py", line 105, in _execute
    return self.cursor.execute(sql, params)
           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/home/runner/.cache/pypoetry/virtualenvs/blt-yuw0N2NF-py3.11/lib/python3.11/site-packages/django/db/backends/sqlite3/base.py", line 360, in execute
    return super().execute(query, params)
           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
sqlite3.OperationalError: no such table: auth_user

The above exception was the direct cause of the following exception:

Traceback (most recent call last):
  File "/home/runner/work/BLT/BLT/test_bacon_system.py", line 39, in <module>
    user, created = User.objects.get_or_create(username="bacon_test_user", defaults={"email": "[email protected]"})
                    ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/home/runner/.cache/pypoetry/virtualenvs/blt-yuw0N2NF-py3.11/lib/python3.11/site-packages/django/db/models/manager.py", line 87, in manager_method
    return getattr(self.get_queryset(), name)(*args, **kwargs)
           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/home/runner/.cache/pypoetry/virtualenvs/blt-yuw0N2NF-py3.11/lib/python3.11/site-packages/django/db/models/query.py", line 948, in get_or_create
    return self.get(**kwargs), False
           ^^^^^^^^^^^^^^^^^^
  File "/home/runner/.cache/pypoetry/virtualenvs/blt-yuw0N2NF-py3.11/lib/python3.11/site-packages/django/db/models/query.py", line 631, in get
    num = len(clone)
          ^^^^^^^^^^
  File "/home/runner/.cache/pypoetry/virtualenvs/blt-yuw0N2NF-py3.11/lib/python3.11/site-packages/django/db/models/query.py", line 368, in __len__
    self._fetch_all()
  File "/home/runner/.cache/pypoetry/virtualenvs/blt-yuw0N2NF-py3.11/lib/python3.11/site-packages/django/db/models/query.py", line 1954, in _fetch_all
    self._result_cache = list(self._iterable_class(self))
                         ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/home/runner/.cache/pypoetry/virtualenvs/blt-yuw0N2NF-py3.11/lib/python3.11/site-packages/django/db/models/query.py", line 93, in __iter__
    results = compiler.execute_sql(
              ^^^^^^^^^^^^^^^^^^^^^
  File "/home/runner/.cache/pypoetry/virtualenvs/blt-yuw0N2NF-py3.11/lib/python3.11/site-packages/django/db/models/sql/compiler.py", line 1623, in execute_sql
    cursor.execute(sql, params)
  File "/home/runner/.cache/pypoetry/virtualenvs/blt-yuw0N2NF-py3.11/lib/python3.11/site-packages/django/db/backends/utils.py", line 79, in execute
    return self._execute_with_wrappers(
           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/home/runner/.cache/pypoetry/virtualenvs/blt-yuw0N2NF-py3.11/lib/python3.11/site-packages/django/db/backends/utils.py", line 92, in _execute_with_wrappers
    return executor(sql, params, many, context)
           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/home/runner/.cache/pypoetry/virtualenvs/blt-yuw0N2NF-py3.11/lib/python3.11/site-packages/django/db/backends/utils.py", line 100, in _execute
    with self.db.wrap_database_errors:
  File "/home/runner/.cache/pypoetry/virtualenvs/blt-yuw0N2NF-py3.11/lib/python3.11/site-packages/django/db/utils.py", line 91, in __exit__
    raise dj_exc_value.with_traceback(traceback) from exc_value
  File "/home/runner/.cache/pypoetry/virtualenvs/blt-yuw0N2NF-py3.11/lib/python3.11/site-packages/django/db/backends/utils.py", line 105, in _execute
    return self.cursor.execute(sql, params)
           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/home/runner/.cache/pypoetry/virtualenvs/blt-yuw0N2NF-py3.11/lib/python3.11/site-packages/django/db/backends/sqlite3/base.py", line 360, in execute
    return super().execute(query, params)
           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
django.db.utils.OperationalError: no such table: auth_user
   [ERROR] Error: no such table: socialaccount_socialaccount
   [ERROR] Error: no such table: website_activity
============================================================
BACON TOKEN SYSTEM TEST
============================================================

1. Checking signal module...
   [OK] Signal module loaded
   [OK] Function exists: True

2. Testing giveBacon function...

3. Checking users with GitHub connected...

4. Checking connection activities...

5. Checking signal receivers...
   [INFO] Found 1 receivers
   - ((139969401281312, 139969740179584), <weakref at 0x7f4d2abfdc10; to 'function' at 0x7f4d2a715b20 (reward_social_account_connection)>, False)

============================================================
TEST COMPLETE
============================================================

If giveBacon works but users aren't getting tokens,
the signal might not be firing during OAuth signup.

Watch logs during signup: docker-compose logs -f app
Look for: SIGNAL FIRED messages
============================================================
/home/runner/.cache/pypoetry/virtualenvs/blt-yuw0N2NF-py3.11/lib/python3.11/site-packages/dj_rest_auth/registration/serializers.py:228: UserWarning: app_settings.USERNAME_REQUIRED is deprecated, use: app_settings.SIGNUP_FIELDS['username']['required']
  required=allauth_account_settings.USERNAME_REQUIRED,
/home/runner/.cache/pypoetry/virtualenvs/blt-yuw0N2NF-py3.11/lib/python3.11/site-packages/dj_rest_auth/registration/serializers.py:230: UserWarning: app_settings.EMAIL_REQUIRED is deprecated, use: app_settings.SIGNUP_FIELDS['email']['required']
  email = serializers.EmailField(required=allauth_account_settings.EMAIL_REQUIRED)
/home/runner/.cache/pypoetry/virtualenvs/blt-yuw0N2NF-py3.11/lib/python3.11/site-packages/dj_rest_auth/registration/serializers.py:288: UserWarning: app_settings.EMAIL_REQUIRED is deprecated, use: app_settings.SIGNUP_FIELDS['email']['required']
  email = serializers.EmailField(required=allauth_account_settings.EMAIL_REQUIRED)
Creating test database for alias 'default' ('file:memorydb_default?mode=memory&cache=shared')...
CommandError: Conflicting migrations detected; multiple leaf nodes in the migration graph: (0261_githubcomment, 0263_githubissue_githubissue_pr_merged_idx_and_more in website).
To fix them run 'python manage.py makemigrations --merge'
Found 314 test(s).

For more information, see the Django testing documentation.

Copy link
Contributor

Copilot AI left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Pull request overview

Copilot reviewed 5 out of 5 changed files in this pull request and generated 13 comments.

Comment on lines +1654 to +1664
GitHubComment.objects.update_or_create(
comment_id=comment_data.get("id"),
defaults={
"github_issue": github_issue,
"user_profile": user_profile,
"body": comment_data.get("body", ""),
"comment_type": "issue_comment",
"created_at": dateutil_parser.parse(comment_data.get("created_at")),
"updated_at": dateutil_parser.parse(comment_data.get("updated_at")),
"url": comment_data.get("html_url"),
},
Copy link

Copilot AI Jan 11, 2026

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

The GitHubComment model has a repo field (line 2347), but the webhook handler doesn't populate it. While the repo can be accessed through github_issue.repo, having it directly on the comment could improve query performance for the leaderboard.

Consider adding the repo to the defaults dictionary: "repo": repo, This would make it easier to filter comments by repo without joining through the github_issue table.

Copilot uses AI. Check for mistakes.
Comment on lines +1654 to +1665
GitHubComment.objects.update_or_create(
comment_id=comment_data.get("id"),
defaults={
"github_issue": github_issue,
"user_profile": user_profile,
"body": comment_data.get("body", ""),
"comment_type": "issue_comment",
"created_at": dateutil_parser.parse(comment_data.get("created_at")),
"updated_at": dateutil_parser.parse(comment_data.get("updated_at")),
"url": comment_data.get("html_url"),
},
)
Copy link

Copilot AI Jan 11, 2026

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

The webhook handlers for comments are missing the contributor field population. The leaderboard query filters by contributor__isnull=False (line 651), but these handlers only set user_profile and don't create or link a Contributor object. This means all comments created through these webhooks will be excluded from the leaderboard, making it always empty.

The handlers should follow the same pattern as the PR handler (lines 1398-1418) to get or create a Contributor object using the sender's github_id, login, avatar_url, and github_url, then include it in the defaults dictionary.

Copilot uses AI. Check for mistakes.
Comment on lines +1641 to +1646
# Find Issue
issue_number = issue_data.get("number")
repo_html_url = repo_data.get("html_url")
try:
repo = Repo.objects.get(repo_url=repo_html_url)
github_issue = GitHubIssue.objects.get(issue_id=issue_number, repo=repo)
Copy link

Copilot AI Jan 11, 2026

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

The GitHub issue lookup logic has a critical mismatch. The PR handler stores PRs using pr_global_id (GitHub's unique PR ID) as the issue_id field (line 1463), but this handler is trying to look up the issue using issue_number (the visible PR number like #123). These are different values, so this lookup will fail for all PRs created through the webhook.

The handler should either:

  1. Use the PR's global ID from issue_data.get("id") instead of number, or
  2. Store PRs using the PR number instead of the global ID in the PR handler

The first option is preferred to maintain consistency with how PRs are currently stored.

Suggested change
# Find Issue
issue_number = issue_data.get("number")
repo_html_url = repo_data.get("html_url")
try:
repo = Repo.objects.get(repo_url=repo_html_url)
github_issue = GitHubIssue.objects.get(issue_id=issue_number, repo=repo)
# Find Issue (for PRs we use the global GitHub ID, not the PR number)
issue_global_id = issue_data.get("id")
repo_html_url = repo_data.get("html_url")
try:
repo = Repo.objects.get(repo_url=repo_html_url)
github_issue = GitHubIssue.objects.get(issue_id=issue_global_id, repo=repo)

Copilot uses AI. Check for mistakes.
Comment on lines +1695 to +1699
pr_number = pr_data.get("number")
repo_html_url = repo_data.get("html_url")
try:
repo = Repo.objects.get(repo_url=repo_html_url)
github_issue = GitHubIssue.objects.get(issue_id=pr_number, repo=repo)
Copy link

Copilot AI Jan 11, 2026

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

The GitHub issue lookup logic has a critical mismatch similar to the issue_comment handler. The PR handler stores PRs using pr_global_id (GitHub's unique PR ID) as the issue_id field (line 1463), but this handler is trying to look up using pr_number (the visible PR number like #123). These are different values, so this lookup will fail for all PRs created through the webhook.

The handler should use the PR's global ID from pr_data.get("id") instead of number to match how PRs are stored.

Suggested change
pr_number = pr_data.get("number")
repo_html_url = repo_data.get("html_url")
try:
repo = Repo.objects.get(repo_url=repo_html_url)
github_issue = GitHubIssue.objects.get(issue_id=pr_number, repo=repo)
pr_global_id = pr_data.get("id")
repo_html_url = repo_data.get("html_url")
try:
repo = Repo.objects.get(repo_url=repo_html_url)
github_issue = GitHubIssue.objects.get(issue_id=pr_global_id, repo=repo)

Copilot uses AI. Check for mistakes.
Comment on lines +657 to +658
.exclude(contributor__name__icontains="copilot") # Exclude copilot contributors
.exclude(contributor__name__icontains="dependabot") # Exclude dependabot
Copy link

Copilot AI Jan 11, 2026

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

The bot exclusion logic in the comment leaderboard query is inconsistent with the PR leaderboard above it. The PR leaderboard (lines 589-608) uses a comprehensive bot list including "copilot", "[bot]", "dependabot", "github-actions", and "renovate" with dynamic exclusion logic, while the comment leaderboard only excludes "copilot" and "dependabot" with separate exclude statements.

For consistency and maintainability, the comment leaderboard should use the same bot list and exclusion pattern as the PR leaderboard. Consider reusing the bots list and bot_exclusions query that's already defined above.

Suggested change
.exclude(contributor__name__icontains="copilot") # Exclude copilot contributors
.exclude(contributor__name__icontains="dependabot") # Exclude dependabot
.exclude(bot_exclusions) # Reuse shared bot exclusion logic

Copilot uses AI. Check for mistakes.
Comment on lines +1714 to +1715
"created_at": dateutil_parser.parse(comment_data.get("created_at")),
"updated_at": dateutil_parser.parse(comment_data.get("updated_at")),
Copy link

Copilot AI Jan 11, 2026

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

The webhook handler directly calls dateutil_parser.parse() without error handling. If GitHub sends malformed or missing timestamps, this will raise an exception and crash the webhook handler.

Use the safe_parse_github_datetime() helper function (defined at line 1269) instead, which handles parsing errors gracefully and logs warnings. This pattern is used in the PR handler (lines 1421-1440).

Suggested change
"created_at": dateutil_parser.parse(comment_data.get("created_at")),
"updated_at": dateutil_parser.parse(comment_data.get("updated_at")),
"created_at": safe_parse_github_datetime(comment_data.get("created_at")),
"updated_at": safe_parse_github_datetime(comment_data.get("updated_at")),

Copilot uses AI. Check for mistakes.
Comment on lines +1707 to +1718
GitHubComment.objects.update_or_create(
comment_id=comment_data.get("id"),
defaults={
"github_issue": github_issue,
"user_profile": user_profile,
"body": comment_data.get("body", ""),
"comment_type": "review_comment",
"created_at": dateutil_parser.parse(comment_data.get("created_at")),
"updated_at": dateutil_parser.parse(comment_data.get("updated_at")),
"url": comment_data.get("html_url"),
},
)
Copy link

Copilot AI Jan 11, 2026

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

The webhook handler for review comments is also missing the contributor field population. Like the issue_comment handler, this will cause all review comments to be excluded from the leaderboard since it filters by contributor__isnull=False.

The handler should follow the same pattern as the PR handler (lines 1398-1418) to get or create a Contributor object and include it in the defaults dictionary.

Copilot uses AI. Check for mistakes.
Comment on lines +1638 to +1692
if login.lower() in bots or (is_bot and login.lower() not in ["..."]): # Add exceptions if any bot is allowed
return JsonResponse({"status": "ignored", "reason": "Bot comment"}, status=200)

# Find Issue
issue_number = issue_data.get("number")
repo_html_url = repo_data.get("html_url")
try:
repo = Repo.objects.get(repo_url=repo_html_url)
github_issue = GitHubIssue.objects.get(issue_id=issue_number, repo=repo)
except (Repo.DoesNotExist, GitHubIssue.DoesNotExist):
return JsonResponse({"status": "ignored", "reason": "Repo or Issue not found"}, status=200)

# Find/Create User Profile
user_profile = UserProfile.objects.filter(github_url=sender.get("html_url")).first()

# Create Comment
GitHubComment.objects.update_or_create(
comment_id=comment_data.get("id"),
defaults={
"github_issue": github_issue,
"user_profile": user_profile,
"body": comment_data.get("body", ""),
"comment_type": "issue_comment",
"created_at": dateutil_parser.parse(comment_data.get("created_at")),
"updated_at": dateutil_parser.parse(comment_data.get("updated_at")),
"url": comment_data.get("html_url"),
},
)
return JsonResponse({"status": "success"}, status=200)


def handle_review_comment_event(payload):
action = payload.get("action")
if action not in ["created", "edited"]:
return JsonResponse({"status": "ignored"}, status=200)

comment_data = payload.get("comment", {})
pr_data = payload.get("pull_request", {})
sender = payload.get("sender", {})
repo_data = payload.get("repository", {})

# Bot filtering
login = sender.get("login", "")
is_bot = sender.get("type") == "Bot" or "bot" in login.lower()
bots = [
"copilot",
"coderabbitai",
"github-actions[bot]",
"dependabot",
"sonarcloud",
"vercel",
"netlify",
]
if login.lower() in bots or is_bot:
return JsonResponse({"status": "ignored", "reason": "Bot comment"}, status=200)
Copy link

Copilot AI Jan 11, 2026

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

The bot filtering logic is inconsistent between the two handlers. In handle_issue_comment_event, there's a complex condition login.lower() not in ["..."] that checks for bot exceptions (line 1638), but in handle_review_comment_event, bots are blocked unconditionally with just is_bot (line 1691).

If bot exceptions are needed, both handlers should use the same logic. If no exceptions are needed, the complex condition with ["..."] should be simplified to match the cleaner logic in the review_comment handler.

Copilot uses AI. Check for mistakes.
Comment on lines +1629 to +1690
bots = [
"copilot",
"coderabbitai",
"github-actions[bot]",
"dependabot",
"sonarcloud",
"vercel",
"netlify",
]
if login.lower() in bots or (is_bot and login.lower() not in ["..."]): # Add exceptions if any bot is allowed
return JsonResponse({"status": "ignored", "reason": "Bot comment"}, status=200)

# Find Issue
issue_number = issue_data.get("number")
repo_html_url = repo_data.get("html_url")
try:
repo = Repo.objects.get(repo_url=repo_html_url)
github_issue = GitHubIssue.objects.get(issue_id=issue_number, repo=repo)
except (Repo.DoesNotExist, GitHubIssue.DoesNotExist):
return JsonResponse({"status": "ignored", "reason": "Repo or Issue not found"}, status=200)

# Find/Create User Profile
user_profile = UserProfile.objects.filter(github_url=sender.get("html_url")).first()

# Create Comment
GitHubComment.objects.update_or_create(
comment_id=comment_data.get("id"),
defaults={
"github_issue": github_issue,
"user_profile": user_profile,
"body": comment_data.get("body", ""),
"comment_type": "issue_comment",
"created_at": dateutil_parser.parse(comment_data.get("created_at")),
"updated_at": dateutil_parser.parse(comment_data.get("updated_at")),
"url": comment_data.get("html_url"),
},
)
return JsonResponse({"status": "success"}, status=200)


def handle_review_comment_event(payload):
action = payload.get("action")
if action not in ["created", "edited"]:
return JsonResponse({"status": "ignored"}, status=200)

comment_data = payload.get("comment", {})
pr_data = payload.get("pull_request", {})
sender = payload.get("sender", {})
repo_data = payload.get("repository", {})

# Bot filtering
login = sender.get("login", "")
is_bot = sender.get("type") == "Bot" or "bot" in login.lower()
bots = [
"copilot",
"coderabbitai",
"github-actions[bot]",
"dependabot",
"sonarcloud",
"vercel",
"netlify",
]
Copy link

Copilot AI Jan 11, 2026

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

The bot filtering list is duplicated between the two webhook handlers. This creates a maintenance burden as any updates to the bot list must be made in multiple places.

Consider extracting the bot list to a constant at the module level (e.g., GITHUB_BOT_USERNAMES) and reusing it in both handlers. This follows the DRY principle and ensures consistency.

Copilot uses AI. Check for mistakes.
"html_url": "https://github.com/owner/repo/issues/1#issuecomment-1001",
},
"repository": {"html_url": "https://github.com/owner/repo", "full_name": "owner/repo"},
"sender": {"login": "testuser", "html_url": "https://github.com/testuser", "type": "User"},
Copy link

Copilot AI Jan 11, 2026

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

The test payload is missing the sender's id field which is required for creating Contributor objects. The PR webhook handler uses github_id (from sender.id) as the primary unique identifier for contributors (see lines 1399-1402). Without this field in the test payload, the contributor creation logic cannot work properly.

Add "id" to the sender dictionary with a test value like 12345.

Copilot uses AI. Check for mistakes.
Copy link
Contributor

@coderabbitai coderabbitai bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Actionable comments posted: 5

🤖 Fix all issues with AI agents
In @website/views/user.py:
- Around line 1694-1701: The lookup is using the PR's number (pr_number) but
GitHubIssue records use the global PR id, so change the lookup to use the PR
global id from pr_data (e.g., pr_id = pr_data.get("id")) instead of pr_number
when calling GitHubIssue.objects.get(issue_id=..., repo=repo); ensure the
variable used in the query matches the stored identifier and update any related
variable names (pr_number → pr_id) to avoid mismatched lookups.
- Around line 1707-1718: The GitHubComment.update_or_create call is missing the
contributor field and uses unsafe dateutil_parser.parse for
created_at/updated_at; update the defaults to include "contributor":
user_profile.contributor (or the correct contributor attr on user_profile) and
replace the direct dateutil_parser.parse(...) calls with a safe parse (e.g., a
parse_datetime_safe helper or a try/except around dateutil_parser.parse that
falls back to timezone.now() or None on failure) so malformed timestamps don’t
crash the handler; modify the GitHubComment.objects.update_or_create defaults
accordingly.
- Around line 1661-1662: The current construction uses dateutil_parser.parse on
comment_data.get("created_at") and ("updated_at") which can raise ParserError on
None/malformed input; replace those calls with the existing
safe_parse_github_datetime helper (defined earlier) when building the comment
dict so created_at and updated_at use
safe_parse_github_datetime(comment_data.get("created_at")) and
safe_parse_github_datetime(comment_data.get("updated_at")) to avoid crashes and
keep behavior consistent.
- Around line 1654-1665: The GitHubComment is being created without setting the
contributor field, so it never matches leaderboard filters
(contributor__isnull=False); update the handler to perform the same contributor
lookup used in handle_pull_request_event (e.g., resolve or get/create the
Contributor for the given user_profile and repository) and include the resulting
Contributor in the defaults passed to GitHubComment.objects.update_or_create
(add the contributor key to the defaults for GitHubComment).
- Around line 1644-1648: The lookup fails because handle_issue_comment_event
queries GitHubIssue by the visible issue_number instead of the PR's stored
global id; change the code that pulls the issue identifier to use pr_id =
issue_data.get("id") and query Repo and GitHubIssue with issue_id=pr_id
(matching how handle_pull_request_event stores it). Replace
dateutil_parser.parse(...) calls with safe_parse_github_datetime(...) when
setting datetime fields on GitHubComment to avoid crashes on None/malformed
timestamps. When creating the GitHubComment, populate the contributor field by
finding-or-creating a Contributor from the comment author (link the found
Contributor to the GitHubComment) so contributor is not left null; keep using
the existing Repo, GitHubIssue, GitHubComment, Contributor model names to locate
the code to update.
🧹 Nitpick comments (2)
website/views/user.py (2)

645-671: Inconsistent bot filtering compared to other leaderboards.

The PR leaderboard (lines 589-608) excludes more bots using a dynamic query: ["copilot", "[bot]", "dependabot", "github-actions", "renovate"]. The comment leaderboard only excludes copilot and dependabot, which could allow other bots like github-actions or renovate to appear.

Consider using the same bot_exclusions pattern for consistency:

♻️ Proposed fix to use consistent bot filtering
         comment_leaderboard = (
             GitHubComment.objects.filter(
                 github_issue__type="pull_request",
                 created_at__gte=since_date,
                 contributor__isnull=False,  # Only comments with contributors
             )
             .filter(
                 Q(github_issue__repo__repo_url__startswith="https://github.com/OWASP-BLT/")
                 | Q(github_issue__repo__repo_url__startswith="https://github.com/owasp-blt/")
             )
-            .exclude(contributor__name__icontains="copilot")  # Exclude copilot contributors
-            .exclude(contributor__name__icontains="dependabot")  # Exclude dependabot
+            .exclude(bot_exclusions)  # Exclude bot contributors (same as PR leaderboard)
             .select_related("contributor")

1638-1638: Confusing bot filtering condition with placeholder.

The condition login.lower() not in ["..."] appears to be a placeholder. This makes the logic confusing. If there are no bot exceptions, simplify to just check is_bot:

♻️ Proposed cleanup
-    if login.lower() in bots or (is_bot and login.lower() not in ["..."]):  # Add exceptions if any bot is allowed
+    if login.lower() in bots or is_bot:
         return JsonResponse({"status": "ignored", "reason": "Bot comment"}, status=200)
📜 Review details

Configuration used: Organization UI

Review profile: CHILL

Plan: Pro

Knowledge base: Disabled due to Reviews -> Disable Knowledge Base setting

📥 Commits

Reviewing files that changed from the base of the PR and between d6654e2 and 77182ee.

📒 Files selected for processing (1)
  • website/views/user.py
⏰ Context from checks skipped due to timeout of 90000ms. You can increase the timeout in your CodeRabbit configuration to a maximum of 15 minutes (900000ms). (4)
  • GitHub Check: Run Tests
  • GitHub Check: Agent
  • GitHub Check: CodeQL analysis (python)
  • GitHub Check: docker-test
🔇 Additional comments (2)
website/views/user.py (2)

20-20: LGTM!

The new imports (Max for aggregation and GitHubComment model) are correctly added and properly used by the new leaderboard query and webhook handlers.

Also applies to: 46-46


1334-1335: LGTM!

The new webhook event handlers are correctly wired to their respective GitHub event types (issue_comment and pull_request_review_comment).

Comment on lines +1644 to +1648
try:
repo = Repo.objects.get(repo_url=repo_html_url)
github_issue = GitHubIssue.objects.get(issue_id=issue_number, repo=repo)
except (Repo.DoesNotExist, GitHubIssue.DoesNotExist):
return JsonResponse({"status": "ignored", "reason": "Repo or Issue not found"}, status=200)
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

⚠️ Potential issue | 🔴 Critical

🧩 Analysis chain

🏁 Script executed:

# First, locate the user.py file and check its structure
fd -t f "user\.py" | grep -E "(views|website)" | head -5

Repository: OWASP-BLT/BLT

Length of output: 307


🏁 Script executed:

# Find the GitHubIssue model definition
fd -t f -e py | xargs rg -l "class GitHubIssue" | head -5

Repository: OWASP-BLT/BLT

Length of output: 160


🏁 Script executed:

# Look for handle_pull_request_event to see how it stores PR data
rg -n "def handle_pull_request_event" -A 50 | head -100

Repository: OWASP-BLT/BLT

Length of output: 3661


🏁 Script executed:

# Look for handle_issue_event to see how it stores issue data
rg -n "def handle_issue_event" -A 50 | head -100

Repository: OWASP-BLT/BLT

Length of output: 3409


🏁 Script executed:

# Get the handle_issue_comment_event function
rg -n "def handle_issue_comment_event" -A 70 website/views/user.py | head -150

Repository: OWASP-BLT/BLT

Length of output: 3135


🏁 Script executed:

# Check the GitHubIssue model definition
rg -n "class GitHubIssue" -A 30 website/models.py

Repository: OWASP-BLT/BLT

Length of output: 1346


🏁 Script executed:

# Check if there are any other references to issue_number in handle_issue_comment_event
rg -n "issue_number\|issue_id" website/views/user.py | grep -A 5 -B 5 "1644\|1646"

Repository: OWASP-BLT/BLT

Length of output: 39


🏁 Script executed:

# Search for how other parts of the code access PR data from issue_data
rg -n "issue_data\.get" website/views/user.py | head -30

Repository: OWASP-BLT/BLT

Length of output: 372


🏁 Script executed:

# Check if issue_data.get("id") is used elsewhere
rg -n "issue_data.*get.*id" website/views/user.py

Repository: OWASP-BLT/BLT

Length of output: 39


🏁 Script executed:

# Look at handle_review_comment_event to see if it has similar code
rg -n "def handle_review_comment_event" -A 30 website/views/user.py | head -60

Repository: OWASP-BLT/BLT

Length of output: 1271


🏁 Script executed:

# Check if there are any comments or documentation about the payload structure
rg -n "pull_request.*issue_data\|issue_data.*id" website/views/user.py -B 3 -A 3

Repository: OWASP-BLT/BLT

Length of output: 39


🏁 Script executed:

# Look at the entire handle_issue_comment_event section to see all uses of issue_data
sed -n '1612,1667p' website/views/user.py

Repository: OWASP-BLT/BLT

Length of output: 2280


🏁 Script executed:

# Check what the PR handler stores and how it queries
sed -n '1353,1440p' website/views/user.py | grep -E "issue_id|get_or_create|filter"

Repository: OWASP-BLT/BLT

Length of output: 381


🏁 Script executed:

# Check the GitHub API webhook documentation or any comments about issue_data structure
rg -n "issue_data.*id\|issue_data.*number" website/views/user.py -B 2 -A 2

Repository: OWASP-BLT/BLT

Length of output: 39


🏁 Script executed:

# Compare how handle_review_comment_event handles the PR lookup
sed -n '1669,1710p' website/views/user.py

Repository: OWASP-BLT/BLT

Length of output: 1565


🏁 Script executed:

# Check if there are any tests that show what issue_data looks like for PR comments
rg -n "issue_comment\|pull_request" website/tests/test_issues.py -A 5 -B 5 | head -80

Repository: OWASP-BLT/BLT

Length of output: 39


🏁 Script executed:

# Check the GitHub API event payload structure by looking at any documentation in comments
rg -n "GitHub.*payload\|webhook" website/views/user.py -B 2 -A 5 | head -50

Repository: OWASP-BLT/BLT

Length of output: 39


🏁 Script executed:

# Look at how handle_pull_request_event accesses pr_data to understand the structure
sed -n '1373,1378p' website/views/user.py

Repository: OWASP-BLT/BLT

Length of output: 427


🏁 Script executed:

# Search for any usage of issue_data.get("id")
rg -n 'issue_data\.get\("id"\)' website/views/user.py

Repository: OWASP-BLT/BLT

Length of output: 39


🏁 Script executed:

# Look at the update_or_create call in handle_pull_request_event
sed -n '1420,1480p' website/views/user.py

Repository: OWASP-BLT/BLT

Length of output: 2473


🏁 Script executed:

# Also check the type field when saving
rg -n "GitHubIssue.objects" website/views/user.py | grep -E "update_or_create|filter.*type"

Repository: OWASP-BLT/BLT

Length of output: 135


🏁 Script executed:

# Check if safe_parse_github_datetime exists and how it's used
rg -n "safe_parse_github_datetime\|dateutil_parser" website/views/user.py | head -20

Repository: OWASP-BLT/BLT

Length of output: 39


🏁 Script executed:

# Check the GitHubComment model to see if it has a contributor field
rg -n "class GitHubComment" -A 20 website/models.py

Repository: OWASP-BLT/BLT

Length of output: 772


🏁 Script executed:

# Verify if contributor field is required in GitHubComment
rg -n "contributor" website/models.py | grep -i "comment\|field" | head -10

Repository: OWASP-BLT/BLT

Length of output: 799


🏁 Script executed:

# Search for safe_parse_github_datetime definition
rg -n "def safe_parse_github_datetime" website/

Repository: OWASP-BLT/BLT

Length of output: 154


🏁 Script executed:

# Check how dateutil_parser is imported and used in the entire file
rg -n "from dateutil\|import dateutil\|dateutil_parser" website/views/user.py

Repository: OWASP-BLT/BLT

Length of output: 39


🏁 Script executed:

# Look at what's actually being used for datetime parsing in handle_issue_comment_event
sed -n '1661,1662p' website/views/user.py

Repository: OWASP-BLT/BLT

Length of output: 218


🏁 Script executed:

# Check how handle_review_comment_event or other handlers set contributor on GitHubComment
rg -n "GitHubComment.*update_or_create" -A 15 website/views/user.py | head -80

Repository: OWASP-BLT/BLT

Length of output: 1455


🏁 Script executed:

# Check if there's a leaderboard query that filters on contributor__isnull
rg -n "contributor__isnull\|contributor=None" website/ -r

Repository: OWASP-BLT/BLT

Length of output: 470


🏁 Script executed:

# Check the safe_parse_github_datetime function implementation
sed -n '1269,1290p' website/views/user.py

Repository: OWASP-BLT/BLT

Length of output: 703


🏁 Script executed:

# Check GitHub API docs reference in comments or look for payload examples
rg -n "GitHub.*PR.*comment\|issue_comment.*PR" website/views/user.py -B 2 -A 2

Repository: OWASP-BLT/BLT

Length of output: 39


🏁 Script executed:

# Let me verify: does issue_data have an "id" field that we could use instead of "number"?
# Check if issue_data.get("id") is available by looking at how it's used elsewhere
rg -n "\.get\(\"id\"\)" website/views/user.py | grep issue

Repository: OWASP-BLT/BLT

Length of output: 39


🏁 Script executed:

# Check the pull_request reference structure in issue_data
sed -n '1622,1625p' website/views/user.py

Repository: OWASP-BLT/BLT

Length of output: 294


🏁 Script executed:

# Search for any test fixtures or mock data showing PR comment payload structure
rg -n "issue_comment\|pull_request.*webhook" website/tests/ -B 3 -A 10 | head -100

Repository: OWASP-BLT/BLT

Length of output: 39


🏁 Script executed:

# Check if there's any GitHub API reference or documentation in comments
rg -n "GitHub API\|github.com.*issue" website/views/user.py | head -20

Repository: OWASP-BLT/BLT

Length of output: 39


🏁 Script executed:

# Final check: does the code ever use issue_data.get("id") for a PR?
rg -n 'issue_data.*"id"' website/

Repository: OWASP-BLT/BLT

Length of output: 484


🏁 Script executed:

# Check context around issue.py line 2408
sed -n '2400,2415p' website/views/issue.py

Repository: OWASP-BLT/BLT

Length of output: 715


🏁 Script executed:

# Also check organization.py line 3299
sed -n '3290,3310p' website/views/organization.py

Repository: OWASP-BLT/BLT

Length of output: 1052


🏁 Script executed:

# Now verify: when handle_issue_comment_event checks "pull_request" in issue_data, 
# is issue_data actually the PR object with its "id" field?
# Let's look at what the GitHub webhook payload structure looks like
rg -n "pull_request.*issue\|issue.*pull_request" website/views/user.py | head -10

Repository: OWASP-BLT/BLT

Length of output: 39


🏁 Script executed:

# Check if there are any comments in the code explaining the issue_data structure
rg -n "GitHub webhook\|payload.*issue\|issue_data.*structure" website/ -B 2 -A 2 | head -40

Repository: OWASP-BLT/BLT

Length of output: 39


🏁 Script executed:

# One more check: let's see if issue_data actually contains "id" by checking other comment handlers
rg -n "issue_data\|comment_data" website/views/user.py | grep -E "1612|1669" -A 5 -B 5

Repository: OWASP-BLT/BLT

Length of output: 39


🏁 Script executed:

# Final verification: Check if there's any code that uses issue_data.get("id") in a PR comment context
sed -n '1612,1720p' website/views/user.py | grep -n "get.*id\|get.*number"

Repository: OWASP-BLT/BLT

Length of output: 401


PR comment lookups will fail due to identifier mismatch.

When handle_pull_request_event (line 1462) stores a PR, it uses issue_id=pr_global_id (the PR's unique numeric id field). However, handle_issue_comment_event (line 1646) queries for the same PR using issue_id=issue_number (the visible PR number like #42). These are different values—the lookup will always fail, and PR comments won't be recorded.

Additionally:

  • Lines 1661-1662 use unsafe dateutil_parser.parse() instead of safe_parse_github_datetime(), risking crashes if timestamps are None or malformed
  • The contributor field is never set on GitHubComment, leaving it null despite being queryable

Change line 1642 to extract the PR's global ID: pr_id = issue_data.get("id") and query with issue_id=pr_id. Use safe_parse_github_datetime() for datetime fields and create/link a Contributor record if available.

🤖 Prompt for AI Agents
In @website/views/user.py around lines 1644 - 1648, The lookup fails because
handle_issue_comment_event queries GitHubIssue by the visible issue_number
instead of the PR's stored global id; change the code that pulls the issue
identifier to use pr_id = issue_data.get("id") and query Repo and GitHubIssue
with issue_id=pr_id (matching how handle_pull_request_event stores it). Replace
dateutil_parser.parse(...) calls with safe_parse_github_datetime(...) when
setting datetime fields on GitHubComment to avoid crashes on None/malformed
timestamps. When creating the GitHubComment, populate the contributor field by
finding-or-creating a Contributor from the comment author (link the found
Contributor to the GitHubComment) so contributor is not left null; keep using
the existing Repo, GitHubIssue, GitHubComment, Contributor model names to locate
the code to update.

Comment on lines +1654 to +1665
GitHubComment.objects.update_or_create(
comment_id=comment_data.get("id"),
defaults={
"github_issue": github_issue,
"user_profile": user_profile,
"body": comment_data.get("body", ""),
"comment_type": "issue_comment",
"created_at": dateutil_parser.parse(comment_data.get("created_at")),
"updated_at": dateutil_parser.parse(comment_data.get("updated_at")),
"url": comment_data.get("html_url"),
},
)
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

⚠️ Potential issue | 🔴 Critical

Missing contributor association breaks leaderboard display.

The GitHubComment is created without setting the contributor field. However, the leaderboard query (line 651) filters with contributor__isnull=False, so comments created by this handler will never appear on the leaderboard.

Add contributor lookup similar to handle_pull_request_event:

🐛 Proposed fix to add contributor association
     # Find/Create User Profile
     user_profile = UserProfile.objects.filter(github_url=sender.get("html_url")).first()

+    # Find/Create Contributor for leaderboard
+    contributor = None
+    gh_id = sender.get("id")
+    gh_login = sender.get("login")
+    gh_avatar = sender.get("avatar_url")
+    gh_github_url = sender.get("html_url")
+    if gh_id is not None:
+        try:
+            contributor, _ = Contributor.objects.get_or_create(
+                github_id=gh_id,
+                defaults={
+                    "github_url": gh_github_url or "",
+                    "name": gh_login or "",
+                    "avatar_url": gh_avatar or "",
+                    "contributor_type": "User",
+                    "contributions": 0,
+                },
+            )
+        except Exception as e:
+            logger.error(f"Error getting/creating Contributor for comment: {e}")
+
     # Create Comment
     GitHubComment.objects.update_or_create(
         comment_id=comment_data.get("id"),
         defaults={
             "github_issue": github_issue,
             "user_profile": user_profile,
+            "contributor": contributor,
             "body": comment_data.get("body", ""),
🤖 Prompt for AI Agents
In @website/views/user.py around lines 1654 - 1665, The GitHubComment is being
created without setting the contributor field, so it never matches leaderboard
filters (contributor__isnull=False); update the handler to perform the same
contributor lookup used in handle_pull_request_event (e.g., resolve or
get/create the Contributor for the given user_profile and repository) and
include the resulting Contributor in the defaults passed to
GitHubComment.objects.update_or_create (add the contributor key to the defaults
for GitHubComment).

Comment on lines +1661 to +1662
"created_at": dateutil_parser.parse(comment_data.get("created_at")),
"updated_at": dateutil_parser.parse(comment_data.get("updated_at")),
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

⚠️ Potential issue | 🟠 Major

Unsafe datetime parsing may crash on malformed input.

Using dateutil_parser.parse() directly will raise ParserError if created_at or updated_at is None or malformed. Use the existing safe_parse_github_datetime() helper (defined at line 1269) for consistency and robustness.

🔧 Proposed fix
-            "created_at": dateutil_parser.parse(comment_data.get("created_at")),
-            "updated_at": dateutil_parser.parse(comment_data.get("updated_at")),
+            "created_at": safe_parse_github_datetime(
+                comment_data.get("created_at"),
+                default=timezone.now(),
+                field_name="comment.created_at",
+            ),
+            "updated_at": safe_parse_github_datetime(
+                comment_data.get("updated_at"),
+                default=timezone.now(),
+                field_name="comment.updated_at",
+            ),
🤖 Prompt for AI Agents
In @website/views/user.py around lines 1661 - 1662, The current construction
uses dateutil_parser.parse on comment_data.get("created_at") and ("updated_at")
which can raise ParserError on None/malformed input; replace those calls with
the existing safe_parse_github_datetime helper (defined earlier) when building
the comment dict so created_at and updated_at use
safe_parse_github_datetime(comment_data.get("created_at")) and
safe_parse_github_datetime(comment_data.get("updated_at")) to avoid crashes and
keep behavior consistent.

Comment on lines +1694 to +1701
# Find Issue (PR is also a GitHubIssue)
pr_number = pr_data.get("number")
repo_html_url = repo_data.get("html_url")
try:
repo = Repo.objects.get(repo_url=repo_html_url)
github_issue = GitHubIssue.objects.get(issue_id=pr_number, repo=repo)
except (Repo.DoesNotExist, GitHubIssue.DoesNotExist):
return JsonResponse({"status": "ignored", "reason": "Repo or Issue not found"}, status=200)
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

⚠️ Potential issue | 🔴 Critical

Same GitHubIssue lookup issue applies here.

This handler has the same problem: it queries by pr_number but PRs are stored with their global id. The lookup will fail and review comments won't be recorded.

🤖 Prompt for AI Agents
In @website/views/user.py around lines 1694 - 1701, The lookup is using the PR's
number (pr_number) but GitHubIssue records use the global PR id, so change the
lookup to use the PR global id from pr_data (e.g., pr_id = pr_data.get("id"))
instead of pr_number when calling GitHubIssue.objects.get(issue_id=...,
repo=repo); ensure the variable used in the query matches the stored identifier
and update any related variable names (pr_number → pr_id) to avoid mismatched
lookups.

Comment on lines +1707 to +1718
GitHubComment.objects.update_or_create(
comment_id=comment_data.get("id"),
defaults={
"github_issue": github_issue,
"user_profile": user_profile,
"body": comment_data.get("body", ""),
"comment_type": "review_comment",
"created_at": dateutil_parser.parse(comment_data.get("created_at")),
"updated_at": dateutil_parser.parse(comment_data.get("updated_at")),
"url": comment_data.get("html_url"),
},
)
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

⚠️ Potential issue | 🔴 Critical

Missing contributor and unsafe datetime parsing (same issues as issue_comment handler).

This handler has the same issues:

  1. Missing contributor field - comments won't appear on leaderboard
  2. Unsafe dateutil_parser.parse() calls that can crash on malformed input

Apply the same fixes as suggested for handle_issue_comment_event.

🤖 Prompt for AI Agents
In @website/views/user.py around lines 1707 - 1718, The
GitHubComment.update_or_create call is missing the contributor field and uses
unsafe dateutil_parser.parse for created_at/updated_at; update the defaults to
include "contributor": user_profile.contributor (or the correct contributor attr
on user_profile) and replace the direct dateutil_parser.parse(...) calls with a
safe parse (e.g., a parse_datetime_safe helper or a try/except around
dateutil_parser.parse that falls back to timezone.now() or None on failure) so
malformed timestamps don’t crash the handler; modify the
GitHubComment.objects.update_or_create defaults accordingly.

@github-actions github-actions bot added last-active: 0d PR last updated 0 days ago last-active: 1d PR last updated 1 day ago last-active: 2d PR last updated 2 days ago last-active: 3d PR last updated 3 days ago last-active: 4d PR last updated 4 days ago last-active: 5d Issue last updated 5 days ago last-active: 6d Issue last updated 6 days ago and removed last-active: 0d PR last updated 0 days ago last-active: 1d PR last updated 1 day ago last-active: 2d PR last updated 2 days ago last-active: 3d PR last updated 3 days ago last-active: 4d PR last updated 4 days ago last-active: 5d Issue last updated 5 days ago labels Jan 12, 2026
@github-actions github-actions bot added last-active: 7d Issue last updated 7 days ago and removed last-active: 6d Issue last updated 6 days ago labels Jan 19, 2026
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

files-changed: 5 PR changes 5 files last-active: 7d Issue last updated 7 days ago migrations PR contains database migration files needs-peer-review PR needs peer review pre-commit: passed Pre-commit checks passed quality: high reviewed tests: failed Django tests failed

Projects

Status: Ready

Development

Successfully merging this pull request may close these issues.

Have a GitHub comment leaderboard

2 participants