Thanks to visit codestin.com
Credit goes to github.com

Skip to content
Permalink

Comparing changes

Choose two branches to see what’s changed or to start a new pull request. If you need to, you can also or learn more about diff comparisons.

Open a pull request

Create a new pull request by comparing changes across two branches. If you need to, you can also . Learn more about diff comparisons here.
base repository: crmne/ruby_llm
Failed to load repositories. Confirm that selected base ref is valid, then try again.
Loading
base: 1.7.1
Choose a base ref
...
head repository: crmne/ruby_llm
Failed to load repositories. Confirm that selected head ref is valid, then try again.
Loading
compare: 1.8.0
Choose a head ref
  • 12 commits
  • 335 files changed
  • 8 contributors

Commits on Sep 11, 2025

  1. Bust README gem cache

    crmne authored Sep 11, 2025
    Configuration menu
    Copy the full SHA
    fa10f0c View commit details
    Browse the repository at this point in the history

Commits on Sep 13, 2025

  1. Fix create_table migrations to prevent foreign key errors (#409) (#411)

    ## Description 
    This PR fixes issue #409 where the rails generate ruby_llm:install
    generator creates migrations with foreign key references to tables that
    haven't been created yet, causing migration failures.
    
    ## Problem
    When running the install generator on a fresh Rails application, the
    generated migrations contain t.references calls that try to create
    foreign key constraints to tables that are created in later migrations.
    This causes PostgreSQL (and other databases) to fail with "relation does
    not exist" errors because the referenced tables haven't been created
    yet.
    
    ## Solution
    1. Removes the t.references lines from the initial table creation
    migrations
    2. Creates a new migration that runs after all tables are created to add
    the foreign key references.
    
    This can be tested with a brand new rails app by adding the gem from
    this branch:
    `gem "ruby_llm", git: "https://github.com/matiasmoya/ruby_llm", branch:
    "409-fix-install-migrations"`
    
    ## What this does
    
    - Remove t.reference calls from initial table creation migrations
    - Add new migration generator template for adding references separately
    - Update install generator to create tables first, then add references
    
    ## Type of change
    
    - [x] Bug fix
    
    ## Scope check
    
    - [x] I read the [Contributing
    Guide](https://github.com/crmne/ruby_llm/blob/main/CONTRIBUTING.md)
    - [x] This aligns with RubyLLM's focus on **LLM communication**
    - [x] This isn't application-specific logic that belongs in user code
    - [x] This benefits most users, not just my specific use case
    
    ## Quality check
    
    - [x] I ran `overcommit --install` and all hooks pass
    - [x] I tested my changes thoroughly
    - [x] For provider changes: Re-recorded VCR cassettes with `bundle exec
    rake vcr:record[provider_name]`
      - [x] All tests pass: `bundle exec rspec`
    - [x] I updated documentation if needed
    - [x] I didn't modify auto-generated files manually (`models.json`,
    `aliases.json`)
    
    ## API changes
    
    - [x] No API changes
    
    ## Related issues
    
    Fixes #409
    matiasmoya authored Sep 13, 2025
    Configuration menu
    Copy the full SHA
    0d23da4 View commit details
    Browse the repository at this point in the history
  2. Fix: Add resolve method delegation from Models instance to class (#407)

    ## Summary
    - Adds missing `resolve` instance method that delegates to the class
    method
    - Fixes usage of `RubyLLM.models.resolve` in the migration template
    - Includes comprehensive test coverage with TDD approach
    
    ## Context
    The upgrade migration template at
    `lib/generators/ruby_llm/upgrade_to_v1_7/templates/migration.rb.tt` uses
    `RubyLLM.models.resolve` on lines 74 and 139. However, `RubyLLM.models`
    returns a `Models` instance, which didn't have the `resolve` method - it
    was only available as a class method.
    
    ## Changes
    - Added `resolve` instance method in `Models` class that delegates to
    `self.class.resolve`
    - Added test coverage for all three usage patterns:
      - With provider specified
      - Without provider (auto-detection)
      - With `assume_exists` option for unknown models
    
    ## Test plan
    - [x] Run `bundle exec rspec spec/ruby_llm/models_spec.rb -e resolve` -
    all tests pass
    - [x] Verified the migration template can now use
    `RubyLLM.models.resolve` without errors
    
    🤖 Generated with [Claude Code](https://claude.ai/code)
    
    ---------
    
    Co-authored-by: Claude <[email protected]>
    Co-authored-by: Carmine Paolino <[email protected]>
    3 people authored Sep 13, 2025
    Configuration menu
    Copy the full SHA
    078ef25 View commit details
    Browse the repository at this point in the history
  3. Models helps should return all supporting modalities (#408)

    image_models, audio_models, and embedding_models should all return
    models that output their modality
    
    ## What this does
    
    <!-- Clear description of what this PR does and why -->
    
    ## Type of change
    
    - [x] Bug fix
    - [ ] New feature
    - [ ] Breaking change
    - [ ] Documentation
    - [ ] Performance improvement
    
    ## Scope check
    
    - [x] I read the [Contributing
    Guide](https://github.com/crmne/ruby_llm/blob/main/CONTRIBUTING.md)
    - [x] This aligns with RubyLLM's focus on **LLM communication**
    - [x] This isn't application-specific logic that belongs in user code
    - [x] This benefits most users, not just my specific use case
    
    ## Quality check
    
    - [x] I ran `overcommit --install` and all hooks pass
    - [x] I tested my changes thoroughly
    - [ ] For provider changes: Re-recorded VCR cassettes with `bundle exec
    rake vcr:record[provider_name]`
      - [x] All tests pass: `bundle exec rspec`
    - [x] I updated documentation if needed
    - [x] I didn't modify auto-generated files manually (`models.json`,
    `aliases.json`)
    
    ## API changes
    
    - [ ] Breaking change
    - [ ] New public methods/classes
    - [ ] Changed method signatures
    - [ ] No API changes
    
    ## Related issues
    
    <!-- Link issues: "Fixes #123" or "Related to #123" -->
    
    Co-authored-by: Carmine Paolino <[email protected]>
    dacamp and crmne authored Sep 13, 2025
    Configuration menu
    Copy the full SHA
    32b3648 View commit details
    Browse the repository at this point in the history

Commits on Sep 14, 2025

  1. Add Content Moderation Feature (#383)

    ## What this does
    
    This PR adds content moderation functionality to RubyLLM, allowing
    developers to identify potentially harmful content before sending it to
    LLM providers. This helps prevent API key bans and ensures safer user
    interactions.
    
    ### New Features
    
    - Content Moderation API: New RubyLLM.moderate() method for screening
    text content
    - Safety Categories: Detects sexual, hate, harassment, violence,
    self-harm, and other harmful content types
    - Convenience Methods: Easy-to-use helpers like `flagged?`,
    `flagged_categories`, and `category_scores`
    - Provider Integration: Currently supports OpenAI's moderation API with
    extensible architecture for future providers
    
    ### Usage Examples
    ```
    # Basic usage
    result = RubyLLM.moderate("User input text")
    puts result.flagged?  # => true/false
    
    # Get flagged categories
    puts result.flagged_categories  # => ["harassment", "hate"]
    
    # Integration pattern - screen before chat
    def safe_chat(user_input)
      moderation = RubyLLM.moderate(user_input)
      return "Content not allowed" if moderation.flagged?
      
      RubyLLM.chat.ask(user_input)
    end
    ```
    
    ### Changes Made
    Core Implementation
    
    - New Class: `RubyLLM::Moderate` - Main moderation interface following
    existing patterns
    - Provider Method: Added `moderate()` to base Provider class
    - OpenAI Integration: `OpenAI::Moderation` module with API
    implementation
    - Main Module: Added `RubyLLM.moderate()` method for global access
    
    Configuration
    
    - Default Model: Added `default_moderation_model` configuration option
    (defaults to `omni-moderation-latest`)
    - API Requirements: Requires OpenAI API key (follows existing provider
    pattern)
    
    Documentation
    
    - Complete Guide: New `moderation.md` with examples
    - Integration Patterns: Real-world usage examples including Rails
    integration
    - Best Practices: Performance considerations and user experience
    guidelines
    
    Testing
    
    - Test Suite: `moderation_spec.rb` with 4 test cases
    - VCR Cassettes: Mock API responses fo testing
    - Tests Passing: No regressions in existing functionality
    
    ## Type of change
    
    - [ ] Bug fix
    - [x] New feature
    - [ ] Breaking change
    - [ ] Documentation
    - [ ] Performance improvement
    
    ## Scope check
    
    - [x] I read the [Contributing
    Guide](https://github.com/crmne/ruby_llm/blob/main/CONTRIBUTING.md)
    - [x] This aligns with RubyLLM's focus on **LLM communication**
    - [x] This isn't application-specific logic that belongs in user code
    - [x] This benefits most users, not just my specific use case
    
    ## Quality check
    
    - [x] I ran `overcommit --install` and all hooks pass
    - [x] I tested my changes thoroughly
    - [x] I updated documentation if needed
    - [x] I didn't modify auto-generated files manually (`models.json`,
    `aliases.json`)
    
    ## API changes
    
    - [ ] Breaking change
    - [x] New public methods/classes
    - [ ] Changed method signatures
    - [ ] No API changes
    
    ## Related issues
    
    N/A
    
    ---------
    
    Co-authored-by: Carmine Paolino <[email protected]>
    iraszl and crmne authored Sep 14, 2025
    Configuration menu
    Copy the full SHA
    497e3d8 View commit details
    Browse the repository at this point in the history
  2. Configuration menu
    Copy the full SHA
    e9f8d50 View commit details
    Browse the repository at this point in the history
  3. Updated appraisal gemfiles

    crmne committed Sep 14, 2025
    Configuration menu
    Copy the full SHA
    aacd639 View commit details
    Browse the repository at this point in the history
  4. Add video file support (#405)

    ## What this does
    
    Adds video file support to RubyLLM.
    
    Supersedes #260, originally authored by @arnodirlam. Thank you for the
    groundwork.
    
    > Maintainers: happy to close this if you prefer waiting for the
    original PR.
    
    ### What changed vs #260
    - Rebased on current `main`
    - Resolved conflicts in README/docs and Gemini capabilities
    - Addressed PR comment reviews
    - Included `gemini-2.5-flash` as a video model for tests
    
    ## Type of change
    
    - [ ] Bug fix
    - [x] New feature
    - [ ] Breaking change
    - [ ] Documentation
    - [ ] Performance improvement
    
    ## Scope check
    
    - [x] I read the [Contributing
    Guide](https://github.com/crmne/ruby_llm/blob/main/CONTRIBUTING.md)
    - [x] This aligns with RubyLLM's focus on **LLM communication**
    - [x] This isn't application-specific logic that belongs in user code
    - [x] This benefits most users, not just my specific use case
    
    ## Quality check
    
    - [x] I ran `overcommit --install` and all hooks pass
    - [x] I tested my changes thoroughly
    - [x] For provider changes: Re-recorded VCR cassettes with `bundle exec
    rake vcr:record[provider_name]`
      - [x] All tests pass: `bundle exec rspec`
    - [x] I updated documentation if needed
    - [x] I didn't modify auto-generated files manually (`models.json`,
    `aliases.json`)
    
    ## API changes
    
    - [ ] Breaking change
    - [ ] New public methods/classes
    - [ ] Changed method signatures
    - [x] No API changes
    
    ## Related issues
    
    Closes #259
    
    ---------
    
    Co-authored-by: Arno Dirlam <[email protected]>
    Co-authored-by: Carmine Paolino <[email protected]>
    3 people authored Sep 14, 2025
    Configuration menu
    Copy the full SHA
    4ff2231 View commit details
    Browse the repository at this point in the history
  5. Configuration menu
    Copy the full SHA
    2ace2d3 View commit details
    Browse the repository at this point in the history
  6. Configuration menu
    Copy the full SHA
    a4fae99 View commit details
    Browse the repository at this point in the history
  7. Updated models

    crmne committed Sep 14, 2025
    Configuration menu
    Copy the full SHA
    e27eb10 View commit details
    Browse the repository at this point in the history
  8. Bump version to 1.8.0

    crmne committed Sep 14, 2025
    Configuration menu
    Copy the full SHA
    0cb6299 View commit details
    Browse the repository at this point in the history
Loading