-
-
Notifications
You must be signed in to change notification settings - Fork 258
Comparing changes
Open a pull request
base repository: crmne/ruby_llm
base: 1.7.1
head repository: crmne/ruby_llm
compare: 1.8.0
- 12 commits
- 335 files changed
- 8 contributors
Commits on Sep 11, 2025
-
Configuration menu - View commit details
-
Copy full SHA for fa10f0c - Browse repository at this point
Copy the full SHA fa10f0cView commit details
Commits on Sep 13, 2025
-
Fix create_table migrations to prevent foreign key errors (#409) (#411)
## Description This PR fixes issue #409 where the rails generate ruby_llm:install generator creates migrations with foreign key references to tables that haven't been created yet, causing migration failures. ## Problem When running the install generator on a fresh Rails application, the generated migrations contain t.references calls that try to create foreign key constraints to tables that are created in later migrations. This causes PostgreSQL (and other databases) to fail with "relation does not exist" errors because the referenced tables haven't been created yet. ## Solution 1. Removes the t.references lines from the initial table creation migrations 2. Creates a new migration that runs after all tables are created to add the foreign key references. This can be tested with a brand new rails app by adding the gem from this branch: `gem "ruby_llm", git: "https://github.com/matiasmoya/ruby_llm", branch: "409-fix-install-migrations"` ## What this does - Remove t.reference calls from initial table creation migrations - Add new migration generator template for adding references separately - Update install generator to create tables first, then add references ## Type of change - [x] Bug fix ## Scope check - [x] I read the [Contributing Guide](https://github.com/crmne/ruby_llm/blob/main/CONTRIBUTING.md) - [x] This aligns with RubyLLM's focus on **LLM communication** - [x] This isn't application-specific logic that belongs in user code - [x] This benefits most users, not just my specific use case ## Quality check - [x] I ran `overcommit --install` and all hooks pass - [x] I tested my changes thoroughly - [x] For provider changes: Re-recorded VCR cassettes with `bundle exec rake vcr:record[provider_name]` - [x] All tests pass: `bundle exec rspec` - [x] I updated documentation if needed - [x] I didn't modify auto-generated files manually (`models.json`, `aliases.json`) ## API changes - [x] No API changes ## Related issues Fixes #409
Configuration menu - View commit details
-
Copy full SHA for 0d23da4 - Browse repository at this point
Copy the full SHA 0d23da4View commit details -
Fix: Add resolve method delegation from Models instance to class (#407)
## Summary - Adds missing `resolve` instance method that delegates to the class method - Fixes usage of `RubyLLM.models.resolve` in the migration template - Includes comprehensive test coverage with TDD approach ## Context The upgrade migration template at `lib/generators/ruby_llm/upgrade_to_v1_7/templates/migration.rb.tt` uses `RubyLLM.models.resolve` on lines 74 and 139. However, `RubyLLM.models` returns a `Models` instance, which didn't have the `resolve` method - it was only available as a class method. ## Changes - Added `resolve` instance method in `Models` class that delegates to `self.class.resolve` - Added test coverage for all three usage patterns: - With provider specified - Without provider (auto-detection) - With `assume_exists` option for unknown models ## Test plan - [x] Run `bundle exec rspec spec/ruby_llm/models_spec.rb -e resolve` - all tests pass - [x] Verified the migration template can now use `RubyLLM.models.resolve` without errors 🤖 Generated with [Claude Code](https://claude.ai/code) --------- Co-authored-by: Claude <[email protected]> Co-authored-by: Carmine Paolino <[email protected]>
Configuration menu - View commit details
-
Copy full SHA for 078ef25 - Browse repository at this point
Copy the full SHA 078ef25View commit details -
Models helps should return all supporting modalities (#408)
image_models, audio_models, and embedding_models should all return models that output their modality ## What this does <!-- Clear description of what this PR does and why --> ## Type of change - [x] Bug fix - [ ] New feature - [ ] Breaking change - [ ] Documentation - [ ] Performance improvement ## Scope check - [x] I read the [Contributing Guide](https://github.com/crmne/ruby_llm/blob/main/CONTRIBUTING.md) - [x] This aligns with RubyLLM's focus on **LLM communication** - [x] This isn't application-specific logic that belongs in user code - [x] This benefits most users, not just my specific use case ## Quality check - [x] I ran `overcommit --install` and all hooks pass - [x] I tested my changes thoroughly - [ ] For provider changes: Re-recorded VCR cassettes with `bundle exec rake vcr:record[provider_name]` - [x] All tests pass: `bundle exec rspec` - [x] I updated documentation if needed - [x] I didn't modify auto-generated files manually (`models.json`, `aliases.json`) ## API changes - [ ] Breaking change - [ ] New public methods/classes - [ ] Changed method signatures - [ ] No API changes ## Related issues <!-- Link issues: "Fixes #123" or "Related to #123" --> Co-authored-by: Carmine Paolino <[email protected]>
Configuration menu - View commit details
-
Copy full SHA for 32b3648 - Browse repository at this point
Copy the full SHA 32b3648View commit details
Commits on Sep 14, 2025
-
Add Content Moderation Feature (#383)
## What this does This PR adds content moderation functionality to RubyLLM, allowing developers to identify potentially harmful content before sending it to LLM providers. This helps prevent API key bans and ensures safer user interactions. ### New Features - Content Moderation API: New RubyLLM.moderate() method for screening text content - Safety Categories: Detects sexual, hate, harassment, violence, self-harm, and other harmful content types - Convenience Methods: Easy-to-use helpers like `flagged?`, `flagged_categories`, and `category_scores` - Provider Integration: Currently supports OpenAI's moderation API with extensible architecture for future providers ### Usage Examples ``` # Basic usage result = RubyLLM.moderate("User input text") puts result.flagged? # => true/false # Get flagged categories puts result.flagged_categories # => ["harassment", "hate"] # Integration pattern - screen before chat def safe_chat(user_input) moderation = RubyLLM.moderate(user_input) return "Content not allowed" if moderation.flagged? RubyLLM.chat.ask(user_input) end ``` ### Changes Made Core Implementation - New Class: `RubyLLM::Moderate` - Main moderation interface following existing patterns - Provider Method: Added `moderate()` to base Provider class - OpenAI Integration: `OpenAI::Moderation` module with API implementation - Main Module: Added `RubyLLM.moderate()` method for global access Configuration - Default Model: Added `default_moderation_model` configuration option (defaults to `omni-moderation-latest`) - API Requirements: Requires OpenAI API key (follows existing provider pattern) Documentation - Complete Guide: New `moderation.md` with examples - Integration Patterns: Real-world usage examples including Rails integration - Best Practices: Performance considerations and user experience guidelines Testing - Test Suite: `moderation_spec.rb` with 4 test cases - VCR Cassettes: Mock API responses fo testing - Tests Passing: No regressions in existing functionality ## Type of change - [ ] Bug fix - [x] New feature - [ ] Breaking change - [ ] Documentation - [ ] Performance improvement ## Scope check - [x] I read the [Contributing Guide](https://github.com/crmne/ruby_llm/blob/main/CONTRIBUTING.md) - [x] This aligns with RubyLLM's focus on **LLM communication** - [x] This isn't application-specific logic that belongs in user code - [x] This benefits most users, not just my specific use case ## Quality check - [x] I ran `overcommit --install` and all hooks pass - [x] I tested my changes thoroughly - [x] I updated documentation if needed - [x] I didn't modify auto-generated files manually (`models.json`, `aliases.json`) ## API changes - [ ] Breaking change - [x] New public methods/classes - [ ] Changed method signatures - [ ] No API changes ## Related issues N/A --------- Co-authored-by: Carmine Paolino <[email protected]>
Configuration menu - View commit details
-
Copy full SHA for 497e3d8 - Browse repository at this point
Copy the full SHA 497e3d8View commit details -
Configuration menu - View commit details
-
Copy full SHA for e9f8d50 - Browse repository at this point
Copy the full SHA e9f8d50View commit details -
Configuration menu - View commit details
-
Copy full SHA for aacd639 - Browse repository at this point
Copy the full SHA aacd639View commit details -
## What this does Adds video file support to RubyLLM. Supersedes #260, originally authored by @arnodirlam. Thank you for the groundwork. > Maintainers: happy to close this if you prefer waiting for the original PR. ### What changed vs #260 - Rebased on current `main` - Resolved conflicts in README/docs and Gemini capabilities - Addressed PR comment reviews - Included `gemini-2.5-flash` as a video model for tests ## Type of change - [ ] Bug fix - [x] New feature - [ ] Breaking change - [ ] Documentation - [ ] Performance improvement ## Scope check - [x] I read the [Contributing Guide](https://github.com/crmne/ruby_llm/blob/main/CONTRIBUTING.md) - [x] This aligns with RubyLLM's focus on **LLM communication** - [x] This isn't application-specific logic that belongs in user code - [x] This benefits most users, not just my specific use case ## Quality check - [x] I ran `overcommit --install` and all hooks pass - [x] I tested my changes thoroughly - [x] For provider changes: Re-recorded VCR cassettes with `bundle exec rake vcr:record[provider_name]` - [x] All tests pass: `bundle exec rspec` - [x] I updated documentation if needed - [x] I didn't modify auto-generated files manually (`models.json`, `aliases.json`) ## API changes - [ ] Breaking change - [ ] New public methods/classes - [ ] Changed method signatures - [x] No API changes ## Related issues Closes #259 --------- Co-authored-by: Arno Dirlam <[email protected]> Co-authored-by: Carmine Paolino <[email protected]>
Configuration menu - View commit details
-
Copy full SHA for 4ff2231 - Browse repository at this point
Copy the full SHA 4ff2231View commit details -
Configuration menu - View commit details
-
Copy full SHA for 2ace2d3 - Browse repository at this point
Copy the full SHA 2ace2d3View commit details -
Remove outdated version notes and clarified moderation being availabl…
…e in next version
Configuration menu - View commit details
-
Copy full SHA for a4fae99 - Browse repository at this point
Copy the full SHA a4fae99View commit details -
Configuration menu - View commit details
-
Copy full SHA for e27eb10 - Browse repository at this point
Copy the full SHA e27eb10View commit details -
Configuration menu - View commit details
-
Copy full SHA for 0cb6299 - Browse repository at this point
Copy the full SHA 0cb6299View commit details
This comparison is taking too long to generate.
Unfortunately it looks like we can’t render this comparison for you right now. It might be too big, or there might be something weird with your repository.
You can try running this command locally to see the comparison on your machine:
git diff 1.7.1...1.8.0