Thanks to visit codestin.com
Credit goes to github.com

Skip to content

Conversation

chinganc
Copy link
Member

@chinganc chinganc commented Sep 12, 2025

This PR finishes the PrioritySearch

It's been tested on the convex optimization and some prompt optimization problems.
It supports

  • Running multiple optimizers in parallel
  • Branching out optimizers along their update chains. This is useful for using optimizer with memory and multiple optimizers.

Changes to core in PR:

  1. ModelWrapper is now moved to the top module as Model so that classed decorated by @trace.model can be pickled.
  2. The old implementation of Algorithm.save and Algorithm.load at the abstract class level are commented out. We will leave that to the subclasses.
  3. Some minor bug fixes.

NOTE
-Resuming experiments. will be done in future PRs

chinganc and others added 20 commits September 18, 2025 07:41
add a saving method to pre-save source code
- Make the copied modules' parameter nodes have the same as the original one, so that optimizer's memory works.
- Add a flag to allow using the same optimizer instance across search
- Remove commented code
added GEPA in examples/priority_search_on_convex_fn.py
…ers_benchmark.py (see howto: examples/trainer_benchmark_HOWTO.md)
Copy link
Member

@allenanie allenanie left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

A few files that need to be removed (including one that I accidentally committed/pushed). No immediate implementation problem spotted.

@chinganc chinganc merged commit ca80cea into experimental Oct 3, 2025
1 check passed
@chinganc chinganc deleted the fix/priority_search branch October 3, 2025 19:45
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

4 participants