Modern workflow orchestration library for Go β An evolutionary Go port of PocketFlow
Flow provides a single adaptive node that automatically changes behavior based on parameters, eliminating boilerplate while enabling unprecedented composability for building AI agents, complex workflows, and data processing pipelines.
- π― Single Adaptive Node: One node type that automatically adapts behavior based on parameters
- π Parameter-Driven: Configure behavior through parameters, not inheritance
- β‘ Auto-Parallel: Add
parallel: trueto any batch operation for instant concurrency - π Auto-Retry: Set
retries > 0to automatically enable retry logic with exponential backoff - π¦ Auto-Batch: Set
batch: truewithdatato automatically process collections - π§© Composable: Mix retry + batch + parallel in a single node declaration
- πͺΆ Ultra-Lightweight: ~440 lines total vs 748 lines of traditional approaches
- π§΅ Thread-Safe: SharedState management for safe concurrent data sharing
go get github.com/joemocha/flowpackage main
import (
"fmt"
"github.com/joemocha/flow"
)
func main() {
state := flow.NewSharedState()
// Single adaptive node - no custom types needed!
node := flow.NewNode()
node.SetParams(map[string]interface{}{
"name": "World",
})
node.SetExecFunc(func(prep interface{}) (interface{}, error) {
name := node.GetParam("name").(string)
fmt.Printf("Hello, %s!\n", name)
return "greeted", nil
})
result := node.Run(state)
fmt.Printf("Result: %s\n", result)
}flowchart TD
A[Start] --> B[Create SharedState]
B --> C[Create Adaptive Node]
C --> D[Set Parameters<br/>name: 'World']
D --> E[Set ExecFunc<br/>Hello, name!]
E --> F[Run Node]
F --> G{Node Execution}
G --> H[Print: Hello, World!]
H --> I[Return: 'greeted']
I --> J[End]
style C fill:#e1f5fe
style G fill:#fff3e0
style I fill:#e8f5e8
func main() {
state := flow.NewSharedState()
// Automatic retry when retries > 0 - no custom RetryNode needed!
node := flow.NewNode()
node.SetParams(map[string]interface{}{
"retries": 3,
"retry_delay": time.Millisecond * 100,
})
node.SetExecFunc(func(prep interface{}) (interface{}, error) {
// Just business logic - retry is automatic!
if rand.Float32() < 0.7 {
return "", fmt.Errorf("API temporarily unavailable")
}
return "api_success", nil
})
result := node.Run(state)
fmt.Printf("Final result: %s\n", result)
}flowchart TD
A[Start] --> B[Create SharedState]
B --> C[Create Adaptive Node]
C --> D[Set Parameters<br/>retries: 3<br/>retry_delay: 100ms]
D --> E[Set ExecFunc<br/>API Call Logic]
E --> F[Run Node]
F --> G{Check Parameters}
G -->|retries > 0| H[Enable Retry Logic]
G -->|retries = 0| I[Single Execution]
H --> J[Attempt 1]
J --> K{Success?}
K -->|Yes| L[Return Success]
K -->|No| M{Retries Left?}
M -->|Yes| N[Wait with Exponential Backoff]
N --> O[Attempt 2]
O --> P{Success?}
P -->|Yes| L
P -->|No| Q{Retries Left?}
Q -->|Yes| R[Wait with Exponential Backoff]
R --> S[Attempt 3]
S --> T{Success?}
T -->|Yes| L
T -->|No| U[Return Error]
I --> V[Single Attempt]
V --> W{Success?}
W -->|Yes| L
W -->|No| U
L --> X[End]
U --> X
style C fill:#e1f5fe
style H fill:#fff3e0
style L fill:#e8f5e8
style U fill:#ffebee
func main() {
state := flow.NewSharedState()
// Automatic batch processing when batch: true is set
node := flow.NewNode()
node.SetParams(map[string]interface{}{
"data": []int{1, 2, 3, 4, 5},
"batch": true,
})
node.SetExecFunc(func(item interface{}) (interface{}, error) {
// Called once per item automatically!
num := item.(int)
return fmt.Sprintf("processed-%d", num*2), nil
})
result := node.Run(state)
fmt.Printf("Batch result: %s\n", result)
// Results automatically stored in shared state
results := state.Get("batch_results")
fmt.Printf("Processed items: %v\n", results)
}flowchart TD
A[Start] --> B[Create SharedState]
B --> C[Create Adaptive Node]
C --> D["Set Parameters<br/>data: 1,2,3,4,5<br/>batch: true"]
D --> E["Set ExecFunc<br/>Process Item Logic"]
E --> F[Run Node]
F --> G{Check Parameters}
G -->|batch = true| H[Enable Batch Processing]
G -->|batch = false| I[Single Execution]
H --> J[Initialize batch_results]
J --> K[Process Item 1]
K --> L[processed-2]
L --> M[Store in batch_results]
M --> N[Process Item 2]
N --> O[processed-4]
O --> P[Store in batch_results]
P --> Q[Process Item 3]
Q --> R[processed-6]
R --> S[Store in batch_results]
S --> T[Process Item 4]
T --> U[processed-8]
U --> V[Store in batch_results]
V --> W[Process Item 5]
W --> X[processed-10]
X --> Y[Store in batch_results]
Y --> Z[Return: batch_complete]
I --> AA[Single Item Processing]
AA --> Z
Z --> BB[Results in SharedState]
BB --> CC[End]
style C fill:#e1f5fe
style H fill:#fff3e0
style Z fill:#e8f5e8
style BB fill:#f3e5f5
func main() {
state := flow.NewSharedState()
// Retry + Batch + Parallel in a single node declaration!
node := flow.NewNode()
node.SetParams(map[string]interface{}{
// Batch configuration
"data": []string{
"https://api1.example.com",
"https://api2.example.com",
"https://api3.example.com",
"https://api4.example.com",
"https://api5.example.com",
},
"batch": true,
// Parallel configuration
"parallel": true,
"parallel_limit": 2, // Max 2 concurrent requests
// Retry configuration
"retries": 3,
"retry_delay": time.Millisecond * 200,
})
node.SetExecFunc(func(item interface{}) (interface{}, error) {
// Pure business logic - all patterns applied automatically!
url := item.(string)
return fetchURL(url), nil // With automatic retry + parallel
})
result := node.Run(state)
fmt.Println("Behaviors applied automatically:")
fmt.Println("β Batch processing (5 URLs)")
fmt.Println("β Parallel execution (max 2 concurrent)")
fmt.Println("β Retry logic (up to 3 attempts per URL)")
fmt.Println("β Results collection")
}flowchart TD
A[Start] --> B[Create SharedState]
B --> C[Create Adaptive Node]
C --> D["Set Parameters<br/>data: url1,url2,url3,url4,url5<br/>batch: true, parallel: true<br/>parallel_limit: 2, retries: 3<br/>retry_delay: 200ms"]
D --> E["Set ExecFunc<br/>fetchURL Logic"]
E --> F[Run Node]
F --> G{Check Parameters}
G --> H[Enable All Patterns:<br/>Batch + Parallel + Retry]
H --> I[Initialize Goroutine Pool<br/>Max 2 concurrent]
I --> J[Start Parallel Processing]
J --> K1[Goroutine 1: URL1]
J --> K2[Goroutine 2: URL2]
K1 --> L1{Fetch Success?}
K2 --> L2{Fetch Success?}
L1 -->|No| M1[Retry 1 with backoff]
L1 -->|Yes| N1[Store Result]
L2 -->|No| M2[Retry 1 with backoff]
L2 -->|Yes| N2[Store Result]
M1 --> O1{Success?}
M2 --> O2{Success?}
O1 -->|No| P1[Retry 2 with backoff]
O1 -->|Yes| N1
O2 -->|No| P2[Retry 2 with backoff]
O2 -->|Yes| N2
P1 --> Q1{Success?}
P2 --> Q2{Success?}
Q1 -->|No| R1[Final Retry]
Q1 -->|Yes| N1
Q2 -->|No| R2[Final Retry]
Q2 -->|Yes| N2
R1 --> S1{Success?}
R2 --> S2{Success?}
S1 -->|Yes| N1
S1 -->|No| T1[Store Error]
S2 -->|Yes| N2
S2 -->|No| T2[Store Error]
N1 --> U[Wait for Goroutine Slot]
N2 --> U
T1 --> U
T2 --> U
U --> V{More URLs?}
V -->|Yes| W[Process URL3, URL4, URL5<br/>with same pattern]
V -->|No| X[Collect All Results]
W --> X
X --> Y[Return: batch_complete]
Y --> Z[End]
style C fill:#e1f5fe
style H fill:#fff3e0
style I fill:#e3f2fd
style Y fill:#e8f5e8
func main() {
state := flow.NewSharedState()
// Create adaptive nodes
step1 := flow.NewNode()
step1.SetExecFunc(func(prep interface{}) (interface{}, error) {
// Processing step
return "continue", nil
})
step2 := flow.NewNode()
step2.SetParams(map[string]interface{}{
"data": []int{1, 2, 3, 4, 5},
"batch": true,
"parallel": true,
})
step2.SetExecFunc(func(item interface{}) (interface{}, error) {
return item.(int) * item.(int), nil // Square numbers in parallel
})
step3 := flow.NewNode()
step3.SetExecFunc(func(prep interface{}) (interface{}, error) {
return "done", nil
})
// Chain nodes with specific actions
step1.Next(step2, "continue")
step2.Next(step3, "batch_complete")
// Create and run flow
flow := flow.NewFlow().Start(step1)
result := flow.Run(state)
fmt.Printf("Flow result: %s\n", result)
}flowchart TD
A[Start] --> B[Create SharedState]
B --> C[Create Flow]
C --> D1[Step1: Adaptive Node]
D1 --> E1[ExecFunc: Processing]
E1 --> F1[Return: 'continue']
F1 --> G{Check Action}
G -->|continue| D2[Step2: Adaptive Node]
D2 --> E2["Set Parameters<br/>data: 1,2,3,4,5<br/>batch: true, parallel: true"]
E2 --> F2["ExecFunc: Square Numbers"]
F2 --> H2[Parallel Processing]
H2 --> I21[Goroutine 1: 1Β² = 1]
H2 --> I22[Goroutine 2: 2Β² = 4]
H2 --> I23[Goroutine 3: 3Β² = 9]
H2 --> I24[Goroutine 4: 4Β² = 16]
H2 --> I25[Goroutine 5: 5Β² = 25]
I21 --> J2[Collect Results]
I22 --> J2
I23 --> J2
I24 --> J2
I25 --> J2
J2 --> K2[Return: 'batch_complete']
K2 --> L{Check Action}
L -->|batch_complete| D3[Step3: Adaptive Node]
D3 --> E3[ExecFunc: Final Processing]
E3 --> F3[Return: 'done']
F3 --> M[Flow Complete]
M --> N[End]
style D1 fill:#e1f5fe
style D2 fill:#e1f5fe
style D3 fill:#e1f5fe
style H2 fill:#e3f2fd
style M fill:#e8f5e8
| Parameter | Type | Effect | Example |
|---|---|---|---|
retries |
int |
Auto-enables retry logic with exponential backoff | "retries": 3 |
retry_delay |
time.Duration |
Base delay for exponential backoff calculation | "retry_delay": time.Second |
data |
[]interface{} |
Data to process (used with batch) | "data": []int{1,2,3} |
batch |
bool |
Enables batch processing of data | "batch": true |
parallel |
bool |
Enables parallel batch execution | "parallel": true |
parallel_limit |
int |
Max concurrent goroutines | "parallel_limit": 5 |
- Batch Processing:
batch: trueβ process each item indata - Retry Logic:
retries > 0β wrap execution with exponential backoff retry - Single Execution: Default behavior
Composability: Batch + Retry + Parallel can all be combined!
// One node type handles everything
type Node struct {
params map[string]interface{}
successors map[string]*Node
execFunc func(interface{}) (interface{}, error)
// ... internal adaptive logic
}state := flow.NewSharedState()
state.Set("key", value)
value := state.Get("key")
state.Append("list", item)
// Thread-safe operations
state.GetInt("count")
state.GetSlice("results")| Approach | Lines of Code | Boilerplate | Composability |
|---|---|---|---|
| Traditional OOP | 85+ lines/pattern | High (inheritance) | Limited |
| Flow Adaptive | ~15 lines/pattern | Zero | Unlimited |
BenchmarkAdaptiveNodeBasic-48 197M ops 6.2 ns/op 0 allocs
BenchmarkAdaptiveBatchSequential-48 995K ops 1085 ns/op 3 allocs
BenchmarkAdaptiveBatchParallel-48 17K ops 65ΞΌs/op 205 allocs
The adaptive node that changes behavior based on parameters.
type Node struct {
// Contains filtered or unexported fields
}
// Constructor
func NewNode() *Node
// Configuration
func (n *Node) SetParams(params map[string]interface{})
func (n *Node) GetParam(key string) interface{}
// Workflow chaining
func (n *Node) Next(node *Node, action string) *Node
func (n *Node) GetSuccessors() map[string]*Node
// Execution functions
func (n *Node) SetExecFunc(fn func(interface{}) (interface{}, error))
func (n *Node) SetPrepFunc(fn func(*SharedState) interface{})
func (n *Node) SetPostFunc(fn func(*SharedState, interface{}, interface{}) string)
// Execution
func (n *Node) Run(shared *SharedState) stringOrchestrates node execution in workflows.
type Flow struct {
*Node
// Contains filtered or unexported fields
}
// Constructor and configuration
func NewFlow() *Flow
func (f *Flow) Start(node *Node) *Flow
func (f *Flow) StartNode() *Node
// Execution
func (f *Flow) Run(shared *SharedState) stringThread-safe data sharing between nodes.
type SharedState struct {
// Contains filtered or unexported fields
}
// Constructor
func NewSharedState() *SharedState
// Basic operations
func (s *SharedState) Set(key string, value interface{})
func (s *SharedState) Get(key string) interface{}
// Typed getters
func (s *SharedState) GetInt(key string) int
func (s *SharedState) GetSlice(key string) []interface{}
// Collection operations
func (s *SharedState) Append(key string, value interface{})| Parameter | Type | Description | Example |
|---|---|---|---|
batch |
bool |
Enable batch processing | "batch": true |
data |
[]interface{} |
Data for batch processing | "data": []int{1,2,3} |
parallel |
bool |
Enable parallel execution | "parallel": true |
parallel_limit |
int |
Max concurrent goroutines | "parallel_limit": 5 |
retries |
int |
Number of retry attempts | "retries": 3 |
retry_delay |
time.Duration |
Base delay for backoff | "retry_delay": time.Second |
node := flow.NewNode()
node.SetExecFunc(func(prep interface{}) (interface{}, error) {
return "result", nil
})
result := node.Run(state) // Returns: "result"node := flow.NewNode()
node.SetParams(map[string]interface{}{
"data": []int{1, 2, 3},
"batch": true,
})
node.SetExecFunc(func(item interface{}) (interface{}, error) {
return item.(int) * 2, nil
})
result := node.Run(state) // Returns: "batch_complete"
// Results stored in state.Get("batch_results")node := flow.NewNode()
node.SetParams(map[string]interface{}{
"retries": 3,
"retry_delay": time.Millisecond * 100,
})
node.SetExecFunc(func(prep interface{}) (interface{}, error) {
// May fail and retry automatically
return "success", nil
})node := flow.NewNode()
node.SetParams(map[string]interface{}{
"data": []string{"url1", "url2", "url3"},
"batch": true,
"parallel": true,
"parallel_limit": 2,
"retries": 3,
})
// All patterns applied automatically!All adaptive behaviors are thoroughly tested:
# Run full test suite (11 test functions + 3 benchmarks)
go test ./... -v
# Run with race detection
go test ./... -race
# Performance benchmarks
go test ./... -bench=. -benchmemTest Coverage:
- β Basic adaptive behavior
- β Retry pattern detection and execution
- β Batch processing (sequential and parallel)
- β Composed patterns (retry + batch + parallel)
- β Flow integration with adaptive nodes
- β Edge cases and error handling
Perfect for building intelligent agents:
// Intelligent reasoning agent with adaptive patterns
func buildReasoningAgent() *flow.Flow {
// Input processing with retry
inputNode := flow.NewNode()
inputNode.SetParams(map[string]interface{}{
"retries": 2,
})
inputNode.SetExecFunc(parseUserInput)
// Parallel tool execution
toolNode := flow.NewNode()
toolNode.SetParams(map[string]interface{}{
"data": []string{"search", "calculator", "weather"},
"batch": true,
"parallel": true,
"retries": 3,
})
toolNode.SetExecFunc(executeTool)
// Response generation
responseNode := flow.NewNode()
responseNode.SetExecFunc(generateResponse)
inputNode.Next(toolNode, "needs_tools")
toolNode.Next(responseNode, "batch_complete")
return flow.NewFlow().Start(inputNode)
}flowchart TD
A[User Input] --> B[Create SharedState]
B --> C[Create Reasoning Agent Flow]
C --> D1[Input Node: Adaptive Node]
D1 --> E1[Set Parameters<br/>retries: 2]
E1 --> F1[ExecFunc: parseUserInput]
F1 --> G1{Parse Success?}
G1 -->|No| H1[Retry with backoff]
G1 -->|Yes| I1[Return: 'needs_tools']
H1 --> G1
I1 --> J{Check Action}
J -->|needs_tools| D2[Tool Node: Adaptive Node]
D2 --> E2["Set Parameters<br/>data: search, calculator, weather<br/>batch: true, parallel: true<br/>retries: 3"]
E2 --> F2["ExecFunc: executeTool"]
F2 --> G2[Parallel Tool Execution]
G2 --> H21[Goroutine 1: Search Tool]
G2 --> H22[Goroutine 2: Calculator Tool]
G2 --> H23[Goroutine 3: Weather Tool]
H21 --> I21{Search Success?}
H22 --> I22{Calc Success?}
H23 --> I23{Weather Success?}
I21 -->|No| J21[Retry Search<br/>up to 3 times]
I22 -->|No| J22[Retry Calculator<br/>up to 3 times]
I23 -->|No| J23[Retry Weather<br/>up to 3 times]
I21 -->|Yes| K21[Search Results]
I22 -->|Yes| K22[Calc Results]
I23 -->|Yes| K23[Weather Results]
J21 --> I21
J22 --> I22
J23 --> I23
K21 --> L[Collect Tool Results]
K22 --> L
K23 --> L
L --> M[Return: 'batch_complete']
M --> N{Check Action}
N -->|batch_complete| D3[Response Node: Adaptive Node]
D3 --> E3[ExecFunc: generateResponse]
E3 --> F3[Synthesize Tool Results]
F3 --> G3[Generate Final Response]
G3 --> H3[Return: 'response_ready']
H3 --> I[Agent Response]
I --> Z[End]
style D1 fill:#e1f5fe
style D2 fill:#e1f5fe
style D3 fill:#e1f5fe
style G2 fill:#e3f2fd
style I fill:#e8f5e8
| Aspect | PocketFlow | Flow Adaptive |
|---|---|---|
| Core Size | 100 lines | ~440 lines |
| Node Types | 1 BaseNode | 1 Adaptive Node |
| Patterns | User-built | Parameter-driven |
| Composability | Limited | Unlimited |
| Boilerplate | Minimal | Zero |
| Type Safety | Runtime | Compile-time |
Flow captures PocketFlow's constraint-based philosophy while advancing it with parameter-driven composability.
# Run all tests
go test ./flow/...
# Verbose output with test details
go test -v ./flow/...
# Race detection for concurrency safety
go test -race ./flow/...
# Performance benchmarks
go test -bench=. ./flow/...# Build the project
go build ./...
# Format code
gofmt -w .
# Run linter
go vet ./...
# Tidy dependencies
go mod tidy
# Generate coverage report
go test -coverprofile=coverage.out ./...
go tool cover -html=coverage.out -o coverage.html
# View coverage in terminal
go test -cover ./...- Fork the repository
- Create a feature branch (
git checkout -b feature/amazing-feature) - Write tests for your changes (see
flow/node_test.gofor examples) - Ensure all tests pass (
go test ./...) - Commit your changes (
git commit -am 'Add amazing feature') - Push to the branch (
git push origin feature/amazing-feature) - Open a Pull Request
CC0 1.0 Universal (Public Domain) - see LICENSE file for details.
This work is dedicated to the public domain. You can copy, modify, distribute and perform the work, even for commercial purposes, all without asking permission.
- PocketFlow - Original Python implementation
- Flow Examples - Working examples of all adaptive patterns
π Built with modern adaptive patterns for the Go community