Simple, unified interface to multiple Generative AI providers.
PSAISuite makes it easy for developers to use multiple LLM through a standardized interface. Using an interface similar to OpenAI's, PSAISuite makes it easy to interact with the most popular LLMs and compare the results. It is a thin wrapper around the LLM endpoint, and allows creators to seamlessly swap out and test responses from different LLM providers without changing their code. Today, the library is primarily focussed on chat completions. I will expand it cover more use cases in near future.
Currently supported providers are:
- Anthropic
- Azure AI Foundry
- DeepSeek
- GitHub
- Groq
- Inception
- Mistral
- Nebius
- Ollama
- OpenAI
- OpenRouter
- Perplexity
- xAI
You can install the module from the PowerShell Gallery.
Install-Module PSAISuiteTo get started, you will need API Keys for the providers you intend to use.
The API Keys need to be be set as environment variables.
Set the API keys.
$env:OpenAIKey="your-openai-api-key"
$env:AnthropicKey="your-anthropic-api-key"
$env:NebiusKey="your-nebius-api-key"
$env:GITHUB_TOKEN="your-github-token" # Add GitHub token
# ... and so on for other providers
$env:INCEPTION_API_KEY="your-inception-api-key"You will need to set the AzureAIKey and AzureAIEndpoint environment variables.
$env:AzureAIKey = "your-azure-ai-key"
$env:AzureAIEndpoint = "your-azure-ai-endpoint"You can pipe data directly into Invoke-ChatCompletion (or its alias icc) to use it as context for your prompt. This is useful for summarizing files, analyzing command output, or providing additional information to the model.
For example:
Get-Content .\README.md | icc -Messages "Summarize this document." -Model "openai:gpt-4o-mini"You can also use the output of any command:
Get-Process | Out-String | icc -Messages "What processes are running?" -Model "openai:gpt-4o-mini"Tip:
- The
-Modelparameter supports tab completion for available providers and models. Start typing a provider (likeopenai:orgithub:) and pressTabto see suggestions.- You can use the
iccorgenerateTextalias instead ofInvoke-ChatCompletionin all examples above.
See PIPE-EXAMPLES.md for more details and examples.
Using PSAISuite to generate chat completion responses from different providers.
You can list all available AI providers using the Get-ChatProviders function:
# Get a list of all available providers
Get-ChatProvidersYou can list OpenRouter models by name using the Get-OpenRouterModel function. Use the -Raw switch to return all properties for matching models:
# List all OpenRouter models with 'gpt' in their name
Get-OpenRouterModel -Name '*gpt*'
# List all OpenRouter models and return all properties
Get-OpenRouterModel -Raw
# List models by name and return all properties
Get-OpenRouterModel -Name '*gpt*' -RawThe -Raw switch returns the full model object from the OpenRouter API, including all available properties.
# Import the module
Import-Module PSAISuite
$models = @("openai:gpt-4o", "anthropic:claude-3-5-sonnet-20240620", "azureai:gpt-4o", "nebius:meta-llama/Llama-3.3-70B-Instruct")
$message = New-ChatMessage -Prompt "What is the capital of France?"
foreach($model in $models) {
Invoke-ChatCompletion -Messages $message -Model $model
}# Import the module
Import-Module PSAISuite
$message = New-ChatMessage -Prompt "What is the capital of France?"
Invoke-ChatCompletion -Messages $message -Raw
# You can also use the alias:
generateText -Messages $message -Raw# Import the module
Import-Module PSAISuite
$model = "openai:gpt-4o"
$message = New-ChatMessage -Prompt "What is the capital of France?"
Invoke-ChatCompletion -Model $model -Messages $message
# or by setting the environment variable
$env:PSAISUITE_DEFAULT_MODEL = "openai:gpt-4o"
$message = New-ChatMessage -Prompt "What is the capital of France?"
Invoke-ChatCompletion -Messages $message Note that the model name in the Invoke-ChatCompletion call uses the format - <provider>:<model-name>.
documentation coming soon