This solution is designed for testing and experimenting with different .NET frameworks and libraries to build Generative AI applications and AI Agents. It provides a structured environment to explore various AI providers, chat interfaces, and LLM integrations.
- Framework: .NET 9.0
- Dependencies:
- LangChain (v0.17.0) - Comprehensive LLM framework
- LangChain.Providers.Ollama (v0.17.0) - Ollama integration
- Description: A conversational AI chat application using LangChain with memory management and Ollama integration
- Features:
- Multi-turn conversations with memory
- Template-based prompting
- Local Ollama model integration (gpt-oss:20b)
- Framework: .NET 9.0
- Dependencies:
- Microsoft.Extensions.AI (v9.8.0) - Microsoft's AI abstraction layer
- OllamaSharp (v5.3.6) - .NET client for Ollama
- Description: A streamlined chat application using Microsoft's AI extensions with real-time streaming responses
- Features:
- Real-time streaming responses
- Chat history management
- Direct Ollama API integration
- .NET 9.0 SDK
- Ollama installed and running locally on port 11434
- VS Code with C# Dev Kit extension
-
Build the entire solution:
dotnet build LLM-demos.sln
-
Run individual projects:
# LangChain demo dotnet run --project LangChainChat-01/LangChainChat-01.csproj # Microsoft AI Extensions demo dotnet run --project Ollama-MS-AI-Extension-Chat-01/Ollama-MS-AI-Extension-Chat-01.csproj
-
Using VS Code Debug: Use F5 or the Run and Debug panel to launch any configured project.
# Navigate to the solution root
cd /path/to/LLM-demos
# Create a new console project
dotnet new console -n YourNewProject-01 -f net9.0
# Add the project to the solution
dotnet sln LLM-demos.sln add YourNewProject-01/YourNewProject-01.csproj# Navigate to your new project directory
cd YourNewProject-01
# Example: Add common AI packages
dotnet add package Microsoft.Extensions.AI
dotnet add package OpenAI
# or
dotnet add package LangChain
dotnet add package LangChain.Providers.OpenAIEdit .vscode/tasks.json and add a new task:
{
"label": "build-yournewproject",
"command": "dotnet",
"type": "process",
"args": [
"build",
"${workspaceFolder}/YourNewProject-01/YourNewProject-01.csproj",
"/property:GenerateFullPaths=true",
"/consoleloggerparameters:NoSummary"
],
"problemMatcher": "$msCompile"
}Edit .vscode/launch.json and add a new configuration:
{
"name": "Launch YourNewProject-01",
"type": "coreclr",
"request": "launch",
"preLaunchTask": "build-yournewproject",
"program": "${workspaceFolder}/YourNewProject-01/bin/Debug/net9.0/YourNewProject-01.dll",
"args": [],
"cwd": "${workspaceFolder}/YourNewProject-01",
"console": "internalConsole",
"stopAtEntry": false
}# Build the solution to ensure everything is configured correctly
dotnet build LLM-demos.sln
# Test the new project
dotnet run --project YourNewProject-01/YourNewProject-01.csprojbuild-langchain: Builds the LangChain demo projectbuild-ollama: Builds the Ollama MS AI Extension demobuild-solution: Builds the entire solution
Launch LangChainChat-01: Debug configuration for LangChain demoLaunch Ollama-MS-AI-Extension-Chat-01: Debug configuration for Ollama demo
- Task naming convention: Use
build-{projectname}format - Launch configuration naming: Use
Launch {ProjectName}format - Console setting: Use
"console": "internalConsole"for interactive applications - PreLaunchTask: Always reference the corresponding build task
When creating new projects, follow this structure:
YourNewProject-01/
βββ YourNewProject-01.csproj
βββ Program.cs
βββ README.md (optional - project-specific documentation)
βββ bin/ (generated)
βββ obj/ (generated)
- Microsoft.Extensions.AI - Microsoft's unified AI abstraction
- LangChain - Comprehensive LLM framework with chains and agents
- Semantic Kernel - Microsoft's AI orchestration framework
- OpenAI - GPT models (requires API key)
- Azure.AI.OpenAI - Azure OpenAI services
- Anthropic - Claude models
- OllamaSharp - Local Ollama integration
- Google.AI.Generative - Google Gemini models
- Microsoft.SemanticKernel.Plugins.Memory - Vector databases and embeddings
- LangChain.Databases.Chroma - Vector database integration
- Microsoft.Extensions.AI.Abstractions - AI service abstractions
- RAG (Retrieval Augmented Generation) applications
- Multi-agent systems with different AI personalities
- Function calling and tool integration
- Vector database integration for semantic search
- Different prompt engineering techniques
- AI model comparison and benchmarking
- Streaming vs. non-streaming response handling
- Microsoft.Extensions.AI Documentation
- LangChain .NET Documentation
- Semantic Kernel Documentation
- Ollama Documentation
Feel free to add new demo projects following the established patterns. Each project should:
- Use .NET 9.0 as the target framework
- Include proper documentation in comments
- Follow the naming convention:
{Purpose}-{SequenceNumber} - Add corresponding VS Code tasks and launch configurations