This is an unofficial wrapper for Langdock API.
| Category | Endpoint | Description | Implemented |
|---|---|---|---|
| Completion API | |||
POST /openai/{region}/v1/chat/completions |
OpenAI chat completions | ❌ | |
POST /anthropic/{region}/v1/messages |
Anthropic messages | ❌ | |
POST /mistral/{region}/v1/fim/completions |
Codestral completions | ❌ | |
| Embedding API | |||
POST /openai/{region}/v1/embeddings |
OpenAI embeddings | ✅ | |
| Assistant API | |||
POST /assistant/v1/chat/completions |
Assistant chat completions | ✅ | |
GET /assistant/v1/models |
List assistant models | ||
POST /attachment/v1/upload |
Upload attachment | ❌ | |
| Knowledge Folder API | |||
POST /knowledge/{folderId} |
Upload file | ❌ | |
PATCH /knowledge/{folderId} |
Update attachment | ❌ | |
GET /knowledge/{folderId}/list |
Retrieve files | ❌ | |
DELETE /knowledge/{folderId}/{attachmentId} |
Delete attachment | ❌ | |
POST /knowledge/search |
Search knowledge folder | ❌ |
Notes:
- ✅ Fully implemented and working
⚠️ Implemented but has issues (HTTP.jl compatibility issue, works with curl)- ❌ Not yet implemented
using Langdock
# Set up provider
ENV["LANGDOCK_API_KEY"] = "your-api-key"
provider = LangdockProvider(api_key=ENV["LANGDOCK_API_KEY"], region="eu") # or get_default_provider()
# List available assistant models
models = list_assistant_models(provider) # not working due to issue with HTTP.jl when calling this endpoint
# Create assistant chat
messages = [Dict("role" => "user", "content" => "Hello!")]
response = create_assistant_chat(
provider,
messages,
assistant_id="<your-assistant-id>"
)
# Generate embeddings
embeddings = create_openai_embeddings(
provider,
"Text to embed",
model_id="text-embedding-ada-002"
)To implement additional endpoints:
- Create a new file in
src/api/for the endpoint category (if not exists) - Implement the function following the existing pattern:
- Support both
api_keyandproviderparameters - Use
langdock_requestfor API calls - Add proper documentation with examples
- Support both
- Export the function in
src/Langdock.jl - Add tests in the corresponding
test/api/file - Update this documentation
Priority endpoints to implement:
- OpenAI Chat Completions - Most commonly used endpoint
- Anthropic Messages - Support for Claude models
- Knowledge Folder operations - File management capabilities
- Mistral/Codestral - Code completion support