-
Notifications
You must be signed in to change notification settings - Fork 427
Add Functions Support, and new Models from Jun 13th Update #149
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. Weβll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Conversation
This looks pretty promising! Thank you for sharing. I'm just trying it out and found something that looks like a typo to me: As I only add a single function, i would expect the class to be named "Function". Or do I maybe miss a point of the class? Either way, thanks again for adding this. I hope this gets merged here as Microsoft is not supporting functions calls in their new OpenAI package as of right now. |
What about chat endpoint requests? And function parameters description as |
I figured chat endpoint requests can come later as this is largely adding in the backend for function support. I can add implementation though if it's needed now.
Do you have any suggestions as I don't like it very much either. The model is expecting some sort of JSON object and fails if you send JSON serialized as a string. |
I was thinking about a way to pass a type instead of the parameters and the
Function would take the type information from that type and generate
Jsonschema from it. I saw already that Newtonsoft has Classes to generate
JsonSchema but did not had the time to look into it. The nice thing about
passing a type would be, that you can later use that type to deserialize
the function call from ChatGpt. This is not realy thought through at the
moment. Just some brainstorming i made a few hours ago.
Jordan Hans ***@***.***> schrieb am Fr., 16. Juni 2023, 19:04:
β¦ What about chat endpoint requests?
I figured chat endpoint requests can come later as this is largely adding
in the backend for function support.
function parameters description as JObject is not good idea.
Do you have any suggestions as I don't like it very much either. The model
is expecting some sort of JSON object and fails if you send JSON serialized
as a string.
β
Reply to this email directly, view it on GitHub
<#149 (comment)>,
or unsubscribe
<https://github.com/notifications/unsubscribe-auth/AAI7QNHM4TKGPM5QEPBQETTXLSG3JANCNFSM6AAAAAAZIVXZ4I>
.
You are receiving this because you commented.Message ID:
***@***.***>
|
Could not do more testing atm. But this seems to work fine, at least till serialization: Function.cs using Newtonsoft.Json.Schema;
using Newtonsoft.Json.Schema.Generation;
//...
[JsonProperty("parameters", Required = Required.Default)]
public JSchema Parameters { get; set; }
public Function(string name, string description, Type type)
{
this.Name = name;
this.Description = description;
this.Parameters = new JSchemaGenerator().Generate(type); // <--
} Test private class TestClass
{
public string TestString { get; set; }
public int TestInt { get; set; }
public bool TestBool { get; set; }
}
[Test]
public void TestFunctionWithSchema(){
var fn = new Function("s","d",typeof(TestClass));
var str = JsonConvert.SerializeObject(fn);
} |
Unfortunately It looks like Newtonsofts JsonSchema licensing is AGPL and only supports up to 1000 requests per hour without purchasing it. However it is worth exploring the idea of easily passing in the parameters schema using a Type. I previously was using NJsonSchema for this but didn't like the idea of including outside packages, that weren't already being used in the project. The benefit to using the JObject is we can easily convert other formats to it. The downside is it isn't an easy way to natively generate the schema. I thought about including a basic schema generator but felt it was outside the scope of the PR and project as a whole. |
I agree on keeping external packages to a minimum. With some AI support i came up with a simple JSON Schema generator, that might help here. I just did a first test with it, but as its getting pretty late here i cannot realy concentrate very good :D Moved code to a gist, so i dont blow up the discussion here. If this looks like a way to go. I'm glad to provide a cleaned up version with a bunch of tests. After some sleep :) |
There is my solution: public interface IJsonSchema
{
string Type { get; }
string? Description { get; set; }
}
public class JsonObjectSchema : IJsonSchema
{
[JsonPropertyName("type")]
public string Type { get; } = "object";
[JsonPropertyName("description")]
public string? Description { get; set; }
[JsonPropertyName("properties")]
public required Dictionary<string, IJsonSchema> Properties { get; set; }
[JsonPropertyName("required")]
public List<string>? Required { get; set; }
}
public class JsonArraySchema : IJsonSchema
{
[JsonPropertyName("type")]
public string Type { get; } = "array";
[JsonPropertyName("description")]
public string? Description { get; set; }
[JsonPropertyName("items")]
public required IJsonSchema Items { get; set; }
[JsonPropertyName("minItems")]
public int? MinItems { get; set; }
[JsonPropertyName("maxItems")]
public int? MaxItems { get; set; }
[JsonPropertyName("uniqueItems")]
public bool? UniqueItems { get; set; }
}
public class JsonStringSchema : IJsonSchema
{
[JsonPropertyName("type")]
public string Type { get; } = "string";
[JsonPropertyName("description")]
public string? Description { get; set; }
[JsonPropertyName("enum")]
public List<string>? Enum { get; set; }
[JsonPropertyName("pattern")]
public string? Pattern { get; set; }
[JsonPropertyName("minLength")]
public int? MinLength { get; set; }
[JsonPropertyName("maxLength")]
public int? MaxLength { get; set; }
}
public class JsonNumberSchema : IJsonSchema
{
[JsonPropertyName("type")]
public string Type { get; } = "number";
[JsonPropertyName("description")]
public string? Description { get; set; }
[JsonPropertyName("minimum")]
public double? Minimum { get; set; }
[JsonPropertyName("maximum")]
public double? Maximum { get; set; }
[JsonPropertyName("multipleOf")]
public double? MultipleOf { get; set; }
[JsonPropertyName("exclusiveMaximum")]
public double? ExclusiveMaximum { get; set; }
[JsonPropertyName("exclusiveMinimum")]
public double? ExclusiveMinimum { get; set; }
}
public class JsonBooleanSchema : IJsonSchema
{
[JsonPropertyName("type")]
public string Type { get; } = "boolean";
[JsonPropertyName("description")]
public string? Description { get; set; }
}
public class JsonSchemaConverter : JsonConverter<IJsonSchema>
{
public override IJsonSchema? Read(ref Utf8JsonReader reader, Type typeToConvert, JsonSerializerOptions options)
{
throw new NotImplementedException();
}
public override void Write(Utf8JsonWriter writer, IJsonSchema value, JsonSerializerOptions options)
{
JsonSerializer.Serialize(writer, value, value.GetType(), options);
}
} Usage: var mySchema = new JsonObjectSchema
{
Properties = new Dictionary<string, IJsonSchema>
{
["name"] = new JsonStringSchema
{
Description = "The name of the person",
Enum = new List<string> { "Alice", "Bob", "Charlie" },
MinLength = 1,
MaxLength = 100
},
["age"] = new JsonNumberSchema
{
Description = "The age of the person",
Minimum = 0,
Maximum = 150,
MultipleOf = 1
},
["hobbies"] = new JsonArraySchema
{
Items = new JsonStringSchema()
{
MaxLength = 30
},
MinItems = 0,
MaxItems = 100,
UniqueItems = true
},
["is_married"] = new JsonBooleanSchema
{
Description = "Marital status of the person"
}
},
Required = new List<string> { "name", "age" }
};
var options = new JsonSerializerOptions { WriteIndented = true, DefaultIgnoreCondition = System.Text.Json.Serialization.JsonIgnoreCondition.WhenWritingNull };
options.Converters.Add(new JsonSchemaConverter());
var jsonString = JsonSerializer.Serialize(mySchema, options);
Console.WriteLine(jsonString); Output: {
"type": "object",
"properties": {
"name": {
"type": "string",
"description": "The name of the person",
"enum": [
"Alice",
"Bob",
"Charlie"
],
"minLength": 1,
"maxLength": 100
},
"age": {
"type": "number",
"description": "The age of the person",
"minimum": 0,
"maximum": 150,
"multipleOf": 1
},
"hobbies": {
"type": "array",
"items": {
"type": "string",
"maxLength": 30
},
"minItems": 0,
"maxItems": 100,
"uniqueItems": true
},
"is_married": {
"type": "boolean",
"description": "Marital status of the person"
}
},
"required": [
"name",
"age"
]
} It can serialize, but not deserialize. But deserialization is not required. |
I have done some adjustments to the parameters type and setter
This avoid implementing our own JsonSchema as the user will be able to provide their own which is arguably going to be better than anything we need to implement and support in the future. If the user wants to use a schema like Newtonsoft, NJsonSchema, or their own implementation it can handle it, if the user passes in a dictionary<string, obj> it can handle it, or if the user passes in a string formatted as JSON it can handle it. @ClusterM I tested this using your implementation (adjusting it slightly for support with Newtonsoft over System.Text.Json) and the original JsonObjectSchema and Parameters were identical. If the user doesn't provide the needed schema they will receive a 400 error from the OpenAI endpoint. Do we have any objections to this handling? |
Sounds reasonable to me. Using object instead of JObject makes it more
accesible and less dependent on Newtonsoft (the code using this package).
Jordan Hans ***@***.***> schrieb am Sa., 17. Juni 2023, 18:06:
β¦ I have done some adjustments to the parameters type and setter
[JsonProperty("parameters", Required = Required.Default)]
public object Parameters
{
get
{
return _parameters;
}
set
{
try
{
if (value is string jsonStringValue)
{
_parameters = JObject.Parse(jsonStringValue);
}
if (value is JObject jObjectValue)
{
_parameters = jObjectValue;
}
else
{
var settings = new JsonSerializerSettings
{
NullValueHandling = NullValueHandling.Ignore
};
var jsonString = JsonConvert.SerializeObject(value, settings);
_parameters = JObject.Parse(jsonString);
}
}
catch (JsonException e)
{
throw new ArgumentException("Could not convert the provided object into a JSON object. Make sure that the object is serializable and its structure matches the required schema.", e);
}
}
}
This avoid implementing our own JsonSchema as the user will be able to
provide their own which is arguably going to be better than anything we
need to implement and support in the future. If the user wants to use a
schema like Newtonsoft, NJsonSchema, or their own implementation it can
handle it, if the user passes in a dictionary<string, obj> it can handle
it, or if the user passes in a string formatted as JSON it can handle it.
@ClusterM <https://github.com/ClusterM> I tested this using your
implementation (adjusting it slightly for support with Newtonsoft over
System.Text.Json) and the original JsonObjectSchema and Parameters were
identical.
If the user doesn't provide the needed schema they will receive a 400
error from the OpenAI endpoint.
Do we have any objections to this handling?
β
Reply to this email directly, view it on GitHub
<#149 (comment)>,
or unsubscribe
<https://github.com/notifications/unsubscribe-auth/AAI7QNDLH2BK7TAGXJHMIRDXLXIY3ANCNFSM6AAAAAAZIVXZ4I>
.
You are receiving this because you commented.Message ID:
***@***.***>
|
@hansjm10 you might want to use |
I can't get this PR to work when streaming the chat completion. Anyone else having issues with this, or am I just doing something wrong? |
Mind sending what you have tried and your response from OpenAI? I am able to use
and the returned streamed results are correct |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
In order for this PR to work as the example code does, You'll have to include this change in ChatMessage.cs
[JsonProperty("content", NullValueHandling = NullValueHandling.Include)]
public string Content { get; set; }
This is because the function response message from OpenAI has a result with a content of null
. If it is not included in the message list on subsequent requests, the API will give you an error:
"error": {
"message": "'content' is a required property - 'messages.1'",
"type": "invalid_request_error",
"param": null,
"code": null
}
Here is an updated test case that follows the workflow in the example code from OpenAI. See the complete python example at https://platform.openai.com/docs/guides/gpt/function-calling public async Task SummarizeFunctionResult()
{
try
{
var api = new OpenAI_API.OpenAIAPI();
var functionList = new List<Function>
{
BuildFunctionForTest()
};
var conversation = api.Chat.CreateConversation(new ChatRequest {
Model = Model.ChatGPTTurbo0613,
Functions = functionList
});
conversation.AppendUserInput("What is the weather like in Boston?");
var response = await conversation.GetResponseFromChatbotAsync();
Assert.IsNull(response);
var functionMessage = new ChatMessage
{
Role = ChatMessageRole.Function,
Name = "get_current_weather",
Content = "{\"temperature\": \"22\", \"unit\": \"celsius\", \"description\": \"sunny\"}"
};
conversation.AppendMessage(functionMessage);
response = await conversation.GetResponseFromChatbotAsync();
Assert.AreEqual("The current weather in Boston is sunny with a temperature of 22 degrees Celsius.", response);
}
catch(NullReferenceException ex)
{
Console.WriteLine(ex.Message, ex.StackTrace);
Assert.False(true);
}
} |
Is there any example how to extract function name and arguments from GetResponseFromChatbotAsync?
|
The newest commit adds in new chat endpoints. You can use
and access function names and arguments like
|
Thanks for quick reply. Open cases:
Example: |
Going to take some time later to think about this situation and see how the official documentation handles this. FWIW You can easily check if the response is null for a function_call if your expecting it.
as with streaming a function that is a bit weird since it supposed to be for internal use for accessing another API. If you could come up with a use case I can look into it more. That being said I'm not to familiar with the streaming side if somebody else wants to look into it too and see if there is a good solution. |
I did some hacks on my side but would like to see the production ready PR. Streaming is also working on my side (with hacks). Use case (chatbot) with streaming:
In both cases there should be a single CallResponseAsync method which returns function or message. Because you don't know in advance if function will be called or not. |
These are two different return types. If you are expecting the possibility of a function you should be checking if the resulted response is a function anyway. If you want more control over this you can use |
OpenAI_API/Chat/Conversation.cs
Outdated
public async Task<Function_Call> GetFunction_CallResponseAsync() | ||
{ | ||
return await GetFromChatResultAsync(message => message.Function_Call); | ||
} |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
This seems like the wrong way to implement this because it implies you know that the API is going to request a function call based on the input. The way it's implemented in the python library is to check for an empty content
return from GetResponseFromChatbotAsync
and then get response["choices"][0]["function_call"]
from the last response to check the named Function_Call.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Not sure why but I'm not able to find any reference to functions in the openai/openai-python library, if you could link what your going off of I would appreciate it. The cookbook utilizes JSON for examples which we shouldn't be working with as our end result.
If function_call is not null that implies the existence of a function,. The real question is should we have the user handle the function call or the wrapper handle it.
If it is null we can just output the response as a string like you said earlier. This is just rambling but we could have the user pass in a list of Func's, then invoke them based on the response. (Which in turn would finally call the summarize). This would circumvent needing to return function_call to the user directly.
That seems a bit advanced and could be outside the scope but it's an interesting thought.
Alternatively this is all handled just by using CreateChatCompletionAsync
and letting the user manually check function_call + content.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I would go for the simplest implementation at this point. I see your point that there really isn't a reference implementation in the python library. It's just relying on the dictionary. Since there are a dozen ways someone might want to implement handling the presence of function_call in the return - I think the way it works now will suffice.
@hansjm10 Can you also add the following method to Conversation.cs ? /// <summary>
/// Creates and appends a <see cref="ChatMessage"/> to the chat history with the Role of <see cref="ChatMessageRole.Function"/>. The function message is a response to a request from the system for output from a predefined function.
/// </summary>
/// <param name="functionName">The name of the function for which the content has been generated as the result</param>
/// <param name="content">The text content (usually JSON)</param>
public void AppendFunctionMessage(string functionName, string content) => AppendMessage(new ChatMessage(ChatMessageRole.Function, content) { Name = functionName }); |
|
Some feedback on usability of the functions: Not a fan of "GetResponseFromChatbotAsync" returning null when the action has choices. Would like to see another routine added called "GetResultFromChatbotAsync" with the result since the likelihood of using functions will only increase over time & using the first message of Choice[0] feels hinky requiring the need to go back to the Conversation object to access "MostRecentApiResult".
Overall, I hope this gets merged sooner than later since it is functional |
@OkGoDoIt I think this is ready to merge |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
This looks good. Some minor nitpicks, but those can be fixed up later. Will be great to get this in.
OpenAI_API/ChatFunctions/Function.cs
Outdated
public class Function | ||
{ | ||
/// <summary> | ||
/// The name of the function to be called. Must be a-z, A-Z, 0-9, or contain underscores and dashes, with a maximum length of 64. |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Are we providing any validation on this? Can we match it against a regex before sending off to OpenAI?
OpenAI_API/ChatFunctions/Function.cs
Outdated
try | ||
{ | ||
if (value is string jsonStringValue) | ||
{ | ||
_parameters = JObject.Parse(jsonStringValue); | ||
} | ||
else if (value is JObject jObjectValue) | ||
{ | ||
_parameters = jObjectValue; | ||
} | ||
else | ||
{ | ||
var settings = new JsonSerializerSettings | ||
{ | ||
NullValueHandling = NullValueHandling.Ignore | ||
}; | ||
_parameters = JObject.FromObject(value, JsonSerializer.Create(settings)); | ||
} | ||
} | ||
catch (JsonException e) | ||
{ | ||
throw new ArgumentException("Could not convert the provided object into a JSON object. Make sure that the object is serializable and its structure matches the required schema.", e); | ||
} |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Might be worth placing this in a helper method for clarity/simplicity.
Got this error running in Unity:
|
StreamChatEnumerableAsync works in a strange way with functions. Is there any working example? I found only the way to use it
|
Hey everyone, I'm sorry for the delay on my end. I've been swamped with my day job and I'm behind on triaging PR's here. I will do my best to get to this within the next few days. Thanks for your understanding π |
Efficient implementation of function calling in conjunction with streaming: lofcz@52b7884#diff-8bd7aaf5181bcc44301c5b518a91bee8f7e4591461c1c19dc8a4a55e74354e07R383 @hansjm10 I think this event-based approach and abstracting buffering away from the end-user presents a better experience for API consumers. |
@OkGoDoIt can you provide us with an additional update on this? |
@hansjm10 I've fixed the issue where function_call couldn't be used properly during re-streaming. Now, function_call can be used normally in all scenarios. detail.Also, I've added example code for using stream transmission. It's located in test/ChatEndpointTest.cs, within the method named SummarizeFunctionStreamResult. |
Looks good and I'm ready to merge it but just revert the changes to the proj files. |
You could cherrypick the two commits that don't mess with the proj files as well |
There's a lot of weird intending issues in a lot of the files as well |
I think @ErikDombi means "indenting issues" as in whitespace, and formatting is all over the place π |
Is there any status on this? Eager to try out this functionality |
I have this and token is printed empty, can someone help me?
.......
|
Looks like PR got killed. Can we get it back? |
did it get merged? |
@EvoTechMike, no, I haven't merged any PR's in a while. I implemented some of this myself but I have not yet implemented the function calling. |
Function calling is the only reason I can't use this beautiful package. Please implement it. |
@OkGoDoIt If you don't have time to implement or approve features by yourself why you don't let your contributors do the job? Still if a community implementation doesn't reach your quality standards and will be replaced by your own ones later on it would be a much better progress. Right now we have a feature lack which grows from day to day. I understand that you have your own personal goals but on a project that became so famous there is a requirement to be up to date and feature complete. So please rethink about your future project management. |
Interim one of the forks supporting Functions, Assistants, and other new features can be used. https://github.com/RageAgainstThePixel/OpenAI-DotNet -Supports Unity, .NET Standard & .NET Core, frequently updated, great test coverage. |
Thanks |
@lofcz Out of curiosity, how did you find these forks? When I go to https://github.com/OkGoDoIt/OpenAI-API-dotnet/forks there isn't any forks with more than 1 star. edit: d'oh! I see one of them is yours. Still wondering why they dont show up as forks though |
Both RageAgainstThePixel and me hardforked some time ago. I've maintained a soft fork for about half a year but then the changes diverged too much from upstream, hence hard fork. |
I suggest archive this repository if the author no longer maintain this feature. It is funny that such feature is not merged after 8 months. |
This commit adds in function support as described in the latest June 13th update. It also adds the new models which support functions. This closes #146