Thanks to visit codestin.com
Credit goes to github.com

Skip to content

Conversation

hansjm10
Copy link

This commit adds in function support as described in the latest June 13th update. It also adds the new models which support functions. This closes #146

@hansjm10 hansjm10 changed the title Add Functions Support, and new Models from Jun 6th Update Add Functions Support, and new Models from Jun 13th Update Jun 16, 2023
@tkoenig89
Copy link

This looks pretty promising! Thank you for sharing.

I'm just trying it out and found something that looks like a typo to me:

image

As I only add a single function, i would expect the class to be named "Function". Or do I maybe miss a point of the class?

Either way, thanks again for adding this. I hope this gets merged here as Microsoft is not supporting functions calls in their new OpenAI package as of right now.

@ClusterM
Copy link

What about chat endpoint requests? And function parameters description as JObject is not good idea.

@hansjm10
Copy link
Author

hansjm10 commented Jun 16, 2023

What about chat endpoint requests?

I figured chat endpoint requests can come later as this is largely adding in the backend for function support. I can add implementation though if it's needed now.

function parameters description as JObject is not good idea.

Do you have any suggestions as I don't like it very much either. The model is expecting some sort of JSON object and fails if you send JSON serialized as a string.

@tkoenig89
Copy link

tkoenig89 commented Jun 16, 2023 via email

@tkoenig89
Copy link

tkoenig89 commented Jun 16, 2023

Could not do more testing atm. But this seems to work fine, at least till serialization:

Function.cs

using Newtonsoft.Json.Schema;
using Newtonsoft.Json.Schema.Generation;
//...
[JsonProperty("parameters", Required = Required.Default)]
public JSchema Parameters { get; set; }

public Function(string name, string description, Type type)
{
    this.Name = name;
    this.Description = description;
    this.Parameters = new JSchemaGenerator().Generate(type); // <--
}

Test

  private class TestClass
  {
    public string TestString { get; set; }
    public int TestInt { get; set; }
    public bool TestBool { get; set; }
  }
  
  [Test]
  public void TestFunctionWithSchema(){
    var fn = new Function("s","d",typeof(TestClass));
    var str = JsonConvert.SerializeObject(fn);
  }

@hansjm10
Copy link
Author

Unfortunately It looks like Newtonsofts JsonSchema licensing is AGPL and only supports up to 1000 requests per hour without purchasing it. However it is worth exploring the idea of easily passing in the parameters schema using a Type. I previously was using NJsonSchema for this but didn't like the idea of including outside packages, that weren't already being used in the project.

The benefit to using the JObject is we can easily convert other formats to it. The downside is it isn't an easy way to natively generate the schema. I thought about including a basic schema generator but felt it was outside the scope of the PR and project as a whole.

@tkoenig89
Copy link

I agree on keeping external packages to a minimum. With some AI support i came up with a simple JSON Schema generator, that might help here. I just did a first test with it, but as its getting pretty late here i cannot realy concentrate very good :D

Moved code to a gist, so i dont blow up the discussion here.
https://gist.github.com/tkoenig89/e35d6ffc2979746476893fe00234ab30

If this looks like a way to go. I'm glad to provide a cleaned up version with a bunch of tests. After some sleep :)

@ClusterM
Copy link

ClusterM commented Jun 17, 2023

There is my solution:

public interface IJsonSchema
{
    string Type { get; }
    string? Description { get; set; }
}

public class JsonObjectSchema : IJsonSchema
{
    [JsonPropertyName("type")]
    public string Type { get; } = "object";

    [JsonPropertyName("description")]
    public string? Description { get; set; }

    [JsonPropertyName("properties")]
    public required Dictionary<string, IJsonSchema> Properties { get; set; }

    [JsonPropertyName("required")]
    public List<string>? Required { get; set; }
}

public class JsonArraySchema : IJsonSchema
{
    [JsonPropertyName("type")]
    public string Type { get; } = "array";

    [JsonPropertyName("description")]
    public string? Description { get; set; }

    [JsonPropertyName("items")]
    public required IJsonSchema Items { get; set; }

    [JsonPropertyName("minItems")]
    public int? MinItems { get; set; }

    [JsonPropertyName("maxItems")]
    public int? MaxItems { get; set; }

    [JsonPropertyName("uniqueItems")]
    public bool? UniqueItems { get; set; }
}

public class JsonStringSchema : IJsonSchema
{
    [JsonPropertyName("type")]
    public string Type { get; } = "string";

    [JsonPropertyName("description")]
    public string? Description { get; set; }

    [JsonPropertyName("enum")]
    public List<string>? Enum { get; set; }

    [JsonPropertyName("pattern")]
    public string? Pattern { get; set; }

    [JsonPropertyName("minLength")]
    public int? MinLength { get; set; }

    [JsonPropertyName("maxLength")]
    public int? MaxLength { get; set; }
}

public class JsonNumberSchema : IJsonSchema
{
    [JsonPropertyName("type")]
    public string Type { get; } = "number";

    [JsonPropertyName("description")]
    public string? Description { get; set; }

    [JsonPropertyName("minimum")]
    public double? Minimum { get; set; }

    [JsonPropertyName("maximum")]
    public double? Maximum { get; set; }

    [JsonPropertyName("multipleOf")]
    public double? MultipleOf { get; set; }

    [JsonPropertyName("exclusiveMaximum")]
    public double? ExclusiveMaximum { get; set; }

    [JsonPropertyName("exclusiveMinimum")]
    public double? ExclusiveMinimum { get; set; }
}

public class JsonBooleanSchema : IJsonSchema
{
    [JsonPropertyName("type")]
    public string Type { get; } = "boolean";

    [JsonPropertyName("description")]
    public string? Description { get; set; }
}

public class JsonSchemaConverter : JsonConverter<IJsonSchema>
{
    public override IJsonSchema? Read(ref Utf8JsonReader reader, Type typeToConvert, JsonSerializerOptions options)
    {
        throw new NotImplementedException();
    }

    public override void Write(Utf8JsonWriter writer, IJsonSchema value, JsonSerializerOptions options)
    {
        JsonSerializer.Serialize(writer, value, value.GetType(), options);
    }
}

Usage:

var mySchema = new JsonObjectSchema
{
    Properties = new Dictionary<string, IJsonSchema>
    {
        ["name"] = new JsonStringSchema
        {
            Description = "The name of the person",
            Enum = new List<string> { "Alice", "Bob", "Charlie" },
            MinLength = 1,
            MaxLength = 100
        },
        ["age"] = new JsonNumberSchema
        {
            Description = "The age of the person",
            Minimum = 0,
            Maximum = 150,
            MultipleOf = 1
        },
        ["hobbies"] = new JsonArraySchema
        {
            Items = new JsonStringSchema()
            {
                MaxLength = 30
            },
            MinItems = 0,
            MaxItems = 100,
            UniqueItems = true
        },
        ["is_married"] = new JsonBooleanSchema
        {
            Description = "Marital status of the person"
        }
    },
    Required = new List<string> { "name", "age" }
};

var options = new JsonSerializerOptions { WriteIndented = true, DefaultIgnoreCondition = System.Text.Json.Serialization.JsonIgnoreCondition.WhenWritingNull };
options.Converters.Add(new JsonSchemaConverter());
var jsonString = JsonSerializer.Serialize(mySchema, options);

Console.WriteLine(jsonString);

Output:

{
  "type": "object",
  "properties": {
    "name": {
      "type": "string",
      "description": "The name of the person",
      "enum": [
        "Alice",
        "Bob",
        "Charlie"
      ],
      "minLength": 1,
      "maxLength": 100
    },
    "age": {
      "type": "number",
      "description": "The age of the person",
      "minimum": 0,
      "maximum": 150,
      "multipleOf": 1
    },
    "hobbies": {
      "type": "array",
      "items": {
        "type": "string",
        "maxLength": 30
      },
      "minItems": 0,
      "maxItems": 100,
      "uniqueItems": true
    },
    "is_married": {
      "type": "boolean",
      "description": "Marital status of the person"
    }
  },
  "required": [
    "name",
    "age"
  ]
}

It can serialize, but not deserialize. But deserialization is not required.

@hansjm10
Copy link
Author

I have done some adjustments to the parameters type and setter

 [JsonProperty("parameters", Required = Required.Default)]
        public object Parameters
        {
            get
            {
                return _parameters;
            }
            set
            {
                try
                {
                    if (value is string jsonStringValue)
                    {
                        _parameters = JObject.Parse(jsonStringValue);
                    }
                    if (value is JObject jObjectValue)
                    {
                        _parameters = jObjectValue;
                    }
                    else
                    {
                        var settings = new JsonSerializerSettings
                        {
                            NullValueHandling = NullValueHandling.Ignore
                        };
                        var jsonString = JsonConvert.SerializeObject(value, settings);
                        _parameters = JObject.Parse(jsonString);
                    }
                }
                catch (JsonException e)
                {
                    throw new ArgumentException("Could not convert the provided object into a JSON object. Make sure that the object is serializable and its structure matches the required schema.", e);
                }
            }
        }

This avoid implementing our own JsonSchema as the user will be able to provide their own which is arguably going to be better than anything we need to implement and support in the future. If the user wants to use a schema like Newtonsoft, NJsonSchema, or their own implementation it can handle it, if the user passes in a dictionary<string, obj> it can handle it, or if the user passes in a string formatted as JSON it can handle it.

@ClusterM I tested this using your implementation (adjusting it slightly for support with Newtonsoft over System.Text.Json) and the original JsonObjectSchema and Parameters were identical.

If the user doesn't provide the needed schema they will receive a 400 error from the OpenAI endpoint.

Do we have any objections to this handling?

@tkoenig89
Copy link

tkoenig89 commented Jun 17, 2023 via email

@tkoenig89
Copy link

@hansjm10 you might want to use JObject.FromObject(value) instead of serializing to string and parsing as JObject, At least this seemed to be the simplest solution i used when i tested the original version of the PR.

@ErikDombi
Copy link

I can't get this PR to work when streaming the chat completion. Anyone else having issues with this, or am I just doing something wrong?

@hansjm10
Copy link
Author

@ErikDombi I can't get this PR to work when streaming the chat completion. Anyone else having issues with this, or am I just doing something wrong?

Mind sending what you have tried and your response from OpenAI? I am able to use

   IAsyncEnumerable<string> asyncEnumerable = conversation.StreamResponseEnumerableFromChatbotAsync();

   await foreach(var res in asyncEnumerable)
   {
	Console.WriteLine(res);
   }

and the returned streamed results are correct

Copy link

@brondavies brondavies left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

In order for this PR to work as the example code does, You'll have to include this change in ChatMessage.cs

		[JsonProperty("content", NullValueHandling = NullValueHandling.Include)]
		public string Content { get; set; }

This is because the function response message from OpenAI has a result with a content of null. If it is not included in the message list on subsequent requests, the API will give you an error:

"error": {
    "message": "'content' is a required property - 'messages.1'",
    "type": "invalid_request_error",
    "param": null,
    "code": null
  }

@brondavies
Copy link

brondavies commented Jun 19, 2023

Here is an updated test case that follows the workflow in the example code from OpenAI. See the complete python example at https://platform.openai.com/docs/guides/gpt/function-calling

public async Task SummarizeFunctionResult()
{
    try
    {
        var api = new OpenAI_API.OpenAIAPI();
        var functionList = new List<Function>
        {
            BuildFunctionForTest()
        };
        var conversation = api.Chat.CreateConversation(new ChatRequest { 
            Model = Model.ChatGPTTurbo0613,
            Functions = functionList
        });
        conversation.AppendUserInput("What is the weather like in Boston?");

        var response = await conversation.GetResponseFromChatbotAsync();

        Assert.IsNull(response);

        var functionMessage = new ChatMessage
        {
            Role = ChatMessageRole.Function,
            Name = "get_current_weather",
            Content = "{\"temperature\": \"22\", \"unit\": \"celsius\", \"description\": \"sunny\"}"
        };
        conversation.AppendMessage(functionMessage);
        response = await conversation.GetResponseFromChatbotAsync();
        
        Assert.AreEqual("The current weather in Boston is sunny with a temperature of 22 degrees Celsius.", response);
    }
    catch(NullReferenceException ex)
    {
        Console.WriteLine(ex.Message, ex.StackTrace);
        Assert.False(true);
    }
}

@62316e
Copy link

62316e commented Jun 21, 2023

Is there any example how to extract function name and arguments from GetResponseFromChatbotAsync?

conversation.AppendUserInput("What is the weather like in Boston?"); returns null.

@hansjm10
Copy link
Author

hansjm10 commented Jun 21, 2023

Is there any example how to extract function name and arguments from GetResponseFromChatbotAsync?

conversation.AppendUserInput("What is the weather like in Boston?"); returns null.

The newest commit adds in new chat endpoints. You can use

var response = await conversation.GetFunction_CallResponseAsync()

and access function names and arguments like

var name = response.Name;

var arguments = response.Arguments;

@62316e
Copy link

62316e commented Jun 21, 2023

Is there any example how to extract function name and arguments from GetResponseFromChatbotAsync?
conversation.AppendUserInput("What is the weather like in Boston?"); returns null.

The newest commit adds in new chat endpoints. You can use

var response = await GetFunction_CallResponseAsync()

and access function names and arguments like

var name = response.Name;

var arguments = response.Arguments;

Thanks for quick reply.

Open cases:

  • CallResponseAsync should return either message or function, since you don't always know if you need to call a function or not
  • Streaming functions are not working

Example:
User: Hello,
Bot: Hi, i'm bot and your assistant.
User: What's the weather in Boston?
Bot: $function$

@hansjm10
Copy link
Author

CallResponseAsync should return either message or function, since you don't always know if you need to call a function or not

Going to take some time later to think about this situation and see how the official documentation handles this. FWIW You can easily check if the response is null for a function_call if your expecting it.

var response = await conversation.GetFunction_CallResponseAsync();
// The user input didn't trigger a function call.
if(response == null)
{
    Console.WriteLine(conversation.Messages.Last().content)
}

as with streaming a function that is a bit weird since it supposed to be for internal use for accessing another API. If you could come up with a use case I can look into it more. That being said I'm not to familiar with the streaming side if somebody else wants to look into it too and see if there is a good solution.

@62316e
Copy link

62316e commented Jun 21, 2023

CallResponseAsync should return either message or function, since you don't always know if you need to call a function or not

Going to take some time later to think about this situation and see how the official documentation handles this. FWIW You can easily check if the response is null for a function_call if your expecting it.

var response = await conversation.GetFunction_CallResponseAsync();
// The user input didn't trigger a function call.
if(response == null)
{
    Console.WriteLine(conversation.Messages.Last().content)
}

as with streaming a function that is a bit weird since it supposed to be for internal use for accessing another API. If you could come up with a use case I can look into it more. That being said I'm not to familiar with the streaming side if somebody else wants to look into it too and see if there is a good solution.

I did some hacks on my side but would like to see the production ready PR. Streaming is also working on my side (with hacks).

Use case (chatbot) with streaming:

  • User: "tell me a joke" <- No function call is required here and system can stream the response directly to the user
  • User: "what's the weather" <- System checks for function call and makes a second call (also with streaming) and stream is returned to the user.

In both cases there should be a single CallResponseAsync method which returns function or message. Because you don't know in advance if function will be called or not.

@hansjm10
Copy link
Author

hansjm10 commented Jun 21, 2023

In both cases there should be a single CallResponseAsync method which returns function or message. Because you don't know in advance if function will be called or not.

These are two different return types. If you are expecting the possibility of a function you should be checking if the resulted response is a function anyway. If you want more control over this you can use CreateChatCompletionAsync which returns the ChatResult. This might be what you are looking for.

Comment on lines 151 to 154
public async Task<Function_Call> GetFunction_CallResponseAsync()
{
return await GetFromChatResultAsync(message => message.Function_Call);
}

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

This seems like the wrong way to implement this because it implies you know that the API is going to request a function call based on the input. The way it's implemented in the python library is to check for an empty content return from GetResponseFromChatbotAsync and then get response["choices"][0]["function_call"] from the last response to check the named Function_Call.

Copy link
Author

@hansjm10 hansjm10 Jun 21, 2023

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Not sure why but I'm not able to find any reference to functions in the openai/openai-python library, if you could link what your going off of I would appreciate it. The cookbook utilizes JSON for examples which we shouldn't be working with as our end result.

If function_call is not null that implies the existence of a function,. The real question is should we have the user handle the function call or the wrapper handle it.

If it is null we can just output the response as a string like you said earlier. This is just rambling but we could have the user pass in a list of Func's, then invoke them based on the response. (Which in turn would finally call the summarize). This would circumvent needing to return function_call to the user directly.

That seems a bit advanced and could be outside the scope but it's an interesting thought.

Alternatively this is all handled just by using CreateChatCompletionAsync and letting the user manually check function_call + content.

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I would go for the simplest implementation at this point. I see your point that there really isn't a reference implementation in the python library. It's just relying on the dictionary. Since there are a dozen ways someone might want to implement handling the presence of function_call in the return - I think the way it works now will suffice.

@brondavies
Copy link

@hansjm10 Can you also add the following method to Conversation.cs ?
I think this would make the solution more complete.

        /// <summary>
        /// Creates and appends a <see cref="ChatMessage"/> to the chat history with the Role of <see cref="ChatMessageRole.Function"/>.  The function message is a response to a request from the system for output from a predefined function.
        /// </summary>
        /// <param name="functionName">The name of the function for which the content has been generated as the result</param>
        /// <param name="content">The text content (usually JSON)</param>
        public void AppendFunctionMessage(string functionName, string content) => AppendMessage(new ChatMessage(ChatMessageRole.Function, content) { Name = functionName });

@reitowo
Copy link

reitowo commented Jun 25, 2023

Function_Call violates almost all naming rules in C#, why mixing _ with PascalCases?

@logikonline
Copy link

logikonline commented Jun 27, 2023

Some feedback on usability of the functions:

Not a fan of "GetResponseFromChatbotAsync" returning null when the action has choices. Would like to see another routine added called "GetResultFromChatbotAsync" with the result since the likelihood of using functions will only increase over time & using the first message of Choice[0] feels hinky requiring the need to go back to the Conversation object to access "MostRecentApiResult".

		public async Task<ChatResult> GetResultFromChatbotAsync()
		{
			ChatRequest req = new ChatRequest(RequestParameters);
			req.Messages = _Messages.ToList();

			var res = await _endpoint.CreateChatCompletionAsync(req);
			MostRecentApiResult = res;
			return res;
		}

Overall, I hope this gets merged sooner than later since it is functional

@brondavies
Copy link

@OkGoDoIt I think this is ready to merge

Copy link

@varon varon left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

This looks good. Some minor nitpicks, but those can be fixed up later. Will be great to get this in.

public class Function
{
/// <summary>
/// The name of the function to be called. Must be a-z, A-Z, 0-9, or contain underscores and dashes, with a maximum length of 64.
Copy link

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Are we providing any validation on this? Can we match it against a regex before sending off to OpenAI?

Comment on lines 57 to 79
try
{
if (value is string jsonStringValue)
{
_parameters = JObject.Parse(jsonStringValue);
}
else if (value is JObject jObjectValue)
{
_parameters = jObjectValue;
}
else
{
var settings = new JsonSerializerSettings
{
NullValueHandling = NullValueHandling.Ignore
};
_parameters = JObject.FromObject(value, JsonSerializer.Create(settings));
}
}
catch (JsonException e)
{
throw new ArgumentException("Could not convert the provided object into a JSON object. Make sure that the object is serializable and its structure matches the required schema.", e);
}
Copy link

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Might be worth placing this in a helper method for clarity/simplicity.

@varon
Copy link

varon commented Jul 3, 2023

@OkGoDoIt ^

@ecsplendid
Copy link

Got this error running in Unity:

Sender:System.Threading.Tasks.Task`1[System.Threading.Tasks.VoidTaskResult] Exception:System.AggregateException: A Task's exception(s) were not observed either by Waiting on the Task or accessing its Exception property. As a result, the unobserved exception was rethrown by the finalizer thread. (Error creating 'OpenAI_API.ChatFunctions.FunctionCallConverter'.) ---> Newtonsoft.Json.JsonException: Error creating 'OpenAI_API.ChatFunctions.FunctionCallConverter'. ---> Newtonsoft.Json.JsonException: No parameterless constructor defined for 'OpenAI_API.ChatFunctions.FunctionCallConverter'.
at Newtonsoft.Json.Serialization.JsonTypeReflector+<>c__DisplayClass22_0.<GetCreator>b__0 (System.Object[] parameters) [0x00000] in <00000000000000000000000000000000>:0
at Newtonsoft.Json.Serialization.JsonTypeReflector.GetJsonConverter (System.Object attributeProvider) [0x00000] in <00000000000000000000000000000000>:0
at Newtonsoft.Json.Serialization.DefaultContractResolver.SetPropertySettingsFromAttributes (Newtonsoft.Json.Serialization.JsonProperty property, System.Object attributeProvider, System.String name, System.Type declaringType, Newtonsoft.Json.MemberSerialization memberSerialization, System.Boolean& allowNonPublicAccess) [0x00000] in <00000000000000000000000000000000>:0
at Newtonsoft.Json.Serialization.DefaultContractResolver.CreateProperty (System.Reflection.MemberInfo member, Newtonsoft.Json.MemberSerialization memberSerialization) [0x00000] in <00000000000000000000000000000000>:0
at Newtonsoft.Json.Serialization.DefaultContractResolver.CreateProperties (System.Type type, Newtonsoft.Json.MemberSerialization memberSerialization) [0x00000] in <00000000000000000000000000000000>:0
at Newtonsoft.Json.Serialization.DefaultContractResolver.CreateObjectContract (System.Type objectType) [0x00000] in <00000000000000000000000000000000>:0
at System.Collections.Concurrent.ConcurrentDictionary`2[TKey,TValue].GetOrAdd (TKey key, System.Func`2[T,TResult] valueFactory) [0x00000] in <00000000000000000000000000000000>:0
at Newtonsoft.Json.Serialization.JsonSerializerInternalWriter.Serialize (Newtonsoft.Json.JsonWriter jsonWriter, System.Object value, System.Type objectType) [0x00000] in <00000000000000000000000000000000>:0
at Newtonsoft.Json.JsonSerializer.SerializeInternal (Newtonsoft.Json.JsonWriter jsonWriter, System.Object value, System.Type objectType) [0x00000] in <00000000000000000000000000000000>:0
at Newtonsoft.Json.JsonConvert.SerializeObjectInternal (System.Object value, System.Type type, Newtonsoft.Json.JsonSerializer jsonSerializer) [0x00000] in <00000000000000000000000000000000>:0
at OpenAI_API.EndpointBase.HttpRequestRaw (System.String url, System.Net.Http.HttpMethod verb, System.Object postData, System.Boolean streaming) [0x00000] in <00000000000000000000000000000000>:0
at System.Runtime.CompilerServices.AsyncTaskMethodBuilder`1[TResult].Start[TStateMachine] (TStateMachine& stateMachine) [0x00000] in <00000000000000000000000000000000>:0
at OpenAI_API.EndpointBase.HttpRequestRaw (System.String url, System.Net.Http.HttpMethod verb, System.Object postData, System.Boolean streaming) [0x00000] in <00000000000000000000000000000000>:0
at OpenAI_API.EndpointBase+<HttpStreamingRequest>d__16`1[T].MoveNext () [0x00000] in <00000000000000000000000000000000>:0
at System.Runtime.CompilerServices.AsyncTaskMethodBuilder.Start[TStateMachine] (TStateMachine& stateMachine) [0x00000] in <00000000000000000000000000000000>:0
at OpenAI_API.EndpointBase+<HttpStreamingRequest>d__16`1[T].System.Collections.Generic.IAsyncEnumerator<T>.MoveNextAsync () [0x00000] in <00000000000000000000000000000000>:0
at OpenAI_API.Chat.ChatEndpoint.StreamCompletionAsync (OpenAI_API.Chat.ChatRequest request, System.Action`2[T1,T2] resultHandler, System.Threading.CancellationToken cancellationToken) [0x00000] in <00000000000000000000000000000000>:0
at System.Runtime.CompilerServices.AsyncTaskMethodBuilder.Start[TStateMachine] (TStateMachine& stateMachine) [0x00000] in <00000000000000000000000000000000>:0
at OpenAI_API.Chat.ChatEndpoint.StreamCompletionAsync (OpenAI_API.Chat.ChatRequest request, System.Action`2[T1,T2] resultHandler, System.Threading.CancellationToken cancellationToken) [0x00000] in <00000000000000000000000000000000>:0
at Xrai.Util.Gpt.ExecuteGptQuery (System.String prompt, System.Action`2[T1,T2] resultHandler, System.Boolean ephemeral, System.String system_message, System.Threading.CancellationToken cancellationToken) [0x00000] in <00000000000000000000000000000000>:0
at CaptionHUD.FixedUpdate () [0x00000] in <00000000000000000000000000000000>:0
--- End of inner exception stack trace ---
at Newtonsoft.Json.Serialization.JsonTypeReflector+<>c__DisplayClass22_0.<GetCreator>b__0 (System.Object[] parameters) [0x00000] in <00000000000000000000000000000000>:0
at Newtonsoft.Json.Serialization.JsonTypeReflector.GetJsonConverter (System.Object attributeProvider) [0x00000] in <00000000000000000000000000000000>:0
at Newtonsoft.Json.Serialization.DefaultContractResolver.SetPropertySettingsFromAttributes (Newtonsoft.Json.Serialization.JsonProperty property, System.Object attributeProvider, System.String name, System.Type declaringType, Newtonsoft.Json.MemberSerialization memberSerialization, System.Boolean& allowNonPublicAccess) [0x00000] in <00000000000000000000000000000000>:0
at Newtonsoft.Json.Serialization.DefaultContractResolver.CreateProperty (System.Reflection.MemberInfo member, Newtonsoft.Json.MemberSerialization memberSerialization) [0x00000] in <00000000000000000000000000000000>:0
at Newtonsoft.Json.Serialization.DefaultContractResolver.CreateProperties (System.Type type, Newtonsoft.Json.MemberSerialization memberSerialization) [0x00000] in <00000000000000000000000000000000>:0
at Newtonsoft.Json.Serialization.DefaultContractResolver.CreateObjectContract (System.Type objectType) [0x00000] in <00000000000000000000000000000000>:0
at System.Collections.Concurrent.ConcurrentDictionary`2[TKey,TValue].GetOrAdd (TKey key, System.Func`2[T,TResult] valueFactory) [0x00000] in <00000000000000000000000000000000>:0
at Newtonsoft.Json.Serialization.JsonSerializerInternalWriter.Serialize (Newtonsoft.Json.JsonWriter jsonWriter, System.Object value, System.Type objectType) [0x00000] in <00000000000000000000000000000000>:0
at Newtonsoft.Json.JsonSerializer.SerializeInternal (Newtonsoft.Json.JsonWriter jsonWriter, System.Object value, System.Type objectType) [0x00000] in <00000000000000000000000000000000>:0
at Newtonsoft.Json.JsonConvert.SerializeObjectInternal (System.Object value, System.Type type, Newtonsoft.Json.JsonSerializer jsonSerializer) [0x00000] in <00000000000000000000000000000000>:0
at OpenAI_API.EndpointBase.HttpRequestRaw (System.String url, System.Net.Http.HttpMethod verb, System.Object postData, System.Boolean streaming) [0x00000] in <00000000000000000000000000000000>:0
at System.Runtime.CompilerServices.AsyncTaskMethodBuilder`1[TResult].Start[TStateMachine] (TStateMachine& stateMachine) [0x00000] in <00000000000000000000000000000000>:0
at OpenAI_API.EndpointBase.HttpRequestRaw (System.String url, System.Net.Http.HttpMethod verb, System.Object postData, System.Boolean streaming) [0x00000] in <00000000000000000000000000000000>:0
at OpenAI_API.EndpointBase+<HttpStreamingRequest>d__16`1[T].MoveNext () [0x00000] in <00000000000000000000000000000000>:0
at System.Runtime.CompilerServices.AsyncTaskMethodBuilder.Start[TStateMachine] (TStateMachine& stateMachine) [0x00000] in <00000000000000000000000000000000>:0
at OpenAI_API.EndpointBase+<HttpStreamingRequest>d__16`1[T].System.Collections.Generic.IAsyncEnumerator<T>.MoveNextAsync () [0x00000] in <00000000000000000000000000000000>:0
at OpenAI_API.Chat.ChatEndpoint.StreamCompletionAsync (OpenAI_API.Chat.ChatRequest request, System.Action`2[T1,T2] resultHandler, System.Threading.CancellationToken cancellationToken) [0x00000] in <00000000000000000000000000000000>:0
at System.Runtime.CompilerServices.AsyncTaskMethodBuilder.Start[TStateMachine] (TStateMachine& stateMachine) [0x00000] in <00000000000000000000000000000000>:0
at OpenAI_API.Chat.ChatEndpoint.StreamCompletionAsync (OpenAI_API.Chat.ChatRequest request, System.Action`2[T1,T2] resultHandler, System.Threading.CancellationToken cancellationToken) [0x00000] in <00000000000000000000000000000000>:0
at Xrai.Util.Gpt.ExecuteGptQuery (System.String prompt, System.Action`2[T1,T2] resultHandler, System.Boolean ephemeral, System.String system_message, System.Threading.CancellationToken cancellationToken) [0x00000] in <00000000000000000000000000000000>:0
at CaptionHUD.FixedUpdate () [0x00000] in <00000000000000000000000000000000>:0
--- End of stack trace from previous location where exception was thrown ---

at OpenAI_API.EndpointBase+<HttpStreamingRequest>d__16`1[T].MoveNext () [0x00000] in <00000000000000000000000000000000>:0
at System.Runtime.CompilerServices.AsyncTaskMethodBuilder.Start[TStateMachine] (TStateMachine& stateMachine) [0x00000] in <00000000000000000000000000000000>:0
at OpenAI_API.EndpointBase+<HttpStreamingRequest>d__16`1[T].System.Collections.Generic.IAsyncEnumerator<T>.MoveNextAsync () [0x00000] in <00000000000000000000000000000000>:0
at OpenAI_API.Chat.ChatEndpoint.StreamCompletionAsync (OpenAI_API.Chat.ChatRequest request, System.Action`2[T1,T2] resultHandler, System.Threading.CancellationToken cancellationToken) [0x00000] in <00000000000000000000000000000000>:0
at System.Runtime.CompilerServices.AsyncTaskMethodBuilder.Start[TStateMachine] (TStateMachine& stateMachine) [0x00000] in <00000000000000000000000000000000>:0
at OpenAI_API.Chat.ChatEndpoint.StreamCompletionAsync (OpenAI_API.Chat.ChatRequest request, System.Action`2[T1,T2] resultHandler, System.Threading.CancellationToken cancellationToken) [0x00000] in <00000000000000000000000000000000>:0
at Xrai.Util.Gpt.ExecuteGptQuery (System.String prompt, System.Action`2[T1,T2] resultHandler, System.Boolean ephemeral, System.String system_message, System.Threading.CancellationToken cancellationToken) [0x00000] in <00000000000000000000000000000000>:0
at CaptionHUD.FixedUpdate () [0x00000] in <00000000000000000000000000000000>:0
--- End of stack trace from previous location where exception was thrown ---

@Whiteha
Copy link

Whiteha commented Jul 6, 2023

StreamChatEnumerableAsync works in a strange way with functions. Is there any working example?

I found only the way to use it

            var conversation = api.Chat.CreateConversation(new ChatRequest
            {
                Model = Model.GPT4_0613,
                Functions = functionsList,
                Temperature = 0.10,
                Messages = messages
            });

            try
            {
                string currentFunction = "";
                var functionCalls = new Dictionary<string, string>();
                await foreach (var result in api.Chat.StreamChatEnumerableAsync(conversation.RequestParameters).ConfigureAwait(false))
                {
                    if (result.Choices[0] != null && result.Choices[0].FinishReason == "function_call")
                    {
                        foreach (var call in functionCalls)
                        {
                            // parse json and call manually
                        }
                    }
                    else if (result.Choices[0].Delta.FunctionCall?.Arguments != null)
                    {
                        // Only first entrance has function Name
                        if (result.Choices[0].Delta.FunctionCall.Name != null)
                        {
                            currentFunction = result.Choices[0].Delta.FunctionCall.Name;
                            functionCalls.Add(currentFunction, "");
                        }
                        else
                        {
                            // collecting json with arguments
                            functionCalls[currentFunction] += result.Choices[0].Delta.FunctionCall.Arguments;
                        }
                    }

                    foreach (var choice in result.Choices.Where(choice => !string.IsNullOrWhiteSpace(choice.Delta?.Content)))
                    {
                        streamGetter(choice.Delta.Content);
                    }
                }
            }
            catch (Exception ex)
            {
                Console.WriteLine(ex.Message);
            } 

@OkGoDoIt
Copy link
Owner

OkGoDoIt commented Jul 7, 2023

Hey everyone, I'm sorry for the delay on my end. I've been swamped with my day job and I'm behind on triaging PR's here. I will do my best to get to this within the next few days. Thanks for your understanding πŸ˜…

@lofcz
Copy link
Contributor

lofcz commented Jul 14, 2023

Efficient implementation of function calling in conjunction with streaming: lofcz@52b7884#diff-8bd7aaf5181bcc44301c5b518a91bee8f7e4591461c1c19dc8a4a55e74354e07R383

@hansjm10 I think this event-based approach and abstracting buffering away from the end-user presents a better experience for API consumers.

@ffMathy
Copy link

ffMathy commented Jul 28, 2023

@OkGoDoIt can you provide us with an additional update on this?

@iwaitu
Copy link

iwaitu commented Aug 2, 2023

@hansjm10 I've fixed the issue where function_call couldn't be used properly during re-streaming. Now, function_call can be used normally in all scenarios. detail.Also, I've added example code for using stream transmission. It's located in test/ChatEndpointTest.cs, within the method named SummarizeFunctionStreamResult.

@hansjm10
Copy link
Author

hansjm10 commented Aug 3, 2023

@hansjm10 I've fixed the issue where function_call couldn't be used properly during re-streaming. Now, function_call can be used normally in all scenarios. detail.Also, I've added example code for using stream transmission. It's located in test/ChatEndpointTest.cs, within the method named SummarizeFunctionStreamResult.

Looks good and I'm ready to merge it but just revert the changes to the proj files.

@ErikDombi
Copy link

Looks good and I'm ready to merge it but just revert the changes to the proj files.

You could cherrypick the two commits that don't mess with the proj files as well

@ErikDombi
Copy link

There's a lot of weird intending issues in a lot of the files as well

@brondavies
Copy link

There's a lot of weird intending issues in a lot of the files as well

I think @ErikDombi means "indenting issues" as in whitespace, and formatting is all over the place πŸ˜„

@MathiasGronseth
Copy link

Is there any status on this? Eager to try out this functionality

@DaliaDawod
Copy link

DaliaDawod commented Aug 23, 2023

I have this and token is printed empty, can someone help me?

 var functionParameters =  new{
            type = "object",
            properties = new
            {
                scary = new
                {
                    type = "string",
                    description = "A scary version of the response to a user's query"
                },
                joyful = new
                {
                    type = "string",
                    description = "A joyful version of the response to a user's query"
                }
            },
            required = new[]
            {
                "scary",
                "joyful"
            }
        };
    var functions = new List<Function> { new Function(
        "responses",
        "ingest the emotion",
        functionParameters
    )};

    var functionCall = new FunctionCall
    {
        Name = "responses" ,
        // Arguments = "scary"
    };


    // Send the entire chat to OpenAI to get the next message
    await foreach (var token in api.Chat.StreamChatEnumerableAsync(new ChatRequest()
    {
            Model = Model.ChatGPTTurbo,
            Temperature = 0.5,
            Messages = messages,
            Functions = functions,  
            FunctionCall = functionCall
    }))
    {
    Console.WriteLine("token ", token);

.......

@EvoTechMike
Copy link

Looks like PR got killed. Can we get it back?

@EvoTechMike
Copy link

did it get merged?

@OkGoDoIt
Copy link
Owner

OkGoDoIt commented Feb 7, 2024

@EvoTechMike, no, I haven't merged any PR's in a while. I implemented some of this myself but I have not yet implemented the function calling.

@lionbarz
Copy link

lionbarz commented Feb 10, 2024

Function calling is the only reason I can't use this beautiful package. Please implement it.

@endeffects
Copy link

@OkGoDoIt If you don't have time to implement or approve features by yourself why you don't let your contributors do the job? Still if a community implementation doesn't reach your quality standards and will be replaced by your own ones later on it would be a much better progress. Right now we have a feature lack which grows from day to day. I understand that you have your own personal goals but on a project that became so famous there is a requirement to be up to date and feature complete. So please rethink about your future project management.

@lofcz
Copy link
Contributor

lofcz commented Feb 10, 2024

Interim one of the forks supporting Functions, Assistants, and other new features can be used.

https://github.com/RageAgainstThePixel/OpenAI-DotNet -Supports Unity, .NET Standard & .NET Core, frequently updated, great test coverage.
https://github.com/lofcz/OpenAiNg - .NET Core only, high performance, customizability, supports locally hosted models, responses include raw HTTP request. A high-level API for using functions and streaming together is included.
(Ordered by ⭐️ at the time of writing)

@endeffects
Copy link

Thanks

@EvoTechMike
Copy link

EvoTechMike commented Feb 11, 2024

@lofcz Out of curiosity, how did you find these forks? When I go to https://github.com/OkGoDoIt/OpenAI-API-dotnet/forks there isn't any forks with more than 1 star.

edit: d'oh! I see one of them is yours. Still wondering why they dont show up as forks though

@lofcz
Copy link
Contributor

lofcz commented Feb 11, 2024

Both RageAgainstThePixel and me hardforked some time ago. I've maintained a soft fork for about half a year but then the changes diverged too much from upstream, hence hard fork.

@reitowo
Copy link

reitowo commented Feb 11, 2024

I suggest archive this repository if the author no longer maintain this feature. It is funny that such feature is not merged after 8 months.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

Add Function calling