Thanks to visit codestin.com
Credit goes to github.com

Skip to content

Internal error with protobufs and tool use #626

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Closed
duccdev opened this issue Nov 11, 2024 · 3 comments · Fixed by Unity-google/generative-ai-python#7 or #684
Closed

Internal error with protobufs and tool use #626

duccdev opened this issue Nov 11, 2024 · 3 comments · Fixed by Unity-google/generative-ai-python#7 or #684
Assignees
Labels
component:python sdk Issue/PR related to Python SDK type:bug Something isn't working

Comments

@duccdev
Copy link

duccdev commented Nov 11, 2024

Description of the bug:

I was playing around with tool use and streaming, this was my code:

import google.generativeai as genai
import config

genai.configure(api_key=config.API_KEY)


def set_brightness(value: float) -> None:
    """Controls the brightness of all house lights. `value` is a `float` between 0 (off) and 1 (max)."""
    print("Brightness changed:", value)


model = genai.GenerativeModel(
    "gemini-1.5-flash-002",
    tools=[set_brightness],
)

chat = model.start_chat()
while True:
    stream = chat.send_message(input("User: "), stream=True)
    print("Model: ", end="", flush=True)
    for chunk in stream:
        print(chunk.text, end="", flush=True)
    print()

I got this peculiar error when I prompted it like this:

❯ python3 main.py
User: hey can you turn off the lights
Model: Traceback (most recent call last):
  File "/home/ducc/Projects/sketchbooks/apithing/main.py", line 29, in <module>
    print(chunk.text, end="", flush=True)
          ^^^^^^^^^^
  File "/home/ducc/Projects/sketchbooks/apithing/.venv/lib/python3.13/site-packages/google/generativeai/types/generation_types.py", line 536, in text
    part_type = protos.Part.pb(part).whichOneof("data")
                ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
AttributeError: whichOneof. Did you mean: 'WhichOneof'?

I am currently using Python 3.13, latest version of the SDK.
This is not an issue on my part as this seems to be a typo in the library.

Actual vs expected behavior:

Actual:

❯ python3 main.py
User: hey can you turn off the lights
Model: Traceback (most recent call last):
  File "/home/ducc/Projects/sketchbooks/apithing/main.py", line 29, in <module>
    print(chunk.text, end="", flush=True)
          ^^^^^^^^^^
  File "/home/ducc/Projects/sketchbooks/apithing/.venv/lib/python3.13/site-packages/google/generativeai/types/generation_types.py", line 536, in text
    part_type = protos.Part.pb(part).whichOneof("data")
                ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
AttributeError: whichOneof. Did you mean: 'WhichOneof'?

Expected (roughly):

❯ python3 main.py
User: hey can you turn off the lights
Brightness changed: 0.0
Model: Done!

Any other information you'd like to share?

Simply change the typo at types/generation_types.py, line 536, from .whichOneof to .WhichOneof.

@Gunand3043 Gunand3043 self-assigned this Nov 12, 2024
@Gunand3043 Gunand3043 added type:bug Something isn't working status:triaged Issue/PR triaged to the corresponding sub-team component:python sdk Issue/PR related to Python SDK labels Nov 12, 2024
@Gunand3043
Copy link

Hi @duccdev

You need to set enable_automatic_function_calling to True to make it work.

def set_brightness(value: float) -> None:
    """Controls the brightness of all house lights. `value` is a `float` between 0 (off) and 1 (max)."""
    print("Brightness changed:", value)


model = genai.GenerativeModel(
    "gemini-1.5-flash-002",
    tools=[set_brightness],
)

chat = model.start_chat(enable_automatic_function_calling=True)
while True:
    stream = chat.send_message(input("User: "))
    print("Model: ", end="", flush=True)
    print(stream.text, end="", flush=True)
    print()

MarkDaoust added a commit that referenced this issue Nov 12, 2024
Fixes: #626
@MarkDaoust MarkDaoust mentioned this issue Nov 12, 2024
@MarkDaoust
Copy link
Collaborator

Yes. Thanks for reporting!

@MarkDaoust
Copy link
Collaborator

Okay, the "WhichOneOf" error is fixed.

But this will still fail:

for chunk in stream:
        print(chunk.text, end="", flush=True)

When you start generating complex responses .text doesn't know what to do with new part types.
You need to iterate over the parts in the chunk, switch on their type, and handle FunctionCall objects yourself.

@github-actions github-actions bot removed the status:triaged Issue/PR triaged to the corresponding sub-team label Feb 20, 2025
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
component:python sdk Issue/PR related to Python SDK type:bug Something isn't working
Projects
None yet
3 participants