Thanks to visit codestin.com
Credit goes to github.com

Skip to content

Discovery Engine answer_query API returns LLM_ADDON_NOT_ENABLED despite console UI working #2677

@tomohirohiratsuka

Description

@tomohirohiratsuka

Environment details

  • OS type and version: Linux e91696405eac 6.10.14-linuxkit
  • Python version: Python 3.11.14
  • pip version: 24.0
  • google-cloud-discoveryengine version: 0.14.0

Steps to reproduce

  1. Enable LLM features in Google Cloud Console (Enterprise Edition + Generative responses)
  2. Create BigQuery data store with LLM exclusion set to False
  3. Try to use answer_query API with the same configuration that works in console UI
  4. API returns LLM_ADDON_NOT_ENABLED error while console UI works fine

Code example

from google.cloud import discoveryengine_v1beta as discoveryengine

# Based on official documentation: https://cloud.google.com/generative-ai-app-builder/docs/answer
def answer_query_sample(
    project_id: str,
    location: str,
    engine_id: str,
) -> discoveryengine.AnswerQueryResponse:
    client = discoveryengine.ConversationalSearchServiceClient()
    
    # The full resource name of the Search serving config
    serving_config = f"projects/{project_id}/locations/{location}/collections/default_collection/engines/{engine_id}/servingConfigs/default_serving_config"
    
    # Optional: Options for answer phase
    answer_generation_spec = discoveryengine.AnswerQueryRequest.AnswerGenerationSpec(
        ignore_adversarial_query=False,
        ignore_non_answer_seeking_query=False,
        ignore_low_relevant_content=False,
        model_spec=discoveryengine.AnswerQueryRequest.AnswerGenerationSpec.ModelSpec(
            model_version="gemini-2.0-flash-001/answer_gen/v1",
        ),
        prompt_spec=discoveryengine.AnswerQueryRequest.AnswerGenerationSpec.PromptSpec(
            preamble="Give a detailed answer.",
        ),
        include_citations=True,
        answer_language_code="en",
    )
    
    # Initialize request argument(s)
    request = discoveryengine.AnswerQueryRequest(
        serving_config=serving_config,
        query=discoveryengine.Query(text="What is Vertex AI Search?"),
        session=None,
        answer_generation_spec=answer_generation_spec,
        user_pseudo_id="user-pseudo-id",
    )
    
    # Make the request
    response = client.answer_query(request)
    return response

# Test execution
try:
    response = answer_query_sample(
        project_id="project-id",
        location="global",
        engine_id="data-store-id"
    )
    print("Success:", response)
except Exception as e:
    print("Error:", str(e))

Stack trace

Traceback (most recent call last):
  File "/usr/local/lib/python3.11/site-packages/google/cloud/discoveryengine_v1beta/services/conversational_search_service/client.py", line 1707, in answer_query
    response = rpc(
  File "/usr/local/lib/python3.11/site-packages/google/api_core/gapic_v1/method.py", line 131, in __call__
    return wrapped_func(*args, **kwargs)
  File "/usr/local/lib/python3.11/site-packages/google/api_core/grpc_helpers.py", line 78, in error_remapped_callable
    raise exceptions.from_grpc_error(exc) from exc
google.api_core.exceptions.FailedPrecondition: 400 This feature is only available when Large Language Model add-on is enabled.

Additional Context:

  • Console UI works perfectly with answer generation
  • All LLM features are enabled in Google Cloud Console
  • Data store type: BigQuery (structured data)
  • Enterprise Edition and Generative responses are both enabled
  • Data store LLM exclusion is set to False
  • The issue appears to be a disconnect between console UI settings and API-level LLM enablement
  • Discovery Engine API is currently in "preview" status which might be related

Metadata

Metadata

Assignees

Labels

No labels
No labels

Type

No type

Projects

No projects

Milestone

No milestone

Relationships

None yet

Development

No branches or pull requests

Issue actions