Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

tools_strict option in OpenAIChatGenerator broken with ComponentTool #8912

Open
1 task
tstadel opened this issue Feb 24, 2025 · 0 comments · May be fixed by #8913
Open
1 task

tools_strict option in OpenAIChatGenerator broken with ComponentTool #8912

tstadel opened this issue Feb 24, 2025 · 0 comments · May be fixed by #8913

Comments

@tstadel
Copy link
Member

tstadel commented Feb 24, 2025

Describe the bug
When using ComponentTool and setting tools_strict=True OpenAI API is complaining that additionalProperties of the schema is not false.

Error message

BadRequestError                           Traceback (most recent call last)
Cell In[19], line 1
----> 1 result = generator.run(messages=chat_messages["prompt"], tools=tool_invoker.tools)
      2 result

File ~/.local/lib/python3.12/site-packages/haystack/components/generators/chat/openai.py:246, in OpenAIChatGenerator.run(self, messages, streaming_callback, generation_kwargs, tools, tools_strict)
    237 streaming_callback = streaming_callback or self.streaming_callback
    239 api_args = self._prepare_api_call(
    240     messages=messages,
    241     streaming_callback=streaming_callback,
   (...)
    244     tools_strict=tools_strict,
    245 )
--> 246 chat_completion: Union[Stream[ChatCompletionChunk], ChatCompletion] = self.client.chat.completions.create(
    247     **api_args
    248 )
    250 is_streaming = isinstance(chat_completion, Stream)
    251 assert is_streaming or streaming_callback is None

File ~/.local/lib/python3.12/site-packages/ddtrace/contrib/trace_utils.py:336, in with_traced_module.<locals>.with_mod.<locals>.wrapper(wrapped, instance, args, kwargs)
    334     log.debug("Pin not found for traced method %r", wrapped)
    335     return wrapped(*args, **kwargs)
--> 336 return func(mod, pin, wrapped, instance, args, kwargs)

File ~/.local/lib/python3.12/site-packages/ddtrace/contrib/internal/openai/patch.py:282, in _patched_endpoint.<locals>.patched_endpoint(openai, pin, func, instance, args, kwargs)
    280 resp, err = None, None
    281 try:
--> 282     resp = func(*args, **kwargs)
    283     return resp
    284 except Exception as e:

File ~/.local/lib/python3.12/site-packages/openai/_utils/_utils.py:279, in required_args.<locals>.inner.<locals>.wrapper(*args, **kwargs)
    277             msg = f"Missing required argument: {quote(missing[0])}"
    278     raise TypeError(msg)
--> 279 return func(*args, **kwargs)

File ~/.local/lib/python3.12/site-packages/openai/resources/chat/completions/completions.py:879, in Completions.create(self, messages, model, audio, frequency_penalty, function_call, functions, logit_bias, logprobs, max_completion_tokens, max_tokens, metadata, modalities, n, parallel_tool_calls, prediction, presence_penalty, reasoning_effort, response_format, seed, service_tier, stop, store, stream, stream_options, temperature, tool_choice, tools, top_logprobs, top_p, user, extra_headers, extra_query, extra_body, timeout)
    837 @required_args(["messages", "model"], ["messages", "model", "stream"])
    838 def create(
    839     self,
   (...)
    876     timeout: float | httpx.Timeout | None | NotGiven = NOT_GIVEN,
    877 ) -> ChatCompletion | Stream[ChatCompletionChunk]:
    878     validate_response_format(response_format)
--> 879     return self._post(
    880         "/chat/completions",
    881         body=maybe_transform(
    882             {
    883                 "messages": messages,
    884                 "model": model,
    885                 "audio": audio,
    886                 "frequency_penalty": frequency_penalty,
    887                 "function_call": function_call,
    888                 "functions": functions,
    889                 "logit_bias": logit_bias,
    890                 "logprobs": logprobs,
    891                 "max_completion_tokens": max_completion_tokens,
    892                 "max_tokens": max_tokens,
    893                 "metadata": metadata,
    894                 "modalities": modalities,
    895                 "n": n,
    896                 "parallel_tool_calls": parallel_tool_calls,
    897                 "prediction": prediction,
    898                 "presence_penalty": presence_penalty,
    899                 "reasoning_effort": reasoning_effort,
    900                 "response_format": response_format,
    901                 "seed": seed,
    902                 "service_tier": service_tier,
    903                 "stop": stop,
    904                 "store": store,
    905                 "stream": stream,
    906                 "stream_options": stream_options,
    907                 "temperature": temperature,
    908                 "tool_choice": tool_choice,
    909                 "tools": tools,
    910                 "top_logprobs": top_logprobs,
    911                 "top_p": top_p,
    912                 "user": user,
    913             },
    914             completion_create_params.CompletionCreateParams,
    915         ),
    916         options=make_request_options(
    917             extra_headers=extra_headers, extra_query=extra_query, extra_body=extra_body, timeout=timeout
    918         ),
    919         cast_to=ChatCompletion,
    920         stream=stream or False,
    921         stream_cls=Stream[ChatCompletionChunk],
    922     )

File ~/.local/lib/python3.12/site-packages/openai/_base_client.py:1290, in SyncAPIClient.post(self, path, cast_to, body, options, files, stream, stream_cls)
   1276 def post(
   1277     self,
   1278     path: str,
   (...)
   1285     stream_cls: type[_StreamT] | None = None,
   1286 ) -> ResponseT | _StreamT:
   1287     opts = FinalRequestOptions.construct(
   1288         method="post", url=path, json_data=body, files=to_httpx_files(files), **options
   1289     )
-> 1290     return cast(ResponseT, self.request(cast_to, opts, stream=stream, stream_cls=stream_cls))

File ~/.local/lib/python3.12/site-packages/openai/_base_client.py:967, in SyncAPIClient.request(self, cast_to, options, remaining_retries, stream, stream_cls)
    964 else:
    965     retries_taken = 0
--> 967 return self._request(
    968     cast_to=cast_to,
    969     options=options,
    970     stream=stream,
    971     stream_cls=stream_cls,
    972     retries_taken=retries_taken,
    973 )

File ~/.local/lib/python3.12/site-packages/openai/_base_client.py:1071, in SyncAPIClient._request(self, cast_to, options, retries_taken, stream, stream_cls)
   1068         err.response.read()
   1070     log.debug("Re-raising status error")
-> 1071     raise self._make_status_error_from_response(err.response) from None
   1073 return self._process_response(
   1074     cast_to=cast_to,
   1075     options=options,
   (...)
   1079     retries_taken=retries_taken,
   1080 )

BadRequestError: Error code: 400 - {'error': {'message': "Invalid schema for function 'web_search': In context=(), 'additionalProperties' is required to be supplied and to be false.", 'type': 'invalid_request_error', 'param': 'tools[0].function.parameters', 'code': 'invalid_function_parameters'}}

Expected behavior
use_strict=True is working

Additional context
Add any other context about the problem here, like document types / preprocessing steps / settings of reader etc.

To Reproduce

from haystack.components.generators.chat.openai import OpenAIChatGenerator
from haystack.dataclasses.chat_message import ChatMessage

gen = OpenAIChatGenerator.from_dict({'type': 'haystack.components.generators.chat.openai.OpenAIChatGenerator',
 'init_parameters': {'model': 'gpt-4o',
  'streaming_callback': None,
  'api_base_url': None,
  'organization': None,
  'generation_kwargs': {},
  'api_key': {'type': 'env_var',
   'env_vars': ['OPENAI_API_KEY'],
   'strict': False},
  'timeout': None,
  'max_retries': None,
  'tools': [{'type': 'haystack.tools.component_tool.ComponentTool',
    'data': {'name': 'web_search',
     'description': 'Search the web for current information on any topic',
     'parameters': {'type': 'object',
      'properties': {'query': {'type': 'string',
        'description': 'Search query.'}},
      'required': ['query']},
     'component': {'type': 'haystack.components.websearch.serper_dev.SerperDevWebSearch',
      'init_parameters': {'top_k': 10,
       'allowed_domains': None,
       'search_params': {},
       'api_key': {'type': 'env_var',
        'env_vars': ['SERPERDEV_API_KEY'],
        'strict': False}}}}}],
  'tools_strict': True}})

gen.run([ChatMessage.from_user("How is the weather today in Berlin?")])

FAQ Check

System:

  • OS:
  • GPU/CPU:
  • Haystack version (commit or version number): 2.10.2
  • DocumentStore:
  • Reader:
  • Retriever:
@tstadel tstadel changed the title use_strict option in OpenAIChatGenerator broken use_strict option in OpenAIChatGenerator broken with ComponentTool Feb 24, 2025
@tstadel tstadel changed the title use_strict option in OpenAIChatGenerator broken with ComponentTool tools_strict option in OpenAIChatGenerator broken with ComponentTool Feb 24, 2025
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging a pull request may close this issue.

1 participant