Required parameters for function calling - asking user for missing information #9852
-
User should be prompted for required parameters that are not specified in the user request. However, we have noticed inconsistent behavior when all parameters are not provided in the user request and many times the model ends up providing missing information on its own. We have tried giving system instructions, providing instructions in function/parameter descriptions etc. but we haven't been able to get consistent behavior. What is the best way to handle this? Filters can be used, but that won't work for all scenarios because the parameters chosen by the model could be valid - but the information should ideally come from the user. |
Beta Was this translation helpful? Give feedback.
Replies: 5 comments
-
@deepinderdeol - which model are you using for this? |
Beta Was this translation helpful? Give feedback.
-
@evchaki We have tried GPT-4o (5/13 version) and other older GPT-4 models. Recently I have also tried many of the open-source models with the Ollama connector - llama3.2, qwen2 etc. - but maybe these handle function calling differently... |
Beta Was this translation helpful? Give feedback.
-
@markwallace-microsoft - FYI |
Beta Was this translation helpful? Give feedback.
-
@deepinderdeol
|
Beta Was this translation helpful? Give feedback.
-
@markwallace-microsoft Thanks for the suggestion. Just curious, is it a known issue with function calling that even though a parameter is marked required, the LLMs can ignore that and invent a value for an argument in some scenarios? Additional information - we have seen this issue mostly for functions with multiple parameters. If there is only one parameter then behavior is pretty consistent and the user is prompted for missing information. Example:
|
Beta Was this translation helpful? Give feedback.
@deepinderdeol
One thing you can try is: