Skip to content

Improve function calling (auto selection, parallel functions, parameters grammar)#1347

Closed
themrzmaster wants to merge 11 commits intoabetlen:mainfrom
themrzmaster:main
Closed

Improve function calling (auto selection, parallel functions, parameters grammar)#1347
themrzmaster wants to merge 11 commits intoabetlen:mainfrom
themrzmaster:main

Conversation

@themrzmaster
Copy link
Copy Markdown
Contributor

@themrzmaster themrzmaster commented Apr 17, 2024

Examples:

response = client.chat.completions.create(
  model="WizardLM-2-7B.Q8_0.gguf",
 messages = [
        {
          "role": "system",
          "content": "Always follow user requests"
        },
        {
          "role": "user",
          "content": "search for items with the term burguer and a maximum price of 10 dollars."
        },
       

      ],
    tools=[
         {
      "type": "function",
      "function": {
          "name": "search_merchant",
          "description": "Search for merchants in the catalog based on the term",
          "parameters": {
              "type": "object",
              "properties": {
                  "term": {
                      "type": "string",
                      "description": "Term to be searched for finding merchants.",
                  }
              },
              "required": ["term"],
          },
      },
  },
  {
      "type": "function",
      "function": {
          "name": "search_item",
          "description": "Search for items in the catalog based on various criteria.",
          "parameters": {
              "type": "object",
              "properties": {
                  "term": {
                      "type": "string",
                      "description": "Term to be searched for finding items, with removed accents.",
                  },
                  "item_price_to": {
                      "type": "integer",
                      "description": "Maximum price the user is willing to pay for an item, if specified.",
                  },
                  "merchant_delivery_fee_to": {
                      "type": "integer",
                      "description": "Maximum delivery fee the user is willing to pay, if specified.",
                  },
                  "merchant_payment_types": {
                      "type": "string",
                      "description": "Type of payment the user prefers, if specified.",
                      "enum": [
                          "Credit Card",
                          "Debit Card",
                          "Other",
                      ],
                  },
              },
              "required": ["term"],
          },
      },
  }],
     tool_choice="auto"
)

gives me:
Choice(finish_reason='tool_calls', index=0, logprobs=None, message=ChatCompletionMessage(content=None, role='assistant', function_call=FunctionCall(arguments='{ "term": "burger", "item_price_to": 10 }', name='search_item'), tool_calls=[ChatCompletionMessageToolCall(id='call__0_search_item_cmpl-768bb07b-6b14-47b1-8d91-3efa3c4de6d3', function=Function(arguments='{ "term": "burger", "item_price_to": 10 }', name='search_item'), type='function')]))

multiple functions:

messages = [
        {
          "role": "system",
          "content": "Always follow user requests"
        },
        {
          "role": "user",
          "content": "search for items with the term burguer and a maximum price of 10 dollars. at the same time look for merchant named Burger King"
        },
       
      ],

same tools definition

gives:
Choice(finish_reason='tool_calls', index=0, logprobs=None, message=ChatCompletionMessage(content=None, role='assistant', function_call=None, tool_calls=[ChatCompletionMessageToolCall(id='call__0_search_item_cmpl-f924aecb-64e8-4ba0-8fc4-e9a620700064', function=Function(arguments='{ "term": "burger", "item_price_to": 10, "merchant_delivery_fee_to": 0 } ', name='search_item'), type='function'), ChatCompletionMessageToolCall(id='call__1_search_merchant_cmpl-98995992-1197-48c0-8740-d67ee7e12255', function=Function(arguments='{ "term": "Burger King" } ', name='search_merchant'), type='function')]))

I ran the server with:

python3 -m llama_cpp.server --model 'llm_models/WizardLM-2-7B.Q8_0.gguf' --n_gpu_layers 100 --chat_format vicuna-function-calling --n_ctx 1024

@abetlen
Copy link
Copy Markdown
Owner

abetlen commented Apr 17, 2024

Hey @themrzmaster this is awesome, thank you for the contribution! Do you mind splitting this up into 2 seperate PRs one for the grammar updates and one for the chat format changes.

@themrzmaster
Copy link
Copy Markdown
Contributor Author

Hi @abetlen ! done it
#1351
#1350

@themrzmaster
Copy link
Copy Markdown
Contributor Author

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

3 participants