Guides

Automatic Tool Execution

18min
Document image


Introduction

Functionary provides a model that can make intelligent decisions regarding which functions/tools to use. However, it does not actually execute the function/tool. To bring this up another level, you can even automatically execute functions/tools once it is decided by Functionary! In this guide, you will learn how to do that with chatlab. The code used in this tutorial is provided in this Github repository.

Prerequisite

  • An understanding of how to run Functionary vLLM server and make API requests to the running server
  • Basic skills in Python programming and interacting with APIs

What you'll learn

  • How to use chatlab
  • How to configure and integrate chatlab directly with Functionary
  • How to get a model response grounded in function outputs end-to-end with Functionary and chatlab

What you'll need

  • A machine capable of running inference with a local Functionary-v1.4 model (at least 24GB GPU VRAM)
  • A machine with Functionary's dependencies installed

Setup and Requirements

Functionary

Start a Functionary vLLM server with functionary-v1.4 model

Python


The functionary-v1.4 model is trained on context-window of 4K so pass in --max-model-len of 4096.

Chatlab

Install the chatlab python binary package. In this tutorial, we will use version 1.3.0.

Shell


Note on Requirements

Please note that Chatlab's Chat class currently doesn't support Parallel Function calling. Thus, this tutorial is compatible with Functionary Version 1.4 only and may not work correctly with Functionary Version 2.* models.

Define Python Function

Let's assume that you are one of the car dealers at Functionary car dealership. You would like to create a chatbot that can assist you or your customers in quickly getting the prices of certain car models available in the dealership. You will create this Python function:

Python


This function queries a dictionary mapping car model names to its respective prices. It returns an "unknown" value if the car model name is not found.

Before you begin

Before you begin, let's imagine packages like chatlab are not around.

Manually create LLM Function

Now, a customer approaches your car dealer chatbot asking about the price of the car model "Rhino". Firstly, you would need to manually convert the Python function into the tool dictionary required in OpenAI API:

Python


Thereafter, you would perform inference on Functionary.

Python


This yields the following response:

Python


Manually execute function

As we can see above, Functionary makes the correct decision to call the `get_car_price` function with `Rhino` as input. However, we need to manually execute this function and append its output to the conversation for Functionary to generate appropriate model responses back to the customer.

Python


This yields the final response:

Python


This simple example shows that Functionary can:

  • Intelligently decide on the correct function to use given the conversation
  • Analyze the function output and generate response grounded in the output

However, as you can see, this requires manually creating the function configuration and executing the functions called until a model response is generated. This is where automatic tool execution will be helpful.

Call real python functions automatically

Now, we show that the Functionary can be further enhanced with automatic execution of Python functions. To call the real Python function, get the result and extract the result to respond, you can use chatlab. The following example uses chatlab==1.3.0:

Python


The output will look like this:

Text


Now, Functionary will be called iteratively and chatlab will automatically execute any function called by Functionary until no more functions are to be called and a model response is generated for the customer. This is all done with the ease of a single command:

Python


Congratulations

Congratulations on completing this tutorial. We hope you have learnt about how to further harness the power of Functionary by combining with automatic tool execution libraries like chatlab. Feel free to try out the example notebooks in the Github repository and explore Functionary's function calling capabilities with your own functions.

Summary

  • Performing inference using Functionary
  • Experiencing how Functionary calls functions and generates model responses
  • Learning how to execute functions called by Functionary automatically with chatlab