Guides
2min
This guidebook is a collection of examples and guides for building simple real-world applications with the Functionary API.
To run these examples, you'll need a machine running the vLLM server with one of the local Functionary models. You will also need to install the official OpenAI Python API package which will allow you to make API calls to the vLLM server just like how you would make API calls to any OpenAI GPT models. Alternatively, you can also call the server with cURL HTTP requests directly, if you wish to.
All the code examples are written in Python, though the concepts can be applied in any language.
Updated 25 Mar 2024
Did this page help you?