OpenAI Functions Agent Node#
Verwenden Sie den OpenAI Functions Agent Node, um ein OpenAI Functions Modell zu verwenden. Dies sind Modelle, die erkennen, wann eine Funktion aufgerufen werden soll, und mit den Eingaben antworten, die an die Funktion übergeben werden sollen.
Weitere Informationen zum AI Agent Node selbst finden Sie unter AI Agent.
You can use this agent with the Chat Trigger node. Attach a memory sub-node so that users can have an ongoing conversation with multiple queries. Memory doesn't persist between sessions.
OpenAI Chat Model erforderlich
Sie müssen das OpenAI Chat Model mit diesem Agenten verwenden.
Node Parameter#
Konfigurieren Sie den OpenAI Functions Agent mit den folgenden Parametern.
Prompt#
Select how you want the node to construct the prompt (also known as the user's query or input from the chat).
Choose from:
- Take from previous node automatically: If you select this option, the node expects an input from a previous node called
chatInput
. - Define below: If you select this option, enter the Text you want to use as the prompt. You can use expressions here for dynamic content.
Spezifisches Ausgabeformat erforderlich#
This parameter controls whether you want the node to require a specific output format. When turned on, Localmind Automate prompts you to connect one of these output parsers to the node:
Node Optionen#
Verfeinern Sie das Verhalten des OpenAI Functions Agent Nodes mit diesen Optionen:
Systemnachricht#
If you'd like to send a message to the agent before the conversation starts, enter the message you'd like to send.
Use this option to guide the agent's decision-making.
Maximale Iterationen#
Enter the number of times the model should run to try and generate a good answer from the user's prompt.
Defaults to 10
.
Zwischenschritte zurückgeben#
Select whether to include intermediate steps the agent took in the final output (turned on) or not (turned off).
This could be useful for further refining the agent's behavior based on the steps it took.