To leverage AI capabilities within your Sparkflows workflows, you can use the Interactive LLM Agent node. This node enables direct interaction with a configured Large Language Model (LLM) as part of a workflow.
Step-by-Step Guide
-
Add the Interactive LLM Agent Node
- From the Node Library, drag and drop the Interactive LLM Agent onto the workflow canvas.
-
Select GenAI Connection
-
Open the node configuration.
-
In the Select Connection field, choose your GenAI connection.
-
If the connection has already been configured via the Admin Panel → Administration, it will automatically be available in this dropdown.
-
-
Configure Metadata / Content Column
-
Specify the Metadata Column that contains the content you want to pass as input to the LLM.
-
This column typically holds text such as documents, descriptions, logs, or any other unstructured content that the model should process.
-
The selected column’s values are sent to the model along with the prompt during execution.
-
-
Define the Prompt
-
In the Prompt section, enter the instruction or query you want the LLM to act on.
-
The prompt can reference the content from the metadata column to guide the model’s response.
-
Prompts should be clear and task-oriented to ensure consistent and accurate outputs.
-
