Skip to main content

Overview

AI nodes form the intelligence layer of Splox workflows, enabling LLM-powered reasoning and tool-based actions.

LLM Node

Execute AI model completions with tool calling and memory

Tool Node

Execute operations, call APIs, and integrate with external services

LLM Node

Purpose: Execute AI model completions with tool calling, streaming, and memory The LLM node is the core of agentic workflows, enabling AI models to generate responses, make decisions, and use tools.
LLM node visual representation
LLM node visual representation
Node Handles: The LLM node has specialized input and output handles for different workflow paths:
Left Side - Main InputReceives data from previous nodes in the workflow. This is the primary execution trigger and data source for the LLM.Accepts:
  • Workflow context
  • User input/messages
  • Previous node outputs
  • Template variables
Key Features:
  • Multi-Provider Support: OpenAI, Anthropic, OpenRouter, custom providers
  • Model Selection: Choose from hundreds of models
  • System Prompts: Define AI behavior with template variables
  • Streaming: Real-time response generation
  • Temperature Control: Adjust creativity vs. consistency
Example Use Cases:
  • Customer support agents with CRM tools
  • Research assistants with web search
  • Code generation with execution sandboxes
  • Content creation with multi-step refinement

Tool Node

Purpose: Execute operations, call APIs, run code, and integrate with external services Tool nodes enable LLMs to take actions in the real world.
Tool node visual representation
Tool node visual representation
Node Handles:
Left Side - Tool Call InputReceives tool call requests from LLM nodes or other workflow nodes.Accepts:
  • Tool call parameters from LLM
  • Direct invocation data
  • Workflow context variables
Tool Types:
Execute platform-specific operations (API calls, database queries, etc.)Features:
  • OAuth integration support
  • Dynamic credential management
  • Input/output schema validation
  • Rate limiting and retry logic
Example: Send email, create ticket, query database
Execute custom JavaScript/Python code with user-defined logicFeatures:
  • Full language support
  • Access to workflow context
  • Timeout protection
  • Error handling and logging
Example: Data transformation, custom validation, calculations
Model Context Protocol servers for standardized tool interfacesFeatures:
  • Server-side tool execution
  • Stateful connections
  • Resource access patterns
  • Prompt injection protection
Example: File system access, terminal execution, API wrappers
Trigger other workflows as sub-tasksFeatures:
  • Specify target workflow and start node
  • Pass input data programmatically
  • Wait for completion or run async
  • Access sub-workflow outputs
Example: Multi-agent systems, modular workflows, parallel processing
Execute code in isolated E2B sandboxesFeatures:
  • User-defined sandbox templates
  • Persistent sandbox instances
  • File system access
  • Package installation
  • Automatic inactivity pausing
Example: Code execution, data science pipelines, test environments

LLM Tool Integration

When connected to an LLM node, tools become automatically available to the AI model:

Tool Calling Behavior

Tool executes independently - no feedback loopWhen an LLM node is in the main workflow (not in a subflow), tool execution is one-directional:
  1. LLM analyzes the request
  2. Decides which tool to use (if any)
  3. Calls the selected tool
  4. Tool executes and completes
  5. Workflow continues to next node (tool results don’t return to LLM)
The LLM doesn’t receive tool results back - the tool simply executes as a separate step in the workflow.Example: LLM → Search Tool → (results go to next node) → Continue workflow
The ability to call tools multiple times is enabled by the subflow loop, not the LLM node itself. Outside subflows, tools execute independently without returning results to the LLM.
The LLM decides which tools to use, when to use them, and can call tools multiple times before generating a final response.

What’s Next?