Anthropic LLM¶
Runs prompts against Anthropic Claude models via the Salt LLM service. It supports dynamic prompt composition, model auto-discovery with graceful fallback, temperature control, and max token limits. Designed for general text generation tasks like summarization, Q&A, drafting, and analysis.

Usage¶
Use this node when you need text responses from Anthropic models. Provide a system prompt to control tone and rules, a main prompt (which can include placeholders), and optionally connect up to four string inputs (e.g., retrieved knowledgebase content) to dynamically fill the prompt. Select a model from the dropdown; if the live model list isn't available, a curated fallback list is used. The node returns a single string with the model's response.
Inputs¶
| Field | Required | Type | Description | Example | 
|---|---|---|---|---|
| model | True | STRING (select) | Anthropic model to use. The list is fetched from the LLM service; if unavailable, a fallback list of Claude models is provided. | claude-3-5-sonnet-20241022 | 
| system_prompt | True | STRING | Instructions and rules for the assistant’s behavior, tone, and format. | You are a helpful assistant. Answer concisely and cite sources when possible. | 
| prompt | True | DYNAMIC_STRING | User-facing request. Supports placeholders for connected inputs using {{input_1}}..{{input_4}}. | Summarize the following research notes in 5 bullet points: {{input_1}} | 
| temperature | True | FLOAT | Controls creativity and variability. Lower = more deterministic; higher = more diverse. | 0.5 | 
| max_tokens | True | INT | Maximum tokens to generate. Influences response length. | 1024 | 
| input_1 | False | STRING | Optional contextual input. Refer to it in the prompt as {{input_1}}. | Long-form notes or retrieved context text | 
| input_2 | False | STRING | Optional contextual input. Refer to it in the prompt as {{input_2}}. | Additional document text | 
| input_3 | False | STRING | Optional contextual input. Refer to it in the prompt as {{input_3}}. | Knowledgebase snippet | 
| input_4 | False | STRING | Optional contextual input. Refer to it in the prompt as {{input_4}}. | User profile or constraints | 
Outputs¶
| Field | Type | Description | Example | 
|---|---|---|---|
| Output | STRING | The generated text from the selected Anthropic model. | Here is a concise, 5-bullet summary of the provided notes... | 
Important Notes¶
- Model selection: The node queries the live Anthropic model list; if unavailable, it falls back to known Claude models (e.g., claude-3-7-sonnet-20250219, claude-3-5-sonnet-20241022, claude-3-haiku-20240307).
- Dynamic placeholders: Use {{input_1}} to {{input_4}} inside the prompt to inject optional inputs.
- Response length: max_tokens bounds the completion length and may be trimmed by the service if too large.
- Timeouts: The node uses a request timeout (default 90s). Long-running requests may need a higher timeout configured upstream.
- Required configuration: The Salt LLM service must be configured with valid Anthropic API credentials for requests to succeed.
Troubleshooting¶
- Model ID not found: Select a model from the dropdown. If you pasted a name, ensure it matches an available or fallback model.
- Empty or very short output: Increase max_tokens or reduce temperature; verify your prompt and system prompt are not empty and that placeholders are populated.
- Request timed out: Reduce prompt/context size, lower max_tokens, or adjust the service timeout configuration.
- Inaccurate or off-topic responses: Refine the system_prompt with clearer rules and add more specific examples/context via input_1..input_4.
- Service error from LLM: Verify Anthropic credentials and connectivity in the Salt LLM service; try another model from the list.