JSON List Processor¶
Processes each item in a JSON array concurrently using user-provided Python logic. You supply a JSON list and a short processing script that operates on each item as input_data; the node runs items in parallel with a configurable worker count and per-item timeout. Returns aggregated results as a JSON string along with an execution log, success count, and total item count.

Usage¶
Use this node when you have a JSON array (e.g., API response or precomputed list) and need to transform or analyze each element in parallel. Typical workflow: provide the JSON list, write simple Python logic that reads input_data and produces a result object, set max_workers and timeout, then run. The node aggregates per-item outputs into a JSON array and logs per-item execution details.
Inputs¶
| Field | Required | Type | Description | Example | 
|---|---|---|---|---|
| json_list | True | STRING | A JSON-formatted list (array) of items to process. Each element is passed as input_data to your Python logic. | [{"id":1,"title":"First"},{"id":2,"title":"Second"}] | 
| python_code | True | STRING | Python logic that runs for each element. Use the variable input_data (and optionally input_index, all_inputs) to read the current item and set a result (e.g., a dict or value) to be collected. | Use input_data to compute and assign a result value for each item. | 
| max_workers | True | INT | Maximum number of concurrent workers used to process items in parallel. | 4 | 
| timeout_seconds | True | FLOAT | Per-item execution timeout in seconds. Items exceeding this timeout are marked as timed out. | 30.0 | 
Outputs¶
| Field | Type | Description | Example | 
|---|---|---|---|
| results_json | STRING | Aggregated results as a JSON string array. Each element corresponds to the output from processing one input item. | [{"post_id":1,"processed":true},{"post_id":2,"processed":true}] | 
| execution_log | STRING | Multiline log summarizing per-item execution outcomes, including success messages, warnings, and timeouts. | Item 0: SUCCESS \| Item 1: SUCCESS \| Processed 2 items. Success: 2, Failed: 0 | 
| success_count | INT | Number of items processed successfully. | 2 | 
| total_items | INT | Total number of items provided in the input JSON list. | 2 | 
Important Notes¶
- Safety checks: The node validates code and rejects potentially dangerous patterns (e.g., attempts to run system commands or import sensitive modules).
- Execution variables: Your code can reference input_data, input_index, all_inputs, and logger.
- Per-item timeout: timeout_seconds applies to each item; timed-out items are logged and counted as failures.
- Parallelism: max_workers controls concurrency; higher values can improve throughput but increase resource usage.
- Serialization: Results are returned as a JSON string. Non-serializable objects are converted to strings when necessary.
- Empty or invalid input: An empty list returns no results; invalid JSON input returns an error in the log with zero successes.
Troubleshooting¶
- Invalid JSON list: If you see 'Invalid JSON format' or 'Input must be a JSON list/array', ensure the json_list is a valid JSON array (starts with [ and contains properly formatted items).
- Code rejected as unsafe: If you get 'Invalid Python code provided', remove disallowed patterns (e.g., system imports, exec/eval) and keep logic focused on transforming input_data.
- Syntax errors in code: Fix Python syntax issues (missing colons, indentation, etc.) before running.
- Per-item timeout: If logs show 'TIMEOUT', increase timeout_seconds or reduce the complexity of the per-item logic.
- Result not JSON-serializable: If the log warns about serialization, ensure your result is a JSON-friendly type (dict, list, string, number, boolean, or null).
- Unexpected None results: Ensure your code sets a result value for each item using the available variables.