JSON List Processor¶
Processes each element of a JSON list concurrently using user-provided Python logic. It validates code for safety, executes with a per-item timeout, captures outputs/warnings, and aggregates results into a JSON string along with an execution log. Designed for scalable per-item transformations with configurable worker count.

Usage¶
Use this node when you have a JSON array (e.g., from an API) and you need to run the same Python transformation on each item in parallel. Typical workflow: fetch or build a JSON list, pass it here with concise Python code that uses 'input_data' to produce a result per item, then consume the aggregated 'results_json' and review 'execution_log'.
Inputs¶
| Field | Required | Type | Description | Example |
|---|---|---|---|---|
| json_list | True | STRING | A JSON-encoded array of items to process. Must be valid JSON and an array type. | [{"id":1,"title":"Example"},{"id":2,"title":"Another"}] |
| python_code | True | STRING | Python code executed for each item. Use 'input_data' (current item), 'input_index' (0-based), 'all_inputs' (the full list), and 'logger' for logging. Must return a value (e.g., dict) per item. | Python code that reads 'input_data' and returns a transformed dictionary |
| max_workers | True | INT | Maximum number of concurrent workers used to process items in parallel. | 4 |
| timeout_seconds | True | FLOAT | Timeout for each item execution (seconds). Items exceeding this timeout are marked as timed out. | 30.0 |
Outputs¶
| Field | Type | Description | Example |
|---|---|---|---|
| results_json | STRING | JSON-encoded array of results returned by your code for each processed item. Non-serializable results are converted to strings with a warning noted in the log. | [{"post_id":1,"processed":true},{"post_id":2,"processed":true}] |
| execution_log | STRING | Concatenated log lines for each item (success, warnings, errors, or timeouts) plus a summary. | Item 0: SUCCESS \| Item 1: SUCCESS \| Processed 2 items. Success: 2, Failed: 0 |
| success_count | INT | Number of items that executed successfully. | 2 |
| total_items | INT | Total number of items in the input list. | 2 |
Important Notes¶
- Code safety restrictions apply: dangerous patterns such as imports of system modules, exec/eval, file I/O, and similar are blocked. Invalid patterns or syntax will cause validation failure.
- The node runs items in parallel. Result ordering follows completion order, not necessarily the original input order.
- Per-item timeout applies; total collection waits up to timeout_seconds × number_of_items.
- If results contain non-JSON-serializable types, they are converted to strings and a warning is appended to the execution log.
- Provide a valid JSON array in 'json_list'. Objects or strings that are not arrays will be rejected.
- Available variables inside your code: 'input_data', 'input_index', 'all_inputs', and 'logger'. Ensure your code returns a result for each item.
Troubleshooting¶
- Error: Invalid JSON format — Ensure 'json_list' is valid JSON and is an array. Validate the JSON with a linter if needed.
- Error: Invalid Python code provided — Remove restricted operations (e.g., imports of os/sys/subprocess, exec/eval, file I/O) and fix syntax errors. Keep code minimal and self-contained.
- Items timing out — Increase 'timeout_seconds' or reduce workload per item. Also consider lowering 'max_workers' if the environment is resource-constrained.
- Empty list provided — Supply a non-empty JSON array; otherwise, the node returns empty results with an informational log.
- Unexpected non-serializable results — Ensure your code returns JSON-serializable values (e.g., dict/list/str/int/float/bool) to avoid string coercion and warnings.
- Results appear out of order — This is expected due to parallel execution. If order matters, include 'input_index' in your result and sort downstream.