JSON List Processor¶
Processes each item in a JSON array using user-provided Python logic, executing items concurrently with a configurable worker pool. Captures execution output and errors per item, aggregates results into a JSON string, and returns a detailed execution log, success count, and total items.

Usage¶
Use this node when you have a JSON list and need to apply the same transformation to each element in parallel. Common in data cleaning or enrichment workflows where each list item (e.g., records from an API) must be processed with custom logic and summarized into a single result set.
Inputs¶
| Field | Required | Type | Description | Example |
|---|---|---|---|---|
| json_list | True | STRING | A JSON-encoded array to process. Each array element is passed to your Python code as input_data. | [{"id":1,"title":"Item A"},{"id":2,"title":"Item B"}] |
| python_code | True | STRING | Python logic applied to each item. Access the current item via input_data and return a value for aggregation. Variables available: input_data (current item), input_index (0-based index), all_inputs (full list), logger. | A short script that reads fields from input_data and returns a dictionary with computed values. |
| max_workers | True | INT | Maximum number of concurrent worker threads used to process items in parallel. | 4 |
| timeout_seconds | True | FLOAT | Per-item timeout in seconds. If a single item exceeds this duration, it is marked as a timeout and processing continues for others. | 30.0 |
Outputs¶
| Field | Type | Description | Example |
|---|---|---|---|
| results_json | STRING | A JSON-encoded array of results returned by your Python logic for each input item. Items that failed or timed out may be null or omitted in the transformation depending on your code. | [{"post_id":1,"processed":true},{"post_id":2,"processed":true}] |
| execution_log | STRING | Multi-line text log summarizing the outcome for each item, including any captured stdout/stderr, errors, or timeouts. | Item 0: SUCCESS \| Item 1: TIMEOUT after 30.0 seconds |
| success_count | INT | Number of items that completed successfully according to the node's execution status. | 3 |
| total_items | INT | Total number of items provided in the input json_list. | 5 |
Important Notes¶
- Execution variables: Your code can use input_data, input_index, all_inputs, and logger.
- Result handling: The node collects the value your code returns. If you set a variable named result or output, it will be used; otherwise the node attempts to use the last expression's value.
- Safety checks: Code containing high-risk operations (e.g., importing os/sys/subprocess/shutil, using exec/eval/import, opening files, prompting for input, or compile) is rejected.
- Timeouts: timeout_seconds applies per item. Timed-out items are reported in the log and counted as failures.
- Serialization: Results are converted to JSON. Non-serializable values are converted to strings, with a warning appended to the log.
- Concurrency: Processing uses a thread pool up to max_workers. Choose a value appropriate for your workload and environment.
Troubleshooting¶
- Invalid JSON input: If json_list cannot be parsed or is not an array, the node returns an error. Ensure it's a valid JSON array string.
- Rejected Python code: If your code uses restricted operations or has syntax errors, it will be rejected. Remove risky imports/functions and fix syntax.
- Item timeouts: Increase timeout_seconds if legitimate tasks need more time, or optimize your code for faster execution.
- Empty results: If results_json contains nulls or fewer items than total, review your code to ensure it returns a value for each input_data.
- Serialization errors: If complex objects are produced, convert them to basic types (dict/list/str/number) before returning to ensure clean JSON output.
- Low success_count: Check execution_log for per-item errors or warnings. Validate that your code handles all expected input shapes.