PostgreSQL Visual Query Builder¶
Builds and executes PostgreSQL SELECT queries through a visual configuration, including multi-table JOINs. It generates SQL from inputs like selected columns, JOIN definitions, filters, grouping, ordering, and pagination, and can return results in text/JSON, HTML, XLSX, PDF, or all formats.

Usage¶
Use this node when you want to construct complex PostgreSQL queries without writing SQL directly. It fits into workflows where you first supply a PostgreSQL connection URI, then configure the main table, joins, filters, and output format. The node executes the generated SQL against your database connector service and returns formatted results and optional exports for reporting or downstream processing.
Inputs¶
| Field | Required | Type | Description | Example |
|---|---|---|---|---|
| credentials_path | True | STRING | Database connection URI for PostgreSQL. This must be a valid database-style URI; it is parsed to route the request to the PostgreSQL data connector service. | postgresql:// |
| timeout | True | INT | Request timeout in seconds for the database service call. | 60 |
| main_table | True | STRING | Primary table to query from. | users |
| main_table_schema | True | STRING | Schema of the main table. | public |
| selected_columns | True | STRING | Columns to include in SELECT (comma-separated). Use '*' for all columns. | users.id, users.name, orders.total_amount |
| join_config | True | STRING | JSON array describing JOINs. Each join object must include: type (INNER, LEFT, RIGHT, FULL, CROSS), table, optional schema (defaults to 'public'), and on condition (not used for CROSS joins). | [{"type": "INNER", "table": "orders", "schema": "public", "on": "users.id = orders.user_id"}, {"type": "LEFT", "table": "payments", "schema": "public", "on": "orders.id = payments.order_id"}] |
| where_conditions | True | STRING | Raw SQL WHERE clause conditions (without the 'WHERE' keyword). Leave empty for no filter. | users.signup_date >= '2024-01-01' AND users.status = 'active' |
| group_by_columns | True | STRING | Columns to GROUP BY (comma-separated). Leave empty for no grouping. | users.id, users.name |
| having_conditions | True | STRING | Raw SQL HAVING clause conditions (without the 'HAVING' keyword); used with GROUP BY. | COUNT(orders.id) > 2 |
| order_by_columns | True | STRING | Columns to ORDER BY (comma-separated). Append DESC for descending. | users.created_at DESC, users.id ASC |
| limit_count | True | INT | Maximum number of rows to return. Must be >= 1. | 100 |
| offset_count | True | INT | Number of rows to skip before starting to return rows. Use 0 for no offset. | 0 |
| output_format | True | ["text", "html", "xlsx", "pdf", "all"] | Controls how results are returned. 'text' returns human-readable text plus JSON; 'html' returns an HTML table; 'xlsx' returns an Excel file as base64; 'pdf' returns a PDF as base64; 'all' returns all formats. | text |
Outputs¶
| Field | Type | Description | Example |
|---|---|---|---|
| result | STRING | Human-readable summary of query results. When output_format is html/xlsx/pdf, this may be empty. | Visual Query: public.users Query Results (100 rows): users.id: 1 \| users.name: Alice \| orders.total_amount: 120.50 ... |
| json_result | STRING | Raw JSON response from the database service including rows and metadata. | {"data": [{"users.id": 1, "users.name": "Alice", "orders.total_amount": 120.5}], "row_count": 1} |
| html_table | STRING | HTML document containing a styled table of results. Returned when output_format is 'html' or 'all'. Empty for other formats. | ... |
| xlsx_data | STRING | Base64-encoded XLSX content of results. Returned when output_format is 'xlsx' or 'all'. Empty for other formats. | |
| pdf_data | STRING | Base64-encoded PDF content of results. Returned when output_format is 'pdf' or 'all'. Empty for other formats. |
Important Notes¶
- Credentials URI required: credentials_path must be a valid database URI (e.g., postgresql://
: @ : / ?schema=public). - JOIN configuration JSON: join_config must be a JSON array of objects with fields: type, table, on; schema defaults to 'public'. Valid join types: INNER, LEFT, RIGHT, FULL, CROSS (CROSS has no ON condition).
- Direct SQL fragments: where_conditions, group_by_columns, having_conditions, and order_by_columns are inserted as-is into the SQL. Ensure they are syntactically correct and safe.
- Row limits: limit_count must be >= 1; offset_count must be >= 0.
- Output behavior: output_format controls which outputs are populated. 'text' returns only result and json_result; other formats return their respective fields, with others left empty. 'all' returns all formats.
- Empty data handling: If no rows are returned, export fields (HTML/XLSX/PDF) may be empty strings.
- Service dependency: The node forwards requests to the configured PostgreSQL data connector service endpoint; ensure endpoints are properly configured in the environment.
Troubleshooting¶
- Invalid credentials URI: If you see errors about credentials or service URL, verify credentials_path is a properly formatted PostgreSQL URI and that the service endpoint is configured.
- Invalid join_config JSON: Errors like 'Invalid JSON' or 'Join configuration must be a list' indicate malformed JSON. Provide a valid JSON array of join objects.
- Missing join fields: Errors such as 'missing required field: type/table/on' mean one of these keys is absent in a join object.
- Unsupported join type: If you receive 'Invalid join type', change the type to one of: INNER, LEFT, RIGHT, FULL, CROSS.
- SQL syntax errors: If the service reports query errors, check selected_columns, where_conditions, group_by_columns, having_conditions, and order_by_columns for correctness.
- No data returned: If the result is empty, confirm your filters and joins are correct and that the tables contain data.
- Timeouts: Increase the timeout input if large queries or slow networks cause timeouts.
- Empty exports: HTML/XLSX/PDF fields can be empty when the result set is empty. Verify data or choose 'text' to inspect the JSON.
- PDF/XLSX cannot open: Ensure you decode the base64 string before saving to a file with the correct extension.