LogoLogo
Documentation
Documentation
  • Getting Started
    • Introduction
    • Sign up to Developer Edition
    • Build Your First Agent
    • Developer Support
  • Core Concepts
    • Agent
      • Knowledge
      • Webhook
    • PII Masking
    • Sub-Agent
    • Intent
    • Workflow
      • Node
        • Input
        • Output
        • Loader
        • Display
        • API Node
        • Web Crawler
        • Table Write
        • Table Read
        • Ruleset
        • Upload Document
        • Javascript
        • Workflow
        • Loop
        • Document To Image
        • External Database
        • Storage Write
        • Storage Read
        • Fetch Document
        • Prompt
        • RAG Query
        • Vector Search
        • Emit Event
    • RAG
    • Model Hub
      • Entity Recognizers
    • Data Gateway
    • Rulesets
    • Code Snippets
    • Tables
    • Storage
    • Widget
  • Overview of GenAI
    • Introduction
    • Key concepts
      • Intent Classification
      • Inference
      • Generative AI Models
      • Large Language Models (LLMs)
      • Prompt Engineering
      • AI Agents
      • RAG (Retrieval Augmented Generation)
      • AI Workflow Automation
      • AI Agents vs LLM-based APPs
Powered by GitBook
On this page
  • Overview
  • Configurations
  • Operations & How They Work
  • Example Use-Cases
  • Key Takeaways for Developers
Export as PDF
  1. Core Concepts
  2. Workflow
  3. Node

Table Write

Overview

The Table Write Node in UPTIQ AI Workbench allows developers to store, update, and manage structured data within an agent’s persistent storage layer. Unlike traditional databases, UPTIQ’s Table concept provides a simplified yet effective way to maintain structured data that remains accessible across workflows.

This node is crucial for workflows requiring data persistence, such as tracking transactions, maintaining user records, logging workflow actions, and managing application statuses.

Refer to the Table section for guidance on creating tables in Uptiq before utilizing it with the Table Write node.

Configurations

Field
Description

Table

Select the table where the operation will be performed, e.g., Transactions.

Operation

Choose the type of database action: Insert Many, Update, or Delete.

Filter (For Update & Delete)

Define a JSON filter to identify which records need modification or removal.

Data (For Insert Many & Update)

Provide the new or updated data in JSON format.

Operations & How They Work

  1. Insert Many (Bulk Insert)

    • Adds multiple records at once to the selected table.

    • Example Data:

      jsonCopyEdit[
        { "transactionId": "T123", "status": "completed", "amount": 500 },
        { "transactionId": "T124", "status": "completed", "amount": 1000 }
      ]
  2. Update (Modify Existing Records)

    • Updates specific records that match a defined filter.

    • Example Filter:

      jsonCopyEdit{ "status": "pending review" }
    • Example Data (New Values for Matching Records):

      jsonCopyEdit{ "status": "approved" }
  3. Delete (Remove Records)

    • Deletes records based on a filter condition.

    • Example Filter:

      jsonCopyEdit{ "status": "rejected" }

Response Format

After execution, the node provides a structured response confirming the operation results, such as:

jsonCopyEdit{
  "inserted": 2,
  "updated": 1,
  "deleted": 3
}

This allows subsequent workflow nodes to act on the results dynamically.

Example Use-Cases

Use-Case 1: Tracking Loan Application Status

A financial institution’s loan processing workflow needs to update loan statuses after review.

  • Configuration:

    • Table: LoanApplications

    • Operation: Update

    • Filter: { "status": "pending review" }

    • Data: { "status": "approved" }

  • Expected Outcome:

    • All pending review applications will be marked as approved.

    • The response will indicate the number of records updated.


Use-Case 2: Recording Transaction Logs

A workflow captures payment transactions and needs to persist them in the database for future reference.

  • Configuration:

    • Table: Transactions

    • Operation: Insert Many

    • Data:

      jsonCopyEdit[
        { "transactionId": "T567", "status": "completed", "amount": 1500 },
        { "transactionId": "T568", "status": "pending", "amount": 700 }
      ]
  • Expected Outcome:

    • New transactions are stored in the table, ensuring future workflows can access them.


Use-Case 3: Cleaning Up Rejected Applications

A workflow runs periodically to delete rejected loan applications that are older than 30 days.

  • Configuration:

    • Table: LoanApplications

    • Operation: Delete

    • Filter: { "status": "rejected" }

  • Expected Outcome:

    • All records with status: rejected are removed, optimizing storage.

Key Takeaways for Developers

✅ Enables Persistent Data Storage – Maintain structured data across workflows without relying on an external database.

✅ Supports Bulk Inserts & Updates – Efficiently write multiple records in one operation, improving workflow performance.

✅ Works with Conditional Filters – Modify or delete records based on dynamic conditions.

✅ Ideal for Transaction Logs, Application Tracking, and Record Management – Best suited for workflows that require data persistence and structured storage.

By leveraging the Table Write Node, developers can build workflows with structured, persistent data handling, ensuring that business processes retain historical data, manage transactions, and optimize workflow efficiency. 🚀

PreviousWeb CrawlerNextTable Read

Last updated 3 months ago