Skip to content

Cached Documents

Cached documents are Fabaio's working memory - a context caching system that stores intermediate results, summaries, tool outputs, and configuration data that Fabaio needs to reason over during conversations. This system enables Fabaio to maintain context, reduce redundant queries, and provide transparent visibility into how decisions are made.

What are Cached Documents?

Cached documents are context storage units that Fabaio creates, updates, or references during conversation execution. They represent the agent's working memory and provide a way to:

  • Store Intermediate Results - Save data retrieved from tools, APIs, or queries for reuse
  • Maintain Context - Preserve conversation context across multiple interactions
  • Reduce Token Consumption - Cache frequently accessed data to avoid redundant LLM calls
  • Enable Transparency - Provide visibility into what data the agent used to make decisions
  • Support Debugging - Allow users to trace agent decisions back to source data

Cached documents are essential for understanding how Fabaio processes information and reaches conclusions during complex multi-step operations.

How Cached Documents Work

Creation

Cached documents are automatically created when:

  1. Tool Outputs - When tools return data that may be needed later
  2. Large Data Sets - When query results exceed token limits
  3. Frequently Accessed Data - When the same data is referenced multiple times
  4. Intermediate Results - When processing multi-step operations
  5. User Requests - When explicitly requested for caching

Storage

Cached documents are stored in the context cache system:

  • Persistent Storage - Documents persist across conversation sessions
  • Indexed - Documents are indexed for quick retrieval

Retrieval

Fabaio retrieves cached documents when:

  • Context Needed - When previous data is relevant to current queries
  • Token Optimization - When cached data can replace expensive queries
  • User Request - When users explicitly request cached document access
  • Cross-Reference - When linking related information across conversations

Lifecycle

Cached documents follow a lifecycle:

  1. Creation - Document is created with initial data
  2. Updates - Document may be updated as new information arrives
  3. Reference - Document is referenced during conversation execution
  4. Retention - Documents are retained based on retention policies
  5. Expiration - Documents expire based on configured TTL (Time To Live)

Viewing Cached Documents

In Conversations

Cached documents are visible in conversation details:

  1. Conversation View - Open any conversation to see cached documents
  2. Cache Documents Tab - Navigate to the "Cache Documents" tab
  3. Document List - View all documents created, updated, or referenced
  4. Document Details - Expand documents to see full content

Document Information

Each cached document displays:

  • Document ID - Unique identifier for the document
  • Creation Time - When the document was created
  • Last Updated - When the document was last modified
  • Content - The actual data stored in the document
  • Metadata - Tags, source, and other metadata
  • References - Where and how the document was used

Benefits of Cached Documents

Performance Optimization

  • Reduced Latency - Faster responses by avoiding redundant queries
  • Token Savings - Lower token consumption by reusing cached data
  • Cost Reduction - Reduced API calls and LLM usage costs

Transparency and Auditability

  • Full Visibility - See exactly what data influenced decisions
  • Traceability - Trace conclusions back to source data
  • Debugging - Understand why agents made specific decisions
  • Compliance - Maintain audit trails for regulatory requirements

Context Preservation

  • Multi-Turn Conversations - Maintain context across conversation turns
  • Cross-Conversation - Reference data from previous conversations
  • Working Memory - Preserve intermediate results for complex operations

Collaboration

  • Shared Context - Share cached documents with team members
  • Knowledge Reuse - Reuse cached data across different conversations
  • Team Learning - Learn from cached documents in shared conversations

Cached Documents in Shared Conversations

When viewing shared conversations, cached documents provide:

  • Full Transparency - See all data used in the original conversation
  • Validation - Verify conclusions by reviewing source data
  • Learning - Understand how agents process information
  • Debugging - Identify issues by examining cached content

Even in read-only shared conversations, cached documents are fully accessible for review and analysis.


Note

Cached documents are a powerful feature for understanding how Fabaio works and making AI operations transparent and auditable. They enable users to trace decisions back to source data, optimize performance, and maintain context across complex operations. All cached documents are subject to your organization's data retention and privacy policies.