Skip to content

storage , minimal code change as possible in th... #971

@MervinPraison

Description

@MervinPraison

@claude

storage , minimal code change as possible in the client side

💡 Proposed PraisonAI Agents Implementation

1. MongoDB Storage

from praisonaiagents.memory import Memory

MongoDB backend for memory

memory = Memory(
provider="mongodb",
config={
"url": "mongodb://localhost:27017/",
"database": "praisonai",
"collection": "agent_memory",
# Optional advanced settings
"indexes": ["user_id", "agent_id", "timestamp"],
"ttl_field": "expires_at" # For automatic expiration
}
)

agent = Agent(
name="DataAnalyst",
memory=memory
)

Alternative: Direct storage usage

from praisonaiagents.storage import MongoDBStorage

storage = MongoDBStorage(
url="mongodb://localhost:27017/praisonai",
collection="sessions"
)

2. PostgreSQL Storage

from praisonaiagents.memory import Memory

PostgreSQL with full SQL capabilities

memory = Memory(
provider="postgresql",
config={
"url": "postgresql://user:password@localhost/dbname",
"schema": "praisonai", # Optional schema
"table_prefix": "agent_",
# Advanced features
"use_jsonb": True, # For flexible data
"connection_pool_size": 10
}
)

Or configuration-based

memory = Memory.from_config({
"provider": "postgresql",
"connection_string": "postgresql://localhost/praisonai"
})

3. DynamoDB Storage

from praisonaiagents.memory import Memory

AWS DynamoDB for serverless

memory = Memory(
provider="dynamodb",
config={
"table_name": "praisonai-memory",
"region": "us-east-1",
# AWS credentials (optional if using IAM roles)
"aws_access_key_id": "...",
"aws_secret_access_key": "...",
# Performance settings
"read_capacity": 5,
"write_capacity": 5,
"enable_streams": True
}
)

With AWS profile

memory = Memory(
provider="dynamodb",
config={
"table_name": "praisonai-memory",
"aws_profile": "production"
}
)

4. Redis Caching Layer

from praisonaiagents.memory import Memory

Redis as primary storage

memory = Memory(
provider="redis",
config={
"host": "localhost",
"port": 6379,
"password": "secure_password",
"ssl": True,
"db": 0,
# Caching features
"default_ttl": 3600, # 1 hour
"key_prefix": "praisonai:",
"compression": "gzip" # Optional compression
}
)

Or as cache layer over another storage

memory = Memory(
provider="rag", # Primary storage
cache={
"provider": "redis",
"host": "localhost",
"ttl": 300 # 5 minute cache
}
)

5. Cloud Storage Integration

from praisonaiagents.memory import Memory

Google Cloud Storage

memory = Memory(
provider="gcs",
config={
"bucket": "praisonai-sessions",
"project": "my-project",
"prefix": "agents/",
"credentials_path": "/path/to/credentials.json"
}
)

Amazon S3

memory = Memory(
provider="s3",
config={
"bucket": "praisonai-sessions",
"region": "us-east-1",
"prefix": "agents/",
"storage_class": "STANDARD_IA" # For infrequent access
}
)

Azure Blob Storage

memory = Memory(
provider="azure",
config={
"container": "praisonai-sessions",
"connection_string": "DefaultEndpointsProtocol=https;...",
"prefix": "agents/"
}
)

📋 Implementation Recommendations

1. Unified Storage Interface

Create a base storage interface that all backends implement:

praisonaiagents/storage/base.py

from abc import ABC, abstractmethod

class BaseStorage(ABC):
@AbstractMethod
async def read(self, key: str) -> dict:
pass

@abstractmethod
async def write(self, key: str, data: dict) -> None:
    pass

@abstractmethod
async def delete(self, key: str) -> None:
    pass

@abstractmethod
async def search(self, query: dict) -> list:
    pass

2. Configuration-First Approach

Example configuration file

storage_config = {
"default": "postgresql",
"backends": {
"postgresql": {
"url": "${DATABASE_URL}",
"pool_size": 10
},
"redis": {
"url": "${REDIS_URL}",
"ttl": 3600
},
"mongodb": {
"url": "${MONGODB_URL}",
"database": "praisonai"
}
},
"cache": {
"enabled": True,
"backend": "redis"
}
}

3. Multi-Storage Support

from praisonaiagents import Agent, Memory

Use different storage for different purposes

agent = Agent(
name="MultiStorageAgent",
# Long-term memory in PostgreSQL
memory=Memory(provider="postgresql"),
# Fast cache in Redis
cache=Memory(provider="redis", config={"ttl": 300}),
# Documents in MongoDB
knowledge=Memory(provider="mongodb")
)

4. Storage Migration Tools

from praisonaiagents.storage import migrate_storage

Migrate from SQLite to PostgreSQL

migrate_storage(
source={"provider": "sqlite", "path": ".praison/memory.db"},
destination={"provider": "postgresql", "url": "postgresql://..."},
batch_size=1000
)

5. Connection Pooling & Performance

Automatic connection pooling

memory = Memory(
provider="postgresql",
config={
"url": "postgresql://...",
"pool": {
"min_size": 2,
"max_size": 10,
"timeout": 30,
"retry_attempts": 3
}
}
)

6. Backward Compatibility

Ensure new storage options work with existing code:

Old way (still works)

agent = Agent(name="Assistant", memory=True)

New way with storage selection

agent = Agent(
name="Assistant",
memory=Memory(provider="mongodb")
)

Auto-detection based on config

agent = Agent(name="Assistant", memory="auto") # Uses default from config

Metadata

Metadata

Assignees

No one assigned

    Labels

    No labels
    No labels

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions