documentation
This commit is contained in:
41
docs/architecture/README.md
Normal file
41
docs/architecture/README.md
Normal file
@@ -0,0 +1,41 @@
|
||||
# Architecture Documentation
|
||||
|
||||
This section contains system architecture and design documentation for the TCP Dashboard platform.
|
||||
|
||||
## 📋 Contents
|
||||
|
||||
### System Architecture
|
||||
|
||||
- **[Architecture Overview](architecture.md)** - *High-level system architecture and component design*
|
||||
- Core system components and interactions
|
||||
- Data flow and processing pipelines
|
||||
- Service architecture and deployment patterns
|
||||
- Technology stack and infrastructure
|
||||
|
||||
### Product Requirements
|
||||
|
||||
- **[Crypto Bot PRD](crypto-bot-prd.md)** - *Product Requirements Document for the crypto trading bot platform*
|
||||
- Platform vision and objectives
|
||||
- Feature specifications and requirements
|
||||
- User personas and use cases
|
||||
- Technical requirements and constraints
|
||||
- Implementation roadmap and milestones
|
||||
|
||||
## 🏗️ System Overview
|
||||
|
||||
The TCP Dashboard follows a modular, microservices-inspired architecture designed for:
|
||||
|
||||
- **Scalability**: Horizontal scaling of individual components
|
||||
- **Reliability**: Fault tolerance and auto-recovery mechanisms
|
||||
- **Maintainability**: Clear separation of concerns and modular design
|
||||
- **Extensibility**: Easy addition of new exchanges, strategies, and features
|
||||
|
||||
## 🔗 Related Documentation
|
||||
|
||||
- **[Components Documentation](../components/)** - Technical implementation details
|
||||
- **[Setup Guide](../guides/setup.md)** - System setup and configuration
|
||||
- **[Reference Documentation](../reference/)** - API specifications and technical references
|
||||
|
||||
---
|
||||
|
||||
*For the complete documentation index, see the [main documentation README](../README.md).*
|
||||
212
docs/architecture/architecture.md
Normal file
212
docs/architecture/architecture.md
Normal file
@@ -0,0 +1,212 @@
|
||||
## Architecture Components
|
||||
|
||||
### 1. Data Collector
|
||||
**Responsibility**: OHLCV data collection and aggregation from exchanges
|
||||
```python
|
||||
class DataCollector:
|
||||
def __init__(self):
|
||||
self.providers = {} # Registry of data providers
|
||||
self.store_raw_data = False # Optional raw data storage
|
||||
|
||||
def register_provider(self, name: str, provider: DataProvider):
|
||||
"""Register a new data provider"""
|
||||
|
||||
def start_collection(self, symbols: List[str], timeframes: List[str]):
|
||||
"""Start collecting OHLCV data for specified symbols and timeframes"""
|
||||
|
||||
def process_raw_trades(self, raw_trades: List[dict]) -> dict:
|
||||
"""Aggregate raw trades into OHLCV candles"""
|
||||
|
||||
def store_ohlcv_data(self, ohlcv_data: dict):
|
||||
"""Store OHLCV data in PostgreSQL market_data table"""
|
||||
|
||||
def send_market_update(self, symbol: str, ohlcv_data: dict):
|
||||
"""Send Redis signal with OHLCV update to active bots"""
|
||||
|
||||
def store_raw_data_optional(self, raw_data: dict):
|
||||
"""Optionally store raw data for detailed backtesting"""
|
||||
```
|
||||
|
||||
### 2. Strategy Engine
|
||||
**Responsibility**: Unified interface for all trading strategies
|
||||
```python
|
||||
class BaseStrategy:
|
||||
def __init__(self, parameters: dict):
|
||||
self.parameters = parameters
|
||||
|
||||
def process_data(self, data: pd.DataFrame) -> Signal:
|
||||
"""Process market data and generate signals"""
|
||||
raise NotImplementedError
|
||||
|
||||
def get_indicators(self) -> dict:
|
||||
"""Return calculated indicators for plotting"""
|
||||
return {}
|
||||
```
|
||||
|
||||
### 3. Bot Manager
|
||||
**Responsibility**: Orchestrate bot execution and state management
|
||||
```python
|
||||
class BotManager:
|
||||
def __init__(self):
|
||||
self.active_bots = {}
|
||||
self.config_path = "config/bots/"
|
||||
|
||||
def load_bot_config(self, bot_id: int) -> dict:
|
||||
"""Load bot configuration from JSON file"""
|
||||
|
||||
def start_bot(self, bot_id: int):
|
||||
"""Start a bot instance with crash recovery monitoring"""
|
||||
|
||||
def stop_bot(self, bot_id: int):
|
||||
"""Stop a bot instance and update database status"""
|
||||
|
||||
def process_signal(self, bot_id: int, signal: Signal):
|
||||
"""Process signal and make virtual trading decision"""
|
||||
|
||||
def update_bot_heartbeat(self, bot_id: int):
|
||||
"""Update bot heartbeat in database for monitoring"""
|
||||
|
||||
def restart_crashed_bots(self):
|
||||
"""Monitor and restart crashed bots (max 3 attempts/hour)"""
|
||||
|
||||
def restore_active_bots_on_startup(self):
|
||||
"""Restore active bot states after application restart"""
|
||||
```
|
||||
|
||||
## Communication Architecture
|
||||
|
||||
### Redis Pub/Sub Patterns
|
||||
```python
|
||||
# Real-time market data distribution
|
||||
MARKET_DATA_CHANNEL = "market:{symbol}" # OHLCV updates
|
||||
BOT_SIGNALS_CHANNEL = "signals:{bot_id}" # Trading decisions
|
||||
BOT_STATUS_CHANNEL = "status:{bot_id}" # Bot lifecycle events
|
||||
SYSTEM_EVENTS_CHANNEL = "system:events" # Global notifications
|
||||
```
|
||||
|
||||
## Time Aggregation Strategy
|
||||
|
||||
### Candlestick Alignment
|
||||
- **Use RIGHT-ALIGNED timestamps** (industry standard)
|
||||
- 5-minute candle with timestamp 09:05:00 represents data from 09:00:01 to 09:05:00
|
||||
- Timestamp = close time of the candle
|
||||
- Aligns with major exchanges (Binance, OKX, Coinbase)
|
||||
|
||||
### Aggregation Logic
|
||||
```python
|
||||
def aggregate_to_timeframe(ticks: List[dict], timeframe: str) -> dict:
|
||||
"""
|
||||
Aggregate tick data to specified timeframe
|
||||
timeframe: '1m', '5m', '15m', '1h', '4h', '1d'
|
||||
"""
|
||||
# Convert timeframe to seconds
|
||||
interval_seconds = parse_timeframe(timeframe)
|
||||
|
||||
# Group ticks by time intervals (right-aligned)
|
||||
for group in group_by_interval(ticks, interval_seconds):
|
||||
candle = {
|
||||
'timestamp': group.end_time, # Right-aligned
|
||||
'open': group.first_price,
|
||||
'high': group.max_price,
|
||||
'low': group.min_price,
|
||||
'close': group.last_price,
|
||||
'volume': group.total_volume
|
||||
}
|
||||
yield candle
|
||||
```
|
||||
|
||||
## Backtesting Strategy
|
||||
|
||||
### Vectorized Processing Approach
|
||||
```python
|
||||
import pandas as pd
|
||||
import numpy as np
|
||||
|
||||
def backtest_strategy_simple(strategy, market_data: pd.DataFrame, initial_balance: float = 10000):
|
||||
"""
|
||||
Simple vectorized backtesting using pandas operations
|
||||
|
||||
Parameters:
|
||||
- strategy: Strategy instance with process_data method
|
||||
- market_data: DataFrame with OHLCV data
|
||||
- initial_balance: Starting portfolio value
|
||||
|
||||
Returns:
|
||||
- Portfolio performance metrics and trade history
|
||||
"""
|
||||
|
||||
# Calculate all signals at once using vectorized operations
|
||||
signals = []
|
||||
portfolio_value = []
|
||||
current_balance = initial_balance
|
||||
position = 0
|
||||
|
||||
for idx, row in market_data.iterrows():
|
||||
# Get signal from strategy
|
||||
signal = strategy.process_data(market_data.iloc[:idx+1])
|
||||
|
||||
# Simulate trade execution
|
||||
if signal.action == 'buy' and position == 0:
|
||||
position = current_balance / row['close']
|
||||
current_balance = 0
|
||||
|
||||
elif signal.action == 'sell' and position > 0:
|
||||
current_balance = position * row['close'] * 0.999 # 0.1% fee
|
||||
position = 0
|
||||
|
||||
# Track portfolio value
|
||||
total_value = current_balance + (position * row['close'])
|
||||
portfolio_value.append(total_value)
|
||||
signals.append(signal)
|
||||
|
||||
return {
|
||||
'final_value': portfolio_value[-1],
|
||||
'total_return': (portfolio_value[-1] / initial_balance - 1) * 100,
|
||||
'signals': signals,
|
||||
'portfolio_progression': portfolio_value
|
||||
}
|
||||
|
||||
def calculate_performance_metrics(portfolio_values: List[float]) -> dict:
|
||||
"""Calculate standard performance metrics"""
|
||||
returns = pd.Series(portfolio_values).pct_change().dropna()
|
||||
|
||||
return {
|
||||
'sharpe_ratio': returns.mean() / returns.std() if returns.std() > 0 else 0,
|
||||
'max_drawdown': (pd.Series(portfolio_values).cummax() - pd.Series(portfolio_values)).max(),
|
||||
'win_rate': (returns > 0).mean(),
|
||||
'total_trades': len(returns)
|
||||
}
|
||||
```
|
||||
|
||||
### Optimization Techniques
|
||||
1. **Vectorized Operations**: Use pandas for bulk data processing
|
||||
2. **Efficient Indexing**: Pre-calculate indicators where possible
|
||||
3. **Memory Management**: Process data in chunks for large datasets
|
||||
4. **Simple Parallelization**: Run multiple strategy tests independently
|
||||
|
||||
## Key Design Principles
|
||||
|
||||
1. **OHLCV-First Data Strategy**: Primary focus on aggregated candle data, optional raw data storage
|
||||
2. **Signal Tracking**: All trading signals recorded in database for analysis and debugging
|
||||
3. **JSON Configuration**: Strategy parameters and bot configs in JSON for rapid testing
|
||||
4. **Real-time State Management**: Bot states updated via Redis and PostgreSQL for monitoring
|
||||
5. **Crash Recovery**: Automatic bot restart and application state recovery
|
||||
6. **Virtual Trading**: Simulation-first approach with fee modeling
|
||||
7. **Simplified Architecture**: Monolithic design with clear component boundaries for future scaling
|
||||
|
||||
## Database Architecture
|
||||
|
||||
### Core Tables
|
||||
- **market_data**: OHLCV candles for bot operations and backtesting (primary table)
|
||||
- **bots**: Bot instances with JSON config references and status tracking
|
||||
- **signals**: Trading decisions with confidence scores and indicator values
|
||||
- **trades**: Virtual trade execution records with P&L tracking
|
||||
- **bot_performance**: Portfolio snapshots for performance visualization
|
||||
|
||||
### Optional Tables
|
||||
- **raw_trades**: Raw tick data for advanced backtesting (partitioned by month)
|
||||
|
||||
### Data Access Patterns
|
||||
- **Real-time**: Bots read recent OHLCV data via indexes on (symbol, timeframe, timestamp)
|
||||
- **Historical**: Dashboard queries aggregated performance data for charts
|
||||
- **Backtesting**: Sequential access to historical OHLCV data by date range
|
||||
608
docs/architecture/crypto-bot-prd.md
Normal file
608
docs/architecture/crypto-bot-prd.md
Normal file
@@ -0,0 +1,608 @@
|
||||
# Simplified Crypto Trading Bot Platform: Product Requirements Document (PRD)
|
||||
|
||||
**Version:** 1.0
|
||||
**Date:** May 30, 2025
|
||||
**Author:** Vasily
|
||||
**Status:** Draft
|
||||
|
||||
## Executive Summary
|
||||
|
||||
This PRD outlines the development of a simplified crypto trading bot platform that enables strategy testing, development, and execution without the complexity of microservices and advanced monitoring. The goal is to create a functional system within 1-2 weeks that allows for strategy testing while establishing a foundation that can scale in the future. The platform addresses key requirements including data collection, strategy execution, visualization, and backtesting capabilities in a monolithic architecture optimized for internal use.
|
||||
|
||||
## Current Requirements & Constraints
|
||||
|
||||
- **Speed to Deployment**: System must be functional within 1-2 weeks
|
||||
- **Scale**: Support for 5-10 concurrent trading bots
|
||||
- **Architecture**: Monolithic application instead of microservices
|
||||
- **User Access**: Internal use only initially (no multi-user authentication)
|
||||
- **Infrastructure**: Simplified deployment without Kubernetes/Docker Swarm
|
||||
- **Monitoring**: Basic logging for modules
|
||||
|
||||
## System Architecture
|
||||
|
||||
### High-Level Architecture
|
||||
|
||||
The platform will follow a monolithic architecture pattern to enable rapid development while providing clear separation between components:
|
||||
|
||||
### Data Flow Architecture
|
||||
|
||||
```
|
||||
OKX Exchange API (WebSocket)
|
||||
↓
|
||||
Data Collector → OHLCV Aggregator → PostgreSQL (market_data)
|
||||
↓ ↓
|
||||
[Optional] Raw Trade Storage Redis Pub/Sub → Strategy Engine (JSON configs)
|
||||
↓ ↓
|
||||
Files/Database (raw_trades) Signal Generation → Bot Manager
|
||||
↓
|
||||
PostgreSQL (signals, trades, bot_performance)
|
||||
↓
|
||||
Dashboard (REST API) ← PostgreSQL (historical data)
|
||||
↑
|
||||
Real-time Updates ← Redis Channels
|
||||
```
|
||||
|
||||
**Data Processing Priority**:
|
||||
1. **Real-time**: Raw data → OHLCV candles → Redis → Bots (primary flow)
|
||||
2. **Historical**: OHLCV data from PostgreSQL for backtesting and charts
|
||||
3. **Advanced Analysis**: Raw trade data (if stored) for detailed backtesting
|
||||
|
||||
### Redis Channel Design
|
||||
|
||||
```python
|
||||
# Real-time market data distribution
|
||||
MARKET_DATA_CHANNEL = "market:{symbol}" # OHLCV updates
|
||||
BOT_SIGNALS_CHANNEL = "signals:{bot_id}" # Trading decisions
|
||||
BOT_STATUS_CHANNEL = "status:{bot_id}" # Bot lifecycle events
|
||||
SYSTEM_EVENTS_CHANNEL = "system:events" # Global notifications
|
||||
```
|
||||
|
||||
### Configuration Strategy
|
||||
|
||||
**PostgreSQL for**: Market data, bot instances, trades, signals, performance metrics
|
||||
**JSON files for**: Strategy parameters, bot configurations (rapid testing and parameter tuning)
|
||||
|
||||
```json
|
||||
// config/strategies/ema_crossover.json
|
||||
{
|
||||
"strategy_name": "EMA_Crossover",
|
||||
"parameters": {
|
||||
"fast_period": 12,
|
||||
"slow_period": 26,
|
||||
"risk_percentage": 0.02
|
||||
}
|
||||
}
|
||||
|
||||
// config/bots/bot_001.json
|
||||
{
|
||||
"bot_id": "bot_001",
|
||||
"strategy_file": "ema_crossover.json",
|
||||
"symbol": "BTC-USDT",
|
||||
"virtual_balance": 10000,
|
||||
"enabled": true
|
||||
}
|
||||
```
|
||||
|
||||
### Error Handling Strategy
|
||||
|
||||
**Bot Crash Recovery**:
|
||||
- Monitor bot processes every 30 seconds
|
||||
- Auto-restart crashed bots if status = 'active'
|
||||
- Log all crashes with stack traces
|
||||
- Maximum 3 restart attempts per hour
|
||||
|
||||
**Exchange Connection Issues**:
|
||||
- Retry with exponential backoff (1s, 2s, 4s, 8s, max 60s)
|
||||
- Switch to backup WebSocket connection if available
|
||||
- Log connection quality metrics
|
||||
|
||||
**Database Errors**:
|
||||
- Continue operation with in-memory cache for up to 5 minutes
|
||||
- Queue operations for retry when connection restored
|
||||
- Alert on prolonged database disconnection
|
||||
|
||||
**Application Restart Recovery**:
|
||||
- Read bot states from database on startup
|
||||
- Restore active bots to 'active' status
|
||||
- Resume data collection for all monitored symbols
|
||||
|
||||
### Component Details and Functional Requirements
|
||||
|
||||
1. **Data Collection Module**
|
||||
- Connect to exchange APIs (OKX initially) via WebSocket
|
||||
- Aggregate real-time trades into OHLCV candles (1m, 5m, 15m, 1h, 4h, 1d)
|
||||
- Store OHLCV data in PostgreSQL for bot operations and backtesting
|
||||
- Send real-time candle updates through Redis
|
||||
- Optional: Store raw trade data for advanced backtesting
|
||||
|
||||
**FR-001: Unified Data Provider Interface**
|
||||
- Support multiple exchanges through standardized adapters
|
||||
- Real-time OHLCV aggregation with WebSocket connections
|
||||
- Primary focus on candle data, raw data storage optional
|
||||
- Data validation and error handling mechanisms
|
||||
|
||||
**FR-002: Market Data Processing**
|
||||
- OHLCV aggregation with configurable timeframes (1m base, higher timeframes derived)
|
||||
- Technical indicator calculation (SMA, EMA, RSI, MACD, Bollinger Bands) on OHLCV data
|
||||
- Data normalization across different exchanges
|
||||
- Time alignment following exchange standards (right-aligned candles)
|
||||
|
||||
2. **Strategy Engine**
|
||||
- Provide unified interface for all trading strategies
|
||||
- Support multiple strategy types with common parameter structure
|
||||
- Generate trading signals based on market data
|
||||
- Log strategy performance and signals
|
||||
- Strategy implementation as a class.
|
||||
|
||||
**FR-003: Strategy Framework**
|
||||
- Base strategy class with standardized interface
|
||||
- Support for multiple strategy types
|
||||
- Parameter configuration and optimization tools (JSON for the parameters)
|
||||
- Signal generation with confidence scoring
|
||||
|
||||
**FR-004: Signal Processing**
|
||||
- Real-time signal calculation and validation
|
||||
- Signal persistence for analysis and debugging
|
||||
- Multi-timeframe analysis capabilities
|
||||
- Custom indicator development support
|
||||
|
||||
3. **Bot Manager**
|
||||
- Create and manage up to 10 concurrent trading bots
|
||||
- Configure bot parameters and associated strategies
|
||||
- Start/stop individual bots
|
||||
- Track bot status and performance
|
||||
|
||||
**FR-005: Bot Lifecycle Management**
|
||||
- Bot creation with strategy and parameter selection
|
||||
- Start/stop/pause functionality with state persistence
|
||||
- Configuration management
|
||||
- Resource allocation and monitoring (in future)
|
||||
|
||||
**FR-006: Portfolio Management**
|
||||
- Position tracking and balance management
|
||||
- Risk management controls (stop-loss, take-profit, position sizing)
|
||||
- Multi-bot coordination and conflict resolution (in future)
|
||||
- Real-time portfolio valuation (in future)
|
||||
|
||||
5. **Trading Execution**
|
||||
- Simulate or execute trades based on configuration
|
||||
- Stores trade information in database
|
||||
|
||||
**FR-007: Order Management**
|
||||
- Order placement with multiple order types (market, limit, stop)
|
||||
- Order tracking and status monitoring (in future)
|
||||
- Execution confirmation and reconciliation (in future)
|
||||
- Fee calculation and tracking (in future)
|
||||
|
||||
**FR-008: Risk Controls**
|
||||
- Pre-trade risk validation
|
||||
- Position limits and exposure controls (in future)
|
||||
- Emergency stop mechanisms (in future)
|
||||
- Compliance monitoring and reporting (in future)
|
||||
|
||||
4. **Database (PostgreSQL)**
|
||||
- Store market data, bot configurations, and trading history
|
||||
- Optimized schema for time-series data without complexity
|
||||
- Support for data querying and aggregation
|
||||
**Database (JSON)**
|
||||
- Store strategy parameters and bot onfiguration in JSON in the beginning for simplicity of editing and testing
|
||||
|
||||
5. **Backtesting Engine**
|
||||
- Run simulations on historical data using vectorized operations for speed
|
||||
- Calculate performance metrics
|
||||
- Support multiple timeframes and strategy parameter testing
|
||||
- Generate comparison reports between strategies
|
||||
|
||||
**FR-009: Historical Simulation**
|
||||
- Strategy backtesting on historical market data
|
||||
- Performance metric calculation (Sharpe ratio, drawdown, win rate, total return)
|
||||
- Parameter optimization through grid search (limited combinations for speed) (in future)
|
||||
- Side-by-side strategy comparison with statistical significance
|
||||
|
||||
**FR-010: Simulation Engine**
|
||||
- Vectorized signal calculation using pandas operations
|
||||
- Realistic fee modeling (0.1% per trade for OKX)
|
||||
- Look-ahead bias prevention with proper timestamp handling
|
||||
- Configurable test periods (1 day to 24 months)
|
||||
|
||||
6. **Dashboard & Visualization**
|
||||
- Display real-time market data and bot status
|
||||
- Show portfolio value progression over time
|
||||
- Visualize trade history with buy/sell markers on price charts
|
||||
- Provide simple bot control interface (start/stop/configure)
|
||||
|
||||
**FR-011: Dashboard Interface**
|
||||
- Real-time bot monitoring with status indicators
|
||||
- Portfolio performance charts (total value, cash vs crypto allocation)
|
||||
- Trade history table with P&L per trade
|
||||
- Simple bot configuration forms for JSON parameter editing
|
||||
|
||||
**FR-012: Data Visualization**
|
||||
- Interactive price charts with strategy signal overlays
|
||||
- Portfolio value progression charts
|
||||
- Performance comparison tables (multiple bots side-by-side)
|
||||
- Fee tracking and total cost analysis
|
||||
|
||||
### Non-Functional Requirements
|
||||
|
||||
1 Performance Requirements
|
||||
**NFR-001: Latency**
|
||||
- Market data processing: <100ms from exchange to database
|
||||
- Signal generation: <500ms for standard strategies
|
||||
- API response time: <200ms for 95% of requests
|
||||
- Dashboard updates: <2 seconds for real-time data
|
||||
|
||||
**NFR-002: Scalability**
|
||||
- Database queries scalable to 1M+ records per table
|
||||
- Horizontal scaling capability for all services (in future)
|
||||
|
||||
2. Reliability Requirements
|
||||
**NFR-003: Availability**
|
||||
- System uptime: 99.5% excluding planned maintenance
|
||||
- Data collection: 99.9% uptime during market hours
|
||||
- Automatic failover for critical services
|
||||
- Graceful degradation during partial outages
|
||||
|
||||
**NFR-004: Data Integrity**
|
||||
- Zero data loss for executed trades
|
||||
- Transactional consistency for all financial operations
|
||||
- Regular database backups with point-in-time recovery
|
||||
- Data validation and error correction mechanisms
|
||||
|
||||
3. Security Requirements
|
||||
**NFR-005: Authentication & Authorization** (in future)
|
||||
|
||||
**NFR-006: Data Protection**
|
||||
- End-to-end encryption for sensitive data (in future)
|
||||
- Secure storage of API keys and credentials
|
||||
- Regular security audits and penetration testing (in future)
|
||||
- Compliance with financial data protection regulations (in future)
|
||||
|
||||
## Technical Implementation
|
||||
|
||||
### Database Schema
|
||||
|
||||
The database schema separates frequently-accessed OHLCV data from raw tick data to optimize performance and storage.
|
||||
|
||||
```sql
|
||||
-- OHLCV Market Data (primary table for bot operations)
|
||||
CREATE TABLE market_data (
|
||||
id SERIAL PRIMARY KEY,
|
||||
exchange VARCHAR(50) NOT NULL DEFAULT 'okx',
|
||||
symbol VARCHAR(20) NOT NULL,
|
||||
timeframe VARCHAR(5) NOT NULL, -- 1m, 5m, 15m, 1h, 4h, 1d
|
||||
timestamp TIMESTAMPTZ NOT NULL,
|
||||
open DECIMAL(18,8) NOT NULL,
|
||||
high DECIMAL(18,8) NOT NULL,
|
||||
low DECIMAL(18,8) NOT NULL,
|
||||
close DECIMAL(18,8) NOT NULL,
|
||||
volume DECIMAL(18,8) NOT NULL,
|
||||
trades_count INTEGER, -- number of trades in this candle
|
||||
created_at TIMESTAMPTZ DEFAULT NOW(),
|
||||
UNIQUE(exchange, symbol, timeframe, timestamp)
|
||||
);
|
||||
CREATE INDEX idx_market_data_lookup ON market_data(symbol, timeframe, timestamp);
|
||||
CREATE INDEX idx_market_data_recent ON market_data(timestamp DESC) WHERE timestamp > NOW() - INTERVAL '7 days';
|
||||
|
||||
-- Raw Trade Data (optional, for detailed backtesting only)
|
||||
CREATE TABLE raw_trades (
|
||||
id SERIAL PRIMARY KEY,
|
||||
exchange VARCHAR(50) NOT NULL DEFAULT 'okx',
|
||||
symbol VARCHAR(20) NOT NULL,
|
||||
timestamp TIMESTAMPTZ NOT NULL,
|
||||
type VARCHAR(10) NOT NULL, -- trade, order, balance, tick, books
|
||||
data JSONB NOT NULL, -- response from the exchange
|
||||
created_at TIMESTAMPTZ DEFAULT NOW()
|
||||
) PARTITION BY RANGE (timestamp);
|
||||
CREATE INDEX idx_raw_trades_symbol_time ON raw_trades(symbol, timestamp);
|
||||
|
||||
-- Monthly partitions for raw data (if using raw data)
|
||||
-- CREATE TABLE raw_trades_y2024m01 PARTITION OF raw_trades
|
||||
-- FOR VALUES FROM ('2024-01-01') TO ('2024-02-01');
|
||||
|
||||
-- Bot Management (simplified)
|
||||
CREATE TABLE bots (
|
||||
id SERIAL PRIMARY KEY,
|
||||
name VARCHAR(100) NOT NULL,
|
||||
strategy_name VARCHAR(50) NOT NULL,
|
||||
symbol VARCHAR(20) NOT NULL,
|
||||
timeframe VARCHAR(5) NOT NULL,
|
||||
status VARCHAR(20) NOT NULL DEFAULT 'inactive', -- active, inactive, error
|
||||
config_file VARCHAR(200), -- path to JSON config
|
||||
virtual_balance DECIMAL(18,8) DEFAULT 10000,
|
||||
current_balance DECIMAL(18,8) DEFAULT 10000,
|
||||
last_heartbeat TIMESTAMPTZ,
|
||||
created_at TIMESTAMPTZ DEFAULT NOW(),
|
||||
updated_at TIMESTAMPTZ DEFAULT NOW()
|
||||
);
|
||||
|
||||
-- Trading Signals (for analysis and debugging)
|
||||
CREATE TABLE signals (
|
||||
id SERIAL PRIMARY KEY,
|
||||
bot_id INTEGER REFERENCES bots(id),
|
||||
timestamp TIMESTAMPTZ NOT NULL,
|
||||
signal_type VARCHAR(10) NOT NULL, -- buy, sell, hold
|
||||
price DECIMAL(18,8),
|
||||
confidence DECIMAL(5,4),
|
||||
indicators JSONB, -- technical indicator values
|
||||
created_at TIMESTAMPTZ DEFAULT NOW()
|
||||
);
|
||||
CREATE INDEX idx_signals_bot_time ON signals(bot_id, timestamp);
|
||||
|
||||
-- Trade Execution Records
|
||||
CREATE TABLE trades (
|
||||
id SERIAL PRIMARY KEY,
|
||||
bot_id INTEGER REFERENCES bots(id),
|
||||
signal_id INTEGER REFERENCES signals(id),
|
||||
timestamp TIMESTAMPTZ NOT NULL,
|
||||
side VARCHAR(5) NOT NULL, -- buy, sell
|
||||
price DECIMAL(18,8) NOT NULL,
|
||||
quantity DECIMAL(18,8) NOT NULL,
|
||||
fees DECIMAL(18,8) DEFAULT 0,
|
||||
pnl DECIMAL(18,8), -- profit/loss for this trade
|
||||
balance_after DECIMAL(18,8), -- portfolio balance after trade
|
||||
created_at TIMESTAMPTZ DEFAULT NOW()
|
||||
);
|
||||
CREATE INDEX idx_trades_bot_time ON trades(bot_id, timestamp);
|
||||
|
||||
-- Performance Snapshots (for plotting portfolio over time)
|
||||
CREATE TABLE bot_performance (
|
||||
id SERIAL PRIMARY KEY,
|
||||
bot_id INTEGER REFERENCES bots(id),
|
||||
timestamp TIMESTAMPTZ NOT NULL,
|
||||
total_value DECIMAL(18,8) NOT NULL, -- current portfolio value
|
||||
cash_balance DECIMAL(18,8) NOT NULL,
|
||||
crypto_balance DECIMAL(18,8) NOT NULL,
|
||||
total_trades INTEGER DEFAULT 0,
|
||||
winning_trades INTEGER DEFAULT 0,
|
||||
total_fees DECIMAL(18,8) DEFAULT 0,
|
||||
created_at TIMESTAMPTZ DEFAULT NOW()
|
||||
);
|
||||
CREATE INDEX idx_bot_performance_bot_time ON bot_performance(bot_id, timestamp);
|
||||
```
|
||||
|
||||
**Data Storage Strategy**:
|
||||
- **OHLCV Data**: Primary source for bot operations, kept indefinitely, optimized indexes
|
||||
- **Raw Trade Data**: Optional table, only if detailed backtesting needed, can be partitioned monthly
|
||||
- **Alternative for Raw Data**: Store in compressed files (Parquet/CSV) instead of database for cost efficiency
|
||||
|
||||
**MVP Approach**: Start with OHLCV data only, add raw data storage later if advanced backtesting requires it.
|
||||
|
||||
### Technology Stack
|
||||
|
||||
The platform will be built using the following technologies:
|
||||
|
||||
- **Backend Framework**: Python 3.10+ with Dash (includes built-in Flask server for REST API endpoints)
|
||||
- **Database**: PostgreSQL 14+ (with TimescaleDB extension for time-series optimization)
|
||||
- **Real-time Messaging**: Redis (for pub/sub messaging between components)
|
||||
- **Frontend**: Dash with Plotly (for visualization and control interface) and Mantine UI components
|
||||
- **Configuration**: JSON files for strategy parameters and bot configurations
|
||||
- **Deployment**: Docker container setup for development and production
|
||||
|
||||
### API Design
|
||||
|
||||
**Dash Callbacks**: Real-time updates and user interactions
|
||||
**REST Endpoints**: Historical data queries for backtesting and analysis
|
||||
```python
|
||||
# Built-in Flask routes for historical data
|
||||
@app.server.route('/api/bot/<bot_id>/trades')
|
||||
@app.server.route('/api/market/<symbol>/history')
|
||||
@app.server.route('/api/backtest/results/<test_id>')
|
||||
```
|
||||
|
||||
### Data Flow
|
||||
|
||||
The data flow follows a simple pattern to ensure efficient processing:
|
||||
|
||||
1. **Market Data Collection**:
|
||||
- Collector fetches data from exchange APIs
|
||||
- Raw data is stored in PostgreSQL
|
||||
- Processed data (e.g., OHLCV candles) are calculated and stored
|
||||
- Real-time updates are published to Redis channels
|
||||
|
||||
2. **Signal Generation**:
|
||||
- Bots subscribe to relevant data channels and generate signals based on the strategy
|
||||
- Signals are stored in database and published to Redis
|
||||
|
||||
3. **Trade Execution**:
|
||||
- Bot manager receives signals from strategies
|
||||
- Validates signals against bot parameters and portfolio
|
||||
- Simulates or executes trades based on configuration
|
||||
- Stores trade information in database
|
||||
|
||||
4. **Visualization**:
|
||||
- Dashboard subscribes to real-time data and trading updates
|
||||
- Queries historical data for charts and performance metrics
|
||||
- Provides interface for bot management and configuration
|
||||
|
||||
## Development Roadmap
|
||||
|
||||
### Phase 1: Foundation (Days 1-5)
|
||||
|
||||
**Objective**: Establish core system components and data flow
|
||||
|
||||
1. **Day 1-2**: Database Setup and Data Collection
|
||||
- Set up PostgreSQL with initial schema
|
||||
- Implement OKX API connector
|
||||
- Create data storage and processing logic
|
||||
|
||||
2. **Day 3-4**: Strategy Engine and Bot Manager
|
||||
- Develop strategy interface and 1-2 example strategies
|
||||
- Create bot manager with basic controls
|
||||
- Implement Redis for real-time messaging
|
||||
|
||||
3. **Day 5**: Basic Visualization
|
||||
- Set up Dash/Plotly for simple charts
|
||||
- Create basic dashboard layout
|
||||
- Connect to real-time data sources
|
||||
- Create mockup strategies and bots
|
||||
|
||||
### Phase 2: Core Functionality (Days 6-10)
|
||||
|
||||
**Objective**: Complete essential features for strategy testing
|
||||
|
||||
1. **Day 6-7**: Backtesting Engine
|
||||
- Get historical data from the database or file (have for BTC/USDT in csv format)
|
||||
- Create performance calculation metrics
|
||||
- Develop strategy comparison tools
|
||||
|
||||
2. **Day 8-9**: Trading Logic
|
||||
- Implement virtual trading capability
|
||||
- Create trade execution logic
|
||||
- Develop portfolio tracking
|
||||
|
||||
3. **Day 10**: Dashboard Enhancement
|
||||
- Improve visualization components
|
||||
- Add bot control interface
|
||||
- Implement real-time performance monitoring
|
||||
|
||||
### Phase 3: Refinement (Days 11-14)
|
||||
|
||||
**Objective**: Polish system and prepare for ongoing use
|
||||
|
||||
1. **Day 11-12**: Testing and Debugging
|
||||
- Comprehensive system testing
|
||||
- Fix identified issues
|
||||
- Performance optimization
|
||||
|
||||
2. **Day 13-14**: Documentation and Deployment
|
||||
- Create user documentation
|
||||
- Prepare deployment process
|
||||
- Set up basic monitoring
|
||||
|
||||
## Technical Considerations
|
||||
|
||||
### Scalability Path
|
||||
|
||||
While the initial system is designed as a monolithic application for rapid development, several considerations ensure future scalability:
|
||||
|
||||
1. **Module Separation**: Clear boundaries between components enable future extraction into microservices
|
||||
2. **Database Design**: Schema supports partitioning and sharding for larger data volumes
|
||||
3. **Message Queue**: Redis implementation paves way for more robust messaging (Kafka/RabbitMQ)
|
||||
4. **API-First Design**: Internal components communicate through well-defined interfaces
|
||||
|
||||
### Time Aggregation
|
||||
|
||||
Special attention is given to time aggregation to ensure consistency with exchanges:
|
||||
|
||||
```python
|
||||
def aggregate_candles(trades, timeframe, alignment='right'):
|
||||
"""
|
||||
Aggregate trade data into OHLCV candles with consistent timestamp alignment.
|
||||
|
||||
Parameters:
|
||||
- trades: List of trade dictionaries with timestamp and price
|
||||
- timeframe: String representing the timeframe (e.g., '1m', '5m', '1h')
|
||||
- alignment: String indicating timestamp alignment ('right' or 'left')
|
||||
|
||||
Returns:
|
||||
- Dictionary with OHLCV data
|
||||
"""
|
||||
# Convert timeframe to pandas offset
|
||||
if timeframe.endswith('m'):
|
||||
offset = pd.Timedelta(minutes=int(timeframe[:-1]))
|
||||
elif timeframe.endswith('h'):
|
||||
offset = pd.Timedelta(hours=int(timeframe[:-1]))
|
||||
elif timeframe.endswith('d'):
|
||||
offset = pd.Timedelta(days=int(timeframe[:-1]))
|
||||
|
||||
# Create DataFrame from trades
|
||||
df = pd.DataFrame(trades)
|
||||
|
||||
# Convert timestamps to pandas datetime
|
||||
df['timestamp'] = pd.to_datetime(df['timestamp'], unit='ms')
|
||||
|
||||
# Floor timestamps to timeframe
|
||||
if alignment == 'right':
|
||||
df['candle_time'] = df['timestamp'].dt.floor(offset)
|
||||
else:
|
||||
df['candle_time'] = df['timestamp'].dt.ceil(offset) - offset
|
||||
|
||||
# Aggregate to OHLCV
|
||||
candles = df.groupby('candle_time').agg({
|
||||
'price': ['first', 'max', 'min', 'last'],
|
||||
'amount': 'sum'
|
||||
}).reset_index()
|
||||
|
||||
# Rename columns
|
||||
candles.columns = ['timestamp', 'open', 'high', 'low', 'close', 'volume']
|
||||
|
||||
return candles
|
||||
```
|
||||
|
||||
### Performance Optimization
|
||||
|
||||
For the initial release, several performance optimizations are implemented:
|
||||
|
||||
1. **Database Indexing**: Proper indexes on timestamp and symbol fields
|
||||
2. **Query Optimization**: Prepared statements and efficient query patterns
|
||||
3. **Connection Pooling**: Database connection management to prevent leaks
|
||||
4. **Data Aggregation**: Pre-calculation of common time intervals
|
||||
5. **Memory Management**: Proper cleanup of data objects after processing
|
||||
|
||||
## User Interface
|
||||
|
||||
The initial user interface focuses on functionality over aesthetics, providing essential controls and visualizations, minimalistic design.
|
||||
|
||||
1. **Market Data View**
|
||||
- Real-time price charts for monitored symbols
|
||||
- Order book visualization
|
||||
- Recent trades list
|
||||
|
||||
2. **Bot Management**
|
||||
- Create/configure bot interface
|
||||
- Start/stop controls
|
||||
- Status indicators
|
||||
|
||||
3. **Strategy Dashboard**
|
||||
- Strategy selection and configuration
|
||||
- Signal visualization
|
||||
- Performance metrics
|
||||
|
||||
4. **Backtesting Interface**
|
||||
- Historical data selection
|
||||
- Strategy parameter configuration
|
||||
- Results visualization
|
||||
|
||||
## Risk Management & Mitigation
|
||||
|
||||
### Technical Risks
|
||||
**Risk:** Exchange API rate limiting affecting data collection
|
||||
**Mitigation:** Implement intelligent rate limiting, multiple API keys, and fallback data sources
|
||||
|
||||
**Risk:** Database performance degradation with large datasets
|
||||
**Mitigation:** Implement data partitioning, archival strategies, and query optimization (in future)
|
||||
|
||||
**Risk:** System downtime during market volatility
|
||||
**Mitigation:** Design redundant systems, implement circuit breakers, and emergency procedures (in future)
|
||||
|
||||
### Business Risks
|
||||
**Risk:** Regulatory changes affecting crypto trading
|
||||
**Mitigation:** Implement compliance monitoring, maintain regulatory awareness, design for adaptability
|
||||
|
||||
**Risk:** Competition from established trading platforms
|
||||
**Mitigation:** Focus on unique value propositions, rapid feature development, strong user experience
|
||||
|
||||
### 8.3 User Risks
|
||||
**Risk:** User losses due to platform errors
|
||||
**Mitigation:** Comprehensive testing, simulation modes, risk warnings, and liability disclaimers
|
||||
|
||||
## Future Expansion
|
||||
|
||||
While keeping the initial implementation simple, the design accommodates future enhancements:
|
||||
|
||||
1. **Authentication System**: Add multi-user support with role-based access
|
||||
2. **Advanced Strategies**: Support for machine learning and AI-based strategies
|
||||
3. **Multi-Exchange Support**: Expand beyond OKX to other exchanges
|
||||
4. **Microservices Migration**: Extract components into separate services
|
||||
5. **Advanced Monitoring**: Integration with Prometheus/Grafana
|
||||
6. **Cloud Deployment**: Support for AWS/GCP/Azure deployment
|
||||
|
||||
## Success Metrics
|
||||
|
||||
The platform's success will be measured by these key metrics:
|
||||
|
||||
1. **Development Timeline**: Complete core functionality within 14 days
|
||||
2. **System Stability**: Maintain 99% uptime during internal testing. System should monitor itself and restart if needed (all or just modules)
|
||||
3. **Strategy Testing**: Successfully backtest at least 3 different strategies
|
||||
4. **Bot Performance**: Run at least 2 bots concurrently for 72+ hours
|
||||
Reference in New Issue
Block a user