PRD
This commit is contained in:
parent
e7314bb238
commit
de6ddbf1d8
165
docs/architecture.md
Normal file
165
docs/architecture.md
Normal file
@ -0,0 +1,165 @@
|
||||
## Architecture Components
|
||||
|
||||
### 1. Data Collector
|
||||
**Responsibility**: Unified data collection from multiple exchanges
|
||||
```python
|
||||
class DataCollector:
|
||||
def __init__(self):
|
||||
self.providers = {} # Registry of data providers
|
||||
|
||||
def register_provider(self, name: str, provider: DataProvider):
|
||||
"""Register a new data provider"""
|
||||
|
||||
def start_collection(self, symbols: List[str]):
|
||||
"""Start collecting data for specified symbols"""
|
||||
|
||||
def process_raw_data(self, raw_data: dict):
|
||||
"""Process raw data into OHLCV format"""
|
||||
|
||||
def send_signal_to_bots(self, processed_data: dict):
|
||||
"""Send Redis signal to active bots"""
|
||||
```
|
||||
|
||||
### 2. Strategy Engine
|
||||
**Responsibility**: Unified interface for all trading strategies
|
||||
```python
|
||||
class BaseStrategy:
|
||||
def __init__(self, parameters: dict):
|
||||
self.parameters = parameters
|
||||
|
||||
def process_data(self, data: pd.DataFrame) -> Signal:
|
||||
"""Process market data and generate signals"""
|
||||
raise NotImplementedError
|
||||
|
||||
def get_indicators(self) -> dict:
|
||||
"""Return calculated indicators for plotting"""
|
||||
return {}
|
||||
```
|
||||
|
||||
### 3. Bot Manager
|
||||
**Responsibility**: Orchestrate bot execution and state management
|
||||
```python
|
||||
class BotManager:
|
||||
def __init__(self):
|
||||
self.active_bots = {}
|
||||
|
||||
def start_bot(self, bot_id: int):
|
||||
"""Start a bot instance"""
|
||||
|
||||
def stop_bot(self, bot_id: int):
|
||||
"""Stop a bot instance"""
|
||||
|
||||
def process_signal(self, bot_id: int, signal: Signal):
|
||||
"""Process signal and make trading decision"""
|
||||
|
||||
def update_bot_state(self, bot_id: int, state: dict):
|
||||
"""Update bot state in database"""
|
||||
```
|
||||
|
||||
## Communication Architecture
|
||||
|
||||
### Redis Pub/Sub Patterns
|
||||
```python
|
||||
# Real-time market data
|
||||
MARKET_DATA_CHANNEL = "market_data:{symbol}"
|
||||
|
||||
# Bot-specific signals
|
||||
BOT_SIGNAL_CHANNEL = "bot_signals:{bot_id}"
|
||||
|
||||
# Trade updates
|
||||
TRADE_UPDATE_CHANNEL = "trade_updates:{bot_id}"
|
||||
|
||||
# System events
|
||||
SYSTEM_EVENT_CHANNEL = "system_events"
|
||||
```
|
||||
|
||||
### WebSocket Communication
|
||||
```python
|
||||
# Frontend real-time updates
|
||||
WS_BOT_STATUS = "/ws/bot/{bot_id}/status"
|
||||
WS_MARKET_DATA = "/ws/market/{symbol}"
|
||||
WS_PORTFOLIO = "/ws/portfolio/{bot_id}"
|
||||
```
|
||||
|
||||
## Time Aggregation Strategy
|
||||
|
||||
### Candlestick Alignment
|
||||
- **Use RIGHT-ALIGNED timestamps** (industry standard)
|
||||
- 5-minute candle with timestamp 09:05:00 represents data from 09:00:01 to 09:05:00
|
||||
- Timestamp = close time of the candle
|
||||
- Aligns with major exchanges (Binance, OKX, Coinbase)
|
||||
|
||||
### Aggregation Logic
|
||||
```python
|
||||
def aggregate_to_timeframe(ticks: List[dict], timeframe: str) -> dict:
|
||||
"""
|
||||
Aggregate tick data to specified timeframe
|
||||
timeframe: '1m', '5m', '15m', '1h', '4h', '1d'
|
||||
"""
|
||||
# Convert timeframe to seconds
|
||||
interval_seconds = parse_timeframe(timeframe)
|
||||
|
||||
# Group ticks by time intervals (right-aligned)
|
||||
for group in group_by_interval(ticks, interval_seconds):
|
||||
candle = {
|
||||
'timestamp': group.end_time, # Right-aligned
|
||||
'open': group.first_price,
|
||||
'high': group.max_price,
|
||||
'low': group.min_price,
|
||||
'close': group.last_price,
|
||||
'volume': group.total_volume
|
||||
}
|
||||
yield candle
|
||||
```
|
||||
|
||||
## Backtesting Optimization
|
||||
|
||||
### Parallel Processing Strategy
|
||||
```python
|
||||
import multiprocessing as mp
|
||||
from joblib import Parallel, delayed
|
||||
import numba
|
||||
|
||||
@numba.jit(nopython=True)
|
||||
def calculate_signals_vectorized(prices, parameters):
|
||||
"""Vectorized signal calculation using Numba"""
|
||||
# High-performance signal calculation
|
||||
return signals
|
||||
|
||||
def backtest_strategy_batch(data_batch, strategy_params):
|
||||
"""Backtest a batch of data in parallel"""
|
||||
# Process batch of signals
|
||||
signals = calculate_signals_vectorized(data_batch, strategy_params)
|
||||
|
||||
# Simulate trades incrementally
|
||||
portfolio = simulate_trades(signals, data_batch)
|
||||
return portfolio
|
||||
|
||||
# Parallel backtesting
|
||||
def run_parallel_backtest(data, strategy_params, n_jobs=4):
|
||||
data_batches = split_data_into_batches(data, n_jobs)
|
||||
|
||||
results = Parallel(n_jobs=n_jobs)(
|
||||
delayed(backtest_strategy_batch)(batch, strategy_params)
|
||||
for batch in data_batches
|
||||
)
|
||||
|
||||
return combine_results(results)
|
||||
```
|
||||
|
||||
### Optimization Techniques
|
||||
1. **Vectorized Operations**: Use NumPy/Pandas for bulk calculations
|
||||
2. **Numba JIT**: Compile critical loops for C-like performance
|
||||
3. **Batch Processing**: Process signals in batches, simulate trades incrementally
|
||||
4. **Memory Management**: Use efficient data structures (arrays vs lists)
|
||||
5. **Parallel Execution**: Utilize multiple CPU cores for independent calculations
|
||||
|
||||
## Key Design Principles
|
||||
|
||||
1. **Data Separation**: Raw and processed data stored separately for audit trail
|
||||
2. **Signal Tracking**: All signals recorded (executed or not) for analysis
|
||||
3. **Real-time State**: Bot states updated in real-time for monitoring
|
||||
4. **Audit Trail**: Complete record of all trading activities
|
||||
5. **Scalability**: Architecture supports multiple bots and strategies
|
||||
6. **Modularity**: Clear separation between data collection, strategy execution, and trading
|
||||
7. **Fault Tolerance**: Redis for reliable message delivery, database transactions for consistency
|
||||
608
docs/crypto-bot-prd.md
Normal file
608
docs/crypto-bot-prd.md
Normal file
@ -0,0 +1,608 @@
|
||||
# Simplified Crypto Trading Bot Platform: Product Requirements Document (PRD)
|
||||
|
||||
**Version:** 1.0
|
||||
**Date:** May 30, 2025
|
||||
**Author:** Vasily
|
||||
**Status:** Draft
|
||||
|
||||
## Executive Summary
|
||||
|
||||
This PRD outlines the development of a simplified crypto trading bot platform that enables strategy testing, development, and execution without the complexity of microservices and advanced monitoring. The goal is to create a functional system within 1-2 weeks that allows for strategy testing while establishing a foundation that can scale in the future. The platform addresses key requirements including data collection, strategy execution, visualization, and backtesting capabilities in a monolithic architecture optimized for internal use.
|
||||
|
||||
## Current Requirements & Constraints
|
||||
|
||||
- **Speed to Deployment**: System must be functional within 1-2 weeks
|
||||
- **Scale**: Support for 5-10 concurrent trading bots
|
||||
- **Architecture**: Monolithic application instead of microservices
|
||||
- **User Access**: Internal use only initially (no multi-user authentication)
|
||||
- **Infrastructure**: Simplified deployment without Kubernetes/Docker Swarm
|
||||
- **Monitoring**: Basic logging for modules
|
||||
|
||||
## System Architecture
|
||||
|
||||
### High-Level Architecture
|
||||
|
||||
The platform will follow a monolithic architecture pattern to enable rapid development while providing clear separation between components:
|
||||
|
||||
### Data Flow Architecture
|
||||
|
||||
```
|
||||
OKX Exchange API (WebSocket)
|
||||
↓
|
||||
Data Collector → OHLCV Aggregator → PostgreSQL (market_data)
|
||||
↓ ↓
|
||||
[Optional] Raw Trade Storage Redis Pub/Sub → Strategy Engine (JSON configs)
|
||||
↓ ↓
|
||||
Files/Database (raw_trades) Signal Generation → Bot Manager
|
||||
↓
|
||||
PostgreSQL (signals, trades, bot_performance)
|
||||
↓
|
||||
Dashboard (REST API) ← PostgreSQL (historical data)
|
||||
↑
|
||||
Real-time Updates ← Redis Channels
|
||||
```
|
||||
|
||||
**Data Processing Priority**:
|
||||
1. **Real-time**: Raw data → OHLCV candles → Redis → Bots (primary flow)
|
||||
2. **Historical**: OHLCV data from PostgreSQL for backtesting and charts
|
||||
3. **Advanced Analysis**: Raw trade data (if stored) for detailed backtesting
|
||||
|
||||
### Redis Channel Design
|
||||
|
||||
```python
|
||||
# Real-time market data distribution
|
||||
MARKET_DATA_CHANNEL = "market:{symbol}" # OHLCV updates
|
||||
BOT_SIGNALS_CHANNEL = "signals:{bot_id}" # Trading decisions
|
||||
BOT_STATUS_CHANNEL = "status:{bot_id}" # Bot lifecycle events
|
||||
SYSTEM_EVENTS_CHANNEL = "system:events" # Global notifications
|
||||
```
|
||||
|
||||
### Configuration Strategy
|
||||
|
||||
**PostgreSQL for**: Market data, bot instances, trades, signals, performance metrics
|
||||
**JSON files for**: Strategy parameters, bot configurations (rapid testing and parameter tuning)
|
||||
|
||||
```json
|
||||
// config/strategies/ema_crossover.json
|
||||
{
|
||||
"strategy_name": "EMA_Crossover",
|
||||
"parameters": {
|
||||
"fast_period": 12,
|
||||
"slow_period": 26,
|
||||
"risk_percentage": 0.02
|
||||
}
|
||||
}
|
||||
|
||||
// config/bots/bot_001.json
|
||||
{
|
||||
"bot_id": "bot_001",
|
||||
"strategy_file": "ema_crossover.json",
|
||||
"symbol": "BTC-USDT",
|
||||
"virtual_balance": 10000,
|
||||
"enabled": true
|
||||
}
|
||||
```
|
||||
|
||||
### Error Handling Strategy
|
||||
|
||||
**Bot Crash Recovery**:
|
||||
- Monitor bot processes every 30 seconds
|
||||
- Auto-restart crashed bots if status = 'active'
|
||||
- Log all crashes with stack traces
|
||||
- Maximum 3 restart attempts per hour
|
||||
|
||||
**Exchange Connection Issues**:
|
||||
- Retry with exponential backoff (1s, 2s, 4s, 8s, max 60s)
|
||||
- Switch to backup WebSocket connection if available
|
||||
- Log connection quality metrics
|
||||
|
||||
**Database Errors**:
|
||||
- Continue operation with in-memory cache for up to 5 minutes
|
||||
- Queue operations for retry when connection restored
|
||||
- Alert on prolonged database disconnection
|
||||
|
||||
**Application Restart Recovery**:
|
||||
- Read bot states from database on startup
|
||||
- Restore active bots to 'active' status
|
||||
- Resume data collection for all monitored symbols
|
||||
|
||||
### Component Details and Functional Requirements
|
||||
|
||||
1. **Data Collection Module**
|
||||
- Connect to exchange APIs (OKX initially) via WebSocket
|
||||
- Aggregate real-time trades into OHLCV candles (1m, 5m, 15m, 1h, 4h, 1d)
|
||||
- Store OHLCV data in PostgreSQL for bot operations and backtesting
|
||||
- Send real-time candle updates through Redis
|
||||
- Optional: Store raw trade data for advanced backtesting
|
||||
|
||||
**FR-001: Unified Data Provider Interface**
|
||||
- Support multiple exchanges through standardized adapters
|
||||
- Real-time OHLCV aggregation with WebSocket connections
|
||||
- Primary focus on candle data, raw data storage optional
|
||||
- Data validation and error handling mechanisms
|
||||
|
||||
**FR-002: Market Data Processing**
|
||||
- OHLCV aggregation with configurable timeframes (1m base, higher timeframes derived)
|
||||
- Technical indicator calculation (SMA, EMA, RSI, MACD, Bollinger Bands) on OHLCV data
|
||||
- Data normalization across different exchanges
|
||||
- Time alignment following exchange standards (right-aligned candles)
|
||||
|
||||
2. **Strategy Engine**
|
||||
- Provide unified interface for all trading strategies
|
||||
- Support multiple strategy types with common parameter structure
|
||||
- Generate trading signals based on market data
|
||||
- Log strategy performance and signals
|
||||
- Strategy implementation as a class.
|
||||
|
||||
**FR-003: Strategy Framework**
|
||||
- Base strategy class with standardized interface
|
||||
- Support for multiple strategy types
|
||||
- Parameter configuration and optimization tools (JSON for the parameters)
|
||||
- Signal generation with confidence scoring
|
||||
|
||||
**FR-004: Signal Processing**
|
||||
- Real-time signal calculation and validation
|
||||
- Signal persistence for analysis and debugging
|
||||
- Multi-timeframe analysis capabilities
|
||||
- Custom indicator development support
|
||||
|
||||
3. **Bot Manager**
|
||||
- Create and manage up to 10 concurrent trading bots
|
||||
- Configure bot parameters and associated strategies
|
||||
- Start/stop individual bots
|
||||
- Track bot status and performance
|
||||
|
||||
**FR-005: Bot Lifecycle Management**
|
||||
- Bot creation with strategy and parameter selection
|
||||
- Start/stop/pause functionality with state persistence
|
||||
- Configuration management
|
||||
- Resource allocation and monitoring (in future)
|
||||
|
||||
**FR-006: Portfolio Management**
|
||||
- Position tracking and balance management
|
||||
- Risk management controls (stop-loss, take-profit, position sizing)
|
||||
- Multi-bot coordination and conflict resolution (in future)
|
||||
- Real-time portfolio valuation (in future)
|
||||
|
||||
5. **Trading Execution**
|
||||
- Simulate or execute trades based on configuration
|
||||
- Stores trade information in database
|
||||
|
||||
**FR-007: Order Management**
|
||||
- Order placement with multiple order types (market, limit, stop)
|
||||
- Order tracking and status monitoring (in future)
|
||||
- Execution confirmation and reconciliation (in future)
|
||||
- Fee calculation and tracking (in future)
|
||||
|
||||
**FR-008: Risk Controls**
|
||||
- Pre-trade risk validation
|
||||
- Position limits and exposure controls (in future)
|
||||
- Emergency stop mechanisms (in future)
|
||||
- Compliance monitoring and reporting (in future)
|
||||
|
||||
4. **Database (PostgreSQL)**
|
||||
- Store market data, bot configurations, and trading history
|
||||
- Optimized schema for time-series data without complexity
|
||||
- Support for data querying and aggregation
|
||||
**Database (JSON)**
|
||||
- Store strategy parameters and bot onfiguration in JSON in the beginning for simplicity of editing and testing
|
||||
|
||||
5. **Backtesting Engine**
|
||||
- Run simulations on historical data using vectorized operations for speed
|
||||
- Calculate performance metrics
|
||||
- Support multiple timeframes and strategy parameter testing
|
||||
- Generate comparison reports between strategies
|
||||
|
||||
**FR-009: Historical Simulation**
|
||||
- Strategy backtesting on historical market data
|
||||
- Performance metric calculation (Sharpe ratio, drawdown, win rate, total return)
|
||||
- Parameter optimization through grid search (limited combinations for speed) (in future)
|
||||
- Side-by-side strategy comparison with statistical significance
|
||||
|
||||
**FR-010: Simulation Engine**
|
||||
- Vectorized signal calculation using pandas operations
|
||||
- Realistic fee modeling (0.1% per trade for OKX)
|
||||
- Look-ahead bias prevention with proper timestamp handling
|
||||
- Configurable test periods (1 day to 24 months)
|
||||
|
||||
6. **Dashboard & Visualization**
|
||||
- Display real-time market data and bot status
|
||||
- Show portfolio value progression over time
|
||||
- Visualize trade history with buy/sell markers on price charts
|
||||
- Provide simple bot control interface (start/stop/configure)
|
||||
|
||||
**FR-011: Dashboard Interface**
|
||||
- Real-time bot monitoring with status indicators
|
||||
- Portfolio performance charts (total value, cash vs crypto allocation)
|
||||
- Trade history table with P&L per trade
|
||||
- Simple bot configuration forms for JSON parameter editing
|
||||
|
||||
**FR-012: Data Visualization**
|
||||
- Interactive price charts with strategy signal overlays
|
||||
- Portfolio value progression charts
|
||||
- Performance comparison tables (multiple bots side-by-side)
|
||||
- Fee tracking and total cost analysis
|
||||
|
||||
### Non-Functional Requirements
|
||||
|
||||
1 Performance Requirements
|
||||
**NFR-001: Latency**
|
||||
- Market data processing: <100ms from exchange to database
|
||||
- Signal generation: <500ms for standard strategies
|
||||
- API response time: <200ms for 95% of requests
|
||||
- Dashboard updates: <2 seconds for real-time data
|
||||
|
||||
**NFR-002: Scalability**
|
||||
- Database queries scalable to 1M+ records per table
|
||||
- Horizontal scaling capability for all services (in future)
|
||||
|
||||
2. Reliability Requirements
|
||||
**NFR-003: Availability**
|
||||
- System uptime: 99.5% excluding planned maintenance
|
||||
- Data collection: 99.9% uptime during market hours
|
||||
- Automatic failover for critical services
|
||||
- Graceful degradation during partial outages
|
||||
|
||||
**NFR-004: Data Integrity**
|
||||
- Zero data loss for executed trades
|
||||
- Transactional consistency for all financial operations
|
||||
- Regular database backups with point-in-time recovery
|
||||
- Data validation and error correction mechanisms
|
||||
|
||||
3. Security Requirements
|
||||
**NFR-005: Authentication & Authorization** (in future)
|
||||
|
||||
**NFR-006: Data Protection**
|
||||
- End-to-end encryption for sensitive data (in future)
|
||||
- Secure storage of API keys and credentials
|
||||
- Regular security audits and penetration testing (in future)
|
||||
- Compliance with financial data protection regulations (in future)
|
||||
|
||||
## Technical Implementation
|
||||
|
||||
### Database Schema
|
||||
|
||||
The database schema separates frequently-accessed OHLCV data from raw tick data to optimize performance and storage.
|
||||
|
||||
```sql
|
||||
-- OHLCV Market Data (primary table for bot operations)
|
||||
CREATE TABLE market_data (
|
||||
id SERIAL PRIMARY KEY,
|
||||
exchange VARCHAR(50) NOT NULL DEFAULT 'okx',
|
||||
symbol VARCHAR(20) NOT NULL,
|
||||
timeframe VARCHAR(5) NOT NULL, -- 1m, 5m, 15m, 1h, 4h, 1d
|
||||
timestamp TIMESTAMPTZ NOT NULL,
|
||||
open DECIMAL(18,8) NOT NULL,
|
||||
high DECIMAL(18,8) NOT NULL,
|
||||
low DECIMAL(18,8) NOT NULL,
|
||||
close DECIMAL(18,8) NOT NULL,
|
||||
volume DECIMAL(18,8) NOT NULL,
|
||||
trades_count INTEGER, -- number of trades in this candle
|
||||
created_at TIMESTAMPTZ DEFAULT NOW(),
|
||||
UNIQUE(exchange, symbol, timeframe, timestamp)
|
||||
);
|
||||
CREATE INDEX idx_market_data_lookup ON market_data(symbol, timeframe, timestamp);
|
||||
CREATE INDEX idx_market_data_recent ON market_data(timestamp DESC) WHERE timestamp > NOW() - INTERVAL '7 days';
|
||||
|
||||
-- Raw Trade Data (optional, for detailed backtesting only)
|
||||
CREATE TABLE raw_trades (
|
||||
id SERIAL PRIMARY KEY,
|
||||
exchange VARCHAR(50) NOT NULL DEFAULT 'okx',
|
||||
symbol VARCHAR(20) NOT NULL,
|
||||
timestamp TIMESTAMPTZ NOT NULL,
|
||||
type VARCHAR(10) NOT NULL, -- trade, order, balance, tick, books
|
||||
data JSONB NOT NULL, -- response from the exchange
|
||||
created_at TIMESTAMPTZ DEFAULT NOW()
|
||||
) PARTITION BY RANGE (timestamp);
|
||||
CREATE INDEX idx_raw_trades_symbol_time ON raw_trades(symbol, timestamp);
|
||||
|
||||
-- Monthly partitions for raw data (if using raw data)
|
||||
-- CREATE TABLE raw_trades_y2024m01 PARTITION OF raw_trades
|
||||
-- FOR VALUES FROM ('2024-01-01') TO ('2024-02-01');
|
||||
|
||||
-- Bot Management (simplified)
|
||||
CREATE TABLE bots (
|
||||
id SERIAL PRIMARY KEY,
|
||||
name VARCHAR(100) NOT NULL,
|
||||
strategy_name VARCHAR(50) NOT NULL,
|
||||
symbol VARCHAR(20) NOT NULL,
|
||||
timeframe VARCHAR(5) NOT NULL,
|
||||
status VARCHAR(20) NOT NULL DEFAULT 'inactive', -- active, inactive, error
|
||||
config_file VARCHAR(200), -- path to JSON config
|
||||
virtual_balance DECIMAL(18,8) DEFAULT 10000,
|
||||
current_balance DECIMAL(18,8) DEFAULT 10000,
|
||||
last_heartbeat TIMESTAMPTZ,
|
||||
created_at TIMESTAMPTZ DEFAULT NOW(),
|
||||
updated_at TIMESTAMPTZ DEFAULT NOW()
|
||||
);
|
||||
|
||||
-- Trading Signals (for analysis and debugging)
|
||||
CREATE TABLE signals (
|
||||
id SERIAL PRIMARY KEY,
|
||||
bot_id INTEGER REFERENCES bots(id),
|
||||
timestamp TIMESTAMPTZ NOT NULL,
|
||||
signal_type VARCHAR(10) NOT NULL, -- buy, sell, hold
|
||||
price DECIMAL(18,8),
|
||||
confidence DECIMAL(5,4),
|
||||
indicators JSONB, -- technical indicator values
|
||||
created_at TIMESTAMPTZ DEFAULT NOW()
|
||||
);
|
||||
CREATE INDEX idx_signals_bot_time ON signals(bot_id, timestamp);
|
||||
|
||||
-- Trade Execution Records
|
||||
CREATE TABLE trades (
|
||||
id SERIAL PRIMARY KEY,
|
||||
bot_id INTEGER REFERENCES bots(id),
|
||||
signal_id INTEGER REFERENCES signals(id),
|
||||
timestamp TIMESTAMPTZ NOT NULL,
|
||||
side VARCHAR(5) NOT NULL, -- buy, sell
|
||||
price DECIMAL(18,8) NOT NULL,
|
||||
quantity DECIMAL(18,8) NOT NULL,
|
||||
fees DECIMAL(18,8) DEFAULT 0,
|
||||
pnl DECIMAL(18,8), -- profit/loss for this trade
|
||||
balance_after DECIMAL(18,8), -- portfolio balance after trade
|
||||
created_at TIMESTAMPTZ DEFAULT NOW()
|
||||
);
|
||||
CREATE INDEX idx_trades_bot_time ON trades(bot_id, timestamp);
|
||||
|
||||
-- Performance Snapshots (for plotting portfolio over time)
|
||||
CREATE TABLE bot_performance (
|
||||
id SERIAL PRIMARY KEY,
|
||||
bot_id INTEGER REFERENCES bots(id),
|
||||
timestamp TIMESTAMPTZ NOT NULL,
|
||||
total_value DECIMAL(18,8) NOT NULL, -- current portfolio value
|
||||
cash_balance DECIMAL(18,8) NOT NULL,
|
||||
crypto_balance DECIMAL(18,8) NOT NULL,
|
||||
total_trades INTEGER DEFAULT 0,
|
||||
winning_trades INTEGER DEFAULT 0,
|
||||
total_fees DECIMAL(18,8) DEFAULT 0,
|
||||
created_at TIMESTAMPTZ DEFAULT NOW()
|
||||
);
|
||||
CREATE INDEX idx_bot_performance_bot_time ON bot_performance(bot_id, timestamp);
|
||||
```
|
||||
|
||||
**Data Storage Strategy**:
|
||||
- **OHLCV Data**: Primary source for bot operations, kept indefinitely, optimized indexes
|
||||
- **Raw Trade Data**: Optional table, only if detailed backtesting needed, can be partitioned monthly
|
||||
- **Alternative for Raw Data**: Store in compressed files (Parquet/CSV) instead of database for cost efficiency
|
||||
|
||||
**MVP Approach**: Start with OHLCV data only, add raw data storage later if advanced backtesting requires it.
|
||||
|
||||
### Technology Stack
|
||||
|
||||
The platform will be built using the following technologies:
|
||||
|
||||
- **Backend Framework**: Python 3.10+ with Dash (includes built-in Flask server for REST API endpoints)
|
||||
- **Database**: PostgreSQL 14+ (with TimescaleDB extension for time-series optimization)
|
||||
- **Real-time Messaging**: Redis (for pub/sub messaging between components)
|
||||
- **Frontend**: Dash with Plotly (for visualization and control interface) and Mantine UI components
|
||||
- **Configuration**: JSON files for strategy parameters and bot configurations
|
||||
- **Deployment**: Docker container setup for development and production
|
||||
|
||||
### API Design
|
||||
|
||||
**Dash Callbacks**: Real-time updates and user interactions
|
||||
**REST Endpoints**: Historical data queries for backtesting and analysis
|
||||
```python
|
||||
# Built-in Flask routes for historical data
|
||||
@app.server.route('/api/bot/<bot_id>/trades')
|
||||
@app.server.route('/api/market/<symbol>/history')
|
||||
@app.server.route('/api/backtest/results/<test_id>')
|
||||
```
|
||||
|
||||
### Data Flow
|
||||
|
||||
The data flow follows a simple pattern to ensure efficient processing:
|
||||
|
||||
1. **Market Data Collection**:
|
||||
- Collector fetches data from exchange APIs
|
||||
- Raw data is stored in PostgreSQL
|
||||
- Processed data (e.g., OHLCV candles) are calculated and stored
|
||||
- Real-time updates are published to Redis channels
|
||||
|
||||
2. **Signal Generation**:
|
||||
- Bots subscribe to relevant data channels and generate signals based on the strategy
|
||||
- Signals are stored in database and published to Redis
|
||||
|
||||
3. **Trade Execution**:
|
||||
- Bot manager receives signals from strategies
|
||||
- Validates signals against bot parameters and portfolio
|
||||
- Simulates or executes trades based on configuration
|
||||
- Stores trade information in database
|
||||
|
||||
4. **Visualization**:
|
||||
- Dashboard subscribes to real-time data and trading updates
|
||||
- Queries historical data for charts and performance metrics
|
||||
- Provides interface for bot management and configuration
|
||||
|
||||
## Development Roadmap
|
||||
|
||||
### Phase 1: Foundation (Days 1-5)
|
||||
|
||||
**Objective**: Establish core system components and data flow
|
||||
|
||||
1. **Day 1-2**: Database Setup and Data Collection
|
||||
- Set up PostgreSQL with initial schema
|
||||
- Implement OKX API connector
|
||||
- Create data storage and processing logic
|
||||
|
||||
2. **Day 3-4**: Strategy Engine and Bot Manager
|
||||
- Develop strategy interface and 1-2 example strategies
|
||||
- Create bot manager with basic controls
|
||||
- Implement Redis for real-time messaging
|
||||
|
||||
3. **Day 5**: Basic Visualization
|
||||
- Set up Dash/Plotly for simple charts
|
||||
- Create basic dashboard layout
|
||||
- Connect to real-time data sources
|
||||
- Create mockup strategies and bots
|
||||
|
||||
### Phase 2: Core Functionality (Days 6-10)
|
||||
|
||||
**Objective**: Complete essential features for strategy testing
|
||||
|
||||
1. **Day 6-7**: Backtesting Engine
|
||||
- Get historical data from the database or file (have for BTC/USDT in csv format)
|
||||
- Create performance calculation metrics
|
||||
- Develop strategy comparison tools
|
||||
|
||||
2. **Day 8-9**: Trading Logic
|
||||
- Implement virtual trading capability
|
||||
- Create trade execution logic
|
||||
- Develop portfolio tracking
|
||||
|
||||
3. **Day 10**: Dashboard Enhancement
|
||||
- Improve visualization components
|
||||
- Add bot control interface
|
||||
- Implement real-time performance monitoring
|
||||
|
||||
### Phase 3: Refinement (Days 11-14)
|
||||
|
||||
**Objective**: Polish system and prepare for ongoing use
|
||||
|
||||
1. **Day 11-12**: Testing and Debugging
|
||||
- Comprehensive system testing
|
||||
- Fix identified issues
|
||||
- Performance optimization
|
||||
|
||||
2. **Day 13-14**: Documentation and Deployment
|
||||
- Create user documentation
|
||||
- Prepare deployment process
|
||||
- Set up basic monitoring
|
||||
|
||||
## Technical Considerations
|
||||
|
||||
### Scalability Path
|
||||
|
||||
While the initial system is designed as a monolithic application for rapid development, several considerations ensure future scalability:
|
||||
|
||||
1. **Module Separation**: Clear boundaries between components enable future extraction into microservices
|
||||
2. **Database Design**: Schema supports partitioning and sharding for larger data volumes
|
||||
3. **Message Queue**: Redis implementation paves way for more robust messaging (Kafka/RabbitMQ)
|
||||
4. **API-First Design**: Internal components communicate through well-defined interfaces
|
||||
|
||||
### Time Aggregation
|
||||
|
||||
Special attention is given to time aggregation to ensure consistency with exchanges:
|
||||
|
||||
```python
|
||||
def aggregate_candles(trades, timeframe, alignment='right'):
|
||||
"""
|
||||
Aggregate trade data into OHLCV candles with consistent timestamp alignment.
|
||||
|
||||
Parameters:
|
||||
- trades: List of trade dictionaries with timestamp and price
|
||||
- timeframe: String representing the timeframe (e.g., '1m', '5m', '1h')
|
||||
- alignment: String indicating timestamp alignment ('right' or 'left')
|
||||
|
||||
Returns:
|
||||
- Dictionary with OHLCV data
|
||||
"""
|
||||
# Convert timeframe to pandas offset
|
||||
if timeframe.endswith('m'):
|
||||
offset = pd.Timedelta(minutes=int(timeframe[:-1]))
|
||||
elif timeframe.endswith('h'):
|
||||
offset = pd.Timedelta(hours=int(timeframe[:-1]))
|
||||
elif timeframe.endswith('d'):
|
||||
offset = pd.Timedelta(days=int(timeframe[:-1]))
|
||||
|
||||
# Create DataFrame from trades
|
||||
df = pd.DataFrame(trades)
|
||||
|
||||
# Convert timestamps to pandas datetime
|
||||
df['timestamp'] = pd.to_datetime(df['timestamp'], unit='ms')
|
||||
|
||||
# Floor timestamps to timeframe
|
||||
if alignment == 'right':
|
||||
df['candle_time'] = df['timestamp'].dt.floor(offset)
|
||||
else:
|
||||
df['candle_time'] = df['timestamp'].dt.ceil(offset) - offset
|
||||
|
||||
# Aggregate to OHLCV
|
||||
candles = df.groupby('candle_time').agg({
|
||||
'price': ['first', 'max', 'min', 'last'],
|
||||
'amount': 'sum'
|
||||
}).reset_index()
|
||||
|
||||
# Rename columns
|
||||
candles.columns = ['timestamp', 'open', 'high', 'low', 'close', 'volume']
|
||||
|
||||
return candles
|
||||
```
|
||||
|
||||
### Performance Optimization
|
||||
|
||||
For the initial release, several performance optimizations are implemented:
|
||||
|
||||
1. **Database Indexing**: Proper indexes on timestamp and symbol fields
|
||||
2. **Query Optimization**: Prepared statements and efficient query patterns
|
||||
3. **Connection Pooling**: Database connection management to prevent leaks
|
||||
4. **Data Aggregation**: Pre-calculation of common time intervals
|
||||
5. **Memory Management**: Proper cleanup of data objects after processing
|
||||
|
||||
## User Interface
|
||||
|
||||
The initial user interface focuses on functionality over aesthetics, providing essential controls and visualizations, minimalistic design.
|
||||
|
||||
1. **Market Data View**
|
||||
- Real-time price charts for monitored symbols
|
||||
- Order book visualization
|
||||
- Recent trades list
|
||||
|
||||
2. **Bot Management**
|
||||
- Create/configure bot interface
|
||||
- Start/stop controls
|
||||
- Status indicators
|
||||
|
||||
3. **Strategy Dashboard**
|
||||
- Strategy selection and configuration
|
||||
- Signal visualization
|
||||
- Performance metrics
|
||||
|
||||
4. **Backtesting Interface**
|
||||
- Historical data selection
|
||||
- Strategy parameter configuration
|
||||
- Results visualization
|
||||
|
||||
## Risk Management & Mitigation
|
||||
|
||||
### Technical Risks
|
||||
**Risk:** Exchange API rate limiting affecting data collection
|
||||
**Mitigation:** Implement intelligent rate limiting, multiple API keys, and fallback data sources
|
||||
|
||||
**Risk:** Database performance degradation with large datasets
|
||||
**Mitigation:** Implement data partitioning, archival strategies, and query optimization (in future)
|
||||
|
||||
**Risk:** System downtime during market volatility
|
||||
**Mitigation:** Design redundant systems, implement circuit breakers, and emergency procedures (in future)
|
||||
|
||||
### Business Risks
|
||||
**Risk:** Regulatory changes affecting crypto trading
|
||||
**Mitigation:** Implement compliance monitoring, maintain regulatory awareness, design for adaptability
|
||||
|
||||
**Risk:** Competition from established trading platforms
|
||||
**Mitigation:** Focus on unique value propositions, rapid feature development, strong user experience
|
||||
|
||||
### 8.3 User Risks
|
||||
**Risk:** User losses due to platform errors
|
||||
**Mitigation:** Comprehensive testing, simulation modes, risk warnings, and liability disclaimers
|
||||
|
||||
## Future Expansion
|
||||
|
||||
While keeping the initial implementation simple, the design accommodates future enhancements:
|
||||
|
||||
1. **Authentication System**: Add multi-user support with role-based access
|
||||
2. **Advanced Strategies**: Support for machine learning and AI-based strategies
|
||||
3. **Multi-Exchange Support**: Expand beyond OKX to other exchanges
|
||||
4. **Microservices Migration**: Extract components into separate services
|
||||
5. **Advanced Monitoring**: Integration with Prometheus/Grafana
|
||||
6. **Cloud Deployment**: Support for AWS/GCP/Azure deployment
|
||||
|
||||
## Success Metrics
|
||||
|
||||
The platform's success will be measured by these key metrics:
|
||||
|
||||
1. **Development Timeline**: Complete core functionality within 14 days
|
||||
2. **System Stability**: Maintain 99% uptime during internal testing. System should monitor itself and restart if needed (all or just modules)
|
||||
3. **Strategy Testing**: Successfully backtest at least 3 different strategies
|
||||
4. **Bot Performance**: Run at least 2 bots concurrently for 72+ hours
|
||||
@ -1,259 +0,0 @@
|
||||
# Dependency Management Guide
|
||||
|
||||
This guide explains how to manage Python dependencies in the Crypto Trading Bot Dashboard project.
|
||||
|
||||
## Local Development
|
||||
|
||||
### Adding New Dependencies
|
||||
|
||||
#### 1. Core Dependencies (Required for Runtime)
|
||||
|
||||
To add a new core dependency:
|
||||
|
||||
```bash
|
||||
# Method 1: Add directly to pyproject.toml
|
||||
# Edit pyproject.toml and add to the dependencies list:
|
||||
# "new-package>=1.0.0",
|
||||
|
||||
# Method 2: Use UV to add and update pyproject.toml
|
||||
uv add "new-package>=1.0.0"
|
||||
|
||||
# Sync to install
|
||||
uv sync
|
||||
```
|
||||
|
||||
#### 2. Development Dependencies (Testing, Linting, etc.)
|
||||
|
||||
```bash
|
||||
# Add development-only dependency
|
||||
uv add --dev "new-dev-package>=1.0.0"
|
||||
|
||||
# Or edit pyproject.toml under [project.optional-dependencies.dev]
|
||||
# Then run:
|
||||
uv sync --dev
|
||||
```
|
||||
|
||||
### Installing Dependencies
|
||||
|
||||
```bash
|
||||
# Install all dependencies
|
||||
uv sync
|
||||
|
||||
# Install with development dependencies
|
||||
uv sync --dev
|
||||
|
||||
# Install only production dependencies
|
||||
uv sync --no-dev
|
||||
```
|
||||
|
||||
### Updating Dependencies
|
||||
|
||||
```bash
|
||||
# Update all dependencies to latest compatible versions
|
||||
uv sync --upgrade
|
||||
|
||||
# Update specific package
|
||||
uv sync --upgrade-package "package-name"
|
||||
```
|
||||
|
||||
## Docker Environment
|
||||
|
||||
### Current Approach
|
||||
|
||||
The project uses a **volume-based development** approach where:
|
||||
- Dependencies are installed in the local environment using UV
|
||||
- Docker containers provide only infrastructure services (PostgreSQL, Redis)
|
||||
- The Python application runs locally with hot reload
|
||||
|
||||
### Adding Dependencies for Docker-based Development
|
||||
|
||||
If you want to run the entire application in Docker:
|
||||
|
||||
#### 1. Create a Dockerfile
|
||||
|
||||
```dockerfile
|
||||
FROM python:3.10-slim
|
||||
|
||||
WORKDIR /app
|
||||
|
||||
# Install system dependencies
|
||||
RUN apt-get update && apt-get install -y \
|
||||
postgresql-client \
|
||||
curl \
|
||||
&& rm -rf /var/lib/apt/lists/*
|
||||
|
||||
# Install UV
|
||||
RUN pip install uv
|
||||
|
||||
# Copy dependency files
|
||||
COPY pyproject.toml ./
|
||||
COPY README.md ./
|
||||
|
||||
# Install dependencies
|
||||
RUN uv sync --no-dev
|
||||
|
||||
# Copy application code
|
||||
COPY . .
|
||||
|
||||
# Expose port
|
||||
EXPOSE 8050
|
||||
|
||||
# Run application
|
||||
CMD ["uv", "run", "python", "main.py"]
|
||||
```
|
||||
|
||||
#### 2. Add Application Service to docker-compose.yml
|
||||
|
||||
```yaml
|
||||
services:
|
||||
app:
|
||||
build: .
|
||||
container_name: dashboard_app
|
||||
ports:
|
||||
- "8050:8050"
|
||||
volumes:
|
||||
- .:/app
|
||||
- uv_cache:/root/.cache/uv
|
||||
environment:
|
||||
- DATABASE_URL=postgresql://dashboard:dashboard123@postgres:5432/dashboard
|
||||
- REDIS_URL=redis://redis:6379
|
||||
depends_on:
|
||||
- postgres
|
||||
- redis
|
||||
networks:
|
||||
- dashboard-network
|
||||
restart: unless-stopped
|
||||
|
||||
volumes:
|
||||
uv_cache:
|
||||
```
|
||||
|
||||
#### 3. Development Workflow with Docker
|
||||
|
||||
```bash
|
||||
# Build and start all services
|
||||
docker-compose up --build
|
||||
|
||||
# Add new dependency
|
||||
# 1. Edit pyproject.toml
|
||||
# 2. Rebuild container
|
||||
docker-compose build app
|
||||
docker-compose up -d app
|
||||
|
||||
# Or use dev dependencies mount
|
||||
# Mount local UV cache for faster rebuilds
|
||||
```
|
||||
|
||||
## Hot Reload Development
|
||||
|
||||
### Method 1: Local Development (Recommended)
|
||||
|
||||
Run services in Docker, application locally with hot reload:
|
||||
|
||||
```bash
|
||||
# Start infrastructure
|
||||
python scripts/dev.py start
|
||||
|
||||
# Run app with hot reload
|
||||
uv run python scripts/dev.py dev-server
|
||||
```
|
||||
|
||||
### Method 2: Docker with Volume Mounts
|
||||
|
||||
If using Docker for the app, mount source code:
|
||||
|
||||
```yaml
|
||||
volumes:
|
||||
- .:/app # Mount source code
|
||||
- /app/__pycache__ # Exclude cache
|
||||
```
|
||||
|
||||
## Best Practices
|
||||
|
||||
### 1. Version Pinning
|
||||
|
||||
```toml
|
||||
# Good: Specify minimum version with compatibility
|
||||
"requests>=2.31.0,<3.0.0"
|
||||
|
||||
# Acceptable: Major version constraint
|
||||
"pandas>=2.1.0"
|
||||
|
||||
# Avoid: Exact pinning (except for critical deps)
|
||||
"somepackage==1.2.3" # Only if necessary
|
||||
```
|
||||
|
||||
### 2. Dependency Categories
|
||||
|
||||
```toml
|
||||
[project]
|
||||
dependencies = [
|
||||
# Core web framework
|
||||
"dash>=2.14.0",
|
||||
|
||||
# Database
|
||||
"sqlalchemy>=2.0.0",
|
||||
"psycopg2-binary>=2.9.0",
|
||||
|
||||
# ... group related dependencies
|
||||
]
|
||||
```
|
||||
|
||||
### 3. Security Updates
|
||||
|
||||
```bash
|
||||
# Check for security vulnerabilities
|
||||
pip-audit
|
||||
|
||||
# Update specific vulnerable package
|
||||
uv sync --upgrade-package "vulnerable-package"
|
||||
```
|
||||
|
||||
## Troubleshooting
|
||||
|
||||
### Common Issues
|
||||
|
||||
1. **Dependency Conflicts**
|
||||
```bash
|
||||
# Clear UV cache and reinstall
|
||||
uv cache clean
|
||||
uv sync --refresh
|
||||
```
|
||||
|
||||
2. **PostgreSQL Connection Issues**
|
||||
```bash
|
||||
# Ensure psycopg2-binary is installed
|
||||
uv add "psycopg2-binary>=2.9.0"
|
||||
```
|
||||
|
||||
3. **Docker Build Failures**
|
||||
```bash
|
||||
# Clean docker build cache
|
||||
docker system prune --volumes
|
||||
docker-compose build --no-cache
|
||||
```
|
||||
|
||||
### Debugging Dependencies
|
||||
|
||||
```bash
|
||||
# Show installed packages
|
||||
uv pip list
|
||||
|
||||
# Show dependency tree
|
||||
uv pip show <package-name>
|
||||
|
||||
# Check for conflicts
|
||||
uv pip check
|
||||
```
|
||||
|
||||
## Migration from requirements.txt
|
||||
|
||||
If you have an existing `requirements.txt`:
|
||||
|
||||
```bash
|
||||
# Convert to pyproject.toml
|
||||
uv add -r requirements.txt
|
||||
|
||||
# Or manually copy dependencies to pyproject.toml
|
||||
# Then remove requirements.txt
|
||||
```
|
||||
283
docs/setup.md
283
docs/setup.md
@ -1,283 +0,0 @@
|
||||
# Development Environment Setup
|
||||
|
||||
This guide will help you set up the Crypto Trading Bot Dashboard development environment.
|
||||
|
||||
## Prerequisites
|
||||
|
||||
- Python 3.10+
|
||||
- Docker Desktop (for Windows/Mac) or Docker Engine (for Linux)
|
||||
- UV package manager
|
||||
- Git
|
||||
|
||||
## Quick Start
|
||||
|
||||
### 1. Initial Setup
|
||||
|
||||
```bash
|
||||
# Install dependencies (including dev tools)
|
||||
uv sync --dev
|
||||
|
||||
# Set up environment and start services
|
||||
python scripts/dev.py setup
|
||||
```
|
||||
|
||||
### 2. Start Services
|
||||
|
||||
```bash
|
||||
# Start PostgreSQL and Redis services
|
||||
python scripts/dev.py start
|
||||
```
|
||||
|
||||
### 3. Configure API Keys
|
||||
|
||||
Copy `env.template` to `.env` and update the OKX API credentials:
|
||||
|
||||
```bash
|
||||
# Copy template (Windows)
|
||||
copy env.template .env
|
||||
|
||||
# Copy template (Unix)
|
||||
cp env.template .env
|
||||
|
||||
# Edit .env file with your actual OKX API credentials
|
||||
# OKX_API_KEY=your_actual_api_key
|
||||
# OKX_SECRET_KEY=your_actual_secret_key
|
||||
# OKX_PASSPHRASE=your_actual_passphrase
|
||||
```
|
||||
|
||||
### 4. Verify Setup
|
||||
|
||||
```bash
|
||||
# Run setup verification tests
|
||||
uv run python tests/test_setup.py
|
||||
```
|
||||
|
||||
### 5. Start Dashboard with Hot Reload
|
||||
|
||||
```bash
|
||||
# Start with hot reload (recommended for development)
|
||||
python scripts/dev.py dev-server
|
||||
|
||||
# Or start without hot reload
|
||||
python scripts/dev.py run
|
||||
```
|
||||
|
||||
## Development Commands
|
||||
|
||||
### Using the dev.py script:
|
||||
|
||||
```bash
|
||||
# Show available commands
|
||||
python scripts/dev.py
|
||||
|
||||
# Set up environment and install dependencies
|
||||
python scripts/dev.py setup
|
||||
|
||||
# Start all services (Docker)
|
||||
python scripts/dev.py start
|
||||
|
||||
# Stop all services
|
||||
python scripts/dev.py stop
|
||||
|
||||
# Restart services
|
||||
python scripts/dev.py restart
|
||||
|
||||
# Check service status
|
||||
python scripts/dev.py status
|
||||
|
||||
# Install/update dependencies
|
||||
python scripts/dev.py install
|
||||
|
||||
# Run development server with hot reload
|
||||
python scripts/dev.py dev-server
|
||||
|
||||
# Run application without hot reload
|
||||
python scripts/dev.py run
|
||||
```
|
||||
|
||||
### Direct Docker commands:
|
||||
|
||||
```bash
|
||||
# Start services in background
|
||||
docker-compose up -d
|
||||
|
||||
# View service logs
|
||||
docker-compose logs -f
|
||||
|
||||
# Stop services
|
||||
docker-compose down
|
||||
|
||||
# Rebuild and restart
|
||||
docker-compose up -d --build
|
||||
```
|
||||
|
||||
## Hot Reload Development
|
||||
|
||||
The development server includes hot reload functionality that automatically restarts the application when Python files change.
|
||||
|
||||
### Features:
|
||||
- 🔥 **Auto-restart** on file changes
|
||||
- 👀 **Watches multiple directories** (config, database, components, etc.)
|
||||
- 🚀 **Fast restart** with debouncing (1-second delay)
|
||||
- 🛑 **Graceful shutdown** with Ctrl+C
|
||||
|
||||
### Usage:
|
||||
```bash
|
||||
# Start hot reload server
|
||||
python scripts/dev.py dev-server
|
||||
|
||||
# The server will watch these directories:
|
||||
# - . (root)
|
||||
# - config/
|
||||
# - database/
|
||||
# - components/
|
||||
# - data/
|
||||
# - strategies/
|
||||
# - trader/
|
||||
```
|
||||
|
||||
## Data Persistence
|
||||
|
||||
### Database Persistence
|
||||
✅ **PostgreSQL data persists** across container restarts
|
||||
- Volume: `postgres_data` mounted to `/var/lib/postgresql/data`
|
||||
- Data survives `docker-compose down` and `docker-compose up`
|
||||
|
||||
### Redis Persistence
|
||||
✅ **Redis data persists** with AOF (Append-Only File)
|
||||
- Volume: `redis_data` mounted to `/data`
|
||||
- AOF sync every second for durability
|
||||
- Data survives container restarts
|
||||
|
||||
### Removing Persistent Data
|
||||
```bash
|
||||
# Stop services and remove volumes (CAUTION: This deletes all data)
|
||||
docker-compose down -v
|
||||
|
||||
# Or remove specific volumes
|
||||
docker volume rm dashboard_postgres_data
|
||||
docker volume rm dashboard_redis_data
|
||||
```
|
||||
|
||||
## Dependency Management
|
||||
|
||||
### Adding New Dependencies
|
||||
|
||||
```bash
|
||||
# Add runtime dependency
|
||||
uv add "package-name>=1.0.0"
|
||||
|
||||
# Add development dependency
|
||||
uv add --dev "dev-package>=1.0.0"
|
||||
|
||||
# Install all dependencies
|
||||
uv sync --dev
|
||||
```
|
||||
|
||||
### Key Dependencies Included:
|
||||
- **Web Framework**: Dash, Plotly
|
||||
- **Database**: SQLAlchemy, psycopg2-binary, Alembic
|
||||
- **Data Processing**: pandas, numpy
|
||||
- **Configuration**: pydantic, python-dotenv
|
||||
- **Development**: watchdog (hot reload), pytest, black, mypy
|
||||
|
||||
See `docs/dependency-management.md` for detailed dependency management guide.
|
||||
|
||||
## Directory Structure
|
||||
|
||||
```
|
||||
Dashboard/
|
||||
├── config/ # Configuration files
|
||||
│ ├── settings.py # Application settings
|
||||
│ └── bot_configs/ # Bot configuration files
|
||||
├── database/ # Database related files
|
||||
│ └── init/ # Database initialization scripts
|
||||
├── scripts/ # Development scripts
|
||||
│ ├── dev.py # Main development script
|
||||
│ ├── setup.sh # Setup script (Unix)
|
||||
│ ├── start.sh # Start script (Unix)
|
||||
│ └── stop.sh # Stop script (Unix)
|
||||
├── tests/ # Test files
|
||||
│ └── test_setup.py # Setup verification tests
|
||||
├── docs/ # Documentation
|
||||
│ ├── setup.md # This file
|
||||
│ └── dependency-management.md # Dependency guide
|
||||
├── docker-compose.yml # Docker services configuration
|
||||
├── env.template # Environment variables template
|
||||
├── pyproject.toml # Dependencies and project config
|
||||
└── main.py # Main application entry point
|
||||
```
|
||||
|
||||
## Services
|
||||
|
||||
### PostgreSQL Database
|
||||
- **Host**: localhost:5432
|
||||
- **Database**: dashboard
|
||||
- **User**: dashboard
|
||||
- **Password**: dashboard123 (development only)
|
||||
- **Persistence**: ✅ Data persists across restarts
|
||||
|
||||
### Redis Cache
|
||||
- **Host**: localhost:6379
|
||||
- **No password** (development only)
|
||||
- **Persistence**: ✅ AOF enabled, data persists across restarts
|
||||
|
||||
## Environment Variables
|
||||
|
||||
Key environment variables (see `env.template` for full list):
|
||||
|
||||
- `DATABASE_URL` - PostgreSQL connection string
|
||||
- `OKX_API_KEY` - OKX API key
|
||||
- `OKX_SECRET_KEY` - OKX secret key
|
||||
- `OKX_PASSPHRASE` - OKX passphrase
|
||||
- `OKX_SANDBOX` - Use OKX sandbox (true/false)
|
||||
- `DEBUG` - Enable debug mode
|
||||
- `LOG_LEVEL` - Logging level (DEBUG, INFO, WARNING, ERROR)
|
||||
|
||||
## Troubleshooting
|
||||
|
||||
### Docker Issues
|
||||
|
||||
1. **Docker not running**: Start Docker Desktop/Engine
|
||||
2. **Port conflicts**: Check if ports 5432 or 6379 are already in use
|
||||
3. **Permission issues**: On Linux, add your user to the docker group
|
||||
4. **Data persistence issues**: Check if volumes are properly mounted
|
||||
|
||||
### Database Connection Issues
|
||||
|
||||
1. **Connection refused**: Ensure PostgreSQL container is running
|
||||
2. **Authentication failed**: Check credentials in `.env` file
|
||||
3. **Database doesn't exist**: Run the setup script again
|
||||
4. **Data loss**: Check if volume is mounted correctly
|
||||
|
||||
### Dependency Issues
|
||||
|
||||
1. **Import errors**: Run `uv sync --dev` to install dependencies
|
||||
2. **Version conflicts**: Check `pyproject.toml` for compatibility
|
||||
3. **Hot reload not working**: Ensure `watchdog` is installed
|
||||
|
||||
### Hot Reload Issues
|
||||
|
||||
1. **Changes not detected**: Check if files are in watched directories
|
||||
2. **Rapid restarts**: Built-in 1-second debouncing should prevent this
|
||||
3. **Process not stopping**: Use Ctrl+C to gracefully shutdown
|
||||
|
||||
## Performance Tips
|
||||
|
||||
1. **Use SSD**: Store Docker volumes on SSD for better database performance
|
||||
2. **Increase Docker memory**: Allocate more RAM to Docker Desktop
|
||||
3. **Hot reload**: Use `dev-server` for faster development cycles
|
||||
4. **Dependency caching**: UV caches dependencies for faster installs
|
||||
|
||||
## Next Steps
|
||||
|
||||
After successful setup:
|
||||
|
||||
1. **Phase 1.0**: Database Infrastructure Setup
|
||||
2. **Phase 2.0**: Bot Management System Development
|
||||
3. **Phase 3.0**: OKX Integration and Data Pipeline
|
||||
4. **Phase 4.0**: Dashboard UI and Visualization
|
||||
5. **Phase 5.0**: Backtesting System Implementation
|
||||
|
||||
See `tasks/tasks-prd-crypto-bot-dashboard.md` for detailed task list.
|
||||
|
||||
89
docs/specification.md
Normal file
89
docs/specification.md
Normal file
@ -0,0 +1,89 @@
|
||||
# Simplified Crypto Trading Bot Platform: Product Requirements Document
|
||||
|
||||
## Executive Summary
|
||||
|
||||
This simplified PRD addresses the need for a rapid-deployment crypto trading bot platform designed for internal testing and strategy development. The platform eliminates microservices complexity in favor of a monolithic architecture that can be functional within 1-2 weeks while supporting approximately 10 concurrent bots. The system focuses on core functionality including data collection, strategy execution, backtesting, and visualization without requiring advanced monitoring or orchestration tools.
|
||||
|
||||
## System Architecture Overview
|
||||
|
||||
The platform follows a streamlined monolithic design that consolidates all components within a single application boundary. This approach enables rapid development while maintaining clear separation between functional modules for future scalability.The architecture consists of six core components working together: Data Collection Module for exchange connectivity, Strategy Engine for unified signal generation, Bot Manager for concurrent bot orchestration, PostgreSQL database for data persistence, Backtesting Engine for historical simulation, and Dashboard for visualization and control.
|
||||
|
||||
## Simplified Technical Stack
|
||||
|
||||
### Core Technologies
|
||||
|
||||
The platform utilizes a Python-based technology stack optimized for rapid development. The backend employs Python 3.10+ with FastAPI or Flask for API services, PostgreSQL 14+ with TimescaleDB extension for time-series optimization, and Redis for real-time pub/sub messaging. The frontend leverages Dash with Plotly for interactive visualization and bot control interfaces.
|
||||
|
||||
### Database Design
|
||||
|
||||
The database schema emphasizes simplicity while supporting essential trading operations. Core tables include raw_market_data for exchange data storage, candles for OHLCV aggregation, strategies for algorithm definitions, bots for instance management, signals for trading decisions, trades for execution records, and bot_portfolio for performance tracking.
|
||||
|
||||
## Development Methodology
|
||||
|
||||
### Two-Week Implementation Timeline
|
||||
|
||||
The development follows a structured three-phase approach designed for rapid deployment. Phase 1 (Days 1-5) establishes foundational components including database setup, data collection implementation, and basic visualization. Phase 2 (Days 6-10) completes core functionality with backtesting engine development, trading logic implementation, and dashboard enhancement. Phase 3 (Days 11-14) focuses on system refinement, comprehensive testing, and deployment preparation.
|
||||
|
||||
### Strategy Implementation Example
|
||||
|
||||
The platform supports multiple trading strategies through a unified interface design. A simple moving average crossover strategy demonstrates the system's capability to generate buy and sell signals based on technical indicators.This example strategy shows how the system processes market data, calculates moving averages, generates trading signals, and tracks portfolio performance over time. The visualization includes price movements, moving average lines, signal markers, and portfolio value progression.
|
||||
|
||||
## Backtesting and Performance Analysis
|
||||
|
||||
### Strategy Validation Framework
|
||||
|
||||
The backtesting engine enables comprehensive strategy testing using historical market data. The system calculates key performance metrics including total returns, Sharpe ratios, maximum drawdown, and win/loss ratios to evaluate strategy effectiveness.
|
||||
|
||||
### Portfolio Management
|
||||
|
||||
The platform tracks portfolio allocation and performance throughout strategy execution. Real-time monitoring capabilities show the distribution between cryptocurrency holdings and cash reserves.
|
||||
|
||||
## Simplified Data Flow
|
||||
|
||||
### Real-Time Processing
|
||||
|
||||
The data collection module connects to exchange APIs to retrieve market information including order books, trades, and candlestick data. Raw data is stored in PostgreSQL while processed information is published through Redis channels for real-time distribution to active trading bots.
|
||||
|
||||
### Signal Generation and Execution
|
||||
|
||||
Strategies subscribe to relevant data streams and generate trading signals based on configured algorithms. The bot manager validates signals against portfolio constraints and executes simulated or live trades according to bot configurations.
|
||||
|
||||
## Future Scalability Considerations
|
||||
|
||||
### Microservices Migration Path
|
||||
|
||||
While implementing a monolithic architecture for rapid deployment, the system design maintains clear component boundaries that facilitate future extraction into microservices. API-first design principles ensure internal components communicate through well-defined interfaces that can be externalized as needed.
|
||||
|
||||
### Authentication and Multi-User Support
|
||||
|
||||
The current single-user design can be extended to support multiple users through role-based access control implementation. Database schema accommodates user management tables and permission structures without requiring significant architectural changes.
|
||||
|
||||
### Advanced Monitoring Integration
|
||||
|
||||
The simplified monitoring approach can be enhanced with Prometheus and Grafana integration when scaling requirements justify the additional complexity. Current basic monitoring provides foundation metrics that can be extended to comprehensive observability systems.
|
||||
|
||||
## Technical Implementation Details
|
||||
|
||||
### Time Series Data Management
|
||||
|
||||
The platform implements proper time aggregation aligned with exchange standards to ensure accurate candle formation. Timestamp alignment follows right-aligned methodology where 5-minute candles from 09:00:00-09:05:00 receive the 09:05:00 timestamp.
|
||||
|
||||
### Performance Optimization
|
||||
|
||||
Database indexing on timestamp and symbol fields ensures efficient time-series queries. Connection pooling prevents database connection leaks while prepared statements optimize query execution. Memory management includes proper cleanup of data objects after processing to maintain system stability.
|
||||
|
||||
## Success Metrics and Validation
|
||||
|
||||
### Development Milestones
|
||||
|
||||
Platform success is measured through specific deliverables including core functionality completion within 14 days, system stability maintenance at 99% uptime during internal testing, successful backtesting of at least 3 different strategies, and concurrent operation of 2+ bots for 72+ hours.
|
||||
|
||||
### Strategy Testing Capabilities
|
||||
|
||||
The system enables comprehensive strategy validation through historical simulation, real-time testing with virtual portfolios, and performance comparison across multiple algorithms. Backtesting results provide insights into strategy effectiveness before live deployment.
|
||||
|
||||
## Conclusion
|
||||
|
||||
This simplified crypto trading bot platform balances rapid development requirements with future scalability needs. The monolithic architecture enables deployment within 1-2 weeks while maintaining architectural flexibility for future enhancements. Clear component separation, comprehensive database design, and strategic technology choices create a foundation that supports both immediate testing objectives and long-term platform evolution.
|
||||
|
||||
The platform's focus on essential functionality without unnecessary complexity ensures teams can begin strategy testing quickly while building toward more sophisticated implementations as requirements expand. This approach maximizes development velocity while preserving options for future architectural evolution and feature enhancement.
|
||||
@ -1,159 +0,0 @@
|
||||
# Product Requirements Document: Crypto Trading Bot Dashboard
|
||||
|
||||
## Introduction/Overview
|
||||
|
||||
Create a simple control dashboard for managing and monitoring multiple cryptocurrency trading bots simultaneously. The system will allow testing different strategies in parallel using real OKX market data and virtual trading simulation. The focus is on rapid implementation to enable strategy testing within days rather than weeks.
|
||||
|
||||
**Core Problem**: Currently, testing multiple trading strategies requires manual coordination and lacks real-time monitoring capabilities. There's no unified way to compare strategy performance or manage multiple bots running simultaneously.
|
||||
|
||||
## Goals
|
||||
|
||||
1. **Enable Parallel Strategy Testing**: Run up to 5 different trading bots simultaneously with different strategies
|
||||
2. **Real-time Monitoring**: Visualize bot performance, trading decisions, and market data in real-time
|
||||
3. **Quick Strategy Validation**: Reduce the time from strategy implementation to performance assessment
|
||||
4. **Historical Analysis**: Enable backtesting with previously collected market data
|
||||
5. **Operational Control**: Simple start/stop functionality for individual bots
|
||||
|
||||
## User Stories
|
||||
|
||||
1. **As a strategy developer**, I want to start multiple bots with different strategies so that I can compare their performance over the same time period.
|
||||
|
||||
2. **As a trader**, I want to see real-time price charts and bot decisions so that I can understand how my strategies are performing.
|
||||
|
||||
3. **As an analyst**, I want to view historical performance metrics so that I can evaluate strategy effectiveness over different market conditions.
|
||||
|
||||
4. **As a system operator**, I want to stop underperforming bots so that I can prevent further virtual losses.
|
||||
|
||||
5. **As a researcher**, I want to run backtests on historical data so that I can validate strategies before live testing.
|
||||
|
||||
## Functional Requirements
|
||||
|
||||
### Core Bot Management
|
||||
1. **Bot Lifecycle Control**: System must allow starting and stopping individual bots via web interface
|
||||
2. **Multi-Bot Support**: System must support running up to 5 bots simultaneously
|
||||
3. **Bot Configuration**: System must read bot configurations from JSON/YAML files in a configs directory
|
||||
4. **Status Monitoring**: System must display current status (running/stopped) for each configured bot
|
||||
|
||||
### Data Management
|
||||
5. **Market Data Integration**: System must connect to existing OKX data feed and display real-time price information
|
||||
6. **Trading Decision Storage**: System must record all bot trading decisions (buy/sell signals, amounts, timestamps) to database
|
||||
7. **Performance Tracking**: System must calculate and store key metrics (balance, profit/loss, number of trades) for each bot
|
||||
8. **Data Persistence**: System must use SQLite for simplicity with separate tables for market data and bot decisions
|
||||
|
||||
### User Interface
|
||||
9. **Dashboard Layout**: System must provide a single-page dashboard showing all bot information
|
||||
10. **Price Visualization**: System must display candlestick charts with bot buy/sell markers overlaid
|
||||
11. **Bot Switching**: System must allow switching chart view between different active bots
|
||||
12. **Performance Metrics**: System must show current balance, total profit/loss, and trade count for each bot and trade time
|
||||
13. **Real-time Updates**: System must refresh data every 2 seconds minimum
|
||||
|
||||
### Backtesting
|
||||
14. **Historical Mode**: System must allow running bots against historical market data
|
||||
15. **Time Range Selection**: System must provide date range picker for backtest periods
|
||||
16. **Accelerated Testing**: System must support running backtests faster than real-time
|
||||
|
||||
## Non-Goals (Out of Scope)
|
||||
|
||||
- **Multi-exchange Support**: Only OKX integration for MVP
|
||||
- **Real Money Trading**: Virtual trading simulation only
|
||||
- **Advanced UI/UX**: Basic functional interface, not polished design
|
||||
- **User Authentication**: Single-user system for MVP
|
||||
- **Strategy Editor**: Strategy creation/editing via code only, not UI
|
||||
- **Advanced Analytics**: Complex statistical analysis beyond basic P&L
|
||||
- **Mobile Interface**: Desktop web interface only
|
||||
- **High-frequency Trading**: Strategies with sub-second requirements not supported
|
||||
|
||||
## Technical Considerations
|
||||
|
||||
### Technology Stack
|
||||
- **Backend**: Python with existing OKX, strategy, and trader modules
|
||||
- **Frontend**: Plotly Dash for rapid development and Python integration
|
||||
- **Database**: PostgreSQL, SQLAlchemy
|
||||
- **Communication**: Direct database polling (no WebSockets for MVP)
|
||||
|
||||
### Architecture Simplicity
|
||||
- **Monolithic Design**: Single Python application with all components
|
||||
- **File-based Configuration**: JSON files for bot settings
|
||||
- **Polling Updates**: 2-second refresh cycle acceptable for MVP
|
||||
- **Process Management**: Simple threading or multiprocessing for bot execution
|
||||
|
||||
### Integration Requirements
|
||||
- **Existing Module Refactoring**: May need to modify current strategy and trader modules for unified signal processing
|
||||
- **Database Schema**: Design simple schema for bot decisions and performance metrics
|
||||
- **Error Handling**: Basic error logging and bot restart capabilities
|
||||
|
||||
## Success Metrics
|
||||
|
||||
1. **Functionality**: Successfully run 3+ bots simultaneously for 24+ hours without crashes
|
||||
2. **Data Completeness**: Capture 100% of trading decisions and market data during bot operation
|
||||
3. **Performance Visibility**: Display real-time bot performance with <5 second update latency
|
||||
4. **Backtesting Capability**: Run historical tests covering 1+ weeks of data in <10 minutes
|
||||
5. **Operational Control**: Start/stop bots with <10 second response time
|
||||
|
||||
## Design Considerations
|
||||
|
||||
### Dashboard Layout
|
||||
```
|
||||
+------------------+-------------------+
|
||||
| Bot Controls | Active Charts |
|
||||
| [Bot1] [Start] | |
|
||||
| [Bot2] [Stop] | Price/Strategy |
|
||||
| [Bot3] [Start] | Chart |
|
||||
+------------------+-------------------+
|
||||
| Performance Metrics |
|
||||
| Bot1: +$50 Bot2: -$20 Bot3: +$35 |
|
||||
+--------------------------------------+
|
||||
```
|
||||
|
||||
### Configuration Example
|
||||
```json
|
||||
{
|
||||
"bot_id": "ema_crossover_01",
|
||||
"strategy": "EMA_Crossover",
|
||||
"parameters": {
|
||||
"fast_period": 12,
|
||||
"slow_period": 26,
|
||||
"symbol": "BTC-USDT"
|
||||
},
|
||||
"virtual_balance": 10000
|
||||
}
|
||||
```
|
||||
|
||||
## Implementation Phases
|
||||
|
||||
### Phase 1 (Week 1-2): Core Infrastructure
|
||||
- Set up basic Dash application
|
||||
- Integrate existing OKX data feed
|
||||
- Create bot manager for start/stop functionality
|
||||
- Basic database schema for trading decisions
|
||||
|
||||
### Phase 2 (Week 3): Visualization
|
||||
- Implement price charts with Plotly
|
||||
- Add bot decision overlays
|
||||
- Create performance metrics dashboard
|
||||
- Bot switching functionality
|
||||
|
||||
### Phase 3 (Week 4): Testing & Refinement
|
||||
- Add backtesting capability
|
||||
- Implement error handling and logging
|
||||
- Performance optimization
|
||||
- User acceptance testing
|
||||
|
||||
## Open Questions
|
||||
|
||||
1. **Strategy Module Integration**: What modifications are needed to current strategy modules for unified signal processing?
|
||||
2. **Database Migration**: Start with PostgreSQL
|
||||
3. **Bot Resource Management**: How should we handle memory/CPU limits for individual bots?
|
||||
4. **Configuration Management**: Bot can have hot reloading (we can save current active, paused bots so on start it restore last state.)
|
||||
5. **Error Recovery**: On bot crash system should restart it if it is active.
|
||||
|
||||
## Acceptance Criteria
|
||||
|
||||
- [ ] Start 5 different bots simultaneously via web interface
|
||||
- [ ] View real-time price charts with buy/sell decision markers
|
||||
- [ ] Switch between different bot views in the dashboard
|
||||
- [ ] See current virtual balance and P&L for each bot
|
||||
- [ ] Stop/start individual bots without affecting others
|
||||
- [ ] Run backtest on 1 week of historical data
|
||||
- [ ] System operates continuously for 48+ hours without manual intervention
|
||||
- [ ] All trading decisions logged to database with timestamps
|
||||
@ -1,78 +0,0 @@
|
||||
## Relevant Files
|
||||
|
||||
- `app.py` - Main Dash application entry point and layout definition
|
||||
- `bot_manager.py` - Core bot lifecycle management and orchestration
|
||||
- `database/models.py` - SQLAlchemy models for bots, trades, and market data
|
||||
- `database/connection.py` - Database connection and session management
|
||||
- `data/okx_integration.py` - OKX API connection and real-time data feed
|
||||
- `strategies/` - Directory containing strategy modules (existing, may need refactoring)
|
||||
- `trader/` - Directory containing virtual trading logic (existing, may need refactoring)
|
||||
- `components/dashboard.py` - Dash dashboard components and layout
|
||||
- `components/charts.py` - Plotly chart components for price and performance visualization
|
||||
- `backtesting/engine.py` - Backtesting execution engine
|
||||
- `config/bot_configs/` - Directory for JSON bot configuration files
|
||||
- `utils/logging.py` - Logging configuration and utilities
|
||||
- `requirements.txt` - Python dependencies using UV package manager
|
||||
|
||||
### Notes
|
||||
|
||||
- Use docker for development and database
|
||||
- Use UV for package management as specified in project requirements
|
||||
- PostgreSQL with SQLAlchemy for database persistence
|
||||
- Plotly Dash for rapid UI development
|
||||
- Bot configurations stored as JSON files in config directory
|
||||
- System should support hot-reloading of bot states
|
||||
|
||||
## Tasks
|
||||
|
||||
- [ ] 0.0 Dev environment and Docker setup
|
||||
- [ ] 0.1 Create Docker Compose file with PostgreSQL service
|
||||
- [ ] 0.2 Set up UV package management with pyproject.toml dependencies
|
||||
- [ ] 0.3 Create .env file template for database and API configuration
|
||||
- [ ] 0.4 Add development scripts for starting/stopping services
|
||||
- [ ] 0.5 Test database connection and basic container orchestration
|
||||
|
||||
- [ ] 1.0 Database Infrastructure Setup
|
||||
- [ ] 1.1 Design PostgreSQL schema for bots, trades, market_data, and bot_states tables
|
||||
- [ ] 1.2 Create SQLAlchemy models in `database/models.py` for all entities
|
||||
- [ ] 1.3 Implement database connection management in `database/connection.py`
|
||||
- [ ] 1.4 Create Alembic migration scripts for initial schema
|
||||
- [ ] 1.5 Add database utility functions for common queries
|
||||
- [ ] 1.6 Implement bot state persistence for hot-reloading capability
|
||||
|
||||
- [ ] 2.0 Bot Management System Development
|
||||
- [ ] 2.1 Create `bot_manager.py` with BotManager class for lifecycle control
|
||||
- [ ] 2.2 Implement bot configuration loading from JSON files in `config/bot_configs/`
|
||||
- [ ] 2.3 Add start/stop functionality for individual bots using threading/multiprocessing
|
||||
- [ ] 2.4 Create bot status tracking and monitoring system
|
||||
- [ ] 2.5 Implement error handling and automatic bot restart on crash
|
||||
- [ ] 2.6 Add bot state persistence to database for system restart recovery
|
||||
- [ ] 2.7 Create unified signal processing interface for strategy integration
|
||||
|
||||
- [ ] 3.0 OKX Integration and Data Pipeline
|
||||
- [ ] 3.1 Create `data/okx_integration.py` with OKX API client
|
||||
- [ ] 3.2 Implement real-time WebSocket connection for market data
|
||||
- [ ] 3.3 Add market data normalization and validation
|
||||
- [ ] 3.4 Create data storage pipeline to PostgreSQL with proper indexing
|
||||
- [ ] 3.5 Implement data feed monitoring and reconnection logic
|
||||
- [ ] 3.6 Add historical data retrieval for backtesting
|
||||
- [ ] 3.7 Test data pipeline with multiple cryptocurrency pairs
|
||||
|
||||
- [ ] 4.0 Dashboard UI and Visualization
|
||||
- [ ] 4.1 Set up basic Dash application structure in `app.py`
|
||||
- [ ] 4.2 Create dashboard layout with bot controls and chart areas
|
||||
- [ ] 4.3 Implement bot control panel in `components/dashboard.py`
|
||||
- [ ] 4.4 Build candlestick charts with buy/sell markers in `components/charts.py`
|
||||
- [ ] 4.5 Add performance metrics display for each bot (balance, P&L, trade count, trade time)
|
||||
- [ ] 4.6 Implement bot switching functionality for chart views
|
||||
- [ ] 4.7 Add real-time data updates with 2-second refresh cycle
|
||||
- [ ] 4.8 Create responsive layout that works on different screen sizes
|
||||
|
||||
- [ ] 5.0 Backtesting System Implementation
|
||||
- [ ] 5.1 Create `backtesting/engine.py` with backtesting framework
|
||||
- [ ] 5.2 Implement historical data loading and time range selection
|
||||
- [ ] 5.3 Add accelerated testing capability (faster than real-time)
|
||||
- [ ] 5.4 Create backtesting results storage and comparison system
|
||||
- [ ] 5.5 Integrate backtesting with dashboard for result visualization
|
||||
- [ ] 5.6 Add date range picker component for backtest periods
|
||||
- [ ] 5.7 Test backtesting with sample strategies on 1+ week datasets
|
||||
Loading…
x
Reference in New Issue
Block a user