This commit is contained in:
Vasily.onl 2025-05-30 16:17:50 +08:00
parent 416f9c565c
commit 06df67c756
2 changed files with 187 additions and 350 deletions

View File

@ -12,7 +12,7 @@ This PRD outlines the development of a simplified crypto trading bot platform th
## Current Requirements & Constraints
- **Speed to Deployment**: System must be functional within 1-2 weeks
- **Scale**: Support for ~10
- **Scale**: Support for 5-10 concurrent trading bots
- **Architecture**: Monolithic application instead of microservices
- **User Access**: Internal use only initially (no multi-user authentication)
- **Infrastructure**: Simplified deployment without Kubernetes/Docker Swarm
@ -24,24 +24,106 @@ This PRD outlines the development of a simplified crypto trading bot platform th
The platform will follow a monolithic architecture pattern to enable rapid development while providing clear separation between components:
### Data Flow Architecture
```
OKX Exchange API (WebSocket)
Data Collector → OHLCV Aggregator → PostgreSQL (market_data)
↓ ↓
[Optional] Raw Trade Storage Redis Pub/Sub → Strategy Engine (JSON configs)
↓ ↓
Files/Database (raw_trades) Signal Generation → Bot Manager
PostgreSQL (signals, trades, bot_performance)
Dashboard (REST API) ← PostgreSQL (historical data)
Real-time Updates ← Redis Channels
```
**Data Processing Priority**:
1. **Real-time**: Raw data → OHLCV candles → Redis → Bots (primary flow)
2. **Historical**: OHLCV data from PostgreSQL for backtesting and charts
3. **Advanced Analysis**: Raw trade data (if stored) for detailed backtesting
### Redis Channel Design
```python
# Real-time market data distribution
MARKET_DATA_CHANNEL = "market:{symbol}" # OHLCV updates
BOT_SIGNALS_CHANNEL = "signals:{bot_id}" # Trading decisions
BOT_STATUS_CHANNEL = "status:{bot_id}" # Bot lifecycle events
SYSTEM_EVENTS_CHANNEL = "system:events" # Global notifications
```
### Configuration Strategy
**PostgreSQL for**: Market data, bot instances, trades, signals, performance metrics
**JSON files for**: Strategy parameters, bot configurations (rapid testing and parameter tuning)
```json
// config/strategies/ema_crossover.json
{
"strategy_name": "EMA_Crossover",
"parameters": {
"fast_period": 12,
"slow_period": 26,
"risk_percentage": 0.02
}
}
// config/bots/bot_001.json
{
"bot_id": "bot_001",
"strategy_file": "ema_crossover.json",
"symbol": "BTC-USDT",
"virtual_balance": 10000,
"enabled": true
}
```
### Error Handling Strategy
**Bot Crash Recovery**:
- Monitor bot processes every 30 seconds
- Auto-restart crashed bots if status = 'active'
- Log all crashes with stack traces
- Maximum 3 restart attempts per hour
**Exchange Connection Issues**:
- Retry with exponential backoff (1s, 2s, 4s, 8s, max 60s)
- Switch to backup WebSocket connection if available
- Log connection quality metrics
**Database Errors**:
- Continue operation with in-memory cache for up to 5 minutes
- Queue operations for retry when connection restored
- Alert on prolonged database disconnection
**Application Restart Recovery**:
- Read bot states from database on startup
- Restore active bots to 'active' status
- Resume data collection for all monitored symbols
### Component Details and Functional Requirements
1. **Data Collection Module**
- Connect to exchange APIs (OKX initially)
- Retrieve market data (order books, trades, candles)
- Store raw and processed data in database
- Send real-time updates through Redis
- Time Series Data Management (similiar to market standard)
- Connect to exchange APIs (OKX initially) via WebSocket
- Aggregate real-time trades into OHLCV candles (1m, 5m, 15m, 1h, 4h, 1d)
- Store OHLCV data in PostgreSQL for bot operations and backtesting
- Send real-time candle updates through Redis
- Optional: Store raw trade data for advanced backtesting
**FR-001: Unified Data Provider Interface**
- Support multiple exchanges through standardized adapters
- Real-time data collection with WebSocket connections
- Raw data storage for audit and replay capabilities
- Real-time OHLCV aggregation with WebSocket connections
- Primary focus on candle data, raw data storage optional
- Data validation and error handling mechanisms
**FR-002: Market Data Processing**
- OHLCV aggregation with configurable timeframes
- Technical indicator calculation (SMA, EMA, RSI, MACD, Bollinger Bands)
- OHLCV aggregation with configurable timeframes (1m base, higher timeframes derived)
- Technical indicator calculation (SMA, EMA, RSI, MACD, Bollinger Bands) on OHLCV data
- Data normalization across different exchanges
- Time alignment following exchange standards (right-aligned candles)
@ -106,43 +188,40 @@ The platform will follow a monolithic architecture pattern to enable rapid devel
- Store strategy parameters and bot onfiguration in JSON in the beginning for simplicity of editing and testing
5. **Backtesting Engine**
- Run simulations on historical data (batch processing for the signals and incremental processing for the trades)
- Run simulations on historical data using vectorized operations for speed
- Calculate performance metrics
- Compare multiple strategies
- Visualize backtest results
- Support multiple timeframes and strategy parameter testing
- Generate comparison reports between strategies
**FR-009: Historical Simulation**
- Strategy backtesting on historical data
- Performance metric calculation (Sharpe ratio, drawdown, win rate)
- Parameter optimization and sensitivity analysis (in future)
- Comparison tools for multiple strategies (in future)
- Strategy backtesting on historical market data
- Performance metric calculation (Sharpe ratio, drawdown, win rate, total return)
- Parameter optimization through grid search (limited combinations for speed) (in future)
- Side-by-side strategy comparison with statistical significance
**FR-010: Simulation Engine**
- Realistic order execution simulation
- Fee and slippage modeling (in future)
- Look-ahead bias prevention
- Parallel backtesting for multiple parameter sets (in future)
**FR-010: Simulation Engine**
- Vectorized signal calculation using pandas operations
- Realistic fee modeling (0.1% per trade for OKX)
- Look-ahead bias prevention with proper timestamp handling
- Configurable test periods (1 day to 24 months)
6. **Dashboard & Visualization**
- Display real-time market data
- Show bot status and performance
- Visualize strategy signals and executions
- Provide control interface for bot management
- Display real-time market data and bot status
- Show portfolio value progression over time
- Visualize trade history with buy/sell markers on price charts
- Provide simple bot control interface (start/stop/configure)
**FR-011: Dashboard Interface**
- Real-time bot monitoring and status display
- Performance charts and metrics visualization
- Bot configuration and management forms
- Alert and notification system
- Real-time bot monitoring with status indicators
- Portfolio performance charts (total value, cash vs crypto allocation)
- Trade history table with P&L per trade
- Simple bot configuration forms for JSON parameter editing
**FR-012: Data Visualization**
- Interactive price charts with bot signals overlay
- Performance comparison charts
- Portfolio allocation and risk metrics display (in future)
- Historical analysis tools
7. **Web API Service**
- REST API for frontend interactions
- Interactive price charts with strategy signal overlays
- Portfolio value progression charts
- Performance comparison tables (multiple bots side-by-side)
- Fee tracking and total cost analysis
### Non-Functional Requirements
@ -183,116 +262,133 @@ The platform will follow a monolithic architecture pattern to enable rapid devel
### Database Schema
The database schema is designed to be simple yet effective for storing market data, bot configurations, and trading history.
The database schema separates frequently-accessed OHLCV data from raw tick data to optimize performance and storage.
```sql
-- Raw Market Data Tables
CREATE TABLE raw_market_data (
-- OHLCV Market Data (primary table for bot operations)
CREATE TABLE market_data (
id SERIAL PRIMARY KEY,
exchange VARCHAR(50) NOT NULL,
exchange VARCHAR(50) NOT NULL DEFAULT 'okx',
symbol VARCHAR(20) NOT NULL,
timestamp TIMESTAMPTZ NOT NULL,
data JSONB NOT NULL,
UNIQUE(exchange, symbol, timestamp)
);
CREATE INDEX idx_raw_market_data_symbol_timestamp ON raw_market_data(symbol, timestamp);
-- Processed OHLCV data
CREATE TABLE processed_market_data (
id SERIAL PRIMARY KEY,
exchange VARCHAR(50) NOT NULL,
symbol VARCHAR(20) NOT NULL,
timeframe VARCHAR(5) NOT NULL,
timeframe VARCHAR(5) NOT NULL, -- 1m, 5m, 15m, 1h, 4h, 1d
timestamp TIMESTAMPTZ NOT NULL,
open DECIMAL(18,8) NOT NULL,
high DECIMAL(18,8) NOT NULL,
low DECIMAL(18,8) NOT NULL,
close DECIMAL(18,8) NOT NULL,
volume DECIMAL(18,8) NOT NULL,
UNIQUE(exchange, symbol, timeframe, timestamp)
) PARTITION BY RANGE (timestamp);
CREATE INDEX idx_processed_market_data_lookup ON processed_market_data(exchange, symbol, timeframe, timestamp);
-- Strategy table
CREATE TABLE strategies (
id SERIAL PRIMARY KEY,
name VARCHAR(100) NOT NULL,
description TEXT,
parameters JSONB NOT NULL,
trades_count INTEGER, -- number of trades in this candle
created_at TIMESTAMPTZ DEFAULT NOW(),
updated_at TIMESTAMPTZ DEFAULT NOW()
UNIQUE(exchange, symbol, timeframe, timestamp)
);
CREATE INDEX idx_market_data_lookup ON market_data(symbol, timeframe, timestamp);
CREATE INDEX idx_market_data_recent ON market_data(timestamp DESC) WHERE timestamp > NOW() - INTERVAL '7 days';
-- Bot table
-- Raw Trade Data (optional, for detailed backtesting only)
CREATE TABLE raw_trades (
id SERIAL PRIMARY KEY,
exchange VARCHAR(50) NOT NULL DEFAULT 'okx',
symbol VARCHAR(20) NOT NULL,
timestamp TIMESTAMPTZ NOT NULL,
type VARCHAR(10) NOT NULL, -- trade, order, balance, tick, books
data JSONB NOT NULL, -- response from the exchange
created_at TIMESTAMPTZ DEFAULT NOW()
) PARTITION BY RANGE (timestamp);
CREATE INDEX idx_raw_trades_symbol_time ON raw_trades(symbol, timestamp);
-- Monthly partitions for raw data (if using raw data)
-- CREATE TABLE raw_trades_y2024m01 PARTITION OF raw_trades
-- FOR VALUES FROM ('2024-01-01') TO ('2024-02-01');
-- Bot Management (simplified)
CREATE TABLE bots (
id SERIAL PRIMARY KEY,
name VARCHAR(100) NOT NULL,
strategy_id INTEGER REFERENCES strategies(id),
strategy_name VARCHAR(50) NOT NULL,
symbol VARCHAR(20) NOT NULL,
timeframe VARCHAR(5) NOT NULL,
parameters JSONB NOT NULL,
status VARCHAR(20) NOT NULL DEFAULT 'inactive',
last_run TIMESTAMPTZ,
status VARCHAR(20) NOT NULL DEFAULT 'inactive', -- active, inactive, error
config_file VARCHAR(200), -- path to JSON config
virtual_balance DECIMAL(18,8) DEFAULT 10000,
current_balance DECIMAL(18,8) DEFAULT 10000,
last_heartbeat TIMESTAMPTZ,
created_at TIMESTAMPTZ DEFAULT NOW(),
updated_at TIMESTAMPTZ DEFAULT NOW()
);
-- Trading Data Tables
-- Trading Signals (for analysis and debugging)
CREATE TABLE signals (
id SERIAL PRIMARY KEY,
bot_id INTEGER REFERENCES bots(id),
timestamp TIMESTAMPTZ NOT NULL,
symbol VARCHAR(20) NOT NULL,
signal_type VARCHAR(10) NOT NULL,
signal_type VARCHAR(10) NOT NULL, -- buy, sell, hold
price DECIMAL(18,8),
confidence DECIMAL(5,4),
metadata JSONB,
indicators JSONB, -- technical indicator values
created_at TIMESTAMPTZ DEFAULT NOW()
);
CREATE INDEX idx_signals_bot_time ON signals(bot_id, timestamp);
-- Trade Execution Records
CREATE TABLE trades (
id SERIAL PRIMARY KEY,
bot_id INTEGER REFERENCES bots(id),
signal_id INTEGER REFERENCES signals(id),
timestamp TIMESTAMPTZ NOT NULL,
symbol VARCHAR(20) NOT NULL,
order_type VARCHAR(20) NOT NULL,
side VARCHAR(5) NOT NULL,
side VARCHAR(5) NOT NULL, -- buy, sell
price DECIMAL(18,8) NOT NULL,
quantity DECIMAL(18,8) NOT NULL,
status VARCHAR(20) NOT NULL,
fees DECIMAL(18,8),
metadata JSONB,
created_at TIMESTAMPTZ DEFAULT NOW(),
updated_at TIMESTAMPTZ DEFAULT NOW()
fees DECIMAL(18,8) DEFAULT 0,
pnl DECIMAL(18,8), -- profit/loss for this trade
balance_after DECIMAL(18,8), -- portfolio balance after trade
created_at TIMESTAMPTZ DEFAULT NOW()
);
CREATE INDEX idx_trades_bot_time ON trades(bot_id, timestamp);
-- Bot Portfolio Tracking
-- Performance Snapshots (for plotting portfolio over time)
CREATE TABLE bot_performance (
id SERIAL PRIMARY KEY,
bot_id INTEGER REFERENCES bots(id),
timestamp TIMESTAMPTZ NOT NULL,
equity DECIMAL(18,8) NOT NULL,
balance DECIMAL(18,8) NOT NULL,
open_positions JSONB,
metrics JSONB,
total_value DECIMAL(18,8) NOT NULL, -- current portfolio value
cash_balance DECIMAL(18,8) NOT NULL,
crypto_balance DECIMAL(18,8) NOT NULL,
total_trades INTEGER DEFAULT 0,
winning_trades INTEGER DEFAULT 0,
total_fees DECIMAL(18,8) DEFAULT 0,
created_at TIMESTAMPTZ DEFAULT NOW()
);
CREATE INDEX idx_bot_performance_bot_time ON bot_performance(bot_id, timestamp);
```
**Data Storage Strategy**:
- **OHLCV Data**: Primary source for bot operations, kept indefinitely, optimized indexes
- **Raw Trade Data**: Optional table, only if detailed backtesting needed, can be partitioned monthly
- **Alternative for Raw Data**: Store in compressed files (Parquet/CSV) instead of database for cost efficiency
**MVP Approach**: Start with OHLCV data only, add raw data storage later if advanced backtesting requires it.
### Technology Stack
The platform will be built using the following technologies:
- **Backend**: Python 3.10+ (FastAPI or Flask)
- **Backend Framework**: Python 3.10+ with Dash (includes built-in Flask server for REST API endpoints)
- **Database**: PostgreSQL 14+ (with TimescaleDB extension for time-series optimization)
- **Real-time Messaging**: Redis (for pub/sub messaging)
- **Frontend**: Dash with Plotly (for visualization and control interface) and Mantine UI (for the dashboard)
- **Deployment**: Simple Docker container setup (not Kubernetes)
- **Real-time Messaging**: Redis (for pub/sub messaging between components)
- **Frontend**: Dash with Plotly (for visualization and control interface) and Mantine UI components
- **Configuration**: JSON files for strategy parameters and bot configurations
- **Deployment**: Docker container setup for development and production
### API Design
**Dash Callbacks**: Real-time updates and user interactions
**REST Endpoints**: Historical data queries for backtesting and analysis
```python
# Built-in Flask routes for historical data
@app.server.route('/api/bot/<bot_id>/trades')
@app.server.route('/api/market/<symbol>/history')
@app.server.route('/api/backtest/results/<test_id>')
```
### Data Flow

View File

@ -1,259 +0,0 @@
# Dependency Management Guide
This guide explains how to manage Python dependencies in the Crypto Trading Bot Dashboard project.
## Local Development
### Adding New Dependencies
#### 1. Core Dependencies (Required for Runtime)
To add a new core dependency:
```bash
# Method 1: Add directly to pyproject.toml
# Edit pyproject.toml and add to the dependencies list:
# "new-package>=1.0.0",
# Method 2: Use UV to add and update pyproject.toml
uv add "new-package>=1.0.0"
# Sync to install
uv sync
```
#### 2. Development Dependencies (Testing, Linting, etc.)
```bash
# Add development-only dependency
uv add --dev "new-dev-package>=1.0.0"
# Or edit pyproject.toml under [project.optional-dependencies.dev]
# Then run:
uv sync --dev
```
### Installing Dependencies
```bash
# Install all dependencies
uv sync
# Install with development dependencies
uv sync --dev
# Install only production dependencies
uv sync --no-dev
```
### Updating Dependencies
```bash
# Update all dependencies to latest compatible versions
uv sync --upgrade
# Update specific package
uv sync --upgrade-package "package-name"
```
## Docker Environment
### Current Approach
The project uses a **volume-based development** approach where:
- Dependencies are installed in the local environment using UV
- Docker containers provide only infrastructure services (PostgreSQL, Redis)
- The Python application runs locally with hot reload
### Adding Dependencies for Docker-based Development
If you want to run the entire application in Docker:
#### 1. Create a Dockerfile
```dockerfile
FROM python:3.10-slim
WORKDIR /app
# Install system dependencies
RUN apt-get update && apt-get install -y \
postgresql-client \
curl \
&& rm -rf /var/lib/apt/lists/*
# Install UV
RUN pip install uv
# Copy dependency files
COPY pyproject.toml ./
COPY README.md ./
# Install dependencies
RUN uv sync --no-dev
# Copy application code
COPY . .
# Expose port
EXPOSE 8050
# Run application
CMD ["uv", "run", "python", "main.py"]
```
#### 2. Add Application Service to docker-compose.yml
```yaml
services:
app:
build: .
container_name: dashboard_app
ports:
- "8050:8050"
volumes:
- .:/app
- uv_cache:/root/.cache/uv
environment:
- DATABASE_URL=postgresql://dashboard:dashboard123@postgres:5432/dashboard
- REDIS_URL=redis://redis:6379
depends_on:
- postgres
- redis
networks:
- dashboard-network
restart: unless-stopped
volumes:
uv_cache:
```
#### 3. Development Workflow with Docker
```bash
# Build and start all services
docker-compose up --build
# Add new dependency
# 1. Edit pyproject.toml
# 2. Rebuild container
docker-compose build app
docker-compose up -d app
# Or use dev dependencies mount
# Mount local UV cache for faster rebuilds
```
## Hot Reload Development
### Method 1: Local Development (Recommended)
Run services in Docker, application locally with hot reload:
```bash
# Start infrastructure
python scripts/dev.py start
# Run app with hot reload
uv run python scripts/dev.py dev-server
```
### Method 2: Docker with Volume Mounts
If using Docker for the app, mount source code:
```yaml
volumes:
- .:/app # Mount source code
- /app/__pycache__ # Exclude cache
```
## Best Practices
### 1. Version Pinning
```toml
# Good: Specify minimum version with compatibility
"requests>=2.31.0,<3.0.0"
# Acceptable: Major version constraint
"pandas>=2.1.0"
# Avoid: Exact pinning (except for critical deps)
"somepackage==1.2.3" # Only if necessary
```
### 2. Dependency Categories
```toml
[project]
dependencies = [
# Core web framework
"dash>=2.14.0",
# Database
"sqlalchemy>=2.0.0",
"psycopg2-binary>=2.9.0",
# ... group related dependencies
]
```
### 3. Security Updates
```bash
# Check for security vulnerabilities
pip-audit
# Update specific vulnerable package
uv sync --upgrade-package "vulnerable-package"
```
## Troubleshooting
### Common Issues
1. **Dependency Conflicts**
```bash
# Clear UV cache and reinstall
uv cache clean
uv sync --refresh
```
2. **PostgreSQL Connection Issues**
```bash
# Ensure psycopg2-binary is installed
uv add "psycopg2-binary>=2.9.0"
```
3. **Docker Build Failures**
```bash
# Clean docker build cache
docker system prune --volumes
docker-compose build --no-cache
```
### Debugging Dependencies
```bash
# Show installed packages
uv pip list
# Show dependency tree
uv pip show <package-name>
# Check for conflicts
uv pip check
```
## Migration from requirements.txt
If you have an existing `requirements.txt`:
```bash
# Convert to pyproject.toml
uv add -r requirements.txt
# Or manually copy dependencies to pyproject.toml
# Then remove requirements.txt
```