PRD
This commit is contained in:
parent
416f9c565c
commit
06df67c756
@ -12,7 +12,7 @@ This PRD outlines the development of a simplified crypto trading bot platform th
|
|||||||
## Current Requirements & Constraints
|
## Current Requirements & Constraints
|
||||||
|
|
||||||
- **Speed to Deployment**: System must be functional within 1-2 weeks
|
- **Speed to Deployment**: System must be functional within 1-2 weeks
|
||||||
- **Scale**: Support for ~10
|
- **Scale**: Support for 5-10 concurrent trading bots
|
||||||
- **Architecture**: Monolithic application instead of microservices
|
- **Architecture**: Monolithic application instead of microservices
|
||||||
- **User Access**: Internal use only initially (no multi-user authentication)
|
- **User Access**: Internal use only initially (no multi-user authentication)
|
||||||
- **Infrastructure**: Simplified deployment without Kubernetes/Docker Swarm
|
- **Infrastructure**: Simplified deployment without Kubernetes/Docker Swarm
|
||||||
@ -24,24 +24,106 @@ This PRD outlines the development of a simplified crypto trading bot platform th
|
|||||||
|
|
||||||
The platform will follow a monolithic architecture pattern to enable rapid development while providing clear separation between components:
|
The platform will follow a monolithic architecture pattern to enable rapid development while providing clear separation between components:
|
||||||
|
|
||||||
|
### Data Flow Architecture
|
||||||
|
|
||||||
|
```
|
||||||
|
OKX Exchange API (WebSocket)
|
||||||
|
↓
|
||||||
|
Data Collector → OHLCV Aggregator → PostgreSQL (market_data)
|
||||||
|
↓ ↓
|
||||||
|
[Optional] Raw Trade Storage Redis Pub/Sub → Strategy Engine (JSON configs)
|
||||||
|
↓ ↓
|
||||||
|
Files/Database (raw_trades) Signal Generation → Bot Manager
|
||||||
|
↓
|
||||||
|
PostgreSQL (signals, trades, bot_performance)
|
||||||
|
↓
|
||||||
|
Dashboard (REST API) ← PostgreSQL (historical data)
|
||||||
|
↑
|
||||||
|
Real-time Updates ← Redis Channels
|
||||||
|
```
|
||||||
|
|
||||||
|
**Data Processing Priority**:
|
||||||
|
1. **Real-time**: Raw data → OHLCV candles → Redis → Bots (primary flow)
|
||||||
|
2. **Historical**: OHLCV data from PostgreSQL for backtesting and charts
|
||||||
|
3. **Advanced Analysis**: Raw trade data (if stored) for detailed backtesting
|
||||||
|
|
||||||
|
### Redis Channel Design
|
||||||
|
|
||||||
|
```python
|
||||||
|
# Real-time market data distribution
|
||||||
|
MARKET_DATA_CHANNEL = "market:{symbol}" # OHLCV updates
|
||||||
|
BOT_SIGNALS_CHANNEL = "signals:{bot_id}" # Trading decisions
|
||||||
|
BOT_STATUS_CHANNEL = "status:{bot_id}" # Bot lifecycle events
|
||||||
|
SYSTEM_EVENTS_CHANNEL = "system:events" # Global notifications
|
||||||
|
```
|
||||||
|
|
||||||
|
### Configuration Strategy
|
||||||
|
|
||||||
|
**PostgreSQL for**: Market data, bot instances, trades, signals, performance metrics
|
||||||
|
**JSON files for**: Strategy parameters, bot configurations (rapid testing and parameter tuning)
|
||||||
|
|
||||||
|
```json
|
||||||
|
// config/strategies/ema_crossover.json
|
||||||
|
{
|
||||||
|
"strategy_name": "EMA_Crossover",
|
||||||
|
"parameters": {
|
||||||
|
"fast_period": 12,
|
||||||
|
"slow_period": 26,
|
||||||
|
"risk_percentage": 0.02
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
// config/bots/bot_001.json
|
||||||
|
{
|
||||||
|
"bot_id": "bot_001",
|
||||||
|
"strategy_file": "ema_crossover.json",
|
||||||
|
"symbol": "BTC-USDT",
|
||||||
|
"virtual_balance": 10000,
|
||||||
|
"enabled": true
|
||||||
|
}
|
||||||
|
```
|
||||||
|
|
||||||
|
### Error Handling Strategy
|
||||||
|
|
||||||
|
**Bot Crash Recovery**:
|
||||||
|
- Monitor bot processes every 30 seconds
|
||||||
|
- Auto-restart crashed bots if status = 'active'
|
||||||
|
- Log all crashes with stack traces
|
||||||
|
- Maximum 3 restart attempts per hour
|
||||||
|
|
||||||
|
**Exchange Connection Issues**:
|
||||||
|
- Retry with exponential backoff (1s, 2s, 4s, 8s, max 60s)
|
||||||
|
- Switch to backup WebSocket connection if available
|
||||||
|
- Log connection quality metrics
|
||||||
|
|
||||||
|
**Database Errors**:
|
||||||
|
- Continue operation with in-memory cache for up to 5 minutes
|
||||||
|
- Queue operations for retry when connection restored
|
||||||
|
- Alert on prolonged database disconnection
|
||||||
|
|
||||||
|
**Application Restart Recovery**:
|
||||||
|
- Read bot states from database on startup
|
||||||
|
- Restore active bots to 'active' status
|
||||||
|
- Resume data collection for all monitored symbols
|
||||||
|
|
||||||
### Component Details and Functional Requirements
|
### Component Details and Functional Requirements
|
||||||
|
|
||||||
1. **Data Collection Module**
|
1. **Data Collection Module**
|
||||||
- Connect to exchange APIs (OKX initially)
|
- Connect to exchange APIs (OKX initially) via WebSocket
|
||||||
- Retrieve market data (order books, trades, candles)
|
- Aggregate real-time trades into OHLCV candles (1m, 5m, 15m, 1h, 4h, 1d)
|
||||||
- Store raw and processed data in database
|
- Store OHLCV data in PostgreSQL for bot operations and backtesting
|
||||||
- Send real-time updates through Redis
|
- Send real-time candle updates through Redis
|
||||||
- Time Series Data Management (similiar to market standard)
|
- Optional: Store raw trade data for advanced backtesting
|
||||||
|
|
||||||
**FR-001: Unified Data Provider Interface**
|
**FR-001: Unified Data Provider Interface**
|
||||||
- Support multiple exchanges through standardized adapters
|
- Support multiple exchanges through standardized adapters
|
||||||
- Real-time data collection with WebSocket connections
|
- Real-time OHLCV aggregation with WebSocket connections
|
||||||
- Raw data storage for audit and replay capabilities
|
- Primary focus on candle data, raw data storage optional
|
||||||
- Data validation and error handling mechanisms
|
- Data validation and error handling mechanisms
|
||||||
|
|
||||||
**FR-002: Market Data Processing**
|
**FR-002: Market Data Processing**
|
||||||
- OHLCV aggregation with configurable timeframes
|
- OHLCV aggregation with configurable timeframes (1m base, higher timeframes derived)
|
||||||
- Technical indicator calculation (SMA, EMA, RSI, MACD, Bollinger Bands)
|
- Technical indicator calculation (SMA, EMA, RSI, MACD, Bollinger Bands) on OHLCV data
|
||||||
- Data normalization across different exchanges
|
- Data normalization across different exchanges
|
||||||
- Time alignment following exchange standards (right-aligned candles)
|
- Time alignment following exchange standards (right-aligned candles)
|
||||||
|
|
||||||
@ -106,43 +188,40 @@ The platform will follow a monolithic architecture pattern to enable rapid devel
|
|||||||
- Store strategy parameters and bot onfiguration in JSON in the beginning for simplicity of editing and testing
|
- Store strategy parameters and bot onfiguration in JSON in the beginning for simplicity of editing and testing
|
||||||
|
|
||||||
5. **Backtesting Engine**
|
5. **Backtesting Engine**
|
||||||
- Run simulations on historical data (batch processing for the signals and incremental processing for the trades)
|
- Run simulations on historical data using vectorized operations for speed
|
||||||
- Calculate performance metrics
|
- Calculate performance metrics
|
||||||
- Compare multiple strategies
|
- Support multiple timeframes and strategy parameter testing
|
||||||
- Visualize backtest results
|
- Generate comparison reports between strategies
|
||||||
|
|
||||||
**FR-009: Historical Simulation**
|
**FR-009: Historical Simulation**
|
||||||
- Strategy backtesting on historical data
|
- Strategy backtesting on historical market data
|
||||||
- Performance metric calculation (Sharpe ratio, drawdown, win rate)
|
- Performance metric calculation (Sharpe ratio, drawdown, win rate, total return)
|
||||||
- Parameter optimization and sensitivity analysis (in future)
|
- Parameter optimization through grid search (limited combinations for speed) (in future)
|
||||||
- Comparison tools for multiple strategies (in future)
|
- Side-by-side strategy comparison with statistical significance
|
||||||
|
|
||||||
**FR-010: Simulation Engine**
|
**FR-010: Simulation Engine**
|
||||||
- Realistic order execution simulation
|
- Vectorized signal calculation using pandas operations
|
||||||
- Fee and slippage modeling (in future)
|
- Realistic fee modeling (0.1% per trade for OKX)
|
||||||
- Look-ahead bias prevention
|
- Look-ahead bias prevention with proper timestamp handling
|
||||||
- Parallel backtesting for multiple parameter sets (in future)
|
- Configurable test periods (1 day to 24 months)
|
||||||
|
|
||||||
6. **Dashboard & Visualization**
|
6. **Dashboard & Visualization**
|
||||||
- Display real-time market data
|
- Display real-time market data and bot status
|
||||||
- Show bot status and performance
|
- Show portfolio value progression over time
|
||||||
- Visualize strategy signals and executions
|
- Visualize trade history with buy/sell markers on price charts
|
||||||
- Provide control interface for bot management
|
- Provide simple bot control interface (start/stop/configure)
|
||||||
|
|
||||||
**FR-011: Dashboard Interface**
|
**FR-011: Dashboard Interface**
|
||||||
- Real-time bot monitoring and status display
|
- Real-time bot monitoring with status indicators
|
||||||
- Performance charts and metrics visualization
|
- Portfolio performance charts (total value, cash vs crypto allocation)
|
||||||
- Bot configuration and management forms
|
- Trade history table with P&L per trade
|
||||||
- Alert and notification system
|
- Simple bot configuration forms for JSON parameter editing
|
||||||
|
|
||||||
**FR-012: Data Visualization**
|
**FR-012: Data Visualization**
|
||||||
- Interactive price charts with bot signals overlay
|
- Interactive price charts with strategy signal overlays
|
||||||
- Performance comparison charts
|
- Portfolio value progression charts
|
||||||
- Portfolio allocation and risk metrics display (in future)
|
- Performance comparison tables (multiple bots side-by-side)
|
||||||
- Historical analysis tools
|
- Fee tracking and total cost analysis
|
||||||
|
|
||||||
7. **Web API Service**
|
|
||||||
- REST API for frontend interactions
|
|
||||||
|
|
||||||
### Non-Functional Requirements
|
### Non-Functional Requirements
|
||||||
|
|
||||||
@ -183,116 +262,133 @@ The platform will follow a monolithic architecture pattern to enable rapid devel
|
|||||||
|
|
||||||
### Database Schema
|
### Database Schema
|
||||||
|
|
||||||
The database schema is designed to be simple yet effective for storing market data, bot configurations, and trading history.
|
The database schema separates frequently-accessed OHLCV data from raw tick data to optimize performance and storage.
|
||||||
|
|
||||||
```sql
|
```sql
|
||||||
-- Raw Market Data Tables
|
-- OHLCV Market Data (primary table for bot operations)
|
||||||
CREATE TABLE raw_market_data (
|
CREATE TABLE market_data (
|
||||||
id SERIAL PRIMARY KEY,
|
id SERIAL PRIMARY KEY,
|
||||||
exchange VARCHAR(50) NOT NULL,
|
exchange VARCHAR(50) NOT NULL DEFAULT 'okx',
|
||||||
symbol VARCHAR(20) NOT NULL,
|
symbol VARCHAR(20) NOT NULL,
|
||||||
timestamp TIMESTAMPTZ NOT NULL,
|
timeframe VARCHAR(5) NOT NULL, -- 1m, 5m, 15m, 1h, 4h, 1d
|
||||||
data JSONB NOT NULL,
|
|
||||||
UNIQUE(exchange, symbol, timestamp)
|
|
||||||
);
|
|
||||||
CREATE INDEX idx_raw_market_data_symbol_timestamp ON raw_market_data(symbol, timestamp);
|
|
||||||
|
|
||||||
|
|
||||||
-- Processed OHLCV data
|
|
||||||
CREATE TABLE processed_market_data (
|
|
||||||
id SERIAL PRIMARY KEY,
|
|
||||||
exchange VARCHAR(50) NOT NULL,
|
|
||||||
symbol VARCHAR(20) NOT NULL,
|
|
||||||
timeframe VARCHAR(5) NOT NULL,
|
|
||||||
timestamp TIMESTAMPTZ NOT NULL,
|
timestamp TIMESTAMPTZ NOT NULL,
|
||||||
open DECIMAL(18,8) NOT NULL,
|
open DECIMAL(18,8) NOT NULL,
|
||||||
high DECIMAL(18,8) NOT NULL,
|
high DECIMAL(18,8) NOT NULL,
|
||||||
low DECIMAL(18,8) NOT NULL,
|
low DECIMAL(18,8) NOT NULL,
|
||||||
close DECIMAL(18,8) NOT NULL,
|
close DECIMAL(18,8) NOT NULL,
|
||||||
volume DECIMAL(18,8) NOT NULL,
|
volume DECIMAL(18,8) NOT NULL,
|
||||||
UNIQUE(exchange, symbol, timeframe, timestamp)
|
trades_count INTEGER, -- number of trades in this candle
|
||||||
) PARTITION BY RANGE (timestamp);
|
|
||||||
CREATE INDEX idx_processed_market_data_lookup ON processed_market_data(exchange, symbol, timeframe, timestamp);
|
|
||||||
|
|
||||||
-- Strategy table
|
|
||||||
CREATE TABLE strategies (
|
|
||||||
id SERIAL PRIMARY KEY,
|
|
||||||
name VARCHAR(100) NOT NULL,
|
|
||||||
description TEXT,
|
|
||||||
parameters JSONB NOT NULL,
|
|
||||||
created_at TIMESTAMPTZ DEFAULT NOW(),
|
created_at TIMESTAMPTZ DEFAULT NOW(),
|
||||||
updated_at TIMESTAMPTZ DEFAULT NOW()
|
UNIQUE(exchange, symbol, timeframe, timestamp)
|
||||||
);
|
);
|
||||||
|
CREATE INDEX idx_market_data_lookup ON market_data(symbol, timeframe, timestamp);
|
||||||
|
CREATE INDEX idx_market_data_recent ON market_data(timestamp DESC) WHERE timestamp > NOW() - INTERVAL '7 days';
|
||||||
|
|
||||||
-- Bot table
|
-- Raw Trade Data (optional, for detailed backtesting only)
|
||||||
|
CREATE TABLE raw_trades (
|
||||||
|
id SERIAL PRIMARY KEY,
|
||||||
|
exchange VARCHAR(50) NOT NULL DEFAULT 'okx',
|
||||||
|
symbol VARCHAR(20) NOT NULL,
|
||||||
|
timestamp TIMESTAMPTZ NOT NULL,
|
||||||
|
type VARCHAR(10) NOT NULL, -- trade, order, balance, tick, books
|
||||||
|
data JSONB NOT NULL, -- response from the exchange
|
||||||
|
created_at TIMESTAMPTZ DEFAULT NOW()
|
||||||
|
) PARTITION BY RANGE (timestamp);
|
||||||
|
CREATE INDEX idx_raw_trades_symbol_time ON raw_trades(symbol, timestamp);
|
||||||
|
|
||||||
|
-- Monthly partitions for raw data (if using raw data)
|
||||||
|
-- CREATE TABLE raw_trades_y2024m01 PARTITION OF raw_trades
|
||||||
|
-- FOR VALUES FROM ('2024-01-01') TO ('2024-02-01');
|
||||||
|
|
||||||
|
-- Bot Management (simplified)
|
||||||
CREATE TABLE bots (
|
CREATE TABLE bots (
|
||||||
id SERIAL PRIMARY KEY,
|
id SERIAL PRIMARY KEY,
|
||||||
name VARCHAR(100) NOT NULL,
|
name VARCHAR(100) NOT NULL,
|
||||||
strategy_id INTEGER REFERENCES strategies(id),
|
strategy_name VARCHAR(50) NOT NULL,
|
||||||
symbol VARCHAR(20) NOT NULL,
|
symbol VARCHAR(20) NOT NULL,
|
||||||
timeframe VARCHAR(5) NOT NULL,
|
timeframe VARCHAR(5) NOT NULL,
|
||||||
parameters JSONB NOT NULL,
|
status VARCHAR(20) NOT NULL DEFAULT 'inactive', -- active, inactive, error
|
||||||
status VARCHAR(20) NOT NULL DEFAULT 'inactive',
|
config_file VARCHAR(200), -- path to JSON config
|
||||||
last_run TIMESTAMPTZ,
|
virtual_balance DECIMAL(18,8) DEFAULT 10000,
|
||||||
|
current_balance DECIMAL(18,8) DEFAULT 10000,
|
||||||
|
last_heartbeat TIMESTAMPTZ,
|
||||||
created_at TIMESTAMPTZ DEFAULT NOW(),
|
created_at TIMESTAMPTZ DEFAULT NOW(),
|
||||||
updated_at TIMESTAMPTZ DEFAULT NOW()
|
updated_at TIMESTAMPTZ DEFAULT NOW()
|
||||||
);
|
);
|
||||||
|
|
||||||
-- Trading Data Tables
|
-- Trading Signals (for analysis and debugging)
|
||||||
CREATE TABLE signals (
|
CREATE TABLE signals (
|
||||||
id SERIAL PRIMARY KEY,
|
id SERIAL PRIMARY KEY,
|
||||||
bot_id INTEGER REFERENCES bots(id),
|
bot_id INTEGER REFERENCES bots(id),
|
||||||
timestamp TIMESTAMPTZ NOT NULL,
|
timestamp TIMESTAMPTZ NOT NULL,
|
||||||
symbol VARCHAR(20) NOT NULL,
|
signal_type VARCHAR(10) NOT NULL, -- buy, sell, hold
|
||||||
signal_type VARCHAR(10) NOT NULL,
|
|
||||||
price DECIMAL(18,8),
|
price DECIMAL(18,8),
|
||||||
confidence DECIMAL(5,4),
|
confidence DECIMAL(5,4),
|
||||||
metadata JSONB,
|
indicators JSONB, -- technical indicator values
|
||||||
created_at TIMESTAMPTZ DEFAULT NOW()
|
created_at TIMESTAMPTZ DEFAULT NOW()
|
||||||
);
|
);
|
||||||
CREATE INDEX idx_signals_bot_time ON signals(bot_id, timestamp);
|
CREATE INDEX idx_signals_bot_time ON signals(bot_id, timestamp);
|
||||||
|
|
||||||
|
-- Trade Execution Records
|
||||||
CREATE TABLE trades (
|
CREATE TABLE trades (
|
||||||
id SERIAL PRIMARY KEY,
|
id SERIAL PRIMARY KEY,
|
||||||
bot_id INTEGER REFERENCES bots(id),
|
bot_id INTEGER REFERENCES bots(id),
|
||||||
signal_id INTEGER REFERENCES signals(id),
|
signal_id INTEGER REFERENCES signals(id),
|
||||||
timestamp TIMESTAMPTZ NOT NULL,
|
timestamp TIMESTAMPTZ NOT NULL,
|
||||||
symbol VARCHAR(20) NOT NULL,
|
side VARCHAR(5) NOT NULL, -- buy, sell
|
||||||
order_type VARCHAR(20) NOT NULL,
|
|
||||||
side VARCHAR(5) NOT NULL,
|
|
||||||
price DECIMAL(18,8) NOT NULL,
|
price DECIMAL(18,8) NOT NULL,
|
||||||
quantity DECIMAL(18,8) NOT NULL,
|
quantity DECIMAL(18,8) NOT NULL,
|
||||||
status VARCHAR(20) NOT NULL,
|
fees DECIMAL(18,8) DEFAULT 0,
|
||||||
fees DECIMAL(18,8),
|
pnl DECIMAL(18,8), -- profit/loss for this trade
|
||||||
metadata JSONB,
|
balance_after DECIMAL(18,8), -- portfolio balance after trade
|
||||||
created_at TIMESTAMPTZ DEFAULT NOW(),
|
created_at TIMESTAMPTZ DEFAULT NOW()
|
||||||
updated_at TIMESTAMPTZ DEFAULT NOW()
|
|
||||||
);
|
);
|
||||||
CREATE INDEX idx_trades_bot_time ON trades(bot_id, timestamp);
|
CREATE INDEX idx_trades_bot_time ON trades(bot_id, timestamp);
|
||||||
|
|
||||||
-- Bot Portfolio Tracking
|
-- Performance Snapshots (for plotting portfolio over time)
|
||||||
CREATE TABLE bot_performance (
|
CREATE TABLE bot_performance (
|
||||||
id SERIAL PRIMARY KEY,
|
id SERIAL PRIMARY KEY,
|
||||||
bot_id INTEGER REFERENCES bots(id),
|
bot_id INTEGER REFERENCES bots(id),
|
||||||
timestamp TIMESTAMPTZ NOT NULL,
|
timestamp TIMESTAMPTZ NOT NULL,
|
||||||
equity DECIMAL(18,8) NOT NULL,
|
total_value DECIMAL(18,8) NOT NULL, -- current portfolio value
|
||||||
balance DECIMAL(18,8) NOT NULL,
|
cash_balance DECIMAL(18,8) NOT NULL,
|
||||||
open_positions JSONB,
|
crypto_balance DECIMAL(18,8) NOT NULL,
|
||||||
metrics JSONB,
|
total_trades INTEGER DEFAULT 0,
|
||||||
|
winning_trades INTEGER DEFAULT 0,
|
||||||
|
total_fees DECIMAL(18,8) DEFAULT 0,
|
||||||
created_at TIMESTAMPTZ DEFAULT NOW()
|
created_at TIMESTAMPTZ DEFAULT NOW()
|
||||||
);
|
);
|
||||||
CREATE INDEX idx_bot_performance_bot_time ON bot_performance(bot_id, timestamp);
|
CREATE INDEX idx_bot_performance_bot_time ON bot_performance(bot_id, timestamp);
|
||||||
```
|
```
|
||||||
|
|
||||||
|
**Data Storage Strategy**:
|
||||||
|
- **OHLCV Data**: Primary source for bot operations, kept indefinitely, optimized indexes
|
||||||
|
- **Raw Trade Data**: Optional table, only if detailed backtesting needed, can be partitioned monthly
|
||||||
|
- **Alternative for Raw Data**: Store in compressed files (Parquet/CSV) instead of database for cost efficiency
|
||||||
|
|
||||||
|
**MVP Approach**: Start with OHLCV data only, add raw data storage later if advanced backtesting requires it.
|
||||||
|
|
||||||
### Technology Stack
|
### Technology Stack
|
||||||
|
|
||||||
The platform will be built using the following technologies:
|
The platform will be built using the following technologies:
|
||||||
|
|
||||||
- **Backend**: Python 3.10+ (FastAPI or Flask)
|
- **Backend Framework**: Python 3.10+ with Dash (includes built-in Flask server for REST API endpoints)
|
||||||
- **Database**: PostgreSQL 14+ (with TimescaleDB extension for time-series optimization)
|
- **Database**: PostgreSQL 14+ (with TimescaleDB extension for time-series optimization)
|
||||||
- **Real-time Messaging**: Redis (for pub/sub messaging)
|
- **Real-time Messaging**: Redis (for pub/sub messaging between components)
|
||||||
- **Frontend**: Dash with Plotly (for visualization and control interface) and Mantine UI (for the dashboard)
|
- **Frontend**: Dash with Plotly (for visualization and control interface) and Mantine UI components
|
||||||
- **Deployment**: Simple Docker container setup (not Kubernetes)
|
- **Configuration**: JSON files for strategy parameters and bot configurations
|
||||||
|
- **Deployment**: Docker container setup for development and production
|
||||||
|
|
||||||
|
### API Design
|
||||||
|
|
||||||
|
**Dash Callbacks**: Real-time updates and user interactions
|
||||||
|
**REST Endpoints**: Historical data queries for backtesting and analysis
|
||||||
|
```python
|
||||||
|
# Built-in Flask routes for historical data
|
||||||
|
@app.server.route('/api/bot/<bot_id>/trades')
|
||||||
|
@app.server.route('/api/market/<symbol>/history')
|
||||||
|
@app.server.route('/api/backtest/results/<test_id>')
|
||||||
|
```
|
||||||
|
|
||||||
### Data Flow
|
### Data Flow
|
||||||
|
|
||||||
|
|||||||
@ -1,259 +0,0 @@
|
|||||||
# Dependency Management Guide
|
|
||||||
|
|
||||||
This guide explains how to manage Python dependencies in the Crypto Trading Bot Dashboard project.
|
|
||||||
|
|
||||||
## Local Development
|
|
||||||
|
|
||||||
### Adding New Dependencies
|
|
||||||
|
|
||||||
#### 1. Core Dependencies (Required for Runtime)
|
|
||||||
|
|
||||||
To add a new core dependency:
|
|
||||||
|
|
||||||
```bash
|
|
||||||
# Method 1: Add directly to pyproject.toml
|
|
||||||
# Edit pyproject.toml and add to the dependencies list:
|
|
||||||
# "new-package>=1.0.0",
|
|
||||||
|
|
||||||
# Method 2: Use UV to add and update pyproject.toml
|
|
||||||
uv add "new-package>=1.0.0"
|
|
||||||
|
|
||||||
# Sync to install
|
|
||||||
uv sync
|
|
||||||
```
|
|
||||||
|
|
||||||
#### 2. Development Dependencies (Testing, Linting, etc.)
|
|
||||||
|
|
||||||
```bash
|
|
||||||
# Add development-only dependency
|
|
||||||
uv add --dev "new-dev-package>=1.0.0"
|
|
||||||
|
|
||||||
# Or edit pyproject.toml under [project.optional-dependencies.dev]
|
|
||||||
# Then run:
|
|
||||||
uv sync --dev
|
|
||||||
```
|
|
||||||
|
|
||||||
### Installing Dependencies
|
|
||||||
|
|
||||||
```bash
|
|
||||||
# Install all dependencies
|
|
||||||
uv sync
|
|
||||||
|
|
||||||
# Install with development dependencies
|
|
||||||
uv sync --dev
|
|
||||||
|
|
||||||
# Install only production dependencies
|
|
||||||
uv sync --no-dev
|
|
||||||
```
|
|
||||||
|
|
||||||
### Updating Dependencies
|
|
||||||
|
|
||||||
```bash
|
|
||||||
# Update all dependencies to latest compatible versions
|
|
||||||
uv sync --upgrade
|
|
||||||
|
|
||||||
# Update specific package
|
|
||||||
uv sync --upgrade-package "package-name"
|
|
||||||
```
|
|
||||||
|
|
||||||
## Docker Environment
|
|
||||||
|
|
||||||
### Current Approach
|
|
||||||
|
|
||||||
The project uses a **volume-based development** approach where:
|
|
||||||
- Dependencies are installed in the local environment using UV
|
|
||||||
- Docker containers provide only infrastructure services (PostgreSQL, Redis)
|
|
||||||
- The Python application runs locally with hot reload
|
|
||||||
|
|
||||||
### Adding Dependencies for Docker-based Development
|
|
||||||
|
|
||||||
If you want to run the entire application in Docker:
|
|
||||||
|
|
||||||
#### 1. Create a Dockerfile
|
|
||||||
|
|
||||||
```dockerfile
|
|
||||||
FROM python:3.10-slim
|
|
||||||
|
|
||||||
WORKDIR /app
|
|
||||||
|
|
||||||
# Install system dependencies
|
|
||||||
RUN apt-get update && apt-get install -y \
|
|
||||||
postgresql-client \
|
|
||||||
curl \
|
|
||||||
&& rm -rf /var/lib/apt/lists/*
|
|
||||||
|
|
||||||
# Install UV
|
|
||||||
RUN pip install uv
|
|
||||||
|
|
||||||
# Copy dependency files
|
|
||||||
COPY pyproject.toml ./
|
|
||||||
COPY README.md ./
|
|
||||||
|
|
||||||
# Install dependencies
|
|
||||||
RUN uv sync --no-dev
|
|
||||||
|
|
||||||
# Copy application code
|
|
||||||
COPY . .
|
|
||||||
|
|
||||||
# Expose port
|
|
||||||
EXPOSE 8050
|
|
||||||
|
|
||||||
# Run application
|
|
||||||
CMD ["uv", "run", "python", "main.py"]
|
|
||||||
```
|
|
||||||
|
|
||||||
#### 2. Add Application Service to docker-compose.yml
|
|
||||||
|
|
||||||
```yaml
|
|
||||||
services:
|
|
||||||
app:
|
|
||||||
build: .
|
|
||||||
container_name: dashboard_app
|
|
||||||
ports:
|
|
||||||
- "8050:8050"
|
|
||||||
volumes:
|
|
||||||
- .:/app
|
|
||||||
- uv_cache:/root/.cache/uv
|
|
||||||
environment:
|
|
||||||
- DATABASE_URL=postgresql://dashboard:dashboard123@postgres:5432/dashboard
|
|
||||||
- REDIS_URL=redis://redis:6379
|
|
||||||
depends_on:
|
|
||||||
- postgres
|
|
||||||
- redis
|
|
||||||
networks:
|
|
||||||
- dashboard-network
|
|
||||||
restart: unless-stopped
|
|
||||||
|
|
||||||
volumes:
|
|
||||||
uv_cache:
|
|
||||||
```
|
|
||||||
|
|
||||||
#### 3. Development Workflow with Docker
|
|
||||||
|
|
||||||
```bash
|
|
||||||
# Build and start all services
|
|
||||||
docker-compose up --build
|
|
||||||
|
|
||||||
# Add new dependency
|
|
||||||
# 1. Edit pyproject.toml
|
|
||||||
# 2. Rebuild container
|
|
||||||
docker-compose build app
|
|
||||||
docker-compose up -d app
|
|
||||||
|
|
||||||
# Or use dev dependencies mount
|
|
||||||
# Mount local UV cache for faster rebuilds
|
|
||||||
```
|
|
||||||
|
|
||||||
## Hot Reload Development
|
|
||||||
|
|
||||||
### Method 1: Local Development (Recommended)
|
|
||||||
|
|
||||||
Run services in Docker, application locally with hot reload:
|
|
||||||
|
|
||||||
```bash
|
|
||||||
# Start infrastructure
|
|
||||||
python scripts/dev.py start
|
|
||||||
|
|
||||||
# Run app with hot reload
|
|
||||||
uv run python scripts/dev.py dev-server
|
|
||||||
```
|
|
||||||
|
|
||||||
### Method 2: Docker with Volume Mounts
|
|
||||||
|
|
||||||
If using Docker for the app, mount source code:
|
|
||||||
|
|
||||||
```yaml
|
|
||||||
volumes:
|
|
||||||
- .:/app # Mount source code
|
|
||||||
- /app/__pycache__ # Exclude cache
|
|
||||||
```
|
|
||||||
|
|
||||||
## Best Practices
|
|
||||||
|
|
||||||
### 1. Version Pinning
|
|
||||||
|
|
||||||
```toml
|
|
||||||
# Good: Specify minimum version with compatibility
|
|
||||||
"requests>=2.31.0,<3.0.0"
|
|
||||||
|
|
||||||
# Acceptable: Major version constraint
|
|
||||||
"pandas>=2.1.0"
|
|
||||||
|
|
||||||
# Avoid: Exact pinning (except for critical deps)
|
|
||||||
"somepackage==1.2.3" # Only if necessary
|
|
||||||
```
|
|
||||||
|
|
||||||
### 2. Dependency Categories
|
|
||||||
|
|
||||||
```toml
|
|
||||||
[project]
|
|
||||||
dependencies = [
|
|
||||||
# Core web framework
|
|
||||||
"dash>=2.14.0",
|
|
||||||
|
|
||||||
# Database
|
|
||||||
"sqlalchemy>=2.0.0",
|
|
||||||
"psycopg2-binary>=2.9.0",
|
|
||||||
|
|
||||||
# ... group related dependencies
|
|
||||||
]
|
|
||||||
```
|
|
||||||
|
|
||||||
### 3. Security Updates
|
|
||||||
|
|
||||||
```bash
|
|
||||||
# Check for security vulnerabilities
|
|
||||||
pip-audit
|
|
||||||
|
|
||||||
# Update specific vulnerable package
|
|
||||||
uv sync --upgrade-package "vulnerable-package"
|
|
||||||
```
|
|
||||||
|
|
||||||
## Troubleshooting
|
|
||||||
|
|
||||||
### Common Issues
|
|
||||||
|
|
||||||
1. **Dependency Conflicts**
|
|
||||||
```bash
|
|
||||||
# Clear UV cache and reinstall
|
|
||||||
uv cache clean
|
|
||||||
uv sync --refresh
|
|
||||||
```
|
|
||||||
|
|
||||||
2. **PostgreSQL Connection Issues**
|
|
||||||
```bash
|
|
||||||
# Ensure psycopg2-binary is installed
|
|
||||||
uv add "psycopg2-binary>=2.9.0"
|
|
||||||
```
|
|
||||||
|
|
||||||
3. **Docker Build Failures**
|
|
||||||
```bash
|
|
||||||
# Clean docker build cache
|
|
||||||
docker system prune --volumes
|
|
||||||
docker-compose build --no-cache
|
|
||||||
```
|
|
||||||
|
|
||||||
### Debugging Dependencies
|
|
||||||
|
|
||||||
```bash
|
|
||||||
# Show installed packages
|
|
||||||
uv pip list
|
|
||||||
|
|
||||||
# Show dependency tree
|
|
||||||
uv pip show <package-name>
|
|
||||||
|
|
||||||
# Check for conflicts
|
|
||||||
uv pip check
|
|
||||||
```
|
|
||||||
|
|
||||||
## Migration from requirements.txt
|
|
||||||
|
|
||||||
If you have an existing `requirements.txt`:
|
|
||||||
|
|
||||||
```bash
|
|
||||||
# Convert to pyproject.toml
|
|
||||||
uv add -r requirements.txt
|
|
||||||
|
|
||||||
# Or manually copy dependencies to pyproject.toml
|
|
||||||
# Then remove requirements.txt
|
|
||||||
```
|
|
||||||
Loading…
x
Reference in New Issue
Block a user