This commit is contained in:
Vasily.onl 2025-05-30 16:43:09 +08:00
parent de6ddbf1d8
commit 4e46c82ff1
3 changed files with 175 additions and 163 deletions

131
README.md
View File

@ -1,118 +1,83 @@
# Crypto Trading Bot Dashboard
# Crypto Trading Bot Platform
A simple control dashboard for managing and monitoring multiple cryptocurrency trading bots simultaneously. Test different trading strategies in parallel using real OKX market data and virtual trading simulation.
A simplified crypto trading bot platform for strategy testing and development. Test multiple trading strategies in parallel using real OKX market data with virtual trading simulation.
## Features
## Overview
- **Multi-Bot Management**: Run up to 5 trading bots simultaneously with different strategies
- **Real-time Monitoring**: Live price charts with bot buy/sell decision markers
- **Performance Tracking**: Monitor virtual balance, P&L, trade count, and timing for each bot
- **Backtesting**: Test strategies on historical data with accelerated execution
- **Simple Configuration**: JSON-based bot configuration files
- **Hot Reloading**: System remembers active bots and restores state on restart
This platform enables rapid strategy testing within 1-2 weeks of development. Built with a monolithic architecture for simplicity, it supports 5-10 concurrent trading bots with real-time monitoring and performance tracking.
## Key Features
- **Multi-Bot Management**: Run 5-10 trading bots simultaneously with different strategies
- **Real-time Monitoring**: Live OHLCV charts with bot trading signals overlay
- **Virtual Trading**: Simulation-first approach with realistic fee modeling
- **JSON Configuration**: Easy strategy parameter testing without code changes
- **Backtesting Engine**: Test strategies on historical market data
- **Crash Recovery**: Automatic bot restart and state restoration
## Tech Stack
- **Backend**: Python with existing OKX, strategy, and trader modules
- **Frontend**: Plotly Dash for rapid development
- **Database**: PostgreSQL with SQLAlchemy ORM
- **Framework**: Python 3.10+ with Dash (unified frontend/backend)
- **Database**: PostgreSQL with optimized OHLCV data storage
- **Real-time**: Redis pub/sub for live updates
- **Package Management**: UV
- **Development**: Docker for consistent environment
## Quick Start
### Prerequisites
- Python 3.10+, Docker, UV package manager
- Python 3.10+
- Docker and Docker Compose
- UV package manager
### Development Setup
*Complete setup workflow*
```
python scripts/dev.py setup # Setup environment and dependencies
python scripts/dev.py start # Start Docker services
uv run python tests/test_setup.py # Verify everything works
```
*Development workflow*
```
python scripts/dev.py dev-server # Start with hot reload (recommended)
python scripts/dev.py run # Start without hot reload
python scripts/dev.py status # Check service status
python scripts/dev.py stop # Stop services
```
*Dependency management*
```
uv add "new-package>=1.0.0" # Add new dependency
uv sync --dev # Install all dependencies
### Setup
```bash
python scripts/dev.py setup # Setup environment
python scripts/dev.py start # Start services
python scripts/dev.py dev-server # Start with hot reload
```
## Project Structure
```
Dashboard/
├── app.py # Main Dash application
├── bot_manager.py # Bot lifecycle management
├── database/
│ ├── models.py # SQLAlchemy models
│ └── connection.py # Database connection
├── data/
│ └── okx_integration.py # OKX API integration
├── components/
│ ├── dashboard.py # Dashboard components
│ └── charts.py # Chart components
├── backtesting/
│ └── engine.py # Backtesting framework
├── config/
│ └── bot_configs/ # Bot configuration files
├── strategies/ # Trading strategy modules
├── trader/ # Virtual trading logic
└── docs/ # Project documentation
├── app.py # Main Dash application
├── bot_manager.py # Bot lifecycle management
├── database/ # PostgreSQL models and connection
├── data/ # OKX API integration
├── components/ # Dashboard UI components
├── strategies/ # Trading strategy modules
├── config/bot_configs/ # JSON bot configurations
└── docs/ # Project documentation
```
## Documentation
- **[Product Requirements](tasks/prd-crypto-bot-dashboard.md)** - Detailed project requirements and specifications
- **[Implementation Tasks](tasks/tasks-prd-crypto-bot-dashboard.md)** - Step-by-step development task list
- **[API Documentation](docs/)** - Module and API documentation
- **[Product Requirements](docs/crypto-bot-prd.md)** - Complete system specifications and requirements
- **[Technical Architecture](docs/architecture.md)** - Implementation details and component design
- **[Platform Overview](docs/specification.md)** - Human-readable system overview
## Bot Configuration
## Configuration Example
Create bot configuration files in `config/bot_configs/`:
Bot configurations use simple JSON files for rapid testing:
```json
{
"bot_id": "ema_crossover_01",
"strategy": "EMA_Crossover",
"parameters": {
"fast_period": 12,
"slow_period": 26,
"symbol": "BTC-USDT"
},
"virtual_balance": 10000
"strategy_file": "ema_crossover.json",
"symbol": "BTC-USDT",
"virtual_balance": 10000,
"enabled": true
}
```
## Development Status
## Development Timeline
This project is in active development. See the [task list](tasks/tasks-prd-crypto-bot-dashboard.md) for current implementation progress.
### Current Phase: Setup and Infrastructure
- [ ] Development environment setup
- [ ] Database schema design
- [ ] Basic bot management system
- [ ] OKX integration
- [ ] Dashboard UI implementation
- [ ] Backtesting framework
**Target**: Functional system within 1-2 weeks
- **Phase 1** (Days 1-5): Database, data collection, basic visualization
- **Phase 2** (Days 6-10): Bot management, backtesting, trading logic
- **Phase 3** (Days 11-14): Testing, optimization, deployment
## Contributing
1. Check the [task list](tasks/tasks-prd-crypto-bot-dashboard.md) for available tasks
2. Follow the project's coding standards and architectural patterns
3. Use UV for package management
4. Write tests for new functionality
5. Update documentation when adding features
1. Review [architecture documentation](docs/architecture.md) for technical approach
2. Check [task list](tasks/tasks-prd-crypto-bot-dashboard.md) for available work
3. Follow project coding standards and use UV for dependencies
4. Update documentation when adding features

View File

@ -1,23 +1,30 @@
## Architecture Components
### 1. Data Collector
**Responsibility**: Unified data collection from multiple exchanges
**Responsibility**: OHLCV data collection and aggregation from exchanges
```python
class DataCollector:
def __init__(self):
self.providers = {} # Registry of data providers
self.store_raw_data = False # Optional raw data storage
def register_provider(self, name: str, provider: DataProvider):
"""Register a new data provider"""
def start_collection(self, symbols: List[str]):
"""Start collecting data for specified symbols"""
def start_collection(self, symbols: List[str], timeframes: List[str]):
"""Start collecting OHLCV data for specified symbols and timeframes"""
def process_raw_data(self, raw_data: dict):
"""Process raw data into OHLCV format"""
def process_raw_trades(self, raw_trades: List[dict]) -> dict:
"""Aggregate raw trades into OHLCV candles"""
def send_signal_to_bots(self, processed_data: dict):
"""Send Redis signal to active bots"""
def store_ohlcv_data(self, ohlcv_data: dict):
"""Store OHLCV data in PostgreSQL market_data table"""
def send_market_update(self, symbol: str, ohlcv_data: dict):
"""Send Redis signal with OHLCV update to active bots"""
def store_raw_data_optional(self, raw_data: dict):
"""Optionally store raw data for detailed backtesting"""
```
### 2. Strategy Engine
@ -42,43 +49,39 @@ class BaseStrategy:
class BotManager:
def __init__(self):
self.active_bots = {}
self.config_path = "config/bots/"
def load_bot_config(self, bot_id: int) -> dict:
"""Load bot configuration from JSON file"""
def start_bot(self, bot_id: int):
"""Start a bot instance"""
"""Start a bot instance with crash recovery monitoring"""
def stop_bot(self, bot_id: int):
"""Stop a bot instance"""
"""Stop a bot instance and update database status"""
def process_signal(self, bot_id: int, signal: Signal):
"""Process signal and make trading decision"""
"""Process signal and make virtual trading decision"""
def update_bot_state(self, bot_id: int, state: dict):
"""Update bot state in database"""
def update_bot_heartbeat(self, bot_id: int):
"""Update bot heartbeat in database for monitoring"""
def restart_crashed_bots(self):
"""Monitor and restart crashed bots (max 3 attempts/hour)"""
def restore_active_bots_on_startup(self):
"""Restore active bot states after application restart"""
```
## Communication Architecture
### Redis Pub/Sub Patterns
```python
# Real-time market data
MARKET_DATA_CHANNEL = "market_data:{symbol}"
# Bot-specific signals
BOT_SIGNAL_CHANNEL = "bot_signals:{bot_id}"
# Trade updates
TRADE_UPDATE_CHANNEL = "trade_updates:{bot_id}"
# System events
SYSTEM_EVENT_CHANNEL = "system_events"
```
### WebSocket Communication
```python
# Frontend real-time updates
WS_BOT_STATUS = "/ws/bot/{bot_id}/status"
WS_MARKET_DATA = "/ws/market/{symbol}"
WS_PORTFOLIO = "/ws/portfolio/{bot_id}"
# Real-time market data distribution
MARKET_DATA_CHANNEL = "market:{symbol}" # OHLCV updates
BOT_SIGNALS_CHANNEL = "signals:{bot_id}" # Trading decisions
BOT_STATUS_CHANNEL = "status:{bot_id}" # Bot lifecycle events
SYSTEM_EVENTS_CHANNEL = "system:events" # Global notifications
```
## Time Aggregation Strategy
@ -112,54 +115,98 @@ def aggregate_to_timeframe(ticks: List[dict], timeframe: str) -> dict:
yield candle
```
## Backtesting Optimization
## Backtesting Strategy
### Parallel Processing Strategy
### Vectorized Processing Approach
```python
import multiprocessing as mp
from joblib import Parallel, delayed
import numba
import pandas as pd
import numpy as np
@numba.jit(nopython=True)
def calculate_signals_vectorized(prices, parameters):
"""Vectorized signal calculation using Numba"""
# High-performance signal calculation
return signals
def backtest_strategy_simple(strategy, market_data: pd.DataFrame, initial_balance: float = 10000):
"""
Simple vectorized backtesting using pandas operations
Parameters:
- strategy: Strategy instance with process_data method
- market_data: DataFrame with OHLCV data
- initial_balance: Starting portfolio value
Returns:
- Portfolio performance metrics and trade history
"""
# Calculate all signals at once using vectorized operations
signals = []
portfolio_value = []
current_balance = initial_balance
position = 0
for idx, row in market_data.iterrows():
# Get signal from strategy
signal = strategy.process_data(market_data.iloc[:idx+1])
# Simulate trade execution
if signal.action == 'buy' and position == 0:
position = current_balance / row['close']
current_balance = 0
elif signal.action == 'sell' and position > 0:
current_balance = position * row['close'] * 0.999 # 0.1% fee
position = 0
# Track portfolio value
total_value = current_balance + (position * row['close'])
portfolio_value.append(total_value)
signals.append(signal)
return {
'final_value': portfolio_value[-1],
'total_return': (portfolio_value[-1] / initial_balance - 1) * 100,
'signals': signals,
'portfolio_progression': portfolio_value
}
def backtest_strategy_batch(data_batch, strategy_params):
"""Backtest a batch of data in parallel"""
# Process batch of signals
signals = calculate_signals_vectorized(data_batch, strategy_params)
def calculate_performance_metrics(portfolio_values: List[float]) -> dict:
"""Calculate standard performance metrics"""
returns = pd.Series(portfolio_values).pct_change().dropna()
# Simulate trades incrementally
portfolio = simulate_trades(signals, data_batch)
return portfolio
# Parallel backtesting
def run_parallel_backtest(data, strategy_params, n_jobs=4):
data_batches = split_data_into_batches(data, n_jobs)
results = Parallel(n_jobs=n_jobs)(
delayed(backtest_strategy_batch)(batch, strategy_params)
for batch in data_batches
)
return combine_results(results)
return {
'sharpe_ratio': returns.mean() / returns.std() if returns.std() > 0 else 0,
'max_drawdown': (pd.Series(portfolio_values).cummax() - pd.Series(portfolio_values)).max(),
'win_rate': (returns > 0).mean(),
'total_trades': len(returns)
}
```
### Optimization Techniques
1. **Vectorized Operations**: Use NumPy/Pandas for bulk calculations
2. **Numba JIT**: Compile critical loops for C-like performance
3. **Batch Processing**: Process signals in batches, simulate trades incrementally
4. **Memory Management**: Use efficient data structures (arrays vs lists)
5. **Parallel Execution**: Utilize multiple CPU cores for independent calculations
1. **Vectorized Operations**: Use pandas for bulk data processing
2. **Efficient Indexing**: Pre-calculate indicators where possible
3. **Memory Management**: Process data in chunks for large datasets
4. **Simple Parallelization**: Run multiple strategy tests independently
## Key Design Principles
1. **Data Separation**: Raw and processed data stored separately for audit trail
2. **Signal Tracking**: All signals recorded (executed or not) for analysis
3. **Real-time State**: Bot states updated in real-time for monitoring
4. **Audit Trail**: Complete record of all trading activities
5. **Scalability**: Architecture supports multiple bots and strategies
6. **Modularity**: Clear separation between data collection, strategy execution, and trading
7. **Fault Tolerance**: Redis for reliable message delivery, database transactions for consistency
1. **OHLCV-First Data Strategy**: Primary focus on aggregated candle data, optional raw data storage
2. **Signal Tracking**: All trading signals recorded in database for analysis and debugging
3. **JSON Configuration**: Strategy parameters and bot configs in JSON for rapid testing
4. **Real-time State Management**: Bot states updated via Redis and PostgreSQL for monitoring
5. **Crash Recovery**: Automatic bot restart and application state recovery
6. **Virtual Trading**: Simulation-first approach with fee modeling
7. **Simplified Architecture**: Monolithic design with clear component boundaries for future scaling
## Database Architecture
### Core Tables
- **market_data**: OHLCV candles for bot operations and backtesting (primary table)
- **bots**: Bot instances with JSON config references and status tracking
- **signals**: Trading decisions with confidence scores and indicator values
- **trades**: Virtual trade execution records with P&L tracking
- **bot_performance**: Portfolio snapshots for performance visualization
### Optional Tables
- **raw_trades**: Raw tick data for advanced backtesting (partitioned by month)
### Data Access Patterns
- **Real-time**: Bots read recent OHLCV data via indexes on (symbol, timeframe, timestamp)
- **Historical**: Dashboard queries aggregated performance data for charts
- **Backtesting**: Sequential access to historical OHLCV data by date range

View File

@ -2,7 +2,7 @@
## Executive Summary
This simplified PRD addresses the need for a rapid-deployment crypto trading bot platform designed for internal testing and strategy development. The platform eliminates microservices complexity in favor of a monolithic architecture that can be functional within 1-2 weeks while supporting approximately 10 concurrent bots. The system focuses on core functionality including data collection, strategy execution, backtesting, and visualization without requiring advanced monitoring or orchestration tools.
This simplified PRD addresses the need for a rapid-deployment crypto trading bot platform designed for internal testing and strategy development. The platform eliminates microservices complexity in favor of a monolithic architecture that can be functional within 1-2 weeks while supporting 5-10 concurrent bots. The system focuses on core functionality including data collection, strategy execution, backtesting, and visualization without requiring advanced monitoring or orchestration tools.
## System Architecture Overview
@ -12,11 +12,11 @@ The platform follows a streamlined monolithic design that consolidates all compo
### Core Technologies
The platform utilizes a Python-based technology stack optimized for rapid development. The backend employs Python 3.10+ with FastAPI or Flask for API services, PostgreSQL 14+ with TimescaleDB extension for time-series optimization, and Redis for real-time pub/sub messaging. The frontend leverages Dash with Plotly for interactive visualization and bot control interfaces.
The platform utilizes a Python-based technology stack optimized for rapid development. The backend employs Python 3.10+ with Dash framework (including built-in Flask server for REST APIs), PostgreSQL 14+ with TimescaleDB extension for time-series optimization, and Redis for real-time pub/sub messaging. The frontend leverages Dash with Plotly for interactive visualization and bot control interfaces, providing a unified full-stack solution.
### Database Design
The database schema emphasizes simplicity while supporting essential trading operations. Core tables include raw_market_data for exchange data storage, candles for OHLCV aggregation, strategies for algorithm definitions, bots for instance management, signals for trading decisions, trades for execution records, and bot_portfolio for performance tracking.
The database schema emphasizes simplicity while supporting essential trading operations. The core approach separates frequently-accessed OHLCV market data from optional raw tick data for optimal performance. Core tables include market_data for OHLCV candles used by bots, bots for instance management with JSON configuration references, signals for trading decisions, trades for execution records, and bot_performance for portfolio tracking. Raw trade data storage is optional and can be implemented later for advanced backtesting scenarios.
## Development Methodology
@ -26,7 +26,7 @@ The development follows a structured three-phase approach designed for rapid dep
### Strategy Implementation Example
The platform supports multiple trading strategies through a unified interface design. A simple moving average crossover strategy demonstrates the system's capability to generate buy and sell signals based on technical indicators.This example strategy shows how the system processes market data, calculates moving averages, generates trading signals, and tracks portfolio performance over time. The visualization includes price movements, moving average lines, signal markers, and portfolio value progression.
The platform supports multiple trading strategies through a unified interface design. Strategy parameters are stored in JSON files, making it easy to test different configurations without rebuilding code. A simple moving average crossover strategy demonstrates the system's capability to generate buy and sell signals based on technical indicators. This example strategy shows how the system processes market data, calculates moving averages, generates trading signals, and tracks portfolio performance over time. The visualization includes price movements, moving average lines, signal markers, and portfolio value progression.
## Backtesting and Performance Analysis
@ -42,11 +42,11 @@ The platform tracks portfolio allocation and performance throughout strategy exe
### Real-Time Processing
The data collection module connects to exchange APIs to retrieve market information including order books, trades, and candlestick data. Raw data is stored in PostgreSQL while processed information is published through Redis channels for real-time distribution to active trading bots.
The data collection module connects to exchange APIs (starting with OKX) to retrieve market information via WebSocket connections. Instead of storing all raw tick data, the system focuses on aggregating trades into OHLCV candles (1-minute, 5-minute, hourly, etc.) which are stored in PostgreSQL. Processed OHLCV data is published through Redis channels for real-time distribution to active trading bots. Raw trade data can optionally be stored for advanced backtesting scenarios.
### Signal Generation and Execution
Strategies subscribe to relevant data streams and generate trading signals based on configured algorithms. The bot manager validates signals against portfolio constraints and executes simulated or live trades according to bot configurations.
Trading strategies subscribe to relevant OHLCV data streams and generate trading signals based on configured algorithms stored in JSON files for easy parameter testing. The bot manager validates signals against portfolio constraints and executes simulated trades with realistic fee modeling. The system includes automatic crash recovery - bots are monitored and restarted if they fail, and the application can restore active bot states after system restarts.
## Future Scalability Considerations
@ -76,7 +76,7 @@ Database indexing on timestamp and symbol fields ensures efficient time-series q
### Development Milestones
Platform success is measured through specific deliverables including core functionality completion within 14 days, system stability maintenance at 99% uptime during internal testing, successful backtesting of at least 3 different strategies, and concurrent operation of 2+ bots for 72+ hours.
Platform success is measured through specific deliverables including core functionality completion within 14 days, system stability maintenance at 99% uptime during internal testing, successful backtesting of at least 3 different strategies, and concurrent operation of 5+ bots for 72+ hours to demonstrate the platform's scalability within its target range.
### Strategy Testing Capabilities
@ -84,6 +84,6 @@ The system enables comprehensive strategy validation through historical simulati
## Conclusion
This simplified crypto trading bot platform balances rapid development requirements with future scalability needs. The monolithic architecture enables deployment within 1-2 weeks while maintaining architectural flexibility for future enhancements. Clear component separation, comprehensive database design, and strategic technology choices create a foundation that supports both immediate testing objectives and long-term platform evolution.
This simplified crypto trading bot platform balances rapid development requirements with future scalability needs. The monolithic architecture enables deployment within 1-2 weeks while maintaining architectural flexibility for future enhancements. The OHLCV-focused data approach optimizes performance by avoiding unnecessary raw data storage, while JSON-based configuration files enable rapid strategy parameter testing without code changes.
The platform's focus on essential functionality without unnecessary complexity ensures teams can begin strategy testing quickly while building toward more sophisticated implementations as requirements expand. This approach maximizes development velocity while preserving options for future architectural evolution and feature enhancement.
Clear component separation, streamlined database design, and strategic technology choices create a foundation that supports both immediate testing objectives and long-term platform evolution. The platform's focus on essential functionality without unnecessary complexity ensures teams can begin strategy testing quickly while building toward more sophisticated implementations as requirements expand. This approach maximizes development velocity while preserving options for future architectural evolution and feature enhancement.