TCPDashboard/docs/setup.md
Vasily.onl 8a378c8d69 Add Alembic migration system for database schema versioning
- Introduced `alembic.ini` for Alembic configuration, enabling structured database migrations.
- Created `database/migrations/env.py` to manage migration environment and database URL retrieval.
- Added migration script template `database/migrations/script.py.mako` for generating migration scripts.
- Updated `.gitignore` to exclude migration versions from version control.
- Enhanced `setup.md` documentation to include details on the migration system and commands for managing migrations.
2025-05-30 18:33:23 +08:00

546 lines
13 KiB
Markdown

# Crypto Trading Bot Dashboard - Setup Guide
This guide will help you set up the Crypto Trading Bot Dashboard on a new machine from scratch.
## Prerequisites
### Required Software
1. **Python 3.12+**
- Download from [python.org](https://python.org)
- Ensure Python is added to PATH
2. **UV Package Manager**
```powershell
# Windows (PowerShell)
powershell -c "irm https://astral.sh/uv/install.ps1 | iex"
# macOS/Linux
curl -LsSf https://astral.sh/uv/install.sh | sh
```
3. **Docker Desktop**
- Download from [docker.com](https://docker.com)
- Ensure Docker is running before proceeding
4. **Git**
- Download from [git-scm.com](https://git-scm.com)
### System Requirements
- **RAM**: Minimum 4GB, Recommended 8GB+
- **Storage**: At least 2GB free space
- **OS**: Windows 10/11, macOS 10.15+, or Linux
## Project Setup
### 1. Clone the Repository
```bash
git clone <repository-url>
cd TCPDashboard
```
### 2. Environment Configuration
Create the environment file from the template:
```powershell
# Windows
Copy-Item env.template .env
# macOS/Linux
cp env.template .env
```
**Important**:
- The `.env` file is **REQUIRED** - the application will not work without it
- The `.env` file contains secure passwords for database and Redis
- **Never commit the `.env` file to version control**
- All credentials must be loaded from environment variables - no hardcoded passwords exist in the codebase
Current configuration in `.env`:
```env
POSTGRES_PORT=5434
POSTGRES_PASSWORD=your_secure_password_here
REDIS_PASSWORD=your_redis_password_here
```
### 3. Configure Custom Ports (Optional)
If you have other PostgreSQL instances running, the default configuration uses port `5434` to avoid conflicts. You can modify these in your `.env` file.
## Database Setup
### 1. Start Database Services
Start PostgreSQL with TimescaleDB and Redis using Docker Compose:
```powershell
docker-compose up -d
```
This will:
- Create a PostgreSQL database with TimescaleDB extension on port `5434`
- Create a Redis instance on port `6379`
- Set up persistent volumes for data storage
- Configure password authentication
- **Automatically initialize the database schema** using the clean schema (without TimescaleDB hypertables for simpler setup)
### 2. Verify Services are Running
Check container status:
```powershell
docker-compose ps
```
Expected output:
```
NAME IMAGE COMMAND SERVICE CREATED STATUS PORTS
dashboard_postgres timescale/timescaledb:latest-pg15 "docker-entrypoint.s…" postgres X minutes ago Up X minutes (healthy) 0.0.0.0:5434->5432/tcp
dashboard_redis redis:7-alpine "docker-entrypoint.s…" redis X minutes ago Up X minutes (healthy) 0.0.0.0:6379->6379/tcp
```
### 3. Database Migration System
The project uses **Alembic** for database schema versioning and migrations. This allows for safe, trackable database schema changes.
#### Understanding Migration vs Direct Schema
The project supports two approaches for database setup:
1. **Direct Schema (Default)**: Uses `database/init/schema_clean.sql` for automatic Docker initialization
2. **Migration System**: Uses Alembic for versioned schema changes and updates
#### Migration Commands
**Check migration status:**
```powershell
uv run alembic current
```
**View migration history:**
```powershell
uv run alembic history --verbose
```
**Upgrade to latest migration:**
```powershell
uv run alembic upgrade head
```
**Downgrade to previous migration:**
```powershell
uv run alembic downgrade -1
```
**Create new migration (for development):**
```powershell
# Auto-generate migration from model changes
uv run alembic revision --autogenerate -m "Description of changes"
# Create empty migration for custom changes
uv run alembic revision -m "Description of changes"
```
#### Migration Files Location
- **Configuration**: `alembic.ini`
- **Environment**: `database/migrations/env.py`
- **Versions**: `database/migrations/versions/`
#### When to Use Migrations
**Use Direct Schema (recommended for new setups):**
- Fresh installations
- Development environments
- When you want automatic schema setup with Docker
**Use Migrations (recommended for updates):**
- Updating existing databases
- Production schema changes
- When you need to track schema history
- Rolling back database changes
#### Migration Best Practices
1. **Always backup before migrations in production**
2. **Test migrations on a copy of production data first**
3. **Review auto-generated migrations before applying**
4. **Use descriptive migration messages**
5. **Never edit migration files after they've been applied**
### 4. Verify Database Schema
The database schema is automatically initialized when containers start. You can verify it worked:
```powershell
docker exec dashboard_postgres psql -U dashboard -d dashboard -c "\dt"
```
Expected output should show tables: `bots`, `bot_performance`, `market_data`, `raw_trades`, `signals`, `supported_exchanges`, `supported_timeframes`, `trades`
### 5. Test Database Initialization Script (Optional)
You can also test the database initialization using the Python script:
```powershell
uv run .\scripts\init_database.py
```
This script will:
- Load environment variables from `.env` file
- Test database connection
- Create all tables using SQLAlchemy models
- Verify all expected tables exist
- Show connection pool status
## Application Setup
### 1. Install Python Dependencies
```powershell
uv sync
```
This will:
- Create a virtual environment in `.venv/`
- Install all required dependencies
- Set up the project for development
### 2. Activate Virtual Environment
```powershell
# Windows
uv run <command>
# Or activate manually
.venv\Scripts\Activate.ps1
# macOS/Linux
source .venv/bin/activate
```
### 3. Verify Database Schema (Optional)
The database schema is automatically initialized when Docker containers start. You can verify it's working:
```powershell
# Check if all tables exist
docker exec dashboard_postgres psql -U dashboard -d dashboard -c "SELECT table_name FROM information_schema.tables WHERE table_schema = 'public' ORDER BY table_name;"
# Verify sample data was inserted
docker exec dashboard_postgres psql -U dashboard -d dashboard -c "SELECT * FROM supported_timeframes;"
```
## Running the Application
### 1. Start the Dashboard
```powershell
uv run python main.py
```
### 2. Access the Application
Open your browser and navigate to:
- **Local**: http://localhost:8050
- **Network**: http://0.0.0.0:8050 (if accessible from other machines)
## Configuration
### Environment Variables
Key configuration options in `.env`:
```env
# Database Configuration
POSTGRES_HOST=localhost
POSTGRES_PORT=5434
POSTGRES_DB=dashboard
POSTGRES_USER=dashboard
POSTGRES_PASSWORD=your_secure_password_here
# Redis Configuration
REDIS_HOST=localhost
REDIS_PORT=6379
REDIS_PASSWORD=your_redis_password_here
# Application Configuration
DASH_HOST=0.0.0.0
DASH_PORT=8050
DEBUG=true
# OKX API Configuration (for real trading)
OKX_API_KEY=your_okx_api_key_here
OKX_SECRET_KEY=your_okx_secret_key_here
OKX_PASSPHRASE=your_okx_passphrase_here
OKX_SANDBOX=true
```
### Port Configuration
If you need to change ports due to conflicts:
1. **PostgreSQL Port**: Update `POSTGRES_PORT` in `.env` and the port mapping in `docker-compose.yml`
2. **Redis Port**: Update `REDIS_PORT` in `.env` and `docker-compose.yml`
3. **Dashboard Port**: Update `DASH_PORT` in `.env`
## Development Workflow
### 1. Daily Development Setup
```powershell
# Start databases
docker-compose up -d
# Start development server
uv run python main.py
```
### 2. Stop Services
```powershell
# Stop application: Ctrl+C in terminal
# Stop databases
docker-compose down
```
### 3. Reset Database (if needed)
```powershell
# WARNING: This will delete all data
docker-compose down -v
docker-compose up -d
```
## Testing
### Run Unit Tests
```powershell
# Run all tests
uv run pytest
# Run specific test file
uv run pytest tests/test_database.py
# Run with coverage
uv run pytest --cov=. --cov-report=html
```
### Test Database Connection
Create a quick test script:
```python
# test_connection.py
import os
import psycopg2
import redis
from dotenv import load_dotenv
load_dotenv()
# Test PostgreSQL
try:
conn = psycopg2.connect(
host=os.getenv('POSTGRES_HOST'),
port=os.getenv('POSTGRES_PORT'),
database=os.getenv('POSTGRES_DB'),
user=os.getenv('POSTGRES_USER'),
password=os.getenv('POSTGRES_PASSWORD')
)
print("✅ PostgreSQL connection successful!")
conn.close()
except Exception as e:
print(f"❌ PostgreSQL connection failed: {e}")
# Test Redis
try:
r = redis.Redis(
host=os.getenv('REDIS_HOST'),
port=int(os.getenv('REDIS_PORT')),
password=os.getenv('REDIS_PASSWORD')
)
r.ping()
print("✅ Redis connection successful!")
except Exception as e:
print(f"❌ Redis connection failed: {e}")
```
Run test:
```powershell
uv run python test_connection.py
```
## Troubleshooting
### Common Issues
#### 1. Port Already in Use
**Error**: `Port 5434 is already allocated`
**Solution**:
- Change `POSTGRES_PORT` in `.env` to a different port (e.g., 5435)
- Update `docker-compose.yml` port mapping accordingly
- Restart containers: `docker-compose down && docker-compose up -d`
#### 2. Docker Permission Issues
**Error**: `permission denied while trying to connect to the Docker daemon`
**Solution**:
- Ensure Docker Desktop is running
- On Linux: Add user to docker group: `sudo usermod -aG docker $USER`
- Restart terminal/session
#### 3. Database Connection Failed
**Error**: `password authentication failed`
**Solution**:
- Ensure `.env` password matches `docker-compose.yml`
- Reset database: `docker-compose down -v && docker-compose up -d`
- Wait for database initialization (30-60 seconds)
#### 4. Database Schema Not Created
**Error**: Tables don't exist or `\dt` shows no tables
**Solution**:
```powershell
# Check initialization logs
docker-compose logs postgres
# Use the Python initialization script to create/verify schema
uv run .\scripts\init_database.py
# Verify tables were created
docker exec dashboard_postgres psql -U dashboard -d dashboard -c "\dt"
```
#### 5. Application Dependencies Issues
**Error**: Package installation failures
**Solution**:
```powershell
# Clear UV cache
uv cache clean
# Reinstall dependencies
rm -rf .venv
uv sync
```
#### 6. Migration Issues
**Error**: `alembic.util.exc.CommandError: Target database is not up to date`
**Solution**:
```powershell
# Check current migration status
uv run alembic current
# Upgrade to latest migration
uv run alembic upgrade head
# If migrations are out of sync, stamp current version
uv run alembic stamp head
```
**Error**: `ModuleNotFoundError: No module named 'database'`
**Solution**:
- Ensure you're running commands from the project root directory
- Verify the virtual environment is activated: `uv run <command>`
**Error**: Migration revision conflicts
**Solution**:
```powershell
# Check migration history
uv run alembic history --verbose
# Merge conflicting migrations
uv run alembic merge -m "Merge conflicting revisions" <revision1> <revision2>
```
**Error**: Database already has tables but no migration history
**Solution**:
```powershell
# Mark current schema as the initial migration
uv run alembic stamp head
# Or start fresh with migrations
docker-compose down -v
docker-compose up -d
uv run alembic upgrade head
```
### Log Files
View service logs:
```powershell
# All services
docker-compose logs
# Specific service
docker-compose logs postgres
docker-compose logs redis
# Follow logs in real-time
docker-compose logs -f
```
### Database Management
#### Backup Database
```powershell
docker exec dashboard_postgres pg_dump -U dashboard dashboard > backup.sql
```
#### Restore Database
```powershell
docker exec -i dashboard_postgres psql -U dashboard dashboard < backup.sql
```
#### Access Database CLI
```powershell
docker exec -it dashboard_postgres psql -U dashboard -d dashboard
```
#### Access Redis CLI
```powershell
docker exec -it dashboard_redis redis-cli -a $env:REDIS_PASSWORD
```
## Security Notes
1. **Never commit `.env` file** to version control
2. **Change default passwords** in production environments
3. **Use strong passwords** for production deployments
4. **Enable SSL/TLS** for production database connections
5. **Restrict network access** in production environments
## Support
If you encounter issues not covered in this guide:
1. Check the [project documentation](../README.md)
2. Review [GitHub issues](link-to-issues)
3. Contact the development team
---
**Last Updated**: 2025-05-30
**Version**: 1.0
**Tested On**: Windows 11, Docker Desktop 4.x