- Replaced the `RedisManager` class with a more modular `SyncRedisManager` and `AsyncRedisManager`, improving the separation of synchronous and asynchronous operations. - Updated the `system_health.py` callbacks to utilize the new `get_sync_redis_manager` function for Redis interactions, simplifying the connection process. - Enhanced error handling and logging in Redis status checks, providing clearer feedback on connection issues. - Revised the setup documentation to reflect changes in Redis connection testing, ensuring clarity for users. These updates improve the maintainability and reliability of Redis interactions within the system, aligning with best practices for modular design.
12 KiB
Crypto Trading Bot Dashboard - Setup Guide
This guide will help you set up the Crypto Trading Bot Dashboard on a new machine from scratch.
Prerequisites
Required Software
-
Python 3.12+
- Download from python.org
- Ensure Python is added to PATH
-
UV Package Manager
# Windows (PowerShell) powershell -c "irm https://astral.sh/uv/install.ps1 | iex" # macOS/Linux curl -LsSf https://astral.sh/uv/install.sh | sh -
Docker Desktop
- Download from docker.com
- Ensure Docker is running before proceeding
-
Git
- Download from git-scm.com
System Requirements
- RAM: Minimum 4GB, Recommended 8GB+
- Storage: At least 2GB free space
- OS: Windows 10/11, macOS 10.15+, or Linux
Project Setup
1. Clone the Repository
git clone <repository-url>
cd TCPDashboard
2. Environment Configuration
Create the environment file from the template:
# Windows
Copy-Item env.template .env
# macOS/Linux
cp env.template .env
Important:
- The
.envfile is REQUIRED - the application will not work without it - The
.envfile contains secure passwords for database and Redis - Never commit the
.envfile to version control - All credentials must be loaded from environment variables - no hardcoded passwords exist in the codebase
Current configuration in .env:
POSTGRES_PORT=5434
POSTGRES_PASSWORD=your_secure_password_here
REDIS_PASSWORD=your_redis_password_here
3. Configure Custom Ports (Optional)
If you have other PostgreSQL instances running, the default configuration uses port 5434 to avoid conflicts. You can modify these in your .env file.
Database Setup
1. Start Database Services
Start PostgreSQL with TimescaleDB and Redis using Docker Compose:
docker-compose up -d
This will:
- Create a PostgreSQL database with TimescaleDB extension on port
5434 - Create a Redis instance on port
6379 - Set up persistent volumes for data storage
- Configure password authentication
- Automatically initialize the database schema using the clean schema (without TimescaleDB hypertables for simpler setup)
2. Verify Services are Running
Check container status:
docker-compose ps
Expected output:
NAME IMAGE COMMAND SERVICE CREATED STATUS PORTS
dashboard_postgres timescale/timescaledb:latest-pg15 "docker-entrypoint.s…" postgres X minutes ago Up X minutes (healthy) 0.0.0.0:5434->5432/tcp
dashboard_redis redis:7-alpine "docker-entrypoint.s…" redis X minutes ago Up X minutes (healthy) 0.0.0.0:6379->6379/tcp
3. Database Migration System
The project uses Alembic for database schema versioning and migrations. This allows for safe, trackable database schema changes.
Understanding Migration vs Direct Schema
The project supports two approaches for database setup:
- Direct Schema (Default): Uses
database/init/schema_clean.sqlfor automatic Docker initialization - Migration System: Uses Alembic for versioned schema changes and updates
Migration Commands
Check migration status:
uv run alembic current
View migration history:
uv run alembic history --verbose
Upgrade to latest migration:
uv run alembic upgrade head
Downgrade to previous migration:
uv run alembic downgrade -1
Create new migration (for development):
# Auto-generate migration from model changes
uv run alembic revision --autogenerate -m "Description of changes"
# Create empty migration for custom changes
uv run alembic revision -m "Description of changes"
Migration Files Location
- Configuration:
alembic.ini - Environment:
database/migrations/env.py - Versions:
database/migrations/versions/
When to Use Migrations
Use Direct Schema (recommended for new setups):
- Fresh installations
- Development environments
- When you want automatic schema setup with Docker
Use Migrations (recommended for updates):
- Updating existing databases
- Production schema changes
- When you need to track schema history
- Rolling back database changes
Migration Best Practices
- Always backup before migrations in production
- Test migrations on a copy of production data first
- Review auto-generated migrations before applying
- Use descriptive migration messages
- Never edit migration files after they've been applied
4. Verify Database Schema
The database schema is automatically initialized when containers start. You can verify it worked:
docker exec dashboard_postgres psql -U dashboard -d dashboard -c "\dt"
Expected output should show tables: bots, bot_performance, market_data, raw_trades, signals, supported_exchanges, supported_timeframes, trades
5. Test Database Initialization Script (Optional)
You can also test the database initialization using the Python script:
uv run .\scripts\init_database.py
This script will:
- Load environment variables from
.envfile - Test database connection
- Create all tables using SQLAlchemy models
- Verify all expected tables exist
- Show connection pool status
Application Setup
1. Install Python Dependencies
uv sync
This will:
- Create a virtual environment in
.venv/ - Install all required dependencies
- Set up the project for development
2. Activate Virtual Environment
# Windows
uv run <command>
# Or activate manually
.venv\Scripts\Activate.ps1
# macOS/Linux
source .venv/bin/activate
3. Verify Database Schema (Optional)
The database schema is automatically initialized when Docker containers start. You can verify it's working:
# Check if all tables exist
docker exec dashboard_postgres psql -U dashboard -d dashboard -c "SELECT table_name FROM information_schema.tables WHERE table_schema = 'public' ORDER BY table_name;"
# Verify sample data was inserted
docker exec dashboard_postgres psql -U dashboard -d dashboard -c "SELECT * FROM supported_timeframes;"
Running the Application
1. Start the Dashboard
uv run python main.py
2. Access the Application
Open your browser and navigate to:
- Local: http://localhost:8050
- Network: http://0.0.0.0:8050 (if accessible from other machines)
Configuration
Environment Variables
Key configuration options in .env:
# Database Configuration
POSTGRES_HOST=localhost
POSTGRES_PORT=5434
POSTGRES_DB=dashboard
POSTGRES_USER=dashboard
POSTGRES_PASSWORD=your_secure_password_here
# Redis Configuration
REDIS_HOST=localhost
REDIS_PORT=6379
REDIS_PASSWORD=your_redis_password_here
# Application Configuration
DASH_HOST=0.0.0.0
DASH_PORT=8050
DEBUG=true
# OKX API Configuration (for real trading)
OKX_API_KEY=your_okx_api_key_here
OKX_SECRET_KEY=your_okx_secret_key_here
OKX_PASSPHRASE=your_okx_passphrase_here
OKX_SANDBOX=true
Port Configuration
If you need to change ports due to conflicts:
- PostgreSQL Port: Update
POSTGRES_PORTin.envand the port mapping indocker-compose.yml - Redis Port: Update
REDIS_PORTin.envanddocker-compose.yml - Dashboard Port: Update
DASH_PORTin.env
Development Workflow
1. Daily Development Setup
# Start databases
docker-compose up -d
# Start development server
uv run python main.py
2. Stop Services
# Stop application: Ctrl+C in terminal
# Stop databases
docker-compose down
3. Reset Database (if needed)
# WARNING: This will delete all data
docker-compose down -v
docker-compose up -d
Testing
Run Unit Tests
# Run all tests
uv run pytest
# Run specific test file
uv run pytest tests/test_database.py
# Run with coverage
uv run pytest --cov=. --cov-report=html
Test Database Connection
Create a quick test script:
# test_connection.py
import os
from database.connection import DatabaseManager
# Load environment variables
from dotenv import load_dotenv
load_dotenv()
# Test Database
db = DatabaseManager()
db.initialize()
if db.test_connection():
print("✅ Database connection successful!")
db.close()
# Test Redis
from database.redis_manager import get_sync_redis_manager
try:
redis_manager = get_sync_redis_manager()
redis_manager.initialize()
print("✅ Redis connection successful!")
except Exception as e:
print(f"❌ Redis connection failed: {e}")
Run test:
uv run python test_connection.py
Troubleshooting
Common Issues
1. Port Already in Use
Error: Port 5434 is already allocated
Solution:
- Change
POSTGRES_PORTin.envto a different port (e.g., 5435) - Update
docker-compose.ymlport mapping accordingly - Restart containers:
docker-compose down && docker-compose up -d
2. Docker Permission Issues
Error: permission denied while trying to connect to the Docker daemon
Solution:
- Ensure Docker Desktop is running
- On Linux: Add user to docker group:
sudo usermod -aG docker $USER - Restart terminal/session
3. Database Connection Failed
Error: password authentication failed
Solution:
- Ensure
.envpassword matchesdocker-compose.yml - Reset database:
docker-compose down -v && docker-compose up -d - Wait for database initialization (30-60 seconds)
4. Database Schema Not Created
Error: Tables don't exist or \dt shows no tables
Solution:
# Check initialization logs
docker-compose logs postgres
# Use the Python initialization script to create/verify schema
uv run .\scripts\init_database.py
# Verify tables were created
docker exec dashboard_postgres psql -U dashboard -d dashboard -c "\dt"
5. Application Dependencies Issues
Error: Package installation failures
Solution:
# Clear UV cache
uv cache clean
# Reinstall dependencies
rm -rf .venv
uv sync
6. Migration Issues
Error: alembic.util.exc.CommandError: Target database is not up to date
Solution:
# Check current migration status
uv run alembic current
# Upgrade to latest migration
uv run alembic upgrade head
# If migrations are out of sync, stamp current version
uv run alembic stamp head
Error: ModuleNotFoundError: No module named 'database'
Solution:
- Ensure you're running commands from the project root directory
- Verify the virtual environment is activated:
uv run <command>
Error: Migration revision conflicts
Solution:
# Check migration history
uv run alembic history --verbose
# Merge conflicting migrations
uv run alembic merge -m "Merge conflicting revisions" <revision1> <revision2>
Error: Database already has tables but no migration history
Solution:
# Mark current schema as the initial migration
uv run alembic stamp head
# Or start fresh with migrations
docker-compose down -v
docker-compose up -d
uv run alembic upgrade head
Log Files
View service logs:
# All services
docker-compose logs
# Specific service
docker-compose logs postgres
docker-compose logs redis
# Follow logs in real-time
docker-compose logs -f
Database Management
Backup Database
docker exec dashboard_postgres pg_dump -U dashboard dashboard > backup.sql
Restore Database
docker exec -i dashboard_postgres psql -U dashboard dashboard < backup.sql
Access Database CLI
docker exec -it dashboard_postgres psql -U dashboard -d dashboard
Access Redis CLI
docker exec -it dashboard_redis redis-cli -a $env:REDIS_PASSWORD
Security Notes
- Never commit
.envfile to version control - Change default passwords in production environments
- Use strong passwords for production deployments
- Enable SSL/TLS for production database connections
- Restrict network access in production environments
Support
If you encounter issues not covered in this guide:
- Check the project documentation
- Review GitHub issues
- Contact the development team
Last Updated: 2025-05-30
Version: 1.0
Tested On: Windows 11, Docker Desktop 4.x