documentation
This commit is contained in:
309
docs/guides/README.md
Normal file
309
docs/guides/README.md
Normal file
@@ -0,0 +1,309 @@
|
||||
# Guides Documentation
|
||||
|
||||
This section contains user guides, tutorials, and setup instructions for the TCP Dashboard platform.
|
||||
|
||||
## 📋 Contents
|
||||
|
||||
### Setup & Installation
|
||||
|
||||
- **[Setup Guide](setup.md)** - *Comprehensive setup instructions for new machines and environments*
|
||||
- Environment configuration and prerequisites
|
||||
- Database setup with Docker and PostgreSQL
|
||||
- Development workflow and best practices
|
||||
- Production deployment guidelines
|
||||
- Troubleshooting common setup issues
|
||||
|
||||
### Quick Start Guides
|
||||
|
||||
#### For Developers
|
||||
|
||||
```bash
|
||||
# Quick setup for development
|
||||
git clone <repository>
|
||||
cd TCPDashboard
|
||||
uv sync
|
||||
cp .env.example .env
|
||||
docker-compose up -d
|
||||
uv run python scripts/init_database.py
|
||||
```
|
||||
|
||||
#### For Users
|
||||
|
||||
```python
|
||||
# Quick data collection setup
|
||||
from data.exchanges import create_okx_collector
|
||||
from data.base_collector import DataType
|
||||
|
||||
collector = create_okx_collector(
|
||||
symbol='BTC-USDT',
|
||||
data_types=[DataType.TRADE]
|
||||
)
|
||||
await collector.start()
|
||||
```
|
||||
|
||||
## 🚀 Tutorial Series
|
||||
|
||||
### Getting Started
|
||||
|
||||
1. **[Environment Setup](setup.md#environment-setup)** - Setting up your development environment
|
||||
2. **[First Data Collector](setup.md#first-collector)** - Creating your first data collector
|
||||
3. **[Database Integration](setup.md#database-setup)** - Connecting to the database
|
||||
4. **[Adding Monitoring](setup.md#monitoring)** - Setting up logging and monitoring
|
||||
|
||||
### Advanced Topics
|
||||
|
||||
1. **[Multi-Exchange Setup](setup.md#multi-exchange)** - Collecting from multiple exchanges
|
||||
2. **[Production Deployment](setup.md#production)** - Deploying to production
|
||||
3. **[Performance Optimization](setup.md#optimization)** - Optimizing for high throughput
|
||||
4. **[Custom Integrations](setup.md#custom)** - Building custom data sources
|
||||
|
||||
## 🛠️ Development Workflow
|
||||
|
||||
### Daily Development
|
||||
|
||||
```bash
|
||||
# Start development environment
|
||||
docker-compose up -d
|
||||
|
||||
# Install new dependencies
|
||||
uv add package-name
|
||||
|
||||
# Run tests
|
||||
uv run pytest
|
||||
|
||||
# Check code quality
|
||||
uv run black .
|
||||
uv run isort .
|
||||
```
|
||||
|
||||
### Code Organization
|
||||
|
||||
- **`data/`**: Data collection and processing
|
||||
- **`database/`**: Database models and utilities
|
||||
- **`utils/`**: Shared utilities and logging
|
||||
- **`tests/`**: Test suite
|
||||
- **`docs/`**: Documentation
|
||||
- **`config/`**: Configuration files
|
||||
|
||||
### Best Practices
|
||||
|
||||
1. **Follow existing patterns**: Use established code patterns
|
||||
2. **Write tests first**: TDD approach for new features
|
||||
3. **Document changes**: Update docs with code changes
|
||||
4. **Use type hints**: Full type annotation coverage
|
||||
5. **Handle errors**: Robust error handling throughout
|
||||
|
||||
## 🔧 Configuration Management
|
||||
|
||||
### Environment Variables
|
||||
|
||||
Key environment variables to configure:
|
||||
|
||||
```bash
|
||||
# Database
|
||||
DATABASE_URL=postgresql://user:pass@localhost:5432/tcp_dashboard
|
||||
|
||||
# Logging
|
||||
LOG_LEVEL=INFO
|
||||
LOG_CLEANUP=true
|
||||
|
||||
# Data Collection
|
||||
DEFAULT_HEALTH_CHECK_INTERVAL=30
|
||||
AUTO_RESTART=true
|
||||
```
|
||||
|
||||
### Configuration Files
|
||||
|
||||
The platform uses JSON configuration files:
|
||||
|
||||
- **`config/okx_config.json`**: OKX exchange settings
|
||||
- **`config/database_config.json`**: Database configuration
|
||||
- **`config/logging_config.json`**: Logging settings
|
||||
|
||||
### Security Best Practices
|
||||
|
||||
- **Never commit secrets**: Use `.env` files for sensitive data
|
||||
- **Validate inputs**: Comprehensive input validation
|
||||
- **Use HTTPS**: Secure connections in production
|
||||
- **Regular updates**: Keep dependencies updated
|
||||
|
||||
## 📊 Monitoring & Observability
|
||||
|
||||
### Health Monitoring
|
||||
|
||||
The platform includes comprehensive health monitoring:
|
||||
|
||||
```python
|
||||
# Check system health
|
||||
from data.collector_manager import CollectorManager
|
||||
|
||||
manager = CollectorManager()
|
||||
status = manager.get_status()
|
||||
|
||||
print(f"Running collectors: {status['statistics']['running_collectors']}")
|
||||
print(f"Failed collectors: {status['statistics']['failed_collectors']}")
|
||||
```
|
||||
|
||||
### Logging
|
||||
|
||||
Structured logging across all components:
|
||||
|
||||
```python
|
||||
from utils.logger import get_logger
|
||||
|
||||
logger = get_logger("my_component")
|
||||
logger.info("Component started", extra={"component": "my_component"})
|
||||
```
|
||||
|
||||
### Performance Metrics
|
||||
|
||||
Built-in performance tracking:
|
||||
|
||||
- **Message rates**: Real-time data processing rates
|
||||
- **Error rates**: System health and stability
|
||||
- **Resource usage**: Memory and CPU utilization
|
||||
- **Uptime**: Component availability metrics
|
||||
|
||||
## 🧪 Testing
|
||||
|
||||
### Running Tests
|
||||
|
||||
```bash
|
||||
# Run all tests
|
||||
uv run pytest
|
||||
|
||||
# Run specific test files
|
||||
uv run pytest tests/test_base_collector.py
|
||||
|
||||
# Run with coverage
|
||||
uv run pytest --cov=data --cov-report=html
|
||||
|
||||
# Run integration tests
|
||||
uv run pytest tests/integration/
|
||||
```
|
||||
|
||||
### Test Organization
|
||||
|
||||
- **Unit tests**: Individual component testing
|
||||
- **Integration tests**: Cross-component functionality
|
||||
- **Performance tests**: Load and stress testing
|
||||
- **End-to-end tests**: Full system workflows
|
||||
|
||||
### Writing Tests
|
||||
|
||||
Follow these patterns when writing tests:
|
||||
|
||||
```python
|
||||
import pytest
|
||||
import asyncio
|
||||
from data.exchanges import create_okx_collector
|
||||
|
||||
@pytest.mark.asyncio
|
||||
async def test_okx_collector():
|
||||
collector = create_okx_collector('BTC-USDT')
|
||||
assert collector is not None
|
||||
|
||||
# Test lifecycle
|
||||
await collector.start()
|
||||
status = collector.get_status()
|
||||
assert status['status'] == 'running'
|
||||
|
||||
await collector.stop()
|
||||
```
|
||||
|
||||
## 🚀 Deployment
|
||||
|
||||
### Development Deployment
|
||||
|
||||
For local development:
|
||||
|
||||
```bash
|
||||
# Start services
|
||||
docker-compose up -d
|
||||
|
||||
# Initialize database
|
||||
uv run python scripts/init_database.py
|
||||
|
||||
# Start data collection
|
||||
uv run python scripts/start_collectors.py
|
||||
```
|
||||
|
||||
### Production Deployment
|
||||
|
||||
For production environments:
|
||||
|
||||
```bash
|
||||
# Use production docker-compose
|
||||
docker-compose -f docker-compose.prod.yml up -d
|
||||
|
||||
# Set production environment
|
||||
export ENV=production
|
||||
export LOG_LEVEL=INFO
|
||||
|
||||
# Start with monitoring
|
||||
uv run python scripts/production_start.py
|
||||
```
|
||||
|
||||
### Docker Deployment
|
||||
|
||||
Using Docker containers:
|
||||
|
||||
```dockerfile
|
||||
FROM python:3.11-slim
|
||||
|
||||
WORKDIR /app
|
||||
COPY requirements.txt .
|
||||
RUN pip install -r requirements.txt
|
||||
|
||||
COPY . .
|
||||
CMD ["python", "-m", "scripts.production_start"]
|
||||
```
|
||||
|
||||
## 🔗 Related Documentation
|
||||
|
||||
- **[Components Documentation](../components/)** - Technical component details
|
||||
- **[Architecture Overview](../architecture/)** - System design
|
||||
- **[Exchange Documentation](../exchanges/)** - Exchange integrations
|
||||
- **[API Reference](../reference/)** - Technical specifications
|
||||
|
||||
## 📞 Support & Troubleshooting
|
||||
|
||||
### Common Issues
|
||||
|
||||
1. **Database Connection Errors**
|
||||
- Check Docker services: `docker-compose ps`
|
||||
- Verify environment variables in `.env`
|
||||
- Test connection: `uv run python scripts/test_db_connection.py`
|
||||
|
||||
2. **Collector Failures**
|
||||
- Check logs: `tail -f logs/collector_error.log`
|
||||
- Verify configuration: Review `config/*.json` files
|
||||
- Test manually: `uv run python scripts/test_okx_collector.py`
|
||||
|
||||
3. **Performance Issues**
|
||||
- Monitor resource usage: `docker stats`
|
||||
- Check message rates: Collector status endpoints
|
||||
- Optimize configuration: Adjust health check intervals
|
||||
|
||||
### Getting Help
|
||||
|
||||
1. **Check Documentation**: Review relevant section documentation
|
||||
2. **Review Logs**: System logs in `./logs/` directory
|
||||
3. **Test Components**: Use built-in test scripts
|
||||
4. **Check Status**: Use status and health check methods
|
||||
|
||||
### Debug Mode
|
||||
|
||||
Enable detailed debugging:
|
||||
|
||||
```bash
|
||||
export LOG_LEVEL=DEBUG
|
||||
uv run python your_script.py
|
||||
|
||||
# Check detailed logs
|
||||
tail -f logs/*_debug.log
|
||||
```
|
||||
|
||||
---
|
||||
|
||||
*For the complete documentation index, see the [main documentation README](../README.md).*
|
||||
546
docs/guides/setup.md
Normal file
546
docs/guides/setup.md
Normal file
@@ -0,0 +1,546 @@
|
||||
# Crypto Trading Bot Dashboard - Setup Guide
|
||||
|
||||
This guide will help you set up the Crypto Trading Bot Dashboard on a new machine from scratch.
|
||||
|
||||
## Prerequisites
|
||||
|
||||
### Required Software
|
||||
|
||||
1. **Python 3.12+**
|
||||
- Download from [python.org](https://python.org)
|
||||
- Ensure Python is added to PATH
|
||||
|
||||
2. **UV Package Manager**
|
||||
```powershell
|
||||
# Windows (PowerShell)
|
||||
powershell -c "irm https://astral.sh/uv/install.ps1 | iex"
|
||||
|
||||
# macOS/Linux
|
||||
curl -LsSf https://astral.sh/uv/install.sh | sh
|
||||
```
|
||||
|
||||
3. **Docker Desktop**
|
||||
- Download from [docker.com](https://docker.com)
|
||||
- Ensure Docker is running before proceeding
|
||||
|
||||
4. **Git**
|
||||
- Download from [git-scm.com](https://git-scm.com)
|
||||
|
||||
### System Requirements
|
||||
|
||||
- **RAM**: Minimum 4GB, Recommended 8GB+
|
||||
- **Storage**: At least 2GB free space
|
||||
- **OS**: Windows 10/11, macOS 10.15+, or Linux
|
||||
|
||||
## Project Setup
|
||||
|
||||
### 1. Clone the Repository
|
||||
|
||||
```bash
|
||||
git clone <repository-url>
|
||||
cd TCPDashboard
|
||||
```
|
||||
|
||||
### 2. Environment Configuration
|
||||
|
||||
Create the environment file from the template:
|
||||
|
||||
```powershell
|
||||
# Windows
|
||||
Copy-Item env.template .env
|
||||
|
||||
# macOS/Linux
|
||||
cp env.template .env
|
||||
```
|
||||
|
||||
**Important**:
|
||||
- The `.env` file is **REQUIRED** - the application will not work without it
|
||||
- The `.env` file contains secure passwords for database and Redis
|
||||
- **Never commit the `.env` file to version control**
|
||||
- All credentials must be loaded from environment variables - no hardcoded passwords exist in the codebase
|
||||
|
||||
Current configuration in `.env`:
|
||||
```env
|
||||
POSTGRES_PORT=5434
|
||||
POSTGRES_PASSWORD=your_secure_password_here
|
||||
REDIS_PASSWORD=your_redis_password_here
|
||||
```
|
||||
|
||||
### 3. Configure Custom Ports (Optional)
|
||||
|
||||
If you have other PostgreSQL instances running, the default configuration uses port `5434` to avoid conflicts. You can modify these in your `.env` file.
|
||||
|
||||
## Database Setup
|
||||
|
||||
### 1. Start Database Services
|
||||
|
||||
Start PostgreSQL with TimescaleDB and Redis using Docker Compose:
|
||||
|
||||
```powershell
|
||||
docker-compose up -d
|
||||
```
|
||||
|
||||
This will:
|
||||
- Create a PostgreSQL database with TimescaleDB extension on port `5434`
|
||||
- Create a Redis instance on port `6379`
|
||||
- Set up persistent volumes for data storage
|
||||
- Configure password authentication
|
||||
- **Automatically initialize the database schema** using the clean schema (without TimescaleDB hypertables for simpler setup)
|
||||
|
||||
### 2. Verify Services are Running
|
||||
|
||||
Check container status:
|
||||
```powershell
|
||||
docker-compose ps
|
||||
```
|
||||
|
||||
Expected output:
|
||||
```
|
||||
NAME IMAGE COMMAND SERVICE CREATED STATUS PORTS
|
||||
dashboard_postgres timescale/timescaledb:latest-pg15 "docker-entrypoint.s…" postgres X minutes ago Up X minutes (healthy) 0.0.0.0:5434->5432/tcp
|
||||
dashboard_redis redis:7-alpine "docker-entrypoint.s…" redis X minutes ago Up X minutes (healthy) 0.0.0.0:6379->6379/tcp
|
||||
```
|
||||
|
||||
### 3. Database Migration System
|
||||
|
||||
The project uses **Alembic** for database schema versioning and migrations. This allows for safe, trackable database schema changes.
|
||||
|
||||
#### Understanding Migration vs Direct Schema
|
||||
|
||||
The project supports two approaches for database setup:
|
||||
|
||||
1. **Direct Schema (Default)**: Uses `database/init/schema_clean.sql` for automatic Docker initialization
|
||||
2. **Migration System**: Uses Alembic for versioned schema changes and updates
|
||||
|
||||
#### Migration Commands
|
||||
|
||||
**Check migration status:**
|
||||
```powershell
|
||||
uv run alembic current
|
||||
```
|
||||
|
||||
**View migration history:**
|
||||
```powershell
|
||||
uv run alembic history --verbose
|
||||
```
|
||||
|
||||
**Upgrade to latest migration:**
|
||||
```powershell
|
||||
uv run alembic upgrade head
|
||||
```
|
||||
|
||||
**Downgrade to previous migration:**
|
||||
```powershell
|
||||
uv run alembic downgrade -1
|
||||
```
|
||||
|
||||
**Create new migration (for development):**
|
||||
```powershell
|
||||
# Auto-generate migration from model changes
|
||||
uv run alembic revision --autogenerate -m "Description of changes"
|
||||
|
||||
# Create empty migration for custom changes
|
||||
uv run alembic revision -m "Description of changes"
|
||||
```
|
||||
|
||||
#### Migration Files Location
|
||||
|
||||
- **Configuration**: `alembic.ini`
|
||||
- **Environment**: `database/migrations/env.py`
|
||||
- **Versions**: `database/migrations/versions/`
|
||||
|
||||
#### When to Use Migrations
|
||||
|
||||
**Use Direct Schema (recommended for new setups):**
|
||||
- Fresh installations
|
||||
- Development environments
|
||||
- When you want automatic schema setup with Docker
|
||||
|
||||
**Use Migrations (recommended for updates):**
|
||||
- Updating existing databases
|
||||
- Production schema changes
|
||||
- When you need to track schema history
|
||||
- Rolling back database changes
|
||||
|
||||
#### Migration Best Practices
|
||||
|
||||
1. **Always backup before migrations in production**
|
||||
2. **Test migrations on a copy of production data first**
|
||||
3. **Review auto-generated migrations before applying**
|
||||
4. **Use descriptive migration messages**
|
||||
5. **Never edit migration files after they've been applied**
|
||||
|
||||
### 4. Verify Database Schema
|
||||
|
||||
The database schema is automatically initialized when containers start. You can verify it worked:
|
||||
|
||||
```powershell
|
||||
docker exec dashboard_postgres psql -U dashboard -d dashboard -c "\dt"
|
||||
```
|
||||
|
||||
Expected output should show tables: `bots`, `bot_performance`, `market_data`, `raw_trades`, `signals`, `supported_exchanges`, `supported_timeframes`, `trades`
|
||||
|
||||
### 5. Test Database Initialization Script (Optional)
|
||||
|
||||
You can also test the database initialization using the Python script:
|
||||
|
||||
```powershell
|
||||
uv run .\scripts\init_database.py
|
||||
```
|
||||
|
||||
This script will:
|
||||
- Load environment variables from `.env` file
|
||||
- Test database connection
|
||||
- Create all tables using SQLAlchemy models
|
||||
- Verify all expected tables exist
|
||||
- Show connection pool status
|
||||
|
||||
## Application Setup
|
||||
|
||||
### 1. Install Python Dependencies
|
||||
|
||||
```powershell
|
||||
uv sync
|
||||
```
|
||||
|
||||
This will:
|
||||
- Create a virtual environment in `.venv/`
|
||||
- Install all required dependencies
|
||||
- Set up the project for development
|
||||
|
||||
### 2. Activate Virtual Environment
|
||||
|
||||
```powershell
|
||||
# Windows
|
||||
uv run <command>
|
||||
|
||||
# Or activate manually
|
||||
.venv\Scripts\Activate.ps1
|
||||
|
||||
# macOS/Linux
|
||||
source .venv/bin/activate
|
||||
```
|
||||
|
||||
### 3. Verify Database Schema (Optional)
|
||||
|
||||
The database schema is automatically initialized when Docker containers start. You can verify it's working:
|
||||
|
||||
```powershell
|
||||
# Check if all tables exist
|
||||
docker exec dashboard_postgres psql -U dashboard -d dashboard -c "SELECT table_name FROM information_schema.tables WHERE table_schema = 'public' ORDER BY table_name;"
|
||||
|
||||
# Verify sample data was inserted
|
||||
docker exec dashboard_postgres psql -U dashboard -d dashboard -c "SELECT * FROM supported_timeframes;"
|
||||
```
|
||||
|
||||
## Running the Application
|
||||
|
||||
### 1. Start the Dashboard
|
||||
|
||||
```powershell
|
||||
uv run python main.py
|
||||
```
|
||||
|
||||
### 2. Access the Application
|
||||
|
||||
Open your browser and navigate to:
|
||||
- **Local**: http://localhost:8050
|
||||
- **Network**: http://0.0.0.0:8050 (if accessible from other machines)
|
||||
|
||||
## Configuration
|
||||
|
||||
### Environment Variables
|
||||
|
||||
Key configuration options in `.env`:
|
||||
|
||||
```env
|
||||
# Database Configuration
|
||||
POSTGRES_HOST=localhost
|
||||
POSTGRES_PORT=5434
|
||||
POSTGRES_DB=dashboard
|
||||
POSTGRES_USER=dashboard
|
||||
POSTGRES_PASSWORD=your_secure_password_here
|
||||
|
||||
# Redis Configuration
|
||||
REDIS_HOST=localhost
|
||||
REDIS_PORT=6379
|
||||
REDIS_PASSWORD=your_redis_password_here
|
||||
|
||||
# Application Configuration
|
||||
DASH_HOST=0.0.0.0
|
||||
DASH_PORT=8050
|
||||
DEBUG=true
|
||||
|
||||
# OKX API Configuration (for real trading)
|
||||
OKX_API_KEY=your_okx_api_key_here
|
||||
OKX_SECRET_KEY=your_okx_secret_key_here
|
||||
OKX_PASSPHRASE=your_okx_passphrase_here
|
||||
OKX_SANDBOX=true
|
||||
```
|
||||
|
||||
### Port Configuration
|
||||
|
||||
If you need to change ports due to conflicts:
|
||||
|
||||
1. **PostgreSQL Port**: Update `POSTGRES_PORT` in `.env` and the port mapping in `docker-compose.yml`
|
||||
2. **Redis Port**: Update `REDIS_PORT` in `.env` and `docker-compose.yml`
|
||||
3. **Dashboard Port**: Update `DASH_PORT` in `.env`
|
||||
|
||||
## Development Workflow
|
||||
|
||||
### 1. Daily Development Setup
|
||||
|
||||
```powershell
|
||||
# Start databases
|
||||
docker-compose up -d
|
||||
|
||||
# Start development server
|
||||
uv run python main.py
|
||||
```
|
||||
|
||||
### 2. Stop Services
|
||||
|
||||
```powershell
|
||||
# Stop application: Ctrl+C in terminal
|
||||
|
||||
# Stop databases
|
||||
docker-compose down
|
||||
```
|
||||
|
||||
### 3. Reset Database (if needed)
|
||||
|
||||
```powershell
|
||||
# WARNING: This will delete all data
|
||||
docker-compose down -v
|
||||
docker-compose up -d
|
||||
```
|
||||
|
||||
## Testing
|
||||
|
||||
### Run Unit Tests
|
||||
|
||||
```powershell
|
||||
# Run all tests
|
||||
uv run pytest
|
||||
|
||||
# Run specific test file
|
||||
uv run pytest tests/test_database.py
|
||||
|
||||
# Run with coverage
|
||||
uv run pytest --cov=. --cov-report=html
|
||||
```
|
||||
|
||||
### Test Database Connection
|
||||
|
||||
Create a quick test script:
|
||||
|
||||
```python
|
||||
# test_connection.py
|
||||
import os
|
||||
import psycopg2
|
||||
import redis
|
||||
from dotenv import load_dotenv
|
||||
|
||||
load_dotenv()
|
||||
|
||||
# Test PostgreSQL
|
||||
try:
|
||||
conn = psycopg2.connect(
|
||||
host=os.getenv('POSTGRES_HOST'),
|
||||
port=os.getenv('POSTGRES_PORT'),
|
||||
database=os.getenv('POSTGRES_DB'),
|
||||
user=os.getenv('POSTGRES_USER'),
|
||||
password=os.getenv('POSTGRES_PASSWORD')
|
||||
)
|
||||
print("✅ PostgreSQL connection successful!")
|
||||
conn.close()
|
||||
except Exception as e:
|
||||
print(f"❌ PostgreSQL connection failed: {e}")
|
||||
|
||||
# Test Redis
|
||||
try:
|
||||
r = redis.Redis(
|
||||
host=os.getenv('REDIS_HOST'),
|
||||
port=int(os.getenv('REDIS_PORT')),
|
||||
password=os.getenv('REDIS_PASSWORD')
|
||||
)
|
||||
r.ping()
|
||||
print("✅ Redis connection successful!")
|
||||
except Exception as e:
|
||||
print(f"❌ Redis connection failed: {e}")
|
||||
```
|
||||
|
||||
Run test:
|
||||
```powershell
|
||||
uv run python test_connection.py
|
||||
```
|
||||
|
||||
## Troubleshooting
|
||||
|
||||
### Common Issues
|
||||
|
||||
#### 1. Port Already in Use
|
||||
|
||||
**Error**: `Port 5434 is already allocated`
|
||||
|
||||
**Solution**:
|
||||
- Change `POSTGRES_PORT` in `.env` to a different port (e.g., 5435)
|
||||
- Update `docker-compose.yml` port mapping accordingly
|
||||
- Restart containers: `docker-compose down && docker-compose up -d`
|
||||
|
||||
#### 2. Docker Permission Issues
|
||||
|
||||
**Error**: `permission denied while trying to connect to the Docker daemon`
|
||||
|
||||
**Solution**:
|
||||
- Ensure Docker Desktop is running
|
||||
- On Linux: Add user to docker group: `sudo usermod -aG docker $USER`
|
||||
- Restart terminal/session
|
||||
|
||||
#### 3. Database Connection Failed
|
||||
|
||||
**Error**: `password authentication failed`
|
||||
|
||||
**Solution**:
|
||||
- Ensure `.env` password matches `docker-compose.yml`
|
||||
- Reset database: `docker-compose down -v && docker-compose up -d`
|
||||
- Wait for database initialization (30-60 seconds)
|
||||
|
||||
#### 4. Database Schema Not Created
|
||||
|
||||
**Error**: Tables don't exist or `\dt` shows no tables
|
||||
|
||||
**Solution**:
|
||||
```powershell
|
||||
# Check initialization logs
|
||||
docker-compose logs postgres
|
||||
|
||||
# Use the Python initialization script to create/verify schema
|
||||
uv run .\scripts\init_database.py
|
||||
|
||||
# Verify tables were created
|
||||
docker exec dashboard_postgres psql -U dashboard -d dashboard -c "\dt"
|
||||
```
|
||||
|
||||
#### 5. Application Dependencies Issues
|
||||
|
||||
**Error**: Package installation failures
|
||||
|
||||
**Solution**:
|
||||
```powershell
|
||||
# Clear UV cache
|
||||
uv cache clean
|
||||
|
||||
# Reinstall dependencies
|
||||
rm -rf .venv
|
||||
uv sync
|
||||
```
|
||||
|
||||
#### 6. Migration Issues
|
||||
|
||||
**Error**: `alembic.util.exc.CommandError: Target database is not up to date`
|
||||
|
||||
**Solution**:
|
||||
```powershell
|
||||
# Check current migration status
|
||||
uv run alembic current
|
||||
|
||||
# Upgrade to latest migration
|
||||
uv run alembic upgrade head
|
||||
|
||||
# If migrations are out of sync, stamp current version
|
||||
uv run alembic stamp head
|
||||
```
|
||||
|
||||
**Error**: `ModuleNotFoundError: No module named 'database'`
|
||||
|
||||
**Solution**:
|
||||
- Ensure you're running commands from the project root directory
|
||||
- Verify the virtual environment is activated: `uv run <command>`
|
||||
|
||||
**Error**: Migration revision conflicts
|
||||
|
||||
**Solution**:
|
||||
```powershell
|
||||
# Check migration history
|
||||
uv run alembic history --verbose
|
||||
|
||||
# Merge conflicting migrations
|
||||
uv run alembic merge -m "Merge conflicting revisions" <revision1> <revision2>
|
||||
```
|
||||
|
||||
**Error**: Database already has tables but no migration history
|
||||
|
||||
**Solution**:
|
||||
```powershell
|
||||
# Mark current schema as the initial migration
|
||||
uv run alembic stamp head
|
||||
|
||||
# Or start fresh with migrations
|
||||
docker-compose down -v
|
||||
docker-compose up -d
|
||||
uv run alembic upgrade head
|
||||
```
|
||||
|
||||
### Log Files
|
||||
|
||||
View service logs:
|
||||
```powershell
|
||||
# All services
|
||||
docker-compose logs
|
||||
|
||||
# Specific service
|
||||
docker-compose logs postgres
|
||||
docker-compose logs redis
|
||||
|
||||
# Follow logs in real-time
|
||||
docker-compose logs -f
|
||||
```
|
||||
|
||||
### Database Management
|
||||
|
||||
#### Backup Database
|
||||
|
||||
```powershell
|
||||
docker exec dashboard_postgres pg_dump -U dashboard dashboard > backup.sql
|
||||
```
|
||||
|
||||
#### Restore Database
|
||||
|
||||
```powershell
|
||||
docker exec -i dashboard_postgres psql -U dashboard dashboard < backup.sql
|
||||
```
|
||||
|
||||
#### Access Database CLI
|
||||
|
||||
```powershell
|
||||
docker exec -it dashboard_postgres psql -U dashboard -d dashboard
|
||||
```
|
||||
|
||||
#### Access Redis CLI
|
||||
|
||||
```powershell
|
||||
docker exec -it dashboard_redis redis-cli -a $env:REDIS_PASSWORD
|
||||
```
|
||||
|
||||
## Security Notes
|
||||
|
||||
1. **Never commit `.env` file** to version control
|
||||
2. **Change default passwords** in production environments
|
||||
3. **Use strong passwords** for production deployments
|
||||
4. **Enable SSL/TLS** for production database connections
|
||||
5. **Restrict network access** in production environments
|
||||
|
||||
## Support
|
||||
|
||||
If you encounter issues not covered in this guide:
|
||||
|
||||
1. Check the [project documentation](../README.md)
|
||||
2. Review [GitHub issues](link-to-issues)
|
||||
3. Contact the development team
|
||||
|
||||
---
|
||||
|
||||
**Last Updated**: 2025-05-30
|
||||
**Version**: 1.0
|
||||
**Tested On**: Windows 11, Docker Desktop 4.x
|
||||
Reference in New Issue
Block a user