TCPDashboard/CONTEXT.md

7.7 KiB

Project Context: Simplified Crypto Trading Bot Platform

This document provides a comprehensive overview of the project's architecture, technology stack, conventions, and current implementation status, following the guidelines in context-management.md.

1. Architecture Overview

The platform is a monolithic application built with Python, designed for rapid development and internal testing of crypto trading strategies. The architecture is modular, with clear separation between components to facilitate future migration to microservices if needed.

Core Components

  • Data Collection Service: A standalone, multi-process service responsible for collecting real-time market data from exchanges (currently OKX). It uses a robust BaseDataCollector abstraction and specific exchange implementations (e.g., OKXCollector). Data is processed, aggregated into OHLCV candles, and stored.
  • Database: PostgreSQL with the TimescaleDB extension (though currently using a "clean" schema without hypertables for simplicity). It stores market data, bot configurations, trading signals, and performance metrics. SQLAlchemy is used as the ORM.
  • Real-time Messaging: Redis is used for pub/sub messaging, intended for real-time data distribution between components (though its use in the dashboard is currently deferred).
  • Dashboard & API: A Dash application serves as the main user interface for visualization, bot management, and system monitoring. The underlying Flask server can be extended for REST APIs.
  • Strategy Engine & Bot Manager: (Not yet implemented) This component will be responsible for executing trading logic, managing bot lifecycles, and tracking virtual portfolios.
  • Backtesting Engine: (Not yet implemented) This will provide capabilities to test strategies against historical data.

Data Flow

  1. The DataCollectionService connects to the OKX WebSocket API.
  2. Raw trade data is received and processed by OKXDataProcessor.
  3. Trades are aggregated into OHLCV candles (1m, 5m, etc.).
  4. Both raw trade data and processed OHLCV candles are stored in the PostgreSQL database.
  5. (Future) The Strategy Engine will consume OHLCV data to generate trading signals.
  6. The Dashboard reads data from the database to provide visualizations and system health monitoring.

2. Technology Stack

  • Backend: Python 3.10+
  • Web Framework: Dash with Dash Bootstrap Components for the frontend UI.
  • Database: PostgreSQL 14+. SQLAlchemy for ORM. Alembic for migrations.
  • Messaging: Redis for pub/sub.
  • Data & Numerics: pandas for data manipulation (especially in backtesting).
  • Package Management: uv
  • Containerization: Docker and Docker Compose for setting up the development environment (PostgreSQL, Redis, etc.).

3. Coding Conventions

  • Modular Design: Code is organized into modules with a clear purpose (e.g., data, database, dashboard). See architecture.md for more details.
  • Naming Conventions:
    • Classes: PascalCase (e.g., MarketData, BaseDataCollector).
    • Functions & Methods: snake_case (e.g., get_system_health_layout, connect).
    • Variables & Attributes: snake_case (e.g., exchange_name, _ws_client).
    • Constants: UPPER_SNAKE_CASE (e.g., MAX_RECONNECT_ATTEMPTS).
    • Modules: snake_case.py (e.g., collector_manager.py).
    • Private Attributes/Methods: Use a single leading underscore _ (e.g., _process_message). Avoid double underscores unless for name mangling in classes.
  • File Organization & Code Structure:
    • Directory Structure: Top-level directories separate major concerns (data, database, dashboard, strategies). Sub-packages should be used for further organization (e.g., data/exchanges/okx).
    • Module Structure: Within a Python module (.py file), the preferred order is:
      1. Module-level docstring explaining its purpose.
      2. Imports (see pattern below).
      3. Module-level constants (ALL_CAPS).
      4. Custom exception classes.
      5. Data classes or simple data structures.
      6. Helper functions (if any, typically private _helper()).
      7. Main business logic classes.
    • __init__.py: Use __init__.py files to define a package's public API and simplify imports for consumers of the package.
  • Import/Export Patterns:
    • Grouping: Imports should be grouped in the following order, with a blank line between each group:
      1. Standard library imports (e.g., asyncio, datetime).
      2. Third-party library imports (e.g., dash, sqlalchemy).
      3. Local application imports (e.g., from utils.logger import get_logger).
    • Style: Use absolute imports (from data.base_collector import ...) over relative imports (from ..base_collector import ...) for better readability and to avoid ambiguity.
    • Exports: To create a clean public API for a package, import the desired classes/functions into the __init__.py file. This allows users to import directly from the package (e.g., from data.exchanges import ExchangeFactory) instead of from the specific submodule.
  • Abstract Base Classes: Used to define common interfaces, as seen in data/base_collector.py.
  • Configuration: Bot and strategy parameters are managed via JSON files in config/. Centralized application settings are handled by config/settings.py.
  • Logging: A unified logging system is available in utils/logger.py and should be used across all components for consistent output.
  • Type Hinting: Mandatory for all function signatures (parameters and return values) for clarity and static analysis.
  • Error Handling: Custom, specific exceptions should be defined (e.g., DataCollectorError). Use try...except blocks to handle potential failures gracefully and provide informative error messages.
  • Database Access: A DatabaseManager in database/connection.py provides a centralized way to handle database sessions and connections. All database operations should ideally go through an operations/repository layer.

4. Current Implementation Status

Completed Features

  • Database Foundation: The database schema is fully defined in database/models.py and database/schema_clean.sql, with all necessary tables, indexes, and relationships. Database connection management is robust.
  • Data Collection System: A highly robust and asynchronous data collection service is in place. It supports OKX, handles WebSocket connections, processes data, aggregates OHLCV candles, and stores data reliably. It features health monitoring and automatic restarts.
  • Basic Dashboard: A functional dashboard exists.
    • System Health Monitoring: A comprehensive page shows the real-time status of the data collection service, database, Redis, and system performance (CPU/memory).
    • Data Visualization: Price charts with technical indicator overlays are implemented.

Work in Progress / To-Do

The core business logic of the application is yet to be implemented. The main remaining tasks are:

  • Strategy Engine and Bot Management (Task Group 4.0):
    • Designing the base strategy interface.
    • Implementing bot lifecycle management (create, run, stop).
    • Signal generation and virtual portfolio tracking.
  • Advanced Dashboard Features (Task Group 5.0):
    • Building the UI for managing bots and configuring strategies.
  • Backtesting Engine (Task Group 6.0):
    • Implementing the engine to test strategies on historical data.
  • Real-Time Trading Simulation (Task Group 7.0):
    • Executing virtual trades based on signals.

The project has a solid foundation. The next phase of development should focus on implementing the trading logic and user-facing bot management features.