Add initial implementation of the Orderflow Backtest System with OBI and CVD metrics integration, including core modules for storage, strategies, and visualization. Introduced persistent metrics storage in SQLite, optimized memory usage, and enhanced documentation.

This commit is contained in:
Simon Moisy 2025-08-26 17:22:07 +08:00
parent 63f723820a
commit fa6df78c1e
52 changed files with 7039 additions and 1 deletions

View File

@ -0,0 +1,61 @@
---
description: Global development standards and AI interaction principles
globs:
alwaysApply: true
---
# Rule: Always Apply - Global Development Standards
## AI Interaction Principles
### Step-by-Step Development
- **NEVER** generate large blocks of code without explanation
- **ALWAYS** ask "provide your plan in a concise bullet list and wait for my confirmation before proceeding"
- Break complex tasks into smaller, manageable pieces (≤250 lines per file, ≤50 lines per function)
- Explain your reasoning step-by-step before writing code
- Wait for explicit approval before moving to the next sub-task
### Context Awareness
- **ALWAYS** reference existing code patterns and data structures before suggesting new approaches
- Ask about existing conventions before implementing new functionality
- Preserve established architectural decisions unless explicitly asked to change them
- Maintain consistency with existing naming conventions and code style
## Code Quality Standards
### File and Function Limits
- **Maximum file size**: 250 lines
- **Maximum function size**: 50 lines
- **Maximum complexity**: If a function does more than one main thing, break it down
- **Naming**: Use clear, descriptive names that explain purpose
### Documentation Requirements
- **Every public function** must have a docstring explaining purpose, parameters, and return value
- **Every class** must have a class-level docstring
- **Complex logic** must have inline comments explaining the "why", not just the "what"
- **API endpoints** must be documented with request/response examples
### Error Handling
- **ALWAYS** include proper error handling for external dependencies
- **NEVER** use bare except clauses
- Provide meaningful error messages that help with debugging
- Log errors appropriately for the application context
## Security and Best Practices
- **NEVER** hardcode credentials, API keys, or sensitive data
- **ALWAYS** validate user inputs
- Use parameterized queries for database operations
- Follow the principle of least privilege
- Implement proper authentication and authorization
## Testing Requirements
- **Every implementation** should have corresponding unit tests
- **Every API endpoint** should have integration tests
- Test files should be placed alongside the code they test
- Use descriptive test names that explain what is being tested
## Response Format
- Be concise and avoid unnecessary repetition
- Focus on actionable information
- Provide examples when explaining complex concepts
- Ask clarifying questions when requirements are ambiguous

View File

@ -0,0 +1,237 @@
---
description: Modular design principles and architecture guidelines for scalable development
globs:
alwaysApply: false
---
# Rule: Architecture and Modular Design
## Goal
Maintain a clean, modular architecture that scales effectively and prevents the complexity issues that arise in AI-assisted development.
## Core Architecture Principles
### 1. Modular Design
- **Single Responsibility**: Each module has one clear purpose
- **Loose Coupling**: Modules depend on interfaces, not implementations
- **High Cohesion**: Related functionality is grouped together
- **Clear Boundaries**: Module interfaces are well-defined and stable
### 2. Size Constraints
- **Files**: Maximum 250 lines per file
- **Functions**: Maximum 50 lines per function
- **Classes**: Maximum 300 lines per class
- **Modules**: Maximum 10 public functions/classes per module
### 3. Dependency Management
- **Layer Dependencies**: Higher layers depend on lower layers only
- **No Circular Dependencies**: Modules cannot depend on each other cyclically
- **Interface Segregation**: Depend on specific interfaces, not broad ones
- **Dependency Injection**: Pass dependencies rather than creating them internally
## Modular Architecture Patterns
### Layer Structure
```
src/
├── presentation/ # UI, API endpoints, CLI interfaces
├── application/ # Business logic, use cases, workflows
├── domain/ # Core business entities and rules
├── infrastructure/ # Database, external APIs, file systems
└── shared/ # Common utilities, constants, types
```
### Module Organization
```
module_name/
├── __init__.py # Public interface exports
├── core.py # Main module logic
├── types.py # Type definitions and interfaces
├── utils.py # Module-specific utilities
├── tests/ # Module tests
└── README.md # Module documentation
```
## Design Patterns for AI Development
### 1. Repository Pattern
Separate data access from business logic:
```python
# Domain interface
class UserRepository:
def get_by_id(self, user_id: str) -> User: ...
def save(self, user: User) -> None: ...
# Infrastructure implementation
class SqlUserRepository(UserRepository):
def get_by_id(self, user_id: str) -> User:
# Database-specific implementation
pass
```
### 2. Service Pattern
Encapsulate business logic in focused services:
```python
class UserService:
def __init__(self, user_repo: UserRepository):
self._user_repo = user_repo
def create_user(self, data: UserData) -> User:
# Validation and business logic
# Single responsibility: user creation
pass
```
### 3. Factory Pattern
Create complex objects with clear interfaces:
```python
class DatabaseFactory:
@staticmethod
def create_connection(config: DatabaseConfig) -> Connection:
# Handle different database types
# Encapsulate connection complexity
pass
```
## Architecture Decision Guidelines
### When to Create New Modules
Create a new module when:
- **Functionality** exceeds size constraints (250 lines)
- **Responsibility** is distinct from existing modules
- **Dependencies** would create circular references
- **Reusability** would benefit other parts of the system
- **Testing** requires isolated test environments
### When to Split Existing Modules
Split modules when:
- **File size** exceeds 250 lines
- **Multiple responsibilities** are evident
- **Testing** becomes difficult due to complexity
- **Dependencies** become too numerous
- **Change frequency** differs significantly between parts
### Module Interface Design
```python
# Good: Clear, focused interface
class PaymentProcessor:
def process_payment(self, amount: Money, method: PaymentMethod) -> PaymentResult:
"""Process a single payment transaction."""
pass
# Bad: Unfocused, kitchen-sink interface
class PaymentManager:
def process_payment(self, ...): pass
def validate_card(self, ...): pass
def send_receipt(self, ...): pass
def update_inventory(self, ...): pass # Wrong responsibility!
```
## Architecture Validation
### Architecture Review Checklist
- [ ] **Dependencies flow in one direction** (no cycles)
- [ ] **Layers are respected** (presentation doesn't call infrastructure directly)
- [ ] **Modules have single responsibility**
- [ ] **Interfaces are stable** and well-defined
- [ ] **Size constraints** are maintained
- [ ] **Testing** is straightforward for each module
### Red Flags
- **God Objects**: Classes/modules that do too many things
- **Circular Dependencies**: Modules that depend on each other
- **Deep Inheritance**: More than 3 levels of inheritance
- **Large Interfaces**: Interfaces with more than 7 methods
- **Tight Coupling**: Modules that know too much about each other's internals
## Refactoring Guidelines
### When to Refactor
- Module exceeds size constraints
- Code duplication across modules
- Difficult to test individual components
- New features require changing multiple unrelated modules
- Performance bottlenecks due to poor separation
### Refactoring Process
1. **Identify** the specific architectural problem
2. **Design** the target architecture
3. **Create tests** to verify current behavior
4. **Implement changes** incrementally
5. **Validate** that tests still pass
6. **Update documentation** to reflect changes
### Safe Refactoring Practices
- **One change at a time**: Don't mix refactoring with new features
- **Tests first**: Ensure comprehensive test coverage before refactoring
- **Incremental changes**: Small steps with verification at each stage
- **Backward compatibility**: Maintain existing interfaces during transition
- **Documentation updates**: Keep architecture documentation current
## Architecture Documentation
### Architecture Decision Records (ADRs)
Document significant decisions in `./docs/decisions/`:
```markdown
# ADR-003: Service Layer Architecture
## Status
Accepted
## Context
As the application grows, business logic is scattered across controllers and models.
## Decision
Implement a service layer to encapsulate business logic.
## Consequences
**Positive:**
- Clear separation of concerns
- Easier testing of business logic
- Better reusability across different interfaces
**Negative:**
- Additional abstraction layer
- More files to maintain
```
### Module Documentation Template
```markdown
# Module: [Name]
## Purpose
What this module does and why it exists.
## Dependencies
- **Imports from**: List of modules this depends on
- **Used by**: List of modules that depend on this one
- **External**: Third-party dependencies
## Public Interface
```python
# Key functions and classes exposed by this module
```
## Architecture Notes
- Design patterns used
- Important architectural decisions
- Known limitations or constraints
```
## Migration Strategies
### Legacy Code Integration
- **Strangler Fig Pattern**: Gradually replace old code with new modules
- **Adapter Pattern**: Create interfaces to integrate old and new code
- **Facade Pattern**: Simplify complex legacy interfaces
### Gradual Modernization
1. **Identify boundaries** in existing code
2. **Extract modules** one at a time
3. **Create interfaces** for each extracted module
4. **Test thoroughly** at each step
5. **Update documentation** continuously

View File

@ -0,0 +1,123 @@
---
description: AI-generated code review checklist and quality assurance guidelines
globs:
alwaysApply: false
---
# Rule: Code Review and Quality Assurance
## Goal
Establish systematic review processes for AI-generated code to maintain quality, security, and maintainability standards.
## AI Code Review Checklist
### Pre-Implementation Review
Before accepting any AI-generated code:
1. **Understand the Code**
- [ ] Can you explain what the code does in your own words?
- [ ] Do you understand each function and its purpose?
- [ ] Are there any "magic" values or unexplained logic?
- [ ] Does the code solve the actual problem stated?
2. **Architecture Alignment**
- [ ] Does the code follow established project patterns?
- [ ] Is it consistent with existing data structures?
- [ ] Does it integrate cleanly with existing components?
- [ ] Are new dependencies justified and necessary?
3. **Code Quality**
- [ ] Are functions smaller than 50 lines?
- [ ] Are files smaller than 250 lines?
- [ ] Are variable and function names descriptive?
- [ ] Is the code DRY (Don't Repeat Yourself)?
### Security Review
- [ ] **Input Validation**: All user inputs are validated and sanitized
- [ ] **Authentication**: Proper authentication checks are in place
- [ ] **Authorization**: Access controls are implemented correctly
- [ ] **Data Protection**: Sensitive data is handled securely
- [ ] **SQL Injection**: Database queries use parameterized statements
- [ ] **XSS Prevention**: Output is properly escaped
- [ ] **Error Handling**: Errors don't leak sensitive information
### Integration Review
- [ ] **Existing Functionality**: New code doesn't break existing features
- [ ] **Data Consistency**: Database changes maintain referential integrity
- [ ] **API Compatibility**: Changes don't break existing API contracts
- [ ] **Performance Impact**: New code doesn't introduce performance bottlenecks
- [ ] **Testing Coverage**: Appropriate tests are included
## Review Process
### Step 1: Initial Code Analysis
1. **Read through the entire generated code** before running it
2. **Identify patterns** that don't match existing codebase
3. **Check dependencies** - are new packages really needed?
4. **Verify logic flow** - does the algorithm make sense?
### Step 2: Security and Error Handling Review
1. **Trace data flow** from input to output
2. **Identify potential failure points** and verify error handling
3. **Check for security vulnerabilities** using the security checklist
4. **Verify proper logging** and monitoring implementation
### Step 3: Integration Testing
1. **Test with existing code** to ensure compatibility
2. **Run existing test suite** to verify no regressions
3. **Test edge cases** and error conditions
4. **Verify performance** under realistic conditions
## Common AI Code Issues to Watch For
### Overcomplication Patterns
- **Unnecessary abstractions**: AI creating complex patterns for simple tasks
- **Over-engineering**: Solutions that are more complex than needed
- **Redundant code**: AI recreating existing functionality
- **Inappropriate design patterns**: Using patterns that don't fit the use case
### Context Loss Indicators
- **Inconsistent naming**: Different conventions from existing code
- **Wrong data structures**: Using different patterns than established
- **Ignored existing functions**: Reimplementing existing functionality
- **Architectural misalignment**: Code that doesn't fit the overall design
### Technical Debt Indicators
- **Magic numbers**: Hardcoded values without explanation
- **Poor error messages**: Generic or unhelpful error handling
- **Missing documentation**: Code without adequate comments
- **Tight coupling**: Components that are too interdependent
## Quality Gates
### Mandatory Reviews
All AI-generated code must pass these gates before acceptance:
1. **Security Review**: No security vulnerabilities detected
2. **Integration Review**: Integrates cleanly with existing code
3. **Performance Review**: Meets performance requirements
4. **Maintainability Review**: Code can be easily modified by team members
5. **Documentation Review**: Adequate documentation is provided
### Acceptance Criteria
- [ ] Code is understandable by any team member
- [ ] Integration requires minimal changes to existing code
- [ ] Security review passes all checks
- [ ] Performance meets established benchmarks
- [ ] Documentation is complete and accurate
## Rejection Criteria
Reject AI-generated code if:
- Security vulnerabilities are present
- Code is too complex for the problem being solved
- Integration requires major refactoring of existing code
- Code duplicates existing functionality without justification
- Documentation is missing or inadequate
## Review Documentation
For each review, document:
- Issues found and how they were resolved
- Performance impact assessment
- Security concerns and mitigations
- Integration challenges and solutions
- Recommendations for future similar tasks

View File

@ -0,0 +1,93 @@
---
description: Context management for maintaining codebase awareness and preventing context drift
globs:
alwaysApply: false
---
# Rule: Context Management
## Goal
Maintain comprehensive project context to prevent context drift and ensure AI-generated code integrates seamlessly with existing codebase patterns and architecture.
## Context Documentation Requirements
### PRD.md file documentation
1. **Project Overview**
- Business objectives and goals
- Target users and use cases
- Key success metrics
### CONTEXT.md File Structure
Every project must maintain a `CONTEXT.md` file in the root directory with:
1. **Architecture Overview**
- High-level system architecture
- Key design patterns used
- Database schema overview
- API structure and conventions
2. **Technology Stack**
- Programming languages and versions
- Frameworks and libraries
- Database systems
- Development and deployment tools
3. **Coding Conventions**
- Naming conventions
- File organization patterns
- Code structure preferences
- Import/export patterns
4. **Current Implementation Status**
- Completed features
- Work in progress
- Known technical debt
- Planned improvements
## Context Maintenance Protocol
### Before Every Coding Session
1. **Review CONTEXT.md and PRD.md** to understand current project state
2. **Scan recent changes** in git history to understand latest patterns
3. **Identify existing patterns** for similar functionality before implementing new features
4. **Ask for clarification** if existing patterns are unclear or conflicting
### During Development
1. **Reference existing code** when explaining implementation approaches
2. **Maintain consistency** with established patterns and conventions
3. **Update CONTEXT.md** when making architectural decisions
4. **Document deviations** from established patterns with reasoning
### Context Preservation Strategies
- **Incremental development**: Build on existing patterns rather than creating new ones
- **Pattern consistency**: Use established data structures and function signatures
- **Integration awareness**: Consider how new code affects existing functionality
- **Dependency management**: Understand existing dependencies before adding new ones
## Context Prompting Best Practices
### Effective Context Sharing
- Include relevant sections of CONTEXT.md in prompts for complex tasks
- Reference specific existing files when asking for similar functionality
- Provide examples of existing patterns when requesting new implementations
- Share recent git commit messages to understand latest changes
### Context Window Optimization
- Prioritize most relevant context for current task
- Use @filename references to include specific files
- Break large contexts into focused, task-specific chunks
- Update context references as project evolves
## Red Flags - Context Loss Indicators
- AI suggests patterns that conflict with existing code
- New implementations ignore established conventions
- Proposed solutions don't integrate with existing architecture
- Code suggestions require significant refactoring of existing functionality
## Recovery Protocol
When context loss is detected:
1. **Stop development** and review CONTEXT.md
2. **Analyze existing codebase** for established patterns
3. **Update context documentation** with missing information
4. **Restart task** with proper context provided
5. **Test integration** with existing code before proceeding

View File

@ -0,0 +1,67 @@
---
description: Creating PRD for a project or specific task/function
globs:
alwaysApply: false
---
---
description: Creating PRD for a project or specific task/function
globs:
alwaysApply: false
---
# Rule: Generating a Product Requirements Document (PRD)
## Goal
To guide an AI assistant in creating a detailed Product Requirements Document (PRD) in Markdown format, based on an initial user prompt. The PRD should be clear, actionable, and suitable for a junior developer to understand and implement the feature.
## Process
1. **Receive Initial Prompt:** The user provides a brief description or request for a new feature or functionality.
2. **Ask Clarifying Questions:** Before writing the PRD, the AI *must* ask clarifying questions to gather sufficient detail. The goal is to understand the "what" and "why" of the feature, not necessarily the "how" (which the developer will figure out).
3. **Generate PRD:** Based on the initial prompt and the user's answers to the clarifying questions, generate a PRD using the structure outlined below.
4. **Save PRD:** Save the generated document as `prd-[feature-name].md` inside the `/tasks` directory.
## Clarifying Questions (Examples)
The AI should adapt its questions based on the prompt, but here are some common areas to explore:
* **Problem/Goal:** "What problem does this feature solve for the user?" or "What is the main goal we want to achieve with this feature?"
* **Target User:** "Who is the primary user of this feature?"
* **Core Functionality:** "Can you describe the key actions a user should be able to perform with this feature?"
* **User Stories:** "Could you provide a few user stories? (e.g., As a [type of user], I want to [perform an action] so that [benefit].)"
* **Acceptance Criteria:** "How will we know when this feature is successfully implemented? What are the key success criteria?"
* **Scope/Boundaries:** "Are there any specific things this feature *should not* do (non-goals)?"
* **Data Requirements:** "What kind of data does this feature need to display or manipulate?"
* **Design/UI:** "Are there any existing design mockups or UI guidelines to follow?" or "Can you describe the desired look and feel?"
* **Edge Cases:** "Are there any potential edge cases or error conditions we should consider?"
## PRD Structure
The generated PRD should include the following sections:
1. **Introduction/Overview:** Briefly describe the feature and the problem it solves. State the goal.
2. **Goals:** List the specific, measurable objectives for this feature.
3. **User Stories:** Detail the user narratives describing feature usage and benefits.
4. **Functional Requirements:** List the specific functionalities the feature must have. Use clear, concise language (e.g., "The system must allow users to upload a profile picture."). Number these requirements.
5. **Non-Goals (Out of Scope):** Clearly state what this feature will *not* include to manage scope.
6. **Design Considerations (Optional):** Link to mockups, describe UI/UX requirements, or mention relevant components/styles if applicable.
7. **Technical Considerations (Optional):** Mention any known technical constraints, dependencies, or suggestions (e.g., "Should integrate with the existing Auth module").
8. **Success Metrics:** How will the success of this feature be measured? (e.g., "Increase user engagement by 10%", "Reduce support tickets related to X").
9. **Open Questions:** List any remaining questions or areas needing further clarification.
## Target Audience
Assume the primary reader of the PRD is a **junior developer**. Therefore, requirements should be explicit, unambiguous, and avoid jargon where possible. Provide enough detail for them to understand the feature's purpose and core logic.
## Output
* **Format:** Markdown (`.md`)
* **Location:** `/tasks/`
* **Filename:** `prd-[feature-name].md`
## Final instructions
1. Do NOT start implmenting the PRD
2. Make sure to ask the user clarifying questions
3. Take the user's answers to the clarifying questions and improve the PRD

View File

@ -0,0 +1,244 @@
---
description: Documentation standards for code, architecture, and development decisions
globs:
alwaysApply: false
---
# Rule: Documentation Standards
## Goal
Maintain comprehensive, up-to-date documentation that supports development, onboarding, and long-term maintenance of the codebase.
## Documentation Hierarchy
### 1. Project Level Documentation (in ./docs/)
- **README.md**: Project overview, setup instructions, basic usage
- **CONTEXT.md**: Current project state, architecture decisions, patterns
- **CHANGELOG.md**: Version history and significant changes
- **CONTRIBUTING.md**: Development guidelines and processes
- **API.md**: API endpoints, request/response formats, authentication
### 2. Module Level Documentation (in ./docs/modules/)
- **[module-name].md**: Purpose, public interfaces, usage examples
- **dependencies.md**: External dependencies and their purposes
- **architecture.md**: Module relationships and data flow
### 3. Code Level Documentation
- **Docstrings**: Function and class documentation
- **Inline comments**: Complex logic explanations
- **Type hints**: Clear parameter and return types
- **README files**: Directory-specific instructions
## Documentation Standards
### Code Documentation
```python
def process_user_data(user_id: str, data: dict) -> UserResult:
"""
Process and validate user data before storage.
Args:
user_id: Unique identifier for the user
data: Dictionary containing user information to process
Returns:
UserResult: Processed user data with validation status
Raises:
ValidationError: When user data fails validation
DatabaseError: When storage operation fails
Example:
>>> result = process_user_data("123", {"name": "John", "email": "john@example.com"})
>>> print(result.status)
'valid'
"""
```
### API Documentation Format
```markdown
### POST /api/users
Create a new user account.
**Request:**
```json
{
"name": "string (required)",
"email": "string (required, valid email)",
"age": "number (optional, min: 13)"
}
```
**Response (201):**
```json
{
"id": "uuid",
"name": "string",
"email": "string",
"created_at": "iso_datetime"
}
```
**Errors:**
- 400: Invalid input data
- 409: Email already exists
```
### Architecture Decision Records (ADRs)
Document significant architecture decisions in `./docs/decisions/`:
```markdown
# ADR-001: Database Choice - PostgreSQL
## Status
Accepted
## Context
We need to choose a database for storing user data and application state.
## Decision
We will use PostgreSQL as our primary database.
## Consequences
**Positive:**
- ACID compliance ensures data integrity
- Rich query capabilities with SQL
- Good performance for our expected load
**Negative:**
- More complex setup than simpler alternatives
- Requires SQL knowledge from team members
## Alternatives Considered
- MongoDB: Rejected due to consistency requirements
- SQLite: Rejected due to scalability needs
```
## Documentation Maintenance
### When to Update Documentation
#### Always Update:
- **API changes**: Any modification to public interfaces
- **Architecture changes**: New patterns, data structures, or workflows
- **Configuration changes**: Environment variables, deployment settings
- **Dependencies**: Adding, removing, or upgrading packages
- **Business logic changes**: Core functionality modifications
#### Update Weekly:
- **CONTEXT.md**: Current development status and priorities
- **Known issues**: Bug reports and workarounds
- **Performance notes**: Bottlenecks and optimization opportunities
#### Update per Release:
- **CHANGELOG.md**: User-facing changes and improvements
- **Version documentation**: Breaking changes and migration guides
- **Examples and tutorials**: Keep sample code current
### Documentation Quality Checklist
#### Completeness
- [ ] Purpose and scope clearly explained
- [ ] All public interfaces documented
- [ ] Examples provided for complex usage
- [ ] Error conditions and handling described
- [ ] Dependencies and requirements listed
#### Accuracy
- [ ] Code examples are tested and working
- [ ] Links point to correct locations
- [ ] Version numbers are current
- [ ] Screenshots reflect current UI
#### Clarity
- [ ] Written for the intended audience
- [ ] Technical jargon is explained
- [ ] Step-by-step instructions are clear
- [ ] Visual aids used where helpful
## Documentation Automation
### Auto-Generated Documentation
- **API docs**: Generate from code annotations
- **Type documentation**: Extract from type hints
- **Module dependencies**: Auto-update from imports
- **Test coverage**: Include coverage reports
### Documentation Testing
```python
# Test that code examples in documentation work
def test_documentation_examples():
"""Verify code examples in docs actually work."""
# Test examples from README.md
# Test API examples from docs/API.md
# Test configuration examples
```
## Documentation Templates
### New Module Documentation Template
```markdown
# Module: [Name]
## Purpose
Brief description of what this module does and why it exists.
## Public Interface
### Functions
- `function_name(params)`: Description and example
### Classes
- `ClassName`: Purpose and basic usage
## Usage Examples
```python
# Basic usage example
```
## Dependencies
- Internal: List of internal modules this depends on
- External: List of external packages required
## Testing
How to run tests for this module.
## Known Issues
Current limitations or bugs.
```
### API Endpoint Template
```markdown
### [METHOD] /api/endpoint
Brief description of what this endpoint does.
**Authentication:** Required/Optional
**Rate Limiting:** X requests per minute
**Request:**
- Headers required
- Body schema
- Query parameters
**Response:**
- Success response format
- Error response format
- Status codes
**Example:**
Working request/response example
```
## Review and Maintenance Process
### Documentation Review
- Include documentation updates in code reviews
- Verify examples still work with code changes
- Check for broken links and outdated information
- Ensure consistency with current implementation
### Regular Audits
- Monthly review of documentation accuracy
- Quarterly assessment of documentation completeness
- Annual review of documentation structure and organization

View File

@ -0,0 +1,207 @@
---
description: Enhanced task list management with quality gates and iterative workflow integration
globs:
alwaysApply: false
---
# Rule: Enhanced Task List Management
## Goal
Manage task lists with integrated quality gates and iterative workflow to prevent context loss and ensure sustainable development.
## Task Implementation Protocol
### Pre-Implementation Check
Before starting any sub-task:
- [ ] **Context Review**: Have you reviewed CONTEXT.md and relevant documentation?
- [ ] **Pattern Identification**: Do you understand existing patterns to follow?
- [ ] **Integration Planning**: Do you know how this will integrate with existing code?
- [ ] **Size Validation**: Is this task small enough (≤50 lines, ≤250 lines per file)?
### Implementation Process
1. **One sub-task at a time**: Do **NOT** start the next subtask until you ask the user for permission and they say "yes" or "y"
2. **Step-by-step execution**:
- Plan the approach in bullet points
- Wait for approval
- Implement the specific sub-task
- Test the implementation
- Update documentation if needed
3. **Quality validation**: Run through the code review checklist before marking complete
### Completion Protocol
When you finish a **subtask**:
1. **Immediate marking**: Change `[ ]` to `[x]`
2. **Quality check**: Verify the implementation meets quality standards
3. **Integration test**: Ensure new code works with existing functionality
4. **Documentation update**: Update relevant files if needed
5. **Parent task check**: If **all** subtasks underneath a parent task are now `[x]`, also mark the **parent task** as completed
6. **Stop and wait**: Get user approval before proceeding to next sub-task
## Enhanced Task List Structure
### Task File Header
```markdown
# Task List: [Feature Name]
**Source PRD**: `prd-[feature-name].md`
**Status**: In Progress / Complete / Blocked
**Context Last Updated**: [Date]
**Architecture Review**: Required / Complete / N/A
## Quick Links
- [Context Documentation](./CONTEXT.md)
- [Architecture Guidelines](./docs/architecture.md)
- [Related Files](#relevant-files)
```
### Task Format with Quality Gates
```markdown
- [ ] 1.0 Parent Task Title
- **Quality Gate**: Architecture review required
- **Dependencies**: List any dependencies
- [ ] 1.1 [Sub-task description 1.1]
- **Size estimate**: [Small/Medium/Large]
- **Pattern reference**: [Reference to existing pattern]
- **Test requirements**: [Unit/Integration/Both]
- [ ] 1.2 [Sub-task description 1.2]
- **Integration points**: [List affected components]
- **Risk level**: [Low/Medium/High]
```
## Relevant Files Management
### Enhanced File Tracking
```markdown
## Relevant Files
### Implementation Files
- `path/to/file1.ts` - Brief description of purpose and role
- **Status**: Created / Modified / Needs Review
- **Last Modified**: [Date]
- **Review Status**: Pending / Approved / Needs Changes
### Test Files
- `path/to/file1.test.ts` - Unit tests for file1.ts
- **Coverage**: [Percentage or status]
- **Last Run**: [Date and result]
### Documentation Files
- `docs/module-name.md` - Module documentation
- **Status**: Up to date / Needs update / Missing
- **Last Updated**: [Date]
### Configuration Files
- `config/setting.json` - Configuration changes
- **Environment**: [Dev/Staging/Prod affected]
- **Backup**: [Location of backup]
```
## Task List Maintenance
### During Development
1. **Regular updates**: Update task status after each significant change
2. **File tracking**: Add new files as they are created or modified
3. **Dependency tracking**: Note when new dependencies between tasks emerge
4. **Risk assessment**: Flag tasks that become more complex than anticipated
### Quality Checkpoints
At 25%, 50%, 75%, and 100% completion:
- [ ] **Architecture alignment**: Code follows established patterns
- [ ] **Performance impact**: No significant performance degradation
- [ ] **Security review**: No security vulnerabilities introduced
- [ ] **Documentation current**: All changes are documented
### Weekly Review Process
1. **Completion assessment**: What percentage of tasks are actually complete?
2. **Quality assessment**: Are completed tasks meeting quality standards?
3. **Process assessment**: Is the iterative workflow being followed?
4. **Risk assessment**: Are there emerging risks or blockers?
## Task Status Indicators
### Status Levels
- `[ ]` **Not Started**: Task not yet begun
- `[~]` **In Progress**: Currently being worked on
- `[?]` **Blocked**: Waiting for dependencies or decisions
- `[!]` **Needs Review**: Implementation complete but needs quality review
- `[x]` **Complete**: Finished and quality approved
### Quality Indicators
- ✅ **Quality Approved**: Passed all quality gates
- ⚠️ **Quality Concerns**: Has issues but functional
- ❌ **Quality Failed**: Needs rework before approval
- 🔄 **Under Review**: Currently being reviewed
### Integration Status
- 🔗 **Integrated**: Successfully integrated with existing code
- 🔧 **Integration Issues**: Problems with existing code integration
- ⏳ **Integration Pending**: Ready for integration testing
## Emergency Procedures
### When Tasks Become Too Complex
If a sub-task grows beyond expected scope:
1. **Stop implementation** immediately
2. **Document current state** and what was discovered
3. **Break down** the task into smaller pieces
4. **Update task list** with new sub-tasks
5. **Get approval** for the new breakdown before proceeding
### When Context is Lost
If AI seems to lose track of project patterns:
1. **Pause development**
2. **Review CONTEXT.md** and recent changes
3. **Update context documentation** with current state
4. **Restart** with explicit pattern references
5. **Reduce task size** until context is re-established
### When Quality Gates Fail
If implementation doesn't meet quality standards:
1. **Mark task** with `[!]` status
2. **Document specific issues** found
3. **Create remediation tasks** if needed
4. **Don't proceed** until quality issues are resolved
## AI Instructions Integration
### Context Awareness Commands
```markdown
**Before starting any task, run these checks:**
1. @CONTEXT.md - Review current project state
2. @architecture.md - Understand design principles
3. @code-review.md - Know quality standards
4. Look at existing similar code for patterns
```
### Quality Validation Commands
```markdown
**After completing any sub-task:**
1. Run code review checklist
2. Test integration with existing code
3. Update documentation if needed
4. Mark task complete only after quality approval
```
### Workflow Commands
```markdown
**For each development session:**
1. Review incomplete tasks and their status
2. Identify next logical sub-task to work on
3. Check dependencies and blockers
4. Follow iterative workflow process
5. Update task list with progress and findings
```
## Success Metrics
### Daily Success Indicators
- Tasks are completed according to quality standards
- No sub-tasks are started without completing previous ones
- File tracking remains accurate and current
- Integration issues are caught early
### Weekly Success Indicators
- Overall task completion rate is sustainable
- Quality issues are decreasing over time
- Context loss incidents are rare
- Team confidence in codebase remains high

View File

@ -0,0 +1,70 @@
---
description: Generate a task list or TODO for a user requirement or implementation.
globs:
alwaysApply: false
---
---
description:
globs:
alwaysApply: false
---
# Rule: Generating a Task List from a PRD
## Goal
To guide an AI assistant in creating a detailed, step-by-step task list in Markdown format based on an existing Product Requirements Document (PRD). The task list should guide a developer through implementation.
## Output
- **Format:** Markdown (`.md`)
- **Location:** `/tasks/`
- **Filename:** `tasks-[prd-file-name].md` (e.g., `tasks-prd-user-profile-editing.md`)
## Process
1. **Receive PRD Reference:** The user points the AI to a specific PRD file
2. **Analyze PRD:** The AI reads and analyzes the functional requirements, user stories, and other sections of the specified PRD.
3. **Phase 1: Generate Parent Tasks:** Based on the PRD analysis, create the file and generate the main, high-level tasks required to implement the feature. Use your judgement on how many high-level tasks to use. It's likely to be about 5. Present these tasks to the user in the specified format (without sub-tasks yet). Inform the user: "I have generated the high-level tasks based on the PRD. Ready to generate the sub-tasks? Respond with 'Go' to proceed."
4. **Wait for Confirmation:** Pause and wait for the user to respond with "Go".
5. **Phase 2: Generate Sub-Tasks:** Once the user confirms, break down each parent task into smaller, actionable sub-tasks necessary to complete the parent task. Ensure sub-tasks logically follow from the parent task and cover the implementation details implied by the PRD.
6. **Identify Relevant Files:** Based on the tasks and PRD, identify potential files that will need to be created or modified. List these under the `Relevant Files` section, including corresponding test files if applicable.
7. **Generate Final Output:** Combine the parent tasks, sub-tasks, relevant files, and notes into the final Markdown structure.
8. **Save Task List:** Save the generated document in the `/tasks/` directory with the filename `tasks-[prd-file-name].md`, where `[prd-file-name]` matches the base name of the input PRD file (e.g., if the input was `prd-user-profile-editing.md`, the output is `tasks-prd-user-profile-editing.md`).
## Output Format
The generated task list _must_ follow this structure:
```markdown
## Relevant Files
- `path/to/potential/file1.ts` - Brief description of why this file is relevant (e.g., Contains the main component for this feature).
- `path/to/file1.test.ts` - Unit tests for `file1.ts`.
- `path/to/another/file.tsx` - Brief description (e.g., API route handler for data submission).
- `path/to/another/file.test.tsx` - Unit tests for `another/file.tsx`.
- `lib/utils/helpers.ts` - Brief description (e.g., Utility functions needed for calculations).
- `lib/utils/helpers.test.ts` - Unit tests for `helpers.ts`.
### Notes
- Unit tests should typically be placed alongside the code files they are testing (e.g., `MyComponent.tsx` and `MyComponent.test.tsx` in the same directory).
- Use `npx jest [optional/path/to/test/file]` to run tests. Running without a path executes all tests found by the Jest configuration.
## Tasks
- [ ] 1.0 Parent Task Title
- [ ] 1.1 [Sub-task description 1.1]
- [ ] 1.2 [Sub-task description 1.2]
- [ ] 2.0 Parent Task Title
- [ ] 2.1 [Sub-task description 2.1]
- [ ] 3.0 Parent Task Title (may not require sub-tasks if purely structural or configuration)
```
## Interaction Model
The process explicitly requires a pause after generating parent tasks to get user confirmation ("Go") before proceeding to generate the detailed sub-tasks. This ensures the high-level plan aligns with user expectations before diving into details.
## Target Audience
Assume the primary reader of the task list is a **junior developer** who will implement the feature.

View File

@ -0,0 +1,236 @@
---
description: Iterative development workflow for AI-assisted coding
globs:
alwaysApply: false
---
# Rule: Iterative Development Workflow
## Goal
Establish a structured, iterative development process that prevents the chaos and complexity that can arise from uncontrolled AI-assisted development.
## Development Phases
### Phase 1: Planning and Design
**Before writing any code:**
1. **Understand the Requirement**
- Break down the task into specific, measurable objectives
- Identify existing code patterns that should be followed
- List dependencies and integration points
- Define acceptance criteria
2. **Design Review**
- Propose approach in bullet points
- Wait for explicit approval before proceeding
- Consider how the solution fits existing architecture
- Identify potential risks and mitigation strategies
### Phase 2: Incremental Implementation
**One small piece at a time:**
1. **Micro-Tasks** (≤ 50 lines each)
- Implement one function or small class at a time
- Test immediately after implementation
- Ensure integration with existing code
- Document decisions and patterns used
2. **Validation Checkpoints**
- After each micro-task, verify it works correctly
- Check that it follows established patterns
- Confirm it integrates cleanly with existing code
- Get approval before moving to next micro-task
### Phase 3: Integration and Testing
**Ensuring system coherence:**
1. **Integration Testing**
- Test new code with existing functionality
- Verify no regressions in existing features
- Check performance impact
- Validate error handling
2. **Documentation Update**
- Update relevant documentation
- Record any new patterns or decisions
- Update context files if architecture changed
## Iterative Prompting Strategy
### Step 1: Context Setting
```
Before implementing [feature], help me understand:
1. What existing patterns should I follow?
2. What existing functions/classes are relevant?
3. How should this integrate with [specific existing component]?
4. What are the potential architectural impacts?
```
### Step 2: Plan Creation
```
Based on the context, create a detailed plan for implementing [feature]:
1. Break it into micro-tasks (≤50 lines each)
2. Identify dependencies and order of implementation
3. Specify integration points with existing code
4. List potential risks and mitigation strategies
Wait for my approval before implementing.
```
### Step 3: Incremental Implementation
```
Implement only the first micro-task: [specific task]
- Use existing patterns from [reference file/function]
- Keep it under 50 lines
- Include error handling
- Add appropriate tests
- Explain your implementation choices
Stop after this task and wait for approval.
```
## Quality Gates
### Before Each Implementation
- [ ] **Purpose is clear**: Can explain what this piece does and why
- [ ] **Pattern is established**: Following existing code patterns
- [ ] **Size is manageable**: Implementation is small enough to understand completely
- [ ] **Integration is planned**: Know how it connects to existing code
### After Each Implementation
- [ ] **Code is understood**: Can explain every line of implemented code
- [ ] **Tests pass**: All existing and new tests are passing
- [ ] **Integration works**: New code works with existing functionality
- [ ] **Documentation updated**: Changes are reflected in relevant documentation
### Before Moving to Next Task
- [ ] **Current task complete**: All acceptance criteria met
- [ ] **No regressions**: Existing functionality still works
- [ ] **Clean state**: No temporary code or debugging artifacts
- [ ] **Approval received**: Explicit go-ahead for next task
- [ ] **Documentaion updated**: If relevant changes to module was made.
## Anti-Patterns to Avoid
### Large Block Implementation
**Don't:**
```
Implement the entire user management system with authentication,
CRUD operations, and email notifications.
```
**Do:**
```
First, implement just the User model with basic fields.
Stop there and let me review before continuing.
```
### Context Loss
**Don't:**
```
Create a new authentication system.
```
**Do:**
```
Looking at the existing auth patterns in auth.py, implement
password validation following the same structure as the
existing email validation function.
```
### Over-Engineering
**Don't:**
```
Build a flexible, extensible user management framework that
can handle any future requirements.
```
**Do:**
```
Implement user creation functionality that matches the existing
pattern in customer.py, focusing only on the current requirements.
```
## Progress Tracking
### Task Status Indicators
- 🔄 **In Planning**: Requirements gathering and design
- ⏳ **In Progress**: Currently implementing
- ✅ **Complete**: Implemented, tested, and integrated
- 🚫 **Blocked**: Waiting for decisions or dependencies
- 🔧 **Needs Refactor**: Working but needs improvement
### Weekly Review Process
1. **Progress Assessment**
- What was completed this week?
- What challenges were encountered?
- How well did the iterative process work?
2. **Process Adjustment**
- Were task sizes appropriate?
- Did context management work effectively?
- What improvements can be made?
3. **Architecture Review**
- Is the code remaining maintainable?
- Are patterns staying consistent?
- Is technical debt accumulating?
## Emergency Procedures
### When Things Go Wrong
If development becomes chaotic or problematic:
1. **Stop Development**
- Don't continue adding to the problem
- Take time to assess the situation
- Don't rush to "fix" with more AI-generated code
2. **Assess the Situation**
- What specific problems exist?
- How far has the code diverged from established patterns?
- What parts are still working correctly?
3. **Recovery Process**
- Roll back to last known good state
- Update context documentation with lessons learned
- Restart with smaller, more focused tasks
- Get explicit approval for each step of recovery
### Context Recovery
When AI seems to lose track of project patterns:
1. **Context Refresh**
- Review and update CONTEXT.md
- Include examples of current code patterns
- Clarify architectural decisions
2. **Pattern Re-establishment**
- Show AI examples of existing, working code
- Explicitly state patterns to follow
- Start with very small, pattern-matching tasks
3. **Gradual Re-engagement**
- Begin with simple, low-risk tasks
- Verify pattern adherence at each step
- Gradually increase task complexity as consistency returns
## Success Metrics
### Short-term (Daily)
- Code is understandable and well-integrated
- No major regressions introduced
- Development velocity feels sustainable
- Team confidence in codebase remains high
### Medium-term (Weekly)
- Technical debt is not accumulating
- New features integrate cleanly
- Development patterns remain consistent
- Documentation stays current
### Long-term (Monthly)
- Codebase remains maintainable as it grows
- New team members can understand and contribute
- AI assistance enhances rather than hinders development
- Architecture remains clean and purposeful

24
.cursor/rules/project.mdc Normal file
View File

@ -0,0 +1,24 @@
---
description:
globs:
alwaysApply: true
---
# Rule: Project specific rules
## Goal
Unify the project structure and interraction with tools and console
### System tools
- **ALWAYS** use UV for package management
- **ALWAYS** use Arch linux compatible command for terminal
### Coding patterns
- **ALWYAS** check the arguments and methods before use to avoid errors with wrong parameters or names
- If in doubt, check [CONTEXT.md](mdc:CONTEXT.md) file and [architecture.md](mdc:docs/architecture.md)
- **PREFER** ORM pattern for databases with SQLAclhemy.
- **DO NOT USE** emoji in code and comments
### Testing
- Use UV for test in format *uv run pytest [filename]*

View File

@ -0,0 +1,237 @@
---
description: Code refactoring and technical debt management for AI-assisted development
globs:
alwaysApply: false
---
# Rule: Code Refactoring and Technical Debt Management
## Goal
Guide AI in systematic code refactoring to improve maintainability, reduce complexity, and prevent technical debt accumulation in AI-assisted development projects.
## When to Apply This Rule
- Code complexity has increased beyond manageable levels
- Duplicate code patterns are detected
- Performance issues are identified
- New features are difficult to integrate
- Code review reveals maintainability concerns
- Weekly technical debt assessment indicates refactoring needs
## Pre-Refactoring Assessment
Before starting any refactoring, the AI MUST:
1. **Context Analysis:**
- Review existing `CONTEXT.md` for architectural decisions
- Analyze current code patterns and conventions
- Identify all files that will be affected (search the codebase for use)
- Check for existing tests that verify current behavior
2. **Scope Definition:**
- Clearly define what will and will not be changed
- Identify the specific refactoring pattern to apply
- Estimate the blast radius of changes
- Plan rollback strategy if needed
3. **Documentation Review:**
- Check `./docs/` for relevant module documentation
- Review any existing architectural diagrams
- Identify dependencies and integration points
- Note any known constraints or limitations
## Refactoring Process
### Phase 1: Planning and Safety
1. **Create Refactoring Plan:**
- Document the current state and desired end state
- Break refactoring into small, atomic steps
- Identify tests that must pass throughout the process
- Plan verification steps for each change
2. **Establish Safety Net:**
- Ensure comprehensive test coverage exists
- If tests are missing, create them BEFORE refactoring
- Document current behavior that must be preserved
- Create backup of current implementation approach
3. **Get Approval:**
- Present the refactoring plan to the user
- Wait for explicit "Go" or "Proceed" confirmation
- Do NOT start refactoring without approval
### Phase 2: Incremental Implementation
4. **One Change at a Time:**
- Implement ONE refactoring step per iteration
- Run tests after each step to ensure nothing breaks
- Update documentation if interfaces change
- Mark progress in the refactoring plan
5. **Verification Protocol:**
- Run all relevant tests after each change
- Verify functionality works as expected
- Check performance hasn't degraded
- Ensure no new linting or type errors
6. **User Checkpoint:**
- After each significant step, pause for user review
- Present what was changed and current status
- Wait for approval before continuing
- Address any concerns before proceeding
### Phase 3: Completion and Documentation
7. **Final Verification:**
- Run full test suite to ensure nothing is broken
- Verify all original functionality is preserved
- Check that new code follows project conventions
- Confirm performance is maintained or improved
8. **Documentation Update:**
- Update `CONTEXT.md` with new patterns/decisions
- Update module documentation in `./docs/`
- Document any new conventions established
- Note lessons learned for future refactoring
## Common Refactoring Patterns
### Extract Method/Function
```
WHEN: Functions/methods exceed 50 lines or have multiple responsibilities
HOW:
1. Identify logical groupings within the function
2. Extract each group into a well-named helper function
3. Ensure each function has a single responsibility
4. Verify tests still pass
```
### Extract Module/Class
```
WHEN: Files exceed 250 lines or handle multiple concerns
HOW:
1. Identify cohesive functionality groups
2. Create new files for each group
3. Move related functions/classes together
4. Update imports and dependencies
5. Verify module boundaries are clean
```
### Eliminate Duplication
```
WHEN: Similar code appears in multiple places
HOW:
1. Identify the common pattern or functionality
2. Extract to a shared utility function or module
3. Update all usage sites to use the shared code
4. Ensure the abstraction is not over-engineered
```
### Improve Data Structures
```
WHEN: Complex nested objects or unclear data flow
HOW:
1. Define clear interfaces/types for data structures
2. Create transformation functions between different representations
3. Ensure data flow is unidirectional where possible
4. Add validation at boundaries
```
### Reduce Coupling
```
WHEN: Modules are tightly interconnected
HOW:
1. Identify dependencies between modules
2. Extract interfaces for external dependencies
3. Use dependency injection where appropriate
4. Ensure modules can be tested in isolation
```
## Quality Gates
Every refactoring must pass these gates:
### Technical Quality
- [ ] All existing tests pass
- [ ] No new linting errors introduced
- [ ] Code follows established project conventions
- [ ] No performance regression detected
- [ ] File sizes remain under 250 lines
- [ ] Function sizes remain under 50 lines
### Maintainability
- [ ] Code is more readable than before
- [ ] Duplicated code has been reduced
- [ ] Module responsibilities are clearer
- [ ] Dependencies are explicit and minimal
- [ ] Error handling is consistent
### Documentation
- [ ] Public interfaces are documented
- [ ] Complex logic has explanatory comments
- [ ] Architectural decisions are recorded
- [ ] Examples are provided where helpful
## AI Instructions for Refactoring
1. **Always ask for permission** before starting any refactoring work
2. **Start with tests** - ensure comprehensive coverage before changing code
3. **Work incrementally** - make small changes and verify each step
4. **Preserve behavior** - functionality must remain exactly the same
5. **Update documentation** - keep all docs current with changes
6. **Follow conventions** - maintain consistency with existing codebase
7. **Stop and ask** if any step fails or produces unexpected results
8. **Explain changes** - clearly communicate what was changed and why
## Anti-Patterns to Avoid
### Over-Engineering
- Don't create abstractions for code that isn't duplicated
- Avoid complex inheritance hierarchies
- Don't optimize prematurely
### Breaking Changes
- Never change public APIs without explicit approval
- Don't remove functionality, even if it seems unused
- Avoid changing behavior "while we're here"
### Scope Creep
- Stick to the defined refactoring scope
- Don't add new features during refactoring
- Resist the urge to "improve" unrelated code
## Success Metrics
Track these metrics to ensure refactoring effectiveness:
### Code Quality
- Reduced cyclomatic complexity
- Lower code duplication percentage
- Improved test coverage
- Fewer linting violations
### Developer Experience
- Faster time to understand code
- Easier integration of new features
- Reduced bug introduction rate
- Higher developer confidence in changes
### Maintainability
- Clearer module boundaries
- More predictable behavior
- Easier debugging and troubleshooting
- Better performance characteristics
## Output Files
When refactoring is complete, update:
- `refactoring-log-[date].md` - Document what was changed and why
- `CONTEXT.md` - Update with new patterns and decisions
- `./docs/` - Update relevant module documentation
- Task lists - Mark refactoring tasks as complete
## Final Verification
Before marking refactoring complete:
1. Run full test suite and verify all tests pass
2. Check that code follows all project conventions
3. Verify documentation is up to date
4. Confirm user is satisfied with the results
5. Record lessons learned for future refactoring efforts

View File

@ -0,0 +1,44 @@
---
description: TODO list task implementation
globs:
alwaysApply: false
---
---
description:
globs:
alwaysApply: false
---
# Task List Management
Guidelines for managing task lists in markdown files to track progress on completing a PRD
## Task Implementation
- **One sub-task at a time:** Do **NOT** start the next subtask until you ask the user for permission and they say “yes” or "y"
- **Completion protocol:**
1. When you finish a **subtask**, immediately mark it as completed by changing `[ ]` to `[x]`.
2. If **all** subtasks underneath a parent task are now `[x]`, also mark the **parent task** as completed.
- Stop after each subtask and wait for the users goahead.
## Task List Maintenance
1. **Update the task list as you work:**
- Mark tasks and subtasks as completed (`[x]`) per the protocol above.
- Add new tasks as they emerge.
2. **Maintain the “Relevant Files” section:**
- List every file created or modified.
- Give each file a oneline description of its purpose.
## AI Instructions
When working with task lists, the AI must:
1. Regularly update the task list file after finishing any significant work.
2. Follow the completion protocol:
- Mark each finished **subtask** `[x]`.
- Mark the **parent task** `[x]` once **all** its subtasks are `[x]`.
3. Add newly discovered tasks.
4. Keep “Relevant Files” accurate and up to date.
5. Before starting work, check which subtask is next.
6. After implementing a subtask, update the file and then pause for user approval.

1
.python-version Normal file
View File

@ -0,0 +1 @@
3.12

23
.vscode/launch.json vendored Normal file
View File

@ -0,0 +1,23 @@
{
// Use IntelliSense to learn about possible attributes.
// Hover to view descriptions of existing attributes.
// For more information, visit: https://go.microsoft.com/fwlink/?linkid=830387
"version": "0.2.0",
"configurations": [
{
"name": "Python Debugger: Default backtest (uv venv)",
"type": "debugpy",
"request": "launch",
"python": "${workspaceFolder}/.venv/bin/python",
"program": "${workspaceFolder}/main.py",
"cwd": "${workspaceFolder}",
"console": "integratedTerminal",
"args": [
"BTC-USDT",
"2025-07-01",
"2025-07-07"
]
}
]
}

7
.vscode/settings.json vendored Normal file
View File

@ -0,0 +1,7 @@
{
"python.testing.pytestArgs": [
"."
],
"python.testing.unittestEnabled": false,
"python.testing.pytestEnabled": true
}

136
README.md
View File

@ -1,2 +1,136 @@
# orderflow_backtest
# Orderflow Backtest System
A high-performance orderbook reconstruction and metrics analysis system for cryptocurrency trading data. Calculates Order Book Imbalance (OBI) and Cumulative Volume Delta (CVD) metrics with per-snapshot granularity.
## Features
- **Orderbook Reconstruction**: Rebuild complete orderbooks from SQLite database files
- **OBI Metrics**: Calculate Order Book Imbalance `(Vb - Va) / (Vb + Va)` per snapshot
- **CVD Metrics**: Track Cumulative Volume Delta with incremental calculation and reset functionality
- **Memory Optimization**: >70% memory reduction through persistent metrics storage
- **Real-time Visualization**: OHLC candlesticks with OBI/CVD curves beneath volume graphs
- **Batch Processing**: High-performance processing of large datasets (months to years of data)
## Quick Start
### Prerequisites
- Python 3.12+
- UV package manager
- SQLite database files with orderbook and trades data
### Installation
```bash
# Install dependencies
uv sync
# Run tests to verify installation
uv run pytest
```
### Basic Usage
```bash
# Process BTC-USDT data from July 1-31, 2025
uv run python main.py BTC-USDT 2025-07-01 2025-08-01
```
## Architecture
### Core Components
- **`models.py`**: Data models (`OrderbookLevel`, `Trade`, `BookSnapshot`, `Book`, `Metric`, `MetricCalculator`)
- **`storage.py`**: Orchestrates orderbook reconstruction and metrics calculation
- **`strategies.py`**: Trading strategy framework with metrics analysis capabilities
- **`visualizer.py`**: Multi-subplot visualization (OHLC, Volume, OBI, CVD)
- **`main.py`**: CLI application entry point
### Data Layer
- **`repositories/sqlite_repository.py`**: Read-only SQLite data access
- **`repositories/sqlite_metrics_repository.py`**: Write-enabled metrics storage and retrieval
- **`parsers/orderbook_parser.py`**: Orderbook text parsing with price caching
### Testing
- **`tests/`**: Comprehensive unit and integration tests
- **Coverage**: 27 tests across 6 test files
- **Run tests**: `uv run pytest`
## Data Flow
1. **Data Loading**: SQLite databases → Repository → Raw orderbook/trades data
2. **Processing**: Storage → MetricCalculator → OBI/CVD calculation per snapshot
3. **Persistence**: Calculated metrics stored in database for future analysis
4. **Analysis**: Strategy loads stored metrics for trading signal generation
5. **Visualization**: Charts display OHLC, volume, OBI, and CVD with shared time axis
## Database Schema
### Input Tables (Required)
```sql
-- Orderbook snapshots
CREATE TABLE book (
id INTEGER PRIMARY KEY,
bids TEXT NOT NULL, -- JSON array of [price, size, liquidation_count, order_count]
asks TEXT NOT NULL, -- JSON array of [price, size, liquidation_count, order_count]
timestamp INTEGER NOT NULL -- Unix timestamp
);
-- Trade executions
CREATE TABLE trades (
id INTEGER PRIMARY KEY,
trade_id REAL NOT NULL,
price REAL NOT NULL,
size REAL NOT NULL,
side TEXT NOT NULL, -- "buy" or "sell"
timestamp INTEGER NOT NULL -- Unix timestamp
);
```
### Output Table (Auto-created)
```sql
-- Calculated metrics
CREATE TABLE metrics (
id INTEGER PRIMARY KEY AUTOINCREMENT,
snapshot_id INTEGER NOT NULL,
timestamp INTEGER NOT NULL,
obi REAL NOT NULL, -- Order Book Imbalance [-1, 1]
cvd REAL NOT NULL, -- Cumulative Volume Delta
best_bid REAL, -- Best bid price
best_ask REAL, -- Best ask price
FOREIGN KEY (snapshot_id) REFERENCES book(id)
);
```
## Performance
- **Memory Usage**: >70% reduction vs. keeping full snapshot history
- **Processing Speed**: Batch processing with optimized SQLite queries
- **Scalability**: Handles months to years of high-frequency data
- **Storage Efficiency**: Metrics table <20% overhead vs. source data
## Development
### Setup
```bash
# Install development dependencies
uv add --dev pytest
# Run linting
uv run pytest --linting
# Run specific test modules
uv run pytest tests/test_storage_metrics.py -v
```
### Project Structure
```
orderflow_backtest/
├── docs/ # Documentation
├── models.py # Core data structures
├── storage.py # Data processing orchestrator
├── strategies.py # Trading strategy framework
├── visualizer.py # Chart rendering
├── main.py # CLI application
├── repositories/ # Data access layer
├── parsers/ # Data parsing utilities
└── tests/ # Test suite
```
For detailed documentation, see [./docs/README.md](./docs/README.md).

689
docs/API.md Normal file
View File

@ -0,0 +1,689 @@
# API Documentation
## Overview
This document provides comprehensive API documentation for the Orderflow Backtest System, including public interfaces, data models, and usage examples.
## Core Data Models
### OrderbookLevel
Represents a single price level in the orderbook.
```python
@dataclass(slots=True)
class OrderbookLevel:
price: float # Price level
size: float # Total size at this price
liquidation_count: int # Number of liquidations
order_count: int # Number of resting orders
```
**Example:**
```python
level = OrderbookLevel(
price=50000.0,
size=10.5,
liquidation_count=0,
order_count=3
)
```
### Trade
Represents a single trade execution.
```python
@dataclass(slots=True)
class Trade:
id: int # Unique trade identifier
trade_id: float # Exchange trade ID
price: float # Execution price
size: float # Trade size
side: str # "buy" or "sell"
timestamp: int # Unix timestamp
```
**Example:**
```python
trade = Trade(
id=1,
trade_id=123456.0,
price=50000.0,
size=0.5,
side="buy",
timestamp=1640995200
)
```
### BookSnapshot
Complete orderbook state at a specific timestamp.
```python
@dataclass
class BookSnapshot:
id: int # Snapshot identifier
timestamp: int # Unix timestamp
bids: Dict[float, OrderbookLevel] # Bid side levels
asks: Dict[float, OrderbookLevel] # Ask side levels
trades: List[Trade] # Associated trades
```
**Example:**
```python
snapshot = BookSnapshot(
id=1,
timestamp=1640995200,
bids={
50000.0: OrderbookLevel(50000.0, 10.0, 0, 1),
49999.0: OrderbookLevel(49999.0, 5.0, 0, 1)
},
asks={
50001.0: OrderbookLevel(50001.0, 3.0, 0, 1),
50002.0: OrderbookLevel(50002.0, 2.0, 0, 1)
},
trades=[]
)
```
### Metric
Calculated financial metrics for a snapshot.
```python
@dataclass(slots=True)
class Metric:
snapshot_id: int # Reference to source snapshot
timestamp: int # Unix timestamp
obi: float # Order Book Imbalance [-1, 1]
cvd: float # Cumulative Volume Delta
best_bid: float | None # Best bid price
best_ask: float | None # Best ask price
```
**Example:**
```python
metric = Metric(
snapshot_id=1,
timestamp=1640995200,
obi=0.333,
cvd=150.5,
best_bid=50000.0,
best_ask=50001.0
)
```
## MetricCalculator API
Static class providing financial metric calculations.
### calculate_obi()
```python
@staticmethod
def calculate_obi(snapshot: BookSnapshot) -> float:
"""
Calculate Order Book Imbalance.
Formula: OBI = (Vb - Va) / (Vb + Va)
Args:
snapshot: BookSnapshot with bids and asks
Returns:
float: OBI value between -1 and 1
Example:
>>> obi = MetricCalculator.calculate_obi(snapshot)
>>> print(f"OBI: {obi:.3f}")
OBI: 0.333
"""
```
### calculate_volume_delta()
```python
@staticmethod
def calculate_volume_delta(trades: List[Trade]) -> float:
"""
Calculate Volume Delta for trades.
Formula: VD = Buy Volume - Sell Volume
Args:
trades: List of Trade objects
Returns:
float: Net volume delta
Example:
>>> vd = MetricCalculator.calculate_volume_delta(trades)
>>> print(f"Volume Delta: {vd}")
Volume Delta: 7.5
"""
```
### calculate_cvd()
```python
@staticmethod
def calculate_cvd(previous_cvd: float, volume_delta: float) -> float:
"""
Calculate Cumulative Volume Delta.
Formula: CVD_t = CVD_{t-1} + VD_t
Args:
previous_cvd: Previous CVD value
volume_delta: Current volume delta
Returns:
float: New CVD value
Example:
>>> cvd = MetricCalculator.calculate_cvd(100.0, 7.5)
>>> print(f"CVD: {cvd}")
CVD: 107.5
"""
```
### get_best_bid_ask()
```python
@staticmethod
def get_best_bid_ask(snapshot: BookSnapshot) -> tuple[float | None, float | None]:
"""
Extract best bid and ask prices.
Args:
snapshot: BookSnapshot with bids and asks
Returns:
tuple: (best_bid, best_ask) or (None, None)
Example:
>>> best_bid, best_ask = MetricCalculator.get_best_bid_ask(snapshot)
>>> print(f"Spread: {best_ask - best_bid}")
Spread: 1.0
"""
```
## Repository APIs
### SQLiteOrderflowRepository
Read-only repository for orderbook and trades data.
#### connect()
```python
def connect(self) -> sqlite3.Connection:
"""
Create optimized SQLite connection.
Returns:
sqlite3.Connection: Configured database connection
Example:
>>> repo = SQLiteOrderflowRepository(db_path)
>>> with repo.connect() as conn:
... # Use connection
"""
```
#### load_trades_by_timestamp()
```python
def load_trades_by_timestamp(self, conn: sqlite3.Connection) -> Dict[int, List[Trade]]:
"""
Load all trades grouped by timestamp.
Args:
conn: Active database connection
Returns:
Dict[int, List[Trade]]: Trades grouped by timestamp
Example:
>>> trades_by_ts = repo.load_trades_by_timestamp(conn)
>>> trades_at_1000 = trades_by_ts.get(1000, [])
"""
```
#### iterate_book_rows()
```python
def iterate_book_rows(self, conn: sqlite3.Connection) -> Iterator[Tuple[int, str, str, int]]:
"""
Memory-efficient iteration over orderbook rows.
Args:
conn: Active database connection
Yields:
Tuple[int, str, str, int]: (id, bids_text, asks_text, timestamp)
Example:
>>> for row_id, bids, asks, ts in repo.iterate_book_rows(conn):
... # Process row
"""
```
### SQLiteMetricsRepository
Write-enabled repository for metrics storage and retrieval.
#### create_metrics_table()
```python
def create_metrics_table(self, conn: sqlite3.Connection) -> None:
"""
Create metrics table with indexes.
Args:
conn: Active database connection
Raises:
sqlite3.Error: If table creation fails
Example:
>>> repo.create_metrics_table(conn)
>>> # Metrics table now available
"""
```
#### insert_metrics_batch()
```python
def insert_metrics_batch(self, conn: sqlite3.Connection, metrics: List[Metric]) -> None:
"""
Insert metrics in batch for performance.
Args:
conn: Active database connection
metrics: List of Metric objects to insert
Example:
>>> metrics = [Metric(...), Metric(...)]
>>> repo.insert_metrics_batch(conn, metrics)
>>> conn.commit()
"""
```
#### load_metrics_by_timerange()
```python
def load_metrics_by_timerange(
self,
conn: sqlite3.Connection,
start_timestamp: int,
end_timestamp: int
) -> List[Metric]:
"""
Load metrics within time range.
Args:
conn: Active database connection
start_timestamp: Start time (inclusive)
end_timestamp: End time (inclusive)
Returns:
List[Metric]: Metrics ordered by timestamp
Example:
>>> metrics = repo.load_metrics_by_timerange(conn, 1000, 2000)
>>> print(f"Loaded {len(metrics)} metrics")
"""
```
## Storage API
### Storage
High-level data processing orchestrator.
#### __init__()
```python
def __init__(self, instrument: str) -> None:
"""
Initialize storage for specific instrument.
Args:
instrument: Trading pair identifier (e.g., "BTC-USDT")
Example:
>>> storage = Storage("BTC-USDT")
"""
```
#### build_booktick_from_db()
```python
def build_booktick_from_db(self, db_path: Path, db_date: datetime) -> None:
"""
Process database and calculate metrics.
This is the main processing pipeline that:
1. Loads orderbook and trades data
2. Calculates OBI and CVD metrics per snapshot
3. Stores metrics in database
4. Populates book with snapshots
Args:
db_path: Path to SQLite database file
db_date: Date for this database (informational)
Example:
>>> storage.build_booktick_from_db(Path("data.db"), datetime.now())
>>> print(f"Processed {len(storage.book.snapshots)} snapshots")
"""
```
## Strategy API
### DefaultStrategy
Trading strategy with metrics analysis capabilities.
#### __init__()
```python
def __init__(self, instrument: str) -> None:
"""
Initialize strategy for instrument.
Args:
instrument: Trading pair identifier
Example:
>>> strategy = DefaultStrategy("BTC-USDT")
"""
```
#### set_db_path()
```python
def set_db_path(self, db_path: Path) -> None:
"""
Configure database path for metrics access.
Args:
db_path: Path to database with metrics
Example:
>>> strategy.set_db_path(Path("data.db"))
"""
```
#### load_stored_metrics()
```python
def load_stored_metrics(self, start_timestamp: int, end_timestamp: int) -> List[Metric]:
"""
Load stored metrics for analysis.
Args:
start_timestamp: Start of time range
end_timestamp: End of time range
Returns:
List[Metric]: Metrics for specified range
Example:
>>> metrics = strategy.load_stored_metrics(1000, 2000)
>>> latest_obi = metrics[-1].obi
"""
```
#### get_metrics_summary()
```python
def get_metrics_summary(self, metrics: List[Metric]) -> dict:
"""
Generate statistical summary of metrics.
Args:
metrics: List of metrics to analyze
Returns:
dict: Statistical summary with keys:
- obi_min, obi_max, obi_avg
- cvd_start, cvd_end, cvd_change
- total_snapshots
Example:
>>> summary = strategy.get_metrics_summary(metrics)
>>> print(f"OBI range: {summary['obi_min']:.3f} to {summary['obi_max']:.3f}")
"""
```
## Visualizer API
### Visualizer
Multi-chart visualization system.
#### __init__()
```python
def __init__(self, window_seconds: int = 60, max_bars: int = 200) -> None:
"""
Initialize visualizer with chart parameters.
Args:
window_seconds: OHLC aggregation window
max_bars: Maximum bars to display
Example:
>>> visualizer = Visualizer(window_seconds=300, max_bars=1000)
"""
```
#### set_db_path()
```python
def set_db_path(self, db_path: Path) -> None:
"""
Configure database path for metrics loading.
Args:
db_path: Path to database with metrics
Example:
>>> visualizer.set_db_path(Path("data.db"))
"""
```
#### update_from_book()
```python
def update_from_book(self, book: Book) -> None:
"""
Update charts with book data and stored metrics.
Creates 4-subplot layout:
1. OHLC candlesticks
2. Volume bars
3. OBI line chart
4. CVD line chart
Args:
book: Book with snapshots for OHLC calculation
Example:
>>> visualizer.update_from_book(storage.book)
>>> # Charts updated with latest data
"""
```
#### show()
```python
def show() -> None:
"""
Display interactive chart window.
Example:
>>> visualizer.show()
>>> # Interactive Qt5 window opens
"""
```
## Database Schema
### Input Tables (Required)
These tables must exist in the SQLite database files:
#### book table
```sql
CREATE TABLE book (
id INTEGER PRIMARY KEY,
instrument TEXT,
bids TEXT NOT NULL, -- JSON array: [[price, size, liq_count, order_count], ...]
asks TEXT NOT NULL, -- JSON array: [[price, size, liq_count, order_count], ...]
timestamp TEXT NOT NULL
);
```
#### trades table
```sql
CREATE TABLE trades (
id INTEGER PRIMARY KEY,
instrument TEXT,
trade_id TEXT,
price REAL NOT NULL,
size REAL NOT NULL,
side TEXT NOT NULL, -- "buy" or "sell"
timestamp TEXT NOT NULL
);
```
### Output Table (Auto-created)
This table is automatically created by the system:
#### metrics table
```sql
CREATE TABLE metrics (
id INTEGER PRIMARY KEY AUTOINCREMENT,
snapshot_id INTEGER NOT NULL,
timestamp TEXT NOT NULL,
obi REAL NOT NULL, -- Order Book Imbalance [-1, 1]
cvd REAL NOT NULL, -- Cumulative Volume Delta
best_bid REAL, -- Best bid price
best_ask REAL, -- Best ask price
FOREIGN KEY (snapshot_id) REFERENCES book(id)
);
-- Performance indexes
CREATE INDEX idx_metrics_timestamp ON metrics(timestamp);
CREATE INDEX idx_metrics_snapshot_id ON metrics(snapshot_id);
```
## Usage Examples
### Complete Processing Workflow
```python
from pathlib import Path
from datetime import datetime
from storage import Storage
from strategies import DefaultStrategy
from visualizer import Visualizer
# Initialize components
storage = Storage("BTC-USDT")
strategy = DefaultStrategy("BTC-USDT")
visualizer = Visualizer(window_seconds=60, max_bars=500)
# Process database
db_path = Path("data/BTC-USDT-25-06-09.db")
strategy.set_db_path(db_path)
visualizer.set_db_path(db_path)
# Build book and calculate metrics
storage.build_booktick_from_db(db_path, datetime.now())
# Analyze metrics
strategy.on_booktick(storage.book)
# Update visualization
visualizer.update_from_book(storage.book)
visualizer.show()
```
### Metrics Analysis
```python
# Load and analyze stored metrics
strategy = DefaultStrategy("BTC-USDT")
strategy.set_db_path(Path("data.db"))
# Get metrics for specific time range
metrics = strategy.load_stored_metrics(1640995200, 1640998800)
# Analyze metrics
summary = strategy.get_metrics_summary(metrics)
print(f"OBI Range: {summary['obi_min']:.3f} to {summary['obi_max']:.3f}")
print(f"CVD Change: {summary['cvd_change']:.1f}")
# Find significant imbalances
significant_obi = [m for m in metrics if abs(m.obi) > 0.2]
print(f"Found {len(significant_obi)} snapshots with >20% imbalance")
```
### Custom Metric Calculations
```python
from models import MetricCalculator
# Calculate metrics for single snapshot
obi = MetricCalculator.calculate_obi(snapshot)
best_bid, best_ask = MetricCalculator.get_best_bid_ask(snapshot)
# Calculate CVD over time
cvd = 0.0
for trades in trades_by_timestamp.values():
volume_delta = MetricCalculator.calculate_volume_delta(trades)
cvd = MetricCalculator.calculate_cvd(cvd, volume_delta)
print(f"CVD: {cvd:.1f}")
```
## Error Handling
### Common Error Scenarios
#### Database Connection Issues
```python
try:
repo = SQLiteMetricsRepository(db_path)
with repo.connect() as conn:
metrics = repo.load_metrics_by_timerange(conn, start, end)
except sqlite3.Error as e:
logging.error(f"Database error: {e}")
metrics = [] # Fallback to empty list
```
#### Missing Metrics Table
```python
repo = SQLiteMetricsRepository(db_path)
with repo.connect() as conn:
if not repo.table_exists(conn, "metrics"):
repo.create_metrics_table(conn)
logging.info("Created metrics table")
```
#### Empty Data Handling
```python
# All methods handle empty data gracefully
obi = MetricCalculator.calculate_obi(empty_snapshot) # Returns 0.0
vd = MetricCalculator.calculate_volume_delta([]) # Returns 0.0
summary = strategy.get_metrics_summary([]) # Returns {}
```
---
This API documentation provides complete coverage of the public interfaces for the Orderflow Backtest System. For implementation details and architecture information, see the additional documentation in the `docs/` directory.

143
docs/CHANGELOG.md Normal file
View File

@ -0,0 +1,143 @@
# Changelog
All notable changes to the Orderflow Backtest System are documented in this file.
The format is based on [Keep a Changelog](https://keepachangelog.com/en/1.0.0/),
and this project adheres to [Semantic Versioning](https://semver.org/spec/v2.0.0.html).
## [2.0.0] - 2024-Current
### Added
- **OBI Metrics Calculation**: Order Book Imbalance calculation with formula `(Vb - Va) / (Vb + Va)`
- **CVD Metrics Calculation**: Cumulative Volume Delta with incremental calculation and reset functionality
- **Persistent Metrics Storage**: SQLite-based storage for calculated metrics to avoid recalculation
- **Memory Optimization**: >70% reduction in peak memory usage through streaming processing
- **Enhanced Visualization**: Multi-subplot charts with OHLC, Volume, OBI, and CVD displays
- **Metrics Repository**: `SQLiteMetricsRepository` for write-enabled database operations
- **MetricCalculator Class**: Static methods for financial metrics computation
- **Batch Processing**: High-performance batch inserts (1000 records per operation)
- **Time-Range Queries**: Efficient metrics retrieval for specified time periods
- **Strategy Enhancement**: Metrics analysis capabilities in `DefaultStrategy`
- **Comprehensive Testing**: 27 tests across 6 test files with full integration coverage
### Changed
- **Storage Architecture**: Modified `Storage.build_booktick_from_db()` to integrate metrics calculation
- **Visualization Separation**: Moved visualization from strategy to main application for better separation of concerns
- **Strategy Interface**: Simplified `DefaultStrategy` constructor (removed `enable_visualization` parameter)
- **Main Application Flow**: Enhanced orchestration with per-database visualization updates
- **Database Schema**: Auto-creation of metrics table with proper indexes and foreign key constraints
- **Memory Management**: Stream processing instead of keeping full snapshot history
### Improved
- **Performance**: Batch database operations and optimized SQLite PRAGMAs
- **Scalability**: Support for months to years of high-frequency trading data
- **Code Quality**: All functions <50 lines, all files <250 lines
- **Documentation**: Comprehensive module and API documentation
- **Error Handling**: Graceful degradation and comprehensive logging
- **Type Safety**: Full type annotations throughout codebase
### Technical Details
- **New Tables**: `metrics` table with indexes on timestamp and snapshot_id
- **New Models**: `Metric` dataclass for calculated values
- **Processing Pipeline**: Snapshot → Calculate → Store → Discard workflow
- **Query Interface**: Time-range based metrics retrieval
- **Visualization Layout**: 4-subplot layout with shared time axis
## [1.0.0] - Previous Version
### Features
- **Orderbook Reconstruction**: Build complete orderbooks from SQLite database files
- **Data Models**: Core structures for `OrderbookLevel`, `Trade`, `BookSnapshot`, `Book`
- **SQLite Repository**: Read-only data access for orderbook and trades data
- **Orderbook Parser**: Text parsing with price caching optimization
- **Storage Orchestration**: High-level facade for book building
- **Basic Visualization**: OHLC candlestick charts with Qt5Agg backend
- **Strategy Framework**: Basic strategy pattern with `DefaultStrategy`
- **CLI Interface**: Command-line application for date range processing
- **Test Suite**: Unit and integration tests
### Architecture
- **Repository Pattern**: Clean separation of data access logic
- **Dataclass Models**: Lightweight, type-safe data structures
- **Parser Optimization**: Price caching for performance
- **Modular Design**: Clear separation between components
---
## Migration Guide
### Upgrading from v1.0.0 to v2.0.0
#### Code Changes Required
1. **Strategy Constructor**
```python
# Before (v1.0.0)
strategy = DefaultStrategy("BTC-USDT", enable_visualization=True)
# After (v2.0.0)
strategy = DefaultStrategy("BTC-USDT")
visualizer = Visualizer(window_seconds=60, max_bars=500)
```
2. **Main Application Flow**
```python
# Before (v1.0.0)
strategy = DefaultStrategy(instrument, enable_visualization=True)
storage.build_booktick_from_db(db_path, db_date)
strategy.on_booktick(storage.book)
# After (v2.0.0)
strategy = DefaultStrategy(instrument)
visualizer = Visualizer(window_seconds=60, max_bars=500)
strategy.set_db_path(db_path)
visualizer.set_db_path(db_path)
storage.build_booktick_from_db(db_path, db_date)
strategy.on_booktick(storage.book)
visualizer.update_from_book(storage.book)
```
#### Database Migration
- **Automatic**: Metrics table created automatically on first run
- **No Data Loss**: Existing orderbook and trades data unchanged
- **Schema Addition**: New `metrics` table with indexes added to existing databases
#### Benefits of Upgrading
- **Memory Efficiency**: >70% reduction in memory usage
- **Performance**: Faster processing through persistent metrics storage
- **Enhanced Analysis**: Access to OBI and CVD financial indicators
- **Better Visualization**: Multi-chart display with synchronized time axis
- **Improved Architecture**: Cleaner separation of concerns
#### Testing Migration
```bash
# Verify upgrade compatibility
uv run pytest tests/test_main_integration.py -v
# Test new metrics functionality
uv run pytest tests/test_storage_metrics.py -v
# Validate visualization separation
uv run pytest tests/test_main_visualization.py -v
```
---
## Development Notes
### Performance Improvements
- **v2.0.0**: >70% memory reduction, batch processing, persistent storage
- **v1.0.0**: In-memory processing, real-time calculations
### Architecture Evolution
- **v2.0.0**: Streaming processing with metrics storage, separated visualization
- **v1.0.0**: Full snapshot retention, integrated visualization in strategies
### Testing Coverage
- **v2.0.0**: 27 tests across 6 files, integration and unit coverage
- **v1.0.0**: Basic unit tests for core components
---
*For detailed technical documentation, see [docs/](../docs/) directory.*

163
docs/CONTEXT.md Normal file
View File

@ -0,0 +1,163 @@
# Project Context
## Current State
The Orderflow Backtest System has successfully implemented a comprehensive OBI (Order Book Imbalance) and CVD (Cumulative Volume Delta) metrics calculation and visualization system. The project is in a production-ready state with full feature completion.
## Recent Achievements
### ✅ Completed Features (Latest Implementation)
- **Metrics Calculation Engine**: Complete OBI and CVD calculation with per-snapshot granularity
- **Persistent Storage**: Metrics stored in SQLite database to avoid recalculation
- **Memory Optimization**: >70% memory usage reduction through efficient data management
- **Visualization System**: Multi-subplot charts (OHLC, Volume, OBI, CVD) with shared time axis
- **Strategy Framework**: Enhanced trading strategy system with metrics analysis
- **Clean Architecture**: Proper separation of concerns between data, analysis, and visualization
### 📊 System Metrics
- **Performance**: Batch processing of 1000 records per operation
- **Memory**: >70% reduction in peak memory usage
- **Test Coverage**: 27 comprehensive tests across 6 test files
- **Code Quality**: All functions <50 lines, all files <250 lines
## Architecture Decisions
### Key Design Patterns
1. **Repository Pattern**: Clean separation between data access and business logic
2. **Dataclass Models**: Lightweight, type-safe data structures with slots optimization
3. **Batch Processing**: High-performance database operations for large datasets
4. **Separation of Concerns**: Strategy, Storage, and Visualization as independent components
### Technology Stack
- **Language**: Python 3.12+ with type hints
- **Database**: SQLite with optimized PRAGMAs for performance
- **Package Management**: UV for fast dependency resolution
- **Testing**: Pytest with comprehensive unit and integration tests
- **Visualization**: Matplotlib with Qt5Agg backend
## Current Development Priorities
### ✅ Completed (Production Ready)
1. **Core Metrics System**: OBI and CVD calculation infrastructure
2. **Database Integration**: Persistent storage and retrieval system
3. **Visualization Framework**: Multi-chart display with proper time alignment
4. **Memory Optimization**: Efficient processing of large datasets
5. **Code Quality**: Comprehensive testing and documentation
### 🔄 Maintenance Phase
- **Documentation**: Comprehensive docs completed
- **Testing**: Full test coverage maintained
- **Performance**: Monitoring and optimization as needed
- **Bug Fixes**: Address any issues discovered in production use
## Known Patterns and Conventions
### Code Style
- **Functions**: Maximum 50 lines, single responsibility
- **Files**: Maximum 250 lines, clear module boundaries
- **Naming**: Descriptive names, no abbreviations except domain terms (OBI, CVD)
- **Error Handling**: Comprehensive try-catch with logging, graceful degradation
### Database Patterns
- **Parameterized Queries**: All SQL uses proper parameterization for security
- **Batch Operations**: Process records in batches of 1000 for performance
- **Indexing**: Strategic indexes on timestamp and foreign key columns
- **Transactions**: Proper transaction boundaries for data consistency
### Testing Patterns
- **Unit Tests**: Each module has comprehensive unit test coverage
- **Integration Tests**: End-to-end workflow testing
- **Mock Objects**: External dependencies mocked for isolated testing
- **Test Data**: Temporary databases with realistic test data
## Integration Points
### External Dependencies
- **SQLite**: Primary data storage (read and write operations)
- **Matplotlib**: Chart rendering and visualization
- **Qt5Agg**: GUI backend for interactive charts
- **Pytest**: Testing framework
### Internal Module Dependencies
```
main.py → storage.py → repositories/ → models.py
→ strategies.py → models.py
→ visualizer.py → repositories/
```
## Performance Characteristics
### Optimizations Implemented
- **Memory Management**: Metrics storage instead of full snapshot retention
- **Database Performance**: Optimized SQLite PRAGMAs and batch processing
- **Query Efficiency**: Indexed queries with proper WHERE clauses
- **Cache Usage**: Price caching in orderbook parser for repeated calculations
### Scalability Notes
- **Dataset Size**: Tested with 600K+ snapshots and 300K+ trades per day
- **Time Range**: Supports months to years of historical data
- **Processing Speed**: ~1000 rows/second with full metrics calculation
- **Storage Overhead**: Metrics table adds <20% to original database size
## Security Considerations
### Implemented Safeguards
- **SQL Injection Prevention**: All queries use parameterized statements
- **Input Validation**: Database paths and table names validated
- **Error Information**: No sensitive data exposed in error messages
- **Access Control**: Database file permissions respected
## Future Considerations
### Potential Enhancements
- **Real-time Processing**: Streaming data support for live trading
- **Additional Metrics**: Volume Profile, Delta Flow, Liquidity metrics
- **Export Capabilities**: CSV/JSON export for external analysis
- **Interactive Charts**: Enhanced user interaction with visualization
- **Configuration System**: Configurable batch sizes and processing parameters
### Scalability Options
- **Database Upgrade**: PostgreSQL for larger datasets if needed
- **Parallel Processing**: Multi-threading for CPU-intensive calculations
- **Caching Layer**: Redis for frequently accessed metrics
- **API Interface**: REST API for external system integration
## Development Environment
### Requirements
- Python 3.12+
- UV package manager
- SQLite database files with required schema
- Qt5 for visualization (Linux/macOS)
### Setup Commands
```bash
# Install dependencies
uv sync
# Run full test suite
uv run pytest
# Process sample data
uv run python main.py BTC-USDT 2025-07-01 2025-08-01
```
## Documentation Status
### ✅ Complete Documentation
- README.md with comprehensive overview
- Module-level documentation for all components
- API documentation with examples
- Architecture decision records
- Code-level documentation with docstrings
### 📊 Quality Metrics
- **Code Coverage**: 27 tests across 6 test files
- **Documentation Coverage**: All public interfaces documented
- **Example Coverage**: Working examples for all major features
- **Error Documentation**: All error conditions documented
---
*Last Updated: Current as of OBI/CVD metrics system completion*
*Next Review: As needed for maintenance or feature additions*

306
docs/CONTRIBUTING.md Normal file
View File

@ -0,0 +1,306 @@
# Contributing to Orderflow Backtest System
## Development Guidelines
Thank you for your interest in contributing to the Orderflow Backtest System. This document outlines the development process, coding standards, and best practices for maintaining code quality.
## Development Environment Setup
### Prerequisites
- **Python**: 3.12 or higher
- **Package Manager**: UV (recommended) or pip
- **Database**: SQLite 3.x
- **GUI**: Qt5 for visualization (Linux/macOS)
### Installation
```bash
# Clone the repository
git clone <repository-url>
cd orderflow_backtest
# Install dependencies
uv sync
# Install development dependencies
uv add --dev pytest coverage mypy
# Verify installation
uv run pytest
```
### Development Tools
```bash
# Run tests
uv run pytest
# Run tests with coverage
uv run pytest --cov=. --cov-report=html
# Run type checking
uv run mypy .
# Run specific test module
uv run pytest tests/test_storage_metrics.py -v
```
## Code Standards
### Function and File Size Limits
- **Functions**: Maximum 50 lines
- **Files**: Maximum 250 lines
- **Classes**: Single responsibility, clear purpose
- **Methods**: One main function per method
### Naming Conventions
```python
# Good examples
def calculate_order_book_imbalance(snapshot: BookSnapshot) -> float:
def load_metrics_by_timerange(start: int, end: int) -> List[Metric]:
class MetricCalculator:
class SQLiteMetricsRepository:
# Avoid abbreviations except domain terms
# Good: OBI, CVD (standard financial terms)
# Avoid: calc_obi, proc_data, mgr
```
### Type Annotations
```python
# Required for all public interfaces
def process_trades(trades: List[Trade]) -> Dict[int, float]:
"""Process trades and return volume by timestamp."""
class Storage:
def __init__(self, instrument: str) -> None:
self.instrument = instrument
```
### Documentation Standards
```python
def calculate_metrics(snapshot: BookSnapshot, trades: List[Trade]) -> Metric:
"""
Calculate OBI and CVD metrics for a snapshot.
Args:
snapshot: Orderbook state at specific timestamp
trades: List of trades executed at this timestamp
Returns:
Metric: Calculated OBI, CVD, and best bid/ask values
Raises:
ValueError: If snapshot contains invalid data
Example:
>>> snapshot = BookSnapshot(...)
>>> trades = [Trade(...), ...]
>>> metric = calculate_metrics(snapshot, trades)
>>> print(f"OBI: {metric.obi:.3f}")
OBI: 0.333
"""
```
## Architecture Principles
### Separation of Concerns
- **Storage**: Data processing and persistence only
- **Strategy**: Trading analysis and signal generation only
- **Visualizer**: Chart rendering and display only
- **Main**: Application orchestration and flow control
### Repository Pattern
```python
# Good: Clean interface
class SQLiteMetricsRepository:
def load_metrics_by_timerange(self, conn: Connection, start: int, end: int) -> List[Metric]:
# Implementation details hidden
# Avoid: Direct SQL in business logic
def analyze_strategy(db_path: Path):
# Don't do this
conn = sqlite3.connect(db_path)
cursor = conn.execute("SELECT * FROM metrics WHERE ...")
```
### Error Handling
```python
# Required pattern
try:
result = risky_operation()
return process_result(result)
except SpecificException as e:
logging.error(f"Operation failed: {e}")
return default_value
except Exception as e:
logging.error(f"Unexpected error in operation: {e}")
raise
```
## Testing Requirements
### Test Coverage
- **Unit Tests**: All public methods must have unit tests
- **Integration Tests**: End-to-end workflow testing required
- **Edge Cases**: Handle empty data, boundary conditions, error scenarios
### Test Structure
```python
def test_feature_description():
"""Test that feature behaves correctly under normal conditions."""
# Arrange
test_data = create_test_data()
# Act
result = function_under_test(test_data)
# Assert
assert result.expected_property == expected_value
assert len(result.collection) == expected_count
```
### Test Data Management
```python
# Use temporary files for database tests
def test_database_operation():
with tempfile.NamedTemporaryFile(suffix=".db", delete=False) as tmp_file:
db_path = Path(tmp_file.name)
try:
# Test implementation
pass
finally:
db_path.unlink(missing_ok=True)
```
## Database Development
### Schema Changes
1. **Create Migration**: Document schema changes in ADR format
2. **Backward Compatibility**: Ensure existing databases continue to work
3. **Auto-Migration**: Implement automatic schema updates where possible
4. **Performance**: Add appropriate indexes for new queries
### Query Patterns
```python
# Good: Parameterized queries
cursor.execute(
"SELECT obi, cvd FROM metrics WHERE timestamp >= ? AND timestamp <= ?",
(start_timestamp, end_timestamp)
)
# Bad: String formatting (security risk)
query = f"SELECT * FROM metrics WHERE timestamp = {timestamp}"
```
### Performance Guidelines
- **Batch Operations**: Process in batches of 1000 records
- **Indexes**: Add indexes for frequently queried columns
- **Transactions**: Use transactions for multi-record operations
- **Connection Management**: Caller manages connection lifecycle
## Performance Requirements
### Memory Management
- **Target**: >70% memory reduction vs. full snapshot retention
- **Measurement**: Profile memory usage with large datasets
- **Optimization**: Stream processing, batch operations, minimal object retention
### Processing Speed
- **Target**: >500 snapshots/second processing rate
- **Measurement**: Benchmark with realistic datasets
- **Optimization**: Database batching, efficient algorithms, minimal I/O
### Storage Efficiency
- **Target**: <25% storage overhead for metrics
- **Measurement**: Compare metrics table size to source data
- **Optimization**: Efficient data types, minimal redundancy
## Submission Process
### Before Submitting
1. **Run Tests**: Ensure all tests pass
```bash
uv run pytest
```
2. **Check Type Hints**: Verify type annotations
```bash
uv run mypy .
```
3. **Test Coverage**: Ensure adequate test coverage
```bash
uv run pytest --cov=. --cov-report=term-missing
```
4. **Documentation**: Update relevant documentation files
### Pull Request Guidelines
- **Description**: Clear description of changes and motivation
- **Testing**: Include tests for new functionality
- **Documentation**: Update docs for API changes
- **Breaking Changes**: Document any breaking changes
- **Performance**: Include performance impact analysis for significant changes
### Code Review Checklist
- [ ] Follows function/file size limits
- [ ] Has comprehensive test coverage
- [ ] Includes proper error handling
- [ ] Uses type annotations consistently
- [ ] Maintains backward compatibility
- [ ] Updates relevant documentation
- [ ] No security vulnerabilities (SQL injection, etc.)
- [ ] Performance impact analyzed
## Documentation Maintenance
### When to Update Documentation
- **API Changes**: Any modification to public interfaces
- **Architecture Changes**: New patterns, data structures, or workflows
- **Performance Changes**: Significant performance improvements or regressions
- **Feature Additions**: New capabilities or metrics
### Documentation Types
- **Code Comments**: Complex algorithms and business logic
- **Docstrings**: All public functions and classes
- **Module Documentation**: Purpose and usage examples
- **Architecture Documentation**: System design and component relationships
## Getting Help
### Resources
- **Architecture Overview**: `docs/architecture.md`
- **API Documentation**: `docs/API.md`
- **Module Documentation**: `docs/modules/`
- **Decision Records**: `docs/decisions/`
### Communication
- **Issues**: Use GitHub issues for bug reports and feature requests
- **Discussions**: Use GitHub discussions for questions and design discussions
- **Code Review**: Comment on pull requests for specific code feedback
---
## Development Workflow
### Feature Development
1. **Create Branch**: Feature-specific branch from main
2. **Develop**: Follow coding standards and test requirements
3. **Test**: Comprehensive testing including edge cases
4. **Document**: Update relevant documentation
5. **Review**: Submit pull request for code review
6. **Merge**: Merge after approval and CI success
### Bug Fixes
1. **Reproduce**: Create test that reproduces the bug
2. **Fix**: Implement minimal fix addressing root cause
3. **Verify**: Ensure fix resolves issue without regressions
4. **Test**: Add regression test to prevent future occurrences
### Performance Improvements
1. **Benchmark**: Establish baseline performance metrics
2. **Optimize**: Implement performance improvements
3. **Measure**: Verify performance gains with benchmarks
4. **Document**: Update performance characteristics in docs
Thank you for contributing to the Orderflow Backtest System! Your contributions help make this a better tool for cryptocurrency trading analysis.

51
docs/README.md Normal file
View File

@ -0,0 +1,51 @@
# Orderflow Backtest System Documentation
## Overview
This directory contains comprehensive documentation for the Orderflow Backtest System, a high-performance cryptocurrency trading data analysis platform.
## Documentation Structure
### 📚 Main Documentation
- **[CONTEXT.md](./CONTEXT.md)**: Current project state, architecture decisions, and development patterns
- **[architecture.md](./architecture.md)**: System architecture, component relationships, and data flow
- **[API.md](./API.md)**: Public interfaces, classes, and function documentation
### 📦 Module Documentation
- **[modules/metrics.md](./modules/metrics.md)**: OBI and CVD calculation system
- **[modules/storage.md](./modules/storage.md)**: Data processing and persistence layer
- **[modules/visualization.md](./modules/visualization.md)**: Chart rendering and display system
- **[modules/repositories.md](./modules/repositories.md)**: Database access and operations
### 🏗️ Architecture Decisions
- **[decisions/ADR-001-metrics-storage.md](./decisions/ADR-001-metrics-storage.md)**: Persistent metrics storage decision
- **[decisions/ADR-002-visualization-separation.md](./decisions/ADR-002-visualization-separation.md)**: Separation of concerns for visualization
### 📋 Development Guides
- **[CONTRIBUTING.md](./CONTRIBUTING.md)**: Development workflow and contribution guidelines
- **[CHANGELOG.md](./CHANGELOG.md)**: Version history and changes
## Quick Navigation
| Topic | Documentation |
|-------|---------------|
| **Getting Started** | [README.md](../README.md) |
| **System Architecture** | [architecture.md](./architecture.md) |
| **Metrics Calculation** | [modules/metrics.md](./modules/metrics.md) |
| **Database Schema** | [API.md](./API.md#database-schema) |
| **Development Setup** | [CONTRIBUTING.md](./CONTRIBUTING.md) |
| **API Reference** | [API.md](./API.md) |
## Documentation Standards
This documentation follows the project's documentation standards defined in `.cursor/rules/documentation.mdc`. All documentation includes:
- Clear purpose and scope
- Code examples with working implementations
- API documentation with request/response formats
- Error handling and edge cases
- Dependencies and requirements
## Maintenance
Documentation is updated with every significant code change and reviewed during the development process. See [CONTRIBUTING.md](./CONTRIBUTING.md) for details on documentation maintenance procedures.

307
docs/architecture.md Normal file
View File

@ -0,0 +1,307 @@
# System Architecture
## Overview
The Orderflow Backtest System is designed as a modular, high-performance data processing pipeline for cryptocurrency trading analysis. The architecture emphasizes separation of concerns, efficient memory usage, and scalable processing of large datasets.
## High-Level Architecture
```
┌─────────────────┐ ┌──────────────────┐ ┌─────────────────┐
│ Data Sources │ │ Processing │ │ Presentation │
│ │ │ │ │ │
│ ┌─────────────┐ │ │ ┌──────────────┐ │ │ ┌─────────────┐ │
│ │SQLite Files │─┼────┼→│ Storage │─┼────┼→│ Visualizer │ │
│ │- orderbook │ │ │ │- Orchestrator│ │ │ │- OHLC Charts│ │
│ │- trades │ │ │ │- Calculator │ │ │ │- OBI/CVD │ │
│ └─────────────┘ │ │ └──────────────┘ │ │ └─────────────┘ │
│ │ │ │ │ │ ▲ │
└─────────────────┘ │ ┌─────────────┐ │ │ ┌─────────────┐ │
│ │ Strategy │──┼────┼→│ Reports │ │
│ │- Analysis │ │ │ │- Metrics │ │
│ │- Alerts │ │ │ │- Summaries │ │
│ └─────────────┘ │ │ └─────────────┘ │
└──────────────────┘ └─────────────────┘
```
## Component Architecture
### Data Layer
#### Models (`models.py`)
**Purpose**: Core data structures and calculation logic
```python
# Core data models
OrderbookLevel # Single price level (price, size, order_count, liquidation_count)
Trade # Individual trade execution (price, size, side, timestamp)
BookSnapshot # Complete orderbook state at timestamp
Book # Container for snapshot sequence
Metric # Calculated OBI/CVD values
# Calculation engine
MetricCalculator # Static methods for OBI/CVD computation
```
**Relationships**:
- `Book` contains multiple `BookSnapshot` instances
- `BookSnapshot` contains dictionaries of `OrderbookLevel` and lists of `Trade`
- `Metric` stores calculated values for each `BookSnapshot`
- `MetricCalculator` operates on snapshots to produce metrics
#### Repositories (`repositories/`)
**Purpose**: Database access and persistence layer
```python
# Read-only base repository
SQLiteOrderflowRepository:
- connect() # Optimized SQLite connection
- load_trades_by_timestamp() # Efficient trade loading
- iterate_book_rows() # Memory-efficient snapshot streaming
- count_rows() # Performance monitoring
# Write-enabled metrics repository
SQLiteMetricsRepository:
- create_metrics_table() # Schema creation
- insert_metrics_batch() # High-performance batch inserts
- load_metrics_by_timerange() # Time-range queries
- table_exists() # Schema validation
```
**Design Patterns**:
- **Repository Pattern**: Clean separation between data access and business logic
- **Batch Processing**: Process 1000 records per database operation
- **Connection Management**: Caller manages connection lifecycle
- **Performance Optimization**: SQLite PRAGMAs for high-speed operations
### Processing Layer
#### Storage (`storage.py`)
**Purpose**: Orchestrates data loading, processing, and metrics calculation
```python
class Storage:
- build_booktick_from_db() # Main processing pipeline
- _create_snapshots_and_metrics() # Per-snapshot processing
- _snapshot_from_row() # Individual snapshot creation
```
**Processing Pipeline**:
1. **Initialize**: Create metrics repository and table if needed
2. **Load Trades**: Group trades by timestamp for efficient access
3. **Stream Processing**: Process snapshots one-by-one to minimize memory
4. **Calculate Metrics**: OBI and CVD calculation per snapshot
5. **Batch Persistence**: Store metrics in batches of 1000
6. **Memory Management**: Discard full snapshots after metric extraction
#### Strategy Framework (`strategies.py`)
**Purpose**: Trading analysis and signal generation
```python
class DefaultStrategy:
- set_db_path() # Configure database access
- compute_OBI() # Real-time OBI calculation (fallback)
- load_stored_metrics() # Retrieve persisted metrics
- get_metrics_summary() # Statistical analysis
- on_booktick() # Main analysis entry point
```
**Analysis Capabilities**:
- **Stored Metrics**: Primary analysis using persisted data
- **Real-time Fallback**: Live calculation for compatibility
- **Statistical Summaries**: Min/max/average OBI, CVD changes
- **Alert System**: Configurable thresholds for significant imbalances
### Presentation Layer
#### Visualization (`visualizer.py`)
**Purpose**: Multi-chart rendering and display
```python
class Visualizer:
- set_db_path() # Configure metrics access
- update_from_book() # Main rendering pipeline
- _load_stored_metrics() # Retrieve metrics for chart range
- _draw() # Multi-subplot rendering
- show() # Display interactive charts
```
**Chart Layout**:
```
┌─────────────────────────────────────┐
│ OHLC Candlesticks │ ← Price action
├─────────────────────────────────────┤
│ Volume Bars │ ← Trading volume
├─────────────────────────────────────┤
│ OBI Line Chart │ ← Order book imbalance
├─────────────────────────────────────┤
│ CVD Line Chart │ ← Cumulative volume delta
└─────────────────────────────────────┘
```
**Features**:
- **Shared Time Axis**: Synchronized X-axis across all subplots
- **Auto-scaling**: Y-axis optimization for each metric type
- **Performance**: Efficient rendering of large datasets
- **Interactive**: Qt5Agg backend for zooming and panning
## Data Flow
### Processing Flow
```
1. SQLite DB → Repository → Raw Data
2. Raw Data → Storage → BookSnapshot
3. BookSnapshot → MetricCalculator → OBI/CVD
4. Metrics → Repository → Database Storage
5. Stored Metrics → Strategy → Analysis
6. Stored Metrics → Visualizer → Charts
```
### Memory Management Flow
```
Traditional: DB → All Snapshots in Memory → Analysis (High Memory)
Optimized: DB → Process Snapshot → Calculate Metrics → Store → Discard (Low Memory)
```
## Database Schema
### Input Schema (Required)
```sql
-- Orderbook snapshots
CREATE TABLE book (
id INTEGER PRIMARY KEY,
instrument TEXT,
bids TEXT, -- JSON: [[price, size, liq_count, order_count], ...]
asks TEXT, -- JSON: [[price, size, liq_count, order_count], ...]
timestamp TEXT
);
-- Trade executions
CREATE TABLE trades (
id INTEGER PRIMARY KEY,
instrument TEXT,
trade_id TEXT,
price REAL,
size REAL,
side TEXT, -- "buy" or "sell"
timestamp TEXT
);
```
### Output Schema (Auto-created)
```sql
-- Calculated metrics
CREATE TABLE metrics (
id INTEGER PRIMARY KEY AUTOINCREMENT,
snapshot_id INTEGER,
timestamp TEXT,
obi REAL, -- Order Book Imbalance [-1, 1]
cvd REAL, -- Cumulative Volume Delta
best_bid REAL,
best_ask REAL,
FOREIGN KEY (snapshot_id) REFERENCES book(id)
);
-- Performance indexes
CREATE INDEX idx_metrics_timestamp ON metrics(timestamp);
CREATE INDEX idx_metrics_snapshot_id ON metrics(snapshot_id);
```
## Performance Characteristics
### Memory Optimization
- **Before**: Store all snapshots in memory (~1GB for 600K snapshots)
- **After**: Store only metrics data (~300MB for same dataset)
- **Reduction**: >70% memory usage decrease
### Processing Performance
- **Batch Size**: 1000 records per database operation
- **Processing Speed**: ~1000 snapshots/second on modern hardware
- **Database Overhead**: <20% storage increase for metrics table
- **Query Performance**: Sub-second retrieval for typical time ranges
### Scalability Limits
- **Single File**: 1M+ snapshots per database file
- **Time Range**: Months to years of historical data
- **Memory Peak**: <2GB for year-long datasets
- **Disk Space**: Original size + 20% for metrics
## Integration Points
### External Interfaces
```python
# Main application entry point
main.py:
- CLI argument parsing
- Database file discovery
- Component orchestration
- Progress monitoring
# Plugin interfaces
Strategy.on_booktick(book: Book) # Strategy integration point
Visualizer.update_from_book(book) # Visualization integration
```
### Internal Interfaces
```python
# Repository interfaces
Repository.connect() → Connection
Repository.load_data() → TypedData
Repository.store_data(data) → None
# Calculator interfaces
MetricCalculator.calculate_obi(snapshot) → float
MetricCalculator.calculate_cvd(prev_cvd, trades) → float
```
## Security Considerations
### Data Protection
- **SQL Injection**: All queries use parameterized statements
- **File Access**: Validates database file paths and permissions
- **Error Handling**: No sensitive data in error messages
- **Input Validation**: Sanitizes all external inputs
### Access Control
- **Database**: Respects file system permissions
- **Memory**: No sensitive data persistence beyond processing
- **Logging**: Configurable log levels without data exposure
## Configuration Management
### Performance Tuning
```python
# Storage configuration
BATCH_SIZE = 1000 # Records per database operation
LOG_FREQUENCY = 20 # Progress reports per processing run
# SQLite optimization
PRAGMA journal_mode = OFF # Maximum write performance
PRAGMA synchronous = OFF # Disable synchronous writes
PRAGMA cache_size = 100000 # Large memory cache
```
### Visualization Settings
```python
# Chart configuration
WINDOW_SECONDS = 60 # OHLC aggregation window
MAX_BARS = 500 # Maximum bars displayed
FIGURE_SIZE = (12, 10) # Chart dimensions
```
## Error Handling Strategy
### Graceful Degradation
- **Database Errors**: Continue with reduced functionality
- **Calculation Errors**: Skip problematic snapshots with logging
- **Visualization Errors**: Display available data, note issues
- **Memory Pressure**: Adjust batch sizes automatically
### Recovery Mechanisms
- **Partial Processing**: Resume from last successful batch
- **Data Validation**: Verify metrics calculations before storage
- **Rollback Support**: Transaction boundaries for data consistency
---
This architecture provides a robust, scalable foundation for high-frequency trading data analysis while maintaining clean separation of concerns and efficient resource utilization.

View File

@ -0,0 +1,120 @@
# ADR-001: Persistent Metrics Storage
## Status
Accepted
## Context
The original orderflow backtest system kept all orderbook snapshots in memory during processing, leading to excessive memory usage (>1GB for typical datasets). With the addition of OBI and CVD metrics calculation, we needed to decide how to handle the computed metrics and manage memory efficiently.
## Decision
We will implement persistent storage of calculated metrics in the SQLite database with the following approach:
1. **Metrics Table**: Create a dedicated `metrics` table to store OBI, CVD, and related data
2. **Streaming Processing**: Process snapshots one-by-one, calculate metrics, store results, then discard snapshots
3. **Batch Operations**: Use batch inserts (1000 records) for optimal database performance
4. **Query Interface**: Provide time-range queries for metrics retrieval and analysis
## Consequences
### Positive
- **Memory Reduction**: >70% reduction in peak memory usage during processing
- **Avoid Recalculation**: Metrics calculated once and reused for multiple analysis runs
- **Scalability**: Can process months/years of data without memory constraints
- **Performance**: Batch database operations provide high throughput
- **Persistence**: Metrics survive between application runs
- **Analysis Ready**: Stored metrics enable complex time-series analysis
### Negative
- **Storage Overhead**: Metrics table adds ~20% to database size
- **Complexity**: Additional database schema and management code
- **Dependencies**: Tighter coupling between processing and database layer
- **Migration**: Existing databases need schema updates for metrics table
## Alternatives Considered
### Option 1: Keep All Snapshots in Memory
**Rejected**: Unsustainable memory usage for large datasets. Would limit analysis to small time ranges.
### Option 2: Calculate Metrics On-Demand
**Rejected**: Recalculating metrics for every analysis run is computationally expensive and time-consuming.
### Option 3: External Metrics Database
**Rejected**: Adds deployment complexity. SQLite co-location provides better performance and simpler management.
### Option 4: Compressed In-Memory Cache
**Rejected**: Still faces fundamental memory scaling issues. Compression/decompression adds CPU overhead.
## Implementation Details
### Database Schema
```sql
CREATE TABLE metrics (
id INTEGER PRIMARY KEY AUTOINCREMENT,
snapshot_id INTEGER NOT NULL,
timestamp TEXT NOT NULL,
obi REAL NOT NULL,
cvd REAL NOT NULL,
best_bid REAL,
best_ask REAL,
FOREIGN KEY (snapshot_id) REFERENCES book(id)
);
CREATE INDEX idx_metrics_timestamp ON metrics(timestamp);
CREATE INDEX idx_metrics_snapshot_id ON metrics(snapshot_id);
```
### Processing Pipeline
1. Create metrics table if not exists
2. Stream through orderbook snapshots
3. For each snapshot:
- Calculate OBI and CVD metrics
- Batch store metrics (1000 records per commit)
- Discard snapshot from memory
4. Provide query interface for time-range retrieval
### Memory Management
- **Before**: Store all snapshots → Calculate on demand → High memory usage
- **After**: Stream snapshots → Calculate immediately → Store metrics → Low memory usage
## Migration Strategy
### Backward Compatibility
- Existing databases continue to work without metrics table
- System auto-creates metrics table on first processing run
- Fallback to real-time calculation if metrics unavailable
### Performance Impact
- **Processing Time**: Slight increase due to database writes (~10%)
- **Query Performance**: Significant improvement for repeated analysis
- **Overall**: Net positive performance for typical usage patterns
## Monitoring and Validation
### Success Metrics
- **Memory Usage**: Target >70% reduction in peak memory usage
- **Processing Speed**: Maintain >500 snapshots/second processing rate
- **Storage Efficiency**: Metrics table <25% of total database size
- **Query Performance**: <1 second retrieval for typical time ranges
### Validation Methods
- Memory profiling during large dataset processing
- Performance benchmarks vs. original system
- Storage overhead analysis across different dataset sizes
- Query performance testing with various time ranges
## Future Considerations
### Potential Enhancements
- **Compression**: Consider compression for metrics storage if overhead becomes significant
- **Partitioning**: Time-based partitioning for very large datasets
- **Caching**: In-memory cache for frequently accessed metrics
- **Export**: Direct export capabilities for external analysis tools
### Scalability Options
- **Database Upgrade**: PostgreSQL if SQLite becomes limiting factor
- **Parallel Processing**: Multi-threaded metrics calculation
- **Distributed Storage**: For institutional-scale datasets
---
This decision provides a solid foundation for efficient, scalable metrics processing while maintaining simplicity and performance characteristics suitable for the target use cases.

View File

@ -0,0 +1,217 @@
# ADR-002: Separation of Visualization from Strategy
## Status
Accepted
## Context
The original system embedded visualization functionality within the `DefaultStrategy` class, creating tight coupling between trading analysis logic and chart rendering. This design had several issues:
1. **Mixed Responsibilities**: Strategy classes handled both trading logic and GUI operations
2. **Testing Complexity**: Strategy tests required mocking GUI components
3. **Deployment Flexibility**: Strategies couldn't run in headless environments
4. **Timing Control**: Visualization timing was tied to strategy execution rather than application flow
The user specifically requested to display visualizations after processing each database file, requiring better control over visualization timing.
## Decision
We will separate visualization from strategy components with the following architecture:
1. **Remove Visualization from Strategy**: Strategy classes focus solely on trading analysis
2. **Main Application Control**: `main.py` orchestrates visualization timing and updates
3. **Independent Configuration**: Strategy and Visualizer get database paths independently
4. **Clean Interfaces**: No direct dependencies between strategy and visualization components
## Consequences
### Positive
- **Single Responsibility**: Strategy focuses on trading logic, Visualizer on charts
- **Better Testability**: Strategy tests run without GUI dependencies
- **Flexible Deployment**: Strategies can run in headless/server environments
- **Timing Control**: Visualization updates precisely when needed (after each DB)
- **Maintainability**: Changes to visualization don't affect strategy logic
- **Performance**: No GUI overhead during strategy analysis
### Negative
- **Increased Complexity**: Main application handles more orchestration logic
- **Coordination Required**: Must ensure strategy and visualizer get same database path
- **Breaking Change**: Existing strategy initialization code needs updates
## Alternatives Considered
### Option 1: Keep Visualization in Strategy
**Rejected**: Violates single responsibility principle. Makes testing difficult and deployment inflexible.
### Option 2: Observer Pattern
**Rejected**: Adds unnecessary complexity for this use case. Direct control in main.py is simpler and more explicit.
### Option 3: Visualization Service
**Rejected**: Over-engineering for current requirements. May be considered for future multi-strategy scenarios.
## Implementation Details
### Before (Coupled Design)
```python
class DefaultStrategy:
def __init__(self, instrument: str, enable_visualization: bool = True):
self.visualizer = Visualizer(...) if enable_visualization else None
def on_booktick(self, book: Book):
# Trading analysis
# ...
# Visualization update
if self.visualizer:
self.visualizer.update_from_book(book)
```
### After (Separated Design)
```python
# Strategy focuses on analysis only
class DefaultStrategy:
def __init__(self, instrument: str):
# No visualization dependencies
def on_booktick(self, book: Book):
# Pure trading analysis
# No visualization code
# Main application orchestrates both
def main():
strategy = DefaultStrategy(instrument)
visualizer = Visualizer(...)
for db_path in db_paths:
strategy.set_db_path(db_path)
visualizer.set_db_path(db_path)
# Process data
storage.build_booktick_from_db(db_path, db_date)
# Analysis
strategy.on_booktick(storage.book)
# Visualization (controlled timing)
visualizer.update_from_book(storage.book)
# Final display
visualizer.show()
```
### Interface Changes
#### Strategy Interface (Simplified)
```python
class DefaultStrategy:
def __init__(self, instrument: str) # Removed visualization param
def set_db_path(self, db_path: Path) -> None # No visualizer.set_db_path()
def on_booktick(self, book: Book) -> None # No visualization calls
```
#### Main Application (Enhanced)
```python
def main():
# Separate initialization
strategy = DefaultStrategy(instrument)
visualizer = Visualizer(window_seconds=60, max_bars=500)
# Independent configuration
for db_path in db_paths:
strategy.set_db_path(db_path)
visualizer.set_db_path(db_path)
# Controlled execution
strategy.on_booktick(storage.book) # Analysis
visualizer.update_from_book(storage.book) # Visualization
```
## Migration Strategy
### Code Changes Required
1. **Strategy Classes**: Remove visualization initialization and calls
2. **Main Application**: Add visualizer creation and orchestration
3. **Tests**: Update strategy tests to remove visualization mocking
4. **Configuration**: Remove visualization parameters from strategy constructors
### Backward Compatibility
- **API Breaking**: Strategy constructor signature changes
- **Functionality Preserved**: All visualization features remain available
- **Test Updates**: Strategy tests become simpler (no GUI mocking needed)
### Migration Steps
1. Update `DefaultStrategy` to remove visualization dependencies
2. Modify `main.py` to create and manage `Visualizer` instance
3. Update all strategy constructor calls to remove `enable_visualization`
4. Update tests to reflect new interfaces
5. Verify visualization timing meets requirements
## Benefits Achieved
### Clean Architecture
- **Strategy**: Pure trading analysis logic
- **Visualizer**: Pure chart rendering logic
- **Main**: Application flow and component coordination
### Improved Testing
```python
# Before: Complex mocking required
def test_strategy():
with patch('visualizer.Visualizer') as mock_viz:
strategy = DefaultStrategy("BTC", enable_visualization=True)
# Complex mock setup...
# After: Simple, direct testing
def test_strategy():
strategy = DefaultStrategy("BTC")
# Direct testing of analysis logic
```
### Flexible Deployment
```python
# Headless server deployment
strategy = DefaultStrategy("BTC")
# No GUI dependencies, can run anywhere
# Development with visualization
strategy = DefaultStrategy("BTC")
visualizer = Visualizer(...)
# Full GUI functionality when needed
```
### Precise Timing Control
```python
# Visualization updates exactly when requested
for db_file in database_files:
process_database(db_file) # Data processing
strategy.analyze(book) # Trading analysis
visualizer.update_from_book(book) # Chart update after each DB
```
## Monitoring and Validation
### Success Criteria
- **Test Simplification**: Strategy tests run without GUI mocking
- **Timing Accuracy**: Visualization updates after each database as requested
- **Performance**: No GUI overhead during pure analysis operations
- **Maintainability**: Visualization changes don't affect strategy code
### Validation Methods
- Run strategy tests in headless environment
- Verify visualization timing matches requirements
- Performance comparison of analysis-only vs. GUI operations
- Code complexity metrics for strategy vs. visualization modules
## Future Considerations
### Potential Enhancements
- **Multiple Visualizers**: Support different chart types or windows
- **Visualization Plugins**: Pluggable chart renderers for different outputs
- **Remote Visualization**: Web-based charts for server deployments
- **Batch Visualization**: Process multiple databases before chart updates
### Extensibility
- **Strategy Plugins**: Easy to add strategies without visualization concerns
- **Visualization Backends**: Swap chart libraries without affecting strategies
- **Analysis Pipeline**: Clear separation enables complex analysis workflows
---
This separation provides a clean, maintainable architecture that supports the requested visualization timing while improving code quality and testability.

302
docs/modules/metrics.md Normal file
View File

@ -0,0 +1,302 @@
# Module: Metrics Calculation System
## Purpose
The metrics calculation system provides high-performance computation of Order Book Imbalance (OBI) and Cumulative Volume Delta (CVD) indicators for cryptocurrency trading analysis. It processes orderbook snapshots and trade data to generate financial metrics with per-snapshot granularity.
## Public Interface
### Classes
#### `Metric` (dataclass)
Represents calculated metrics for a single orderbook snapshot.
```python
@dataclass(slots=True)
class Metric:
snapshot_id: int # Reference to source snapshot
timestamp: int # Unix timestamp
obi: float # Order Book Imbalance [-1, 1]
cvd: float # Cumulative Volume Delta
best_bid: float | None # Best bid price
best_ask: float | None # Best ask price
```
#### `MetricCalculator` (static class)
Provides calculation methods for financial metrics.
```python
class MetricCalculator:
@staticmethod
def calculate_obi(snapshot: BookSnapshot) -> float
@staticmethod
def calculate_volume_delta(trades: List[Trade]) -> float
@staticmethod
def calculate_cvd(previous_cvd: float, volume_delta: float) -> float
@staticmethod
def get_best_bid_ask(snapshot: BookSnapshot) -> tuple[float | None, float | None]
```
### Functions
#### Order Book Imbalance (OBI) Calculation
```python
def calculate_obi(snapshot: BookSnapshot) -> float:
"""
Calculate Order Book Imbalance using the standard formula.
Formula: OBI = (Vb - Va) / (Vb + Va)
Where:
Vb = Total volume on bid side
Va = Total volume on ask side
Args:
snapshot: BookSnapshot containing bids and asks data
Returns:
float: OBI value between -1 and 1, or 0.0 if no volume
Example:
>>> snapshot = BookSnapshot(bids={50000.0: OrderbookLevel(...)}, ...)
>>> obi = MetricCalculator.calculate_obi(snapshot)
>>> print(f"OBI: {obi:.3f}")
OBI: 0.333
"""
```
#### Volume Delta Calculation
```python
def calculate_volume_delta(trades: List[Trade]) -> float:
"""
Calculate Volume Delta for a list of trades.
Volume Delta = Buy Volume - Sell Volume
- Buy trades (side = "buy"): positive contribution
- Sell trades (side = "sell"): negative contribution
Args:
trades: List of Trade objects for specific timestamp
Returns:
float: Net volume delta (positive = buy pressure, negative = sell pressure)
Example:
>>> trades = [
... Trade(side="buy", size=10.0, ...),
... Trade(side="sell", size=3.0, ...)
... ]
>>> vd = MetricCalculator.calculate_volume_delta(trades)
>>> print(f"Volume Delta: {vd}")
Volume Delta: 7.0
"""
```
#### Cumulative Volume Delta (CVD) Calculation
```python
def calculate_cvd(previous_cvd: float, volume_delta: float) -> float:
"""
Calculate Cumulative Volume Delta with incremental support.
Formula: CVD_t = CVD_{t-1} + Volume_Delta_t
Args:
previous_cvd: Previous CVD value (use 0.0 for reset)
volume_delta: Current volume delta to add
Returns:
float: New cumulative volume delta value
Example:
>>> cvd = 0.0 # Starting value
>>> cvd = MetricCalculator.calculate_cvd(cvd, 10.0) # First trade
>>> cvd = MetricCalculator.calculate_cvd(cvd, -5.0) # Second trade
>>> print(f"CVD: {cvd}")
CVD: 5.0
"""
```
## Usage Examples
### Basic OBI Calculation
```python
from models import MetricCalculator, BookSnapshot, OrderbookLevel
# Create sample orderbook snapshot
snapshot = BookSnapshot(
id=1,
timestamp=1640995200,
bids={
50000.0: OrderbookLevel(price=50000.0, size=10.0, liquidation_count=0, order_count=1),
49999.0: OrderbookLevel(price=49999.0, size=5.0, liquidation_count=0, order_count=1),
},
asks={
50001.0: OrderbookLevel(price=50001.0, size=3.0, liquidation_count=0, order_count=1),
50002.0: OrderbookLevel(price=50002.0, size=2.0, liquidation_count=0, order_count=1),
}
)
# Calculate OBI
obi = MetricCalculator.calculate_obi(snapshot)
print(f"OBI: {obi:.3f}") # Output: OBI: 0.500
# Explanation: (15 - 5) / (15 + 5) = 10/20 = 0.5
```
### CVD Calculation with Reset
```python
from models import MetricCalculator, Trade
# Simulate trading session
cvd = 0.0 # Reset CVD at session start
# Process trades for first timestamp
trades_t1 = [
Trade(id=1, trade_id=1.0, price=50000.0, size=8.0, side="buy", timestamp=1000),
Trade(id=2, trade_id=2.0, price=50001.0, size=3.0, side="sell", timestamp=1000),
]
vd_t1 = MetricCalculator.calculate_volume_delta(trades_t1) # 8.0 - 3.0 = 5.0
cvd = MetricCalculator.calculate_cvd(cvd, vd_t1) # 0.0 + 5.0 = 5.0
# Process trades for second timestamp
trades_t2 = [
Trade(id=3, trade_id=3.0, price=49999.0, size=2.0, side="buy", timestamp=1001),
Trade(id=4, trade_id=4.0, price=50000.0, size=7.0, side="sell", timestamp=1001),
]
vd_t2 = MetricCalculator.calculate_volume_delta(trades_t2) # 2.0 - 7.0 = -5.0
cvd = MetricCalculator.calculate_cvd(cvd, vd_t2) # 5.0 + (-5.0) = 0.0
print(f"Final CVD: {cvd}") # Output: Final CVD: 0.0
```
### Complete Metrics Processing
```python
from models import MetricCalculator, Metric
def process_snapshot_metrics(snapshot, trades, previous_cvd=0.0):
"""Process complete metrics for a single snapshot."""
# Calculate OBI
obi = MetricCalculator.calculate_obi(snapshot)
# Calculate volume delta and CVD
volume_delta = MetricCalculator.calculate_volume_delta(trades)
cvd = MetricCalculator.calculate_cvd(previous_cvd, volume_delta)
# Extract best bid/ask
best_bid, best_ask = MetricCalculator.get_best_bid_ask(snapshot)
# Create metric record
metric = Metric(
snapshot_id=snapshot.id,
timestamp=snapshot.timestamp,
obi=obi,
cvd=cvd,
best_bid=best_bid,
best_ask=best_ask
)
return metric, cvd
# Usage in processing loop
current_cvd = 0.0
for snapshot, trades in snapshot_trade_pairs:
metric, current_cvd = process_snapshot_metrics(snapshot, trades, current_cvd)
# Store metric to database...
```
## Dependencies
### Internal
- `models.BookSnapshot`: Orderbook state data
- `models.Trade`: Individual trade execution data
- `models.OrderbookLevel`: Price level information
### External
- **Python Standard Library**: `typing` for type hints
- **No external packages required**
## Performance Characteristics
### Computational Complexity
- **OBI Calculation**: O(n) where n = number of price levels
- **Volume Delta**: O(m) where m = number of trades
- **CVD Calculation**: O(1) - simple addition
- **Best Bid/Ask**: O(n) for min/max operations
### Memory Usage
- **Static Methods**: No instance state, minimal memory overhead
- **Calculations**: Process data in-place without copying
- **Results**: Lightweight `Metric` objects with slots optimization
### Typical Performance
```python
# Benchmark results (approximate)
Snapshot with 50 price levels: ~0.1ms per OBI calculation
Timestamp with 20 trades: ~0.05ms per volume delta
CVD update: ~0.001ms per calculation
Complete metric processing: ~0.2ms per snapshot
```
## Error Handling
### Edge Cases Handled
```python
# Empty orderbook
empty_snapshot = BookSnapshot(bids={}, asks={})
obi = MetricCalculator.calculate_obi(empty_snapshot) # Returns 0.0
# No trades
empty_trades = []
vd = MetricCalculator.calculate_volume_delta(empty_trades) # Returns 0.0
# Zero volume scenario
zero_vol_snapshot = BookSnapshot(
bids={50000.0: OrderbookLevel(price=50000.0, size=0.0, ...)},
asks={50001.0: OrderbookLevel(price=50001.0, size=0.0, ...)}
)
obi = MetricCalculator.calculate_obi(zero_vol_snapshot) # Returns 0.0
```
### Validation
- **OBI Range**: Results automatically bounded to [-1, 1]
- **Division by Zero**: Handled gracefully with 0.0 return
- **Invalid Data**: Empty collections handled without errors
## Testing
### Test Coverage
- **Unit Tests**: `tests/test_metric_calculator.py`
- **Integration Tests**: Included in storage and strategy tests
- **Edge Cases**: Empty data, zero volume, boundary conditions
### Running Tests
```bash
# Run metric calculator tests specifically
uv run pytest tests/test_metric_calculator.py -v
# Run all tests with metrics
uv run pytest -k "metric" -v
# Performance tests
uv run pytest tests/test_metric_calculator.py::test_calculate_obi_performance
```
## Known Issues
### Current Limitations
- **Precision**: Floating-point arithmetic limitations for very small numbers
- **Scale**: No optimization for extremely large orderbooks (>10k levels)
- **Currency**: No multi-currency support (assumes single denomination)
### Planned Enhancements
- **Decimal Precision**: Consider `decimal.Decimal` for high-precision calculations
- **Vectorization**: NumPy integration for batch calculations
- **Additional Metrics**: Volume Profile, Liquidity metrics, Delta Flow
---
The metrics calculation system provides a robust foundation for financial analysis with clean interfaces, comprehensive error handling, and optimal performance for high-frequency trading data.

60
main.py Normal file
View File

@ -0,0 +1,60 @@
import logging
import typer
from pathlib import Path
from typing import List
from datetime import datetime, timezone
from storage import Storage
from strategies import DefaultStrategy
from visualizer import Visualizer
databases_path = Path("../data/OKX")
storage = None
def main(instrument: str = typer.Argument(..., help="Instrument to backtest, e.g. BTC-USDT"),
start_date: str = typer.Argument(..., help="Start date, e.g. 2025-07-01"),
end_date: str = typer.Argument(..., help="End date, e.g. 2025-08-01")):
start_date = datetime.strptime(start_date, "%Y-%m-%d").replace(tzinfo=timezone.utc)
end_date = datetime.strptime(end_date, "%Y-%m-%d").replace(tzinfo=timezone.utc)
db_paths = list(databases_path.glob(f"{instrument}*.db"))
db_paths.sort()
storage = Storage(instrument)
strategy = DefaultStrategy(instrument)
visualizer = Visualizer(window_seconds=60, max_bars=500)
logging.basicConfig(level=logging.INFO, format="%(asctime)s %(levelname)s %(message)s")
for db_path in db_paths:
db_name = db_path.name.split(".")[0].split("-")[2:5]
db_date = datetime.strptime("".join(db_name), "%y%m%d").replace(tzinfo=timezone.utc)
if db_date < start_date or db_date >= end_date:
continue
logging.info(f"Processing database: {db_path.name}")
# Set database path for strategy and visualizer to access stored metrics
strategy.set_db_path(db_path)
visualizer.set_db_path(db_path)
# Build snapshots and calculate metrics
storage.build_booktick_from_db(db_path, db_date)
logging.info(f"Processed {len(storage.book.snapshots)} snapshots with metrics")
# Strategy analyzes metrics from the database
strategy.on_booktick(storage.book)
# Update visualization after processing each database
logging.info(f"Updating visualization for {db_path.name}")
visualizer.update_from_book(storage.book)
# Show final visualization
logging.info("Processing complete. Displaying final visualization...")
if db_paths: # Ensure we have processed at least one database
visualizer.show()
if __name__ == "__main__":
typer.run(main)

192
models.py Normal file
View File

@ -0,0 +1,192 @@
"""Core data models for orderbook reconstruction and backtesting.
This module defines lightweight data structures for orderbook levels, trades,
book snapshots, and the in-memory `Book` container used by the `Storage` layer.
"""
from dataclasses import dataclass, field
from typing import Dict, List
@dataclass(slots=True)
class OrderbookLevel:
"""Represents a single price level on one side of the orderbook.
Attributes:
price: Price level for the orderbook entry.
size: Total size at this price level.
liquidation_count: Number of liquidations at this level.
order_count: Number of resting orders at this level.
"""
price: float
size: float
liquidation_count: int
order_count: int
@dataclass(slots=True)
class Trade:
"""Represents a single trade event."""
id: int
trade_id: float
price: float
size: float
side: str
timestamp: int
@dataclass(slots=True)
class Metric:
"""Represents calculated metrics for a snapshot."""
snapshot_id: int
timestamp: int
obi: float
cvd: float
best_bid: float | None = None
best_ask: float | None = None
@dataclass
class BookSnapshot:
"""In-memory representation of an orderbook state at a specific timestamp."""
id: int = 0
timestamp: int = 0
bids: Dict[float, OrderbookLevel] = field(default_factory=dict)
asks: Dict[float, OrderbookLevel] = field(default_factory=dict)
trades: List[Trade] = field(default_factory=list)
class Book:
"""Container for managing orderbook snapshots and their evolution over time."""
def __init__(self) -> None:
"""Initialize an empty book."""
self.snapshots: List[BookSnapshot] = []
self.first_timestamp = 0
self.last_timestamp = 0
def add_snapshot(self, snapshot: BookSnapshot) -> None:
"""Add a snapshot to the book's history and update time bounds."""
self.snapshots.append(snapshot)
if self.first_timestamp == 0 or snapshot.timestamp < self.first_timestamp:
self.first_timestamp = snapshot.timestamp
if snapshot.timestamp > self.last_timestamp:
self.last_timestamp = snapshot.timestamp
def create_snapshot(self, id: int, timestamp: int) -> BookSnapshot:
"""Create a new snapshot, add it to history, and return it.
Copies bids/asks/trades from the previous snapshot to maintain continuity.
"""
prev_snapshot = self.snapshots[-1] if self.snapshots else BookSnapshot()
snapshot = BookSnapshot(
id=id,
timestamp=timestamp,
bids={
k: OrderbookLevel(
price=v.price,
size=v.size,
liquidation_count=v.liquidation_count,
order_count=v.order_count,
)
for k, v in prev_snapshot.bids.items()
},
asks={
k: OrderbookLevel(
price=v.price,
size=v.size,
liquidation_count=v.liquidation_count,
order_count=v.order_count,
)
for k, v in prev_snapshot.asks.items()
},
trades=prev_snapshot.trades.copy() if prev_snapshot.trades else [],
)
self.add_snapshot(snapshot)
return snapshot
class MetricCalculator:
"""Calculator for OBI and CVD metrics from orderbook snapshots and trades."""
@staticmethod
def calculate_obi(snapshot: BookSnapshot) -> float:
"""Calculate Order Book Imbalance for a snapshot.
Formula: OBI = (Vb - Va) / (Vb + Va)
Where Vb = total bid volume, Va = total ask volume
Args:
snapshot: BookSnapshot containing bids and asks data.
Returns:
OBI value between -1 and 1, or 0.0 if no volume.
"""
# Calculate total bid volume
vb = sum(level.size for level in snapshot.bids.values())
# Calculate total ask volume
va = sum(level.size for level in snapshot.asks.values())
# Handle edge case where total volume is zero
if vb + va == 0:
return 0.0
# Calculate OBI using standard formula
obi = (vb - va) / (vb + va)
# Ensure result is within expected bounds [-1, 1]
return max(-1.0, min(1.0, obi))
@staticmethod
def get_best_bid_ask(snapshot: BookSnapshot) -> tuple[float | None, float | None]:
"""Extract best bid and ask prices from a snapshot.
Args:
snapshot: BookSnapshot containing bids and asks data.
Returns:
Tuple of (best_bid, best_ask) or (None, None) if no data.
"""
best_bid = max(snapshot.bids.keys()) if snapshot.bids else None
best_ask = min(snapshot.asks.keys()) if snapshot.asks else None
return best_bid, best_ask
@staticmethod
def calculate_volume_delta(trades: List[Trade]) -> float:
"""Calculate Volume Delta for a list of trades.
Volume Delta = Buy Volume - Sell Volume
Buy trades (side = "buy") contribute positive volume
Sell trades (side = "sell") contribute negative volume
Args:
trades: List of Trade objects for a specific timestamp.
Returns:
Volume delta value (can be positive, negative, or zero).
"""
buy_volume = sum(trade.size for trade in trades if trade.side == "buy")
sell_volume = sum(trade.size for trade in trades if trade.side == "sell")
return buy_volume - sell_volume
@staticmethod
def calculate_cvd(previous_cvd: float, volume_delta: float) -> float:
"""Calculate Cumulative Volume Delta.
CVD_t = CVD_{t-1} + Volume_Delta_t
Args:
previous_cvd: Previous CVD value (use 0.0 for reset or first calculation).
volume_delta: Current volume delta to add.
Returns:
New cumulative volume delta value.
"""
return previous_cvd + volume_delta

0
nonexistent.db Normal file
View File

3
parsers/__init__.py Normal file
View File

@ -0,0 +1,3 @@
"""Parsing utilities for transforming raw persisted data into domain models."""

View File

@ -0,0 +1,45 @@
from __future__ import annotations
from ast import literal_eval
from typing import Dict
import logging
from models import OrderbookLevel
class OrderbookParser:
"""Parser for orderbook side text into structured levels.
Maintains a price cache for memory efficiency and provides a method to
parse a side into a dictionary of price -> OrderbookLevel.
"""
def __init__(self, price_cache: dict[float, float] | None = None, debug: bool = False) -> None:
self._price_cache: dict[float, float] = price_cache or {}
self._debug = debug
def parse_side(self, text: str, side_dict: Dict[float, OrderbookLevel]) -> None:
"""Parse orderbook side data from text and populate the provided dictionary."""
if not text or text.strip() == "":
return
try:
arr = literal_eval(text)
for p, s, lc, oc in arr:
price = float(p)
size = float(s)
price = self._price_cache.get(price, price)
if size > 0:
side_dict[price] = OrderbookLevel(
price=price,
size=size,
liquidation_count=int(lc),
order_count=int(oc),
)
except Exception as e:
if self._debug:
logging.exception("Error parsing orderbook data")
logging.debug(f"Text sample: {text[:100]}...")
else:
logging.error(f"Failed to parse orderbook data: {type(e).__name__}")

138
prd-obi-cvd-metrics.md Normal file
View File

@ -0,0 +1,138 @@
# PRD: OBI and CVD Metrics Integration
## Introduction/Overview
This feature integrates Order Book Imbalance (OBI) and Cumulative Volume Delta (CVD) calculations into the orderflow backtest system. Currently, the system stores all snapshots in memory during processing, which consumes excessive memory. The goal is to compute OBI and CVD metrics during the `build_booktick_from_db` execution, store these metrics persistently in the database, and visualize them alongside OHLC and volume data.
## Goals
1. **Memory Optimization**: Reduce memory usage by storing only essential data (OBI/CVD metrics, best bid/ask) instead of full snapshot history
2. **Metric Calculation**: Implement per-snapshot OBI and CVD calculations with maximum granularity
3. **Persistent Storage**: Store calculated metrics in the database to avoid recalculation
4. **Enhanced Visualization**: Display OBI and CVD curves beneath volume graphs with shared time axis
5. **Incremental CVD**: Support incremental CVD calculation that can be reset at user-defined points
## User Stories
1. **As a trader**, I want to see OBI and CVD metrics calculated for each orderbook snapshot so that I can analyze market sentiment with maximum granularity.
2. **As a system user**, I want metrics to be stored persistently in the database so that I don't need to recalculate them when re-analyzing the same dataset.
3. **As a data analyst**, I want to visualize OBI and CVD curves alongside OHLC and volume data so that I can correlate price movements with orderbook imbalances and volume deltas.
4. **As a performance-conscious user**, I want the system to use less memory during processing so that I can analyze larger datasets (months to years of data).
5. **As a researcher**, I want incremental CVD calculation so that I can track cumulative volume changes from any chosen starting point in my analysis.
## Functional Requirements
### Database Schema Updates
1. **Create metrics table** with the following structure:
```sql
CREATE TABLE metrics (
id INTEGER PRIMARY KEY AUTOINCREMENT,
snapshot_id INTEGER,
timestamp TEXT,
obi REAL,
cvd REAL,
best_bid REAL,
best_ask REAL,
FOREIGN KEY (snapshot_id) REFERENCES book(id)
);
```
### OBI Calculation
2. **Calculate OBI per snapshot** using the formula: `OBI = (Vb - Va) / (Vb + Va)` where:
- Vb = total volume on bid side
- Va = total volume on ask side
3. **Handle edge cases** where Vb + Va = 0 by setting OBI = 0.0
4. **Store OBI values** in the metrics table for each processed snapshot
### CVD Calculation
5. **Calculate Volume Delta per timestamp** by summing all trades at each snapshot timestamp:
- Buy trades (side = "buy"): add to positive volume
- Sell trades (side = "sell"): add to negative volume
- VD = Buy Volume - Sell Volume
6. **Calculate Cumulative Volume Delta** as running sum: `CVD_t = CVD_{t-1} + VD_t`
7. **Support CVD reset functionality** to allow starting cumulative calculation from any point
8. **Handle snapshots with no trades** by carrying forward the previous CVD value
### Storage System Updates
9. **Modify Storage class** to integrate metric calculations during `build_booktick_from_db`
10. **Update Book model** to store only essential data: OBI/CVD time series and best bid/ask levels
11. **Remove full snapshot retention** from memory after metric calculation
12. **Add metric persistence** to SQLite database during processing
### Strategy Integration
13. **Enhance DefaultStrategy** to calculate both OBI and CVD metrics
14. **Return time-series data structures** compatible with visualization system
15. **Integrate metric calculation** into the existing `on_booktick` workflow
### Visualization Enhancements
16. **Add OBI and CVD plotting** to the visualizer beneath volume graphs
17. **Implement shared X-axis** for time alignment across OHLC, volume, OBI, and CVD charts
18. **Support 6-hour bar aggregation** as the initial time resolution
19. **Use standard line styling** for OBI and CVD curves
20. **Make time resolution configurable** for future flexibility
## Non-Goals (Out of Scope)
1. **Real-time streaming** - This feature focuses on historical data processing
2. **Advanced visualization features** - Complex styling, indicators, or interactive elements beyond basic line charts
3. **Alternative CVD calculation methods** - Only implementing the standard buy/sell volume delta approach
4. **Multi-threading optimization** - Simple sequential processing for initial implementation
5. **Data compression** - No advanced compression techniques for stored metrics
6. **Export functionality** - No CSV/JSON export of calculated metrics
## Technical Considerations
### Database Performance
- Add indexes on `metrics.timestamp` and `metrics.snapshot_id` for efficient querying
- Consider batch insertions for metric data to improve write performance
### Memory Management
- Process snapshots sequentially and discard after metric calculation
- Maintain only calculated time-series data in memory
- Keep best bid/ask data for potential future analysis needs
### Data Integrity
- Ensure metric calculations are atomic with snapshot processing
- Add foreign key constraints to maintain referential integrity
- Implement transaction boundaries for consistent data state
### Integration Points
- Modify `SQLiteOrderflowRepository` to support metrics table operations
- Update `Storage._create_snapshots_from_rows` to include metric calculation
- Extend `Visualizer` to handle additional metric data series
## Success Metrics
1. **Memory Usage Reduction**: Achieve at least 70% reduction in peak memory usage during processing
2. **Processing Speed**: Maintain or improve current processing speed (rows/sec) despite additional calculations
3. **Data Accuracy**: 100% correlation between manually calculated and stored OBI/CVD values for test datasets
4. **Visualization Quality**: Successfully display OBI and CVD curves with proper time alignment
5. **Storage Efficiency**: Metrics table size should be manageable relative to source data (< 20% overhead)
## Open Questions
1. **Index Strategy**: Should we add additional database indexes for time-range queries on metrics?
2. **CVD Starting Value**: Should CVD start from 0 for each database file, or allow continuation from previous sessions?
3. **Error Recovery**: How should the system handle partial metric calculations if processing is interrupted?
4. **Validation**: Do we need validation checks to ensure OBI values stay within [-1, 1] range?
5. **Performance Monitoring**: Should we add timing metrics to track calculation performance per snapshot?
## Implementation Priority
**Phase 1: Core Functionality**
- Database schema updates
- Basic OBI/CVD calculation
- Metric storage integration
**Phase 2: Memory Optimization**
- Remove full snapshot retention
- Implement essential data-only storage
**Phase 3: Visualization**
- Add metric plotting to visualizer
- Implement time axis alignment
- Support 6-hour bar aggregation

16
pyproject.toml Normal file
View File

@ -0,0 +1,16 @@
[project]
name = "orderflow-backtest"
version = "0.1.0"
description = "Add your description here"
readme = "README.md"
requires-python = ">=3.12"
dependencies = [
"matplotlib>=3.10.5",
"pyqt5>=5.15.11",
"typer>=0.16.1",
]
[dependency-groups]
dev = [
"pytest>=8.4.1",
]

7
repositories/__init__.py Normal file
View File

@ -0,0 +1,7 @@
"""Repository layer for data access implementations (e.g., SQLite).
This package contains concrete repositories used by the `Storage` orchestrator
to read persisted orderflow data.
"""

View File

@ -0,0 +1,132 @@
from __future__ import annotations
from pathlib import Path
import sqlite3
import logging
from typing import List, Dict, Tuple
from .sqlite_repository import SQLiteOrderflowRepository
from models import Metric
class SQLiteMetricsRepository(SQLiteOrderflowRepository):
"""Write-enabled repository for storing and loading metrics data alongside orderflow data."""
def create_metrics_table(self, conn: sqlite3.Connection) -> None:
"""Create the metrics table with proper indexes and foreign key constraints.
Args:
conn: Active SQLite database connection.
"""
try:
# Create metrics table following PRD schema
conn.execute("""
CREATE TABLE IF NOT EXISTS metrics (
id INTEGER PRIMARY KEY AUTOINCREMENT,
snapshot_id INTEGER NOT NULL,
timestamp TEXT NOT NULL,
obi REAL NOT NULL,
cvd REAL NOT NULL,
best_bid REAL,
best_ask REAL,
FOREIGN KEY (snapshot_id) REFERENCES book(id)
)
""")
# Create indexes for efficient querying
conn.execute("CREATE INDEX IF NOT EXISTS idx_metrics_timestamp ON metrics(timestamp)")
conn.execute("CREATE INDEX IF NOT EXISTS idx_metrics_snapshot_id ON metrics(snapshot_id)")
conn.commit()
logging.info("Metrics table and indexes created successfully")
except sqlite3.Error as e:
logging.error(f"Error creating metrics table: {e}")
raise
def table_exists(self, conn: sqlite3.Connection, table_name: str) -> bool:
"""Check if a table exists in the database.
Args:
conn: Active SQLite database connection.
table_name: Name of the table to check.
Returns:
True if table exists, False otherwise.
"""
try:
cursor = conn.cursor()
cursor.execute(
"SELECT name FROM sqlite_master WHERE type='table' AND name=?",
(table_name,)
)
return cursor.fetchone() is not None
except sqlite3.Error as e:
logging.error(f"Error checking if table {table_name} exists: {e}")
return False
def insert_metrics_batch(self, conn: sqlite3.Connection, metrics: List[Metric]) -> None:
"""Insert multiple metrics in a single batch operation for performance.
Args:
conn: Active SQLite database connection.
metrics: List of Metric objects to insert.
"""
if not metrics:
return
try:
# Prepare batch data following existing batch pattern
batch_data = [
(m.snapshot_id, m.timestamp, m.obi, m.cvd, m.best_bid, m.best_ask)
for m in metrics
]
# Use executemany for batch insertion
conn.executemany(
"INSERT INTO metrics (snapshot_id, timestamp, obi, cvd, best_bid, best_ask) VALUES (?, ?, ?, ?, ?, ?)",
batch_data
)
logging.debug(f"Inserted {len(metrics)} metrics records")
except sqlite3.Error as e:
logging.error(f"Error inserting metrics batch: {e}")
raise
def load_metrics_by_timerange(self, conn: sqlite3.Connection, start_timestamp: int, end_timestamp: int) -> List[Metric]:
"""Load metrics within a specified timestamp range.
Args:
conn: Active SQLite database connection.
start_timestamp: Start of the time range (inclusive).
end_timestamp: End of the time range (inclusive).
Returns:
List of Metric objects ordered by timestamp.
"""
try:
cursor = conn.cursor()
cursor.execute(
"SELECT snapshot_id, timestamp, obi, cvd, best_bid, best_ask FROM metrics WHERE timestamp >= ? AND timestamp <= ? ORDER BY timestamp ASC",
(start_timestamp, end_timestamp)
)
metrics = []
for batch in iter(lambda: cursor.fetchmany(5000), []):
for snapshot_id, timestamp, obi, cvd, best_bid, best_ask in batch:
metric = Metric(
snapshot_id=int(snapshot_id),
timestamp=int(timestamp),
obi=float(obi),
cvd=float(cvd),
best_bid=float(best_bid) if best_bid is not None else None,
best_ask=float(best_ask) if best_ask is not None else None,
)
metrics.append(metric)
return metrics
except sqlite3.Error as e:
logging.error(f"Error loading metrics by timerange: {e}")
return []

View File

@ -0,0 +1,73 @@
from __future__ import annotations
from pathlib import Path
from typing import Dict, Iterator, List, Tuple
import sqlite3
import logging
from models import Trade
class SQLiteOrderflowRepository:
"""Read-only repository for loading orderflow data from a SQLite database."""
def __init__(self, db_path: Path) -> None:
self.db_path = db_path
def connect(self) -> sqlite3.Connection:
conn = sqlite3.connect(str(self.db_path))
conn.execute("PRAGMA journal_mode = OFF")
conn.execute("PRAGMA synchronous = OFF")
conn.execute("PRAGMA cache_size = 100000")
conn.execute("PRAGMA temp_store = MEMORY")
conn.execute("PRAGMA mmap_size = 30000000000")
return conn
def count_rows(self, conn: sqlite3.Connection, table: str) -> int:
allowed_tables = {"book", "trades"}
if table not in allowed_tables:
raise ValueError(f"Unsupported table name: {table}")
try:
row = conn.execute(f"SELECT COUNT(*) FROM {table}").fetchone()
return int(row[0]) if row and row[0] is not None else 0
except sqlite3.Error as e:
logging.error(f"Error counting rows in table {table}: {e}")
return 0
def load_trades_by_timestamp(self, conn: sqlite3.Connection) -> Dict[int, List[Trade]]:
trades_by_timestamp: Dict[int, List[Trade]] = {}
try:
cursor = conn.cursor()
cursor.execute(
"SELECT id, trade_id, price, size, side, timestamp FROM trades ORDER BY timestamp ASC"
)
for batch in iter(lambda: cursor.fetchmany(5000), []):
for id_, trade_id, price, size, side, ts in batch:
timestamp_int = int(ts)
trade = Trade(
id=id_,
trade_id=float(trade_id),
price=float(price),
size=float(size),
side=str(side),
timestamp=timestamp_int,
)
if timestamp_int not in trades_by_timestamp:
trades_by_timestamp[timestamp_int] = []
trades_by_timestamp[timestamp_int].append(trade)
return trades_by_timestamp
except sqlite3.Error as e:
logging.error(f"Error loading trades: {e}")
return {}
def iterate_book_rows(self, conn: sqlite3.Connection) -> Iterator[Tuple[int, str, str, int]]:
cursor = conn.cursor()
cursor.execute("SELECT id, bids, asks, timestamp FROM book ORDER BY timestamp ASC")
while True:
rows = cursor.fetchmany(5000)
if not rows:
break
for row in rows:
yield row # (id, bids, asks, timestamp)

207
storage.py Normal file
View File

@ -0,0 +1,207 @@
"""Storage utilities to reconstruct an in-memory orderbook from a SQLite DB.
This module defines lightweight data structures for orderbook levels, trades,
and a `Storage` facade that can hydrate a `Book` incrementally from rows stored
in a SQLite file produced by an external data collector.
"""
from pathlib import Path
from datetime import datetime
from typing import List, Dict, Optional, Iterator, Tuple
import time
import logging
from models import OrderbookLevel, Trade, BookSnapshot, Book, MetricCalculator, Metric
from repositories.sqlite_repository import SQLiteOrderflowRepository
from repositories.sqlite_metrics_repository import SQLiteMetricsRepository
from parsers.orderbook_parser import OrderbookParser
class Storage:
"""High-level facade to read historical orderflow into a `Book`.
Attributes:
instrument: Symbol/instrument name (e.g., "BTC-USDT").
book: In-memory orderbook that maintains the current state and tracks timestamps.
"""
def __init__(self, instrument: str) -> None:
self.instrument = instrument
self.book = Book()
# Pre-allocate memory for common price points
self._price_cache = {float(p/10): float(p/10) for p in range(1, 1000001, 5)}
# Debug flag
self._debug = False
self._parser = OrderbookParser(price_cache=self._price_cache, debug=self._debug)
def build_booktick_from_db(self, db_path: Path, db_date: datetime) -> None:
"""Hydrate the in-memory `book` from a SQLite database and calculate metrics.
Builds a Book instance with sequential snapshots and calculates OBI/CVD metrics.
Args:
db_path: Path to the SQLite database file.
db_date: Date associated with the database (currently informational).
"""
# Reset the book to start fresh
self.book = Book()
metrics_repo = SQLiteMetricsRepository(db_path)
with metrics_repo.connect() as conn:
# Create metrics table if it doesn't exist
if not metrics_repo.table_exists(conn, "metrics"):
metrics_repo.create_metrics_table(conn)
# Load trades grouped by timestamp
trades_by_timestamp = metrics_repo.load_trades_by_timestamp(conn)
# Check if we have any orderbook data
total_rows = metrics_repo.count_rows(conn, "book")
if total_rows == 0:
logging.info(f"No orderbook data found in {db_path}")
return
# Process orderbook data and calculate metrics
rows_iter = metrics_repo.iterate_book_rows(conn)
self._create_snapshots_and_metrics(rows_iter, trades_by_timestamp, total_rows, conn, metrics_repo)
# Log summary
logging.info(f"Processed {len(self.book.snapshots)} snapshots with metrics from {db_path}")
def _create_snapshots_and_metrics(self, rows_iter: Iterator[Tuple[int, str, str, int]], trades_by_timestamp: Dict[int, List[Trade]], total_rows: int, conn, metrics_repo: SQLiteMetricsRepository) -> None:
"""Create BookSnapshot instances and calculate metrics, storing them in database.
Args:
rows_iter: Iterator yielding (id, bids_text, asks_text, timestamp)
trades_by_timestamp: Dictionary mapping timestamps to lists of trades
total_rows: Total number of rows in the book table
conn: Database connection for storing metrics
metrics_repo: Repository instance for metrics operations
"""
# Initialize CVD tracking
current_cvd = 0.0
metrics_batch = []
batch_size = 1000 # Process metrics in batches for performance
# Set batch size and logging frequency
log_every = max(1, total_rows // 20)
processed = 0
start_time = time.time()
last_report_time = start_time
for row_id, bids_text, asks_text, timestamp in rows_iter:
snapshot = self._snapshot_from_row(row_id, bids_text, asks_text, timestamp, trades_by_timestamp)
if snapshot is not None:
# Calculate metrics for this snapshot
obi = MetricCalculator.calculate_obi(snapshot)
trades = trades_by_timestamp.get(int(timestamp), [])
volume_delta = MetricCalculator.calculate_volume_delta(trades)
current_cvd = MetricCalculator.calculate_cvd(current_cvd, volume_delta)
best_bid, best_ask = MetricCalculator.get_best_bid_ask(snapshot)
# Create metric record
metric = Metric(
snapshot_id=row_id,
timestamp=int(timestamp),
obi=obi,
cvd=current_cvd,
best_bid=best_bid,
best_ask=best_ask
)
metrics_batch.append(metric)
# Add snapshot to book (for compatibility)
self.book.add_snapshot(snapshot)
# Insert metrics batch when it reaches batch_size
if len(metrics_batch) >= batch_size:
metrics_repo.insert_metrics_batch(conn, metrics_batch)
conn.commit()
metrics_batch = []
processed += 1
# Report progress
current_time = time.time()
if processed % log_every == 0 and current_time - last_report_time > 1.0:
logging.info(
f"{processed / total_rows * 100:.1f}% - OBI: {metrics_batch[-1].obi if metrics_batch else 'N/A':.3f} - "
f"CVD: {current_cvd:.1f} - {processed/(current_time-start_time):.1f} rows/sec"
)
last_report_time = current_time
# Insert remaining metrics
if metrics_batch:
metrics_repo.insert_metrics_batch(conn, metrics_batch)
conn.commit()
def _create_snapshots_from_rows(self, rows_iter: Iterator[Tuple[int, str, str, int]], trades_by_timestamp: Dict[int, List[Trade]], total_rows: int) -> None:
"""Create BookSnapshot instances from database rows and add them to the book.
Args:
rows_iter: Iterator yielding (id, bids_text, asks_text, timestamp)
trades_by_timestamp: Dictionary mapping timestamps to lists of trades
total_rows: Total number of rows in the book table
"""
# Get reference to the book
book = self.book
# Set batch size and logging frequency
log_every = max(1, total_rows // 20)
processed = 0
start_time = time.time()
last_report_time = start_time
for row_id, bids_text, asks_text, timestamp in rows_iter:
snapshot = self._snapshot_from_row(row_id, bids_text, asks_text, timestamp, trades_by_timestamp)
if snapshot is not None:
book.add_snapshot(snapshot)
processed += 1
# Report progress
current_time = time.time()
if processed % log_every == 0 and current_time - last_report_time > 1.0:
logging.info(
f"{processed / total_rows * 100:.1f}% - asks {len(self.book.snapshots[-1].asks) if self.book.snapshots else 0} - "
f"bids {len(self.book.snapshots[-1].bids) if self.book.snapshots else 0} - "
f"{processed/(current_time-start_time):.1f} rows/sec"
)
last_report_time = current_time
def _snapshot_from_row(
self,
row_id: int,
bids_text: str,
asks_text: str,
timestamp: int,
trades_by_timestamp: Dict[int, List[Trade]],
) -> Optional[BookSnapshot]:
"""Create a `BookSnapshot` from a single DB row and attached trades.
Returns None if the snapshot has no bids or asks after parsing.
"""
timestamp_int = int(timestamp)
snapshot = BookSnapshot(
id=row_id,
timestamp=timestamp_int,
bids={},
asks={},
trades=trades_by_timestamp.get(timestamp_int, []),
)
self._parser.parse_side(bids_text, snapshot.bids)
self._parser.parse_side(asks_text, snapshot.asks)
if snapshot.bids and snapshot.asks:
return snapshot
return None
def _parse_orderbook_side(self, text: str, side_dict: Dict[float, OrderbookLevel]) -> None:
"""Compatibility wrapper delegating to `OrderbookParser.parse_side`."""
self._parser.parse_side(text, side_dict)
# The following helper was previously used, kept here for reference
# and potential future extensions. It has been superseded by repository
# methods for data access and is intentionally not used.

104
strategies.py Normal file
View File

@ -0,0 +1,104 @@
import logging
from typing import Optional, Any, cast, List
from pathlib import Path
from storage import Book, BookSnapshot
from models import MetricCalculator, Metric
from repositories.sqlite_metrics_repository import SQLiteMetricsRepository
class DefaultStrategy:
"""Strategy that calculates and analyzes OBI and CVD metrics from stored data."""
def __init__(self, instrument: str):
self.instrument = instrument
self._db_path: Optional[Path] = None
def set_db_path(self, db_path: Path) -> None:
"""Set the database path for loading stored metrics."""
self._db_path = db_path
def compute_OBI(self, book: Book) -> List[float]:
"""Compute Order Book Imbalance using MetricCalculator.
Returns:
list: A list of OBI values, one for each snapshot in the book.
"""
if not book.snapshots:
return []
obi_values = []
for snapshot in book.snapshots:
obi = MetricCalculator.calculate_obi(snapshot)
obi_values.append(obi)
return obi_values
def load_stored_metrics(self, start_timestamp: int, end_timestamp: int) -> List[Metric]:
"""Load stored OBI and CVD metrics from database.
Args:
start_timestamp: Start of time range to load.
end_timestamp: End of time range to load.
Returns:
List of Metric objects with OBI and CVD data.
"""
if not self._db_path:
logging.warning("Database path not set, cannot load stored metrics")
return []
try:
metrics_repo = SQLiteMetricsRepository(self._db_path)
with metrics_repo.connect() as conn:
return metrics_repo.load_metrics_by_timerange(conn, start_timestamp, end_timestamp)
except Exception as e:
logging.error(f"Error loading stored metrics: {e}")
return []
def get_metrics_summary(self, metrics: List[Metric]) -> dict:
"""Get summary statistics for loaded metrics.
Args:
metrics: List of metric objects.
Returns:
Dictionary with summary statistics.
"""
if not metrics:
return {}
obi_values = [m.obi for m in metrics]
cvd_values = [m.cvd for m in metrics]
return {
"obi_min": min(obi_values),
"obi_max": max(obi_values),
"obi_avg": sum(obi_values) / len(obi_values),
"cvd_start": cvd_values[0],
"cvd_end": cvd_values[-1],
"cvd_change": cvd_values[-1] - cvd_values[0],
"total_snapshots": len(metrics)
}
def on_booktick(self, book: Book):
"""Hook called on each book tick; can load and analyze stored metrics."""
# Load stored metrics if database path is available
if self._db_path and book.first_timestamp and book.last_timestamp:
metrics = self.load_stored_metrics(book.first_timestamp, book.last_timestamp)
if metrics:
# Analyze stored metrics
summary = self.get_metrics_summary(metrics)
logging.info(f"Metrics summary: {summary}")
# Check for significant imbalances using stored OBI
latest_metric = metrics[-1]
if abs(latest_metric.obi) > 0.2: # 20% imbalance threshold
logging.info(f"Significant imbalance detected: OBI={latest_metric.obi:.3f}, CVD={latest_metric.cvd:.1f}")
else:
# Fallback to real-time calculation for compatibility
obi_values = self.compute_OBI(book)
if obi_values:
latest_obi = obi_values[-1]
if abs(latest_obi) > 0.2:
logging.info(f"Significant imbalance detected: {latest_obi:.3f}")

View File

@ -0,0 +1,66 @@
# Tasks: OBI and CVD Metrics Integration
Based on the PRD for integrating Order Book Imbalance (OBI) and Cumulative Volume Delta (CVD) calculations into the orderflow backtest system.
## Relevant Files
- `repositories/sqlite_repository.py` - Extend to support metrics table operations and batch insertions
- `repositories/test_metrics_repository.py` - Unit tests for metrics repository functionality
- `models.py` - Add new data models for metrics and update Book class
- `tests/test_models_metrics.py` - Unit tests for new metric models
- `storage.py` - Modify to integrate metric calculations during snapshot processing
- `tests/test_storage_metrics.py` - Unit tests for storage metric integration
- `strategies.py` - Enhance DefaultStrategy to calculate OBI and CVD metrics
- `tests/test_strategies_metrics.py` - Unit tests for strategy metric calculations
- `visualizer.py` - Extend to plot OBI and CVD curves beneath volume graphs
- `tests/test_visualizer_metrics.py` - Unit tests for metric visualization
- `parsers/metric_calculator.py` - New utility class for OBI and CVD calculations
- `tests/test_metric_calculator.py` - Unit tests for metric calculation logic
### Notes
- Unit tests should be placed alongside the code files they are testing
- Use `uv run pytest [optional/path/to/test/file]` to run tests following project standards
- Database schema changes require migration considerations for existing databases
## Tasks
- [ ] 1.0 Database Schema and Repository Updates
- [ ] 1.1 Create metrics table schema with proper indexes and foreign key constraints
- [ ] 1.2 Add metrics table creation method to SQLiteOrderflowRepository
- [ ] 1.3 Implement metrics insertion methods with batch support for performance
- [ ] 1.4 Add metrics querying methods (by timestamp range, snapshot_id)
- [ ] 1.5 Create database migration utility to add metrics table to existing databases
- [ ] 1.6 Add proper error handling and transaction management for metrics operations
- [ ] 2.0 Metric Calculation Engine
- [ ] 2.1 Create MetricCalculator class with OBI calculation method
- [ ] 2.2 Implement CVD calculation with incremental support and reset functionality
- [ ] 2.3 Add volume delta calculation for individual timestamps
- [ ] 2.4 Implement best bid/ask extraction from orderbook snapshots
- [ ] 2.5 Add edge case handling (empty orderbook, no trades, zero volume)
- [ ] 2.6 Create validation methods to ensure OBI values are within [-1, 1] range
- [ ] 3.0 Storage System Integration
- [ ] 3.1 Modify Storage.build_booktick_from_db to integrate metric calculations
- [ ] 3.2 Update _create_snapshots_from_rows to calculate and store metrics per snapshot
- [ ] 3.3 Implement memory optimization by removing full snapshot retention
- [ ] 3.4 Add metric persistence during snapshot processing
- [ ] 3.5 Update Book model to store only essential data (metrics + best bid/ask)
- [ ] 3.6 Add progress reporting for metric calculation during processing
- [ ] 4.0 Strategy Enhancement
- [ ] 4.1 Update DefaultStrategy to use MetricCalculator for OBI and CVD
- [ ] 4.2 Modify compute_OBI method to work with new metric calculation system
- [ ] 4.3 Add CVD computation method to DefaultStrategy
- [ ] 4.4 Return time-series data structures compatible with visualizer
- [ ] 4.5 Integrate metric calculation into on_booktick workflow
- [ ] 4.6 Add configuration options for CVD reset points and calculation parameters
- [ ] 5.0 Visualization Implementation
- [ ] 5.1 Extend Visualizer to load metrics data from database
- [ ] 5.2 Add OBI and CVD plotting methods beneath volume graphs
- [ ] 5.3 Implement shared X-axis time alignment across all charts (OHLC, volume, OBI, CVD)
- [ ] 5.4 Add 6-hour bar aggregation support for metrics visualization
- [ ] 5.5 Implement standard line styling for OBI and CVD curves
- [ ] 5.6 Make time resolution configurable for future flexibility

View File

@ -0,0 +1,93 @@
"""Tests for main.py integration with metrics system."""
import sys
import sqlite3
import tempfile
from pathlib import Path
from unittest.mock import patch, MagicMock
sys.path.append(str(Path(__file__).resolve().parents[1]))
# Mock typer to avoid import issues in tests
sys.modules['typer'] = MagicMock()
from storage import Storage
from strategies import DefaultStrategy
def test_strategy_database_integration():
"""Test that strategy gets database path set correctly in main workflow."""
with tempfile.NamedTemporaryFile(suffix=".db", delete=False) as tmp_file:
db_path = Path(tmp_file.name)
try:
# Create minimal test database
with sqlite3.connect(str(db_path)) as conn:
conn.execute("""
CREATE TABLE book (
id INTEGER PRIMARY KEY,
bids TEXT NOT NULL,
asks TEXT NOT NULL,
timestamp INTEGER NOT NULL
)
""")
conn.execute("""
CREATE TABLE trades (
id INTEGER PRIMARY KEY,
trade_id REAL NOT NULL,
price REAL NOT NULL,
size REAL NOT NULL,
side TEXT NOT NULL,
timestamp INTEGER NOT NULL
)
""")
# Insert minimal test data
bids = "[(50000.0, 10.0, 0, 1)]"
asks = "[(50001.0, 5.0, 0, 1)]"
conn.execute("INSERT INTO book (id, bids, asks, timestamp) VALUES (?, ?, ?, ?)",
(1, bids, asks, 1000))
conn.execute("INSERT INTO trades (id, trade_id, price, size, side, timestamp) VALUES (?, ?, ?, ?, ?, ?)",
(1, 1.0, 50000.0, 3.0, "buy", 1000))
conn.commit()
# Test the integration workflow
storage = Storage("BTC-USDT")
strategy = DefaultStrategy("BTC-USDT")
# This simulates the main.py workflow
strategy.set_db_path(db_path) # This is what main.py now does
storage.build_booktick_from_db(db_path, None) # This calculates and stores metrics
# Verify strategy can access stored metrics
assert strategy._db_path == db_path
# Verify metrics were stored by attempting to load them
metrics = strategy.load_stored_metrics(1000, 1000)
assert len(metrics) == 1
assert metrics[0].timestamp == 1000
# Verify strategy can be called (this is what main.py does)
strategy.on_booktick(storage.book) # Should use stored metrics
finally:
db_path.unlink(missing_ok=True)
def test_strategy_backwards_compatibility():
"""Test that strategy still works without database path (backwards compatibility)."""
storage = Storage("BTC-USDT")
strategy = DefaultStrategy("BTC-USDT")
# Don't set database path - should fall back to real-time calculation
# This ensures existing code that doesn't use metrics still works
# Create empty book
assert len(storage.book.snapshots) == 0
# Strategy should handle this gracefully
strategy.on_booktick(storage.book) # Should not crash
# Verify OBI calculation still works
obi_values = strategy.compute_OBI(storage.book)
assert obi_values == [] # Empty book should return empty list

View File

@ -0,0 +1,108 @@
"""Tests for main.py visualization workflow."""
import sys
import sqlite3
import tempfile
from pathlib import Path
from unittest.mock import patch, MagicMock
sys.path.append(str(Path(__file__).resolve().parents[1]))
# Mock typer to avoid import issues in tests
sys.modules['typer'] = MagicMock()
from storage import Storage
from strategies import DefaultStrategy
from visualizer import Visualizer
def test_main_workflow_separation():
"""Test that main.py workflow properly separates strategy and visualization."""
with tempfile.NamedTemporaryFile(suffix=".db", delete=False) as tmp_file:
db_path = Path(tmp_file.name)
try:
# Create minimal test database
with sqlite3.connect(str(db_path)) as conn:
conn.execute("""
CREATE TABLE book (
id INTEGER PRIMARY KEY,
bids TEXT NOT NULL,
asks TEXT NOT NULL,
timestamp INTEGER NOT NULL
)
""")
conn.execute("""
CREATE TABLE trades (
id INTEGER PRIMARY KEY,
trade_id REAL NOT NULL,
price REAL NOT NULL,
size REAL NOT NULL,
side TEXT NOT NULL,
timestamp INTEGER NOT NULL
)
""")
# Insert minimal test data
bids = "[(50000.0, 10.0, 0, 1)]"
asks = "[(50001.0, 5.0, 0, 1)]"
conn.execute("INSERT INTO book (id, bids, asks, timestamp) VALUES (?, ?, ?, ?)",
(1, bids, asks, 1000))
conn.execute("INSERT INTO trades (id, trade_id, price, size, side, timestamp) VALUES (?, ?, ?, ?, ?, ?)",
(1, 1.0, 50000.0, 3.0, "buy", 1000))
conn.commit()
# Test the new main.py workflow
storage = Storage("BTC-USDT")
strategy = DefaultStrategy("BTC-USDT") # No visualization parameter
# Mock visualizer to avoid GUI issues in tests
with patch('matplotlib.pyplot.subplots') as mock_subplots:
mock_fig = type('MockFig', (), {'canvas': type('MockCanvas', (), {'draw_idle': lambda: None})()})()
mock_axes = [type('MockAx', (), {'clear': lambda: None})() for _ in range(4)]
mock_subplots.return_value = (mock_fig, tuple(mock_axes))
visualizer = Visualizer(window_seconds=60, max_bars=500)
# This simulates the new main.py workflow
strategy.set_db_path(db_path)
visualizer.set_db_path(db_path)
storage.build_booktick_from_db(db_path, None)
# Strategy analyzes metrics (no visualization)
strategy.on_booktick(storage.book)
# Verify strategy has database path but no visualizer
assert strategy._db_path == db_path
assert not hasattr(strategy, 'visualizer') or strategy.visualizer is None
# Verify visualizer can access database
assert visualizer._db_path == db_path
# Verify visualizer can load metrics
metrics = visualizer._load_stored_metrics(1000, 1000)
assert len(metrics) == 1
# Test visualization update (should work independently)
with patch.object(visualizer, '_draw') as mock_draw:
visualizer.update_from_book(storage.book)
mock_draw.assert_called_once()
finally:
db_path.unlink(missing_ok=True)
def test_strategy_has_no_visualization_dependency():
"""Test that strategy no longer depends on visualization."""
strategy = DefaultStrategy("BTC-USDT")
# Strategy should not have visualizer attribute
assert not hasattr(strategy, 'visualizer') or strategy.visualizer is None
# Strategy should work without any visualization setup
from models import Book
book = Book()
# Should not raise any errors
strategy.on_booktick(book)

View File

@ -0,0 +1,142 @@
"""Tests for MetricCalculator OBI calculation and best bid/ask extraction."""
import sys
from pathlib import Path
sys.path.append(str(Path(__file__).resolve().parents[1]))
from models import MetricCalculator, BookSnapshot, OrderbookLevel, Trade
def test_calculate_obi_normal_case():
"""Test OBI calculation with normal bid and ask volumes."""
# Create test snapshot with more bid volume than ask volume
snapshot = BookSnapshot(
id=1,
timestamp=1000,
bids={
50000.0: OrderbookLevel(price=50000.0, size=10.0, liquidation_count=0, order_count=1),
49999.0: OrderbookLevel(price=49999.0, size=5.0, liquidation_count=0, order_count=1),
},
asks={
50001.0: OrderbookLevel(price=50001.0, size=3.0, liquidation_count=0, order_count=1),
50002.0: OrderbookLevel(price=50002.0, size=2.0, liquidation_count=0, order_count=1),
},
)
# Total bid volume = 15.0, total ask volume = 5.0
# OBI = (15 - 5) / (15 + 5) = 10 / 20 = 0.5
obi = MetricCalculator.calculate_obi(snapshot)
assert obi == 0.5
def test_calculate_obi_zero_volume():
"""Test OBI calculation when there's no volume."""
snapshot = BookSnapshot(id=1, timestamp=1000, bids={}, asks={})
obi = MetricCalculator.calculate_obi(snapshot)
assert obi == 0.0
def test_calculate_obi_ask_heavy():
"""Test OBI calculation with more ask volume than bid volume."""
snapshot = BookSnapshot(
id=1,
timestamp=1000,
bids={
50000.0: OrderbookLevel(price=50000.0, size=2.0, liquidation_count=0, order_count=1),
},
asks={
50001.0: OrderbookLevel(price=50001.0, size=8.0, liquidation_count=0, order_count=1),
},
)
# Total bid volume = 2.0, total ask volume = 8.0
# OBI = (2 - 8) / (2 + 8) = -6 / 10 = -0.6
obi = MetricCalculator.calculate_obi(snapshot)
assert obi == -0.6
def test_get_best_bid_ask_normal():
"""Test best bid/ask extraction with normal orderbook."""
snapshot = BookSnapshot(
id=1,
timestamp=1000,
bids={
50000.0: OrderbookLevel(price=50000.0, size=1.0, liquidation_count=0, order_count=1),
49999.0: OrderbookLevel(price=49999.0, size=1.0, liquidation_count=0, order_count=1),
49998.0: OrderbookLevel(price=49998.0, size=1.0, liquidation_count=0, order_count=1),
},
asks={
50001.0: OrderbookLevel(price=50001.0, size=1.0, liquidation_count=0, order_count=1),
50002.0: OrderbookLevel(price=50002.0, size=1.0, liquidation_count=0, order_count=1),
50003.0: OrderbookLevel(price=50003.0, size=1.0, liquidation_count=0, order_count=1),
},
)
best_bid, best_ask = MetricCalculator.get_best_bid_ask(snapshot)
assert best_bid == 50000.0 # Highest bid price
assert best_ask == 50001.0 # Lowest ask price
def test_get_best_bid_ask_empty():
"""Test best bid/ask extraction with empty orderbook."""
snapshot = BookSnapshot(id=1, timestamp=1000, bids={}, asks={})
best_bid, best_ask = MetricCalculator.get_best_bid_ask(snapshot)
assert best_bid is None
assert best_ask is None
def test_calculate_volume_delta_buy_heavy():
"""Test volume delta calculation with more buy volume than sell volume."""
trades = [
Trade(id=1, trade_id=1.0, price=50000.0, size=10.0, side="buy", timestamp=1000),
Trade(id=2, trade_id=2.0, price=50001.0, size=5.0, side="buy", timestamp=1000),
Trade(id=3, trade_id=3.0, price=49999.0, size=3.0, side="sell", timestamp=1000),
]
# Buy volume = 15.0, Sell volume = 3.0
# Volume Delta = 15.0 - 3.0 = 12.0
vd = MetricCalculator.calculate_volume_delta(trades)
assert vd == 12.0
def test_calculate_volume_delta_sell_heavy():
"""Test volume delta calculation with more sell volume than buy volume."""
trades = [
Trade(id=1, trade_id=1.0, price=50000.0, size=2.0, side="buy", timestamp=1000),
Trade(id=2, trade_id=2.0, price=49999.0, size=8.0, side="sell", timestamp=1000),
]
# Buy volume = 2.0, Sell volume = 8.0
# Volume Delta = 2.0 - 8.0 = -6.0
vd = MetricCalculator.calculate_volume_delta(trades)
assert vd == -6.0
def test_calculate_volume_delta_no_trades():
"""Test volume delta calculation with no trades."""
trades = []
vd = MetricCalculator.calculate_volume_delta(trades)
assert vd == 0.0
def test_calculate_cvd_incremental():
"""Test incremental CVD calculation."""
# Start with zero CVD
cvd1 = MetricCalculator.calculate_cvd(0.0, 10.0)
assert cvd1 == 10.0
# Add more volume delta
cvd2 = MetricCalculator.calculate_cvd(cvd1, -5.0)
assert cvd2 == 5.0
# Continue accumulating
cvd3 = MetricCalculator.calculate_cvd(cvd2, 15.0)
assert cvd3 == 20.0
def test_calculate_cvd_reset_functionality():
"""Test CVD reset by starting from 0.0."""
# Simulate reset by passing 0.0 as previous CVD
cvd_after_reset = MetricCalculator.calculate_cvd(0.0, 25.0)
assert cvd_after_reset == 25.0

View File

@ -0,0 +1,126 @@
"""Tests for SQLiteMetricsRepository table creation and schema validation."""
import sys
import sqlite3
import tempfile
from pathlib import Path
sys.path.append(str(Path(__file__).resolve().parents[1]))
from repositories.sqlite_metrics_repository import SQLiteMetricsRepository
from models import Metric
def test_create_metrics_table():
"""Test that metrics table is created with proper schema and indexes."""
with tempfile.NamedTemporaryFile(suffix=".db", delete=False) as tmp_file:
db_path = Path(tmp_file.name)
try:
repo = SQLiteMetricsRepository(db_path)
with repo.connect() as conn:
# Create metrics table
repo.create_metrics_table(conn)
# Verify table exists
assert repo.table_exists(conn, "metrics")
# Verify table schema
cursor = conn.cursor()
cursor.execute("PRAGMA table_info(metrics)")
columns = cursor.fetchall()
# Check expected columns exist
column_names = [col[1] for col in columns]
expected_columns = ["id", "snapshot_id", "timestamp", "obi", "cvd", "best_bid", "best_ask"]
for col in expected_columns:
assert col in column_names, f"Column {col} missing from metrics table"
# Verify indexes exist
cursor.execute("PRAGMA index_list(metrics)")
indexes = cursor.fetchall()
index_names = [idx[1] for idx in indexes]
assert "idx_metrics_timestamp" in index_names
assert "idx_metrics_snapshot_id" in index_names
finally:
db_path.unlink(missing_ok=True)
def test_insert_metrics_batch():
"""Test batch insertion of metrics data."""
with tempfile.NamedTemporaryFile(suffix=".db", delete=False) as tmp_file:
db_path = Path(tmp_file.name)
try:
repo = SQLiteMetricsRepository(db_path)
with repo.connect() as conn:
# Create metrics table
repo.create_metrics_table(conn)
# Create test metrics
metrics = [
Metric(snapshot_id=1, timestamp=1000, obi=0.5, cvd=100.0, best_bid=50000.0, best_ask=50001.0),
Metric(snapshot_id=2, timestamp=1001, obi=-0.2, cvd=150.0, best_bid=50002.0, best_ask=50003.0),
Metric(snapshot_id=3, timestamp=1002, obi=0.0, cvd=125.0), # No best_bid/ask
]
# Insert batch
repo.insert_metrics_batch(conn, metrics)
conn.commit()
# Verify insertion
cursor = conn.cursor()
cursor.execute("SELECT COUNT(*) FROM metrics")
count = cursor.fetchone()[0]
assert count == 3
# Verify data integrity
cursor.execute("SELECT snapshot_id, timestamp, obi, cvd, best_bid, best_ask FROM metrics ORDER BY timestamp")
rows = cursor.fetchall()
assert rows[0] == (1, "1000", 0.5, 100.0, 50000.0, 50001.0)
assert rows[1] == (2, "1001", -0.2, 150.0, 50002.0, 50003.0)
assert rows[2] == (3, "1002", 0.0, 125.0, None, None)
finally:
db_path.unlink(missing_ok=True)
def test_load_metrics_by_timerange():
"""Test loading metrics within a timestamp range."""
with tempfile.NamedTemporaryFile(suffix=".db", delete=False) as tmp_file:
db_path = Path(tmp_file.name)
try:
repo = SQLiteMetricsRepository(db_path)
with repo.connect() as conn:
# Create metrics table and insert test data
repo.create_metrics_table(conn)
metrics = [
Metric(snapshot_id=1, timestamp=1000, obi=0.1, cvd=10.0, best_bid=50000.0, best_ask=50001.0),
Metric(snapshot_id=2, timestamp=1005, obi=0.2, cvd=20.0, best_bid=50002.0, best_ask=50003.0),
Metric(snapshot_id=3, timestamp=1010, obi=0.3, cvd=30.0, best_bid=50004.0, best_ask=50005.0),
Metric(snapshot_id=4, timestamp=1015, obi=0.4, cvd=40.0, best_bid=50006.0, best_ask=50007.0),
]
repo.insert_metrics_batch(conn, metrics)
conn.commit()
# Test timerange query - should get middle 2 records
loaded_metrics = repo.load_metrics_by_timerange(conn, 1003, 1012)
assert len(loaded_metrics) == 2
assert loaded_metrics[0].timestamp == 1005
assert loaded_metrics[0].obi == 0.2
assert loaded_metrics[1].timestamp == 1010
assert loaded_metrics[1].obi == 0.3
# Test edge cases
assert len(repo.load_metrics_by_timerange(conn, 2000, 3000)) == 0 # No data
assert len(repo.load_metrics_by_timerange(conn, 1000, 1000)) == 1 # Single record
finally:
db_path.unlink(missing_ok=True)

View File

@ -0,0 +1,17 @@
import sys
from pathlib import Path
sys.path.append(str(Path(__file__).resolve().parents[1]))
from parsers.orderbook_parser import OrderbookParser
def test_parse_side_malformed_text_does_not_raise():
parser = OrderbookParser(debug=False)
side = {}
# Malformed text that literal_eval cannot parse
bad_text = "[[100.0, 'missing tuple closing'"
# Should not raise; should simply log an error and leave side empty
parser.parse_side(bad_text, side)
assert side == {}

View File

@ -0,0 +1,53 @@
import sys
from pathlib import Path
import sqlite3
sys.path.append(str(Path(__file__).resolve().parents[1]))
from repositories.sqlite_repository import SQLiteOrderflowRepository
def test_iterate_book_rows_batches(tmp_path):
db_path = tmp_path / "iter.db"
with sqlite3.connect(str(db_path)) as conn:
c = conn.cursor()
c.execute(
"""
CREATE TABLE book (
id INTEGER PRIMARY KEY,
bids TEXT NOT NULL,
asks TEXT NOT NULL,
timestamp INTEGER NOT NULL
)
"""
)
c.execute(
"""
CREATE TABLE trades (
id INTEGER PRIMARY KEY,
trade_id REAL NOT NULL,
price REAL NOT NULL,
size REAL NOT NULL,
side TEXT NOT NULL,
timestamp INTEGER NOT NULL
)
"""
)
# Insert 12 rows to ensure multiple fetchmany batches (repo uses 5000, but iteration still correct)
bids = str([(100.0, 1.0, 0, 1)])
asks = str([(101.0, 1.0, 0, 1)])
for i in range(12):
c.execute("INSERT INTO book (id, bids, asks, timestamp) VALUES (?, ?, ?, ?)", (i + 1, bids, asks, 1000 + i))
conn.commit()
repo = SQLiteOrderflowRepository(db_path)
with repo.connect() as conn:
rows = list(repo.iterate_book_rows(conn))
assert len(rows) == 12
# Ensure ordering by timestamp ascending
timestamps = [r[3] for r in rows]
assert timestamps == sorted(timestamps)
# count_rows allowlist should work
assert repo.count_rows(conn, "book") == 12

View File

@ -0,0 +1,83 @@
from pathlib import Path
from datetime import datetime, timezone
import sqlite3
import sys
# Ensure project root is on sys.path for direct module imports
sys.path.append(str(Path(__file__).resolve().parents[1]))
from storage import Storage
def _init_db(path: Path) -> None:
with sqlite3.connect(str(path)) as conn:
c = conn.cursor()
c.execute(
"""
CREATE TABLE IF NOT EXISTS book (
id INTEGER PRIMARY KEY,
bids TEXT NOT NULL,
asks TEXT NOT NULL,
timestamp INTEGER NOT NULL
)
"""
)
c.execute(
"""
CREATE TABLE IF NOT EXISTS trades (
id INTEGER PRIMARY KEY,
trade_id REAL NOT NULL,
price REAL NOT NULL,
size REAL NOT NULL,
side TEXT NOT NULL,
timestamp INTEGER NOT NULL
)
"""
)
conn.commit()
def test_storage_builds_snapshots_and_attaches_trades(tmp_path):
db_path = tmp_path / "test.db"
_init_db(db_path)
ts = 1_725_000_000
# Insert one valid book row and one invalid (empty asks) that should be ignored
bids = str([(100.0, 1.0, 0, 1), (99.5, 2.0, 0, 1)])
asks = str([(100.5, 1.5, 0, 1), (101.0, 1.0, 0, 1)])
invalid_asks = str([])
with sqlite3.connect(str(db_path)) as conn:
c = conn.cursor()
c.execute("INSERT INTO book (id, bids, asks, timestamp) VALUES (?, ?, ?, ?)", (1, bids, asks, ts))
c.execute("INSERT INTO book (id, bids, asks, timestamp) VALUES (?, ?, ?, ?)", (2, bids, invalid_asks, ts + 1))
# Insert trades for ts
c.execute(
"INSERT INTO trades (id, trade_id, price, size, side, timestamp) VALUES (?, ?, ?, ?, ?, ?)",
(1, 1.0, 100.25, 0.5, "buy", ts),
)
c.execute(
"INSERT INTO trades (id, trade_id, price, size, side, timestamp) VALUES (?, ?, ?, ?, ?, ?)",
(2, 2.0, 100.75, 0.75, "sell", ts),
)
conn.commit()
storage = Storage("BTC-USDT")
db_date = datetime.fromtimestamp(ts, tz=timezone.utc)
storage.build_booktick_from_db(db_path, db_date)
# Only one snapshot should be included (the valid one with non-empty asks)
assert len(storage.book.snapshots) == 1
snap = storage.book.snapshots[0]
assert snap.timestamp == ts
assert len(snap.bids) == 2
assert len(snap.asks) == 2
# Trades should be attached for the same timestamp
assert len(snap.trades) == 2
sides = sorted(t.side for t in snap.trades)
assert sides == ["buy", "sell"]

View File

@ -0,0 +1,88 @@
"""Tests for Storage metrics integration."""
import sys
import sqlite3
import tempfile
from pathlib import Path
from datetime import datetime
sys.path.append(str(Path(__file__).resolve().parents[1]))
from storage import Storage
from repositories.sqlite_metrics_repository import SQLiteMetricsRepository
def test_storage_calculates_and_stores_metrics():
"""Test that Storage calculates and stores metrics during build_booktick_from_db."""
with tempfile.NamedTemporaryFile(suffix=".db", delete=False) as tmp_file:
db_path = Path(tmp_file.name)
try:
# Create test database with minimal data
with sqlite3.connect(str(db_path)) as conn:
# Create tables
conn.execute("""
CREATE TABLE book (
id INTEGER PRIMARY KEY,
bids TEXT NOT NULL,
asks TEXT NOT NULL,
timestamp INTEGER NOT NULL
)
""")
conn.execute("""
CREATE TABLE trades (
id INTEGER PRIMARY KEY,
trade_id REAL NOT NULL,
price REAL NOT NULL,
size REAL NOT NULL,
side TEXT NOT NULL,
timestamp INTEGER NOT NULL
)
""")
# Insert test data
bids = "[(50000.0, 10.0, 0, 1), (49999.0, 5.0, 0, 1)]" # 15.0 total bid volume
asks = "[(50001.0, 3.0, 0, 1), (50002.0, 2.0, 0, 1)]" # 5.0 total ask volume
conn.execute("INSERT INTO book (id, bids, asks, timestamp) VALUES (?, ?, ?, ?)",
(1, bids, asks, 1000))
# Add trades for CVD calculation
conn.execute("INSERT INTO trades (id, trade_id, price, size, side, timestamp) VALUES (?, ?, ?, ?, ?, ?)",
(1, 1.0, 50000.0, 8.0, "buy", 1000))
conn.execute("INSERT INTO trades (id, trade_id, price, size, side, timestamp) VALUES (?, ?, ?, ?, ?, ?)",
(2, 2.0, 50001.0, 3.0, "sell", 1000))
conn.commit()
# Test Storage metrics integration
storage = Storage("BTC-USDT")
storage.build_booktick_from_db(db_path, datetime.now())
# Verify metrics were calculated and stored
metrics_repo = SQLiteMetricsRepository(db_path)
with metrics_repo.connect() as conn:
# Check metrics table exists
assert metrics_repo.table_exists(conn, "metrics")
# Load calculated metrics
metrics = metrics_repo.load_metrics_by_timerange(conn, 1000, 1000)
assert len(metrics) == 1
metric = metrics[0]
# Verify OBI calculation: (15 - 5) / (15 + 5) = 0.5
assert abs(metric.obi - 0.5) < 0.001
# Verify CVD calculation: buy(8.0) - sell(3.0) = 5.0
assert abs(metric.cvd - 5.0) < 0.001
# Verify best bid/ask
assert metric.best_bid == 50000.0
assert metric.best_ask == 50001.0
# Verify book was also populated (backward compatibility)
assert len(storage.book.snapshots) == 1
finally:
db_path.unlink(missing_ok=True)

View File

@ -0,0 +1,48 @@
import sys
from pathlib import Path
import pytest
# Ensure project root is on sys.path for direct module imports
sys.path.append(str(Path(__file__).resolve().parents[1]))
from storage import Storage, OrderbookLevel
def test_parse_orderbook_side_happy_path():
storage = Storage("BTC-USDT")
text = str([
(100.0, 1.5, 0, 2),
(101.0, 2.25, 1, 3),
])
side = {}
storage._parse_orderbook_side(text, side)
assert 100.0 in side and 101.0 in side
level_100 = side[100.0]
level_101 = side[101.0]
assert isinstance(level_100, OrderbookLevel)
assert level_100.price == 100.0
assert level_100.size == 1.5
assert level_100.liquidation_count == 0
assert level_100.order_count == 2
assert level_101.size == 2.25
assert level_101.liquidation_count == 1
assert level_101.order_count == 3
def test_parse_orderbook_side_ignores_zero_size():
storage = Storage("BTC-USDT")
text = str([
(100.0, 0.0, 0, 0),
(101.0, 1.0, 0, 1),
])
side = {}
storage._parse_orderbook_side(text, side)
assert 100.0 not in side
assert 101.0 in side

View File

@ -0,0 +1,112 @@
"""Tests for DefaultStrategy metrics integration."""
import sys
import sqlite3
import tempfile
from pathlib import Path
sys.path.append(str(Path(__file__).resolve().parents[1]))
from strategies import DefaultStrategy
from models import Book, BookSnapshot, OrderbookLevel, Metric
from repositories.sqlite_metrics_repository import SQLiteMetricsRepository
def test_strategy_uses_metric_calculator():
"""Test that strategy uses MetricCalculator for OBI calculation."""
strategy = DefaultStrategy("BTC-USDT")
# Create test book with snapshots
book = Book()
snapshot = BookSnapshot(
id=1,
timestamp=1000,
bids={50000.0: OrderbookLevel(price=50000.0, size=10.0, liquidation_count=0, order_count=1)},
asks={50001.0: OrderbookLevel(price=50001.0, size=5.0, liquidation_count=0, order_count=1)},
)
book.add_snapshot(snapshot)
# Test OBI calculation
obi_values = strategy.compute_OBI(book)
assert len(obi_values) == 1
# OBI = (10 - 5) / (10 + 5) = 0.333...
assert abs(obi_values[0] - 0.333333) < 0.001
def test_strategy_loads_stored_metrics():
"""Test that strategy can load stored metrics from database."""
with tempfile.NamedTemporaryFile(suffix=".db", delete=False) as tmp_file:
db_path = Path(tmp_file.name)
try:
# Create test database with metrics
metrics_repo = SQLiteMetricsRepository(db_path)
with metrics_repo.connect() as conn:
metrics_repo.create_metrics_table(conn)
# Insert test metrics
test_metrics = [
Metric(snapshot_id=1, timestamp=1000, obi=0.1, cvd=10.0, best_bid=50000.0, best_ask=50001.0),
Metric(snapshot_id=2, timestamp=1001, obi=0.2, cvd=15.0, best_bid=50002.0, best_ask=50003.0),
Metric(snapshot_id=3, timestamp=1002, obi=0.3, cvd=20.0, best_bid=50004.0, best_ask=50005.0),
]
metrics_repo.insert_metrics_batch(conn, test_metrics)
conn.commit()
# Test strategy loading
strategy = DefaultStrategy("BTC-USDT")
strategy.set_db_path(db_path)
loaded_metrics = strategy.load_stored_metrics(1000, 1002)
assert len(loaded_metrics) == 3
assert loaded_metrics[0].obi == 0.1
assert loaded_metrics[0].cvd == 10.0
assert loaded_metrics[-1].obi == 0.3
assert loaded_metrics[-1].cvd == 20.0
finally:
db_path.unlink(missing_ok=True)
def test_strategy_metrics_summary():
"""Test that strategy generates correct metrics summary."""
strategy = DefaultStrategy("BTC-USDT")
# Create test metrics
metrics = [
Metric(snapshot_id=1, timestamp=1000, obi=0.1, cvd=10.0),
Metric(snapshot_id=2, timestamp=1001, obi=-0.2, cvd=5.0),
Metric(snapshot_id=3, timestamp=1002, obi=0.3, cvd=15.0),
]
summary = strategy.get_metrics_summary(metrics)
assert summary["obi_min"] == -0.2
assert summary["obi_max"] == 0.3
assert abs(summary["obi_avg"] - 0.0667) < 0.001 # (0.1 + (-0.2) + 0.3) / 3
assert summary["cvd_start"] == 10.0
assert summary["cvd_end"] == 15.0
assert summary["cvd_change"] == 5.0 # 15.0 - 10.0
assert summary["total_snapshots"] == 3
def test_strategy_empty_metrics():
"""Test strategy behavior with empty metrics."""
strategy = DefaultStrategy("BTC-USDT")
# Test with empty book
book = Book()
obi_values = strategy.compute_OBI(book)
assert obi_values == []
# Test with empty metrics
summary = strategy.get_metrics_summary([])
assert summary == {}
# Test loading from non-existent database
strategy.set_db_path(Path("nonexistent.db"))
metrics = strategy.load_stored_metrics(1000, 2000)
assert metrics == []

View File

@ -0,0 +1,112 @@
"""Tests for Visualizer metrics integration."""
import sys
import sqlite3
import tempfile
from pathlib import Path
from unittest.mock import patch
sys.path.append(str(Path(__file__).resolve().parents[1]))
from visualizer import Visualizer
from models import Book, BookSnapshot, OrderbookLevel, Metric
from repositories.sqlite_metrics_repository import SQLiteMetricsRepository
def test_visualizer_loads_metrics():
"""Test that visualizer can load stored metrics from database."""
with tempfile.NamedTemporaryFile(suffix=".db", delete=False) as tmp_file:
db_path = Path(tmp_file.name)
try:
# Create test database with metrics
metrics_repo = SQLiteMetricsRepository(db_path)
with metrics_repo.connect() as conn:
metrics_repo.create_metrics_table(conn)
# Insert test metrics
test_metrics = [
Metric(snapshot_id=1, timestamp=1000, obi=0.1, cvd=10.0, best_bid=50000.0, best_ask=50001.0),
Metric(snapshot_id=2, timestamp=1060, obi=0.2, cvd=15.0, best_bid=50002.0, best_ask=50003.0),
Metric(snapshot_id=3, timestamp=1120, obi=-0.1, cvd=12.0, best_bid=50004.0, best_ask=50005.0),
]
metrics_repo.insert_metrics_batch(conn, test_metrics)
conn.commit()
# Test visualizer
visualizer = Visualizer(window_seconds=60, max_bars=200)
visualizer.set_db_path(db_path)
# Load metrics directly to test the method
loaded_metrics = visualizer._load_stored_metrics(1000, 1120)
assert len(loaded_metrics) == 3
assert loaded_metrics[0].obi == 0.1
assert loaded_metrics[0].cvd == 10.0
assert loaded_metrics[1].obi == 0.2
assert loaded_metrics[2].obi == -0.1
finally:
db_path.unlink(missing_ok=True)
def test_visualizer_handles_no_database():
"""Test that visualizer handles gracefully when no database path is set."""
visualizer = Visualizer(window_seconds=60, max_bars=200)
# No database path set - should return empty list
metrics = visualizer._load_stored_metrics(1000, 2000)
assert metrics == []
def test_visualizer_handles_invalid_database():
"""Test that visualizer handles invalid database paths gracefully."""
visualizer = Visualizer(window_seconds=60, max_bars=200)
visualizer.set_db_path(Path("nonexistent.db"))
# Should handle error gracefully and return empty list
metrics = visualizer._load_stored_metrics(1000, 2000)
assert metrics == []
@patch('matplotlib.pyplot.subplots')
def test_visualizer_creates_four_subplots(mock_subplots):
"""Test that visualizer creates four subplots for OHLC, Volume, OBI, and CVD."""
# Mock the subplots creation
mock_fig = type('MockFig', (), {})()
mock_ax_ohlc = type('MockAx', (), {})()
mock_ax_volume = type('MockAx', (), {})()
mock_ax_obi = type('MockAx', (), {})()
mock_ax_cvd = type('MockAx', (), {})()
mock_subplots.return_value = (mock_fig, (mock_ax_ohlc, mock_ax_volume, mock_ax_obi, mock_ax_cvd))
# Create visualizer
visualizer = Visualizer(window_seconds=60, max_bars=200)
# Verify subplots were created correctly
mock_subplots.assert_called_once_with(4, 1, figsize=(12, 10), sharex=True)
assert visualizer.ax_ohlc == mock_ax_ohlc
assert visualizer.ax_volume == mock_ax_volume
assert visualizer.ax_obi == mock_ax_obi
assert visualizer.ax_cvd == mock_ax_cvd
def test_visualizer_update_from_book_with_empty_book():
"""Test that visualizer handles empty book gracefully."""
with patch('matplotlib.pyplot.subplots') as mock_subplots:
# Mock the subplots creation
mock_fig = type('MockFig', (), {'canvas': type('MockCanvas', (), {'draw_idle': lambda: None})()})()
mock_axes = [type('MockAx', (), {'clear': lambda: None})() for _ in range(4)]
mock_subplots.return_value = (mock_fig, tuple(mock_axes))
visualizer = Visualizer(window_seconds=60, max_bars=200)
# Test with empty book
book = Book()
# Should handle gracefully without errors
with patch('logging.warning') as mock_warning:
visualizer.update_from_book(book)
mock_warning.assert_called_once_with("Book has no snapshots to visualize")

612
uv.lock generated Normal file
View File

@ -0,0 +1,612 @@
version = 1
revision = 3
requires-python = ">=3.12"
[[package]]
name = "click"
version = "8.2.1"
source = { registry = "https://pypi.org/simple" }
dependencies = [
{ name = "colorama", marker = "sys_platform == 'win32'" },
]
sdist = { url = "https://files.pythonhosted.org/packages/60/6c/8ca2efa64cf75a977a0d7fac081354553ebe483345c734fb6b6515d96bbc/click-8.2.1.tar.gz", hash = "sha256:27c491cc05d968d271d5a1db13e3b5a184636d9d930f148c50b038f0d0646202", size = 286342, upload-time = "2025-05-20T23:19:49.832Z" }
wheels = [
{ url = "https://files.pythonhosted.org/packages/85/32/10bb5764d90a8eee674e9dc6f4db6a0ab47c8c4d0d83c27f7c39ac415a4d/click-8.2.1-py3-none-any.whl", hash = "sha256:61a3265b914e850b85317d0b3109c7f8cd35a670f963866005d6ef1d5175a12b", size = 102215, upload-time = "2025-05-20T23:19:47.796Z" },
]
[[package]]
name = "colorama"
version = "0.4.6"
source = { registry = "https://pypi.org/simple" }
sdist = { url = "https://files.pythonhosted.org/packages/d8/53/6f443c9a4a8358a93a6792e2acffb9d9d5cb0a5cfd8802644b7b1c9a02e4/colorama-0.4.6.tar.gz", hash = "sha256:08695f5cb7ed6e0531a20572697297273c47b8cae5a63ffc6d6ed5c201be6e44", size = 27697, upload-time = "2022-10-25T02:36:22.414Z" }
wheels = [
{ url = "https://files.pythonhosted.org/packages/d1/d6/3965ed04c63042e047cb6a3e6ed1a63a35087b6a609aa3a15ed8ac56c221/colorama-0.4.6-py2.py3-none-any.whl", hash = "sha256:4f1d9991f5acc0ca119f9d443620b77f9d6b33703e51011c16baf57afb285fc6", size = 25335, upload-time = "2022-10-25T02:36:20.889Z" },
]
[[package]]
name = "contourpy"
version = "1.3.3"
source = { registry = "https://pypi.org/simple" }
dependencies = [
{ name = "numpy" },
]
sdist = { url = "https://files.pythonhosted.org/packages/58/01/1253e6698a07380cd31a736d248a3f2a50a7c88779a1813da27503cadc2a/contourpy-1.3.3.tar.gz", hash = "sha256:083e12155b210502d0bca491432bb04d56dc3432f95a979b429f2848c3dbe880", size = 13466174, upload-time = "2025-07-26T12:03:12.549Z" }
wheels = [
{ url = "https://files.pythonhosted.org/packages/be/45/adfee365d9ea3d853550b2e735f9d66366701c65db7855cd07621732ccfc/contourpy-1.3.3-cp312-cp312-macosx_10_13_x86_64.whl", hash = "sha256:b08a32ea2f8e42cf1d4be3169a98dd4be32bafe4f22b6c4cb4ba810fa9e5d2cb", size = 293419, upload-time = "2025-07-26T12:01:21.16Z" },
{ url = "https://files.pythonhosted.org/packages/53/3e/405b59cfa13021a56bba395a6b3aca8cec012b45bf177b0eaf7a202cde2c/contourpy-1.3.3-cp312-cp312-macosx_11_0_arm64.whl", hash = "sha256:556dba8fb6f5d8742f2923fe9457dbdd51e1049c4a43fd3986a0b14a1d815fc6", size = 273979, upload-time = "2025-07-26T12:01:22.448Z" },
{ url = "https://files.pythonhosted.org/packages/d4/1c/a12359b9b2ca3a845e8f7f9ac08bdf776114eb931392fcad91743e2ea17b/contourpy-1.3.3-cp312-cp312-manylinux_2_26_aarch64.manylinux_2_28_aarch64.whl", hash = "sha256:92d9abc807cf7d0e047b95ca5d957cf4792fcd04e920ca70d48add15c1a90ea7", size = 332653, upload-time = "2025-07-26T12:01:24.155Z" },
{ url = "https://files.pythonhosted.org/packages/63/12/897aeebfb475b7748ea67b61e045accdfcf0d971f8a588b67108ed7f5512/contourpy-1.3.3-cp312-cp312-manylinux_2_26_ppc64le.manylinux_2_28_ppc64le.whl", hash = "sha256:b2e8faa0ed68cb29af51edd8e24798bb661eac3bd9f65420c1887b6ca89987c8", size = 379536, upload-time = "2025-07-26T12:01:25.91Z" },
{ url = "https://files.pythonhosted.org/packages/43/8a/a8c584b82deb248930ce069e71576fc09bd7174bbd35183b7943fb1064fd/contourpy-1.3.3-cp312-cp312-manylinux_2_26_s390x.manylinux_2_28_s390x.whl", hash = "sha256:626d60935cf668e70a5ce6ff184fd713e9683fb458898e4249b63be9e28286ea", size = 384397, upload-time = "2025-07-26T12:01:27.152Z" },
{ url = "https://files.pythonhosted.org/packages/cc/8f/ec6289987824b29529d0dfda0d74a07cec60e54b9c92f3c9da4c0ac732de/contourpy-1.3.3-cp312-cp312-manylinux_2_27_x86_64.manylinux_2_28_x86_64.whl", hash = "sha256:4d00e655fcef08aba35ec9610536bfe90267d7ab5ba944f7032549c55a146da1", size = 362601, upload-time = "2025-07-26T12:01:28.808Z" },
{ url = "https://files.pythonhosted.org/packages/05/0a/a3fe3be3ee2dceb3e615ebb4df97ae6f3828aa915d3e10549ce016302bd1/contourpy-1.3.3-cp312-cp312-musllinux_1_2_aarch64.whl", hash = "sha256:451e71b5a7d597379ef572de31eeb909a87246974d960049a9848c3bc6c41bf7", size = 1331288, upload-time = "2025-07-26T12:01:31.198Z" },
{ url = "https://files.pythonhosted.org/packages/33/1d/acad9bd4e97f13f3e2b18a3977fe1b4a37ecf3d38d815333980c6c72e963/contourpy-1.3.3-cp312-cp312-musllinux_1_2_x86_64.whl", hash = "sha256:459c1f020cd59fcfe6650180678a9993932d80d44ccde1fa1868977438f0b411", size = 1403386, upload-time = "2025-07-26T12:01:33.947Z" },
{ url = "https://files.pythonhosted.org/packages/cf/8f/5847f44a7fddf859704217a99a23a4f6417b10e5ab1256a179264561540e/contourpy-1.3.3-cp312-cp312-win32.whl", hash = "sha256:023b44101dfe49d7d53932be418477dba359649246075c996866106da069af69", size = 185018, upload-time = "2025-07-26T12:01:35.64Z" },
{ url = "https://files.pythonhosted.org/packages/19/e8/6026ed58a64563186a9ee3f29f41261fd1828f527dd93d33b60feca63352/contourpy-1.3.3-cp312-cp312-win_amd64.whl", hash = "sha256:8153b8bfc11e1e4d75bcb0bff1db232f9e10b274e0929de9d608027e0d34ff8b", size = 226567, upload-time = "2025-07-26T12:01:36.804Z" },
{ url = "https://files.pythonhosted.org/packages/d1/e2/f05240d2c39a1ed228d8328a78b6f44cd695f7ef47beb3e684cf93604f86/contourpy-1.3.3-cp312-cp312-win_arm64.whl", hash = "sha256:07ce5ed73ecdc4a03ffe3e1b3e3c1166db35ae7584be76f65dbbe28a7791b0cc", size = 193655, upload-time = "2025-07-26T12:01:37.999Z" },
{ url = "https://files.pythonhosted.org/packages/68/35/0167aad910bbdb9599272bd96d01a9ec6852f36b9455cf2ca67bd4cc2d23/contourpy-1.3.3-cp313-cp313-macosx_10_13_x86_64.whl", hash = "sha256:177fb367556747a686509d6fef71d221a4b198a3905fe824430e5ea0fda54eb5", size = 293257, upload-time = "2025-07-26T12:01:39.367Z" },
{ url = "https://files.pythonhosted.org/packages/96/e4/7adcd9c8362745b2210728f209bfbcf7d91ba868a2c5f40d8b58f54c509b/contourpy-1.3.3-cp313-cp313-macosx_11_0_arm64.whl", hash = "sha256:d002b6f00d73d69333dac9d0b8d5e84d9724ff9ef044fd63c5986e62b7c9e1b1", size = 274034, upload-time = "2025-07-26T12:01:40.645Z" },
{ url = "https://files.pythonhosted.org/packages/73/23/90e31ceeed1de63058a02cb04b12f2de4b40e3bef5e082a7c18d9c8ae281/contourpy-1.3.3-cp313-cp313-manylinux_2_26_aarch64.manylinux_2_28_aarch64.whl", hash = "sha256:348ac1f5d4f1d66d3322420f01d42e43122f43616e0f194fc1c9f5d830c5b286", size = 334672, upload-time = "2025-07-26T12:01:41.942Z" },
{ url = "https://files.pythonhosted.org/packages/ed/93/b43d8acbe67392e659e1d984700e79eb67e2acb2bd7f62012b583a7f1b55/contourpy-1.3.3-cp313-cp313-manylinux_2_26_ppc64le.manylinux_2_28_ppc64le.whl", hash = "sha256:655456777ff65c2c548b7c454af9c6f33f16c8884f11083244b5819cc214f1b5", size = 381234, upload-time = "2025-07-26T12:01:43.499Z" },
{ url = "https://files.pythonhosted.org/packages/46/3b/bec82a3ea06f66711520f75a40c8fc0b113b2a75edb36aa633eb11c4f50f/contourpy-1.3.3-cp313-cp313-manylinux_2_26_s390x.manylinux_2_28_s390x.whl", hash = "sha256:644a6853d15b2512d67881586bd03f462c7ab755db95f16f14d7e238f2852c67", size = 385169, upload-time = "2025-07-26T12:01:45.219Z" },
{ url = "https://files.pythonhosted.org/packages/4b/32/e0f13a1c5b0f8572d0ec6ae2f6c677b7991fafd95da523159c19eff0696a/contourpy-1.3.3-cp313-cp313-manylinux_2_27_x86_64.manylinux_2_28_x86_64.whl", hash = "sha256:4debd64f124ca62069f313a9cb86656ff087786016d76927ae2cf37846b006c9", size = 362859, upload-time = "2025-07-26T12:01:46.519Z" },
{ url = "https://files.pythonhosted.org/packages/33/71/e2a7945b7de4e58af42d708a219f3b2f4cff7386e6b6ab0a0fa0033c49a9/contourpy-1.3.3-cp313-cp313-musllinux_1_2_aarch64.whl", hash = "sha256:a15459b0f4615b00bbd1e91f1b9e19b7e63aea7483d03d804186f278c0af2659", size = 1332062, upload-time = "2025-07-26T12:01:48.964Z" },
{ url = "https://files.pythonhosted.org/packages/12/fc/4e87ac754220ccc0e807284f88e943d6d43b43843614f0a8afa469801db0/contourpy-1.3.3-cp313-cp313-musllinux_1_2_x86_64.whl", hash = "sha256:ca0fdcd73925568ca027e0b17ab07aad764be4706d0a925b89227e447d9737b7", size = 1403932, upload-time = "2025-07-26T12:01:51.979Z" },
{ url = "https://files.pythonhosted.org/packages/a6/2e/adc197a37443f934594112222ac1aa7dc9a98faf9c3842884df9a9d8751d/contourpy-1.3.3-cp313-cp313-win32.whl", hash = "sha256:b20c7c9a3bf701366556e1b1984ed2d0cedf999903c51311417cf5f591d8c78d", size = 185024, upload-time = "2025-07-26T12:01:53.245Z" },
{ url = "https://files.pythonhosted.org/packages/18/0b/0098c214843213759692cc638fce7de5c289200a830e5035d1791d7a2338/contourpy-1.3.3-cp313-cp313-win_amd64.whl", hash = "sha256:1cadd8b8969f060ba45ed7c1b714fe69185812ab43bd6b86a9123fe8f99c3263", size = 226578, upload-time = "2025-07-26T12:01:54.422Z" },
{ url = "https://files.pythonhosted.org/packages/8a/9a/2f6024a0c5995243cd63afdeb3651c984f0d2bc727fd98066d40e141ad73/contourpy-1.3.3-cp313-cp313-win_arm64.whl", hash = "sha256:fd914713266421b7536de2bfa8181aa8c699432b6763a0ea64195ebe28bff6a9", size = 193524, upload-time = "2025-07-26T12:01:55.73Z" },
{ url = "https://files.pythonhosted.org/packages/c0/b3/f8a1a86bd3298513f500e5b1f5fd92b69896449f6cab6a146a5d52715479/contourpy-1.3.3-cp313-cp313t-macosx_10_13_x86_64.whl", hash = "sha256:88df9880d507169449d434c293467418b9f6cbe82edd19284aa0409e7fdb933d", size = 306730, upload-time = "2025-07-26T12:01:57.051Z" },
{ url = "https://files.pythonhosted.org/packages/3f/11/4780db94ae62fc0c2053909b65dc3246bd7cecfc4f8a20d957ad43aa4ad8/contourpy-1.3.3-cp313-cp313t-macosx_11_0_arm64.whl", hash = "sha256:d06bb1f751ba5d417047db62bca3c8fde202b8c11fb50742ab3ab962c81e8216", size = 287897, upload-time = "2025-07-26T12:01:58.663Z" },
{ url = "https://files.pythonhosted.org/packages/ae/15/e59f5f3ffdd6f3d4daa3e47114c53daabcb18574a26c21f03dc9e4e42ff0/contourpy-1.3.3-cp313-cp313t-manylinux_2_26_aarch64.manylinux_2_28_aarch64.whl", hash = "sha256:e4e6b05a45525357e382909a4c1600444e2a45b4795163d3b22669285591c1ae", size = 326751, upload-time = "2025-07-26T12:02:00.343Z" },
{ url = "https://files.pythonhosted.org/packages/0f/81/03b45cfad088e4770b1dcf72ea78d3802d04200009fb364d18a493857210/contourpy-1.3.3-cp313-cp313t-manylinux_2_26_ppc64le.manylinux_2_28_ppc64le.whl", hash = "sha256:ab3074b48c4e2cf1a960e6bbeb7f04566bf36b1861d5c9d4d8ac04b82e38ba20", size = 375486, upload-time = "2025-07-26T12:02:02.128Z" },
{ url = "https://files.pythonhosted.org/packages/0c/ba/49923366492ffbdd4486e970d421b289a670ae8cf539c1ea9a09822b371a/contourpy-1.3.3-cp313-cp313t-manylinux_2_26_s390x.manylinux_2_28_s390x.whl", hash = "sha256:6c3d53c796f8647d6deb1abe867daeb66dcc8a97e8455efa729516b997b8ed99", size = 388106, upload-time = "2025-07-26T12:02:03.615Z" },
{ url = "https://files.pythonhosted.org/packages/9f/52/5b00ea89525f8f143651f9f03a0df371d3cbd2fccd21ca9b768c7a6500c2/contourpy-1.3.3-cp313-cp313t-manylinux_2_27_x86_64.manylinux_2_28_x86_64.whl", hash = "sha256:50ed930df7289ff2a8d7afeb9603f8289e5704755c7e5c3bbd929c90c817164b", size = 352548, upload-time = "2025-07-26T12:02:05.165Z" },
{ url = "https://files.pythonhosted.org/packages/32/1d/a209ec1a3a3452d490f6b14dd92e72280c99ae3d1e73da74f8277d4ee08f/contourpy-1.3.3-cp313-cp313t-musllinux_1_2_aarch64.whl", hash = "sha256:4feffb6537d64b84877da813a5c30f1422ea5739566abf0bd18065ac040e120a", size = 1322297, upload-time = "2025-07-26T12:02:07.379Z" },
{ url = "https://files.pythonhosted.org/packages/bc/9e/46f0e8ebdd884ca0e8877e46a3f4e633f6c9c8c4f3f6e72be3fe075994aa/contourpy-1.3.3-cp313-cp313t-musllinux_1_2_x86_64.whl", hash = "sha256:2b7e9480ffe2b0cd2e787e4df64270e3a0440d9db8dc823312e2c940c167df7e", size = 1391023, upload-time = "2025-07-26T12:02:10.171Z" },
{ url = "https://files.pythonhosted.org/packages/b9/70/f308384a3ae9cd2209e0849f33c913f658d3326900d0ff5d378d6a1422d2/contourpy-1.3.3-cp313-cp313t-win32.whl", hash = "sha256:283edd842a01e3dcd435b1c5116798d661378d83d36d337b8dde1d16a5fc9ba3", size = 196157, upload-time = "2025-07-26T12:02:11.488Z" },
{ url = "https://files.pythonhosted.org/packages/b2/dd/880f890a6663b84d9e34a6f88cded89d78f0091e0045a284427cb6b18521/contourpy-1.3.3-cp313-cp313t-win_amd64.whl", hash = "sha256:87acf5963fc2b34825e5b6b048f40e3635dd547f590b04d2ab317c2619ef7ae8", size = 240570, upload-time = "2025-07-26T12:02:12.754Z" },
{ url = "https://files.pythonhosted.org/packages/80/99/2adc7d8ffead633234817ef8e9a87115c8a11927a94478f6bb3d3f4d4f7d/contourpy-1.3.3-cp313-cp313t-win_arm64.whl", hash = "sha256:3c30273eb2a55024ff31ba7d052dde990d7d8e5450f4bbb6e913558b3d6c2301", size = 199713, upload-time = "2025-07-26T12:02:14.4Z" },
{ url = "https://files.pythonhosted.org/packages/72/8b/4546f3ab60f78c514ffb7d01a0bd743f90de36f0019d1be84d0a708a580a/contourpy-1.3.3-cp314-cp314-macosx_10_13_x86_64.whl", hash = "sha256:fde6c716d51c04b1c25d0b90364d0be954624a0ee9d60e23e850e8d48353d07a", size = 292189, upload-time = "2025-07-26T12:02:16.095Z" },
{ url = "https://files.pythonhosted.org/packages/fd/e1/3542a9cb596cadd76fcef413f19c79216e002623158befe6daa03dbfa88c/contourpy-1.3.3-cp314-cp314-macosx_11_0_arm64.whl", hash = "sha256:cbedb772ed74ff5be440fa8eee9bd49f64f6e3fc09436d9c7d8f1c287b121d77", size = 273251, upload-time = "2025-07-26T12:02:17.524Z" },
{ url = "https://files.pythonhosted.org/packages/b1/71/f93e1e9471d189f79d0ce2497007731c1e6bf9ef6d1d61b911430c3db4e5/contourpy-1.3.3-cp314-cp314-manylinux_2_26_aarch64.manylinux_2_28_aarch64.whl", hash = "sha256:22e9b1bd7a9b1d652cd77388465dc358dafcd2e217d35552424aa4f996f524f5", size = 335810, upload-time = "2025-07-26T12:02:18.9Z" },
{ url = "https://files.pythonhosted.org/packages/91/f9/e35f4c1c93f9275d4e38681a80506b5510e9327350c51f8d4a5a724d178c/contourpy-1.3.3-cp314-cp314-manylinux_2_26_ppc64le.manylinux_2_28_ppc64le.whl", hash = "sha256:a22738912262aa3e254e4f3cb079a95a67132fc5a063890e224393596902f5a4", size = 382871, upload-time = "2025-07-26T12:02:20.418Z" },
{ url = "https://files.pythonhosted.org/packages/b5/71/47b512f936f66a0a900d81c396a7e60d73419868fba959c61efed7a8ab46/contourpy-1.3.3-cp314-cp314-manylinux_2_26_s390x.manylinux_2_28_s390x.whl", hash = "sha256:afe5a512f31ee6bd7d0dda52ec9864c984ca3d66664444f2d72e0dc4eb832e36", size = 386264, upload-time = "2025-07-26T12:02:21.916Z" },
{ url = "https://files.pythonhosted.org/packages/04/5f/9ff93450ba96b09c7c2b3f81c94de31c89f92292f1380261bd7195bea4ea/contourpy-1.3.3-cp314-cp314-manylinux_2_27_x86_64.manylinux_2_28_x86_64.whl", hash = "sha256:f64836de09927cba6f79dcd00fdd7d5329f3fccc633468507079c829ca4db4e3", size = 363819, upload-time = "2025-07-26T12:02:23.759Z" },
{ url = "https://files.pythonhosted.org/packages/3e/a6/0b185d4cc480ee494945cde102cb0149ae830b5fa17bf855b95f2e70ad13/contourpy-1.3.3-cp314-cp314-musllinux_1_2_aarch64.whl", hash = "sha256:1fd43c3be4c8e5fd6e4f2baeae35ae18176cf2e5cced681cca908addf1cdd53b", size = 1333650, upload-time = "2025-07-26T12:02:26.181Z" },
{ url = "https://files.pythonhosted.org/packages/43/d7/afdc95580ca56f30fbcd3060250f66cedbde69b4547028863abd8aa3b47e/contourpy-1.3.3-cp314-cp314-musllinux_1_2_x86_64.whl", hash = "sha256:6afc576f7b33cf00996e5c1102dc2a8f7cc89e39c0b55df93a0b78c1bd992b36", size = 1404833, upload-time = "2025-07-26T12:02:28.782Z" },
{ url = "https://files.pythonhosted.org/packages/e2/e2/366af18a6d386f41132a48f033cbd2102e9b0cf6345d35ff0826cd984566/contourpy-1.3.3-cp314-cp314-win32.whl", hash = "sha256:66c8a43a4f7b8df8b71ee1840e4211a3c8d93b214b213f590e18a1beca458f7d", size = 189692, upload-time = "2025-07-26T12:02:30.128Z" },
{ url = "https://files.pythonhosted.org/packages/7d/c2/57f54b03d0f22d4044b8afb9ca0e184f8b1afd57b4f735c2fa70883dc601/contourpy-1.3.3-cp314-cp314-win_amd64.whl", hash = "sha256:cf9022ef053f2694e31d630feaacb21ea24224be1c3ad0520b13d844274614fd", size = 232424, upload-time = "2025-07-26T12:02:31.395Z" },
{ url = "https://files.pythonhosted.org/packages/18/79/a9416650df9b525737ab521aa181ccc42d56016d2123ddcb7b58e926a42c/contourpy-1.3.3-cp314-cp314-win_arm64.whl", hash = "sha256:95b181891b4c71de4bb404c6621e7e2390745f887f2a026b2d99e92c17892339", size = 198300, upload-time = "2025-07-26T12:02:32.956Z" },
{ url = "https://files.pythonhosted.org/packages/1f/42/38c159a7d0f2b7b9c04c64ab317042bb6952b713ba875c1681529a2932fe/contourpy-1.3.3-cp314-cp314t-macosx_10_13_x86_64.whl", hash = "sha256:33c82d0138c0a062380332c861387650c82e4cf1747aaa6938b9b6516762e772", size = 306769, upload-time = "2025-07-26T12:02:34.2Z" },
{ url = "https://files.pythonhosted.org/packages/c3/6c/26a8205f24bca10974e77460de68d3d7c63e282e23782f1239f226fcae6f/contourpy-1.3.3-cp314-cp314t-macosx_11_0_arm64.whl", hash = "sha256:ea37e7b45949df430fe649e5de8351c423430046a2af20b1c1961cae3afcda77", size = 287892, upload-time = "2025-07-26T12:02:35.807Z" },
{ url = "https://files.pythonhosted.org/packages/66/06/8a475c8ab718ebfd7925661747dbb3c3ee9c82ac834ccb3570be49d129f4/contourpy-1.3.3-cp314-cp314t-manylinux_2_26_aarch64.manylinux_2_28_aarch64.whl", hash = "sha256:d304906ecc71672e9c89e87c4675dc5c2645e1f4269a5063b99b0bb29f232d13", size = 326748, upload-time = "2025-07-26T12:02:37.193Z" },
{ url = "https://files.pythonhosted.org/packages/b4/a3/c5ca9f010a44c223f098fccd8b158bb1cb287378a31ac141f04730dc49be/contourpy-1.3.3-cp314-cp314t-manylinux_2_26_ppc64le.manylinux_2_28_ppc64le.whl", hash = "sha256:ca658cd1a680a5c9ea96dc61cdbae1e85c8f25849843aa799dfd3cb370ad4fbe", size = 375554, upload-time = "2025-07-26T12:02:38.894Z" },
{ url = "https://files.pythonhosted.org/packages/80/5b/68bd33ae63fac658a4145088c1e894405e07584a316738710b636c6d0333/contourpy-1.3.3-cp314-cp314t-manylinux_2_26_s390x.manylinux_2_28_s390x.whl", hash = "sha256:ab2fd90904c503739a75b7c8c5c01160130ba67944a7b77bbf36ef8054576e7f", size = 388118, upload-time = "2025-07-26T12:02:40.642Z" },
{ url = "https://files.pythonhosted.org/packages/40/52/4c285a6435940ae25d7410a6c36bda5145839bc3f0beb20c707cda18b9d2/contourpy-1.3.3-cp314-cp314t-manylinux_2_27_x86_64.manylinux_2_28_x86_64.whl", hash = "sha256:b7301b89040075c30e5768810bc96a8e8d78085b47d8be6e4c3f5a0b4ed478a0", size = 352555, upload-time = "2025-07-26T12:02:42.25Z" },
{ url = "https://files.pythonhosted.org/packages/24/ee/3e81e1dd174f5c7fefe50e85d0892de05ca4e26ef1c9a59c2a57e43b865a/contourpy-1.3.3-cp314-cp314t-musllinux_1_2_aarch64.whl", hash = "sha256:2a2a8b627d5cc6b7c41a4beff6c5ad5eb848c88255fda4a8745f7e901b32d8e4", size = 1322295, upload-time = "2025-07-26T12:02:44.668Z" },
{ url = "https://files.pythonhosted.org/packages/3c/b2/6d913d4d04e14379de429057cd169e5e00f6c2af3bb13e1710bcbdb5da12/contourpy-1.3.3-cp314-cp314t-musllinux_1_2_x86_64.whl", hash = "sha256:fd6ec6be509c787f1caf6b247f0b1ca598bef13f4ddeaa126b7658215529ba0f", size = 1391027, upload-time = "2025-07-26T12:02:47.09Z" },
{ url = "https://files.pythonhosted.org/packages/93/8a/68a4ec5c55a2971213d29a9374913f7e9f18581945a7a31d1a39b5d2dfe5/contourpy-1.3.3-cp314-cp314t-win32.whl", hash = "sha256:e74a9a0f5e3fff48fb5a7f2fd2b9b70a3fe014a67522f79b7cca4c0c7e43c9ae", size = 202428, upload-time = "2025-07-26T12:02:48.691Z" },
{ url = "https://files.pythonhosted.org/packages/fa/96/fd9f641ffedc4fa3ace923af73b9d07e869496c9cc7a459103e6e978992f/contourpy-1.3.3-cp314-cp314t-win_amd64.whl", hash = "sha256:13b68d6a62db8eafaebb8039218921399baf6e47bf85006fd8529f2a08ef33fc", size = 250331, upload-time = "2025-07-26T12:02:50.137Z" },
{ url = "https://files.pythonhosted.org/packages/ae/8c/469afb6465b853afff216f9528ffda78a915ff880ed58813ba4faf4ba0b6/contourpy-1.3.3-cp314-cp314t-win_arm64.whl", hash = "sha256:b7448cb5a725bb1e35ce88771b86fba35ef418952474492cf7c764059933ff8b", size = 203831, upload-time = "2025-07-26T12:02:51.449Z" },
]
[[package]]
name = "cycler"
version = "0.12.1"
source = { registry = "https://pypi.org/simple" }
sdist = { url = "https://files.pythonhosted.org/packages/a9/95/a3dbbb5028f35eafb79008e7522a75244477d2838f38cbb722248dabc2a8/cycler-0.12.1.tar.gz", hash = "sha256:88bb128f02ba341da8ef447245a9e138fae777f6a23943da4540077d3601eb1c", size = 7615, upload-time = "2023-10-07T05:32:18.335Z" }
wheels = [
{ url = "https://files.pythonhosted.org/packages/e7/05/c19819d5e3d95294a6f5947fb9b9629efb316b96de511b418c53d245aae6/cycler-0.12.1-py3-none-any.whl", hash = "sha256:85cef7cff222d8644161529808465972e51340599459b8ac3ccbac5a854e0d30", size = 8321, upload-time = "2023-10-07T05:32:16.783Z" },
]
[[package]]
name = "fonttools"
version = "4.59.1"
source = { registry = "https://pypi.org/simple" }
sdist = { url = "https://files.pythonhosted.org/packages/11/7f/29c9c3fe4246f6ad96fee52b88d0dc3a863c7563b0afc959e36d78b965dc/fonttools-4.59.1.tar.gz", hash = "sha256:74995b402ad09822a4c8002438e54940d9f1ecda898d2bb057729d7da983e4cb", size = 3534394, upload-time = "2025-08-14T16:28:14.266Z" }
wheels = [
{ url = "https://files.pythonhosted.org/packages/ac/fe/6e069cc4cb8881d164a9bd956e9df555bc62d3eb36f6282e43440200009c/fonttools-4.59.1-cp312-cp312-macosx_10_13_universal2.whl", hash = "sha256:43ab814bbba5f02a93a152ee61a04182bb5809bd2bc3609f7822e12c53ae2c91", size = 2769172, upload-time = "2025-08-14T16:26:45.729Z" },
{ url = "https://files.pythonhosted.org/packages/b9/98/ec4e03f748fefa0dd72d9d95235aff6fef16601267f4a2340f0e16b9330f/fonttools-4.59.1-cp312-cp312-macosx_10_13_x86_64.whl", hash = "sha256:4f04c3ffbfa0baafcbc550657cf83657034eb63304d27b05cff1653b448ccff6", size = 2337281, upload-time = "2025-08-14T16:26:47.921Z" },
{ url = "https://files.pythonhosted.org/packages/8b/b1/890360a7e3d04a30ba50b267aca2783f4c1364363797e892e78a4f036076/fonttools-4.59.1-cp312-cp312-manylinux1_x86_64.manylinux2014_x86_64.manylinux_2_17_x86_64.manylinux_2_5_x86_64.whl", hash = "sha256:d601b153e51a5a6221f0d4ec077b6bfc6ac35bfe6c19aeaa233d8990b2b71726", size = 4909215, upload-time = "2025-08-14T16:26:49.682Z" },
{ url = "https://files.pythonhosted.org/packages/8a/ec/2490599550d6c9c97a44c1e36ef4de52d6acf742359eaa385735e30c05c4/fonttools-4.59.1-cp312-cp312-manylinux2014_aarch64.manylinux_2_17_aarch64.manylinux_2_28_aarch64.whl", hash = "sha256:c735e385e30278c54f43a0d056736942023c9043f84ee1021eff9fd616d17693", size = 4951958, upload-time = "2025-08-14T16:26:51.616Z" },
{ url = "https://files.pythonhosted.org/packages/d1/40/bd053f6f7634234a9b9805ff8ae4f32df4f2168bee23cafd1271ba9915a9/fonttools-4.59.1-cp312-cp312-musllinux_1_2_aarch64.whl", hash = "sha256:1017413cdc8555dce7ee23720da490282ab7ec1cf022af90a241f33f9a49afc4", size = 4894738, upload-time = "2025-08-14T16:26:53.836Z" },
{ url = "https://files.pythonhosted.org/packages/ac/a1/3cd12a010d288325a7cfcf298a84825f0f9c29b01dee1baba64edfe89257/fonttools-4.59.1-cp312-cp312-musllinux_1_2_x86_64.whl", hash = "sha256:5c6d8d773470a5107052874341ed3c487c16ecd179976d81afed89dea5cd7406", size = 5045983, upload-time = "2025-08-14T16:26:56.153Z" },
{ url = "https://files.pythonhosted.org/packages/a2/af/8a2c3f6619cc43cf87951405337cc8460d08a4e717bb05eaa94b335d11dc/fonttools-4.59.1-cp312-cp312-win32.whl", hash = "sha256:2a2d0d33307f6ad3a2086a95dd607c202ea8852fa9fb52af9b48811154d1428a", size = 2203407, upload-time = "2025-08-14T16:26:58.165Z" },
{ url = "https://files.pythonhosted.org/packages/8e/f2/a19b874ddbd3ebcf11d7e25188ef9ac3f68b9219c62263acb34aca8cde05/fonttools-4.59.1-cp312-cp312-win_amd64.whl", hash = "sha256:0b9e4fa7eaf046ed6ac470f6033d52c052481ff7a6e0a92373d14f556f298dc0", size = 2251561, upload-time = "2025-08-14T16:27:00.646Z" },
{ url = "https://files.pythonhosted.org/packages/19/5e/94a4d7f36c36e82f6a81e0064d148542e0ad3e6cf51fc5461ca128f3658d/fonttools-4.59.1-cp313-cp313-macosx_10_13_universal2.whl", hash = "sha256:89d9957b54246c6251345297dddf77a84d2c19df96af30d2de24093bbdf0528b", size = 2760192, upload-time = "2025-08-14T16:27:03.024Z" },
{ url = "https://files.pythonhosted.org/packages/ee/a5/f50712fc33ef9d06953c660cefaf8c8fe4b8bc74fa21f44ee5e4f9739439/fonttools-4.59.1-cp313-cp313-macosx_10_13_x86_64.whl", hash = "sha256:8156b11c0d5405810d216f53907bd0f8b982aa5f1e7e3127ab3be1a4062154ff", size = 2332694, upload-time = "2025-08-14T16:27:04.883Z" },
{ url = "https://files.pythonhosted.org/packages/e9/a2/5a9fc21c354bf8613215ce233ab0d933bd17d5ff4c29693636551adbc7b3/fonttools-4.59.1-cp313-cp313-manylinux1_x86_64.manylinux2014_x86_64.manylinux_2_17_x86_64.manylinux_2_5_x86_64.whl", hash = "sha256:8387876a8011caec52d327d5e5bca705d9399ec4b17afb8b431ec50d47c17d23", size = 4889254, upload-time = "2025-08-14T16:27:07.02Z" },
{ url = "https://files.pythonhosted.org/packages/2d/e5/54a6dc811eba018d022ca2e8bd6f2969291f9586ccf9a22a05fc55f91250/fonttools-4.59.1-cp313-cp313-manylinux2014_aarch64.manylinux_2_17_aarch64.manylinux_2_28_aarch64.whl", hash = "sha256:fb13823a74b3a9204a8ed76d3d6d5ec12e64cc5bc44914eb9ff1cdac04facd43", size = 4949109, upload-time = "2025-08-14T16:27:09.3Z" },
{ url = "https://files.pythonhosted.org/packages/db/15/b05c72a248a95bea0fd05fbd95acdf0742945942143fcf961343b7a3663a/fonttools-4.59.1-cp313-cp313-musllinux_1_2_aarch64.whl", hash = "sha256:e1ca10da138c300f768bb68e40e5b20b6ecfbd95f91aac4cc15010b6b9d65455", size = 4888428, upload-time = "2025-08-14T16:27:11.514Z" },
{ url = "https://files.pythonhosted.org/packages/63/71/c7d6840f858d695adc0c4371ec45e3fb1c8e060b276ba944e2800495aca4/fonttools-4.59.1-cp313-cp313-musllinux_1_2_x86_64.whl", hash = "sha256:2beb5bfc4887a3130f8625349605a3a45fe345655ce6031d1bac11017454b943", size = 5032668, upload-time = "2025-08-14T16:27:13.872Z" },
{ url = "https://files.pythonhosted.org/packages/90/54/57be4aca6f1312e2bc4d811200dd822325794e05bdb26eeff0976edca651/fonttools-4.59.1-cp313-cp313-win32.whl", hash = "sha256:419f16d750d78e6d704bfe97b48bba2f73b15c9418f817d0cb8a9ca87a5b94bf", size = 2201832, upload-time = "2025-08-14T16:27:16.126Z" },
{ url = "https://files.pythonhosted.org/packages/fc/1f/1899a6175a5f900ed8730a0d64f53ca1b596ed7609bfda033cf659114258/fonttools-4.59.1-cp313-cp313-win_amd64.whl", hash = "sha256:c536f8a852e8d3fa71dde1ec03892aee50be59f7154b533f0bf3c1174cfd5126", size = 2250673, upload-time = "2025-08-14T16:27:18.033Z" },
{ url = "https://files.pythonhosted.org/packages/15/07/f6ba82c22f118d9985c37fea65d8d715ca71300d78b6c6e90874dc59f11d/fonttools-4.59.1-cp314-cp314-macosx_10_13_universal2.whl", hash = "sha256:d5c3bfdc9663f3d4b565f9cb3b8c1efb3e178186435b45105bde7328cfddd7fe", size = 2758606, upload-time = "2025-08-14T16:27:20.064Z" },
{ url = "https://files.pythonhosted.org/packages/3a/81/84aa3d0ce27b0112c28b67b637ff7a47cf401cf5fbfee6476e4bc9777580/fonttools-4.59.1-cp314-cp314-macosx_10_13_x86_64.whl", hash = "sha256:ea03f1da0d722fe3c2278a05957e6550175571a4894fbf9d178ceef4a3783d2b", size = 2330187, upload-time = "2025-08-14T16:27:22.42Z" },
{ url = "https://files.pythonhosted.org/packages/17/41/b3ba43f78afb321e2e50232c87304c8d0f5ab39b64389b8286cc39cdb824/fonttools-4.59.1-cp314-cp314-manylinux1_x86_64.manylinux2014_x86_64.manylinux_2_17_x86_64.manylinux_2_5_x86_64.whl", hash = "sha256:57a3708ca6bfccb790f585fa6d8f29432ec329618a09ff94c16bcb3c55994643", size = 4832020, upload-time = "2025-08-14T16:27:24.214Z" },
{ url = "https://files.pythonhosted.org/packages/67/b1/3af871c7fb325a68938e7ce544ca48bfd2c6bb7b357f3c8252933b29100a/fonttools-4.59.1-cp314-cp314-manylinux2014_aarch64.manylinux_2_17_aarch64.manylinux_2_28_aarch64.whl", hash = "sha256:729367c91eb1ee84e61a733acc485065a00590618ca31c438e7dd4d600c01486", size = 4930687, upload-time = "2025-08-14T16:27:26.484Z" },
{ url = "https://files.pythonhosted.org/packages/c5/4f/299fc44646b30d9ef03ffaa78b109c7bd32121f0d8f10009ee73ac4514bc/fonttools-4.59.1-cp314-cp314-musllinux_1_2_aarch64.whl", hash = "sha256:8f8ef66ac6db450193ed150e10b3b45dde7aded10c5d279968bc63368027f62b", size = 4875794, upload-time = "2025-08-14T16:27:28.887Z" },
{ url = "https://files.pythonhosted.org/packages/90/cf/a0a3d763ab58f5f81ceff104ddb662fd9da94248694862b9c6cbd509fdd5/fonttools-4.59.1-cp314-cp314-musllinux_1_2_x86_64.whl", hash = "sha256:075f745d539a998cd92cb84c339a82e53e49114ec62aaea8307c80d3ad3aef3a", size = 4985780, upload-time = "2025-08-14T16:27:30.858Z" },
{ url = "https://files.pythonhosted.org/packages/72/c5/ba76511aaae143d89c29cd32ce30bafb61c477e8759a1590b8483f8065f8/fonttools-4.59.1-cp314-cp314-win32.whl", hash = "sha256:c2b0597522d4c5bb18aa5cf258746a2d4a90f25878cbe865e4d35526abd1b9fc", size = 2205610, upload-time = "2025-08-14T16:27:32.578Z" },
{ url = "https://files.pythonhosted.org/packages/a9/65/b250e69d6caf35bc65cddbf608be0662d741c248f2e7503ab01081fc267e/fonttools-4.59.1-cp314-cp314-win_amd64.whl", hash = "sha256:e9ad4ce044e3236f0814c906ccce8647046cc557539661e35211faadf76f283b", size = 2255376, upload-time = "2025-08-14T16:27:34.653Z" },
{ url = "https://files.pythonhosted.org/packages/11/f3/0bc63a23ac0f8175e23d82f85d6ee693fbd849de7ad739f0a3622182ad29/fonttools-4.59.1-cp314-cp314t-macosx_10_13_universal2.whl", hash = "sha256:652159e8214eb4856e8387ebcd6b6bd336ee258cbeb639c8be52005b122b9609", size = 2826546, upload-time = "2025-08-14T16:27:36.783Z" },
{ url = "https://files.pythonhosted.org/packages/e9/46/a3968205590e068fdf60e926be329a207782576cb584d3b7dcd2d2844957/fonttools-4.59.1-cp314-cp314t-macosx_10_13_x86_64.whl", hash = "sha256:43d177cd0e847ea026fedd9f099dc917da136ed8792d142298a252836390c478", size = 2359771, upload-time = "2025-08-14T16:27:39.678Z" },
{ url = "https://files.pythonhosted.org/packages/b8/ff/d14b4c283879e8cb57862d9624a34fe6522b6fcdd46ccbfc58900958794a/fonttools-4.59.1-cp314-cp314t-manylinux1_x86_64.manylinux2014_x86_64.manylinux_2_17_x86_64.manylinux_2_5_x86_64.whl", hash = "sha256:e54437651e1440ee53a95e6ceb6ee440b67a3d348c76f45f4f48de1a5ecab019", size = 4831575, upload-time = "2025-08-14T16:27:41.885Z" },
{ url = "https://files.pythonhosted.org/packages/9c/04/a277d9a584a49d98ca12d3b2c6663bdf333ae97aaa83bd0cdabf7c5a6c84/fonttools-4.59.1-cp314-cp314t-manylinux2014_aarch64.manylinux_2_17_aarch64.manylinux_2_28_aarch64.whl", hash = "sha256:6065fdec8ff44c32a483fd44abe5bcdb40dd5e2571a5034b555348f2b3a52cea", size = 5069962, upload-time = "2025-08-14T16:27:44.284Z" },
{ url = "https://files.pythonhosted.org/packages/16/6f/3d2ae69d96c4cdee6dfe7598ca5519a1514487700ca3d7c49c5a1ad65308/fonttools-4.59.1-cp314-cp314t-musllinux_1_2_aarch64.whl", hash = "sha256:42052b56d176f8b315fbc09259439c013c0cb2109df72447148aeda677599612", size = 4942926, upload-time = "2025-08-14T16:27:46.523Z" },
{ url = "https://files.pythonhosted.org/packages/0c/d3/c17379e0048d03ce26b38e4ab0e9a98280395b00529e093fe2d663ac0658/fonttools-4.59.1-cp314-cp314t-musllinux_1_2_x86_64.whl", hash = "sha256:bcd52eaa5c4c593ae9f447c1d13e7e4a00ca21d755645efa660b6999425b3c88", size = 4958678, upload-time = "2025-08-14T16:27:48.555Z" },
{ url = "https://files.pythonhosted.org/packages/8c/3f/c5543a1540abdfb4d375e3ebeb84de365ab9b153ec14cb7db05f537dd1e7/fonttools-4.59.1-cp314-cp314t-win32.whl", hash = "sha256:02e4fdf27c550dded10fe038a5981c29f81cb9bc649ff2eaa48e80dab8998f97", size = 2266706, upload-time = "2025-08-14T16:27:50.556Z" },
{ url = "https://files.pythonhosted.org/packages/3e/99/85bff6e674226bc8402f983e365f07e76d990e7220ba72bcc738fef52391/fonttools-4.59.1-cp314-cp314t-win_amd64.whl", hash = "sha256:412a5fd6345872a7c249dac5bcce380393f40c1c316ac07f447bc17d51900922", size = 2329994, upload-time = "2025-08-14T16:27:52.36Z" },
{ url = "https://files.pythonhosted.org/packages/0f/64/9d606e66d498917cd7a2ff24f558010d42d6fd4576d9dd57f0bd98333f5a/fonttools-4.59.1-py3-none-any.whl", hash = "sha256:647db657073672a8330608970a984d51573557f328030566521bc03415535042", size = 1130094, upload-time = "2025-08-14T16:28:12.048Z" },
]
[[package]]
name = "iniconfig"
version = "2.1.0"
source = { registry = "https://pypi.org/simple" }
sdist = { url = "https://files.pythonhosted.org/packages/f2/97/ebf4da567aa6827c909642694d71c9fcf53e5b504f2d96afea02718862f3/iniconfig-2.1.0.tar.gz", hash = "sha256:3abbd2e30b36733fee78f9c7f7308f2d0050e88f0087fd25c2645f63c773e1c7", size = 4793, upload-time = "2025-03-19T20:09:59.721Z" }
wheels = [
{ url = "https://files.pythonhosted.org/packages/2c/e1/e6716421ea10d38022b952c159d5161ca1193197fb744506875fbb87ea7b/iniconfig-2.1.0-py3-none-any.whl", hash = "sha256:9deba5723312380e77435581c6bf4935c94cbfab9b1ed33ef8d238ea168eb760", size = 6050, upload-time = "2025-03-19T20:10:01.071Z" },
]
[[package]]
name = "kiwisolver"
version = "1.4.9"
source = { registry = "https://pypi.org/simple" }
sdist = { url = "https://files.pythonhosted.org/packages/5c/3c/85844f1b0feb11ee581ac23fe5fce65cd049a200c1446708cc1b7f922875/kiwisolver-1.4.9.tar.gz", hash = "sha256:c3b22c26c6fd6811b0ae8363b95ca8ce4ea3c202d3d0975b2914310ceb1bcc4d", size = 97564, upload-time = "2025-08-10T21:27:49.279Z" }
wheels = [
{ url = "https://files.pythonhosted.org/packages/86/c9/13573a747838aeb1c76e3267620daa054f4152444d1f3d1a2324b78255b5/kiwisolver-1.4.9-cp312-cp312-macosx_10_13_universal2.whl", hash = "sha256:ac5a486ac389dddcc5bef4f365b6ae3ffff2c433324fb38dd35e3fab7c957999", size = 123686, upload-time = "2025-08-10T21:26:10.034Z" },
{ url = "https://files.pythonhosted.org/packages/51/ea/2ecf727927f103ffd1739271ca19c424d0e65ea473fbaeea1c014aea93f6/kiwisolver-1.4.9-cp312-cp312-macosx_10_13_x86_64.whl", hash = "sha256:f2ba92255faa7309d06fe44c3a4a97efe1c8d640c2a79a5ef728b685762a6fd2", size = 66460, upload-time = "2025-08-10T21:26:11.083Z" },
{ url = "https://files.pythonhosted.org/packages/5b/5a/51f5464373ce2aeb5194508298a508b6f21d3867f499556263c64c621914/kiwisolver-1.4.9-cp312-cp312-macosx_11_0_arm64.whl", hash = "sha256:4a2899935e724dd1074cb568ce7ac0dce28b2cd6ab539c8e001a8578eb106d14", size = 64952, upload-time = "2025-08-10T21:26:12.058Z" },
{ url = "https://files.pythonhosted.org/packages/70/90/6d240beb0f24b74371762873e9b7f499f1e02166a2d9c5801f4dbf8fa12e/kiwisolver-1.4.9-cp312-cp312-manylinux2014_x86_64.manylinux_2_17_x86_64.whl", hash = "sha256:f6008a4919fdbc0b0097089f67a1eb55d950ed7e90ce2cc3e640abadd2757a04", size = 1474756, upload-time = "2025-08-10T21:26:13.096Z" },
{ url = "https://files.pythonhosted.org/packages/12/42/f36816eaf465220f683fb711efdd1bbf7a7005a2473d0e4ed421389bd26c/kiwisolver-1.4.9-cp312-cp312-manylinux_2_24_aarch64.manylinux_2_28_aarch64.whl", hash = "sha256:67bb8b474b4181770f926f7b7d2f8c0248cbcb78b660fdd41a47054b28d2a752", size = 1276404, upload-time = "2025-08-10T21:26:14.457Z" },
{ url = "https://files.pythonhosted.org/packages/2e/64/bc2de94800adc830c476dce44e9b40fd0809cddeef1fde9fcf0f73da301f/kiwisolver-1.4.9-cp312-cp312-manylinux_2_24_ppc64le.manylinux_2_28_ppc64le.whl", hash = "sha256:2327a4a30d3ee07d2fbe2e7933e8a37c591663b96ce42a00bc67461a87d7df77", size = 1294410, upload-time = "2025-08-10T21:26:15.73Z" },
{ url = "https://files.pythonhosted.org/packages/5f/42/2dc82330a70aa8e55b6d395b11018045e58d0bb00834502bf11509f79091/kiwisolver-1.4.9-cp312-cp312-manylinux_2_24_s390x.manylinux_2_28_s390x.whl", hash = "sha256:7a08b491ec91b1d5053ac177afe5290adacf1f0f6307d771ccac5de30592d198", size = 1343631, upload-time = "2025-08-10T21:26:17.045Z" },
{ url = "https://files.pythonhosted.org/packages/22/fd/f4c67a6ed1aab149ec5a8a401c323cee7a1cbe364381bb6c9c0d564e0e20/kiwisolver-1.4.9-cp312-cp312-musllinux_1_2_aarch64.whl", hash = "sha256:d8fc5c867c22b828001b6a38d2eaeb88160bf5783c6cb4a5e440efc981ce286d", size = 2224963, upload-time = "2025-08-10T21:26:18.737Z" },
{ url = "https://files.pythonhosted.org/packages/45/aa/76720bd4cb3713314677d9ec94dcc21ced3f1baf4830adde5bb9b2430a5f/kiwisolver-1.4.9-cp312-cp312-musllinux_1_2_ppc64le.whl", hash = "sha256:3b3115b2581ea35bb6d1f24a4c90af37e5d9b49dcff267eeed14c3893c5b86ab", size = 2321295, upload-time = "2025-08-10T21:26:20.11Z" },
{ url = "https://files.pythonhosted.org/packages/80/19/d3ec0d9ab711242f56ae0dc2fc5d70e298bb4a1f9dfab44c027668c673a1/kiwisolver-1.4.9-cp312-cp312-musllinux_1_2_s390x.whl", hash = "sha256:858e4c22fb075920b96a291928cb7dea5644e94c0ee4fcd5af7e865655e4ccf2", size = 2487987, upload-time = "2025-08-10T21:26:21.49Z" },
{ url = "https://files.pythonhosted.org/packages/39/e9/61e4813b2c97e86b6fdbd4dd824bf72d28bcd8d4849b8084a357bc0dd64d/kiwisolver-1.4.9-cp312-cp312-musllinux_1_2_x86_64.whl", hash = "sha256:ed0fecd28cc62c54b262e3736f8bb2512d8dcfdc2bcf08be5f47f96bf405b145", size = 2291817, upload-time = "2025-08-10T21:26:22.812Z" },
{ url = "https://files.pythonhosted.org/packages/a0/41/85d82b0291db7504da3c2defe35c9a8a5c9803a730f297bd823d11d5fb77/kiwisolver-1.4.9-cp312-cp312-win_amd64.whl", hash = "sha256:f68208a520c3d86ea51acf688a3e3002615a7f0238002cccc17affecc86a8a54", size = 73895, upload-time = "2025-08-10T21:26:24.37Z" },
{ url = "https://files.pythonhosted.org/packages/e2/92/5f3068cf15ee5cb624a0c7596e67e2a0bb2adee33f71c379054a491d07da/kiwisolver-1.4.9-cp312-cp312-win_arm64.whl", hash = "sha256:2c1a4f57df73965f3f14df20b80ee29e6a7930a57d2d9e8491a25f676e197c60", size = 64992, upload-time = "2025-08-10T21:26:25.732Z" },
{ url = "https://files.pythonhosted.org/packages/31/c1/c2686cda909742ab66c7388e9a1a8521a59eb89f8bcfbee28fc980d07e24/kiwisolver-1.4.9-cp313-cp313-macosx_10_13_universal2.whl", hash = "sha256:a5d0432ccf1c7ab14f9949eec60c5d1f924f17c037e9f8b33352fa05799359b8", size = 123681, upload-time = "2025-08-10T21:26:26.725Z" },
{ url = "https://files.pythonhosted.org/packages/ca/f0/f44f50c9f5b1a1860261092e3bc91ecdc9acda848a8b8c6abfda4a24dd5c/kiwisolver-1.4.9-cp313-cp313-macosx_10_13_x86_64.whl", hash = "sha256:efb3a45b35622bb6c16dbfab491a8f5a391fe0e9d45ef32f4df85658232ca0e2", size = 66464, upload-time = "2025-08-10T21:26:27.733Z" },
{ url = "https://files.pythonhosted.org/packages/2d/7a/9d90a151f558e29c3936b8a47ac770235f436f2120aca41a6d5f3d62ae8d/kiwisolver-1.4.9-cp313-cp313-macosx_11_0_arm64.whl", hash = "sha256:1a12cf6398e8a0a001a059747a1cbf24705e18fe413bc22de7b3d15c67cffe3f", size = 64961, upload-time = "2025-08-10T21:26:28.729Z" },
{ url = "https://files.pythonhosted.org/packages/e9/e9/f218a2cb3a9ffbe324ca29a9e399fa2d2866d7f348ec3a88df87fc248fc5/kiwisolver-1.4.9-cp313-cp313-manylinux2014_x86_64.manylinux_2_17_x86_64.whl", hash = "sha256:b67e6efbf68e077dd71d1a6b37e43e1a99d0bff1a3d51867d45ee8908b931098", size = 1474607, upload-time = "2025-08-10T21:26:29.798Z" },
{ url = "https://files.pythonhosted.org/packages/d9/28/aac26d4c882f14de59041636292bc838db8961373825df23b8eeb807e198/kiwisolver-1.4.9-cp313-cp313-manylinux_2_24_aarch64.manylinux_2_28_aarch64.whl", hash = "sha256:5656aa670507437af0207645273ccdfee4f14bacd7f7c67a4306d0dcaeaf6eed", size = 1276546, upload-time = "2025-08-10T21:26:31.401Z" },
{ url = "https://files.pythonhosted.org/packages/8b/ad/8bfc1c93d4cc565e5069162f610ba2f48ff39b7de4b5b8d93f69f30c4bed/kiwisolver-1.4.9-cp313-cp313-manylinux_2_24_ppc64le.manylinux_2_28_ppc64le.whl", hash = "sha256:bfc08add558155345129c7803b3671cf195e6a56e7a12f3dde7c57d9b417f525", size = 1294482, upload-time = "2025-08-10T21:26:32.721Z" },
{ url = "https://files.pythonhosted.org/packages/da/f1/6aca55ff798901d8ce403206d00e033191f63d82dd708a186e0ed2067e9c/kiwisolver-1.4.9-cp313-cp313-manylinux_2_24_s390x.manylinux_2_28_s390x.whl", hash = "sha256:40092754720b174e6ccf9e845d0d8c7d8e12c3d71e7fc35f55f3813e96376f78", size = 1343720, upload-time = "2025-08-10T21:26:34.032Z" },
{ url = "https://files.pythonhosted.org/packages/d1/91/eed031876c595c81d90d0f6fc681ece250e14bf6998c3d7c419466b523b7/kiwisolver-1.4.9-cp313-cp313-musllinux_1_2_aarch64.whl", hash = "sha256:497d05f29a1300d14e02e6441cf0f5ee81c1ff5a304b0d9fb77423974684e08b", size = 2224907, upload-time = "2025-08-10T21:26:35.824Z" },
{ url = "https://files.pythonhosted.org/packages/e9/ec/4d1925f2e49617b9cca9c34bfa11adefad49d00db038e692a559454dfb2e/kiwisolver-1.4.9-cp313-cp313-musllinux_1_2_ppc64le.whl", hash = "sha256:bdd1a81a1860476eb41ac4bc1e07b3f07259e6d55bbf739b79c8aaedcf512799", size = 2321334, upload-time = "2025-08-10T21:26:37.534Z" },
{ url = "https://files.pythonhosted.org/packages/43/cb/450cd4499356f68802750c6ddc18647b8ea01ffa28f50d20598e0befe6e9/kiwisolver-1.4.9-cp313-cp313-musllinux_1_2_s390x.whl", hash = "sha256:e6b93f13371d341afee3be9f7c5964e3fe61d5fa30f6a30eb49856935dfe4fc3", size = 2488313, upload-time = "2025-08-10T21:26:39.191Z" },
{ url = "https://files.pythonhosted.org/packages/71/67/fc76242bd99f885651128a5d4fa6083e5524694b7c88b489b1b55fdc491d/kiwisolver-1.4.9-cp313-cp313-musllinux_1_2_x86_64.whl", hash = "sha256:d75aa530ccfaa593da12834b86a0724f58bff12706659baa9227c2ccaa06264c", size = 2291970, upload-time = "2025-08-10T21:26:40.828Z" },
{ url = "https://files.pythonhosted.org/packages/75/bd/f1a5d894000941739f2ae1b65a32892349423ad49c2e6d0771d0bad3fae4/kiwisolver-1.4.9-cp313-cp313-win_amd64.whl", hash = "sha256:dd0a578400839256df88c16abddf9ba14813ec5f21362e1fe65022e00c883d4d", size = 73894, upload-time = "2025-08-10T21:26:42.33Z" },
{ url = "https://files.pythonhosted.org/packages/95/38/dce480814d25b99a391abbddadc78f7c117c6da34be68ca8b02d5848b424/kiwisolver-1.4.9-cp313-cp313-win_arm64.whl", hash = "sha256:d4188e73af84ca82468f09cadc5ac4db578109e52acb4518d8154698d3a87ca2", size = 64995, upload-time = "2025-08-10T21:26:43.889Z" },
{ url = "https://files.pythonhosted.org/packages/e2/37/7d218ce5d92dadc5ebdd9070d903e0c7cf7edfe03f179433ac4d13ce659c/kiwisolver-1.4.9-cp313-cp313t-macosx_10_13_universal2.whl", hash = "sha256:5a0f2724dfd4e3b3ac5a82436a8e6fd16baa7d507117e4279b660fe8ca38a3a1", size = 126510, upload-time = "2025-08-10T21:26:44.915Z" },
{ url = "https://files.pythonhosted.org/packages/23/b0/e85a2b48233daef4b648fb657ebbb6f8367696a2d9548a00b4ee0eb67803/kiwisolver-1.4.9-cp313-cp313t-macosx_10_13_x86_64.whl", hash = "sha256:1b11d6a633e4ed84fc0ddafd4ebfd8ea49b3f25082c04ad12b8315c11d504dc1", size = 67903, upload-time = "2025-08-10T21:26:45.934Z" },
{ url = "https://files.pythonhosted.org/packages/44/98/f2425bc0113ad7de24da6bb4dae1343476e95e1d738be7c04d31a5d037fd/kiwisolver-1.4.9-cp313-cp313t-macosx_11_0_arm64.whl", hash = "sha256:61874cdb0a36016354853593cffc38e56fc9ca5aa97d2c05d3dcf6922cd55a11", size = 66402, upload-time = "2025-08-10T21:26:47.101Z" },
{ url = "https://files.pythonhosted.org/packages/98/d8/594657886df9f34c4177cc353cc28ca7e6e5eb562d37ccc233bff43bbe2a/kiwisolver-1.4.9-cp313-cp313t-manylinux2014_x86_64.manylinux_2_17_x86_64.whl", hash = "sha256:60c439763a969a6af93b4881db0eed8fadf93ee98e18cbc35bc8da868d0c4f0c", size = 1582135, upload-time = "2025-08-10T21:26:48.665Z" },
{ url = "https://files.pythonhosted.org/packages/5c/c6/38a115b7170f8b306fc929e166340c24958347308ea3012c2b44e7e295db/kiwisolver-1.4.9-cp313-cp313t-manylinux_2_24_aarch64.manylinux_2_28_aarch64.whl", hash = "sha256:92a2f997387a1b79a75e7803aa7ded2cfbe2823852ccf1ba3bcf613b62ae3197", size = 1389409, upload-time = "2025-08-10T21:26:50.335Z" },
{ url = "https://files.pythonhosted.org/packages/bf/3b/e04883dace81f24a568bcee6eb3001da4ba05114afa622ec9b6fafdc1f5e/kiwisolver-1.4.9-cp313-cp313t-manylinux_2_24_ppc64le.manylinux_2_28_ppc64le.whl", hash = "sha256:a31d512c812daea6d8b3be3b2bfcbeb091dbb09177706569bcfc6240dcf8b41c", size = 1401763, upload-time = "2025-08-10T21:26:51.867Z" },
{ url = "https://files.pythonhosted.org/packages/9f/80/20ace48e33408947af49d7d15c341eaee69e4e0304aab4b7660e234d6288/kiwisolver-1.4.9-cp313-cp313t-manylinux_2_24_s390x.manylinux_2_28_s390x.whl", hash = "sha256:52a15b0f35dad39862d376df10c5230155243a2c1a436e39eb55623ccbd68185", size = 1453643, upload-time = "2025-08-10T21:26:53.592Z" },
{ url = "https://files.pythonhosted.org/packages/64/31/6ce4380a4cd1f515bdda976a1e90e547ccd47b67a1546d63884463c92ca9/kiwisolver-1.4.9-cp313-cp313t-musllinux_1_2_aarch64.whl", hash = "sha256:a30fd6fdef1430fd9e1ba7b3398b5ee4e2887783917a687d86ba69985fb08748", size = 2330818, upload-time = "2025-08-10T21:26:55.051Z" },
{ url = "https://files.pythonhosted.org/packages/fa/e9/3f3fcba3bcc7432c795b82646306e822f3fd74df0ee81f0fa067a1f95668/kiwisolver-1.4.9-cp313-cp313t-musllinux_1_2_ppc64le.whl", hash = "sha256:cc9617b46837c6468197b5945e196ee9ca43057bb7d9d1ae688101e4e1dddf64", size = 2419963, upload-time = "2025-08-10T21:26:56.421Z" },
{ url = "https://files.pythonhosted.org/packages/99/43/7320c50e4133575c66e9f7dadead35ab22d7c012a3b09bb35647792b2a6d/kiwisolver-1.4.9-cp313-cp313t-musllinux_1_2_s390x.whl", hash = "sha256:0ab74e19f6a2b027ea4f845a78827969af45ce790e6cb3e1ebab71bdf9f215ff", size = 2594639, upload-time = "2025-08-10T21:26:57.882Z" },
{ url = "https://files.pythonhosted.org/packages/65/d6/17ae4a270d4a987ef8a385b906d2bdfc9fce502d6dc0d3aea865b47f548c/kiwisolver-1.4.9-cp313-cp313t-musllinux_1_2_x86_64.whl", hash = "sha256:dba5ee5d3981160c28d5490f0d1b7ed730c22470ff7f6cc26cfcfaacb9896a07", size = 2391741, upload-time = "2025-08-10T21:26:59.237Z" },
{ url = "https://files.pythonhosted.org/packages/2a/8f/8f6f491d595a9e5912971f3f863d81baddccc8a4d0c3749d6a0dd9ffc9df/kiwisolver-1.4.9-cp313-cp313t-win_arm64.whl", hash = "sha256:0749fd8f4218ad2e851e11cc4dc05c7cbc0cbc4267bdfdb31782e65aace4ee9c", size = 68646, upload-time = "2025-08-10T21:27:00.52Z" },
{ url = "https://files.pythonhosted.org/packages/6b/32/6cc0fbc9c54d06c2969faa9c1d29f5751a2e51809dd55c69055e62d9b426/kiwisolver-1.4.9-cp314-cp314-macosx_10_13_universal2.whl", hash = "sha256:9928fe1eb816d11ae170885a74d074f57af3a0d65777ca47e9aeb854a1fba386", size = 123806, upload-time = "2025-08-10T21:27:01.537Z" },
{ url = "https://files.pythonhosted.org/packages/b2/dd/2bfb1d4a4823d92e8cbb420fe024b8d2167f72079b3bb941207c42570bdf/kiwisolver-1.4.9-cp314-cp314-macosx_10_13_x86_64.whl", hash = "sha256:d0005b053977e7b43388ddec89fa567f43d4f6d5c2c0affe57de5ebf290dc552", size = 66605, upload-time = "2025-08-10T21:27:03.335Z" },
{ url = "https://files.pythonhosted.org/packages/f7/69/00aafdb4e4509c2ca6064646cba9cd4b37933898f426756adb2cb92ebbed/kiwisolver-1.4.9-cp314-cp314-macosx_11_0_arm64.whl", hash = "sha256:2635d352d67458b66fd0667c14cb1d4145e9560d503219034a18a87e971ce4f3", size = 64925, upload-time = "2025-08-10T21:27:04.339Z" },
{ url = "https://files.pythonhosted.org/packages/43/dc/51acc6791aa14e5cb6d8a2e28cefb0dc2886d8862795449d021334c0df20/kiwisolver-1.4.9-cp314-cp314-manylinux2014_x86_64.manylinux_2_17_x86_64.whl", hash = "sha256:767c23ad1c58c9e827b649a9ab7809fd5fd9db266a9cf02b0e926ddc2c680d58", size = 1472414, upload-time = "2025-08-10T21:27:05.437Z" },
{ url = "https://files.pythonhosted.org/packages/3d/bb/93fa64a81db304ac8a246f834d5094fae4b13baf53c839d6bb6e81177129/kiwisolver-1.4.9-cp314-cp314-manylinux_2_24_aarch64.manylinux_2_28_aarch64.whl", hash = "sha256:72d0eb9fba308b8311685c2268cf7d0a0639a6cd027d8128659f72bdd8a024b4", size = 1281272, upload-time = "2025-08-10T21:27:07.063Z" },
{ url = "https://files.pythonhosted.org/packages/70/e6/6df102916960fb8d05069d4bd92d6d9a8202d5a3e2444494e7cd50f65b7a/kiwisolver-1.4.9-cp314-cp314-manylinux_2_24_ppc64le.manylinux_2_28_ppc64le.whl", hash = "sha256:f68e4f3eeca8fb22cc3d731f9715a13b652795ef657a13df1ad0c7dc0e9731df", size = 1298578, upload-time = "2025-08-10T21:27:08.452Z" },
{ url = "https://files.pythonhosted.org/packages/7c/47/e142aaa612f5343736b087864dbaebc53ea8831453fb47e7521fa8658f30/kiwisolver-1.4.9-cp314-cp314-manylinux_2_24_s390x.manylinux_2_28_s390x.whl", hash = "sha256:d84cd4061ae292d8ac367b2c3fa3aad11cb8625a95d135fe93f286f914f3f5a6", size = 1345607, upload-time = "2025-08-10T21:27:10.125Z" },
{ url = "https://files.pythonhosted.org/packages/54/89/d641a746194a0f4d1a3670fb900d0dbaa786fb98341056814bc3f058fa52/kiwisolver-1.4.9-cp314-cp314-musllinux_1_2_aarch64.whl", hash = "sha256:a60ea74330b91bd22a29638940d115df9dc00af5035a9a2a6ad9399ffb4ceca5", size = 2230150, upload-time = "2025-08-10T21:27:11.484Z" },
{ url = "https://files.pythonhosted.org/packages/aa/6b/5ee1207198febdf16ac11f78c5ae40861b809cbe0e6d2a8d5b0b3044b199/kiwisolver-1.4.9-cp314-cp314-musllinux_1_2_ppc64le.whl", hash = "sha256:ce6a3a4e106cf35c2d9c4fa17c05ce0b180db622736845d4315519397a77beaf", size = 2325979, upload-time = "2025-08-10T21:27:12.917Z" },
{ url = "https://files.pythonhosted.org/packages/fc/ff/b269eefd90f4ae14dcc74973d5a0f6d28d3b9bb1afd8c0340513afe6b39a/kiwisolver-1.4.9-cp314-cp314-musllinux_1_2_s390x.whl", hash = "sha256:77937e5e2a38a7b48eef0585114fe7930346993a88060d0bf886086d2aa49ef5", size = 2491456, upload-time = "2025-08-10T21:27:14.353Z" },
{ url = "https://files.pythonhosted.org/packages/fc/d4/10303190bd4d30de547534601e259a4fbf014eed94aae3e5521129215086/kiwisolver-1.4.9-cp314-cp314-musllinux_1_2_x86_64.whl", hash = "sha256:24c175051354f4a28c5d6a31c93906dc653e2bf234e8a4bbfb964892078898ce", size = 2294621, upload-time = "2025-08-10T21:27:15.808Z" },
{ url = "https://files.pythonhosted.org/packages/28/e0/a9a90416fce5c0be25742729c2ea52105d62eda6c4be4d803c2a7be1fa50/kiwisolver-1.4.9-cp314-cp314-win_amd64.whl", hash = "sha256:0763515d4df10edf6d06a3c19734e2566368980d21ebec439f33f9eb936c07b7", size = 75417, upload-time = "2025-08-10T21:27:17.436Z" },
{ url = "https://files.pythonhosted.org/packages/1f/10/6949958215b7a9a264299a7db195564e87900f709db9245e4ebdd3c70779/kiwisolver-1.4.9-cp314-cp314-win_arm64.whl", hash = "sha256:0e4e2bf29574a6a7b7f6cb5fa69293b9f96c928949ac4a53ba3f525dffb87f9c", size = 66582, upload-time = "2025-08-10T21:27:18.436Z" },
{ url = "https://files.pythonhosted.org/packages/ec/79/60e53067903d3bc5469b369fe0dfc6b3482e2133e85dae9daa9527535991/kiwisolver-1.4.9-cp314-cp314t-macosx_10_13_universal2.whl", hash = "sha256:d976bbb382b202f71c67f77b0ac11244021cfa3f7dfd9e562eefcea2df711548", size = 126514, upload-time = "2025-08-10T21:27:19.465Z" },
{ url = "https://files.pythonhosted.org/packages/25/d1/4843d3e8d46b072c12a38c97c57fab4608d36e13fe47d47ee96b4d61ba6f/kiwisolver-1.4.9-cp314-cp314t-macosx_10_13_x86_64.whl", hash = "sha256:2489e4e5d7ef9a1c300a5e0196e43d9c739f066ef23270607d45aba368b91f2d", size = 67905, upload-time = "2025-08-10T21:27:20.51Z" },
{ url = "https://files.pythonhosted.org/packages/8c/ae/29ffcbd239aea8b93108de1278271ae764dfc0d803a5693914975f200596/kiwisolver-1.4.9-cp314-cp314t-macosx_11_0_arm64.whl", hash = "sha256:e2ea9f7ab7fbf18fffb1b5434ce7c69a07582f7acc7717720f1d69f3e806f90c", size = 66399, upload-time = "2025-08-10T21:27:21.496Z" },
{ url = "https://files.pythonhosted.org/packages/a1/ae/d7ba902aa604152c2ceba5d352d7b62106bedbccc8e95c3934d94472bfa3/kiwisolver-1.4.9-cp314-cp314t-manylinux2014_x86_64.manylinux_2_17_x86_64.whl", hash = "sha256:b34e51affded8faee0dfdb705416153819d8ea9250bbbf7ea1b249bdeb5f1122", size = 1582197, upload-time = "2025-08-10T21:27:22.604Z" },
{ url = "https://files.pythonhosted.org/packages/f2/41/27c70d427eddb8bc7e4f16420a20fefc6f480312122a59a959fdfe0445ad/kiwisolver-1.4.9-cp314-cp314t-manylinux_2_24_aarch64.manylinux_2_28_aarch64.whl", hash = "sha256:d8aacd3d4b33b772542b2e01beb50187536967b514b00003bdda7589722d2a64", size = 1390125, upload-time = "2025-08-10T21:27:24.036Z" },
{ url = "https://files.pythonhosted.org/packages/41/42/b3799a12bafc76d962ad69083f8b43b12bf4fe78b097b12e105d75c9b8f1/kiwisolver-1.4.9-cp314-cp314t-manylinux_2_24_ppc64le.manylinux_2_28_ppc64le.whl", hash = "sha256:7cf974dd4e35fa315563ac99d6287a1024e4dc2077b8a7d7cd3d2fb65d283134", size = 1402612, upload-time = "2025-08-10T21:27:25.773Z" },
{ url = "https://files.pythonhosted.org/packages/d2/b5/a210ea073ea1cfaca1bb5c55a62307d8252f531beb364e18aa1e0888b5a0/kiwisolver-1.4.9-cp314-cp314t-manylinux_2_24_s390x.manylinux_2_28_s390x.whl", hash = "sha256:85bd218b5ecfbee8c8a82e121802dcb519a86044c9c3b2e4aef02fa05c6da370", size = 1453990, upload-time = "2025-08-10T21:27:27.089Z" },
{ url = "https://files.pythonhosted.org/packages/5f/ce/a829eb8c033e977d7ea03ed32fb3c1781b4fa0433fbadfff29e39c676f32/kiwisolver-1.4.9-cp314-cp314t-musllinux_1_2_aarch64.whl", hash = "sha256:0856e241c2d3df4efef7c04a1e46b1936b6120c9bcf36dd216e3acd84bc4fb21", size = 2331601, upload-time = "2025-08-10T21:27:29.343Z" },
{ url = "https://files.pythonhosted.org/packages/e0/4b/b5e97eb142eb9cd0072dacfcdcd31b1c66dc7352b0f7c7255d339c0edf00/kiwisolver-1.4.9-cp314-cp314t-musllinux_1_2_ppc64le.whl", hash = "sha256:9af39d6551f97d31a4deebeac6f45b156f9755ddc59c07b402c148f5dbb6482a", size = 2422041, upload-time = "2025-08-10T21:27:30.754Z" },
{ url = "https://files.pythonhosted.org/packages/40/be/8eb4cd53e1b85ba4edc3a9321666f12b83113a178845593307a3e7891f44/kiwisolver-1.4.9-cp314-cp314t-musllinux_1_2_s390x.whl", hash = "sha256:bb4ae2b57fc1d8cbd1cf7b1d9913803681ffa903e7488012be5b76dedf49297f", size = 2594897, upload-time = "2025-08-10T21:27:32.803Z" },
{ url = "https://files.pythonhosted.org/packages/99/dd/841e9a66c4715477ea0abc78da039832fbb09dac5c35c58dc4c41a407b8a/kiwisolver-1.4.9-cp314-cp314t-musllinux_1_2_x86_64.whl", hash = "sha256:aedff62918805fb62d43a4aa2ecd4482c380dc76cd31bd7c8878588a61bd0369", size = 2391835, upload-time = "2025-08-10T21:27:34.23Z" },
{ url = "https://files.pythonhosted.org/packages/0c/28/4b2e5c47a0da96896fdfdb006340ade064afa1e63675d01ea5ac222b6d52/kiwisolver-1.4.9-cp314-cp314t-win_amd64.whl", hash = "sha256:1fa333e8b2ce4d9660f2cda9c0e1b6bafcfb2457a9d259faa82289e73ec24891", size = 79988, upload-time = "2025-08-10T21:27:35.587Z" },
{ url = "https://files.pythonhosted.org/packages/80/be/3578e8afd18c88cdf9cb4cffde75a96d2be38c5a903f1ed0ceec061bd09e/kiwisolver-1.4.9-cp314-cp314t-win_arm64.whl", hash = "sha256:4a48a2ce79d65d363597ef7b567ce3d14d68783d2b2263d98db3d9477805ba32", size = 70260, upload-time = "2025-08-10T21:27:36.606Z" },
]
[[package]]
name = "markdown-it-py"
version = "4.0.0"
source = { registry = "https://pypi.org/simple" }
dependencies = [
{ name = "mdurl" },
]
sdist = { url = "https://files.pythonhosted.org/packages/5b/f5/4ec618ed16cc4f8fb3b701563655a69816155e79e24a17b651541804721d/markdown_it_py-4.0.0.tar.gz", hash = "sha256:cb0a2b4aa34f932c007117b194e945bd74e0ec24133ceb5bac59009cda1cb9f3", size = 73070, upload-time = "2025-08-11T12:57:52.854Z" }
wheels = [
{ url = "https://files.pythonhosted.org/packages/94/54/e7d793b573f298e1c9013b8c4dade17d481164aa517d1d7148619c2cedbf/markdown_it_py-4.0.0-py3-none-any.whl", hash = "sha256:87327c59b172c5011896038353a81343b6754500a08cd7a4973bb48c6d578147", size = 87321, upload-time = "2025-08-11T12:57:51.923Z" },
]
[[package]]
name = "matplotlib"
version = "3.10.5"
source = { registry = "https://pypi.org/simple" }
dependencies = [
{ name = "contourpy" },
{ name = "cycler" },
{ name = "fonttools" },
{ name = "kiwisolver" },
{ name = "numpy" },
{ name = "packaging" },
{ name = "pillow" },
{ name = "pyparsing" },
{ name = "python-dateutil" },
]
sdist = { url = "https://files.pythonhosted.org/packages/43/91/f2939bb60b7ebf12478b030e0d7f340247390f402b3b189616aad790c366/matplotlib-3.10.5.tar.gz", hash = "sha256:352ed6ccfb7998a00881692f38b4ca083c691d3e275b4145423704c34c909076", size = 34804044, upload-time = "2025-07-31T18:09:33.805Z" }
wheels = [
{ url = "https://files.pythonhosted.org/packages/66/1e/c6f6bcd882d589410b475ca1fc22e34e34c82adff519caf18f3e6dd9d682/matplotlib-3.10.5-cp312-cp312-macosx_10_13_x86_64.whl", hash = "sha256:00b6feadc28a08bd3c65b2894f56cf3c94fc8f7adcbc6ab4516ae1e8ed8f62e2", size = 8253056, upload-time = "2025-07-31T18:08:05.385Z" },
{ url = "https://files.pythonhosted.org/packages/53/e6/d6f7d1b59413f233793dda14419776f5f443bcccb2dfc84b09f09fe05dbe/matplotlib-3.10.5-cp312-cp312-macosx_11_0_arm64.whl", hash = "sha256:ee98a5c5344dc7f48dc261b6ba5d9900c008fc12beb3fa6ebda81273602cc389", size = 8110131, upload-time = "2025-07-31T18:08:07.293Z" },
{ url = "https://files.pythonhosted.org/packages/66/2b/bed8a45e74957549197a2ac2e1259671cd80b55ed9e1fe2b5c94d88a9202/matplotlib-3.10.5-cp312-cp312-manylinux2014_x86_64.manylinux_2_17_x86_64.whl", hash = "sha256:a17e57e33de901d221a07af32c08870ed4528db0b6059dce7d7e65c1122d4bea", size = 8669603, upload-time = "2025-07-31T18:08:09.064Z" },
{ url = "https://files.pythonhosted.org/packages/7e/a7/315e9435b10d057f5e52dfc603cd353167ae28bb1a4e033d41540c0067a4/matplotlib-3.10.5-cp312-cp312-manylinux_2_27_aarch64.manylinux_2_28_aarch64.whl", hash = "sha256:97b9d6443419085950ee4a5b1ee08c363e5c43d7176e55513479e53669e88468", size = 9508127, upload-time = "2025-07-31T18:08:10.845Z" },
{ url = "https://files.pythonhosted.org/packages/7f/d9/edcbb1f02ca99165365d2768d517898c22c6040187e2ae2ce7294437c413/matplotlib-3.10.5-cp312-cp312-musllinux_1_2_x86_64.whl", hash = "sha256:ceefe5d40807d29a66ae916c6a3915d60ef9f028ce1927b84e727be91d884369", size = 9566926, upload-time = "2025-07-31T18:08:13.186Z" },
{ url = "https://files.pythonhosted.org/packages/3b/d9/6dd924ad5616c97b7308e6320cf392c466237a82a2040381163b7500510a/matplotlib-3.10.5-cp312-cp312-win_amd64.whl", hash = "sha256:c04cba0f93d40e45b3c187c6c52c17f24535b27d545f757a2fffebc06c12b98b", size = 8107599, upload-time = "2025-07-31T18:08:15.116Z" },
{ url = "https://files.pythonhosted.org/packages/0e/f3/522dc319a50f7b0279fbe74f86f7a3506ce414bc23172098e8d2bdf21894/matplotlib-3.10.5-cp312-cp312-win_arm64.whl", hash = "sha256:a41bcb6e2c8e79dc99c5511ae6f7787d2fb52efd3d805fff06d5d4f667db16b2", size = 7978173, upload-time = "2025-07-31T18:08:21.518Z" },
{ url = "https://files.pythonhosted.org/packages/8d/05/4f3c1f396075f108515e45cb8d334aff011a922350e502a7472e24c52d77/matplotlib-3.10.5-cp313-cp313-macosx_10_13_x86_64.whl", hash = "sha256:354204db3f7d5caaa10e5de74549ef6a05a4550fdd1c8f831ab9bca81efd39ed", size = 8253586, upload-time = "2025-07-31T18:08:23.107Z" },
{ url = "https://files.pythonhosted.org/packages/2f/2c/e084415775aac7016c3719fe7006cdb462582c6c99ac142f27303c56e243/matplotlib-3.10.5-cp313-cp313-macosx_11_0_arm64.whl", hash = "sha256:b072aac0c3ad563a2b3318124756cb6112157017f7431626600ecbe890df57a1", size = 8110715, upload-time = "2025-07-31T18:08:24.675Z" },
{ url = "https://files.pythonhosted.org/packages/52/1b/233e3094b749df16e3e6cd5a44849fd33852e692ad009cf7de00cf58ddf6/matplotlib-3.10.5-cp313-cp313-manylinux2014_x86_64.manylinux_2_17_x86_64.whl", hash = "sha256:d52fd5b684d541b5a51fb276b2b97b010c75bee9aa392f96b4a07aeb491e33c7", size = 8669397, upload-time = "2025-07-31T18:08:26.778Z" },
{ url = "https://files.pythonhosted.org/packages/e8/ec/03f9e003a798f907d9f772eed9b7c6a9775d5bd00648b643ebfb88e25414/matplotlib-3.10.5-cp313-cp313-manylinux_2_27_aarch64.manylinux_2_28_aarch64.whl", hash = "sha256:ee7a09ae2f4676276f5a65bd9f2bd91b4f9fbaedf49f40267ce3f9b448de501f", size = 9508646, upload-time = "2025-07-31T18:08:28.848Z" },
{ url = "https://files.pythonhosted.org/packages/91/e7/c051a7a386680c28487bca27d23b02d84f63e3d2a9b4d2fc478e6a42e37e/matplotlib-3.10.5-cp313-cp313-musllinux_1_2_x86_64.whl", hash = "sha256:ba6c3c9c067b83481d647af88b4e441d532acdb5ef22178a14935b0b881188f4", size = 9567424, upload-time = "2025-07-31T18:08:30.726Z" },
{ url = "https://files.pythonhosted.org/packages/36/c2/24302e93ff431b8f4173ee1dd88976c8d80483cadbc5d3d777cef47b3a1c/matplotlib-3.10.5-cp313-cp313-win_amd64.whl", hash = "sha256:07442d2692c9bd1cceaa4afb4bbe5b57b98a7599de4dabfcca92d3eea70f9ebe", size = 8107809, upload-time = "2025-07-31T18:08:33.928Z" },
{ url = "https://files.pythonhosted.org/packages/0b/33/423ec6a668d375dad825197557ed8fbdb74d62b432c1ed8235465945475f/matplotlib-3.10.5-cp313-cp313-win_arm64.whl", hash = "sha256:48fe6d47380b68a37ccfcc94f009530e84d41f71f5dae7eda7c4a5a84aa0a674", size = 7978078, upload-time = "2025-07-31T18:08:36.764Z" },
{ url = "https://files.pythonhosted.org/packages/51/17/521fc16ec766455c7bb52cc046550cf7652f6765ca8650ff120aa2d197b6/matplotlib-3.10.5-cp313-cp313t-macosx_10_13_x86_64.whl", hash = "sha256:3b80eb8621331449fc519541a7461987f10afa4f9cfd91afcd2276ebe19bd56c", size = 8295590, upload-time = "2025-07-31T18:08:38.521Z" },
{ url = "https://files.pythonhosted.org/packages/f8/12/23c28b2c21114c63999bae129fce7fd34515641c517ae48ce7b7dcd33458/matplotlib-3.10.5-cp313-cp313t-macosx_11_0_arm64.whl", hash = "sha256:47a388908e469d6ca2a6015858fa924e0e8a2345a37125948d8e93a91c47933e", size = 8158518, upload-time = "2025-07-31T18:08:40.195Z" },
{ url = "https://files.pythonhosted.org/packages/81/f8/aae4eb25e8e7190759f3cb91cbeaa344128159ac92bb6b409e24f8711f78/matplotlib-3.10.5-cp313-cp313t-manylinux2014_x86_64.manylinux_2_17_x86_64.whl", hash = "sha256:8b6b49167d208358983ce26e43aa4196073b4702858670f2eb111f9a10652b4b", size = 8691815, upload-time = "2025-07-31T18:08:42.238Z" },
{ url = "https://files.pythonhosted.org/packages/d0/ba/450c39ebdd486bd33a359fc17365ade46c6a96bf637bbb0df7824de2886c/matplotlib-3.10.5-cp313-cp313t-manylinux_2_27_aarch64.manylinux_2_28_aarch64.whl", hash = "sha256:8a8da0453a7fd8e3da114234ba70c5ba9ef0e98f190309ddfde0f089accd46ea", size = 9522814, upload-time = "2025-07-31T18:08:44.914Z" },
{ url = "https://files.pythonhosted.org/packages/89/11/9c66f6a990e27bb9aa023f7988d2d5809cb98aa39c09cbf20fba75a542ef/matplotlib-3.10.5-cp313-cp313t-musllinux_1_2_x86_64.whl", hash = "sha256:52c6573dfcb7726a9907b482cd5b92e6b5499b284ffacb04ffbfe06b3e568124", size = 9573917, upload-time = "2025-07-31T18:08:47.038Z" },
{ url = "https://files.pythonhosted.org/packages/b3/69/8b49394de92569419e5e05e82e83df9b749a0ff550d07631ea96ed2eb35a/matplotlib-3.10.5-cp313-cp313t-win_amd64.whl", hash = "sha256:a23193db2e9d64ece69cac0c8231849db7dd77ce59c7b89948cf9d0ce655a3ce", size = 8181034, upload-time = "2025-07-31T18:08:48.943Z" },
{ url = "https://files.pythonhosted.org/packages/47/23/82dc435bb98a2fc5c20dffcac8f0b083935ac28286413ed8835df40d0baa/matplotlib-3.10.5-cp313-cp313t-win_arm64.whl", hash = "sha256:56da3b102cf6da2776fef3e71cd96fcf22103a13594a18ac9a9b31314e0be154", size = 8023337, upload-time = "2025-07-31T18:08:50.791Z" },
{ url = "https://files.pythonhosted.org/packages/ac/e0/26b6cfde31f5383503ee45dcb7e691d45dadf0b3f54639332b59316a97f8/matplotlib-3.10.5-cp314-cp314-macosx_10_13_x86_64.whl", hash = "sha256:96ef8f5a3696f20f55597ffa91c28e2e73088df25c555f8d4754931515512715", size = 8253591, upload-time = "2025-07-31T18:08:53.254Z" },
{ url = "https://files.pythonhosted.org/packages/c1/89/98488c7ef7ea20ea659af7499628c240a608b337af4be2066d644cfd0a0f/matplotlib-3.10.5-cp314-cp314-macosx_11_0_arm64.whl", hash = "sha256:77fab633e94b9da60512d4fa0213daeb76d5a7b05156840c4fd0399b4b818837", size = 8112566, upload-time = "2025-07-31T18:08:55.116Z" },
{ url = "https://files.pythonhosted.org/packages/52/67/42294dfedc82aea55e1a767daf3263aacfb5a125f44ba189e685bab41b6f/matplotlib-3.10.5-cp314-cp314-manylinux_2_27_aarch64.manylinux_2_28_aarch64.whl", hash = "sha256:27f52634315e96b1debbfdc5c416592edcd9c4221bc2f520fd39c33db5d9f202", size = 9513281, upload-time = "2025-07-31T18:08:56.885Z" },
{ url = "https://files.pythonhosted.org/packages/e7/68/f258239e0cf34c2cbc816781c7ab6fca768452e6bf1119aedd2bd4a882a3/matplotlib-3.10.5-cp314-cp314-manylinux_2_27_x86_64.manylinux_2_28_x86_64.whl", hash = "sha256:525f6e28c485c769d1f07935b660c864de41c37fd716bfa64158ea646f7084bb", size = 9780873, upload-time = "2025-07-31T18:08:59.241Z" },
{ url = "https://files.pythonhosted.org/packages/89/64/f4881554006bd12e4558bd66778bdd15d47b00a1f6c6e8b50f6208eda4b3/matplotlib-3.10.5-cp314-cp314-musllinux_1_2_x86_64.whl", hash = "sha256:1f5f3ec4c191253c5f2b7c07096a142c6a1c024d9f738247bfc8e3f9643fc975", size = 9568954, upload-time = "2025-07-31T18:09:01.244Z" },
{ url = "https://files.pythonhosted.org/packages/06/f8/42779d39c3f757e1f012f2dda3319a89fb602bd2ef98ce8faf0281f4febd/matplotlib-3.10.5-cp314-cp314-win_amd64.whl", hash = "sha256:707f9c292c4cd4716f19ab8a1f93f26598222cd931e0cd98fbbb1c5994bf7667", size = 8237465, upload-time = "2025-07-31T18:09:03.206Z" },
{ url = "https://files.pythonhosted.org/packages/cf/f8/153fd06b5160f0cd27c8b9dd797fcc9fb56ac6a0ebf3c1f765b6b68d3c8a/matplotlib-3.10.5-cp314-cp314-win_arm64.whl", hash = "sha256:21a95b9bf408178d372814de7baacd61c712a62cae560b5e6f35d791776f6516", size = 8108898, upload-time = "2025-07-31T18:09:05.231Z" },
{ url = "https://files.pythonhosted.org/packages/9a/ee/c4b082a382a225fe0d2a73f1f57cf6f6f132308805b493a54c8641006238/matplotlib-3.10.5-cp314-cp314t-macosx_10_13_x86_64.whl", hash = "sha256:a6b310f95e1102a8c7c817ef17b60ee5d1851b8c71b63d9286b66b177963039e", size = 8295636, upload-time = "2025-07-31T18:09:07.306Z" },
{ url = "https://files.pythonhosted.org/packages/30/73/2195fa2099718b21a20da82dfc753bf2af58d596b51aefe93e359dd5915a/matplotlib-3.10.5-cp314-cp314t-macosx_11_0_arm64.whl", hash = "sha256:94986a242747a0605cb3ff1cb98691c736f28a59f8ffe5175acaeb7397c49a5a", size = 8158575, upload-time = "2025-07-31T18:09:09.083Z" },
{ url = "https://files.pythonhosted.org/packages/f6/e9/a08cdb34618a91fa08f75e6738541da5cacde7c307cea18ff10f0d03fcff/matplotlib-3.10.5-cp314-cp314t-manylinux_2_27_aarch64.manylinux_2_28_aarch64.whl", hash = "sha256:1ff10ea43288f0c8bab608a305dc6c918cc729d429c31dcbbecde3b9f4d5b569", size = 9522815, upload-time = "2025-07-31T18:09:11.191Z" },
{ url = "https://files.pythonhosted.org/packages/4e/bb/34d8b7e0d1bb6d06ef45db01dfa560d5a67b1c40c0b998ce9ccde934bb09/matplotlib-3.10.5-cp314-cp314t-manylinux_2_27_x86_64.manylinux_2_28_x86_64.whl", hash = "sha256:f6adb644c9d040ffb0d3434e440490a66cf73dbfa118a6f79cd7568431f7a012", size = 9783514, upload-time = "2025-07-31T18:09:13.307Z" },
{ url = "https://files.pythonhosted.org/packages/12/09/d330d1e55dcca2e11b4d304cc5227f52e2512e46828d6249b88e0694176e/matplotlib-3.10.5-cp314-cp314t-musllinux_1_2_x86_64.whl", hash = "sha256:4fa40a8f98428f789a9dcacd625f59b7bc4e3ef6c8c7c80187a7a709475cf592", size = 9573932, upload-time = "2025-07-31T18:09:15.335Z" },
{ url = "https://files.pythonhosted.org/packages/eb/3b/f70258ac729aa004aca673800a53a2b0a26d49ca1df2eaa03289a1c40f81/matplotlib-3.10.5-cp314-cp314t-win_amd64.whl", hash = "sha256:95672a5d628b44207aab91ec20bf59c26da99de12b88f7e0b1fb0a84a86ff959", size = 8322003, upload-time = "2025-07-31T18:09:17.416Z" },
{ url = "https://files.pythonhosted.org/packages/5b/60/3601f8ce6d76a7c81c7f25a0e15fde0d6b66226dd187aa6d2838e6374161/matplotlib-3.10.5-cp314-cp314t-win_arm64.whl", hash = "sha256:2efaf97d72629e74252e0b5e3c46813e9eeaa94e011ecf8084a971a31a97f40b", size = 8153849, upload-time = "2025-07-31T18:09:19.673Z" },
]
[[package]]
name = "mdurl"
version = "0.1.2"
source = { registry = "https://pypi.org/simple" }
sdist = { url = "https://files.pythonhosted.org/packages/d6/54/cfe61301667036ec958cb99bd3efefba235e65cdeb9c84d24a8293ba1d90/mdurl-0.1.2.tar.gz", hash = "sha256:bb413d29f5eea38f31dd4754dd7377d4465116fb207585f97bf925588687c1ba", size = 8729, upload-time = "2022-08-14T12:40:10.846Z" }
wheels = [
{ url = "https://files.pythonhosted.org/packages/b3/38/89ba8ad64ae25be8de66a6d463314cf1eb366222074cfda9ee839c56a4b4/mdurl-0.1.2-py3-none-any.whl", hash = "sha256:84008a41e51615a49fc9966191ff91509e3c40b939176e643fd50a5c2196b8f8", size = 9979, upload-time = "2022-08-14T12:40:09.779Z" },
]
[[package]]
name = "numpy"
version = "2.3.2"
source = { registry = "https://pypi.org/simple" }
sdist = { url = "https://files.pythonhosted.org/packages/37/7d/3fec4199c5ffb892bed55cff901e4f39a58c81df9c44c280499e92cad264/numpy-2.3.2.tar.gz", hash = "sha256:e0486a11ec30cdecb53f184d496d1c6a20786c81e55e41640270130056f8ee48", size = 20489306, upload-time = "2025-07-24T21:32:07.553Z" }
wheels = [
{ url = "https://files.pythonhosted.org/packages/00/6d/745dd1c1c5c284d17725e5c802ca4d45cfc6803519d777f087b71c9f4069/numpy-2.3.2-cp312-cp312-macosx_10_13_x86_64.whl", hash = "sha256:bc3186bea41fae9d8e90c2b4fb5f0a1f5a690682da79b92574d63f56b529080b", size = 20956420, upload-time = "2025-07-24T20:28:18.002Z" },
{ url = "https://files.pythonhosted.org/packages/bc/96/e7b533ea5740641dd62b07a790af5d9d8fec36000b8e2d0472bd7574105f/numpy-2.3.2-cp312-cp312-macosx_11_0_arm64.whl", hash = "sha256:2f4f0215edb189048a3c03bd5b19345bdfa7b45a7a6f72ae5945d2a28272727f", size = 14184660, upload-time = "2025-07-24T20:28:39.522Z" },
{ url = "https://files.pythonhosted.org/packages/2b/53/102c6122db45a62aa20d1b18c9986f67e6b97e0d6fbc1ae13e3e4c84430c/numpy-2.3.2-cp312-cp312-macosx_14_0_arm64.whl", hash = "sha256:8b1224a734cd509f70816455c3cffe13a4f599b1bf7130f913ba0e2c0b2006c0", size = 5113382, upload-time = "2025-07-24T20:28:48.544Z" },
{ url = "https://files.pythonhosted.org/packages/2b/21/376257efcbf63e624250717e82b4fae93d60178f09eb03ed766dbb48ec9c/numpy-2.3.2-cp312-cp312-macosx_14_0_x86_64.whl", hash = "sha256:3dcf02866b977a38ba3ec10215220609ab9667378a9e2150615673f3ffd6c73b", size = 6647258, upload-time = "2025-07-24T20:28:59.104Z" },
{ url = "https://files.pythonhosted.org/packages/91/ba/f4ebf257f08affa464fe6036e13f2bf9d4642a40228781dc1235da81be9f/numpy-2.3.2-cp312-cp312-manylinux_2_27_aarch64.manylinux_2_28_aarch64.whl", hash = "sha256:572d5512df5470f50ada8d1972c5f1082d9a0b7aa5944db8084077570cf98370", size = 14281409, upload-time = "2025-07-24T20:40:30.298Z" },
{ url = "https://files.pythonhosted.org/packages/59/ef/f96536f1df42c668cbacb727a8c6da7afc9c05ece6d558927fb1722693e1/numpy-2.3.2-cp312-cp312-manylinux_2_27_x86_64.manylinux_2_28_x86_64.whl", hash = "sha256:8145dd6d10df13c559d1e4314df29695613575183fa2e2d11fac4c208c8a1f73", size = 16641317, upload-time = "2025-07-24T20:40:56.625Z" },
{ url = "https://files.pythonhosted.org/packages/f6/a7/af813a7b4f9a42f498dde8a4c6fcbff8100eed00182cc91dbaf095645f38/numpy-2.3.2-cp312-cp312-musllinux_1_2_aarch64.whl", hash = "sha256:103ea7063fa624af04a791c39f97070bf93b96d7af7eb23530cd087dc8dbe9dc", size = 16056262, upload-time = "2025-07-24T20:41:20.797Z" },
{ url = "https://files.pythonhosted.org/packages/8b/5d/41c4ef8404caaa7f05ed1cfb06afe16a25895260eacbd29b4d84dff2920b/numpy-2.3.2-cp312-cp312-musllinux_1_2_x86_64.whl", hash = "sha256:fc927d7f289d14f5e037be917539620603294454130b6de200091e23d27dc9be", size = 18579342, upload-time = "2025-07-24T20:41:50.753Z" },
{ url = "https://files.pythonhosted.org/packages/a1/4f/9950e44c5a11636f4a3af6e825ec23003475cc9a466edb7a759ed3ea63bd/numpy-2.3.2-cp312-cp312-win32.whl", hash = "sha256:d95f59afe7f808c103be692175008bab926b59309ade3e6d25009e9a171f7036", size = 6320610, upload-time = "2025-07-24T20:42:01.551Z" },
{ url = "https://files.pythonhosted.org/packages/7c/2f/244643a5ce54a94f0a9a2ab578189c061e4a87c002e037b0829dd77293b6/numpy-2.3.2-cp312-cp312-win_amd64.whl", hash = "sha256:9e196ade2400c0c737d93465327d1ae7c06c7cb8a1756121ebf54b06ca183c7f", size = 12786292, upload-time = "2025-07-24T20:42:20.738Z" },
{ url = "https://files.pythonhosted.org/packages/54/cd/7b5f49d5d78db7badab22d8323c1b6ae458fbf86c4fdfa194ab3cd4eb39b/numpy-2.3.2-cp312-cp312-win_arm64.whl", hash = "sha256:ee807923782faaf60d0d7331f5e86da7d5e3079e28b291973c545476c2b00d07", size = 10194071, upload-time = "2025-07-24T20:42:36.657Z" },
{ url = "https://files.pythonhosted.org/packages/1c/c0/c6bb172c916b00700ed3bf71cb56175fd1f7dbecebf8353545d0b5519f6c/numpy-2.3.2-cp313-cp313-macosx_10_13_x86_64.whl", hash = "sha256:c8d9727f5316a256425892b043736d63e89ed15bbfe6556c5ff4d9d4448ff3b3", size = 20949074, upload-time = "2025-07-24T20:43:07.813Z" },
{ url = "https://files.pythonhosted.org/packages/20/4e/c116466d22acaf4573e58421c956c6076dc526e24a6be0903219775d862e/numpy-2.3.2-cp313-cp313-macosx_11_0_arm64.whl", hash = "sha256:efc81393f25f14d11c9d161e46e6ee348637c0a1e8a54bf9dedc472a3fae993b", size = 14177311, upload-time = "2025-07-24T20:43:29.335Z" },
{ url = "https://files.pythonhosted.org/packages/78/45/d4698c182895af189c463fc91d70805d455a227261d950e4e0f1310c2550/numpy-2.3.2-cp313-cp313-macosx_14_0_arm64.whl", hash = "sha256:dd937f088a2df683cbb79dda9a772b62a3e5a8a7e76690612c2737f38c6ef1b6", size = 5106022, upload-time = "2025-07-24T20:43:37.999Z" },
{ url = "https://files.pythonhosted.org/packages/9f/76/3e6880fef4420179309dba72a8c11f6166c431cf6dee54c577af8906f914/numpy-2.3.2-cp313-cp313-macosx_14_0_x86_64.whl", hash = "sha256:11e58218c0c46c80509186e460d79fbdc9ca1eb8d8aee39d8f2dc768eb781089", size = 6640135, upload-time = "2025-07-24T20:43:49.28Z" },
{ url = "https://files.pythonhosted.org/packages/34/fa/87ff7f25b3c4ce9085a62554460b7db686fef1e0207e8977795c7b7d7ba1/numpy-2.3.2-cp313-cp313-manylinux_2_27_aarch64.manylinux_2_28_aarch64.whl", hash = "sha256:5ad4ebcb683a1f99f4f392cc522ee20a18b2bb12a2c1c42c3d48d5a1adc9d3d2", size = 14278147, upload-time = "2025-07-24T20:44:10.328Z" },
{ url = "https://files.pythonhosted.org/packages/1d/0f/571b2c7a3833ae419fe69ff7b479a78d313581785203cc70a8db90121b9a/numpy-2.3.2-cp313-cp313-manylinux_2_27_x86_64.manylinux_2_28_x86_64.whl", hash = "sha256:938065908d1d869c7d75d8ec45f735a034771c6ea07088867f713d1cd3bbbe4f", size = 16635989, upload-time = "2025-07-24T20:44:34.88Z" },
{ url = "https://files.pythonhosted.org/packages/24/5a/84ae8dca9c9a4c592fe11340b36a86ffa9fd3e40513198daf8a97839345c/numpy-2.3.2-cp313-cp313-musllinux_1_2_aarch64.whl", hash = "sha256:66459dccc65d8ec98cc7df61307b64bf9e08101f9598755d42d8ae65d9a7a6ee", size = 16053052, upload-time = "2025-07-24T20:44:58.872Z" },
{ url = "https://files.pythonhosted.org/packages/57/7c/e5725d99a9133b9813fcf148d3f858df98511686e853169dbaf63aec6097/numpy-2.3.2-cp313-cp313-musllinux_1_2_x86_64.whl", hash = "sha256:a7af9ed2aa9ec5950daf05bb11abc4076a108bd3c7db9aa7251d5f107079b6a6", size = 18577955, upload-time = "2025-07-24T20:45:26.714Z" },
{ url = "https://files.pythonhosted.org/packages/ae/11/7c546fcf42145f29b71e4d6f429e96d8d68e5a7ba1830b2e68d7418f0bbd/numpy-2.3.2-cp313-cp313-win32.whl", hash = "sha256:906a30249315f9c8e17b085cc5f87d3f369b35fedd0051d4a84686967bdbbd0b", size = 6311843, upload-time = "2025-07-24T20:49:24.444Z" },
{ url = "https://files.pythonhosted.org/packages/aa/6f/a428fd1cb7ed39b4280d057720fed5121b0d7754fd2a9768640160f5517b/numpy-2.3.2-cp313-cp313-win_amd64.whl", hash = "sha256:c63d95dc9d67b676e9108fe0d2182987ccb0f11933c1e8959f42fa0da8d4fa56", size = 12782876, upload-time = "2025-07-24T20:49:43.227Z" },
{ url = "https://files.pythonhosted.org/packages/65/85/4ea455c9040a12595fb6c43f2c217257c7b52dd0ba332c6a6c1d28b289fe/numpy-2.3.2-cp313-cp313-win_arm64.whl", hash = "sha256:b05a89f2fb84d21235f93de47129dd4f11c16f64c87c33f5e284e6a3a54e43f2", size = 10192786, upload-time = "2025-07-24T20:49:59.443Z" },
{ url = "https://files.pythonhosted.org/packages/80/23/8278f40282d10c3f258ec3ff1b103d4994bcad78b0cba9208317f6bb73da/numpy-2.3.2-cp313-cp313t-macosx_10_13_x86_64.whl", hash = "sha256:4e6ecfeddfa83b02318f4d84acf15fbdbf9ded18e46989a15a8b6995dfbf85ab", size = 21047395, upload-time = "2025-07-24T20:45:58.821Z" },
{ url = "https://files.pythonhosted.org/packages/1f/2d/624f2ce4a5df52628b4ccd16a4f9437b37c35f4f8a50d00e962aae6efd7a/numpy-2.3.2-cp313-cp313t-macosx_11_0_arm64.whl", hash = "sha256:508b0eada3eded10a3b55725b40806a4b855961040180028f52580c4729916a2", size = 14300374, upload-time = "2025-07-24T20:46:20.207Z" },
{ url = "https://files.pythonhosted.org/packages/f6/62/ff1e512cdbb829b80a6bd08318a58698867bca0ca2499d101b4af063ee97/numpy-2.3.2-cp313-cp313t-macosx_14_0_arm64.whl", hash = "sha256:754d6755d9a7588bdc6ac47dc4ee97867271b17cee39cb87aef079574366db0a", size = 5228864, upload-time = "2025-07-24T20:46:30.58Z" },
{ url = "https://files.pythonhosted.org/packages/7d/8e/74bc18078fff03192d4032cfa99d5a5ca937807136d6f5790ce07ca53515/numpy-2.3.2-cp313-cp313t-macosx_14_0_x86_64.whl", hash = "sha256:a9f66e7d2b2d7712410d3bc5684149040ef5f19856f20277cd17ea83e5006286", size = 6737533, upload-time = "2025-07-24T20:46:46.111Z" },
{ url = "https://files.pythonhosted.org/packages/19/ea/0731efe2c9073ccca5698ef6a8c3667c4cf4eea53fcdcd0b50140aba03bc/numpy-2.3.2-cp313-cp313t-manylinux_2_27_aarch64.manylinux_2_28_aarch64.whl", hash = "sha256:de6ea4e5a65d5a90c7d286ddff2b87f3f4ad61faa3db8dabe936b34c2275b6f8", size = 14352007, upload-time = "2025-07-24T20:47:07.1Z" },
{ url = "https://files.pythonhosted.org/packages/cf/90/36be0865f16dfed20f4bc7f75235b963d5939707d4b591f086777412ff7b/numpy-2.3.2-cp313-cp313t-manylinux_2_27_x86_64.manylinux_2_28_x86_64.whl", hash = "sha256:a3ef07ec8cbc8fc9e369c8dcd52019510c12da4de81367d8b20bc692aa07573a", size = 16701914, upload-time = "2025-07-24T20:47:32.459Z" },
{ url = "https://files.pythonhosted.org/packages/94/30/06cd055e24cb6c38e5989a9e747042b4e723535758e6153f11afea88c01b/numpy-2.3.2-cp313-cp313t-musllinux_1_2_aarch64.whl", hash = "sha256:27c9f90e7481275c7800dc9c24b7cc40ace3fdb970ae4d21eaff983a32f70c91", size = 16132708, upload-time = "2025-07-24T20:47:58.129Z" },
{ url = "https://files.pythonhosted.org/packages/9a/14/ecede608ea73e58267fd7cb78f42341b3b37ba576e778a1a06baffbe585c/numpy-2.3.2-cp313-cp313t-musllinux_1_2_x86_64.whl", hash = "sha256:07b62978075b67eee4065b166d000d457c82a1efe726cce608b9db9dd66a73a5", size = 18651678, upload-time = "2025-07-24T20:48:25.402Z" },
{ url = "https://files.pythonhosted.org/packages/40/f3/2fe6066b8d07c3685509bc24d56386534c008b462a488b7f503ba82b8923/numpy-2.3.2-cp313-cp313t-win32.whl", hash = "sha256:c771cfac34a4f2c0de8e8c97312d07d64fd8f8ed45bc9f5726a7e947270152b5", size = 6441832, upload-time = "2025-07-24T20:48:37.181Z" },
{ url = "https://files.pythonhosted.org/packages/0b/ba/0937d66d05204d8f28630c9c60bc3eda68824abde4cf756c4d6aad03b0c6/numpy-2.3.2-cp313-cp313t-win_amd64.whl", hash = "sha256:72dbebb2dcc8305c431b2836bcc66af967df91be793d63a24e3d9b741374c450", size = 12927049, upload-time = "2025-07-24T20:48:56.24Z" },
{ url = "https://files.pythonhosted.org/packages/e9/ed/13542dd59c104d5e654dfa2ac282c199ba64846a74c2c4bcdbc3a0f75df1/numpy-2.3.2-cp313-cp313t-win_arm64.whl", hash = "sha256:72c6df2267e926a6d5286b0a6d556ebe49eae261062059317837fda12ddf0c1a", size = 10262935, upload-time = "2025-07-24T20:49:13.136Z" },
{ url = "https://files.pythonhosted.org/packages/c9/7c/7659048aaf498f7611b783e000c7268fcc4dcf0ce21cd10aad7b2e8f9591/numpy-2.3.2-cp314-cp314-macosx_10_13_x86_64.whl", hash = "sha256:448a66d052d0cf14ce9865d159bfc403282c9bc7bb2a31b03cc18b651eca8b1a", size = 20950906, upload-time = "2025-07-24T20:50:30.346Z" },
{ url = "https://files.pythonhosted.org/packages/80/db/984bea9d4ddf7112a04cfdfb22b1050af5757864cfffe8e09e44b7f11a10/numpy-2.3.2-cp314-cp314-macosx_11_0_arm64.whl", hash = "sha256:546aaf78e81b4081b2eba1d105c3b34064783027a06b3ab20b6eba21fb64132b", size = 14185607, upload-time = "2025-07-24T20:50:51.923Z" },
{ url = "https://files.pythonhosted.org/packages/e4/76/b3d6f414f4eca568f469ac112a3b510938d892bc5a6c190cb883af080b77/numpy-2.3.2-cp314-cp314-macosx_14_0_arm64.whl", hash = "sha256:87c930d52f45df092f7578889711a0768094debf73cfcde105e2d66954358125", size = 5114110, upload-time = "2025-07-24T20:51:01.041Z" },
{ url = "https://files.pythonhosted.org/packages/9e/d2/6f5e6826abd6bca52392ed88fe44a4b52aacb60567ac3bc86c67834c3a56/numpy-2.3.2-cp314-cp314-macosx_14_0_x86_64.whl", hash = "sha256:8dc082ea901a62edb8f59713c6a7e28a85daddcb67454c839de57656478f5b19", size = 6642050, upload-time = "2025-07-24T20:51:11.64Z" },
{ url = "https://files.pythonhosted.org/packages/c4/43/f12b2ade99199e39c73ad182f103f9d9791f48d885c600c8e05927865baf/numpy-2.3.2-cp314-cp314-manylinux_2_27_aarch64.manylinux_2_28_aarch64.whl", hash = "sha256:af58de8745f7fa9ca1c0c7c943616c6fe28e75d0c81f5c295810e3c83b5be92f", size = 14296292, upload-time = "2025-07-24T20:51:33.488Z" },
{ url = "https://files.pythonhosted.org/packages/5d/f9/77c07d94bf110a916b17210fac38680ed8734c236bfed9982fd8524a7b47/numpy-2.3.2-cp314-cp314-manylinux_2_27_x86_64.manylinux_2_28_x86_64.whl", hash = "sha256:fed5527c4cf10f16c6d0b6bee1f89958bccb0ad2522c8cadc2efd318bcd545f5", size = 16638913, upload-time = "2025-07-24T20:51:58.517Z" },
{ url = "https://files.pythonhosted.org/packages/9b/d1/9d9f2c8ea399cc05cfff8a7437453bd4e7d894373a93cdc46361bbb49a7d/numpy-2.3.2-cp314-cp314-musllinux_1_2_aarch64.whl", hash = "sha256:095737ed986e00393ec18ec0b21b47c22889ae4b0cd2d5e88342e08b01141f58", size = 16071180, upload-time = "2025-07-24T20:52:22.827Z" },
{ url = "https://files.pythonhosted.org/packages/4c/41/82e2c68aff2a0c9bf315e47d61951099fed65d8cb2c8d9dc388cb87e947e/numpy-2.3.2-cp314-cp314-musllinux_1_2_x86_64.whl", hash = "sha256:b5e40e80299607f597e1a8a247ff8d71d79c5b52baa11cc1cce30aa92d2da6e0", size = 18576809, upload-time = "2025-07-24T20:52:51.015Z" },
{ url = "https://files.pythonhosted.org/packages/14/14/4b4fd3efb0837ed252d0f583c5c35a75121038a8c4e065f2c259be06d2d8/numpy-2.3.2-cp314-cp314-win32.whl", hash = "sha256:7d6e390423cc1f76e1b8108c9b6889d20a7a1f59d9a60cac4a050fa734d6c1e2", size = 6366410, upload-time = "2025-07-24T20:56:44.949Z" },
{ url = "https://files.pythonhosted.org/packages/11/9e/b4c24a6b8467b61aced5c8dc7dcfce23621baa2e17f661edb2444a418040/numpy-2.3.2-cp314-cp314-win_amd64.whl", hash = "sha256:b9d0878b21e3918d76d2209c924ebb272340da1fb51abc00f986c258cd5e957b", size = 12918821, upload-time = "2025-07-24T20:57:06.479Z" },
{ url = "https://files.pythonhosted.org/packages/0e/0f/0dc44007c70b1007c1cef86b06986a3812dd7106d8f946c09cfa75782556/numpy-2.3.2-cp314-cp314-win_arm64.whl", hash = "sha256:2738534837c6a1d0c39340a190177d7d66fdf432894f469728da901f8f6dc910", size = 10477303, upload-time = "2025-07-24T20:57:22.879Z" },
{ url = "https://files.pythonhosted.org/packages/8b/3e/075752b79140b78ddfc9c0a1634d234cfdbc6f9bbbfa6b7504e445ad7d19/numpy-2.3.2-cp314-cp314t-macosx_10_13_x86_64.whl", hash = "sha256:4d002ecf7c9b53240be3bb69d80f86ddbd34078bae04d87be81c1f58466f264e", size = 21047524, upload-time = "2025-07-24T20:53:22.086Z" },
{ url = "https://files.pythonhosted.org/packages/fe/6d/60e8247564a72426570d0e0ea1151b95ce5bd2f1597bb878a18d32aec855/numpy-2.3.2-cp314-cp314t-macosx_11_0_arm64.whl", hash = "sha256:293b2192c6bcce487dbc6326de5853787f870aeb6c43f8f9c6496db5b1781e45", size = 14300519, upload-time = "2025-07-24T20:53:44.053Z" },
{ url = "https://files.pythonhosted.org/packages/4d/73/d8326c442cd428d47a067070c3ac6cc3b651a6e53613a1668342a12d4479/numpy-2.3.2-cp314-cp314t-macosx_14_0_arm64.whl", hash = "sha256:0a4f2021a6da53a0d580d6ef5db29947025ae8b35b3250141805ea9a32bbe86b", size = 5228972, upload-time = "2025-07-24T20:53:53.81Z" },
{ url = "https://files.pythonhosted.org/packages/34/2e/e71b2d6dad075271e7079db776196829019b90ce3ece5c69639e4f6fdc44/numpy-2.3.2-cp314-cp314t-macosx_14_0_x86_64.whl", hash = "sha256:9c144440db4bf3bb6372d2c3e49834cc0ff7bb4c24975ab33e01199e645416f2", size = 6737439, upload-time = "2025-07-24T20:54:04.742Z" },
{ url = "https://files.pythonhosted.org/packages/15/b0/d004bcd56c2c5e0500ffc65385eb6d569ffd3363cb5e593ae742749b2daa/numpy-2.3.2-cp314-cp314t-manylinux_2_27_aarch64.manylinux_2_28_aarch64.whl", hash = "sha256:f92d6c2a8535dc4fe4419562294ff957f83a16ebdec66df0805e473ffaad8bd0", size = 14352479, upload-time = "2025-07-24T20:54:25.819Z" },
{ url = "https://files.pythonhosted.org/packages/11/e3/285142fcff8721e0c99b51686426165059874c150ea9ab898e12a492e291/numpy-2.3.2-cp314-cp314t-manylinux_2_27_x86_64.manylinux_2_28_x86_64.whl", hash = "sha256:cefc2219baa48e468e3db7e706305fcd0c095534a192a08f31e98d83a7d45fb0", size = 16702805, upload-time = "2025-07-24T20:54:50.814Z" },
{ url = "https://files.pythonhosted.org/packages/33/c3/33b56b0e47e604af2c7cd065edca892d180f5899599b76830652875249a3/numpy-2.3.2-cp314-cp314t-musllinux_1_2_aarch64.whl", hash = "sha256:76c3e9501ceb50b2ff3824c3589d5d1ab4ac857b0ee3f8f49629d0de55ecf7c2", size = 16133830, upload-time = "2025-07-24T20:55:17.306Z" },
{ url = "https://files.pythonhosted.org/packages/6e/ae/7b1476a1f4d6a48bc669b8deb09939c56dd2a439db1ab03017844374fb67/numpy-2.3.2-cp314-cp314t-musllinux_1_2_x86_64.whl", hash = "sha256:122bf5ed9a0221b3419672493878ba4967121514b1d7d4656a7580cd11dddcbf", size = 18652665, upload-time = "2025-07-24T20:55:46.665Z" },
{ url = "https://files.pythonhosted.org/packages/14/ba/5b5c9978c4bb161034148ade2de9db44ec316fab89ce8c400db0e0c81f86/numpy-2.3.2-cp314-cp314t-win32.whl", hash = "sha256:6f1ae3dcb840edccc45af496f312528c15b1f79ac318169d094e85e4bb35fdf1", size = 6514777, upload-time = "2025-07-24T20:55:57.66Z" },
{ url = "https://files.pythonhosted.org/packages/eb/46/3dbaf0ae7c17cdc46b9f662c56da2054887b8d9e737c1476f335c83d33db/numpy-2.3.2-cp314-cp314t-win_amd64.whl", hash = "sha256:087ffc25890d89a43536f75c5fe8770922008758e8eeeef61733957041ed2f9b", size = 13111856, upload-time = "2025-07-24T20:56:17.318Z" },
{ url = "https://files.pythonhosted.org/packages/c1/9e/1652778bce745a67b5fe05adde60ed362d38eb17d919a540e813d30f6874/numpy-2.3.2-cp314-cp314t-win_arm64.whl", hash = "sha256:092aeb3449833ea9c0bf0089d70c29ae480685dd2377ec9cdbbb620257f84631", size = 10544226, upload-time = "2025-07-24T20:56:34.509Z" },
]
[[package]]
name = "orderflow-backtest"
version = "0.1.0"
source = { virtual = "." }
dependencies = [
{ name = "matplotlib" },
{ name = "pyqt5" },
{ name = "typer" },
]
[package.dev-dependencies]
dev = [
{ name = "pytest" },
]
[package.metadata]
requires-dist = [
{ name = "matplotlib", specifier = ">=3.10.5" },
{ name = "pyqt5", specifier = ">=5.15.11" },
{ name = "typer", specifier = ">=0.16.1" },
]
[package.metadata.requires-dev]
dev = [{ name = "pytest", specifier = ">=8.4.1" }]
[[package]]
name = "packaging"
version = "25.0"
source = { registry = "https://pypi.org/simple" }
sdist = { url = "https://files.pythonhosted.org/packages/a1/d4/1fc4078c65507b51b96ca8f8c3ba19e6a61c8253c72794544580a7b6c24d/packaging-25.0.tar.gz", hash = "sha256:d443872c98d677bf60f6a1f2f8c1cb748e8fe762d2bf9d3148b5599295b0fc4f", size = 165727, upload-time = "2025-04-19T11:48:59.673Z" }
wheels = [
{ url = "https://files.pythonhosted.org/packages/20/12/38679034af332785aac8774540895e234f4d07f7545804097de4b666afd8/packaging-25.0-py3-none-any.whl", hash = "sha256:29572ef2b1f17581046b3a2227d5c611fb25ec70ca1ba8554b24b0e69331a484", size = 66469, upload-time = "2025-04-19T11:48:57.875Z" },
]
[[package]]
name = "pillow"
version = "11.3.0"
source = { registry = "https://pypi.org/simple" }
sdist = { url = "https://files.pythonhosted.org/packages/f3/0d/d0d6dea55cd152ce3d6767bb38a8fc10e33796ba4ba210cbab9354b6d238/pillow-11.3.0.tar.gz", hash = "sha256:3828ee7586cd0b2091b6209e5ad53e20d0649bbe87164a459d0676e035e8f523", size = 47113069, upload-time = "2025-07-01T09:16:30.666Z" }
wheels = [
{ url = "https://files.pythonhosted.org/packages/40/fe/1bc9b3ee13f68487a99ac9529968035cca2f0a51ec36892060edcc51d06a/pillow-11.3.0-cp312-cp312-macosx_10_13_x86_64.whl", hash = "sha256:fdae223722da47b024b867c1ea0be64e0df702c5e0a60e27daad39bf960dd1e4", size = 5278800, upload-time = "2025-07-01T09:14:17.648Z" },
{ url = "https://files.pythonhosted.org/packages/2c/32/7e2ac19b5713657384cec55f89065fb306b06af008cfd87e572035b27119/pillow-11.3.0-cp312-cp312-macosx_11_0_arm64.whl", hash = "sha256:921bd305b10e82b4d1f5e802b6850677f965d8394203d182f078873851dada69", size = 4686296, upload-time = "2025-07-01T09:14:19.828Z" },
{ url = "https://files.pythonhosted.org/packages/8e/1e/b9e12bbe6e4c2220effebc09ea0923a07a6da1e1f1bfbc8d7d29a01ce32b/pillow-11.3.0-cp312-cp312-manylinux2014_aarch64.manylinux_2_17_aarch64.whl", hash = "sha256:eb76541cba2f958032d79d143b98a3a6b3ea87f0959bbe256c0b5e416599fd5d", size = 5871726, upload-time = "2025-07-03T13:10:04.448Z" },
{ url = "https://files.pythonhosted.org/packages/8d/33/e9200d2bd7ba00dc3ddb78df1198a6e80d7669cce6c2bdbeb2530a74ec58/pillow-11.3.0-cp312-cp312-manylinux2014_x86_64.manylinux_2_17_x86_64.whl", hash = "sha256:67172f2944ebba3d4a7b54f2e95c786a3a50c21b88456329314caaa28cda70f6", size = 7644652, upload-time = "2025-07-03T13:10:10.391Z" },
{ url = "https://files.pythonhosted.org/packages/41/f1/6f2427a26fc683e00d985bc391bdd76d8dd4e92fac33d841127eb8fb2313/pillow-11.3.0-cp312-cp312-manylinux_2_27_aarch64.manylinux_2_28_aarch64.whl", hash = "sha256:97f07ed9f56a3b9b5f49d3661dc9607484e85c67e27f3e8be2c7d28ca032fec7", size = 5977787, upload-time = "2025-07-01T09:14:21.63Z" },
{ url = "https://files.pythonhosted.org/packages/e4/c9/06dd4a38974e24f932ff5f98ea3c546ce3f8c995d3f0985f8e5ba48bba19/pillow-11.3.0-cp312-cp312-manylinux_2_27_x86_64.manylinux_2_28_x86_64.whl", hash = "sha256:676b2815362456b5b3216b4fd5bd89d362100dc6f4945154ff172e206a22c024", size = 6645236, upload-time = "2025-07-01T09:14:23.321Z" },
{ url = "https://files.pythonhosted.org/packages/40/e7/848f69fb79843b3d91241bad658e9c14f39a32f71a301bcd1d139416d1be/pillow-11.3.0-cp312-cp312-musllinux_1_2_aarch64.whl", hash = "sha256:3e184b2f26ff146363dd07bde8b711833d7b0202e27d13540bfe2e35a323a809", size = 6086950, upload-time = "2025-07-01T09:14:25.237Z" },
{ url = "https://files.pythonhosted.org/packages/0b/1a/7cff92e695a2a29ac1958c2a0fe4c0b2393b60aac13b04a4fe2735cad52d/pillow-11.3.0-cp312-cp312-musllinux_1_2_x86_64.whl", hash = "sha256:6be31e3fc9a621e071bc17bb7de63b85cbe0bfae91bb0363c893cbe67247780d", size = 6723358, upload-time = "2025-07-01T09:14:27.053Z" },
{ url = "https://files.pythonhosted.org/packages/26/7d/73699ad77895f69edff76b0f332acc3d497f22f5d75e5360f78cbcaff248/pillow-11.3.0-cp312-cp312-win32.whl", hash = "sha256:7b161756381f0918e05e7cb8a371fff367e807770f8fe92ecb20d905d0e1c149", size = 6275079, upload-time = "2025-07-01T09:14:30.104Z" },
{ url = "https://files.pythonhosted.org/packages/8c/ce/e7dfc873bdd9828f3b6e5c2bbb74e47a98ec23cc5c74fc4e54462f0d9204/pillow-11.3.0-cp312-cp312-win_amd64.whl", hash = "sha256:a6444696fce635783440b7f7a9fc24b3ad10a9ea3f0ab66c5905be1c19ccf17d", size = 6986324, upload-time = "2025-07-01T09:14:31.899Z" },
{ url = "https://files.pythonhosted.org/packages/16/8f/b13447d1bf0b1f7467ce7d86f6e6edf66c0ad7cf44cf5c87a37f9bed9936/pillow-11.3.0-cp312-cp312-win_arm64.whl", hash = "sha256:2aceea54f957dd4448264f9bf40875da0415c83eb85f55069d89c0ed436e3542", size = 2423067, upload-time = "2025-07-01T09:14:33.709Z" },
{ url = "https://files.pythonhosted.org/packages/1e/93/0952f2ed8db3a5a4c7a11f91965d6184ebc8cd7cbb7941a260d5f018cd2d/pillow-11.3.0-cp313-cp313-ios_13_0_arm64_iphoneos.whl", hash = "sha256:1c627742b539bba4309df89171356fcb3cc5a9178355b2727d1b74a6cf155fbd", size = 2128328, upload-time = "2025-07-01T09:14:35.276Z" },
{ url = "https://files.pythonhosted.org/packages/4b/e8/100c3d114b1a0bf4042f27e0f87d2f25e857e838034e98ca98fe7b8c0a9c/pillow-11.3.0-cp313-cp313-ios_13_0_arm64_iphonesimulator.whl", hash = "sha256:30b7c02f3899d10f13d7a48163c8969e4e653f8b43416d23d13d1bbfdc93b9f8", size = 2170652, upload-time = "2025-07-01T09:14:37.203Z" },
{ url = "https://files.pythonhosted.org/packages/aa/86/3f758a28a6e381758545f7cdb4942e1cb79abd271bea932998fc0db93cb6/pillow-11.3.0-cp313-cp313-ios_13_0_x86_64_iphonesimulator.whl", hash = "sha256:7859a4cc7c9295f5838015d8cc0a9c215b77e43d07a25e460f35cf516df8626f", size = 2227443, upload-time = "2025-07-01T09:14:39.344Z" },
{ url = "https://files.pythonhosted.org/packages/01/f4/91d5b3ffa718df2f53b0dc109877993e511f4fd055d7e9508682e8aba092/pillow-11.3.0-cp313-cp313-macosx_10_13_x86_64.whl", hash = "sha256:ec1ee50470b0d050984394423d96325b744d55c701a439d2bd66089bff963d3c", size = 5278474, upload-time = "2025-07-01T09:14:41.843Z" },
{ url = "https://files.pythonhosted.org/packages/f9/0e/37d7d3eca6c879fbd9dba21268427dffda1ab00d4eb05b32923d4fbe3b12/pillow-11.3.0-cp313-cp313-macosx_11_0_arm64.whl", hash = "sha256:7db51d222548ccfd274e4572fdbf3e810a5e66b00608862f947b163e613b67dd", size = 4686038, upload-time = "2025-07-01T09:14:44.008Z" },
{ url = "https://files.pythonhosted.org/packages/ff/b0/3426e5c7f6565e752d81221af9d3676fdbb4f352317ceafd42899aaf5d8a/pillow-11.3.0-cp313-cp313-manylinux2014_aarch64.manylinux_2_17_aarch64.whl", hash = "sha256:2d6fcc902a24ac74495df63faad1884282239265c6839a0a6416d33faedfae7e", size = 5864407, upload-time = "2025-07-03T13:10:15.628Z" },
{ url = "https://files.pythonhosted.org/packages/fc/c1/c6c423134229f2a221ee53f838d4be9d82bab86f7e2f8e75e47b6bf6cd77/pillow-11.3.0-cp313-cp313-manylinux2014_x86_64.manylinux_2_17_x86_64.whl", hash = "sha256:f0f5d8f4a08090c6d6d578351a2b91acf519a54986c055af27e7a93feae6d3f1", size = 7639094, upload-time = "2025-07-03T13:10:21.857Z" },
{ url = "https://files.pythonhosted.org/packages/ba/c9/09e6746630fe6372c67c648ff9deae52a2bc20897d51fa293571977ceb5d/pillow-11.3.0-cp313-cp313-manylinux_2_27_aarch64.manylinux_2_28_aarch64.whl", hash = "sha256:c37d8ba9411d6003bba9e518db0db0c58a680ab9fe5179f040b0463644bc9805", size = 5973503, upload-time = "2025-07-01T09:14:45.698Z" },
{ url = "https://files.pythonhosted.org/packages/d5/1c/a2a29649c0b1983d3ef57ee87a66487fdeb45132df66ab30dd37f7dbe162/pillow-11.3.0-cp313-cp313-manylinux_2_27_x86_64.manylinux_2_28_x86_64.whl", hash = "sha256:13f87d581e71d9189ab21fe0efb5a23e9f28552d5be6979e84001d3b8505abe8", size = 6642574, upload-time = "2025-07-01T09:14:47.415Z" },
{ url = "https://files.pythonhosted.org/packages/36/de/d5cc31cc4b055b6c6fd990e3e7f0f8aaf36229a2698501bcb0cdf67c7146/pillow-11.3.0-cp313-cp313-musllinux_1_2_aarch64.whl", hash = "sha256:023f6d2d11784a465f09fd09a34b150ea4672e85fb3d05931d89f373ab14abb2", size = 6084060, upload-time = "2025-07-01T09:14:49.636Z" },
{ url = "https://files.pythonhosted.org/packages/d5/ea/502d938cbaeec836ac28a9b730193716f0114c41325db428e6b280513f09/pillow-11.3.0-cp313-cp313-musllinux_1_2_x86_64.whl", hash = "sha256:45dfc51ac5975b938e9809451c51734124e73b04d0f0ac621649821a63852e7b", size = 6721407, upload-time = "2025-07-01T09:14:51.962Z" },
{ url = "https://files.pythonhosted.org/packages/45/9c/9c5e2a73f125f6cbc59cc7087c8f2d649a7ae453f83bd0362ff7c9e2aee2/pillow-11.3.0-cp313-cp313-win32.whl", hash = "sha256:a4d336baed65d50d37b88ca5b60c0fa9d81e3a87d4a7930d3880d1624d5b31f3", size = 6273841, upload-time = "2025-07-01T09:14:54.142Z" },
{ url = "https://files.pythonhosted.org/packages/23/85/397c73524e0cd212067e0c969aa245b01d50183439550d24d9f55781b776/pillow-11.3.0-cp313-cp313-win_amd64.whl", hash = "sha256:0bce5c4fd0921f99d2e858dc4d4d64193407e1b99478bc5cacecba2311abde51", size = 6978450, upload-time = "2025-07-01T09:14:56.436Z" },
{ url = "https://files.pythonhosted.org/packages/17/d2/622f4547f69cd173955194b78e4d19ca4935a1b0f03a302d655c9f6aae65/pillow-11.3.0-cp313-cp313-win_arm64.whl", hash = "sha256:1904e1264881f682f02b7f8167935cce37bc97db457f8e7849dc3a6a52b99580", size = 2423055, upload-time = "2025-07-01T09:14:58.072Z" },
{ url = "https://files.pythonhosted.org/packages/dd/80/a8a2ac21dda2e82480852978416cfacd439a4b490a501a288ecf4fe2532d/pillow-11.3.0-cp313-cp313t-macosx_10_13_x86_64.whl", hash = "sha256:4c834a3921375c48ee6b9624061076bc0a32a60b5532b322cc0ea64e639dd50e", size = 5281110, upload-time = "2025-07-01T09:14:59.79Z" },
{ url = "https://files.pythonhosted.org/packages/44/d6/b79754ca790f315918732e18f82a8146d33bcd7f4494380457ea89eb883d/pillow-11.3.0-cp313-cp313t-macosx_11_0_arm64.whl", hash = "sha256:5e05688ccef30ea69b9317a9ead994b93975104a677a36a8ed8106be9260aa6d", size = 4689547, upload-time = "2025-07-01T09:15:01.648Z" },
{ url = "https://files.pythonhosted.org/packages/49/20/716b8717d331150cb00f7fdd78169c01e8e0c219732a78b0e59b6bdb2fd6/pillow-11.3.0-cp313-cp313t-manylinux2014_aarch64.manylinux_2_17_aarch64.whl", hash = "sha256:1019b04af07fc0163e2810167918cb5add8d74674b6267616021ab558dc98ced", size = 5901554, upload-time = "2025-07-03T13:10:27.018Z" },
{ url = "https://files.pythonhosted.org/packages/74/cf/a9f3a2514a65bb071075063a96f0a5cf949c2f2fce683c15ccc83b1c1cab/pillow-11.3.0-cp313-cp313t-manylinux2014_x86_64.manylinux_2_17_x86_64.whl", hash = "sha256:f944255db153ebb2b19c51fe85dd99ef0ce494123f21b9db4877ffdfc5590c7c", size = 7669132, upload-time = "2025-07-03T13:10:33.01Z" },
{ url = "https://files.pythonhosted.org/packages/98/3c/da78805cbdbee9cb43efe8261dd7cc0b4b93f2ac79b676c03159e9db2187/pillow-11.3.0-cp313-cp313t-manylinux_2_27_aarch64.manylinux_2_28_aarch64.whl", hash = "sha256:1f85acb69adf2aaee8b7da124efebbdb959a104db34d3a2cb0f3793dbae422a8", size = 6005001, upload-time = "2025-07-01T09:15:03.365Z" },
{ url = "https://files.pythonhosted.org/packages/6c/fa/ce044b91faecf30e635321351bba32bab5a7e034c60187fe9698191aef4f/pillow-11.3.0-cp313-cp313t-manylinux_2_27_x86_64.manylinux_2_28_x86_64.whl", hash = "sha256:05f6ecbeff5005399bb48d198f098a9b4b6bdf27b8487c7f38ca16eeb070cd59", size = 6668814, upload-time = "2025-07-01T09:15:05.655Z" },
{ url = "https://files.pythonhosted.org/packages/7b/51/90f9291406d09bf93686434f9183aba27b831c10c87746ff49f127ee80cb/pillow-11.3.0-cp313-cp313t-musllinux_1_2_aarch64.whl", hash = "sha256:a7bc6e6fd0395bc052f16b1a8670859964dbd7003bd0af2ff08342eb6e442cfe", size = 6113124, upload-time = "2025-07-01T09:15:07.358Z" },
{ url = "https://files.pythonhosted.org/packages/cd/5a/6fec59b1dfb619234f7636d4157d11fb4e196caeee220232a8d2ec48488d/pillow-11.3.0-cp313-cp313t-musllinux_1_2_x86_64.whl", hash = "sha256:83e1b0161c9d148125083a35c1c5a89db5b7054834fd4387499e06552035236c", size = 6747186, upload-time = "2025-07-01T09:15:09.317Z" },
{ url = "https://files.pythonhosted.org/packages/49/6b/00187a044f98255225f172de653941e61da37104a9ea60e4f6887717e2b5/pillow-11.3.0-cp313-cp313t-win32.whl", hash = "sha256:2a3117c06b8fb646639dce83694f2f9eac405472713fcb1ae887469c0d4f6788", size = 6277546, upload-time = "2025-07-01T09:15:11.311Z" },
{ url = "https://files.pythonhosted.org/packages/e8/5c/6caaba7e261c0d75bab23be79f1d06b5ad2a2ae49f028ccec801b0e853d6/pillow-11.3.0-cp313-cp313t-win_amd64.whl", hash = "sha256:857844335c95bea93fb39e0fa2726b4d9d758850b34075a7e3ff4f4fa3aa3b31", size = 6985102, upload-time = "2025-07-01T09:15:13.164Z" },
{ url = "https://files.pythonhosted.org/packages/f3/7e/b623008460c09a0cb38263c93b828c666493caee2eb34ff67f778b87e58c/pillow-11.3.0-cp313-cp313t-win_arm64.whl", hash = "sha256:8797edc41f3e8536ae4b10897ee2f637235c94f27404cac7297f7b607dd0716e", size = 2424803, upload-time = "2025-07-01T09:15:15.695Z" },
{ url = "https://files.pythonhosted.org/packages/73/f4/04905af42837292ed86cb1b1dabe03dce1edc008ef14c473c5c7e1443c5d/pillow-11.3.0-cp314-cp314-macosx_10_13_x86_64.whl", hash = "sha256:d9da3df5f9ea2a89b81bb6087177fb1f4d1c7146d583a3fe5c672c0d94e55e12", size = 5278520, upload-time = "2025-07-01T09:15:17.429Z" },
{ url = "https://files.pythonhosted.org/packages/41/b0/33d79e377a336247df6348a54e6d2a2b85d644ca202555e3faa0cf811ecc/pillow-11.3.0-cp314-cp314-macosx_11_0_arm64.whl", hash = "sha256:0b275ff9b04df7b640c59ec5a3cb113eefd3795a8df80bac69646ef699c6981a", size = 4686116, upload-time = "2025-07-01T09:15:19.423Z" },
{ url = "https://files.pythonhosted.org/packages/49/2d/ed8bc0ab219ae8768f529597d9509d184fe8a6c4741a6864fea334d25f3f/pillow-11.3.0-cp314-cp314-manylinux2014_aarch64.manylinux_2_17_aarch64.whl", hash = "sha256:0743841cabd3dba6a83f38a92672cccbd69af56e3e91777b0ee7f4dba4385632", size = 5864597, upload-time = "2025-07-03T13:10:38.404Z" },
{ url = "https://files.pythonhosted.org/packages/b5/3d/b932bb4225c80b58dfadaca9d42d08d0b7064d2d1791b6a237f87f661834/pillow-11.3.0-cp314-cp314-manylinux2014_x86_64.manylinux_2_17_x86_64.whl", hash = "sha256:2465a69cf967b8b49ee1b96d76718cd98c4e925414ead59fdf75cf0fd07df673", size = 7638246, upload-time = "2025-07-03T13:10:44.987Z" },
{ url = "https://files.pythonhosted.org/packages/09/b5/0487044b7c096f1b48f0d7ad416472c02e0e4bf6919541b111efd3cae690/pillow-11.3.0-cp314-cp314-manylinux_2_27_aarch64.manylinux_2_28_aarch64.whl", hash = "sha256:41742638139424703b4d01665b807c6468e23e699e8e90cffefe291c5832b027", size = 5973336, upload-time = "2025-07-01T09:15:21.237Z" },
{ url = "https://files.pythonhosted.org/packages/a8/2d/524f9318f6cbfcc79fbc004801ea6b607ec3f843977652fdee4857a7568b/pillow-11.3.0-cp314-cp314-manylinux_2_27_x86_64.manylinux_2_28_x86_64.whl", hash = "sha256:93efb0b4de7e340d99057415c749175e24c8864302369e05914682ba642e5d77", size = 6642699, upload-time = "2025-07-01T09:15:23.186Z" },
{ url = "https://files.pythonhosted.org/packages/6f/d2/a9a4f280c6aefedce1e8f615baaa5474e0701d86dd6f1dede66726462bbd/pillow-11.3.0-cp314-cp314-musllinux_1_2_aarch64.whl", hash = "sha256:7966e38dcd0fa11ca390aed7c6f20454443581d758242023cf36fcb319b1a874", size = 6083789, upload-time = "2025-07-01T09:15:25.1Z" },
{ url = "https://files.pythonhosted.org/packages/fe/54/86b0cd9dbb683a9d5e960b66c7379e821a19be4ac5810e2e5a715c09a0c0/pillow-11.3.0-cp314-cp314-musllinux_1_2_x86_64.whl", hash = "sha256:98a9afa7b9007c67ed84c57c9e0ad86a6000da96eaa638e4f8abe5b65ff83f0a", size = 6720386, upload-time = "2025-07-01T09:15:27.378Z" },
{ url = "https://files.pythonhosted.org/packages/e7/95/88efcaf384c3588e24259c4203b909cbe3e3c2d887af9e938c2022c9dd48/pillow-11.3.0-cp314-cp314-win32.whl", hash = "sha256:02a723e6bf909e7cea0dac1b0e0310be9d7650cd66222a5f1c571455c0a45214", size = 6370911, upload-time = "2025-07-01T09:15:29.294Z" },
{ url = "https://files.pythonhosted.org/packages/2e/cc/934e5820850ec5eb107e7b1a72dd278140731c669f396110ebc326f2a503/pillow-11.3.0-cp314-cp314-win_amd64.whl", hash = "sha256:a418486160228f64dd9e9efcd132679b7a02a5f22c982c78b6fc7dab3fefb635", size = 7117383, upload-time = "2025-07-01T09:15:31.128Z" },
{ url = "https://files.pythonhosted.org/packages/d6/e9/9c0a616a71da2a5d163aa37405e8aced9a906d574b4a214bede134e731bc/pillow-11.3.0-cp314-cp314-win_arm64.whl", hash = "sha256:155658efb5e044669c08896c0c44231c5e9abcaadbc5cd3648df2f7c0b96b9a6", size = 2511385, upload-time = "2025-07-01T09:15:33.328Z" },
{ url = "https://files.pythonhosted.org/packages/1a/33/c88376898aff369658b225262cd4f2659b13e8178e7534df9e6e1fa289f6/pillow-11.3.0-cp314-cp314t-macosx_10_13_x86_64.whl", hash = "sha256:59a03cdf019efbfeeed910bf79c7c93255c3d54bc45898ac2a4140071b02b4ae", size = 5281129, upload-time = "2025-07-01T09:15:35.194Z" },
{ url = "https://files.pythonhosted.org/packages/1f/70/d376247fb36f1844b42910911c83a02d5544ebd2a8bad9efcc0f707ea774/pillow-11.3.0-cp314-cp314t-macosx_11_0_arm64.whl", hash = "sha256:f8a5827f84d973d8636e9dc5764af4f0cf2318d26744b3d902931701b0d46653", size = 4689580, upload-time = "2025-07-01T09:15:37.114Z" },
{ url = "https://files.pythonhosted.org/packages/eb/1c/537e930496149fbac69efd2fc4329035bbe2e5475b4165439e3be9cb183b/pillow-11.3.0-cp314-cp314t-manylinux2014_aarch64.manylinux_2_17_aarch64.whl", hash = "sha256:ee92f2fd10f4adc4b43d07ec5e779932b4eb3dbfbc34790ada5a6669bc095aa6", size = 5902860, upload-time = "2025-07-03T13:10:50.248Z" },
{ url = "https://files.pythonhosted.org/packages/bd/57/80f53264954dcefeebcf9dae6e3eb1daea1b488f0be8b8fef12f79a3eb10/pillow-11.3.0-cp314-cp314t-manylinux2014_x86_64.manylinux_2_17_x86_64.whl", hash = "sha256:c96d333dcf42d01f47b37e0979b6bd73ec91eae18614864622d9b87bbd5bbf36", size = 7670694, upload-time = "2025-07-03T13:10:56.432Z" },
{ url = "https://files.pythonhosted.org/packages/70/ff/4727d3b71a8578b4587d9c276e90efad2d6fe0335fd76742a6da08132e8c/pillow-11.3.0-cp314-cp314t-manylinux_2_27_aarch64.manylinux_2_28_aarch64.whl", hash = "sha256:4c96f993ab8c98460cd0c001447bff6194403e8b1d7e149ade5f00594918128b", size = 6005888, upload-time = "2025-07-01T09:15:39.436Z" },
{ url = "https://files.pythonhosted.org/packages/05/ae/716592277934f85d3be51d7256f3636672d7b1abfafdc42cf3f8cbd4b4c8/pillow-11.3.0-cp314-cp314t-manylinux_2_27_x86_64.manylinux_2_28_x86_64.whl", hash = "sha256:41342b64afeba938edb034d122b2dda5db2139b9a4af999729ba8818e0056477", size = 6670330, upload-time = "2025-07-01T09:15:41.269Z" },
{ url = "https://files.pythonhosted.org/packages/e7/bb/7fe6cddcc8827b01b1a9766f5fdeb7418680744f9082035bdbabecf1d57f/pillow-11.3.0-cp314-cp314t-musllinux_1_2_aarch64.whl", hash = "sha256:068d9c39a2d1b358eb9f245ce7ab1b5c3246c7c8c7d9ba58cfa5b43146c06e50", size = 6114089, upload-time = "2025-07-01T09:15:43.13Z" },
{ url = "https://files.pythonhosted.org/packages/8b/f5/06bfaa444c8e80f1a8e4bff98da9c83b37b5be3b1deaa43d27a0db37ef84/pillow-11.3.0-cp314-cp314t-musllinux_1_2_x86_64.whl", hash = "sha256:a1bc6ba083b145187f648b667e05a2534ecc4b9f2784c2cbe3089e44868f2b9b", size = 6748206, upload-time = "2025-07-01T09:15:44.937Z" },
{ url = "https://files.pythonhosted.org/packages/f0/77/bc6f92a3e8e6e46c0ca78abfffec0037845800ea38c73483760362804c41/pillow-11.3.0-cp314-cp314t-win32.whl", hash = "sha256:118ca10c0d60b06d006be10a501fd6bbdfef559251ed31b794668ed569c87e12", size = 6377370, upload-time = "2025-07-01T09:15:46.673Z" },
{ url = "https://files.pythonhosted.org/packages/4a/82/3a721f7d69dca802befb8af08b7c79ebcab461007ce1c18bd91a5d5896f9/pillow-11.3.0-cp314-cp314t-win_amd64.whl", hash = "sha256:8924748b688aa210d79883357d102cd64690e56b923a186f35a82cbc10f997db", size = 7121500, upload-time = "2025-07-01T09:15:48.512Z" },
{ url = "https://files.pythonhosted.org/packages/89/c7/5572fa4a3f45740eaab6ae86fcdf7195b55beac1371ac8c619d880cfe948/pillow-11.3.0-cp314-cp314t-win_arm64.whl", hash = "sha256:79ea0d14d3ebad43ec77ad5272e6ff9bba5b679ef73375ea760261207fa8e0aa", size = 2512835, upload-time = "2025-07-01T09:15:50.399Z" },
]
[[package]]
name = "pluggy"
version = "1.6.0"
source = { registry = "https://pypi.org/simple" }
sdist = { url = "https://files.pythonhosted.org/packages/f9/e2/3e91f31a7d2b083fe6ef3fa267035b518369d9511ffab804f839851d2779/pluggy-1.6.0.tar.gz", hash = "sha256:7dcc130b76258d33b90f61b658791dede3486c3e6bfb003ee5c9bfb396dd22f3", size = 69412, upload-time = "2025-05-15T12:30:07.975Z" }
wheels = [
{ url = "https://files.pythonhosted.org/packages/54/20/4d324d65cc6d9205fabedc306948156824eb9f0ee1633355a8f7ec5c66bf/pluggy-1.6.0-py3-none-any.whl", hash = "sha256:e920276dd6813095e9377c0bc5566d94c932c33b27a3e3945d8389c374dd4746", size = 20538, upload-time = "2025-05-15T12:30:06.134Z" },
]
[[package]]
name = "pygments"
version = "2.19.2"
source = { registry = "https://pypi.org/simple" }
sdist = { url = "https://files.pythonhosted.org/packages/b0/77/a5b8c569bf593b0140bde72ea885a803b82086995367bf2037de0159d924/pygments-2.19.2.tar.gz", hash = "sha256:636cb2477cec7f8952536970bc533bc43743542f70392ae026374600add5b887", size = 4968631, upload-time = "2025-06-21T13:39:12.283Z" }
wheels = [
{ url = "https://files.pythonhosted.org/packages/c7/21/705964c7812476f378728bdf590ca4b771ec72385c533964653c68e86bdc/pygments-2.19.2-py3-none-any.whl", hash = "sha256:86540386c03d588bb81d44bc3928634ff26449851e99741617ecb9037ee5ec0b", size = 1225217, upload-time = "2025-06-21T13:39:07.939Z" },
]
[[package]]
name = "pyparsing"
version = "3.2.3"
source = { registry = "https://pypi.org/simple" }
sdist = { url = "https://files.pythonhosted.org/packages/bb/22/f1129e69d94ffff626bdb5c835506b3a5b4f3d070f17ea295e12c2c6f60f/pyparsing-3.2.3.tar.gz", hash = "sha256:b9c13f1ab8b3b542f72e28f634bad4de758ab3ce4546e4301970ad6fa77c38be", size = 1088608, upload-time = "2025-03-25T05:01:28.114Z" }
wheels = [
{ url = "https://files.pythonhosted.org/packages/05/e7/df2285f3d08fee213f2d041540fa4fc9ca6c2d44cf36d3a035bf2a8d2bcc/pyparsing-3.2.3-py3-none-any.whl", hash = "sha256:a749938e02d6fd0b59b356ca504a24982314bb090c383e3cf201c95ef7e2bfcf", size = 111120, upload-time = "2025-03-25T05:01:24.908Z" },
]
[[package]]
name = "pyqt5"
version = "5.15.11"
source = { registry = "https://pypi.org/simple" }
dependencies = [
{ name = "pyqt5-qt5" },
{ name = "pyqt5-sip" },
]
sdist = { url = "https://files.pythonhosted.org/packages/0e/07/c9ed0bd428df6f87183fca565a79fee19fa7c88c7f00a7f011ab4379e77a/PyQt5-5.15.11.tar.gz", hash = "sha256:fda45743ebb4a27b4b1a51c6d8ef455c4c1b5d610c90d2934c7802b5c1557c52", size = 3216775, upload-time = "2024-07-19T08:39:57.756Z" }
wheels = [
{ url = "https://files.pythonhosted.org/packages/11/64/42ec1b0bd72d87f87bde6ceb6869f444d91a2d601f2e67cd05febc0346a1/PyQt5-5.15.11-cp38-abi3-macosx_11_0_arm64.whl", hash = "sha256:c8b03dd9380bb13c804f0bdb0f4956067f281785b5e12303d529f0462f9afdc2", size = 6579776, upload-time = "2024-07-19T08:39:19.775Z" },
{ url = "https://files.pythonhosted.org/packages/49/f5/3fb696f4683ea45d68b7e77302eff173493ac81e43d63adb60fa760b9f91/PyQt5-5.15.11-cp38-abi3-macosx_11_0_x86_64.whl", hash = "sha256:6cd75628f6e732b1ffcfe709ab833a0716c0445d7aec8046a48d5843352becb6", size = 7016415, upload-time = "2024-07-19T08:39:32.977Z" },
{ url = "https://files.pythonhosted.org/packages/b4/8c/4065950f9d013c4b2e588fe33cf04e564c2322842d84dbcbce5ba1dc28b0/PyQt5-5.15.11-cp38-abi3-manylinux_2_17_x86_64.whl", hash = "sha256:cd672a6738d1ae33ef7d9efa8e6cb0a1525ecf53ec86da80a9e1b6ec38c8d0f1", size = 8188103, upload-time = "2024-07-19T08:39:40.561Z" },
{ url = "https://files.pythonhosted.org/packages/f3/f0/ae5a5b4f9b826b29ea4be841b2f2d951bcf5ae1d802f3732b145b57c5355/PyQt5-5.15.11-cp38-abi3-win32.whl", hash = "sha256:76be0322ceda5deecd1708a8d628e698089a1cea80d1a49d242a6d579a40babd", size = 5433308, upload-time = "2024-07-19T08:39:46.932Z" },
{ url = "https://files.pythonhosted.org/packages/56/d5/68eb9f3d19ce65df01b6c7b7a577ad3bbc9ab3a5dd3491a4756e71838ec9/PyQt5-5.15.11-cp38-abi3-win_amd64.whl", hash = "sha256:bdde598a3bb95022131a5c9ea62e0a96bd6fb28932cc1619fd7ba211531b7517", size = 6865864, upload-time = "2024-07-19T08:39:53.572Z" },
]
[[package]]
name = "pyqt5-qt5"
version = "5.15.17"
source = { registry = "https://pypi.org/simple" }
wheels = [
{ url = "https://files.pythonhosted.org/packages/d3/f9/accb06e76e23fb23053d48cc24fd78dec6ed14cb4d5cbadb0fd4a0c1b02e/PyQt5_Qt5-5.15.17-py3-none-macosx_10_13_x86_64.whl", hash = "sha256:d8b8094108e748b4bbd315737cfed81291d2d228de43278f0b8bd7d2b808d2b9", size = 39972275, upload-time = "2025-05-24T11:15:42.259Z" },
{ url = "https://files.pythonhosted.org/packages/87/1a/e1601ad6934cc489b8f1e967494f23958465cf1943712f054c5a306e9029/PyQt5_Qt5-5.15.17-py3-none-macosx_11_0_arm64.whl", hash = "sha256:b68628f9b8261156f91d2f72ebc8dfb28697c4b83549245d9a68195bd2d74f0c", size = 37135109, upload-time = "2025-05-24T11:15:59.786Z" },
{ url = "https://files.pythonhosted.org/packages/ac/e1/13d25a9ff2ac236a264b4603abaa39fa8bb9a7aa430519bb5f545c5b008d/PyQt5_Qt5-5.15.17-py3-none-manylinux2014_x86_64.whl", hash = "sha256:b018f75d1cc61146396fa5af14da1db77c5d6318030e5e366f09ffdf7bd358d8", size = 61112954, upload-time = "2025-05-24T11:16:26.036Z" },
]
[[package]]
name = "pyqt5-sip"
version = "12.17.0"
source = { registry = "https://pypi.org/simple" }
sdist = { url = "https://files.pythonhosted.org/packages/01/79/086b50414bafa71df494398ad277d72e58229a3d1c1b1c766d12b14c2e6d/pyqt5_sip-12.17.0.tar.gz", hash = "sha256:682dadcdbd2239af9fdc0c0628e2776b820e128bec88b49b8d692fe682f90b4f", size = 104042, upload-time = "2025-02-02T17:13:11.268Z" }
wheels = [
{ url = "https://files.pythonhosted.org/packages/a3/e6/e51367c28d69b5a462f38987f6024e766fd8205f121fe2f4d8ba2a6886b9/PyQt5_sip-12.17.0-cp312-cp312-macosx_10_9_universal2.whl", hash = "sha256:ea08341c8a5da00c81df0d689ecd4ee47a95e1ecad9e362581c92513f2068005", size = 124650, upload-time = "2025-02-02T17:12:50.595Z" },
{ url = "https://files.pythonhosted.org/packages/64/3b/e6d1f772b41d8445d6faf86cc9da65910484ebd9f7df83abc5d4955437d0/PyQt5_sip-12.17.0-cp312-cp312-manylinux_2_5_x86_64.manylinux1_x86_64.whl", hash = "sha256:4a92478d6808040fbe614bb61500fbb3f19f72714b99369ec28d26a7e3494115", size = 281893, upload-time = "2025-02-02T17:12:51.966Z" },
{ url = "https://files.pythonhosted.org/packages/ed/c5/d17fc2ddb9156a593710c88afd98abcf4055a2224b772f8bec2c6eea879c/PyQt5_sip-12.17.0-cp312-cp312-win32.whl", hash = "sha256:b0ff280b28813e9bfd3a4de99490739fc29b776dc48f1c849caca7239a10fc8b", size = 49438, upload-time = "2025-02-02T17:12:54.426Z" },
{ url = "https://files.pythonhosted.org/packages/fe/c5/1174988d52c732d07033cf9a5067142b01d76be7731c6394a64d5c3ef65c/PyQt5_sip-12.17.0-cp312-cp312-win_amd64.whl", hash = "sha256:54c31de7706d8a9a8c0fc3ea2c70468aba54b027d4974803f8eace9c22aad41c", size = 58017, upload-time = "2025-02-02T17:12:56.31Z" },
{ url = "https://files.pythonhosted.org/packages/fd/5d/f234e505af1a85189310521447ebc6052ebb697efded850d0f2b2555f7aa/PyQt5_sip-12.17.0-cp313-cp313-macosx_10_13_universal2.whl", hash = "sha256:c7a7ff355e369616b6bcb41d45b742327c104b2bf1674ec79b8d67f8f2fa9543", size = 124580, upload-time = "2025-02-02T17:12:58.158Z" },
{ url = "https://files.pythonhosted.org/packages/cd/cb/3b2050e9644d0021bdf25ddf7e4c3526e1edd0198879e76ba308e5d44faf/PyQt5_sip-12.17.0-cp313-cp313-manylinux_2_5_x86_64.manylinux1_x86_64.whl", hash = "sha256:419b9027e92b0b707632c370cfc6dc1f3b43c6313242fc4db57a537029bd179c", size = 281563, upload-time = "2025-02-02T17:12:59.421Z" },
{ url = "https://files.pythonhosted.org/packages/51/61/b8ebde7e0b32d0de44c521a0ace31439885b0423d7d45d010a2f7d92808c/PyQt5_sip-12.17.0-cp313-cp313-win32.whl", hash = "sha256:351beab964a19f5671b2a3e816ecf4d3543a99a7e0650f88a947fea251a7589f", size = 49383, upload-time = "2025-02-02T17:13:00.597Z" },
{ url = "https://files.pythonhosted.org/packages/15/ed/ff94d6b2910e7627380cb1fc9a518ff966e6d78285c8e54c9422b68305db/PyQt5_sip-12.17.0-cp313-cp313-win_amd64.whl", hash = "sha256:672c209d05661fab8e17607c193bf43991d268a1eefbc2c4551fbf30fd8bb2ca", size = 58022, upload-time = "2025-02-02T17:13:01.738Z" },
]
[[package]]
name = "pytest"
version = "8.4.1"
source = { registry = "https://pypi.org/simple" }
dependencies = [
{ name = "colorama", marker = "sys_platform == 'win32'" },
{ name = "iniconfig" },
{ name = "packaging" },
{ name = "pluggy" },
{ name = "pygments" },
]
sdist = { url = "https://files.pythonhosted.org/packages/08/ba/45911d754e8eba3d5a841a5ce61a65a685ff1798421ac054f85aa8747dfb/pytest-8.4.1.tar.gz", hash = "sha256:7c67fd69174877359ed9371ec3af8a3d2b04741818c51e5e99cc1742251fa93c", size = 1517714, upload-time = "2025-06-18T05:48:06.109Z" }
wheels = [
{ url = "https://files.pythonhosted.org/packages/29/16/c8a903f4c4dffe7a12843191437d7cd8e32751d5de349d45d3fe69544e87/pytest-8.4.1-py3-none-any.whl", hash = "sha256:539c70ba6fcead8e78eebbf1115e8b589e7565830d7d006a8723f19ac8a0afb7", size = 365474, upload-time = "2025-06-18T05:48:03.955Z" },
]
[[package]]
name = "python-dateutil"
version = "2.9.0.post0"
source = { registry = "https://pypi.org/simple" }
dependencies = [
{ name = "six" },
]
sdist = { url = "https://files.pythonhosted.org/packages/66/c0/0c8b6ad9f17a802ee498c46e004a0eb49bc148f2fd230864601a86dcf6db/python-dateutil-2.9.0.post0.tar.gz", hash = "sha256:37dd54208da7e1cd875388217d5e00ebd4179249f90fb72437e91a35459a0ad3", size = 342432, upload-time = "2024-03-01T18:36:20.211Z" }
wheels = [
{ url = "https://files.pythonhosted.org/packages/ec/57/56b9bcc3c9c6a792fcbaf139543cee77261f3651ca9da0c93f5c1221264b/python_dateutil-2.9.0.post0-py2.py3-none-any.whl", hash = "sha256:a8b2bc7bffae282281c8140a97d3aa9c14da0b136dfe83f850eea9a5f7470427", size = 229892, upload-time = "2024-03-01T18:36:18.57Z" },
]
[[package]]
name = "rich"
version = "14.1.0"
source = { registry = "https://pypi.org/simple" }
dependencies = [
{ name = "markdown-it-py" },
{ name = "pygments" },
]
sdist = { url = "https://files.pythonhosted.org/packages/fe/75/af448d8e52bf1d8fa6a9d089ca6c07ff4453d86c65c145d0a300bb073b9b/rich-14.1.0.tar.gz", hash = "sha256:e497a48b844b0320d45007cdebfeaeed8db2a4f4bcf49f15e455cfc4af11eaa8", size = 224441, upload-time = "2025-07-25T07:32:58.125Z" }
wheels = [
{ url = "https://files.pythonhosted.org/packages/e3/30/3c4d035596d3cf444529e0b2953ad0466f6049528a879d27534700580395/rich-14.1.0-py3-none-any.whl", hash = "sha256:536f5f1785986d6dbdea3c75205c473f970777b4a0d6c6dd1b696aa05a3fa04f", size = 243368, upload-time = "2025-07-25T07:32:56.73Z" },
]
[[package]]
name = "shellingham"
version = "1.5.4"
source = { registry = "https://pypi.org/simple" }
sdist = { url = "https://files.pythonhosted.org/packages/58/15/8b3609fd3830ef7b27b655beb4b4e9c62313a4e8da8c676e142cc210d58e/shellingham-1.5.4.tar.gz", hash = "sha256:8dbca0739d487e5bd35ab3ca4b36e11c4078f3a234bfce294b0a0291363404de", size = 10310, upload-time = "2023-10-24T04:13:40.426Z" }
wheels = [
{ url = "https://files.pythonhosted.org/packages/e0/f9/0595336914c5619e5f28a1fb793285925a8cd4b432c9da0a987836c7f822/shellingham-1.5.4-py2.py3-none-any.whl", hash = "sha256:7ecfff8f2fd72616f7481040475a65b2bf8af90a56c89140852d1120324e8686", size = 9755, upload-time = "2023-10-24T04:13:38.866Z" },
]
[[package]]
name = "six"
version = "1.17.0"
source = { registry = "https://pypi.org/simple" }
sdist = { url = "https://files.pythonhosted.org/packages/94/e7/b2c673351809dca68a0e064b6af791aa332cf192da575fd474ed7d6f16a2/six-1.17.0.tar.gz", hash = "sha256:ff70335d468e7eb6ec65b95b99d3a2836546063f63acc5171de367e834932a81", size = 34031, upload-time = "2024-12-04T17:35:28.174Z" }
wheels = [
{ url = "https://files.pythonhosted.org/packages/b7/ce/149a00dd41f10bc29e5921b496af8b574d8413afcd5e30dfa0ed46c2cc5e/six-1.17.0-py2.py3-none-any.whl", hash = "sha256:4721f391ed90541fddacab5acf947aa0d3dc7d27b2e1e8eda2be8970586c3274", size = 11050, upload-time = "2024-12-04T17:35:26.475Z" },
]
[[package]]
name = "typer"
version = "0.16.1"
source = { registry = "https://pypi.org/simple" }
dependencies = [
{ name = "click" },
{ name = "rich" },
{ name = "shellingham" },
{ name = "typing-extensions" },
]
sdist = { url = "https://files.pythonhosted.org/packages/43/78/d90f616bf5f88f8710ad067c1f8705bf7618059836ca084e5bb2a0855d75/typer-0.16.1.tar.gz", hash = "sha256:d358c65a464a7a90f338e3bb7ff0c74ac081449e53884b12ba658cbd72990614", size = 102836, upload-time = "2025-08-18T19:18:22.898Z" }
wheels = [
{ url = "https://files.pythonhosted.org/packages/2d/76/06dbe78f39b2203d2a47d5facc5df5102d0561e2807396471b5f7c5a30a1/typer-0.16.1-py3-none-any.whl", hash = "sha256:90ee01cb02d9b8395ae21ee3368421faf21fa138cb2a541ed369c08cec5237c9", size = 46397, upload-time = "2025-08-18T19:18:21.663Z" },
]
[[package]]
name = "typing-extensions"
version = "4.14.1"
source = { registry = "https://pypi.org/simple" }
sdist = { url = "https://files.pythonhosted.org/packages/98/5a/da40306b885cc8c09109dc2e1abd358d5684b1425678151cdaed4731c822/typing_extensions-4.14.1.tar.gz", hash = "sha256:38b39f4aeeab64884ce9f74c94263ef78f3c22467c8724005483154c26648d36", size = 107673, upload-time = "2025-07-04T13:28:34.16Z" }
wheels = [
{ url = "https://files.pythonhosted.org/packages/b5/00/d631e67a838026495268c2f6884f3711a15a9a2a96cd244fdaea53b823fb/typing_extensions-4.14.1-py3-none-any.whl", hash = "sha256:d1e1e3b58374dc93031d6eda2420a48ea44a36c2b4766a4fdeb3710755731d76", size = 43906, upload-time = "2025-07-04T13:28:32.743Z" },
]

256
visualizer.py Normal file
View File

@ -0,0 +1,256 @@
# Set Qt5Agg as the default backend before importing pyplot
import os
import matplotlib
matplotlib.use('Qt5Agg')
import logging
import matplotlib.pyplot as plt
import matplotlib.dates as mdates
from matplotlib.patches import Rectangle
from datetime import datetime, timezone
from collections import deque
from typing import Deque, Optional
from pathlib import Path
from storage import Book, BookSnapshot
from models import Metric
from repositories.sqlite_metrics_repository import SQLiteMetricsRepository
class Visualizer:
"""Render OHLC candles, volume, OBI and CVD charts from order book data.
Aggregates mid-prices into OHLC bars and displays OBI/CVD metrics beneath.
Uses Qt5Agg backend for interactive charts.
Public methods:
- update_from_book: process all snapshots from a Book and display charts
- set_db_path: set database path for loading stored metrics
- flush: finalize and draw the last in-progress bar
- show: display the Matplotlib window using Qt5Agg
"""
def __init__(self, window_seconds: int = 60, max_bars: int = 200) -> None:
# Create subplots: OHLC on top, Volume below, OBI and CVD at bottom
self.fig, (self.ax_ohlc, self.ax_volume, self.ax_obi, self.ax_cvd) = plt.subplots(4, 1, figsize=(12, 10), sharex=True)
self.window_seconds = int(max(1, window_seconds))
self.max_bars = int(max(1, max_bars))
self._db_path: Optional[Path] = None
# Bars buffer: list of tuples (start_ts, open, high, low, close)
self._bars: Deque[tuple[int, float, float, float, float, float]] = deque(maxlen=self.max_bars)
# Current in-progress bucket state
self._current_bucket_ts: Optional[int] = None
self._open: Optional[float] = None
self._high: Optional[float] = None
self._low: Optional[float] = None
self._close: Optional[float] = None
self._volume: float = 0.0
def _bucket_start(self, ts: int) -> int:
return int(ts) - (int(ts) % self.window_seconds)
def _normalize_ts_seconds(self, ts: int) -> int:
"""Return epoch seconds from possibly ms/us timestamps.
Heuristic based on magnitude:
- >1e14: microseconds divide by 1e6
- >1e11: milliseconds divide by 1e3
- else: seconds
"""
its = int(ts)
if its > 100_000_000_000_000: # > 1e14 → microseconds
return its // 1_000_000
if its > 100_000_000_000: # > 1e11 → milliseconds
return its // 1_000
return its
def set_db_path(self, db_path: Path) -> None:
"""Set the database path for loading stored metrics."""
self._db_path = db_path
def _load_stored_metrics(self, start_timestamp: int, end_timestamp: int) -> list[Metric]:
"""Load stored metrics from database for the given time range."""
if not self._db_path:
return []
try:
metrics_repo = SQLiteMetricsRepository(self._db_path)
with metrics_repo.connect() as conn:
return metrics_repo.load_metrics_by_timerange(conn, start_timestamp, end_timestamp)
except Exception as e:
logging.error(f"Error loading metrics for visualization: {e}")
return []
def _append_current_bar(self) -> None:
if self._current_bucket_ts is None or self._open is None:
return
self._bars.append(
(
self._current_bucket_ts,
float(self._open),
float(self._high if self._high is not None else self._open),
float(self._low if self._low is not None else self._open),
float(self._close if self._close is not None else self._open),
float(self._volume),
)
)
def _draw(self) -> None:
# Clear all subplots
self.ax_ohlc.clear()
self.ax_volume.clear()
self.ax_obi.clear()
self.ax_cvd.clear()
if not self._bars:
self.fig.canvas.draw_idle()
return
day_seconds = 24 * 60 * 60
width = self.window_seconds / day_seconds
# Draw OHLC candlesticks and extract volume data
volume_data = []
timestamps_ohlc = []
for start_ts, open_, high_, low_, close_, volume in self._bars:
# Collect volume data
dt = datetime.fromtimestamp(start_ts, tz=timezone.utc).replace(tzinfo=None)
x = mdates.date2num(dt)
volume_data.append((x, volume))
timestamps_ohlc.append(x)
# Wick
self.ax_ohlc.vlines(x + width / 2.0, low_, high_, color="black", linewidth=1.0)
# Body
lower = min(open_, close_)
height = max(1e-12, abs(close_ - open_))
color = "green" if close_ >= open_ else "red"
rect = Rectangle((x, lower), width, height, facecolor=color, edgecolor=color, linewidth=1.0)
self.ax_ohlc.add_patch(rect)
# Plot volume bars
if volume_data:
volumes_x = [v[0] for v in volume_data]
volumes_y = [v[1] for v in volume_data]
self.ax_volume.bar(volumes_x, volumes_y, width=width, alpha=0.7, color='blue', align='center')
# Draw metrics if available
if self._bars:
first_ts = self._bars[0][0]
last_ts = self._bars[-1][0]
metrics = self._load_stored_metrics(first_ts, last_ts + self.window_seconds)
if metrics:
# Prepare data for plotting
timestamps = [mdates.date2num(datetime.fromtimestamp(m.timestamp / 1000, tz=timezone.utc).replace(tzinfo=None)) for m in metrics]
obi_values = [m.obi for m in metrics]
cvd_values = [m.cvd for m in metrics]
# Plot OBI and CVD
self.ax_obi.plot(timestamps, obi_values, 'b-', linewidth=1, label='OBI')
self.ax_obi.axhline(y=0, color='gray', linestyle='--', alpha=0.5)
self.ax_cvd.plot(timestamps, cvd_values, 'r-', linewidth=1, label='CVD')
# Configure axes
self.ax_ohlc.set_title("Mid-price OHLC")
self.ax_ohlc.set_ylabel("Price")
self.ax_volume.set_title("Volume")
self.ax_volume.set_ylabel("Volume")
self.ax_obi.set_title("Order Book Imbalance (OBI)")
self.ax_obi.set_ylabel("OBI")
self.ax_obi.set_ylim(-1.1, 1.1)
self.ax_cvd.set_title("Cumulative Volume Delta (CVD)")
self.ax_cvd.set_ylabel("CVD")
self.ax_cvd.set_xlabel("Time (UTC)")
# Format time axis for bottom subplot only
self.ax_cvd.xaxis_date()
self.ax_cvd.xaxis.set_major_formatter(mdates.DateFormatter("%H:%M:%S"))
self.fig.tight_layout()
self.fig.canvas.draw_idle()
def update_from_book(self, book: Book) -> None:
"""Update the visualizer with all snapshots from the book.
Uses best bid/ask to compute mid-price; aggregates into OHLC bars.
Processes all snapshots in chronological order.
"""
if not book.snapshots:
logging.warning("Book has no snapshots to visualize")
return
# Reset state before processing all snapshots
self._bars.clear()
self._current_bucket_ts = None
self._open = self._high = self._low = self._close = None
self._volume = 0.0
logging.info(f"Visualizing {len(book.snapshots)} snapshots")
# Process all snapshots in chronological order
snapshot_count = 0
for snapshot in sorted(book.snapshots, key=lambda s: s.timestamp):
snapshot_count += 1
if not snapshot.bids or not snapshot.asks:
continue
try:
best_bid = max(snapshot.bids.keys())
best_ask = min(snapshot.asks.keys())
except (ValueError, TypeError):
continue
mid = (float(best_bid) + float(best_ask)) / 2.0
ts_raw = int(snapshot.timestamp)
ts = self._normalize_ts_seconds(ts_raw)
bucket_ts = self._bucket_start(ts)
# Calculate volume from trades in this snapshot
snapshot_volume = sum(trade.size for trade in snapshot.trades)
# New bucket: close and store previous bar
if self._current_bucket_ts is None:
self._current_bucket_ts = bucket_ts
self._open = self._high = self._low = self._close = mid
self._volume = snapshot_volume
elif bucket_ts != self._current_bucket_ts:
self._append_current_bar()
self._current_bucket_ts = bucket_ts
self._open = self._high = self._low = self._close = mid
self._volume = snapshot_volume
else:
# Update current bucket OHLC and accumulate volume
if self._high is None or mid > self._high:
self._high = mid
if self._low is None or mid < self._low:
self._low = mid
self._close = mid
self._volume += snapshot_volume
# Finalize the last bar
self._append_current_bar()
logging.info(f"Created {len(self._bars)} OHLC bars from {snapshot_count} valid snapshots")
# Draw all bars
self._draw()
def flush(self) -> None:
"""Finalize the in-progress bar and redraw."""
self._append_current_bar()
# Reset current state (optional: keep last bucket running)
self._current_bucket_ts = None
self._open = self._high = self._low = self._close = None
self._volume = 0.0
self._draw()
def show(self) -> None:
plt.show()

39
visualizer_test.py Normal file
View File

@ -0,0 +1,39 @@
"""Interactive demo for the Visualizer; run manually, not as a test."""
import random
from datetime import datetime
from visualizer import Visualizer
from storage import Book, BookSnapshot, OrderbookLevel
def demo_visualizer_creates_single_bar_on_flush() -> None:
vis = Visualizer(window_seconds=60, max_bars=10)
book = Book()
ts = datetime.now().timestamp()
snapshot = BookSnapshot(timestamp=int(ts))
for r in range(100):
snapshot.bids[100000 + random.random() * 100] = OrderbookLevel(
price=100000 + random.random() * 100,
size=1.0,
liquidation_count=0,
order_count=1,
)
snapshot.asks[100000 + random.random() * 100] = OrderbookLevel(
price=100000 + random.random() * 100,
size=1.0,
liquidation_count=0,
order_count=1,
)
book.add_snapshot(snapshot)
vis.update_from_book(book)
vis.flush()
vis.show()
if __name__ == "__main__":
demo_visualizer_creates_single_bar_on_flush()