Developer Onboarding Guide¶
Welcome to the Redhound development team. This guide will help you set up your development environment and understand the project workflow.
Prerequisites¶
Quick Setup Available
The quick-start script can automatically install required prerequisites. Manual installation is optional.
Required Prerequisites¶
| Tool | Description | Installation |
|---|---|---|
| make | Build automation tool (to run make quickstart) |
Pre-installed on macOS/Linux; Windows users may need to install separately |
| Git | Version control and repository cloning | Auto-installed via Homebrew (macOS) or apt/dnf (Linux) if missing |
| Python 3.12.12 | Python runtime (exact version required) | Auto-installed via Homebrew (macOS) or apt/dnf (Linux) if missing |
| uv 0.9.13 | Package manager (exact version required, newer versions accepted) | Auto-installed via official installer if missing |
The project pins the Python version in .python-version at the repository root; uv and pyenv use this when creating the virtual environment.
Optional Prerequisites¶
| Tool | Description | Minimum Version |
|---|---|---|
| Docker | Container runtime for local services | 20.10.0 (for BuildKit features) |
| Docker Compose | Service orchestration for PostgreSQL, Redis, Prometheus, Grafana | Included with Docker |
| Numba | JIT compiler for pandas-ta performance optimization | 0.60.0 (optional, install via uv pip install -e .[performance]) |
| TA-Lib | Technical analysis library for candlestick pattern recognition | 0.6.4 (requires C library + Python wrapper) |
Installing TA-Lib¶
TA-Lib enables candlestick pattern recognition in technical analysis. It is a core dependency that requires installing both the C library and Python wrapper.
System Dependency
TA-Lib requires a C library to be installed on your system. If the library is missing, candlestick pattern recognition will be gracefully skipped, but we recommend installing it for full functionality.
macOS:
# Install C library via Homebrew
brew install ta-lib
# Install Python wrapper
uv pip install TA-Lib>=0.6.4
# or
uv sync --extra talib
Linux (Ubuntu/Debian):
# Install dependencies
sudo apt-get update
sudo apt-get install -y build-essential wget
# Download and install TA-Lib C library
wget http://prdownloads.sourceforge.net/ta-lib/ta-lib-0.4.0-src.tar.gz
tar -xzf ta-lib-0.4.0-src.tar.gz
cd ta-lib/
./configure --prefix=/usr
make
sudo make install
cd ..
rm -rf ta-lib ta-lib-0.4.0-src.tar.gz
# Install Python wrapper
uv pip install TA-Lib>=0.6.4
# or
uv sync --extra talib
Windows:
# Option 1: Use pre-built wheel (easiest)
# Download from: https://www.lfd.uci.edu/~gohlke/pythonlibs/#ta-lib
# Then install: uv pip install TA_Lib‑0.6.4‑cp312‑cp312‑win_amd64.whl
# Option 2: Build from source (requires Visual Studio)
# See: https://github.com/TA-Lib/ta-lib-python#windows
Verify Installation:
# Test TA-Lib import
python -c "import talib; print(talib.__version__)"
# Should print: 0.6.4 (or higher)
Enable Candlestick Patterns:
After installing TA-Lib, enable pattern detection in your .env file:
# Enable candlestick pattern recognition
REDHOUND_TA_PATTERNS_ENABLED=true
# Optional: Customize patterns (defaults shown)
REDHOUND_TA_PATTERNS_LIST=CDLDOJI,CDLHAMMER,CDLENGULFING,CDLHARAMI
# Optional: Minimum pattern strength (-100 to 100, default: 0)
REDHOUND_TA_PATTERN_MIN_STRENGTH=0
# Optional: Lookback days for pattern reporting (default: 5)
REDHOUND_TA_PATTERN_LOOKBACK=5
See Technical Indicators documentation for more details on candlestick patterns.
Initial Setup¶
1. Clone the Repository¶
# Clone the repository
git clone https://github.com/redhound-labs/redhound.git
cd redhound
# Checkout the dev branch (main development branch)
git checkout dev
2. Quick-Start Script (Recommended)¶
Recommended Approach
The quick-start script automatically installs prerequisites and sets up the development environment in one command.
Run the quick-start script:
What the script does:
- Prerequisites Check - Verifies Python, uv, Git, Docker, Docker Compose
- Version Detection - Automatically reads required versions from Dockerfile and CI/CD config
- Version Verification - Checks installed versions match project requirements exactly
- Interactive Updates - Prompts to update Python, uv, or Docker if versions don't match requirements
- Auto-Installation - Installs missing required prerequisites (Python, uv, Git) with your permission
- Optional Installation - Offers to install Docker and Docker Compose (optional but recommended)
- Environment Setup - Creates virtual environment using exact Python version
- Dependencies - Installs base and development dependencies
- Configuration - Creates
.envfile from template - Git Hooks - Installs pre-commit hooks
- Validation - Validates the complete setup
- Interactive Setup - Optionally configures API keys and mock mode interactively
Version Requirements
The script automatically reads required versions from the project's Dockerfile and CI/CD configuration, ensuring it always uses the correct versions even when they're updated.
Version Matching: - Python: Requires exact version match (e.g., 3.12.12). The script will prompt to update if a different version is installed. - uv: Requires exact version match (e.g., 0.9.13), but newer versions are accepted as they're backward compatible. - Docker: Requires minimum version 20.10.0 (for BuildKit features). The script will prompt to update if an older version is installed.
If installed versions don't match requirements, the script will interactively prompt you to update them.
Platform Support:
| Platform | Installation Method |
|---|---|
| macOS | Homebrew |
| Linux | apt/dnf (requires sudo) |
| Windows | Manual installation instructions provided |
3. Manual Setup (Alternative)¶
Manual Setup
If you prefer to set up the environment manually, follow these steps:
Step-by-step manual setup:
# Ensure you have the exact Python version (3.12.12)
python3.12 --version # Should show Python 3.12.12
# Create virtual environment
uv venv --python python3.12
# Activate virtual environment
source .venv/bin/activate # macOS/Linux
# or
.venv\Scripts\activate # Windows
# Install dependencies
uv sync --locked --extra dev
# Create .env file
cp .env.example .env
# Install pre-commit hooks (runs on commit and on push)
uv run pre-commit install
uv run pre-commit install --hook-type pre-push
# Validate setup
make validate-setup
Pre-push runs the same quality checks as CI (lint, format, type-check, security, detect-secrets, and file-quality hooks). Push is blocked until they pass, so CI does not fail on those checks.
When using make install-dev or make setup, a post-sync patch is applied to the installed pandas_ta package to remove deprecated pd.options.mode.copy_on_write usage. This avoids the Pandas4Warning and keeps the project compatible with future pandas 4.x. To re-apply the patch after a manual uv sync, run make patch-deps.
Version Requirements
When setting up manually, ensure you have: - Python 3.12.12 (exact version) - uv 0.9.13 (exact version, or newer) - Docker 20.10.0+ (if using Docker services)
4. Configure Environment Variables¶
Interactive Configuration
The quick-start script can configure API keys interactively. Alternatively, edit .env manually.
Edit .env file:
# Required for real LLM usage
OPENAI_API_KEY=your_openai_key_here
# Optional but recommended
ALPHA_VANTAGE_API_KEY=your_alpha_vantage_key_here
ANTHROPIC_API_KEY=your_anthropic_key_here
GOOGLE_API_KEY=your_google_key_here
# Development (enable mock mode to avoid API costs)
REDHOUND_MOCK_MODE=true
Security Warning
Never commit API keys to version control. The .env file is gitignored.
5. Verify Installation¶
Verify your setup is working correctly:
# Activate virtual environment and verify CLI
source .venv/bin/activate && redhound --help
# Validate development environment
make validate-setup
# Run tests in mock mode
REDHOUND_MOCK_MODE=true make test
Development Workflow¶
Branch Strategy¶
The project follows a Git Flow-inspired branching strategy:
| Branch Type | Purpose | Protection |
|---|---|---|
main |
Production-ready code | Protected |
dev |
Main development branch (default) | Default branch |
feature/* |
Feature development | - |
bugfix/* |
Bug fixes | - |
hotfix/* |
Urgent production fixes | - |
Creating a Feature Branch¶
# Ensure dev branch is up to date
git checkout dev
git pull origin dev
# Create feature branch
git checkout -b feature/your-feature-name
# Make changes and commit
git add .
git commit -m "feat: add new feature"
# Push to remote
git push origin feature/your-feature-name
Commit Message Convention¶
Follow the Conventional Commits specification:
Commit Types:
| Type | Description |
|---|---|
feat |
New feature |
fix |
Bug fix |
docs |
Documentation changes |
style |
Code style changes (formatting, no logic change) |
refactor |
Code refactoring |
test |
Adding or updating tests |
chore |
Maintenance tasks (dependencies, build, etc.) |
perf |
Performance improvements |
ci |
CI/CD changes |
Examples:
# Feature
git commit -m "feat(agents): add custom analyst type"
# Bug fix
git commit -m "fix(data): handle missing API key gracefully"
# Documentation
git commit -m "docs: update architecture diagram"
# Chore
git commit -m "chore(deps): update langchain to 1.1.3"
Pre-commit Hooks¶
Automatic Code Quality
Pre-commit hooks run automatically before each commit to enforce code quality.
Tools included:
| Tool | Purpose |
|---|---|
| ruff | Linting and formatting |
| pyright | Type checking |
| bandit | Security scanning |
| detect-secrets | Secret detection |
| pip-audit | Dependency vulnerability scanning |
Commands:
# Install hooks (done automatically by make quickstart)
uv run pre-commit install
# Run hooks manually on all files
uv run pre-commit run --all-files
# Skip hooks (use sparingly, only when necessary)
git commit --no-verify -m "commit message"
Running Tests¶
Mock Mode Recommended
Use mock mode for fast, cost-free testing during development.
Test commands:
# Run all tests in mock mode (fast, no API costs)
REDHOUND_MOCK_MODE=true make test
# Run specific test file
REDHOUND_MOCK_MODE=true make test TEST_ARGS='-k test_technical_analyst'
# Run tests with coverage
REDHOUND_MOCK_MODE=true make test-coverage
# View coverage report
open htmlcov/index.html # macOS
xdg-open htmlcov/index.html # Linux
Code Quality Checks¶
The project provides Makefile targets for all common development tasks. Use make help to see all available commands.
Using Makefile Commands¶
# View all available commands
make help
# Run all code quality checks
make quality
# Run individual checks
make lint # Run ruff linter (check only)
make format # Auto-format code with ruff
make format-check # Check formatting without fixing
make type-check # Run pyright type checker
make security # Run security scanners (bandit, pip-audit)
Manual Commands (Alternative)¶
If you prefer running tools directly:
# Run linter
uv run ruff check .
# Auto-fix linting issues
uv run ruff check --fix .
# Run formatter
uv run ruff format .
# Run type checker
npx pyright backend/ cli/
# Run security scanner
uv run bandit -r backend/ cli/ -ll
# Run all checks (as CI does)
uv run pre-commit run --all-files
Makefile Commands Reference¶
The project includes a comprehensive Makefile with targets for all common development tasks. All commands use uv for dependency management and respect the virtual environment.
Development Setup¶
make quickstart # Interactive quick-start script (installs prerequisites, sets up environment, configures API keys)
make validate-setup # Validate development environment setup
make validate-app # Run comprehensive application validation (Docker, environment, imports, etc.)
make clean-venv # Remove virtual environment
Database¶
make init-db # Create database tables (agent_memory, stock_profile, etc.). Run after first Postgres start or when using agent memory/CLI.
Code Quality¶
make lint # Run ruff linter (check only, no fixes)
make format # Auto-format code with ruff
make format-check # Check code formatting without fixing
make type-check # Run pyright type checker
make security # Run security scanners (bandit, pip-audit)
make quality # Run all code quality checks (lint, format-check, type-check, security)
make pre-commit-run # Run pre-commit hooks on all files
Testing¶
make test # Run pytest (set TEST_ARGS='-k pattern' to filter)
make test TEST_ARGS='-k test_technical_analyst' # Run specific tests
make test-coverage # Run pytest with coverage reports (HTML and XML)
Docker Commands¶
make up-build # Build images and start all services (first run or after code changes)
make up # Start all services (image must already be built)
make down # Stop all services
make docker-ps # List running Docker containers
make logs # View logs from all services (follow mode)
make docker-build # Build Docker image
make docker-rebuild # Rebuild Docker image without cache
make reset CONFIRM=true # Stop all services and remove volumes (WARNING: deletes data)
Documentation¶
# Build documentation
cd docs/mkdocs && uv run mkdocs build
# Serve documentation locally (http://127.0.0.1:8002)
cd docs/mkdocs && uv run mkdocs serve
# Clean build artifacts
rm -rf docs/mkdocs/site/
Dependency Management¶
make lock-sync # Update uv.lock to sync with pyproject.toml
make check-lock-sync # Check if uv.lock is in sync with pyproject.toml
Package/Build¶
Cleanup¶
make clean-cache # Clean Python cache files (__pycache__, .pyc)
make clean-coverage # Clean coverage reports and data
make clean # Clean all build artifacts (cache, coverage)
CI Validation¶
Issue Synchronization¶
make sync-issues-from-github # Sync issues from GitHub to markdown file
make sync-issues-check # Check if markdown file is in sync with GitHub
Dependency Management¶
The project uses uv for dependency management with a lockfile (uv.lock) for reproducible builds.
Adding Dependencies¶
# Add a new dependency
# 1. Edit pyproject.toml and add the package with version
# 2. Update lockfile
make lock-sync
# 3. Commit both files
git add pyproject.toml uv.lock
git commit -m "chore(deps): add new-package==1.0.0"
Updating Dependencies¶
# Update all dependencies
uv lock --upgrade
# Update specific package
uv lock --upgrade-package langchain
# Verify lockfile is in sync
make check-lock-sync
# Commit updated lockfile
git add uv.lock
git commit -m "chore(deps): update dependencies"
Lockfile Requirement
Always commit uv.lock with dependency changes. The lockfile ensures reproducible builds across environments.
Managing Dependabot PRs¶
Zero Manual Effort
Dependabot automatically creates PRs for dependency updates every Sunday. The system uses conditional direct merging for zero manual effort.
Fully Automatic Process:
- Dependabot creates PRs weekly (Sunday 09:00 UTC)
- CI automatically runs and updates
uv.lockif needed - Patch/minor updates get auto-approved instantly
- PRs merge automatically once CI passes (checked by scheduled workflow at 12:00 UTC daily)
- Major updates get labeled
requires-reviewfor manual review
If many open Dependabot PRs were never auto-merged: run Actions → Auto-merge Dependabot → Run workflow (leave PR number empty), or wait for the daily run at 12:00 UTC. The workflow will check CI status and merge PRs that pass all checks.
Private Repo Workaround
GitHub's native auto-merge feature (--auto flag) is not available for private repos without Pro. This workflow uses conditional direct merging as a workaround: it checks CI status via scheduled job and directly merges PRs that pass all checks.
What You Need to Do:
- Nothing for ~90% of updates (patch/minor automatically merge)
- Review PRs with
requires-reviewlabel (major updates, ~1-2/month) - View on GitHub:
https://github.com/your-org/redhound/pulls?q=is:pr+is:open+label:requires-review - Click "Merge pull request" button after reviewing
That's it. No scripts, no commands, no manual tracking needed.
Mock Mode Development¶
Cost-Free Development
Use mock mode to develop without API costs.
Enable mock mode:
# Enable mock mode globally
export REDHOUND_MOCK_MODE=true
# Run CLI in mock mode
redhound
# Run tests in mock mode
REDHOUND_MOCK_MODE=true make test
See Mock Mode documentation for complete details.
Working with Issues¶
Issues are tracked on GitHub. Use gh to view and manage them:
Project Structure¶
redhound/
├── cli/ # Command-line interface
│ ├── main.py # CLI entry point
│ ├── models.py # CLI data models
│ └── utils.py # CLI utilities
│
├── backend/ # Main application package
│ ├── agents/ # AI agents
│ │ ├── analysts/ # Technical, fundamentals, sentiment, news, market_context, sector, insider, options_flow, short_interest, earnings_revisions
│ │ ├── researchers/ # Bull, bear researchers
│ │ ├── managers/ # Research manager
│ │ └── risk_overlay.py # Deterministic risk overlay
│ │
│ ├── orchestration/ # Workflow orchestration (LangGraph)
│ │ ├── trading_graph.py # Main workflow graph
│ │ ├── conditional_logic.py # Routing logic
│ │ ├── signal_aggregator.py # Signal aggregation node
│ │ └── propagation.py # Workflow execution
│ │
│ ├── data/ # Market data layer
│ │ ├── interface.py # Unified data interface
│ │ ├── cache.py # Redis caching
│ │ ├── memory.py # pgvector vector memory
│ │ ├── llm_factory.py # LLM instance creation
│ │ ├── vendors/ # Data source implementations
│ │ ├── tools/ # LangChain tool wrappers
│ │ └── utils/ # Data processing utilities
│ │
│ ├── api/ # FastAPI endpoints
│ │ ├── app.py # FastAPI application
│ │ ├── health.py # Health check endpoint
│ │ └── metrics.py # Metrics endpoint
│ │
│ ├── config/ # Configuration
│ │ └── settings.py # Default configuration
│ │
│ ├── models/ # Shared domain models
│ │ └── agent_states.py # Agent state schemas
│ │
│ ├── services/ # Business logic services
│ │ ├── signal_service.py
│ │ ├── market_data_service.py
│ │ ├── analytics_service.py
│ │ ├── session_service.py
│ │ ├── scanner_service.py
│ │ ├── screener_engine.py
│ │ └── base.py
│ │
│ ├── utils/ # Shared utilities
│ │ ├── logging.py # Structured logging
│ │ ├── metrics.py # Prometheus metrics
│ │ ├── mock_llm.py # Mock LLM for testing
│ │ └── mock_memory.py # Mock memory for testing
│ │
│ ├── tui/ # Textual TUI implementation
│ │
│ └── database/ # Database layer (SQLAlchemy models, pgvector)
│ ├── models/ # ORM models by domain
│ │ ├── base.py # Base, mixins (Timestamp, SoftDelete)
│ │ ├── enums.py # StrEnum types (SignalAction, AgentType, etc.)
│ │ ├── signal.py
│ │ ├── session.py
│ │ ├── agent_analysis.py
│ │ ├── agent_memory.py
│ │ ├── debate.py
│ │ ├── stock_profile.py
│ │ └── cost.py
│ ├── types.py # Custom column types (e.g. Vector for pgvector)
│ └── utils/ # Vector operations (HNSW, similarity search)
│
├── tests/ # Test suite
│ ├── api/ # API tests
│ ├── database/ # Database model tests
│ │ └── models/ # Tests per model module
│ ├── integration/ # Integration tests
│ ├── orchestration/ # Orchestration tests
│ ├── performance/ # Performance tests
│ └── utils/ # Test utilities
│
├── docs/ # Documentation (MkDocs)
│ ├── content/ # Markdown documentation
│ └── mkdocs/ # MkDocs configuration and assets
│
├── docker/ # Docker configuration
│ ├── entrypoint.sh # Container entrypoint
│ ├── healthcheck.sh # Health check script
│ ├── grafana/ # Grafana provisioning
│ ├── prometheus/ # Prometheus configuration
│ └── postgres/ # PostgreSQL initialization
│
├── scripts/ # Utility scripts
│ ├── sync_issues.py # GitHub issues synchronization
│ ├── cleanup_artifacts.py # Cleanup temporary files
│ └── validate_app.py # Application validation
│
├── .github/ # GitHub configuration
│ ├── workflows/ # CI/CD workflows
│ └── actions/ # Reusable actions
│
├── pyproject.toml # Project metadata and dependencies
├── uv.lock # Dependency lockfile
├── Dockerfile # Container image definition
├── docker-compose.yml # Local services orchestration
├── Makefile # Development automation
└── README.md # Project overview
Common Development Tasks¶
Running the CLI¶
# Activate virtual environment
source .venv/bin/activate
# Run CLI in mock mode (no API costs)
REDHOUND_MOCK_MODE=true redhound
# Run CLI with real APIs
redhound
# Run CLI with specific ticker and date
redhound --ticker AAPL --date 2024-12-01
Viewing Logs¶
# Application logs (if file logging enabled)
tail -f backend/logs/trading.log
# Docker service logs
docker-compose logs -f app
docker-compose logs -f postgres
docker-compose logs -f redis
Accessing Monitoring¶
# Prometheus UI
open http://localhost:9090
# Grafana UI
open http://localhost:3000
# Default credentials: admin / admin
Debugging¶
# Run with debug logging
export REDHOUND_LOG_LEVEL=DEBUG
redhound
# Use Python debugger
python -m pdb -m cli.main
# Use ipdb for better debugging
uv pip install ipdb
# Add breakpoint in code: import ipdb; ipdb.set_trace()
Cleaning Up¶
# Remove virtual environment
make clean-venv
# Stop Docker services
make down
# Remove Docker volumes (WARNING: deletes data)
make reset CONFIRM=true
# Clean Python cache files
make clean-cache
# Clean all build artifacts
make clean
CI/CD Pipeline¶
The project uses GitHub Actions for continuous integration and deployment.
Pipeline Stages¶
| Stage | Description |
|---|---|
| Check Lock | Verify uv.lock is in sync with pyproject.toml |
| Lint | Run ruff linter and formatter |
| Type Check | Run pyright type checker |
| Security | Run bandit and pip-audit security scanners |
| Detect Secrets | Scan for committed secrets |
| Pre-commit | Run pre-commit hooks |
| Test | Run pytest test suite with coverage |
| Docker Build | Build and scan Docker image |
| Notification | Send Slack notification (if configured) |
Viewing CI Results¶
# View workflow runs
gh run list
# View specific run
gh run view <run-id>
# View logs for failed job
gh run view <run-id> --log-failed
Fixing CI Failures¶
Using Makefile (recommended):
# Run all CI checks locally
make ci-check
# Run individual checks
make check-lock-sync # Verify lock file sync
make quality # Run all quality checks
make test # Run tests
# Auto-fix formatting issues
make format
Manual commands:
# Run the same checks locally
uv run pre-commit run --all-files
# Fix linting issues
uv run ruff check --fix .
uv run ruff format .
# Fix type errors
npx pyright backend/ cli/
# Run tests
REDHOUND_MOCK_MODE=true make test
Documentation¶
Building Documentation Locally¶
# Install documentation dependencies
uv sync --locked --extra docs
# Serve documentation locally (port 8002)
cd docs/mkdocs
uv run mkdocs serve
# Open in browser
open http://127.0.0.1:8002
To build static documentation:
Updating Documentation¶
# Edit markdown files in docs/content/
vim docs/content/architecture.md
# Preview changes
cd docs/mkdocs && uv run mkdocs serve
# Commit changes
git add docs/content/architecture.md
git commit -m "docs: update architecture documentation"
Documentation Structure¶
Documentation is organized in the MkDocs navigation (see the sidebar when serving with mkdocs serve):
- Getting Started: Developer Onboarding, Configuration Reference
- System Architecture: Architecture Overview, Signal Aggregation, Technical Indicators, API Reference, MVP Dashboard Design
- Data & Database: Database Overview, Agent-Database Integration, pgvector Setup
- Development: Mock Mode, CLI Usage, Input Validation, Testing Guide, Error Handling
- Deployment & Operations: DevOps Operations, Docker Setup, CI/CD Pipeline, Monitoring & Metrics
- Planning: Product Roadmap
Getting Help¶
Internal Resources¶
- Documentation: https://redhound.pages.dev
- GitHub Issues: https://github.com/redhound-labs/redhound/issues
- Slack: https://teamredhound.slack.com
Team Contacts¶
For questions or code review requests, open a GitHub issue or tag the relevant team member in a pull request. See the GitHub repository for contributor information.
Common Issues¶
| Issue | Solution |
|---|---|
| Virtual environment | make clean-venv && make install-dev |
| Dependency sync | make lock-sync |
| Docker services | make down && docker-compose down -v && make up |
| API keys | env \| grep API_KEY or use REDHOUND_MOCK_MODE=true |
Best Practices¶
Code Quality¶
- Write clear, self-documenting code
- Add docstrings to all public functions and classes
- Follow PEP 8 style guidelines (enforced by ruff)
- Use type hints for function parameters and return values
- Keep functions small and focused (single responsibility)
- Avoid deep nesting (max 3-4 levels)
Testing¶
- Write tests for all new features
- Aim for >80% code coverage
- Use mock mode for fast, cost-free testing
- Test edge cases and error conditions
- Use descriptive test names:
test_<what>_<condition>_<expected>
Git Workflow¶
- Commit frequently with meaningful messages
- Keep commits atomic (one logical change per commit)
- Review your own changes before pushing
- Rebase feature branches on dev regularly
- Squash commits before merging (if requested)
Documentation¶
- Update documentation when changing behavior
- Add code comments for non-obvious logic
- Keep README and docs in sync with code
- Use diagrams for complex workflows
Data Access Patterns¶
✅ CORRECT: Use route_to_vendor for all market data access
from backend.data.interface import route_to_vendor
# Fetch stock data through the routing system
data = route_to_vendor("get_stock_data", "AAPL", "2024-01-01", "2024-12-31")
Benefits of using route_to_vendor:
- ✅ Data quality validation (empty data, NaN values, low quality scores)
- ✅ Automatic caching (performance optimization)
- ✅ Metrics collection (observability)
- ✅ Error handling and retry logic (reliability)
- ✅ Circuit breakers (fault tolerance)
❌ INCORRECT: Direct vendor imports
import fmp # BANNED in agent layer
ticker = fmp.get_ticker("AAPL") # Bypasses validation, caching, and metrics
data = ticker.history(start="2024-01-01", end="2024-12-31")
Linting Enforcement:
Direct vendor imports are prohibited by Ruff configuration outside the vendor layer:
- ✅ Allowed: backend/data/vendors/*.py (vendor implementation)
- ✅ Allowed: backend/data/utils/ (vendor utilities)
- ❌ Banned: backend/agents/**/*.py (all agent code)
- ❌ Banned: backend/orchestration/**/*.py (orchestration code)
- ❌ Banned: Other application code outside vendor layer
The linting rule will fail your build if you import vendor libraries directly outside allowed files.
Security¶
- Never commit API keys or secrets
- Use environment variables for sensitive data
- Review dependencies for vulnerabilities regularly
- Follow principle of least privilege
Next Steps¶
Now that your environment is set up:
- Explore the codebase: Start with
backend/orchestration/trading_graph.py - Run a test analysis:
REDHOUND_MOCK_MODE=true redhound - Read the architecture docs: System Overview
- Learn about mock mode: Mock Mode Guide
- Understand CI/CD: CI/CD Pipeline
- Pick an issue: Find a good first issue on GitHub
- Ask questions: Reach out on Slack if you need help
Welcome to the Team
You're all set! Start exploring and contributing to the project.