Multi-Project Python Development: The Complete Guide

Go beyond single packages! Learn how to structure, configure, and manage complex multi-project Python setups with proper dependency management, testing strategies, and deployment patterns.

Multi-Project Python Development: The Complete Guide

In my last article, we covered how to properly install local Python projects using pip install -e .. But what happens when you're not building a single package, but an entire ecosystem of interconnected packages?

Welcome to the world of multi-project Python development, where things get simultaneously more powerful and more complicated.

🏗️ The Architecture Decision: Flat vs Nested

When you start building larger applications, you'll hit a crossroads: how do you organize multiple packages that need to work together? You have two main approaches.

Flat Structure: The Monolithic Approach

my-app/
├── pyproject.toml
├── src/
│   ├── my_app/
│   │   ├── __init__.py
│   │   ├── api.py
│   │   ├── models.py
│   │   └── services.py
├── tests/
└── README.md

Configuration:

[build-system]
requires = ["hatchling"]
build-backend = "hatchling.build"

[tool.hatch.build.targets.wheel]
packages = ["src/my_app"]

[project]
name = "my-app"
version = "0.1.0"
dependencies = [
    "fastapi>=0.68.0",
    "pydantic>=2.0.0",
]

Nested Structure: The Ecosystem Approach

my-ecosystem/
├── app-api/
│   ├── pyproject.toml
│   ├── src/app_api/
│   └── tests/
├── app-models/
│   ├── pyproject.toml
│   ├── src/app_models/
│   └── tests/
├── app-services/
│   ├── pyproject.toml
│   ├── src/app_services/
│   └── tests/
└── README.md

Configuration (per package):

# app-models/pyproject.toml
[build-system]
requires = ["hatchling"]
build-backend = "hatchling.build"

[tool.hatch.build.targets.wheel]
packages = ["src/app_models"]

[project]
name = "app-models"
version = "0.1.0"
dependencies = ["pydantic>=2.0.0"]

[tool.pytest.ini_options]
testpaths = ["tests", "src"]

🎯 When to Choose Each Approach

Choose Flat Structure When:

  • Single application with internal modules
  • Tightly coupled components that always change together
  • Simple deployment (everything or nothing)
  • Small team that owns the entire codebase
  • Internal tools that won't be published separately

Real-world example: A Django web application where models, views, and services are all part of the same deployable unit.

Choose Nested Structure When:

  • Independent packages that can be used separately
  • Different teams own different packages
  • Separate versioning and release cycles
  • Microservices architecture with shared libraries
  • Planning to publish packages to PyPI

Real-world example: Your current project! Models, services, and API as separate packages that can be developed and versioned independently.

🔗 The Magic: Inter-Package Dependencies

Here's where nested structure shines. Each package declares its dependencies explicitly:

# app-services/pyproject.toml
[project]
name = "app-services"
version = "0.1.0"
dependencies = [
    "app-models>=0.1.0",  # Depends on models package
    "pydantic>=2.0.0",
]

# app-api/pyproject.toml  
[project]
name = "app-api"
version = "0.1.0"
dependencies = [
    "app-models>=0.1.0",    # Depends on models
    "app-services>=0.1.0",  # Depends on services
    "fastapi>=0.68.0",
    "uvicorn>=0.15.0",
]

This is powerful because:

  1. Explicit contracts - You know exactly what each package needs
  2. Version management - Can upgrade dependencies independently
  3. Testing isolation - Each package can be tested in isolation
  4. Reusability - Packages can be used in other projects

🛠️ Development Workflow: The Complete Setup

Step 1: Environment Setup

# Create virtual environment
python -m venv .venv
source .venv/bin/activate  # Linux/Mac
# .venv\Scripts\activate   # Windows

Step 2: Install Packages in Dependency Order

# Install in dependency order (bottom-up)
pip install -e ./app-models
pip install -e ./app-services  
pip install -e ./app-api

Step 3: Verify Installation

# Check what's installed
pip list

# Verify package imports work
python -c "from app_models import User; print('Models work!')"
python -c "from app_services import UserService; print('Services work!')"
python -c "from app_api import app; print('API works!')"

Step 4: Development Dependencies

# Add to each package's pyproject.toml
[project.optional-dependencies]
dev = [
    "pytest>=7.0.0",
    "pytest-cov>=4.0.0",
    "black>=22.0.0",
    "mypy>=1.0.0",
]

# Install with: pip install -e "./package[dev]"

🧪 Testing Strategies for Multi-Project Setups

Option 1: Individual Package Testing

# Test each package separately
cd app-models && pytest
cd ../app-services && pytest
cd ../app-api && pytest

Option 2: Root-Level Testing with Test Paths

# In each package's pyproject.toml
[tool.pytest.ini_options]
testpaths = ["tests", "src"]

Option 3: Integration Testing

Create a separate integration-tests/ directory that tests the full stack:

integration-tests/
├── test_full_workflow.py
├── test_api_integration.py
└── conftest.py

🚀 Deployment Patterns

Pattern 1: All Together (Monolithic Deploy)

# Deploy all packages as a unit
pip install ./app-models ./app-services ./app-api

Pattern 2: Independent Package Deployment

# Deploy packages independently
pip install ./app-models==1.2.0
pip install ./app-services==2.1.0  
pip install ./app-api==3.0.0

Pattern 3: PyPI Publishing

# Build and publish each package
cd app-models && python -m build && twine upload dist/*
cd ../app-services && python -m build && twine upload dist/*
cd ../app-api && python -m build && twine upload dist/*

🔄 Advanced Configuration Patterns

Development Dependencies with Inheritance

# app-api/pyproject.toml
[project.optional-dependencies]
dev = [
    "app-models[dev]>=0.1.0",    # Inherit dev deps
    "app-services[dev]>=0.1.0",  # Inherit dev deps
    "httpx>=0.24.0",             # API testing
    "pytest>=7.0.0",
]

Local Package Dependencies (Advanced)

# For development without publishing to PyPI
[project.dependencies]
"app-models @ file:///path/to/app-models"
"app-services @ file:///path/to/app-services"

Conditional Dependencies

[project.optional-dependencies]
test = ["pytest>=7.0.0", "pytest-cov>=4.0.0"]
dev = ["my-package[test]", "black>=22.0.0", "mypy>=1.0.0"]
prod = ["my-package[test]", "gunicorn>=20.0"]

📊 Real-World Example: Your Multi-Project Setup

Let's break down your actual project structure:

Package Relationships

app-models (independent)
    ↑
app-services (depends on models)
    ↑
app-api (depends on both)

Installation Script

#!/bin/bash
# setup.sh - Automated setup script

set -e  # Exit on error

echo "Setting up multi-project environment..."

# Create virtual environment
python -m venv .venv
source .venv/bin/activate

# Install packages in dependency order
echo "Installing app-models..."
pip install -e ./app-models

echo "Installing app-services..."
pip install -e ./app-services

echo "Installing app-api..."
pip install -e ./app-api

echo "Installing development dependencies..."
pip install -e ./app-models[dev]
pip install -e ./app-services[dev]  
pip install -e ./app-api[dev]

echo "Running tests..."
pytest app-models/tests/
pytest app-services/tests/
pytest app-api/tests/

echo "Setup complete! 🎉"

Development Workflow

# Day-to-day development
1. Edit code in any package
2. Changes are immediately available (editable install)
3. Run tests for affected package: cd app-services && pytest
4. Test integration: cd app-api && pytest
5. API server auto-reloads: uvicorn src.app_api:app --reload

⚡ Performance and Productivity Benefits

Development Speed

  • Hot reloading - Edit any package, changes are instant
  • Parallel development - Teams can work on different packages
  • Incremental testing - Only test what changed
  • Faster iteration - No need to reinstall after changes

Code Organization

  • Clear boundaries - Each package has a single responsibility
  • Reduced coupling - Explicit dependencies prevent spaghetti code
  • Better testing - Each package can be tested in isolation
  • Easier onboarding - New developers can understand one package at a time

🎛️ CI/CD Considerations

Multi-Package Pipeline

# .github/workflows/ci.yml
name: CI
on: [push, pull_request]

jobs:
  test-models:
    runs-on: ubuntu-latest
    steps:
      - uses: actions/checkout@v3
      - uses: actions/setup-python@v4
        with: {python-version: '3.11'}
      - run: pip install -e ./app-models[dev]
      - run: pytest app-models/

  test-services:
    needs: test-models
    runs-on: ubuntu-latest
    steps:
      - uses: actions/checkout@v3
      - uses: actions/setup-python@v4
        with: {python-version: '3.11'}
      - run: pip install -e ./app-models ./app-services[dev]
      - run: pytest app-services/

  test-api:
    needs: [test-models, test-services]
    runs-on: ubuntu-latest
    steps:
      - uses: actions/checkout@v3
      - uses: actions/setup-python@v4
        with: {python-version: '3.11'}
      - run: pip install -e ./app-models ./app-services ./app-api[dev]
      - run: pytest app-api/

🚨 Common Pitfalls and Solutions

Pitfall 1: Circular Dependencies

# BAD: app-models depends on app-services
# app-services depends on app-models

# Solution: Create shared package
# app-models (independent)
# app-services (depends on app-models)
# app-shared (independent, used by both)

Pitfall 2: Version Conflicts

# Problem: Different packages require different versions
# app-api needs requests==2.25.0
# app-services needs requests==2.28.0

# Solution: Use compatible version ranges
dependencies = ["requests>=2.28.0"]  # Both can use this

Pitfall 3: Import Path Confusion

# WRONG: Relative imports break in production
from ..app_models import User

# RIGHT: Package imports work everywhere
from app_models import User

🎯 Best Practices Summary

  1. Start simple - Begin with flat structure, split when needed
  2. Explicit dependencies - Always declare what each package needs
  3. Semantic versioning - Use proper version numbers for packages
  4. Independent testing - Each package should be testable in isolation
  5. Clear interfaces - Define APIs between packages
  6. Documentation - Document the relationships between packages
  7. Automation - Use scripts for setup and common operations

🎉 The Bottom Line

Multi-project Python development seems complex at first, but it's incredibly powerful for building maintainable, scalable applications. The key is understanding your architecture:

  • Flat structure for simple, single-team applications
  • Nested structure for complex, multi-team ecosystems

Your current project is a perfect example of when nested structure makes sense. You have clear separation of concerns, independent versioning, and the ability to reuse packages across different projects.

The setup might take a few extra minutes, but the payoff in code organization, team productivity, and long-term maintainability is enormous.


What's your experience with multi-project Python development? Have you tried these patterns? Drop a comment below with your thoughts and questions!

Comments

Share your thoughts and insights in the comments below. We'd love to hear your perspective on this topic!

Geek Cafe LogoGeek Cafe

Your trusted partner for cloud architecture, development, and technical solutions. Let's build something amazing together.

Quick Links

© 2025 Geek Cafe LLC. All rights reserved.

Research Triangle Park, North Carolina

Version: 8.9.16