Skip to main content

DB-First Design

The Cosailor Template follows a Database-First approach where database migrations serve as the source of truth for the data model. This approach ensures consistency between the database schema and application code through automated generation processes.

Core Principles

1. Database Migrations as Source of Truth

Database schema changes start with Alembic migrations, which define the canonical data model:

  • Version Control: Schema changes are tracked in revision history
  • Reproducible: Same migrations create identical database structure
  • Rollback Support: Each migration includes downgrade functionality
  • Environment Parity: Same migrations run across development, staging, and production

2. Type-Safe Database Models

SQLAlchemy ORM models are defined with full type safety.

  • Type Safety & Validation

    • Compile-time checks via SQLAlchemy and TypeScript
    • Runtime validation through Pydantic DTOs
    • IDE features like auto-completion and refactoring
  • API Contract Management

    • OpenAPI spec defines frontend/backend contract
    • Automated request/response validation
    • Version tracking and compatibility checks

3. Auto-Generated TypeScript Types

OpenAPI specifications automatically generate TypeScript types for frontend consumption.

Development Workflow

Development Flow

Full Stack Data Flow

1. Alembic Migration → Create database table schema
2. SQLAlchemy Models → Define Python ORM classes
3. Repository (repo.py) → Write database queries
4. Services (services.py) → Business logic + DTO assembly
5. DTOs (dto.py) → Pydantic models for API responses
6. Routes (routes/*.py) → FastAPI HTTP endpoints with auth
7. OpenAPI Spec → Auto-generate from FastAPI app
8. TypeScript SDK → Auto-generate from OpenAPI spec
9. Frontend Usage → Type-safe API calls in Next.js

Steps 7-9 are automated - write Python, get TypeScript types automatically!

Step 1: Create Database Migration

cd apps/core-api/src/db
alembic -c alembic.ini revision -m "add_users_table"

Step 2: Define Migration Schema

def upgrade() -> None:
op.create_table(
'users',
sa.Column('id', sa.String(), primary_key=True),
sa.Column('email', sa.String(255), nullable=False),
sa.Column('display_name', sa.String(255), nullable=False),
sa.Column('status', sa.Enum('active', 'inactive', name='user_status'), nullable=False),
sa.Column('created_at', sa.DateTime(timezone=True), nullable=False),
sa.Column('tenant_id', sa.String(), sa.ForeignKey('tenants.id'), nullable=False),
schema=SCHEMA_NAME
)

op.create_index('idx_users_email', 'users', ['email'], unique=True, schema=SCHEMA_NAME)
op.create_index('idx_users_tenant_id', 'users', ['tenant_id'], schema=SCHEMA_NAME)

def downgrade() -> None:
op.drop_table('users', schema=SCHEMA_NAME)

Migration Best Practices

  1. Always Include Downgrade - Every upgrade migration must have a corresponding downgrade
  2. Use Transactions - Wrap migrations in database transactions for atomicity
  3. Index Strategy - Add appropriate indexes for query performance
  4. Foreign Key Constraints - Ensure referential integrity with proper foreign keys
  5. Schema Namespacing - Use schema prefixes to avoid naming conflicts

Step 3: Create SQLAlchemy Models

from sqlalchemy.orm import Mapped, mapped_column, relationship
from sqlalchemy import String, DateTime, Enum, ForeignKey
from datetime import datetime
from .base import Base

class User(Base):
__tablename__ = "users"
__table_args__ = {'schema': SCHEMA_NAME}

id: Mapped[str] = mapped_column(String, primary_key=True)
email: Mapped[str] = mapped_column(String(255), nullable=False, unique=True)
display_name: Mapped[str] = mapped_column(String(255), nullable=False)
status: Mapped[UserStatus] = mapped_column(Enum(UserStatus), nullable=False)
created_at: Mapped[datetime] = mapped_column(DateTime(timezone=True), nullable=False)

# Relationships
tenant_id: Mapped[str] = mapped_column(String, ForeignKey("tenants.id"), nullable=False)
tenant: Mapped["Tenant"] = relationship("Tenant", back_populates="users")
roles: Mapped[List["UserRole"]] = relationship("UserRole", back_populates="user")

Step 4: Create Repository Functions

def get_user_by_id(conn: Connection, user_id: str) -> Optional[User]:
stmt = select(User).where(User.id == user_id)
return conn.execute(stmt).scalar_one_or_none()

def get_users_by_tenant(conn: Connection, tenant_id: str) -> List[User]:
stmt = select(User).where(User.tenant_id == tenant_id)
return list(conn.execute(stmt).scalars())

Step 5: Define Data Transfer Objects (DTOs)

from pydantic import BaseModel
from typing import Optional
from datetime import datetime

class UserDTO(BaseModel):
id: str
email: str
displayName: str
status: str
createdAt: datetime
tenantId: str

class UserWithRolesDTO(UserDTO):
roles: List[RoleDTO]
managedUsers: List[UserDTO]

Step 6: Create Service Functions

def build_user_dto(user: User) -> UserDTO:
return UserDTO(
id=user.id,
email=user.email,
displayName=user.display_name,
status=user.status.value,
createdAt=user.created_at,
tenantId=user.tenant_id
)

def get_user_profile(conn: Connection, user_id: str) -> Optional[UserWithRolesDTO]:
user = repo.get_user_by_id(conn, user_id)
if not user:
return None

roles = repo.get_user_roles(conn, user_id)
managed_users = repo.get_managed_users(conn, user_id)

return UserWithRolesDTO(
**build_user_dto(user).dict(),
roles=[build_role_dto(role) for role in roles],
managedUsers=[build_user_dto(u) for u in managed_users]
)

Step 7: Create API Endpoints

@router.get("/users/{user_id}", response_model=UserWithRolesDTO)
def get_user_profile(
user_id: str,
auth: ServiceAuthContext = Depends(get_auth_context)
):
check_can_read_user(user_id, auth)
return services.get_user_profile(get_db_connection(), user_id)

Step 8: Generate TypeScript SDK

cd apps/core-api
python src/write_openapi.py

cd ../web
npm run generate:sdk

Step 9: Use in Frontend

import { CoreApiClient, UserWithRolesDTO } from '@instalily/core-sdk-ts';

const client = new CoreApiClient({
baseUrl: process.env.NEXT_PUBLIC_CORE_API_URL,
authToken: userToken
});

const userProfile: UserWithRolesDTO = await client.users.get(userId);