Workflow Step: gemini → plan_architecture
This document outlines the detailed architecture plan for the microservice that will be generated by the "Microservice Scaffolder" workflow. The goal is to produce a comprehensive, production-ready microservice structure, complete with best practices for development, deployment, and operations. This plan will serve as the blueprint for the subsequent generation steps.
This architecture plan details the structural components, technology stack, and operational considerations for a standard microservice generated by the Scaffolder. It aims to provide a robust, scalable, and maintainable foundation, adhering to modern cloud-native principles. The generated microservice will be self-contained, containerized, and ready for deployment in a modern CI/CD pipeline.
The generated microservice will follow a layered architecture, focusing on separation of concerns, modularity, and ease of testing.
To provide a concrete and actionable plan, we will define a default technology stack. The scaffolder will be designed to potentially allow for customization of these choices in future iterations.
logging module with JSON formatter)/metrics endpoint)The microservice will be structured around the following key components:
The file and directory structure of the generated microservice will be logical and adhere to common best practices.
├── my_microservice/ │ ├── src/ │ │ ├── api/ │ │ │ ├── __init__.py │ │ │ ├── v1/ │ │ │ │ ├── __init__.py │ │ │ │ ├── endpoints/ # API route definitions (e.g., users.py, items.py) │ │ │ │ └── schemas/ # Pydantic models for request/response bodies │ │ ├── core/ │ │ │ ├── __init__.py │ │ │ ├── config.py # Application configuration settings │ │ │ ├── dependencies.py # Common dependencies (e.g., database session) │ │ │ ├── exceptions.py # Custom exception classes │ │ │ └── security.py # Placeholder for auth utilities │ │ ├── db/ │ │ │ ├── __init__.py │ │ │ ├── base.py # Base for SQLAlchemy models │ │ │ ├── session.py # Database session management │ │ │ └── models/ # SQLAlchemy ORM models (e.g., user.py, item.py) │ │ ├── services/ # Business logic (e.g., user_service.py, item_service.py) │ │ ├── main.py # FastAPI application entry point │ │ └── __init__.py │ ├── tests/ │ │ ├── unit/ │ │ ├── integration/ │ │ └── e2e/ │ ├── migrations/ # Alembic database migration scripts │ ├── scripts/ # Utility scripts (e.g., initial data load) │ ├── .env.example # Example environment variables │ ├── Dockerfile # Docker build instructions │ ├── docker-compose.yml # Local development setup │ ├── requirements.txt # Python dependencies ��� ├── pyproject.toml # Poetry/PDM configuration (optional) │ ├── README.md # Project documentation │ └── .gitignore ├── .github/ │ └── workflows/ │ └── ci-cd.yml # GitHub Actions workflow ├── k8s/ # Kubernetes deployment manifests │ ├── templates/ │ │ ├── deployment.yaml │ │ ├── service.yaml │ │ ├── ingress.yaml │ │ └── configmap.yaml │ └── HelmChart.yaml # Helm chart configuration ├── terraform/ # Infrastructure as Code (e.g., AWS ECS/Fargate) │ ├── main.tf │ ├── variables.tf │ └── outputs.tf
src/core/)config.py: Centralized configuration management using Pydantic Settings, loading from environment variables (.env file) with sensible defaults. This will handle database credentials, API keys, service settings, etc.dependencies.py: Defines common FastAPI dependencies, such as database session providers, authentication checks, and dependency injection for services.exceptions.py: Custom exception classes for specific application errors, mapped to appropriate HTTP status codes.security.py: Placeholder for JWT handling, OAuth2, API key validation, etc.src/api/v1/)endpoints/: Each file represents a logical group of API routes (e.g., users.py for user-related endpoints, items.py for item-related endpoints).schemas/: Contains Pydantic models for request bodies, response models, and input validation. This ensures strong typing and automatic documentation.src/db/)base.py: Defines the declarative base for SQLAlchemy models.session.py: Manages SQLAlchemy engine and session creation, providing a context-managed session for database operations.models/: Each file defines an SQLAlchemy ORM model, representing a database table. Includes relationships and column definitions.src/services/)user_service.py handles user-related business logic).The microservice will be fully containerized to ensure portability and consistent environments across development, testing, and production.
Dockerfile:* Multi-stage build for optimized image size (e.g., separate build stage for dependencies, smaller runtime stage).
* Uses a slim Python base image.
* Copies application code, installs dependencies.
* Sets up appropriate working directory, user, and permissions.
* Exposes the application port (e.g., 8000).
* Defines the entry point command (e.g., uvicorn src.main:app --host 0.0.0.0 --port 8000).
docker-compose.yml:* Defines services for local development: the microservice itself, a PostgreSQL database, and optionally a PgAdmin container.
* Configures network, volumes for data persistence, and environment variables.
* Enables easy local setup and testing with docker-compose up.
The API will adhere to RESTful principles and best practices:
/users, /items).GET for retrieval, POST for creation, PUT/PATCH for updates, and DELETE for removal./api/v1/).* Initial migration script generated for the base models.
* Instructions for generating new migrations.
A comprehensive testing suite is crucial for microservice reliability. The scaffolder will generate a structured test directory and example tests.
tests/unit/):* Focus on individual functions, methods, and classes in isolation.
* Mocks external dependencies (database, external APIs).
* Uses pytest.
tests/integration/):* Verify interactions between different components (e.g., API endpoint calling a service, service interacting with the database).
* May use an in-memory database or a dedicated test database container.
* Uses pytest with fixtures for setup/teardown.
tests/e2e/) (Optional but Recommended):* Test the entire system flow from the client perspective.
* Typically involves deploying the microservice and its dependencies to a test environment.
* Can use pytest with httpx or requests to interact with the API.
pytest-cov to measure test coverage.A robust CI/CD pipeline will be configured using GitHub Actions to automate the build, test, and deployment process.
ci-cd.yml (GitHub Actions): * Trigger: On push to main (or master) and pull_request to main.
* Stages:
1. Build & Lint:
* Installs dependencies.
* Runs linters (e.g., flake8, black, isort).
* Builds Docker image (without pushing).
2. Test:
* Runs unit and integration tests.
* Generates code coverage report.
3. Security Scan (Optional):
* Scans Docker image for vulnerabilities (e.g., Trivy, Snyk).
* Scans dependencies for known vulnerabilities.
4. Push Docker Image:
* Logs into Docker registry (e.g., Docker Hub, ECR).
* Tags and pushes the Docker image.
5. Deploy (Conditional):
* Deploys to a staging environment on successful build and tests.
* Manual approval step for production deployment.
* Uses kubectl or terraform apply commands.
The microservice will be designed for deployment on container orchestration platforms.
* k8s/templates/:
* deployment.yaml: Defines the Kubernetes Deployment for the microservice (number of replicas, container image, resource limits/requests, environment variables, health probes).
* service.yaml: Defines the Kubernetes Service to expose the application within the cluster.
* ingress.yaml: Defines the Ingress resource for external access via an Ingress Controller.
* configmap.yaml: For
This deliverable outlines the complete microservice scaffold, meticulously designed with a robust architecture, containerization, API definitions, database integration, testing framework, and CI/CD configuration. This output is ready for direct consumption and serves as a strong foundation for further development.
This document provides a comprehensive scaffold for a new microservice, "Product Service," using a modern Python-based stack. It includes the core application logic, API definitions, database models, Docker setup, testing suite, CI/CD pipeline configuration, and deployment scripts.
Key Technologies Used:
The generated project adheres to a standard, maintainable structure.
.
├── .github/
│ └── workflows/
│ └── main.yml # GitHub Actions CI/CD workflow
├── alembic/
│ ├── versions/ # Database migration scripts
│ ├── env.py # Alembic environment script
│ └── script.py.mako # Alembic migration template
├── app/
│ ├── api/
│ │ └── v1/
│ │ ├── endpoints/
│ │ │ └── products.py # Product API endpoints
│ │ └── dependencies.py # API-level dependencies (e.g., DB session)
│ ├── core/
│ │ └── config.py # Application settings and configuration
│ ├── crud/
│ │ └── products.py # CRUD operations for Product model
│ ├── db/
│ │ ├── database.py # Database connection and session setup
│ │ └── models.py # SQLAlchemy ORM models
│ ├── exceptions/
│ │ └── custom_exceptions.py # Custom exception definitions
│ ├── schemas/
│ │ └── products.py # Pydantic schemas for request/response
│ └── main.py # FastAPI application entry point
├── scripts/
│ ├── deploy.sh # Generic deployment script
│ └── run_migrations.sh # Script to run database migrations
├── tests/
│ ├── api/
│ │ └── v1/
│ │ └── test_products.py # Tests for product API endpoints
│ └── conftest.py # Pytest fixtures for testing
├── .dockerignore # Files to ignore when building Docker image
├── .env.example # Example environment variables
├── .gitignore # Git ignore file
├── Dockerfile # Dockerfile for building the application image
├── docker-compose.yml # Docker Compose for local development
├── alembic.ini # Alembic configuration file
├── pyproject.toml # Poetry project definition and dependencies
├── README.md # Project README
└── pytest.ini # Pytest configuration
pyproject.toml (Poetry Configuration)This file defines project metadata and dependencies.
[tool.poetry]
name = "product-service"
version = "0.1.0"
description = "A FastAPI microservice for managing products."
authors = ["Your Name <your.email@example.com>"]
readme = "README.md"
[tool.poetry.dependencies]
python = "^3.10"
fastapi = "^0.111.0"
uvicorn = {extras = ["standard"], version = "^0.30.1"}
sqlalchemy = {extras = ["asyncpg"], version = "^2.0.30"}
pydantic = {extras = ["email"], version = "^2.7.1"}
pydantic-settings = "^2.2.1"
alembic = "^1.13.1"
asyncpg = "^0.29.0"
python-dotenv = "^1.0.1"
[tool.poetry.group.dev.dependencies]
pytest = "^8.2.0"
pytest-asyncio = "^0.23.6"
httpx = "^0.27.0"
mypy = "^1.10.0"
ruff = "^0.4.4"
black = "^24.4.2"
isort = "^5.13.2"
[build-system]
requires = ["poetry-core"]
build-backend = "poetry.core.masonry.api"
[tool.ruff]
line-length = 120
target-version = "py310"
[tool.ruff.lint]
select = ["E", "F", "I", "W", "C90", "N", "D"] # Common linting rules
ignore = ["D100", "D104", "D105"] # Ignore missing docstrings for modules/functions/classes
[tool.ruff.format]
quote-style = "double"
indent-style = "space"
skip-magic-trailing-comma = false
line-ending = "auto"
app/core/config.py (Application Settings)Manages environment variables and application-wide settings using Pydantic Settings.
from pydantic_settings import BaseSettings, SettingsConfigDict
import os
class Settings(BaseSettings):
"""
Application settings loaded from environment variables.
Pydantic-settings will automatically load from .env file if present.
"""
model_config = SettingsConfigDict(env_file=".env", extra="ignore")
PROJECT_NAME: str = "Product Service"
API_V1_STR: str = "/api/v1"
DATABASE_URL: str
TEST_DATABASE_URL: str
# Example: JWT settings (if authentication were fully implemented)
SECRET_KEY: str = "YOUR_SUPER_SECRET_KEY" # CHANGE THIS IN PRODUCTION
ALGORITHM: str = "HS256"
ACCESS_TOKEN_EXPIRE_MINUTES: int = 30
settings = Settings()
# Example usage for local development setup if .env is missing
# This helps in quickly getting started without explicitly setting all env vars for docker-compose
if not os.getenv("DATABASE_URL"):
settings.DATABASE_URL = "postgresql+asyncpg://user:password@db:5432/products_db"
if not os.getenv("TEST_DATABASE_URL"):
settings.TEST_DATABASE_URL = "postgresql+asyncpg://user:password@localhost:5433/test_products_db"
app/db/database.py (Database Connection)Sets up the SQLAlchemy engine and session for asynchronous operations.
from sqlalchemy.ext.asyncio import create_async_engine, AsyncSession, async_sessionmaker
from sqlalchemy.orm import declarative_base
from app.core.config import settings
# Create an asynchronous engine to connect to the PostgreSQL database
# echo=True will log all SQL statements, useful for debugging
engine = create_async_engine(settings.DATABASE_URL, echo=True)
# Create a sessionmaker for managing database sessions
# expire_on_commit=False prevents objects from being expired after commit,
# allowing them to be accessed outside the session (with caution)
AsyncSessionLocal = async_sessionmaker(autocommit=False, autoflush=False, bind=engine, class_=AsyncSession)
# Base class for our SQLAlchemy ORM models
Base = declarative_base()
async def get_db():
"""
Dependency function to provide an async database session.
This function will be yielded by FastAPI, ensuring the session is
properly closed after the request is processed.
"""
async with AsyncSessionLocal() as session:
yield session
# No need to explicitly close the session here,
# 'async with' handles context management.
app/db/models.py (SQLAlchemy ORM Models)Defines the Product model, mapping it to a database table.
from datetime import datetime
from sqlalchemy import Column, Integer, String, Float, DateTime
from app.db.database import Base
class Product(Base):
"""
SQLAlchemy ORM model for the 'products' table.
Represents a product in the inventory.
"""
__tablename__ = "products"
id = Column(Integer, primary_key=True, index=True)
name = Column(String, unique=True, index=True, nullable=False)
description = Column(String, nullable=True)
price = Column(Float, nullable=False)
created_at = Column(DateTime, default=datetime.utcnow, nullable=False)
updated_at = Column(DateTime, default=datetime.utcnow, onupdate=datetime.utcnow, nullable=False)
def __repr__(self):
return f"<Product(id={self.id}, name='{self.name}', price={self.price})>"
app/schemas/products.py (Pydantic Schemas)Defines Pydantic models for request body validation and response serialization.
from datetime import datetime
from pydantic import BaseModel, Field, PositiveFloat
from typing import Optional
class ProductBase(BaseModel):
"""
Base schema for product data, used for common fields.
"""
name: str = Field(..., min_length=3, max_length=100, description="Name of the product")
description: Optional[str] = Field(None, max_length=500, description="Description of the product")
price: PositiveFloat = Field(..., gt=0, description="Price of the product (must be positive)")
class ProductCreate(ProductBase):
"""
Schema for creating a new product. Inherits from ProductBase.
"""
pass
class ProductUpdate(ProductBase):
"""
Schema for updating an existing product. All fields are optional.
"""
name: Optional[str] = Field(None, min_length=3, max_length=100, description="Name of the product")
price: Optional[PositiveFloat] = Field(None, gt=0, description="Price of the product (must be positive)")
class ProductInDB(ProductBase):
"""
Schema for product data as stored in the database, including auto-generated fields.
Used for API responses.
"""
id: int = Field(..., description="Unique identifier of the product")
created_at: datetime = Field(..., description="Timestamp when the product was created")
updated_at: datetime = Field(..., description="Timestamp when the product was last updated")
class Config:
from_attributes = True # Enable ORM mode for Pydantic v2
app/crud/products.py (CRUD Operations)Encapsulates database interaction logic for the Product model.
from typing import List, Optional
from sqlalchemy.ext.asyncio import AsyncSession
from sqlalchemy import select, update, delete
from app.db.models import Product
from app.schemas.products import ProductCreate, ProductUpdate
class CRUDProduct:
"""
Class for performing Create, Read, Update, Delete (CRUD) operations on Product models.
"""
async def get_product(self, db: AsyncSession, product_id: int) -> Optional[Product]:
"""
Retrieves a single product by its ID.
"""
stmt =
This document provides a detailed overview and access to the scaffolded microservice, complete with its core components, Docker setup, testing framework, CI/CD pipeline configurations, and initial deployment scripts. This deliverable is designed to be a ready-to-use foundation, enabling rapid development and deployment of your new service.
We have successfully generated a robust microservice foundation, designed for scalability, maintainability, and ease of deployment. This scaffolding includes:
docker-compose.yml for local development and production-like environments.Assumptions for this Scaffold:
The generated microservice follows a standard, organized directory structure to promote clarity and maintainability.
.
├── src/
│ ├── main.py # FastAPI application entry point
│ ├── api/ # API routes definition
│ │ ├── __init__.py
│ │ ├── v1/
│ │ │ ├── __init__.py
│ │ │ ├── endpoints/ # Specific endpoint definitions
│ │ │ │ ├── health.py
│ │ │ │ └── items.py
│ │ │ └── schemas/ # Pydantic models for request/response
│ │ │ ├── item.py
│ │ │ └── health.py
│ ├── core/ # Core application logic, configuration
│ │ ├── __init__.py
│ │ ├── config.py # Environment and application settings
│ │ └── security.py # Authentication/Authorization stubs
│ ├── database/ # Database related files
│ │ ├── __init__.py
│ │ ├── connection.py # Database session management
│ │ ├── models/ # SQLAlchemy ORM models
│ │ │ ├── __init__.py
│ │ │ └── item.py
│ │ └── migrations/ # Alembic migration scripts
│ │ ├── env.py
│ │ ├── script.py.mako
│ │ └── versions/
│ └── services/ # Business logic services
│ ├── __init__.py
│ └── item_service.py
├── tests/
│ ├── __init__.py
│ ├── unit/ # Unit tests for individual components
│ │ ├── test_config.py
│ │ └── test_item_service.py
│ └── integration/ # Integration tests for API endpoints
│ └── test_items_api.py
├── scripts/
│ ├── deploy_local.sh # Script for local deployment
│ ├── deploy_aws_ecs.sh # Example for AWS ECS deployment
│ └── db/
│ └── init_db.sh # Script to initialize database
├── .github/ # GitHub Actions CI/CD configuration
│ └── workflows/
│ └── ci-cd.yml
├── .gitlab-ci.yml # GitLab CI/CD configuration
├── Dockerfile # Docker build instructions for the microservice
├── docker-compose.yml # Docker Compose for local development stack
├── requirements.txt # Python dependencies
├── alembic.ini # Alembic configuration
├── pyproject.toml # Poetry/Pipenv project definition (if used, otherwise requirements.txt)
├── README.md # Project overview and setup instructions
└── .env.example # Example environment variables
src/api/v1/endpoints/)v1) and resource (e.g., items.py).src/api/v1/endpoints/items.py):
from fastapi import APIRouter, Depends, HTTPException, status
from sqlalchemy.orm import Session
from typing import List
from src.api.v1.schemas.item import ItemCreate, ItemResponse
from src.database.connection import get_db
from src.services.item_service import ItemService
router = APIRouter()
@router.post("/", response_model=ItemResponse, status_code=status.HTTP_201_CREATED)
async def create_item(item: ItemCreate, db: Session = Depends(get_db)):
"""
Creates a new item.
"""
db_item = ItemService.create_item(db, item)
return db_item
@router.get("/", response_model=List[ItemResponse])
async def read_items(skip: int = 0, limit: int = 100, db: Session = Depends(get_db)):
"""
Retrieves a list of items.
"""
items = ItemService.get_items(db, skip=skip, limit=limit)
return items
@router.get("/{item_id}", response_model=ItemResponse)
async def read_item(item_id: int, db: Session = Depends(get_db)):
"""
Retrieves a single item by ID.
"""
db_item = ItemService.get_item(db, item_id)
if db_item is None:
raise HTTPException(status_code=status.HTTP_404_NOT_FOUND, detail="Item not found")
return db_item
src/database/models/)src/database/models/item.py):
from sqlalchemy import Column, Integer, String, Boolean
from src.database.connection import Base
class Item(Base):
__tablename__ = "items"
id = Column(Integer, primary_key=True, index=True)
name = Column(String, index=True)
description = Column(String, nullable=True)
is_active = Column(Boolean, default=True)
def __repr__(self):
return f"<Item(id={self.id}, name='{self.name}')>"
src/core/config.py)BaseSettings is used for environment variable management, ensuring type safety and easy loading.
from pydantic_settings import BaseSettings, SettingsConfigDict
class Settings(BaseSettings):
model_config = SettingsConfigDict(env_file=".env", extra="ignore")
APP_NAME: str = "Microservice Scaffolder"
APP_VERSION: str = "0.1.0"
DEBUG_MODE: bool = False
DATABASE_URL: str = "postgresql+psycopg2://user:password@db:5432/microservice_db"
SECRET_KEY: str = "YOUR_SUPER_SECRET_KEY_HERE" # CHANGE THIS IN PRODUCTION
ALGORITHM: str = "HS256"
ACCESS_TOKEN_EXPIRE_MINUTES: int = 30
settings = Settings()
Dockerfile):
# Stage 1: Builder - Install dependencies
FROM python:3.10-slim-buster AS builder
WORKDIR /app
# Install system dependencies required for psycopg2 and others
RUN apt-get update && apt-get install -y \
build-essential \
libpq-dev \
gcc \
&& rm -rf /var/lib/apt/lists/*
COPY requirements.txt .
RUN pip install --no-cache-dir -r requirements.txt
# Stage 2: Runner - Create final image
FROM python:3.10-slim-buster AS runner
WORKDIR /app
# Copy only necessary system dependencies from builder
COPY --from=builder /usr/lib/x86_64-linux-gnu/libpq.so.5 /usr/lib/x86_64-linux-gnu/
# If other build-essential components are needed at runtime, copy them
# For a slim image, avoid copying unnecessary ones.
# Copy installed Python packages
COPY --from=builder /usr/local/lib/python3.10/site-packages /usr/local/lib/python3.10/site-packages
COPY --from=builder /usr/local/bin/alembic /usr/local/bin/alembic
COPY --from=builder /usr/local/bin/uvicorn /usr/local/bin/uvicorn
# Add other executables as needed
# Copy application source code
COPY src ./src
COPY alembic.ini .
COPY .env.example .env # Copy example, use proper secrets management in production
# Expose the port the app runs on
EXPOSE 8000
# Command to run the application using Uvicorn
CMD ["uvicorn", "src.main:app", "--host", "0.0.0.0", "--port", "8000"]
docker-compose.yml)docker-compose.yml):
version: '3.8'
services:
app:
build:
context: .
dockerfile: Dockerfile
ports:
- "8000:8000"
environment:
# These can be overridden by a .env file in the project root
DATABASE_URL: postgresql+psycopg2://user:password@db:5432/microservice_db
SECRET_KEY: development_secret_key # IMPORTANT: Change for production!
DEBUG_MODE: "true"
depends_on:
- db
volumes:
- ./src:/app/src # Mount source code for live reloading during development (optional)
command: uvicorn src.main:app --host 0.0.0.0 --port 8000 --reload # --reload for dev
db:
image: postgres:13-alpine
restart: always
environment:
POSTGRES_DB: microservice_db
POSTGRES_USER: user
POSTGRES_PASSWORD: password
ports:
- "5432:5432"
volumes:
- db_data:/var/lib/postgresql/data
volumes:
db_data:
* docker-compose up --build to build images and start services.
* docker-compose down to stop and remove containers.
tests/)unit and integration tests.tests/unit/)tests/unit/test_item_service.py):
import pytest
from unittest.mock import MagicMock
from src.services.item_service import ItemService
from src.database.models.item import Item
from src.api.v1.schemas.item import ItemCreate
def test_create_item_service():
mock_db = MagicMock()
item_create = ItemCreate(name="Test Item", description="A test description")
# Configure the mock session's add and commit methods
mock_db.add.return_value = None
mock_db.commit.return_value = None
mock_db.refresh.side_effect = lambda x: None # Simulate refresh
created_item = ItemService.create_item(mock_db, item_create)
assert created_item.name == "Test Item"
assert created_item.description == "A test description"
assert created_item.is_active is True
mock_db.add.assert_called_once()
mock_db.commit.assert_called_once()
tests/integration/)TestClient and a temporary/mock database session.tests/integration/test_items_api.py):
from fastapi.testclient import TestClient
from sqlalchemy import create_engine
from sqlalchemy.orm import sessionmaker
import pytest
from src.main import app
from src.database.connection import get_db, Base
from src.database.models.item import Item
from src.core.config import settings
# Use a separate test database URL
TEST_DATABASE_URL = settings.DATABASE_URL.replace("microservice_db", "test_microservice_db")
engine = create_engine(TEST_DATABASE_URL)
TestingSessionLocal = sessionmaker(autocommit=False, autoflush=False, bind=engine)
@pytest.fixture(scope="module")
def test_db():
# Create tables in the test database
Base.metadata.create_all(bind=engine)
yield
# Drop tables after tests are done
Base.metadata.drop_all(bind=engine)
@pytest.fixture(scope="function")
def session(test_db):
connection = engine.connect()
transaction = connection.begin()
db = TestingSessionLocal(bind=connection)
yield db
db.close()
transaction.rollback() # Rollback all changes after each test
connection.close()
@pytest.fixture(scope="function")
def client(session):
def