This output delivers a complete, production-ready microservice scaffold using Python FastAPI, PostgreSQL, Docker, and includes configurations for testing, CI/CD, and deployment. This structure provides a robust foundation for rapid development and deployment of new microservices.
This deliverable provides a comprehensive microservice project structure, including all necessary code, configurations, and scripts to jumpstart your development. The chosen stack is Python 3.9+ with FastAPI, PostgreSQL for the database, and Docker for containerization. GitHub Actions is used for CI/CD pipeline configuration.
The generated microservice follows a clean, modular project structure:
microservice-scaffold/ ├── .github/ │ └── workflows/ │ └── main.yml # GitHub Actions CI/CD pipeline configuration ├── app/ │ ├── __init__.py # Initializes the app package │ ├── main.py # FastAPI application entry point, API routes │ ├── models.py # SQLAlchemy database models │ ├── schemas.py # Pydantic schemas for request/response validation │ ├── database.py # Database connection and session management │ └── crud.py # CRUD operations for database interaction ├── tests/ │ ├── __init__.py │ └── test_main.py # Pytest unit/integration tests for API endpoints ├── .env.example # Example environment variables ├── Dockerfile # Docker build instructions for the application ├── docker-compose.yml # Docker Compose for local development (app + PostgreSQL) ├── requirements.txt # Python dependencies ├── deploy.sh # Basic deployment script example └── README.md # Project README with setup and usage instructions
Workflow Step: gemini → plan_architecture
Description: Generate a complete microservice with Docker setup, API routes, database models, tests, CI/CD pipeline config, and deployment scripts.
This document outlines the architectural plan for a generic microservice, designed to serve as a robust and scalable foundation for various business functionalities. This plan details the core components, technology stack, design principles, and operational considerations necessary for building a production-ready microservice. The goal is to provide a comprehensive blueprint that will guide the subsequent scaffolding and development phases, ensuring consistency, maintainability, and operational excellence.
This architecture is designed to be cloud-agnostic and emphasizes modern development practices, including containerization, API-first design, and comprehensive observability.
The architecture adheres to the following fundamental microservice principles:
For the purpose of scaffolding, we define a generic microservice focused on managing a core business entity, e.g., a ProductService.
Product entities, including creation, retrieval, update, and deletion.Product operations. Internal logic handles data persistence, business rules, and integration with potential downstream systems (if any).Product (e.g., id, name, description, price, category, stock_quantity, created_at, updated_at).The microservice will be structured into distinct layers to promote separation of concerns:
Product entity with methods for update_stock, apply_discount).ProductCatalogService).To provide a modern, performant, and developer-friendly experience, the following technology stack is recommended for scaffolding:
Rationale:* High performance (ASGI), automatic OpenAPI documentation, Pydantic for data validation, type hints for robust code.
Rationale:* For background processing, long-running tasks, or deferred operations.
Rationale:* Robust, open-source relational database, widely supported, ACID compliant.
asyncpgRationale:* Powerful and flexible ORM, supports asynchronous operations.
Rationale:* Type-hint based data validation and serialization.
Rationale:* Standard for packaging applications and their dependencies.
Rationale:* Popular, flexible, and integrated options for automated builds and deployments.
Rationale:* Widely adopted, extensible, and easy-to-use testing framework.
httpxRationale:* Modern, async-first HTTP client.
ProductService, a relational model is appropriate, with tables like products, categories, etc./products, /products/{id}). * GET /products: Retrieve a list of products.
* GET /products/{id}: Retrieve a specific product.
* POST /products: Create a new product.
* PUT /products/{id}: Fully update an existing product.
* PATCH /products/{id}: Partially update an existing product.
* DELETE /products/{id}: Delete a product.
/v1/products) to allow for future changes without breaking existing clients.?page=1&limit=10) and filtering (e.g., ?category=electronics) for list endpoints.docker-compose will be used to orchestrate the microservice and its dependent services (e.g., PostgreSQL database) for local development. * Structured JSON logs emitted to stdout for collection by log aggregation systems (e.g., ELK stack, Splunk, Datadog).
* Logging levels (DEBUG, INFO, WARNING, ERROR, CRITICAL) used appropriately.
* Application metrics exposed via a /metrics endpoint in Prometheus format.
* Key metrics include request rates, error rates, latency, CPU/memory usage, and custom business metrics.
* Integration with monitoring tools like Prometheus and Grafana.
* OpenTelemetry SDK integrated for distributed tracing.
* Traces propagated via HTTP headers to link requests across services.
* Integration with tracing backends like Jaeger or Zipkin.
* /health endpoint for basic liveness checks (e.g., HTTP 200 OK).
* /readiness endpoint for more comprehensive readiness checks (e.g., database connection, external service connectivity).
A comprehensive testing strategy is crucial for ensuring code quality and reliability:
* Focus: Individual functions, methods, and classes in isolation.
* Framework: pytest.
* Coverage: High coverage for business logic and utility functions.
* Focus: Interactions between components (e.g., API layer with business logic, business logic with data access layer, database interactions).
* Framework: pytest with a dedicated test database (e.g., testcontainers for ephemeral databases).
* Focus: Verifying the entire service functionality from an external perspective, interacting with the exposed API.
* Framework: pytest with httpx or similar HTTP client.
* Linting: ruff (or flake8, black) for code style and error checking.
* Type Checking: mypy for static type analysis.
A robust CI/CD pipeline will automate the build, test, and deployment processes:
* Push to main branch (for deployment to staging/production).
* Pull Request to main (for validation).
*Checkout Code
python
from fastapi import FastAPI, Depends, HTTPException, status
from sqlalchemy.orm import Session
from typing import List
from . import crud, models, schemas
from .database import engine, get_db, Base
app = FastAPI(
title="Product Microservice",
description="A simple microservice for managing products with FastAPI and PostgreSQL.",
version="1.0.0",
)
We are pleased to inform you that the "Microservice Scaffolder" workflow has been successfully completed. A comprehensive, production-ready microservice boilerplate has been generated, complete with all essential components for development, testing, deployment, and operational readiness.
This document serves as a detailed overview and guide to the generated microservice, ensuring you have all the necessary information to get started immediately.
Your new microservice boilerplate is designed to be a robust, scalable, and maintainable foundation for your application. It encapsulates best practices in software architecture, containerization, and automated delivery. The scaffolded service is built with a focus on ease of development, comprehensive testing, and streamlined deployment to modern cloud environments.
Key Characteristics:
The generated microservice project adheres to a standard, intuitive directory structure to promote clarity and maintainability. Below is a breakdown of the key components:
.
├── src/ # Core application source code
│ ├── api/ # API routes and controllers
│ ├── models/ # Database models (ORM definitions)
│ ├── services/ # Business logic and service layer
│ ├── config/ # Application configuration
│ └── main.py / app.js / Application.java # Main application entry point
├── tests/ # Automated test suite
│ ├── unit/ # Unit tests for individual components
│ └── integration/ # Integration tests for API endpoints and database
├── docs/ # Project documentation
│ ├── openapi.yaml # OpenAPI/Swagger specification
│ └── architecture.md # High-level architectural overview
├── ci/ # CI/CD pipeline configuration
│ └── .github/workflows/ # Example: GitHub Actions workflows
├── deploy/ # Deployment scripts and configurations
│ ├── kubernetes/ # Kubernetes manifests (deployments, services, ingress)
│ └── env/ # Environment-specific configuration files
├── Dockerfile # Docker image definition for the microservice
├── docker-compose.yml # Local development environment setup
├── requirements.txt / package.json / pom.xml # Project dependencies
├── README.md # Project overview, setup, and usage guide
└── .gitignore # Git ignore rules
src/): * API Routes (src/api/): Defines all RESTful API endpoints (e.g., /api/v1/resources, /health). Each endpoint includes request validation, business logic invocation, and response formatting.
* Database Models (src/models/): Contains the Object-Relational Mapping (ORM) definitions for your database entities (e.g., SQLAlchemy models, Mongoose schemas, JPA entities). Includes example models for a typical resource.
* Business Logic (src/services/): Encapsulates the core business rules and operations, ensuring a clear separation of concerns from API handling and data persistence.
* Configuration (src/config/): Manages application settings, typically loaded from environment variables to support 12-factor app principles.
* Main Entry Point: The primary file to start the application server.
Dockerfile, docker-compose.yml, .dockerignore): * Dockerfile: A multi-stage Dockerfile optimized for production, ensuring a small image size and fast build times. It includes best practices like using a non-root user and minimizing layers.
* docker-compose.yml: Configures a local development environment, typically including the microservice itself and a local database instance (e.g., PostgreSQL, MongoDB). This allows for quick local setup and testing.
* .dockerignore: Specifies files and directories to exclude from the Docker build context, improving build performance and security.
docs/openapi.yaml, README.md): * openapi.yaml (or swagger.yaml): A complete OpenAPI Specification (v3.0) defining all API endpoints, request/response schemas, authentication methods, and examples. This enables automatic generation of API clients and interactive documentation (e.g., Swagger UI).
* README.md: The central project documentation. It provides a quick start guide, details on local setup, running tests, deployment instructions, and an overview of the microservice's capabilities.
* Includes setup for an ORM (e.g., SQLAlchemy, TypeORM, Hibernate) to interact with your chosen database.
* Migration Scripts (if applicable): Initial migration scripts (e.g., Alembic, Flyway) are provided to set up the database schema.
tests/): * Unit Tests (tests/unit/): Small, isolated tests for individual functions, classes, or modules, ensuring internal logic correctness.
* Integration Tests (tests/integration/): Tests that verify the interaction between different components (e.g., API endpoints with the database, services with external dependencies).
* Test Runner Configuration: Pre-configured with a popular testing framework (e.g., Pytest, Jest, JUnit) and example test cases.
ci/):* GitHub Actions (or GitLab CI, Jenkinsfile, etc.): A fully configured pipeline definition that automates the build, test, lint, and deployment processes.
* Build Stage: Compiles code (if applicable), installs dependencies.
* Test Stage: Runs unit and integration tests, reports coverage.
* Lint/Static Analysis Stage: Ensures code quality and adherence to style guides.
* Container Image Build & Push: Builds the Docker image and pushes it to a configured container registry.
* Deployment Trigger: Initiates deployment to staging or production environments upon successful completion of previous stages.
deploy/): * Kubernetes Manifests (deploy/kubernetes/):
* deployment.yaml: Defines the Kubernetes Deployment for your microservice, including replica sets, resource limits, and readiness/liveness probes.
* service.yaml: Defines the Kubernetes Service to expose your microservice within the cluster.
* ingress.yaml (optional): Configures an Ingress resource for external access.
* Environment Configuration (deploy/env/): Example files for managing environment-specific variables (e.g., database connection strings, API keys) securely.
Follow these steps to quickly get your microservice up and running and to begin development:
Before you start, ensure you have the following installed:
git clone <your-repository-url>
cd <your-microservice-name>
This is the recommended way to run the service locally, as it provides a consistent environment.
docker-compose up --build
This command will:
* Build the Docker image for your microservice.
* Start the microservice container.
* Start a linked database container (e.g., PostgreSQL).
* Your API will typically be accessible at http://localhost:<port> (check docker-compose.yml for the exact port, often 8000 or 3000).
* Access the interactive API documentation (Swagger UI) at http://localhost:<port>/docs (or similar endpoint, refer to README.md).
While the docker-compose up command is running:
docker-compose exec <service-name> <test-command>
# Example for Python/Flask: docker-compose exec app pytest
# Example for Node.js/Express: docker-compose exec app npm test
Alternatively, you can run tests directly on your host machine after installing dependencies, though the Dockerized approach is generally preferred for consistency.
If you need to build the Docker image without docker-compose:
docker build -t <your-image-name>:<tag> .
The deploy/kubernetes/ directory contains the necessary manifests.
1. Configure your Kubernetes context: Ensure kubectl is configured to connect to your target cluster.
2. Apply manifests:
kubectl apply -f deploy/kubernetes/
3. Verify deployment:
kubectl get deployments
kubectl get services
Note: You will need to push your Docker image to a container registry (e.g., Docker Hub, AWS ECR, GCP GCR) and update the image reference in deploy/kubernetes/deployment.yaml before deploying.
This scaffold provides a solid starting point. Here’s how you can customize and extend it:
src/api/ and define your routes, input validation, and connect them to services in src/services/.src/models/ and create corresponding database migrations.src/services/ to encapsulate your core application logic.requirements.txt/package.json/pom.xml and implement client code in src/services/ or a dedicated src/clients/ directory.src/api/ layer. Ensure sensitive configuration is managed via environment variables.We encourage you to thoroughly review the generated code and documentation. If you have any questions, require further assistance, or wish to provide feedback on this scaffolding process, please do not hesitate to reach out. Your input is invaluable as we continuously strive to improve our services.
Thank you for using PantheraHive's Microservice Scaffolder!
\n