Project: Microservice Scaffolder Workflow
Step: 1 of 3 - Plan Architecture (gemini -> plan_architecture)
Date: 2023-10-27
Deliverable: Detailed Microservice Architecture Plan
This document outlines a comprehensive architectural plan for generating a complete, production-ready microservice using the "Microservice Scaffolder" workflow. The goal is to define a robust, scalable, and maintainable structure that includes core application logic, API design, data persistence, testing, CI/CD, and deployment strategies. This plan serves as the blueprint for subsequent code generation and implementation steps, ensuring consistency, best practices, and accelerated development.
The proposed architecture emphasizes modularity, loose coupling, and adherence to modern microservice principles, enabling rapid iteration and independent deployment.
The microservice will be designed around the following foundational principles:
To ensure broad applicability, modern tooling, and strong community support, the following technology stack is recommended for the scaffolded microservice:
* Rationale: FastAPI offers high performance comparable to NodeJS and Go, built-in data validation (Pydantic), automatic interactive API documentation (Swagger UI/OpenAPI), and excellent asynchronous capabilities. It's highly productive and well-suited for microservices.
* Rationale: PostgreSQL is a powerful, open-source, object-relational database system known for its reliability, feature robustness, and performance. It's a solid choice for most microservice data persistence needs.
* Rationale: SQLAlchemy is Python's leading SQL toolkit and Object Relational Mapper. It provides a full suite of well-known persistence patterns, designed for efficient and high-performing database access.
* Rationale: Docker provides a consistent environment for development, testing, and production, ensuring "it works on my machine" translates to "it works everywhere."
* Rationale: Simplifies multi-container application setup for local development and integration testing.
The scaffolded microservice will follow a layered architecture to separate concerns and enhance maintainability.
* main.py: FastAPI application entry point.
app/api/v1/endpoints/.py: Defines API routes (e.g., users.py, items.py).
app/schemas/.py: Pydantic models for request (input) and response (output) data validation and serialization.
app/services/.py: Modules containing functions that implement specific business logic (e.g., user_service.py for user creation, retrieval, update). These services will interact with repositories.
app/models/.py: SQLAlchemy models defining the database schema (tables, columns, relationships).
app/crud/.py: (Create, Read, Update, Delete) repository functions that interact directly with SQLAlchemy models to perform database operations (e.g., user_crud.py for get_user, create_user).
* app/db/session.py: Manages database session creation and lifecycle.
* .env file: For local development environment variables.
* app/core/config.py: Loads environment variables and provides structured access to settings (e.g., using Pydantic's BaseSettings).
created_at and updated_at timestamps for auditing.deleted_at timestamp for soft deletion instead of permanent removal.# app/models/user.py
from sqlalchemy import Column, Integer, String, DateTime, Boolean
from sqlalchemy.ext.declarative import declarative_base
from datetime import datetime
Base = declarative_base()
class User(Base):
__tablename__ = "users"
id = Column(Integer, primary_key=True, index=True)
uuid = Column(String, unique=True, index=True, nullable=False) # Example for UUID
email = Column(String, unique=True, index=True, nullable=False)
hashed_password = Column(String, nullable=False)
is_active = Column(Boolean, default=True)
created_at = Column(DateTime, default=datetime.utcnow)
updated_at = Column(DateTime, default=datetime.utcnow, onupdate=datetime.utcnow)
deleted_at = Column(DateTime, nullable=True)
/users, /products).GET for retrieval, POST for creation, PUT/PATCH for updates, DELETE for removal./api/v1 for versioning and clarity. * GET /api/v1/users: Retrieve a list of users.
* POST /api/v1/users: Create a new user.
* GET /api/v1/users/{user_id}: Retrieve a specific user.
* PUT /api/v1/users/{user_id}: Fully update a specific user.
* PATCH /api/v1/users/{user_id}: Partially update a specific user.
* DELETE /api/v1/users/{user_id}: Delete a specific user.
/api/v1/, /api/v2/ etc., for clear and explicit API evolution.A comprehensive testing strategy is crucial for microservice reliability. The scaffold will include templates for:
pytestunittest.mock or pytest-mock for isolating dependencies (e.g., database calls, external API calls).pytest with httpx (for making HTTP requests to the FastAPI app) and a temporary test database (e.g., using pytest-docker or an in-memory SQLite for simpler cases).pytest with httpx or dedicated E2E tools (e.g., Playwright if UI is involved, but for pure API microservice, httpx is sufficient).pytest (primary test runner), httpx (for API testing), SQLAlchemy's testing utilities, pytest-mock.The scaffold will include a template for an automated CI/CD pipeline to ensure code quality, test coverage, and efficient deployment.
flake8, black, isort).The scaffold will provide basic deployment artifacts and scripts, focusing on container orchestration.
* Rationale: Kubernetes is the industry standard for container orchestration, providing robust features for scaling, self-healing, load balancing, and rolling updates.
* Rationale: Terraform allows defining
This document provides the complete, production-ready code scaffolding for your new microservice, adhering to best practices for development, testing, and deployment. The generated code includes a robust backend API, database integration, containerization setup, testing framework, CI/CD pipeline configuration, and basic deployment scripts.
This microservice, named ProductService, is designed to manage products with basic CRUD (Create, Read, Update, Delete) operations. It is built using Python with FastAPI, SQLAlchemy for database interaction, and PostgreSQL as the database.
Key Technologies Used:
Project Directory Structure:
product-service/
├── .github/
│ └── workflows/
│ └── ci-cd.yml
├── alembic/
│ ├── versions/
│ │ └── <timestamp>_initial_migration.py
│ └── env.py
│ └── script.py.mako
├── app/
│ ├── api/
│ │ └── v1/
│ │ └── endpoints/
│ │ └── products.py
│ │ └── __init__.py
│ │ └── __init__.py
│ ├── core/
│ │ ├── config.py
│ │ └── database.py
│ │ └── security.py
│ │ └── __init__.py
│ ├── crud/
│ │ └── products.py
│ │ └── __init__.py
│ ├── models/
│ │ └── product.py
│ │ └── __init__.py
│ ├── schemas/
│ │ └── product.py
│ │ └── __init__.py
│ ├── main.py
│ └── __init__.py
├── tests/
│ ├── api/
│ │ └── v1/
│ │ └── test_products.py
│ ├── conftest.py
│ └── __init__.py
├── .dockerignore
├── .env.example
├── Dockerfile
├── docker-compose.yml
├── entrypoint.sh
├── requirements.txt
├── alembic.ini
├── README.md
This section provides the Python code for the FastAPI application, including configuration, database setup, models, schemas, CRUD operations, and API endpoints.
requirements.txt
fastapi[all]>=0.104.1
uvicorn[standard]>=0.23.2
SQLAlchemy>=2.0.23
psycopg2-binary>=2.9.9
alembic>=1.12.1
python-dotenv>=1.0.0
pydantic-settings>=2.0.3
pytest>=7.4.3
httpx>=0.25.1
black>=23.11.0
flake8>=6.1.0
isort>=5.12.0
app/core/config.py)Handles environment variables and application settings using Pydantic Settings.
from pydantic_settings import BaseSettings, SettingsConfigDict
from typing import Optional
class Settings(BaseSettings):
"""
Application settings loaded from environment variables.
Uses .env file for local development.
"""
PROJECT_NAME: str = "ProductService"
API_V1_STR: str = "/api/v1"
# Database settings
POSTGRES_SERVER: str
POSTGRES_USER: str
POSTGRES_PASSWORD: str
POSTGRES_DB: str
POSTGRES_PORT: int = 5432
DATABASE_URL: Optional[str] = None # Will be constructed if not provided
# Test database settings (for pytest)
TEST_POSTGRES_DB: str = "test_product_db"
# CORS settings
BACKEND_CORS_ORIGINS: list[str] = ["http://localhost:3000", "http://localhost:8080"]
model_config = SettingsConfigDict(env_file=".env", extra="ignore")
def get_database_url(self) -> str:
"""Constructs the database URL if not explicitly set."""
if self.DATABASE_URL:
return self.DATABASE_URL
return (
f"postgresql+psycopg2://{self.POSTGRES_USER}:{self.POSTGRES_PASSWORD}@"
f"{self.POSTGRES_SERVER}:{self.POSTGRES_PORT}/{self.POSTGRES_DB}"
)
def get_test_database_url(self) -> str:
"""Constructs the test database URL."""
return (
f"postgresql+psycopg2://{self.POSTGRES_USER}:{self.POSTGRES_PASSWORD}@"
f"{self.POSTGRES_SERVER}:{self.POSTGRES_PORT}/{self.TEST_POSTGRES_DB}"
)
settings = Settings()
app/core/database.py)Sets up SQLAlchemy engine, session, and base for models.
from sqlalchemy import create_engine
from sqlalchemy.ext.declarative import declarative_base
from sqlalchemy.orm import sessionmaker, Session
from app.core.config import settings
# Create the SQLAlchemy engine
engine = create_engine(settings.get_database_url())
# Create a SessionLocal class
SessionLocal = sessionmaker(autocommit=False, autoflush=False, bind=engine)
# Base class for our models
Base = declarative_base()
def get_db():
"""
Dependency to get a database session.
Yields a session and ensures it's closed afterwards.
"""
db = SessionLocal()
try:
yield db
finally:
db.close()
app/models/product.py)Defines the Product SQLAlchemy model.
from sqlalchemy import Column, Integer, String, Float, DateTime, func
from app.core.database import Base
class Product(Base):
"""
SQLAlchemy model for a Product.
"""
__tablename__ = "products"
id = Column(Integer, primary_key=True, index=True)
name = Column(String, index=True, nullable=False)
description = Column(String, nullable=True)
price = Column(Float, nullable=False)
stock = Column(Integer, default=0, nullable=False)
created_at = Column(DateTime, server_default=func.now(), nullable=False)
updated_at = Column(DateTime, onupdate=func.now(), server_default=func.now(), nullable=False)
def __repr__(self):
return f"<Product(id={self.id}, name='{self.name}', price={self.price})>"
app/schemas/product.py)Defines Pydantic models for request validation and response serialization.
from pydantic import BaseModel, Field
from datetime import datetime
from typing import Optional
class ProductBase(BaseModel):
"""Base schema for Product."""
name: str = Field(..., min_length=1, max_length=100)
description: Optional[str] = Field(None, max_length=500)
price: float = Field(..., gt=0)
stock: int = Field(0, ge=0)
class ProductCreate(ProductBase):
"""Schema for creating a new Product."""
pass
class ProductUpdate(ProductBase):
"""Schema for updating an existing Product."""
name: Optional[str] = Field(None, min_length=1, max_length=100)
price: Optional[float] = Field(None, gt=0)
stock: Optional[int] = Field(None, ge=0)
class ProductInDB(ProductBase):
"""Schema for Product as stored in the database (includes IDs and timestamps)."""
id: int
created_at: datetime
updated_at: datetime
class Config:
from_attributes = True # For SQLAlchemy 2.0+
app/crud/products.py)Abstracts database interactions for the Product model.
from sqlalchemy.orm import Session
from app.models.product import Product
from app.schemas.product import ProductCreate, ProductUpdate
from typing import List, Optional
class CRUDProduct:
"""
CRUD operations for the Product model.
"""
def get(self, db: Session, product_id: int) -> Optional[Product]:
"""Retrieve a product by its ID."""
return db.query(Product).filter(Product.id == product_id).first()
def get_multi(self, db: Session, skip: int = 0, limit: int = 100) -> List[Product]:
"""Retrieve multiple products with pagination."""
return db.query(Product).offset(skip).limit(limit).all()
def create(self, db: Session, obj_in: ProductCreate) -> Product:
"""Create a new product."""
db_obj = Product(**obj_in.model_dump())
db.add(db_obj)
db.commit()
db.refresh(db_obj)
return db_obj
def update(self, db: Session, db_obj: Product, obj_in: ProductUpdate) -> Product:
"""Update an existing product."""
obj_data = obj_in.model_dump(exclude_unset=True) # Only update provided fields
for field, value in obj_data.items():
setattr(db_obj, field, value)
db.add(db_obj)
db.commit()
db.refresh(db_obj)
return db_obj
def delete(self, db: Session, product_id: int) -> Optional[Product]:
"""Delete a product by its ID."""
obj = db.query(Product).filter(Product.id == product_id).first()
if obj:
db.delete(obj)
db.commit()
return obj
product = CRUDProduct()
app/api/v1/endpoints/products.py)Defines the FastAPI routes for Product resources.
from fastapi import APIRouter, Depends, HTTPException, status
from sqlalchemy.orm import Session
from typing import List
from app.core.database import get_db
from app.crud.products import product as crud_product
from app.schemas.product import ProductCreate, ProductUpdate, ProductInDB
router = APIRouter()
@router.post("/", response_model=ProductInDB, status_code=status.HTTP_201_CREATED)
def create_product(
product_in: ProductCreate,
db: Session = Depends(get_db)
):
"""
Create a new product.
"""
return crud_product.create(db=db, obj_in=product_in)
@router.get("/", response_model=List[ProductInDB])
def read_products(
skip: int = 0,
limit: int = 100,
db: Session = Depends(get_db)
):
"""
Retrieve multiple products.
"""
products = crud_product.get_multi(db=db, skip=skip, limit=limit)
return products
@router.get("/{product_id}", response_model=ProductInDB)
def read_product(
product_id: int,
db: Session = Depends(get_db)
):
"""
Retrieve a single product by ID.
"""
product = crud_product.get(db=db, product_id=product_id)
if not product:
raise HTTPException(status_code=status.HTTP_404_NOT_FOUND, detail="Product not found")
return product
@router.put("/{product_id}", response_model=ProductInDB)
def update_product(
product_id: int,
product_in: ProductUpdate,
db: Session = Depends(get_db)
):
"""
Update an existing product.
"""
product = crud_product.get(db=db, product_id=product_id)
if not product:
raise HTTPException(status_code=status.HTTP_404_NOT_FOUND, detail="Product not found")
return crud_product.update(db=db, db_obj=product, obj_in=product_in)
@router.delete("/{product_id}", response_model=ProductInDB)
def delete_product(
product_id: int,
db: Session = Depends(get_db)
):
"""
Delete a product by ID.
"""
product = crud_product.get(db=db, product_id=product_id)
if not product:
raise HTTPException(status_code=status.HTTP_404_NOT_FOUND, detail="Product not found")
crud_product.delete(db=db, product_id=product_id)
return product
app/main.py)Initializes the FastAPI application and includes the API routers.
from fastapi import FastAPI
from fastapi.middleware.cors import CORSMiddleware
from app.core.config import settings
from app.api.v1.endpoints import products
app = FastAPI(
title=settings.PROJECT_NAME,
openapi_url=f"{settings.API_V1_STR}/openapi.json"
)
# Set up CORS middleware
if settings.BACKEND_CORS_ORIGINS:
app.add_middleware(
CORSMiddleware,
allow_origins=[str(origin) for origin in settings.BACKEND_CORS_ORIGINS],
allow_credentials=True,
allow_methods=["*"],
allow_headers=["*"],
)
# Include API routers
app.include_router(products.router, prefix=settings.API_V1_STR + "/products", tags=["products"])
@
This document details the comprehensive microservice scaffolding generated for your "Product Catalog Service." This deliverable includes a complete, production-ready microservice structure, encompassing API routes, database models, Docker setup, testing frameworks, CI/CD pipeline configuration, and initial deployment scripts.
The "Product Catalog Service" microservice has been fully scaffolded, providing a robust foundation for managing product information. This service is designed to be highly scalable, maintainable, and easily deployable, adhering to modern microservice best practices. The chosen technology stack for this example is Python with FastAPI for the API, SQLAlchemy for ORM, and PostgreSQL as the database, containerized with Docker.
The generated project adheres to a standard, modular structure designed for clarity and scalability.
product-catalog-service/
├── .github/
│ └── workflows/
│ └── ci.yml # GitHub Actions CI/CD pipeline
├── alembic/ # Database migration scripts
│ ├── versions/
│ └── env.py
│ └── script.py.mako
├── app/
│ ├── api/
│ │ ├── v1/
│ │ │ └── endpoints/
│ │ │ ├── products.py # Product API endpoints
│ │ │ └── health.py # Health check endpoint
│ │ │ └── __init__.py
│ │ └── __init__.py
│ ├── core/
│ │ ├── config.py # Application configuration
│ │ ├── database.py # Database connection setup
│ │ └── security.py # Basic security utilities (if needed)
│ ├── crud/ # Create, Read, Update, Delete operations
│ │ └── product.py # CRUD for Product model
│ ├── models/ # SQLAlchemy ORM models
│ │ └── product.py # Product database model
│ │ └── __init__.py
│ ├── schemas/ # Pydantic schemas for request/response validation
│ │ ├── product.py # Product schemas (create, update, response)
│ │ └── __init__.py
│ ├── services/ # Business logic services
│ │ └── product_service.py # Product-specific business logic
│ ├── main.py # FastAPI application entry point
│ └── __init__.py
├── tests/
│ ├── unit/
│ │ └── test_product_model.py
│ ├── integration/
│ │ └── test_product_api.py
│ └── conftest.py # Pytest fixtures
├── scripts/
│ ├── deploy.sh # Example deployment script
│ └── build.sh # Example build script
├── .env.example # Environment variables example
├── Dockerfile # Docker build file for the application
├── docker-compose.yml # Docker Compose for local development (app + db)
├── entrypoint.sh # Script to run inside the Docker container
├── requirements.txt # Python dependencies
├── README.md # Project documentation
├── alembic.ini # Alembic configuration
└── pyproject.toml # Poetry/Pipenv/setuptools config (if used)
The app/api/ directory contains the definition of your service's RESTful API endpoints, leveraging FastAPI's robust and performant framework.
app/main.py: The entry point for the FastAPI application, where the main application instance is created and API routers are included.app/api/v1/endpoints/products.py: Defines the CRUD operations for products. * GET /api/v1/products: Retrieve all products.
* GET /api/v1/products/{product_id}: Retrieve a single product by ID.
* POST /api/v1/products: Create a new product.
* PUT /api/v1/products/{product_id}: Update an existing product.
* DELETE /api/v1/products/{product_id}: Delete a product.
app/api/v1/endpoints/health.py: Provides a simple health check endpoint (GET /api/v1/health) for monitoring.app/schemas/product.py: Pydantic models define the data structure for requests (ProductCreate, ProductUpdate) and responses (ProductResponse), ensuring strict validation and clear API contracts.The service interacts with a PostgreSQL database, managed by SQLAlchemy ORM and Alembic for migrations.
app/models/product.py: Defines the Product SQLAlchemy model, mapping Python objects to database tables.
# Example: app/models/product.py
from sqlalchemy import Column, Integer, String, Float, Boolean, DateTime
from sqlalchemy.ext.declarative import declarative_base
from sqlalchemy.sql import func
Base = declarative_base()
class Product(Base):
__tablename__ = "products"
id = Column(Integer, primary_key=True, index=True)
name = Column(String, index=True, nullable=False)
description = Column(String, nullable=True)
price = Column(Float, nullable=False)
is_available = Column(Boolean, default=True)
created_at = Column(DateTime(timezone=True), server_default=func.now())
updated_at = Column(DateTime(timezone=True), onupdate=func.now())
def __repr__(self):
return f"<Product(id={self.id}, name='{self.name}')>"
app/core/database.py: Configures the database connection (SQLAlchemy engine and session factory).alembic/: Contains the necessary files for managing database schema migrations. This allows for version-controlled database changes. * alembic.ini: Alembic configuration file.
* versions/: Directory for generated migration scripts.
The app/services/ directory encapsulates the core business logic, separating it from the API endpoints and database operations.
app/services/product_service.py: Contains functions that orchestrate CRUD operations using app/crud/product.py and implement any specific business rules related to products (e.g., price calculation, inventory checks).app/crud/product.py: Provides generic database interaction methods (create, read, update, delete) for the Product model, keeping the service layer clean and focused on business logic.The microservice is fully containerized using Docker, ensuring consistent environments across development, testing, and production.
DockerfileDefines the steps to build the Docker image for the "Product Catalog Service."
# Example: Dockerfile
FROM python:3.9-slim-buster
WORKDIR /app
COPY requirements.txt .
RUN pip install --no-cache-dir -r requirements.txt
COPY . .
# Expose the port FastAPI runs on
EXPOSE 8000
# Run database migrations before starting the application
# This is a simple approach for development/small scale; for production,
# consider separate migration jobs or init containers.
ENTRYPOINT ["/app/entrypoint.sh"]
CMD ["uvicorn", "app.main:app", "--host", "0.0.0.0", "--port", "8000"]
docker-compose.ymlFacilitates local development by orchestrating the application service and its dependencies (e.g., PostgreSQL database).
# Example: docker-compose.yml
version: '3.8'
services:
db:
image: postgres:13
environment:
POSTGRES_DB: ${POSTGRES_DB:-product_catalog_db}
POSTGRES_USER: ${POSTGRES_USER:-user}
POSTGRES_PASSWORD: ${POSTGRES_PASSWORD:-password}
ports:
- "5432:5432"
volumes:
- db_data:/var/lib/postgresql/data
healthcheck:
test: ["CMD-SHELL", "pg_isready -U user -d product_catalog_db"]
interval: 5s
timeout: 5s
retries: 5
app:
build: .
ports:
- "8000:8000"
environment:
DATABASE_URL: postgresql://${POSTGRES_USER:-user}:${POSTGRES_PASSWORD:-password}@db:5432/${POSTGRES_DB:-product_catalog_db}
ENVIRONMENT: development
depends_on:
db:
condition: service_healthy
volumes:
- .:/app # Mount current directory for live code changes during development
volumes:
db_data:
entrypoint.sh: A script executed inside the Docker container to ensure database migrations are run before the application starts.
#!/bin/bash
# Example: entrypoint.sh
echo "Running database migrations..."
alembic upgrade head
echo "Migrations complete."
exec "$@"
A comprehensive testing suite is included to ensure the reliability and correctness of the service. Pytest is used as the testing framework.
tests/: Root directory for all tests. * tests/unit/test_product_model.py: Unit tests for individual components, such as the Product SQLAlchemy model, ensuring its methods and properties behave as expected.
* tests/integration/test_product_api.py: Integration tests that verify the interaction between different components, e.g., API endpoints with the database, ensuring the full request-response cycle works correctly.
* tests/conftest.py: Contains Pytest fixtures for setting up test databases, client instances, and other shared resources, promoting test reusability and isolation.
To run tests:
docker-compose run app pytest
A GitHub Actions workflow (.github/workflows/ci.yml) is provided to automate the build, test, and deployment process, ensuring continuous integration and delivery.
# Example: .github/workflows/ci.yml
name: Product Catalog Service CI/CD
on:
push:
branches:
- main
pull_request:
branches:
- main
jobs:
build-and-test:
runs-on: ubuntu-latest
steps:
- name: Checkout code
uses: actions/checkout@v3
- name: Set up Python
uses: actions/setup-python@v4
with:
python-version: '3.9'
- name: Install dependencies
run: |
python -m pip install --upgrade pip
pip install -r requirements.txt
pip install pytest alembic python-dotenv
- name: Run Docker Compose for test environment
run: docker-compose -f docker-compose.test.yml up -d
- name: Wait for DB to be ready
run: |
docker-compose -f docker-compose.test.yml exec db pg_isready -U user -d product_catalog_db -t 0
sleep 10 # Give it a moment more
- name: Run migrations in test DB
run: docker-compose -f docker-compose.test.yml exec app alembic upgrade head
- name: Run tests
run: docker-compose -f docker-compose.test.yml exec app pytest
- name: Build Docker image
run: docker build -t your-registry/product-catalog-service:${{ github.sha }} .
# - name: Run security scans (e.g., Trivy, Bandit)
# run: |
# # Example: docker run --rm -v /var/run/docker.sock:/var/run/docker.sock aquasec/trivy:0.40.0 image your-registry/product-catalog-service:${{ github.sha }}
# # Example: bandit -r app/ -ll -f json
deploy:
runs-on: ubuntu-latest
needs: build-and-test
if: github.ref == 'refs/heads/main'
steps:
- name: Checkout code
uses: actions/checkout@v3
- name: Log in to Docker Hub / Container Registry
uses: docker/login-action@v2
with:
username: ${{ secrets.DOCKER_USERNAME }}
password: ${{ secrets.DOCKER_PASSWORD }}
- name: Push Docker image
run: docker push your-registry/product-catalog-service:${{ github.sha }}
- name: Deploy to Kubernetes / ECS / VM (Example using a script)
uses: appleboy/ssh-action@master
with:
host: ${{ secrets.DEPLOY_HOST }}
username: ${{ secrets.DEPLOY_USERNAME }}
key: ${{ secrets.DEPLOY_KEY }}
script: |
cd /path/to/your/deployment/repo
# Pull latest image, update k8s deployment, restart service, etc.
# Example for K8s:
# helm upgrade --install product-catalog-service ./helm-chart --set image.tag=${{ github.sha }}
# Example for Docker/VM:
# docker pull your-registry/product-catalog-service:${{ github.sha }}
# docker stop product-catalog-