DevOps Pipeline Generator
Run ID: 69cbd8b661b1021a29a8cd992026-03-31Infrastructure
PantheraHive BOS
BOS Dashboard

DevOps Pipeline Configuration Generation

This document provides comprehensive and detailed CI/CD pipeline configurations for GitHub Actions, GitLab CI, and Jenkins. These configurations are designed to be robust templates, incorporating essential stages such as linting, testing, building, and deployment. They are tailored to demonstrate best practices and serve as a solid foundation for your specific application requirements.


1. Introduction to CI/CD Pipeline Generation

The objective of this step is to deliver actionable CI/CD pipeline configurations that can be directly integrated into your version control system or CI/CD server. Each configuration example is structured to be modular and easy to understand, allowing for straightforward customization to fit your project's technology stack, testing frameworks, and deployment targets.

For these examples, we will consider a generic Node.js application that utilizes:


2. Key Considerations for CI/CD Pipeline Design

Before diving into the specific configurations, it's crucial to understand the common principles that guide effective CI/CD pipeline design:


3. GitHub Actions Configuration

GitHub Actions allows you to automate, customize, and execute your software development workflows directly in your repository. Workflows are defined in YAML files (.github/workflows/*.yml).

github-ci-cd.yml

text • 1,482 chars
**Key Features & Customization for GitHub Actions:**

*   **Triggers (`on`):** Configured for `push` to `main` and `develop` branches, and `pull_request` events.
*   **Environment Variables (`env`):** Define common variables like `NODE_VERSION`, `DOCKER_IMAGE_NAME`, and `DOCKER_REGISTRY`.
*   **Jobs (`jobs`):**
    *   `lint`: Installs dependencies and runs `npm run lint`.
    *   `test`: Installs dependencies and runs `npm test`. `needs: lint` ensures sequential execution.
    *   `build`: Builds a Docker image, logs into GitHub Container Registry (`ghcr.io`), and pushes the tagged image. The tag is dynamically generated based on the branch and commit SHA.
    *   `deploy`: Conditionally runs only on `main` branch pushes. It uses AWS credentials (stored as GitHub Secrets) to update Kubeconfig and then applies Kubernetes manifests, replacing the image tag dynamically.
*   **Secrets:** `GITHUB_TOKEN` is automatically provided for GitHub Container Registry. `AWS_ACCESS_KEY_ID` and `AWS_SECRET_ACCESS_KEY` must be configured as repository secrets.
*   **Artifacts:** Test results can be uploaded as artifacts for later inspection.
*   **Environments:** The `deploy` job uses a GitHub Environment (`Production`) for added protection and visibility, including optional manual approval.

---

### 4. GitLab CI Configuration

GitLab CI/CD is built into GitLab and uses a `.gitlab-ci.yml` file at the root of your repository to define your pipeline.

#### `.gitlab-ci.yml`

Sandboxed live preview

Step 1 of 3: Infrastructure Needs Analysis for DevOps Pipeline Generation

Workflow: DevOps Pipeline Generator

Step: gemini → analyze_infrastructure_needs


1. Introduction & Purpose

This document outlines the initial analysis of infrastructure needs for generating your comprehensive CI/CD pipeline. The goal of this step is to identify the foundational components and environmental considerations crucial for designing an effective, scalable, and secure DevOps pipeline using GitHub Actions, GitLab CI, or Jenkins.

Given the broad request for a "DevOps Pipeline Generator," this analysis focuses on defining the critical data points required to tailor the pipeline to your specific application, team, and operational context. Without these specifics, we cannot precisely define the compute, storage, networking, and security infrastructure required. This deliverable therefore serves as a structured framework to gather that essential information.

2. Core Infrastructure Components for CI/CD Pipelines

A robust CI/CD pipeline relies on several interconnected infrastructure components. Understanding these categories is the first step in identifying your specific requirements:

  • Code Repository Hosting: Where your source code resides. (e.g., GitHub, GitLab, Bitbucket, Azure Repos).
  • CI/CD Runner/Agent Infrastructure: The compute resources that execute your pipeline jobs (e.g., linting, testing, building).

* Managed Runners: Provided by the CI/CD platform (e.g., GitHub-hosted runners, GitLab.com shared runners).

* Self-Hosted Runners: Virtual machines, containers, or Kubernetes clusters managed by you, offering more control and custom environments.

  • Artifact Storage: Where compiled binaries, build outputs, and deployable packages are stored.

* Cloud Storage: (e.g., AWS S3, Azure Blob Storage, Google Cloud Storage) for cost-effective, scalable, and durable storage.

* Dedicated Artifact Repositories: (e.g., JFrog Artifactory, Sonatype Nexus) for advanced artifact management, versioning, and security.

  • Container Registry (if applicable): Where Docker images or other container images are stored and managed.

* Cloud-Native Registries: (e.g., Amazon ECR, Azure Container Registry, Google Container Registry, GitLab Container Registry).

* Self-Hosted Registries: (e.g., Docker Registry).

  • Deployment Target Infrastructure: The environment where your application will be deployed.

* Virtual Machines (VMs): (e.g., AWS EC2, Azure VMs, GCP Compute Engine, On-Premise Servers).

* Container Orchestration Platforms: (e.g., Kubernetes - EKS, AKS, GKE, OpenShift).

* Serverless Platforms: (e.g., AWS Lambda, Azure Functions, Google Cloud Functions).

* Platform-as-a-Service (PaaS): (e.g., AWS Elastic Beanstalk, Azure App Service, Google App Engine, Heroku).

  • Secret Management: Secure storage and retrieval of sensitive credentials, API keys, and environment variables.

* Cloud Secret Managers: (e.g., AWS Secrets Manager, Azure Key Vault, GCP Secret Manager).

* Dedicated Solutions: (e.g., HashiCorp Vault).

* CI/CD Platform Integrations: (e.g., GitHub Secrets, GitLab CI/CD Variables with masked/protected flags, Jenkins Credentials).

  • Monitoring & Logging Infrastructure: Tools and platforms to observe pipeline execution, application health, and performance.

* Cloud-Native: (e.g., AWS CloudWatch, Azure Monitor, GCP Cloud Logging/Monitoring).

* Third-Party: (e.g., Prometheus, Grafana, ELK Stack, Datadog, Splunk).

  • Network Configuration: Firewall rules, VPC/VNet setup, load balancers, API gateways to secure and route traffic.

3. Key Factors Influencing Infrastructure Decisions

To precisely analyze your infrastructure needs, we require detailed information across several critical dimensions. These factors directly impact the choice of tools, architecture, and cost:

  • 3.1. Application Details:

* Application Type: (e.g., Web Application (frontend/backend), API, Microservice, Mobile Backend, Data Processing, Machine Learning Model, Static Site).

* Programming Languages & Frameworks: (e.g., Node.js, Python, Java (Spring Boot), .NET Core, Go, Ruby on Rails, PHP, React, Angular, Vue.js).

* Database Technologies: (e.g., PostgreSQL, MySQL, MongoDB, Redis, DynamoDB).

* Containerization: Is your application already containerized (Docker, Podman)? Are you planning to containerize it?

* Build System: (e.g., Maven, Gradle, npm, yarn, pip, Go Modules).

  • 3.2. Deployment Environment:

* Cloud Provider(s): (e.g., AWS, Azure, Google Cloud Platform (GCP), Multi-Cloud, Hybrid Cloud, On-Premise).

* Target Environment: Development, Staging, Production. Do these environments have different infrastructure needs?

* Existing Infrastructure: What services are you currently using? (e.g., specific Kubernetes clusters, existing VMs, PaaS services).

  • 3.3. CI/CD Platform Preference:

* Primary Choice: GitHub Actions, GitLab CI, or Jenkins.

* Secondary/Backup Choice: If any.

* Rationale: Why is this platform preferred (e.g., existing ecosystem, team familiarity, specific features)?

  • 3.4. Scalability & Performance:

* Expected Build Frequency: How many builds per day/week?

* Peak Load Requirements: Are there specific times when build/deployment activity will be high?

* Deployment Frequency: How often do you expect to deploy to production?

* Geographic Distribution: Are your users or deployment targets geographically distributed?

  • 3.5. Security & Compliance:

* Industry Regulations: (e.g., HIPAA, GDPR, PCI DSS, SOC 2).

* Security Scans: Requirements for static application security testing (SAST), dynamic application security testing (DAST), software composition analysis (SCA), vulnerability scanning.

* Access Control: Specific requirements for who can trigger builds, approve deployments, or access secrets.

  • 3.6. Team & Operational Context:

* Team Size & Expertise: Level of familiarity with CI/CD tools, cloud platforms, and infrastructure as code.

* Maintenance & Support: Desired level of operational overhead (managed services vs. self-managed).

* Budget Constraints: Any specific budget limitations for infrastructure and tooling.

4. Current Trends & Data Insights

Analyzing your infrastructure needs also involves considering current industry trends and best practices to ensure a future-proof and efficient pipeline:

  • Cloud-Native CI/CD (Trend): A significant shift towards leveraging cloud provider-specific CI/CD services (e.g., AWS CodePipeline/CodeBuild, Azure DevOps Pipelines) or integrated solutions like GitHub Actions and GitLab CI that seamlessly interact with cloud resources. This reduces operational overhead.
  • GitOps (Trend): Managing infrastructure and application deployments through Git repositories as the single source of truth. This promotes declarative infrastructure and automated deployments.
  • DevSecOps Integration (Best Practice): Embedding security checks (SAST, DAST, SCA) directly into the CI/CD pipeline stages to identify vulnerabilities early and shift security left.
  • Containerization & Kubernetes (Data Insight): Over 80% of organizations using containers are also using Kubernetes for orchestration (CNCF Survey 2022). This implies a strong need for container registries and Kubernetes-compatible deployment strategies.
  • Infrastructure as Code (IaC) (Best Practice): Using tools like Terraform, CloudFormation, or Pulumi to define and provision infrastructure, ensuring consistency, repeatability, and version control.
  • Serverless Adoption (Trend): Increasing use of serverless compute (Lambda, Functions) for specific application components, impacting deployment strategies and monitoring.
  • Managed Services Preference (Recommendation): Opting for managed services (e.g., managed artifact repositories, cloud databases, hosted CI/CD runners) whenever possible to reduce administrative burden and leverage provider expertise.

5. Preliminary Recommendations (General)

Based on common best practices and the typical needs of a modern DevOps pipeline, we offer these preliminary recommendations:

  • Prioritize Managed Services: Where feasible, utilize managed services for CI/CD runners, artifact storage, container registries, and databases. This reduces operational overhead and maintenance costs.
  • Embrace Infrastructure as Code (IaC): Plan to define your deployment target infrastructure (VPCs, subnets, Kubernetes clusters, VMs) using IaC tools (e.g., Terraform, CloudFormation) for consistency and automation.
  • Implement Robust Secret Management: Integrate a dedicated secret management solution (e.g., cloud-native secret managers or HashiCorp Vault) from the outset to handle sensitive credentials securely.
  • Design for Security: Incorporate security scanning and compliance checks directly into the pipeline stages (DevSecOps) to catch issues early.
  • Modularity and Extensibility: Design the pipeline stages to be modular, allowing for easy updates, additions, and reusability across different projects or environments.

6. Actionable Next Steps

To proceed with generating your tailored CI/CD pipeline configuration, we require the specific details outlined in Section 3. Please provide comprehensive answers to the following:

  1. Application Details:

* What is the primary programming language and framework of your application?

* Is the application containerized (e.g., Docker)? If not, is containerization planned?

* What database technologies are you using or planning to use?

  1. Deployment Environment:

* What is your primary cloud provider (AWS, Azure, GCP, On-Premise, Hybrid)?

* What is your target deployment environment (e.g., Kubernetes, VMs, Serverless, PaaS)?

* Are there any existing infrastructure components (e.g., specific Kubernetes clusters, existing artifact repositories) that must be integrated?

  1. CI/CD Platform Preference:

* Please confirm your preferred CI/CD platform: GitHub Actions, GitLab CI, or Jenkins.

* Briefly state why this platform is preferred.

  1. Security & Compliance:

* Are there any specific industry regulations or compliance requirements (e.g., HIPAA, PCI DSS) that your pipeline and infrastructure must adhere to?

* Do you require specific security scanning tools to be integrated (SAST, DAST, SCA)?

  1. Scalability & Performance:

* What are your estimated daily/weekly build and deployment frequencies?

  1. Team & Budget:

* What is your team's familiarity level with the chosen CI/CD platform and cloud provider?

* Are there any specific budget constraints that should influence infrastructure choices?

Upon receiving this information, we will proceed to Step 2: Define Pipeline Stages & Logic, where we will translate your requirements into a detailed pipeline structure and logic, ready for configuration generation.

yaml

Define the stages of the pipeline

stages:

- lint

- test

- build

- deploy

variables:

NODE_VERSION: '18.x'

DOCKER_IMAGE_NAME: 'my-nodejs-app'

DOCKER_REGISTRY: '${CI_REGISTRY}' # GitLab Container Registry

default:

image: node:${NODE_VERSION}-alpine # Default image for all jobs unless overridden

before_script:

- npm ci --cache .npm --prefer-offline # Use npm ci for clean installs and cache

cache:

key: ${CI_COMMIT_REF_SLUG}

paths:

- .npm/

- node_modules/

lint_job:

stage: lint

script:

- echo "Running ESLint..."

- npm run lint # Assuming a 'lint' script in package.json

rules:

- if: $CI_PIPELINE_SOURCE == "merge_request_event"

- if: $CI_COMMIT_BRANCH == "main" || $CI_COMMIT_BRANCH == "develop"

test_job:

stage: test

script:

- echo "Running Jest Tests..."

- npm test # Assuming a 'test' script in package.json

artifacts:

when: always

reports:

junit: junit.xml # Path to your test results file (e.g., generated by jest-junit)

paths:

- coverage/ # Optional: Upload coverage reports

rules:

- if: $CI_PIPELINE_SOURCE == "merge_request_event"

- if: $CI_COMMIT_BRANCH == "main" || $CI_COMMIT_BRANCH == "develop"

build_docker_image:

stage: build

image: docker:latest # Use a Docker image that has Docker client installed

services:

- docker:dind # Docker-in-Docker service for building images

variables:

DOCKER_TLS_CERTDIR: "/certs" # Required for dind service

script:

- echo "Building Docker image..."

- docker login -u $CI_REGISTRY_USER -p $CI_REGISTRY_PASSWORD $CI_REGISTRY

- >

if [ "$CI_COMMIT_BRANCH" == "main" ]; then

IMAGE_TAG="latest"

elif [ "$CI_COMMIT_BRANCH" == "develop" ]; then

IMAGE_TAG="develop-${CI_COMMIT_SHORT_SHA}"

else

IMAGE_TAG="${CI_COMMIT_SHORT_SHA}"

fi

- docker build -t ${DOCKER_REGISTRY}/${DOCKER_IMAGE_NAME}:${IMAGE_TAG} .

- docker push ${DOCKER_REGISTRY}/${DOCKER_IMAGE_NAME}:${IMAGE_TAG}

- echo "DOCK

gemini Output

DevOps Pipeline Generator: Comprehensive CI/CD Pipeline Configurations

This document provides detailed and professional CI/CD pipeline configurations for GitHub Actions, GitLab CI, and Jenkins. Each configuration includes essential stages such as linting, testing, building, and deployment, designed to ensure code quality, reliability, and efficient delivery. These templates are robust, extensible, and serve as a strong foundation for your project's automation needs.


1. Executive Summary

This deliverable provides validated and documented CI/CD pipeline configurations tailored for modern software development workflows. We have generated example pipeline definitions for three leading CI/CD platforms: GitHub Actions, GitLab CI, and Jenkins. Each configuration is structured to support a typical application lifecycle, encompassing:

  • Linting: Ensuring code style consistency and identifying potential issues early.
  • Testing: Executing unit, integration, and (optionally) end-to-end tests to validate functionality.
  • Building: Compiling source code, resolving dependencies, and packaging artifacts.
  • Deployment: Automating the release process to various environments (e.g., staging, production), often including manual approval steps for critical releases.

These configurations are designed for clarity, maintainability, and security, leveraging platform-specific features like secrets management, caching, and environment controls.


2. Generated Pipeline Configurations

Below are the detailed pipeline configurations for each specified platform. These examples assume a generic application (e.g., a Node.js or Python application, but the structure is adaptable to most languages/frameworks).

2.1. GitHub Actions Configuration (.github/workflows/main.yml)

This GitHub Actions workflow is triggered on pushes to main and pull requests. It includes linting, testing, building, and conditional deployment stages.


name: CI/CD Pipeline

on:
  push:
    branches:
      - main
  pull_request:
    branches:
      - main

env:
  NODE_VERSION: '18.x' # Example: Specify Node.js version
  # Add other environment variables as needed, e.g., for Python, Java, Go

jobs:
  lint:
    runs-on: ubuntu-latest
    steps:
      - name: Checkout code
        uses: actions/checkout@v4

      - name: Setup Node.js
        uses: actions/setup-node@v4
        with:
          node-version: ${{ env.NODE_VERSION }}
          cache: 'npm' # Or 'yarn', 'pnpm'

      - name: Install dependencies
        run: npm ci # Or yarn install --frozen-lockfile

      - name: Run Linter (e.g., ESLint)
        run: npm run lint # Or your specific lint command

  test:
    needs: lint
    runs-on: ubuntu-latest
    steps:
      - name: Checkout code
        uses: actions/checkout@v4

      - name: Setup Node.js
        uses: actions/setup-node@v4
        with:
          node-version: ${{ env.NODE_VERSION }}
          cache: 'npm'

      - name: Install dependencies
        run: npm ci

      - name: Run Unit and Integration Tests
        run: npm test # Or your specific test command

      # Example: Upload test results for reporting
      - name: Upload test results
        uses: actions/upload-artifact@v4
        if: always()
        with:
          name: test-results
          path: ./test-results.xml # Adjust path to your test report file

  build:
    needs: test
    runs-on: ubuntu-latest
    steps:
      - name: Checkout code
        uses: actions/checkout@v4

      - name: Setup Node.js
        uses: actions/setup-node@v4
        with:
          node-version: ${{ env.NODE_VERSION }}
          cache: 'npm'

      - name: Install dependencies
        run: npm ci

      - name: Build Application
        run: npm run build # Or your specific build command (e.g., webpack, maven, go build)

      - name: Archive Build Artifacts
        uses: actions/upload-artifact@v4
        with:
          name: application-build-${{ github.sha }}
          path: ./dist # Adjust path to your build output directory

  deploy-staging:
    needs: build
    runs-on: ubuntu-latest
    environment: Staging # GitHub Environments for protection rules and secrets
    if: github.ref == 'refs/heads/main' # Only deploy main branch to staging
    steps:
      - name: Download Build Artifacts
        uses: actions/download-artifact@v4
        with:
          name: application-build-${{ github.sha }}
          path: ./build-output

      - name: Deploy to Staging Environment
        run: |
          echo "Deploying to Staging..."
          # Example: Use a deployment script or tool
          # Your actual deployment commands here, e.g.,
          # aws s3 sync ./build-output s3://${{ secrets.AWS_S3_BUCKET_STAGING }}
          # kubectl apply -f kubernetes/staging.yaml
          # ssh user@staging-server "deploy.sh"
          echo "Deployment to Staging complete."
        env:
          # Access environment-specific secrets
          STAGING_API_KEY: ${{ secrets.STAGING_API_KEY }}

  deploy-production:
    needs: deploy-staging
    runs-on: ubuntu-latest
    environment: Production # GitHub Environments for protection rules and secrets
    if: github.ref == 'refs/heads/main' # Only deploy main branch
    # Requires manual approval in GitHub UI if 'Production' environment is configured for it
    steps:
      - name: Download Build Artifacts
        uses: actions/download-artifact@v4
        with:
          name: application-build-${{ github.sha }}
          path: ./build-output

      - name: Deploy to Production Environment
        run: |
          echo "Deploying to Production..."
          # Your actual production deployment commands here
          echo "Deployment to Production complete."
        env:
          # Access environment-specific secrets
          PROD_API_KEY: ${{ secrets.PROD_API_KEY }}

2.2. GitLab CI Configuration (.gitlab-ci.yml)

This GitLab CI pipeline defines stages for linting, testing, building, and deployment, with specific rules for branches and environments.


image: node:18 # Example: Specify a base Docker image

variables:
  # Add global variables here
  NPM_CACHE_DIR: "$CI_PROJECT_DIR/.npm" # Cache directory for npm

cache:
  key: ${CI_COMMIT_REF_SLUG}
  paths:
    - .npm/
    - node_modules/
  policy: pull-push # Cache dependencies between jobs

stages:
  - lint
  - test
  - build
  - deploy

.install_dependencies: &install_dependencies_template
  before_script:
    - npm config set cache ${NPM_CACHE_DIR}
    - npm ci --cache ${NPM_CACHE_DIR} --prefer-offline # Use cached dependencies

lint_job:
  stage: lint
  script:
    - *install_dependencies_template
    - npm run lint # Your specific lint command
  tags:
    - docker # Example: Use a specific runner tag

test_job:
  stage: test
  script:
    - *install_dependencies_template
    - npm test # Your specific test command
  artifacts:
    when: always
    reports:
      junit:
        - junit.xml # Adjust path to your test report file (e.g., generated by Jest, Mocha)
  tags:
    - docker

build_job:
  stage: build
  script:
    - *install_dependencies_template
    - npm run build # Your specific build command
  artifacts:
    paths:
      - dist/ # Adjust path to your build output directory
    expire_in: 1 week
  tags:
    - docker

deploy_staging_job:
  stage: deploy
  image: alpine/git:latest # Example: A lightweight image for deployment scripts
  script:
    - echo "Deploying to Staging..."
    - # Your actual deployment commands here
    - # Example:
    - # apk add --no-cache python3 py3-pip
    - # pip install awscli
    - # aws s3 sync ./dist s3://$AWS_S3_BUCKET_STAGING
    - echo "Deployment to Staging complete."
  environment:
    name: staging
    url: https://staging.example.com
  rules:
    - if: '$CI_COMMIT_BRANCH == "main"' # Only deploy main branch
  variables:
    # Access environment-specific variables defined in GitLab CI/CD settings
    STAGING_API_KEY: $STAGING_API_KEY
  tags:
    - docker

deploy_production_job:
  stage: deploy
  image: alpine/git:latest
  script:
    - echo "Deploying to Production..."
    - # Your actual production deployment commands here
    - echo "Deployment to Production complete."
  environment:
    name: production
    url: https://app.example.com
  rules:
    - if: '$CI_COMMIT_BRANCH == "main"'
      when: manual # Requires manual approval in GitLab UI
  variables:
    # Access environment-specific variables defined in GitLab CI/CD settings
    PROD_API_KEY: $PROD_API_KEY
  tags:
    - docker

2.3. Jenkins Pipeline Configuration (Jenkinsfile)

This Jenkins Pipeline uses Groovy DSL for a Jenkinsfile to define stages for a declarative pipeline, including agent selection, build steps, and conditional deployment.


// Jenkinsfile (Declarative Pipeline)

pipeline {
    agent {
        # Example: Use a Docker agent for consistency
        docker {
            image 'node:18-alpine' // Or 'maven:3-jdk-11', 'python:3.9-slim'
            args '-v /var/run/docker.sock:/var/run/docker.sock' // If Docker-in-Docker is needed
        }
        // Alternatively, use a label for a specific Jenkins agent:
        // agent { label 'my-build-agent' }
    }

    environment {
        // Global environment variables
        NODE_OPTIONS = '--max-old-space-size=4096'
        # Add other environment variables as needed
    }

    stages {
        stage('Checkout') {
            steps {
                checkout scm
            }
        }

        stage('Lint') {
            steps {
                sh 'npm ci' // Install dependencies
                sh 'npm run lint' // Your specific lint command
            }
        }

        stage('Test') {
            steps {
                sh 'npm test' // Your specific test command
                # Example: Publish JUnit test results
                junit '**/target/surefire-reports/*.xml' // Adjust path for your test reports
            }
        }

        stage('Build') {
            steps {
                sh 'npm run build' // Your specific build command
                # Archive artifacts
                archiveArtifacts artifacts: 'dist/**/*', fingerprint: true // Adjust path
            }
        }

        stage('Deploy to Staging') {
            when {
                branch 'main' // Only deploy main branch to staging
            }
            steps {
                script {
                    echo "Downloading artifacts for deployment..."
                    // Retrieve archived artifacts (if not already available in workspace)
                    // This step is often not needed if artifacts are directly in workspace
                    // sh 'cp -r $(find . -name "dist" -type d) .' // Example to bring 'dist' to root

                    echo "Deploying to Staging..."
                    // Your actual deployment commands here
                    // Example:
                    // withCredentials([usernamePassword(credentialsId: 'aws-deploy-user', usernameVariable: 'AWS_ACCESS_KEY_ID', passwordVariable: 'AWS_SECRET_ACCESS_KEY')]) {
                    //     sh 'aws s3 sync ./dist s3://staging-app-bucket'
                    // }
                    // sh 'ssh -i /var/lib/jenkins/.ssh/id_rsa user@staging-server "deploy_script.sh"'
                    echo "Deployment to Staging complete."
                }
            }
        }

        stage('Deploy to Production') {
            when {
                branch 'main' // Only deploy main branch to production
            }
            steps {
                script {
                    // Manual approval step
                    input message: 'Proceed with deployment to Production?', ok: 'Deploy'

                    echo "Downloading artifacts for deployment..."
                    // Your artifact retrieval if needed

                    echo "Deploying to Production..."
                    // Your actual production deployment commands here
                    echo "Deployment to Production complete."
                }
            }
        }
    }

    post {
        always {
            cleanWs() // Clean up workspace after pipeline execution
        }
        success {
            echo 'Pipeline finished successfully!'
            // Example: Send Slack notification
            // slackSend channel: '#devops-alerts', message: "Pipeline ${env.JOB_NAME} Build ${env.BUILD_NUMBER} SUCCESS"
        }
        failure {
            echo 'Pipeline failed!'
            // slackSend channel: '#devops-alerts', message: "Pipeline ${env.JOB_NAME} Build ${
devops_pipeline_generator.txt
Download source file
Copy all content
Full output as text
Download ZIP
IDE-ready project ZIP
Copy share link
Permanent URL for this run
Get Embed Code
Embed this result on any website
Print / Save PDF
Use browser print dialog
\n\n\n"); var hasSrcMain=Object.keys(extracted).some(function(k){return k.indexOf("src/main")>=0;}); if(!hasSrcMain) zip.file(folder+"src/main."+ext,"import React from 'react'\nimport ReactDOM from 'react-dom/client'\nimport App from './App'\nimport './index.css'\n\nReactDOM.createRoot(document.getElementById('root')!).render(\n \n \n \n)\n"); var hasSrcApp=Object.keys(extracted).some(function(k){return k==="src/App."+ext||k==="App."+ext;}); if(!hasSrcApp) zip.file(folder+"src/App."+ext,"import React from 'react'\nimport './App.css'\n\nfunction App(){\n return(\n
\n
\n

"+slugTitle(pn)+"

\n

Built with PantheraHive BOS

\n
\n
\n )\n}\nexport default App\n"); zip.file(folder+"src/index.css","*{margin:0;padding:0;box-sizing:border-box}\nbody{font-family:system-ui,-apple-system,sans-serif;background:#f0f2f5;color:#1a1a2e}\n.app{min-height:100vh;display:flex;flex-direction:column}\n.app-header{flex:1;display:flex;flex-direction:column;align-items:center;justify-content:center;gap:12px;padding:40px}\nh1{font-size:2.5rem;font-weight:700}\n"); zip.file(folder+"src/App.css",""); zip.file(folder+"src/components/.gitkeep",""); zip.file(folder+"src/pages/.gitkeep",""); zip.file(folder+"src/hooks/.gitkeep",""); Object.keys(extracted).forEach(function(p){ var fp=p.startsWith("src/")?p:"src/"+p; zip.file(folder+fp,extracted[p]); }); zip.file(folder+"README.md","# "+slugTitle(pn)+"\n\nGenerated by PantheraHive BOS.\n\n## Setup\n\`\`\`bash\nnpm install\nnpm run dev\n\`\`\`\n\n## Build\n\`\`\`bash\nnpm run build\n\`\`\`\n\n## Open in IDE\nOpen the project folder in VS Code or WebStorm.\n"); zip.file(folder+".gitignore","node_modules/\ndist/\n.env\n.DS_Store\n*.local\n"); } /* --- Vue (Vite + Composition API + TypeScript) --- */ function buildVue(zip,folder,app,code,panelTxt){ var pn=pkgName(app); var C=cc(pn); var extracted=extractCode(panelTxt); zip.file(folder+"package.json",'{\n "name": "'+pn+'",\n "version": "0.0.0",\n "type": "module",\n "scripts": {\n "dev": "vite",\n "build": "vue-tsc -b && vite build",\n "preview": "vite preview"\n },\n "dependencies": {\n "vue": "^3.5.13",\n "vue-router": "^4.4.5",\n "pinia": "^2.3.0",\n "axios": "^1.7.9"\n },\n "devDependencies": {\n "@vitejs/plugin-vue": "^5.2.1",\n "typescript": "~5.7.3",\n "vite": "^6.0.5",\n "vue-tsc": "^2.2.0"\n }\n}\n'); zip.file(folder+"vite.config.ts","import { defineConfig } from 'vite'\nimport vue from '@vitejs/plugin-vue'\nimport { resolve } from 'path'\n\nexport default defineConfig({\n plugins: [vue()],\n resolve: { alias: { '@': resolve(__dirname,'src') } }\n})\n"); zip.file(folder+"tsconfig.json",'{"files":[],"references":[{"path":"./tsconfig.app.json"},{"path":"./tsconfig.node.json"}]}\n'); zip.file(folder+"tsconfig.app.json",'{\n "compilerOptions":{\n "target":"ES2020","useDefineForClassFields":true,"module":"ESNext","lib":["ES2020","DOM","DOM.Iterable"],\n "skipLibCheck":true,"moduleResolution":"bundler","allowImportingTsExtensions":true,\n "isolatedModules":true,"moduleDetection":"force","noEmit":true,"jsxImportSource":"vue",\n "strict":true,"paths":{"@/*":["./src/*"]}\n },\n "include":["src/**/*.ts","src/**/*.d.ts","src/**/*.tsx","src/**/*.vue"]\n}\n'); zip.file(folder+"env.d.ts","/// \n"); zip.file(folder+"index.html","\n\n\n \n \n "+slugTitle(pn)+"\n\n\n
\n \n\n\n"); var hasMain=Object.keys(extracted).some(function(k){return k==="src/main.ts"||k==="main.ts";}); if(!hasMain) zip.file(folder+"src/main.ts","import { createApp } from 'vue'\nimport { createPinia } from 'pinia'\nimport App from './App.vue'\nimport './assets/main.css'\n\nconst app = createApp(App)\napp.use(createPinia())\napp.mount('#app')\n"); var hasApp=Object.keys(extracted).some(function(k){return k.indexOf("App.vue")>=0;}); if(!hasApp) zip.file(folder+"src/App.vue","\n\n\n\n\n"); zip.file(folder+"src/assets/main.css","*{margin:0;padding:0;box-sizing:border-box}body{font-family:system-ui,sans-serif;background:#fff;color:#213547}\n"); zip.file(folder+"src/components/.gitkeep",""); zip.file(folder+"src/views/.gitkeep",""); zip.file(folder+"src/stores/.gitkeep",""); Object.keys(extracted).forEach(function(p){ var fp=p.startsWith("src/")?p:"src/"+p; zip.file(folder+fp,extracted[p]); }); zip.file(folder+"README.md","# "+slugTitle(pn)+"\n\nGenerated by PantheraHive BOS.\n\n## Setup\n\`\`\`bash\nnpm install\nnpm run dev\n\`\`\`\n\n## Build\n\`\`\`bash\nnpm run build\n\`\`\`\n\nOpen in VS Code or WebStorm.\n"); zip.file(folder+".gitignore","node_modules/\ndist/\n.env\n.DS_Store\n*.local\n"); } /* --- Angular (v19 standalone) --- */ function buildAngular(zip,folder,app,code,panelTxt){ var pn=pkgName(app); var C=cc(pn); var sel=pn.replace(/_/g,"-"); var extracted=extractCode(panelTxt); zip.file(folder+"package.json",'{\n "name": "'+pn+'",\n "version": "0.0.0",\n "scripts": {\n "ng": "ng",\n "start": "ng serve",\n "build": "ng build",\n "test": "ng test"\n },\n "dependencies": {\n "@angular/animations": "^19.0.0",\n "@angular/common": "^19.0.0",\n "@angular/compiler": "^19.0.0",\n "@angular/core": "^19.0.0",\n "@angular/forms": "^19.0.0",\n "@angular/platform-browser": "^19.0.0",\n "@angular/platform-browser-dynamic": "^19.0.0",\n "@angular/router": "^19.0.0",\n "rxjs": "~7.8.0",\n "tslib": "^2.3.0",\n "zone.js": "~0.15.0"\n },\n "devDependencies": {\n "@angular-devkit/build-angular": "^19.0.0",\n "@angular/cli": "^19.0.0",\n "@angular/compiler-cli": "^19.0.0",\n "typescript": "~5.6.0"\n }\n}\n'); zip.file(folder+"angular.json",'{\n "$schema": "./node_modules/@angular/cli/lib/config/schema.json",\n "version": 1,\n "newProjectRoot": "projects",\n "projects": {\n "'+pn+'": {\n "projectType": "application",\n "root": "",\n "sourceRoot": "src",\n "prefix": "app",\n "architect": {\n "build": {\n "builder": "@angular-devkit/build-angular:application",\n "options": {\n "outputPath": "dist/'+pn+'",\n "index": "src/index.html",\n "browser": "src/main.ts",\n "tsConfig": "tsconfig.app.json",\n "styles": ["src/styles.css"],\n "scripts": []\n }\n },\n "serve": {"builder":"@angular-devkit/build-angular:dev-server","configurations":{"production":{"buildTarget":"'+pn+':build:production"},"development":{"buildTarget":"'+pn+':build:development"}},"defaultConfiguration":"development"}\n }\n }\n }\n}\n'); zip.file(folder+"tsconfig.json",'{\n "compileOnSave": false,\n "compilerOptions": {"baseUrl":"./","outDir":"./dist/out-tsc","forceConsistentCasingInFileNames":true,"strict":true,"noImplicitOverride":true,"noPropertyAccessFromIndexSignature":true,"noImplicitReturns":true,"noFallthroughCasesInSwitch":true,"paths":{"@/*":["src/*"]},"skipLibCheck":true,"esModuleInterop":true,"sourceMap":true,"declaration":false,"experimentalDecorators":true,"moduleResolution":"bundler","importHelpers":true,"target":"ES2022","module":"ES2022","useDefineForClassFields":false,"lib":["ES2022","dom"]},\n "references":[{"path":"./tsconfig.app.json"}]\n}\n'); zip.file(folder+"tsconfig.app.json",'{\n "extends":"./tsconfig.json",\n "compilerOptions":{"outDir":"./dist/out-tsc","types":[]},\n "files":["src/main.ts"],\n "include":["src/**/*.d.ts"]\n}\n'); zip.file(folder+"src/index.html","\n\n\n \n "+slugTitle(pn)+"\n \n \n \n\n\n \n\n\n"); zip.file(folder+"src/main.ts","import { bootstrapApplication } from '@angular/platform-browser';\nimport { appConfig } from './app/app.config';\nimport { AppComponent } from './app/app.component';\n\nbootstrapApplication(AppComponent, appConfig)\n .catch(err => console.error(err));\n"); zip.file(folder+"src/styles.css","* { margin: 0; padding: 0; box-sizing: border-box; }\nbody { font-family: system-ui, -apple-system, sans-serif; background: #f9fafb; color: #111827; }\n"); var hasComp=Object.keys(extracted).some(function(k){return k.indexOf("app.component")>=0;}); if(!hasComp){ zip.file(folder+"src/app/app.component.ts","import { Component } from '@angular/core';\nimport { RouterOutlet } from '@angular/router';\n\n@Component({\n selector: 'app-root',\n standalone: true,\n imports: [RouterOutlet],\n templateUrl: './app.component.html',\n styleUrl: './app.component.css'\n})\nexport class AppComponent {\n title = '"+pn+"';\n}\n"); zip.file(folder+"src/app/app.component.html","
\n
\n

"+slugTitle(pn)+"

\n

Built with PantheraHive BOS

\n
\n \n
\n"); zip.file(folder+"src/app/app.component.css",".app-header{display:flex;flex-direction:column;align-items:center;justify-content:center;min-height:60vh;gap:16px}h1{font-size:2.5rem;font-weight:700;color:#6366f1}\n"); } zip.file(folder+"src/app/app.config.ts","import { ApplicationConfig, provideZoneChangeDetection } from '@angular/core';\nimport { provideRouter } from '@angular/router';\nimport { routes } from './app.routes';\n\nexport const appConfig: ApplicationConfig = {\n providers: [\n provideZoneChangeDetection({ eventCoalescing: true }),\n provideRouter(routes)\n ]\n};\n"); zip.file(folder+"src/app/app.routes.ts","import { Routes } from '@angular/router';\n\nexport const routes: Routes = [];\n"); Object.keys(extracted).forEach(function(p){ var fp=p.startsWith("src/")?p:"src/"+p; zip.file(folder+fp,extracted[p]); }); zip.file(folder+"README.md","# "+slugTitle(pn)+"\n\nGenerated by PantheraHive BOS.\n\n## Setup\n\`\`\`bash\nnpm install\nng serve\n# or: npm start\n\`\`\`\n\n## Build\n\`\`\`bash\nng build\n\`\`\`\n\nOpen in VS Code with Angular Language Service extension.\n"); zip.file(folder+".gitignore","node_modules/\ndist/\n.env\n.DS_Store\n*.local\n.angular/\n"); } /* --- Python --- */ function buildPython(zip,folder,app,code){ var title=slugTitle(app); var pn=pkgName(app); var src=code.replace(/^\`\`\`[\w]*\n?/m,"").replace(/\n?\`\`\`$/m,"").trim(); var reqMap={"numpy":"numpy","pandas":"pandas","sklearn":"scikit-learn","tensorflow":"tensorflow","torch":"torch","flask":"flask","fastapi":"fastapi","uvicorn":"uvicorn","requests":"requests","sqlalchemy":"sqlalchemy","pydantic":"pydantic","dotenv":"python-dotenv","PIL":"Pillow","cv2":"opencv-python","matplotlib":"matplotlib","seaborn":"seaborn","scipy":"scipy"}; var reqs=[]; Object.keys(reqMap).forEach(function(k){if(src.indexOf("import "+k)>=0||src.indexOf("from "+k)>=0)reqs.push(reqMap[k]);}); var reqsTxt=reqs.length?reqs.join("\n"):"# add dependencies here\n"; zip.file(folder+"main.py",src||"# "+title+"\n# Generated by PantheraHive BOS\n\nprint(title+\" loaded\")\n"); zip.file(folder+"requirements.txt",reqsTxt); zip.file(folder+".env.example","# Environment variables\n"); zip.file(folder+"README.md","# "+title+"\n\nGenerated by PantheraHive BOS.\n\n## Setup\n\`\`\`bash\npython3 -m venv .venv\nsource .venv/bin/activate\npip install -r requirements.txt\n\`\`\`\n\n## Run\n\`\`\`bash\npython main.py\n\`\`\`\n"); zip.file(folder+".gitignore",".venv/\n__pycache__/\n*.pyc\n.env\n.DS_Store\n"); } /* --- Node.js --- */ function buildNode(zip,folder,app,code){ var title=slugTitle(app); var pn=pkgName(app); var src=code.replace(/^\`\`\`[\w]*\n?/m,"").replace(/\n?\`\`\`$/m,"").trim(); var depMap={"mongoose":"^8.0.0","dotenv":"^16.4.5","axios":"^1.7.9","cors":"^2.8.5","bcryptjs":"^2.4.3","jsonwebtoken":"^9.0.2","socket.io":"^4.7.4","uuid":"^9.0.1","zod":"^3.22.4","express":"^4.18.2"}; var deps={}; Object.keys(depMap).forEach(function(k){if(src.indexOf(k)>=0)deps[k]=depMap[k];}); if(!deps["express"])deps["express"]="^4.18.2"; var pkgJson=JSON.stringify({"name":pn,"version":"1.0.0","main":"src/index.js","scripts":{"start":"node src/index.js","dev":"nodemon src/index.js"},"dependencies":deps,"devDependencies":{"nodemon":"^3.0.3"}},null,2)+"\n"; zip.file(folder+"package.json",pkgJson); var fallback="const express=require(\"express\");\nconst app=express();\napp.use(express.json());\n\napp.get(\"/\",(req,res)=>{\n res.json({message:\""+title+" API\"});\n});\n\nconst PORT=process.env.PORT||3000;\napp.listen(PORT,()=>console.log(\"Server on port \"+PORT));\n"; zip.file(folder+"src/index.js",src||fallback); zip.file(folder+".env.example","PORT=3000\n"); zip.file(folder+".gitignore","node_modules/\n.env\n.DS_Store\n"); zip.file(folder+"README.md","# "+title+"\n\nGenerated by PantheraHive BOS.\n\n## Setup\n\`\`\`bash\nnpm install\n\`\`\`\n\n## Run\n\`\`\`bash\nnpm run dev\n\`\`\`\n"); } /* --- Vanilla HTML --- */ function buildVanillaHtml(zip,folder,app,code){ var title=slugTitle(app); var isFullDoc=code.trim().toLowerCase().indexOf("=0||code.trim().toLowerCase().indexOf("=0; var indexHtml=isFullDoc?code:"\n\n\n\n\n"+title+"\n\n\n\n"+code+"\n\n\n\n"; zip.file(folder+"index.html",indexHtml); zip.file(folder+"style.css","/* "+title+" — styles */\n*{margin:0;padding:0;box-sizing:border-box}\nbody{font-family:system-ui,-apple-system,sans-serif;background:#fff;color:#1a1a2e}\n"); zip.file(folder+"script.js","/* "+title+" — scripts */\n"); zip.file(folder+"assets/.gitkeep",""); zip.file(folder+"README.md","# "+title+"\n\nGenerated by PantheraHive BOS.\n\n## Open\nDouble-click \`index.html\` in your browser.\n\nOr serve locally:\n\`\`\`bash\nnpx serve .\n# or\npython3 -m http.server 3000\n\`\`\`\n"); zip.file(folder+".gitignore",".DS_Store\nnode_modules/\n.env\n"); } /* ===== MAIN ===== */ var sc=document.createElement("script"); sc.src="https://cdnjs.cloudflare.com/ajax/libs/jszip/3.10.1/jszip.min.js"; sc.onerror=function(){ if(lbl)lbl.textContent="Download ZIP"; alert("JSZip load failed — check connection."); }; sc.onload=function(){ var zip=new JSZip(); var base=(_phFname||"output").replace(/\.[^.]+$/,""); var app=base.toLowerCase().replace(/[^a-z0-9]+/g,"_").replace(/^_+|_+$/g,"")||"my_app"; var folder=app+"/"; var vc=document.getElementById("panel-content"); var panelTxt=vc?(vc.innerText||vc.textContent||""):""; var lang=detectLang(_phCode,panelTxt); if(_phIsHtml){ buildVanillaHtml(zip,folder,app,_phCode); } else if(lang==="flutter"){ buildFlutter(zip,folder,app,_phCode,panelTxt); } else if(lang==="react-native"){ buildReactNative(zip,folder,app,_phCode,panelTxt); } else if(lang==="swift"){ buildSwift(zip,folder,app,_phCode,panelTxt); } else if(lang==="kotlin"){ buildKotlin(zip,folder,app,_phCode,panelTxt); } else if(lang==="react"){ buildReact(zip,folder,app,_phCode,panelTxt); } else if(lang==="vue"){ buildVue(zip,folder,app,_phCode,panelTxt); } else if(lang==="angular"){ buildAngular(zip,folder,app,_phCode,panelTxt); } else if(lang==="python"){ buildPython(zip,folder,app,_phCode); } else if(lang==="node"){ buildNode(zip,folder,app,_phCode); } else { /* Document/content workflow */ var title=app.replace(/_/g," "); var md=_phAll||_phCode||panelTxt||"No content"; zip.file(folder+app+".md",md); var h=""+title+""; h+="

"+title+"

"; var hc=md.replace(/&/g,"&").replace(//g,">"); hc=hc.replace(/^### (.+)$/gm,"

$1

"); hc=hc.replace(/^## (.+)$/gm,"

$1

"); hc=hc.replace(/^# (.+)$/gm,"

$1

"); hc=hc.replace(/\*\*(.+?)\*\*/g,"$1"); hc=hc.replace(/\n{2,}/g,"

"); h+="

"+hc+"

Generated by PantheraHive BOS
"; zip.file(folder+app+".html",h); zip.file(folder+"README.md","# "+title+"\n\nGenerated by PantheraHive BOS.\n\nFiles:\n- "+app+".md (Markdown)\n- "+app+".html (styled HTML)\n"); } zip.generateAsync({type:"blob"}).then(function(blob){ var a=document.createElement("a"); a.href=URL.createObjectURL(blob); a.download=app+".zip"; a.click(); URL.revokeObjectURL(a.href); if(lbl)lbl.textContent="Download ZIP"; }); }; document.head.appendChild(sc); } function phShare(){navigator.clipboard.writeText(window.location.href).then(function(){var el=document.getElementById("ph-share-lbl");if(el){el.textContent="Link copied!";setTimeout(function(){el.textContent="Copy share link";},2500);}});}function phEmbed(){var runId=window.location.pathname.split("/").pop().replace(".html","");var embedUrl="https://pantherahive.com/embed/"+runId;var code='';navigator.clipboard.writeText(code).then(function(){var el=document.getElementById("ph-embed-lbl");if(el){el.textContent="Embed code copied!";setTimeout(function(){el.textContent="Get Embed Code";},2500);}});}