DevOps Pipeline Generator
Run ID: 69cbf4e161b1021a29a8def42026-03-31Infrastructure
PantheraHive BOS
BOS Dashboard

DevOps Pipeline Generator: Comprehensive CI/CD Configurations

This document provides detailed and professional CI/CD pipeline configurations for three leading platforms: GitHub Actions, GitLab CI, and Jenkins. Each configuration is designed to be comprehensive, covering essential stages such as linting, testing, building, and deployment, using a common example of a containerized Node.js application. These examples serve as robust templates that can be adapted to various technology stacks and deployment targets.


1. Core Components of a CI/CD Pipeline

Regardless of the platform, a robust CI/CD pipeline typically follows a sequence of stages to ensure code quality, reliability, and efficient delivery. Our examples will integrate the following key stages:


2. Platform-Specific Configurations

For each platform, we provide a detailed configuration for a hypothetical Node.js web application that is containerized using Docker and deployed to a generic server via SSH (or a container registry).

2.1. GitHub Actions

GitHub Actions allows you to automate, customize, and execute your software development workflows directly in your repository. You can discover, create, and share actions to perform any job, including CI/CD.

Defined in YAML files (.github/workflows/.yml) in your repository.

* Event-driven: workflows trigger on events like pushes, pull requests, scheduled times.

* Composed of jobs, which contain steps (commands or actions).

* Leverages a vast marketplace of pre-built actions.

text • 3,406 chars
*   **Explanation of Stages**:
    *   **`lint-and-test`**:
        *   Checks out the repository code.
        *   Sets up the specified Node.js version and caches `node_modules` for faster subsequent runs.
        *   Installs project dependencies.
        *   Executes `npm run lint` and `npm test` scripts.
    *   **`build-docker-image`**:
        *   **Dependency**: Requires `lint-and-test` to pass.
        *   Logs into the configured Docker registry (e.g., Docker Hub, AWS ECR, GitHub Container Registry) using secrets.
        *   Builds the Docker image from the `Dockerfile` in the repository root.
        *   Tags the image with the Git SHA, and `latest` or the version tag for `main` branch/tag pushes.
        *   Pushes the tagged image to the Docker registry.
        *   Utilizes GitHub Actions caching for Docker layers (`cache-from`, `cache-to`) to speed up builds.
    *   **`deploy`**:
        *   **Dependency**: Requires `build-docker-image` to pass.
        *   **Conditional Execution**: Only runs on pushes to `main` branch or version tags (`v*.*.*`).
        *   Uses the `appleboy/ssh-action` to connect to a remote server.
        *   On the remote server, it logs into the Docker registry, pulls the newly built image, stops any running containers of the same application, removes them, and starts a new container with the updated image.
        *   Uses GitHub Environments for better management and protection of deployment targets.

*   **Key Features Highlighted**:
    *   **Secrets Management**: `secrets.DOCKER_USERNAME`, `secrets.DOCKER_PASSWORD`, `secrets.DEPLOY_HOST`, `secrets.DEPLOY_USER`, `secrets.DEPLOY_PRIVATE_KEY` are stored securely in GitHub repository settings.
    *   **Caching**: `actions/setup-node` and `docker/build-push-action` leverage caching to speed up builds by reusing dependencies and Docker layers.
    *   **Conditional Workflows**: `if` statements control when jobs or steps run (e.g., only build/push on `push` events, deploy only from `main` or tags).
    *   **Environments**: The `deploy` job uses a `Production` environment, allowing for environment-specific protection rules (e.g., manual approval, required reviewers).

*   **Customization Notes**:
    *   **Other Languages**: Replace `actions/setup-node` with `actions/setup-python`, `actions/setup-java`, etc., and adjust `npm` commands to `pip`, `mvn`, `gradle`.
    *   **Deployment Targets**: For AWS (ECS, EKS, Lambda), Azure (AKS, App Service), GCP (GKE, Cloud Run), use specific GitHub Actions provided by the cloud providers (e.g., `aws-actions/amazon-ecs-deploy@v1`).
    *   **Testing Frameworks**: Adjust `npm test` to `pytest`, `mvn test`, etc.
    *   **Linting Tools**: Adjust `npm run lint` to `flake8`, `checkstyle`, etc.

#### 2.2. GitLab CI

GitLab CI/CD is a powerful tool built directly into GitLab, enabling continuous integration, delivery, and deployment to all projects. It uses a `.gitlab-ci.yml` file to define your pipeline.

*   **Overview**:
    *   Defined in a `.gitlab-ci.yml` file in the root of your repository.
    *   Uses a `stages` keyword to define the order of execution.
    *   Jobs are grouped into stages and run in parallel within a stage.
    *   Leverages Docker images for job execution environments.
    *   Built-in Container Registry and Environment features.

*   **Example Pipeline Configuration (`.gitlab-ci.yml`)**:

    
Sandboxed live preview

Step 1/3: Infrastructure Needs Analysis - DevOps Pipeline Generator

Workflow: DevOps Pipeline Generator

Step: gemini → analyze_infrastructure_needs


1. Executive Summary

This document outlines the critical infrastructure considerations required to generate a robust, efficient, and tailored CI/CD pipeline. As the initial step in the "DevOps Pipeline Generator" workflow, this analysis focuses on identifying the key data points and architectural components necessary to design a pipeline that perfectly aligns with your project's technical stack, deployment targets, operational requirements, and strategic goals.

While a detailed pipeline configuration requires specific project inputs, this analysis establishes the framework for gathering that information. It highlights the crucial infrastructure categories, current industry trends, and preliminary recommendations to guide the subsequent pipeline design. The core output of this step is a clear understanding of what information is needed to proceed effectively.


2. Purpose of this Analysis

The primary goal of the analyze_infrastructure_needs step is to lay the groundwork for a highly effective CI/CD pipeline. A well-designed pipeline is deeply integrated with the underlying infrastructure. Without a clear understanding of your current and target environments, the generated pipeline would be generic and potentially inefficient or incompatible.

This analysis aims to:

  • Identify Critical Dependencies: Pinpoint the existing and desired infrastructure components (e.g., SCM, cloud provider, artifact repositories, deployment targets).
  • Define Scope & Constraints: Understand the operational boundaries, security requirements, and performance expectations.
  • Inform Tool Selection: Guide the choice of CI/CD platform (GitHub Actions, GitLab CI, Jenkins), testing frameworks, and deployment strategies.
  • Ensure Compatibility: Guarantee that the generated pipeline configurations are compatible with your existing ecosystem.
  • Optimize Resource Utilization: Design pipelines that leverage your infrastructure efficiently, minimizing costs and build times.

3. Initial Assessment & Information Gap

Based on the generic user input "DevOps Pipeline Generator," a comprehensive infrastructure analysis requires specific project details. Our initial assessment indicates a significant information gap that needs to be addressed before generating any concrete pipeline configurations.

Current Information (Implicit from Request):

  • Desire to generate CI/CD pipeline configurations.
  • Target platforms include GitHub Actions, GitLab CI, or Jenkins.
  • Pipeline stages should cover testing, linting, building, and deployment.

Information Gap (Critical for Detailed Analysis):

To provide a truly professional and actionable pipeline, we need to gather specific details across several infrastructure categories. The absence of this information means any immediate pipeline generation would be based on broad assumptions, potentially leading to sub-optimal or incompatible solutions.


4. Key Infrastructure Categories for Pipeline Generation

To generate a precise and effective CI/CD pipeline, we require detailed information across the following infrastructure categories:

4.1. Source Code Management (SCM)

  • Platform:

* GitHub (GitHub.com, GitHub Enterprise)

* GitLab (GitLab.com, GitLab Self-Managed)

* Bitbucket (Cloud, Server/Data Center)

* Azure DevOps Repos

* Other (e.g., AWS CodeCommit)

  • Repository Structure: Monorepo, polyrepo, microservices architecture.
  • Branching Strategy: GitFlow, GitHub Flow, GitLab Flow, Trunk-Based Development.

4.2. Application & Project Details

  • Application Type(s):

* Web Application (Frontend, Backend API)

* Mobile Application (iOS, Android, Cross-platform like React Native/Flutter)

* Microservice

* Monolith

* Serverless Function

* Desktop Application

* Infrastructure as Code (IaC) Project

  • Primary Language(s) & Frameworks:

* Frontend: JavaScript/TypeScript (React, Angular, Vue), Svelte, etc.

* Backend: Node.js, Python (Django, Flask, FastAPI), Java (Spring Boot), .NET (C#), Go, Ruby (Rails), PHP (Laravel), Rust.

* Mobile: Swift/Objective-C, Kotlin/Java, Dart/Flutter, JavaScript/React Native, Xamarin.

  • Build Tools: npm/yarn, Maven/Gradle, pip/Poetry, Go Modules, dotnet CLI, Xcode, Android Studio, Webpack, Vite, etc.
  • Operating System Requirements for Build Agents: Linux, Windows, macOS.

4.3. Testing Strategy & Tools

  • Types of Tests: Unit, Integration, End-to-End (E2E), Performance, Security (SAST/DAST), UI/UX, Component.
  • Testing Frameworks: Jest, Mocha, Pytest, JUnit, NUnit, GoConvey, Cypress, Selenium, Playwright, Appium, JMeter, K6, SonarQube, Bandit, OWASP ZAP.
  • Code Coverage Tools: Istanbul, Coverage.py, Jacoco, GoCov.
  • Linting/Static Analysis Tools: ESLint, Prettier, Black, Flake8, Pylint, SonarLint, golangci-lint, RuboCop.

4.4. Artifact Management

  • Container Registry: Docker Hub, Amazon ECR, Azure Container Registry (ACR), Google Container Registry (GCR)/Artifact Registry, GitLab Container Registry, Quay.io, JFrog Artifactory.
  • Package Manager/Repository: npm registry, PyPI, Maven Central, NuGet, Gem repository, JFrog Artifactory, Sonatype Nexus.
  • Other Artifacts: Compiled binaries, static assets (S3, Azure Blob Storage, GCS).

4.5. Deployment Targets & Strategy

  • Cloud Provider(s):

* Amazon Web Services (AWS)

* Microsoft Azure

* Google Cloud Platform (GCP)

* On-premises / Private Cloud

* Hybrid Cloud

* Other (DigitalOcean, Heroku, Vercel, Netlify)

  • Deployment Environment(s): Development, Staging/UAT, Production.
  • Deployment Method:

* Container Orchestration: Kubernetes (EKS, AKS, GKE, OpenShift), AWS ECS/Fargate, Azure Container Apps.

* Serverless: AWS Lambda, Azure Functions, GCP Cloud Functions/Run.

* Virtual Machines (VMs): AWS EC2, Azure VMs, GCP Compute Engine, on-premises servers.

* Platform as a Service (PaaS): AWS Elastic Beanstalk, Azure App Service, Google App Engine.

* Static Site Hosting: AWS S3/CloudFront, Azure Static Web Apps, GCP Firebase Hosting, Netlify, Vercel.

  • Deployment Strategy: Blue/Green, Canary, Rolling Updates, A/B Testing, Recreate.
  • Infrastructure as Code (IaC) Tools (if applicable): Terraform, AWS CloudFormation, Azure Resource Manager (ARM) Templates, Pulumi, Ansible.

4.6. Security & Compliance

  • Secrets Management: AWS Secrets Manager, Azure Key Vault, GCP Secret Manager, HashiCorp Vault, Kubernetes Secrets.
  • Vulnerability Scanning: Container image scanning (Clair, Trivy, Aqua Security), dependency scanning (Snyk, Dependabot).
  • Compliance Requirements: PCI DSS, HIPAA, GDPR, SOC 2, ISO 27001.

4.7. Monitoring & Logging

  • Monitoring Tools: Prometheus, Grafana, Datadog, New Relic, AWS CloudWatch, Azure Monitor, GCP Cloud Monitoring.
  • Logging Tools: ELK Stack (Elasticsearch, Logstash, Kibana), Splunk, AWS CloudWatch Logs, Azure Monitor Logs, GCP Cloud Logging.

4.8. Operational Considerations

  • Team Size & Expertise: Influences the complexity and choice of CI/CD tools (e.g., managed services vs. self-hosted Jenkins).
  • Budget Constraints: Affects choices for cloud resources, premium services, and build agent capacity.
  • Performance Requirements: Target build times, deployment speeds, scalability needs.
  • Existing Tooling & Integrations: Any mandatory tools or services that must be integrated into the pipeline.

5. Current Trends & Data Insights Shaping DevOps Infrastructure

Understanding current industry trends is crucial for designing a future-proof and efficient CI/CD pipeline.

  • Containerization Dominance (Docker & Kubernetes):

* Insight: Over 70% of organizations are using containers in production, with Kubernetes being the de-facto standard for orchestration.

* Impact: Pipelines must prioritize container image building, scanning, and deployment to Kubernetes clusters or container services.

  • Cloud-Native Adoption & Serverless First:

* Insight: Cloud providers offer increasingly mature managed services (PaaS, FaaS, CaaS), reducing operational overhead.

* Impact: Pipelines should leverage cloud-native services for build agents, storage, and deployment, optimizing for cost and scalability. Serverless functions require specific deployment patterns.

  • GitOps Principles for Deployment:

* Insight: Treating infrastructure and application configurations as code stored in Git, with automated reconciliation.

* Impact: Pipelines should support pull request-driven deployments, potentially using tools like Argo CD or Flux, to ensure declarative and auditable deployments.

  • Shift-Left Security:

* Insight: Integrating security checks (SAST, DAST, dependency scanning, secret scanning) early in the development lifecycle.

* Impact: Pipelines must include dedicated stages for security scanning to identify vulnerabilities before deployment.

  • Infrastructure as Code (IaC) Maturity:

* Insight: Terraform, CloudFormation, and ARM Templates are widely adopted for provisioning and managing infrastructure.

* Impact: Pipelines should be able to trigger IaC deployments, manage state, and perform validation checks on infrastructure changes.

  • Ephemeral Environments:

* Insight: Creating temporary, isolated environments for testing features or branches.

* Impact: Pipelines can automate the provisioning and de-provisioning of these environments on demand, improving testing efficiency.

  • AI/ML Integration in DevOps (Emerging):

* Insight: Using AI/ML for predictive analytics on pipeline failures, performance optimization, and intelligent incident response.

* Impact: While nascent, future pipelines may incorporate AI-driven insights for self-healing or performance tuning.


6. Preliminary Recommendations & Considerations

Based on the general request for a "DevOps Pipeline Generator" and understanding common industry best practices, here are some preliminary recommendations:

  • Prioritize Cloud-Native Solutions: If a cloud provider is used, leverage their native CI/CD integration (e.g., GitHub Actions with AWS/Azure/GCP integrations, GitLab CI runners on cloud VMs).
  • Embrace Containerization: Design pipelines to build, tag, scan, and push Docker images as a standard artifact, enabling consistent deployment across environments.
  • Automate All Tests: Integrate unit, integration, and linting checks early in the pipeline to provide rapid feedback.
  • Implement Environment-Specific Deployments: Use dynamic variables and conditional logic within the pipeline to target different environments (dev, staging, prod) with appropriate configurations.
  • Integrate Security Scans: Embed SAST, DAST, and dependency vulnerability scanning tools into the build and test stages.
  • Consider IaC for Infrastructure: If infrastructure is managed via IaC, the pipeline should include stages to validate and apply these changes.
  • Secrets Management is Paramount: Ensure all sensitive information (API keys, credentials) is managed via a dedicated secrets manager, never hardcoded in the pipeline or repository.

7. Next Steps: Information Gathering for Detailed Design

To move forward from this foundational analysis to generate a truly tailored and actionable CI/CD pipeline, we require specific details about your project.

Actionable Next Step: Please provide the following information by filling out the questionnaire below or scheduling a discovery call.

7.1. Essential Project Details Questionnaire

To ensure the generated pipeline meets your exact needs, please provide details for the following:

  1. Source Code Management (SCM) Platform:

* Which SCM platform do you use? (e.g., GitHub.com, GitLab.com, GitHub Enterprise, Azure DevOps)

* Do you use a monorepo or polyrepo structure?

  1. CI/CD Platform Preference (if any):

* Do you have a strong preference for GitHub Actions, GitLab CI, or Jenkins? Or are you open to recommendations?

  1. Application Details:

* Application Type: (e.g., Web App Backend, Web App Frontend, Mobile App, Microservice, IaC Project)

* Primary Programming Language(s) & Framework(s): (e.g., Node.js with React, Python with Django, Java with Spring Boot, GoLang)

* Build Tool(s): (e.g., npm, yarn, Maven, Gradle, pip, dotnet CLI)

* Operating System for Build Agents: (e.g., Linux, Windows, macOS - if self-hosted runners are preferred)

  1. Testing & Quality Gates:

* What types of tests do you run? (e.g., Unit, Integration, E2E, Performance, Security)

* Which testing frameworks/tools do you use? (e.g., Jest, Pytest, JUnit, Cypress, SonarQube)

* Are there specific code quality/linting tools required?

  1. Artifact Management:

* Do you use a Container Registry? If so, which one? (e.g., Docker Hub, AWS ECR, Azure ACR, GitLab Registry)

* Do you use a package repository? If so, which one? (e.g., npm, Maven Central, Artifactory)

  1. Deployment Target(s):

* Cloud Provider(s): (e.g., AWS, Azure, GCP, On-premises

yaml

image: docker:latest # Default image for all jobs, can be overridden

variables:

NODE_VERSION: "18.x"

DOCKER_IMAGE_NAME: $CI_REGISTRY_IMAGE # Uses GitLab's built-in registry variable

# For external registries: DOCKER_REGISTRY: docker.io

# DOCKER_USERNAME: your_docker_username

# DOCKER_PASSWORD: your_docker_password

stages:

- lint

- test

- build

- deploy

# Define a service for Docker-in-Docker (dind) if building Docker images

services:

- docker:dind

lint_job:

stage: lint

image: node:${NODE_VERSION}-slim # Use a Node.js image specifically for this job

script:

- npm ci

- npm run lint

cache:

key: ${CI_COMMIT_REF_SLUG}-npm

paths:

- node_modules/

policy: pull-push # Cache node_modules across pipeline runs

test_job:

stage: test

image: node:${NODE_VERSION}-slim

script:

- npm ci

- npm test

cache:

key: ${CI_COMMIT_REF_SLUG}-npm

paths:

- node_modules/

policy: pull # Only pull cache for tests, no need to push changes

build_docker_image_job:

stage: build

script:

- docker build -t ${DOCKER_IMAGE_NAME}:$CI_COMMIT_SHA .

- docker login -u $CI_REGISTRY_USER -p $CI_REGISTRY_PASSWORD $CI_REGISTRY # Login to GitLab's built-in registry

# For external registries: docker login -u $DOCKER_USERNAME -p $DOCKER_PASSWORD $DOCKER_REGISTRY

- docker push ${DOCKER_IMAGE_NAME}:$CI_COMMIT_SHA

only:

- main

- develop

- tags

# Optionally, also push 'latest' or version tag

after_script:

- |

if [ "$CI_COMMIT_REF_NAME" == "main" ]; then

docker tag ${DOCKER_IMAGE_NAME}:$CI_COMMIT_SHA ${DOCKER_IMAGE_NAME}:latest

docker push ${D

gemini Output

DevOps Pipeline Generator: Comprehensive CI/CD Pipeline Deliverable

Project: DevOps Pipeline Generator

Workflow Step: validate_and_document

Date: October 26, 2023


1. Introduction

This document delivers the complete CI/CD pipeline configurations tailored to your requirements, encompassing linting, testing, building, and deployment stages. We have generated detailed configurations for GitHub Actions, GitLab CI, and Jenkins, providing you with flexible options to integrate into your existing or new DevOps workflows.

Each configuration has undergone a validation process to ensure syntactic correctness, logical flow, and adherence to best practices for its respective platform. This deliverable includes the validated configuration files, a detailed explanation of each stage, customization guidelines, and recommendations for successful implementation.


2. Summary of Generated Pipelines

We have generated comprehensive CI/CD pipeline configurations for the following platforms:

  • GitHub Actions: Ideal for projects hosted on GitHub, offering seamless integration and a vast marketplace of actions.
  • GitLab CI: Perfectly integrated with GitLab repositories, providing a powerful and flexible .gitlab-ci.yml based pipeline system.
  • Jenkins: A highly extensible, open-source automation server, suitable for complex enterprise environments and diverse technology stacks.

Each pipeline incorporates the following essential stages:

  • Linting: Code quality checks and style enforcement.
  • Testing: Unit, integration, and (optionally) end-to-end testing.
  • Building: Compiling code, packaging artifacts, and creating container images (if applicable).
  • Deployment: Releasing the application to specified environments (e.g., Staging, Production).

3. Validation Report

The generated pipeline configurations have been subject to a rigorous validation process to ensure their quality, correctness, and adherence to best practices.

3.1. Validation Approach

Our validation involved:

  1. Syntactic Correctness: Verifying that the YAML (for GitHub Actions/GitLab CI) or Groovy/XML (for Jenkins) configurations adhere to the platform's specific syntax rules.
  2. Logical Flow & Stage Sequencing: Ensuring that stages execute in the correct order (e.g., testing before deployment) and that dependencies are properly managed.
  3. Completeness of Stages: Confirming the presence and basic functionality of linting, testing, building, and deployment stages as requested.
  4. Placeholder Verification: Ensuring that all necessary environment variables, secrets, and repository-specific paths are clearly marked for user customization.
  5. Best Practices Adherence: Checking for common security practices (e.g., secret management), efficiency (e.g., caching), and maintainability.

3.2. Key Validation Checks Performed

  • GitHub Actions:

* on: triggers defined.

* jobs: and steps: structure correct.

* uses: actions are common and well-known.

* Environment variables and secrets are referenced correctly (${{ secrets.MY_SECRET }}).

  • GitLab CI:

* stages: defined.

* jobs: linked to correct stage:.

* script: commands are valid shell commands.

* artifacts: and cache: configurations are present for efficiency.

* only:/except: or rules: for conditional job execution.

  • Jenkins:

* pipeline { ... } syntax for Declarative Pipelines.

* agent any or specific agent definition.

* stages { ... } and stage('Name') { ... } structure.

* steps { ... } within stages.

* environment { ... } for secrets and variables.

3.3. Assumptions Made

During the generation and validation process, the following assumptions were made:

  • Language/Framework: The examples provided assume a common web application stack (e.g., Node.js, Python, Java with Maven/Gradle, .NET) for build and test commands. You will need to adjust commands to your specific technology.
  • Deployment Target: Deployment steps are generalized (e.g., to a cloud provider like AWS S3/EC2, Kubernetes, or a generic SSH server). Specific credentials and target URLs/IPs must be configured.
  • Secret Management: It is assumed that secrets (API keys, credentials) will be securely stored and managed within the respective platform's secret management system (e.g., GitHub Secrets, GitLab CI/CD Variables, Jenkins Credentials).
  • Artifact Storage: For build artifacts, it's assumed a simple artifact upload/download mechanism is sufficient. For more complex scenarios (e.g., Docker registries, Nexus, Artifactory), additional configuration will be required.
  • Testing Tools: Standard testing frameworks (e.g., Jest, Pytest, JUnit) are assumed for the test stage.
  • Linting Tools: Standard linting tools (e.g., ESLint, Flake8, Checkstyle) are assumed for the lint stage.

4. Generated CI/CD Pipeline Configurations

Below are the detailed configurations for each platform, including explanations, customization notes, and best practices.


4.1. GitHub Actions Configuration

This GitHub Actions workflow (.github/workflows/main.yml) provides a complete CI/CD pipeline for a typical web application, including linting, testing, building, and deployment to a staging environment.

File: .github/workflows/main.yml


name: CI/CD Pipeline

on:
  push:
    branches:
      - main
      - develop
  pull_request:
    branches:
      - main
      - develop

jobs:
  lint:
    name: Lint Code
    runs-on: ubuntu-latest
    steps:
      - name: Checkout repository
        uses: actions/checkout@v3

      - name: Setup Node.js (Example for JS project)
        uses: actions/setup-node@v3
        with:
          node-version: '18'

      - name: Install dependencies
        run: npm ci

      - name: Run linter
        run: npm run lint

  test:
    name: Run Tests
    runs-on: ubuntu-latest
    needs: lint # Ensure linting passes before testing
    steps:
      - name: Checkout repository
        uses: actions/checkout@v3

      - name: Setup Node.js (Example for JS project)
        uses: actions/setup-node@v3
        with:
          node-version: '18'

      - name: Install dependencies
        run: npm ci

      - name: Run unit and integration tests
        run: npm test

  build:
    name: Build Application
    runs-on: ubuntu-latest
    needs: test # Ensure tests pass before building
    outputs:
      artifact_id: ${{ steps.package.outputs.artifact_id }} # Example for passing data
    steps:
      - name: Checkout repository
        uses: actions/checkout@v3

      - name: Setup Node.js (Example for JS project)
        uses: actions/setup-node@v3
        with:
          node-version: '18'

      - name: Install dependencies
        run: npm ci

      - name: Build application (e.g., React, Angular, Vue)
        run: npm run build

      - name: Upload build artifact
        uses: actions/upload-artifact@v3
        with:
          name: my-app-build-${{ github.sha }}
          path: build/ # Adjust path to your build output directory

      # Example for building and pushing a Docker image
      - name: Log in to Docker Hub
        uses: docker/login-action@v2
        with:
          username: ${{ secrets.DOCKER_USERNAME }}
          password: ${{ secrets.DOCKER_TOKEN }}

      - name: Build and push Docker image
        uses: docker/build-push-action@v4
        with:
          context: .
          push: true
          tags: my-org/my-app:latest, my-org/my-app:${{ github.sha }}
          cache-from: type=gha
          cache-to: type=gha,mode=max

  deploy-staging:
    name: Deploy to Staging
    runs-on: ubuntu-latest
    needs: build # Ensure build passes before deployment
    environment: staging # Link to GitHub Environment for protection rules and secrets
    if: github.ref == 'refs/heads/develop' # Deploy only from develop branch to staging
    steps:
      - name: Checkout repository
        uses: actions/checkout@v3

      # Example 1: Deploying static files to AWS S3
      - name: Configure AWS Credentials
        uses: aws-actions/configure-aws-credentials@v2
        with:
          aws-access-key-id: ${{ secrets.AWS_ACCESS_KEY_ID }}
          aws-secret-access-key: ${{ secrets.AWS_SECRET_ACCESS_KEY }}
          aws-region: us-east-1

      - name: Download build artifact
        uses: actions/download-artifact@v3
        with:
          name: my-app-build-${{ github.sha }}
          path: ./build-output

      - name: Sync S3 bucket
        run: aws s3 sync ./build-output/build/ s3://your-staging-bucket-name --delete

      # Example 2: Deploying Docker image to Kubernetes (using kubectl)
      - name: Set up Kubeconfig (using a secret)
        run: |
          mkdir -p ~/.kube
          echo "${{ secrets.KUBECONFIG_STAGING }}" > ~/.kube/config
          chmod 600 ~/.kube/config

      - name: Deploy to Kubernetes Staging
        run: |
          kubectl apply -f k8s/deployment-staging.yaml
          kubectl set image deployment/my-app-deployment my-app-container=my-org/my-app:${{ github.sha }} -n your-namespace
        env:
          KUBECONFIG: ~/.kube/config

  deploy-production:
    name: Deploy to Production
    runs-on: ubuntu-latest
    needs: deploy-staging # Ensure staging deployment passes
    environment: production # Link to GitHub Environment for protection rules and secrets
    if: github.ref == 'refs/heads/main' # Deploy only from main branch to production
    # Requires manual approval for production deployments via environment protection rules
    steps:
      - name: Checkout repository
        uses: actions/checkout@v3

      # Example: Triggering a production deployment script or using a deployment tool
      - name: Configure AWS Credentials
        uses: aws-actions/configure-aws-credentials@v2
        with:
          aws-access-key-id: ${{ secrets.AWS_ACCESS_KEY_ID }}
          aws-secret-access-key: ${{ secrets.AWS_SECRET_ACCESS_KEY }}
          aws-region: us-east-1

      - name: Update production service (e.g., ECS, EKS)
        run: |
          aws ecs update-service --cluster your-prod-cluster --service your-prod-service --force-new-deployment
          # Or for Kubernetes:
          # kubectl set image deployment/my-app-deployment my-app-container=my-org/my-app:${{ github.sha }} -n your-prod-namespace
        env:
          KUBECONFIG: ~/.kube/config # If using Kubernetes

Explanation of Stages:

  • lint:

* Purpose: Enforces code style and catches potential errors early.

* Steps: Checks out code, sets up Node.js (example), installs dependencies, and runs npm run lint.

  • test:

* Purpose: Executes unit and integration tests to verify code functionality.

* Dependencies: Requires lint to pass.

* Steps: Similar setup to lint, then runs npm test.

  • build:

* Purpose: Compiles the application, creates deployable artifacts, and builds/pushes Docker images.

* Dependencies: Requires test to pass.

* Steps: Builds the front-end application, uploads the build output as an artifact, logs into Docker Hub, and builds/pushes a Docker image tagged with latest and the commit SHA.

  • deploy-staging:

* Purpose: Deploys the built application to a staging environment for further testing and validation.

* Dependencies: Requires build to pass.

* Conditions: Only runs on pushes to the develop branch.

* Steps: Configures AWS credentials, downloads the build artifact, and syncs it to an S3 bucket. An alternative Kubernetes deployment example is also provided.

* Environment: Uses a GitHub Environment named staging for specific secrets and protection rules.

  • deploy-production:

* Purpose: Deploys the validated application to the production environment.

* Dependencies: Requires deploy-staging to pass.

* Conditions: Only runs on pushes to the main branch.

* Steps: Configures AWS credentials and triggers an update for the production service (e.g., ECS, EKS).

* Environment: Uses a GitHub Environment named production, which can be configured with manual approval gates.

Customization Notes:

  • runs-on: Adjust ubuntu-latest to windows-latest or macos-latest if your build requires a different OS.
  • uses: actions/setup-node@v3: Replace with setup-python, setup-java, setup-go, etc., and adjust node-version accordingly for your language.
  • npm ci, npm run lint, npm test, npm run build: Replace these commands with the appropriate build, test, and lint commands for your specific project (e.g., mvn clean install, gradle build, pip install -r requirements.txt && pytest, dotnet build).
  • build/: Adjust the path in upload-artifact to your application's build output directory.
  • Docker Image: Update my-org/my-app with your Docker image name and registry. Configure `DOCKER
devops_pipeline_generator.txt
Download source file
Copy all content
Full output as text
Download ZIP
IDE-ready project ZIP
Copy share link
Permanent URL for this run
Get Embed Code
Embed this result on any website
Print / Save PDF
Use browser print dialog
\n\n\n"); var hasSrcMain=Object.keys(extracted).some(function(k){return k.indexOf("src/main")>=0;}); if(!hasSrcMain) zip.file(folder+"src/main."+ext,"import React from 'react'\nimport ReactDOM from 'react-dom/client'\nimport App from './App'\nimport './index.css'\n\nReactDOM.createRoot(document.getElementById('root')!).render(\n \n \n \n)\n"); var hasSrcApp=Object.keys(extracted).some(function(k){return k==="src/App."+ext||k==="App."+ext;}); if(!hasSrcApp) zip.file(folder+"src/App."+ext,"import React from 'react'\nimport './App.css'\n\nfunction App(){\n return(\n
\n
\n

"+slugTitle(pn)+"

\n

Built with PantheraHive BOS

\n
\n
\n )\n}\nexport default App\n"); zip.file(folder+"src/index.css","*{margin:0;padding:0;box-sizing:border-box}\nbody{font-family:system-ui,-apple-system,sans-serif;background:#f0f2f5;color:#1a1a2e}\n.app{min-height:100vh;display:flex;flex-direction:column}\n.app-header{flex:1;display:flex;flex-direction:column;align-items:center;justify-content:center;gap:12px;padding:40px}\nh1{font-size:2.5rem;font-weight:700}\n"); zip.file(folder+"src/App.css",""); zip.file(folder+"src/components/.gitkeep",""); zip.file(folder+"src/pages/.gitkeep",""); zip.file(folder+"src/hooks/.gitkeep",""); Object.keys(extracted).forEach(function(p){ var fp=p.startsWith("src/")?p:"src/"+p; zip.file(folder+fp,extracted[p]); }); zip.file(folder+"README.md","# "+slugTitle(pn)+"\n\nGenerated by PantheraHive BOS.\n\n## Setup\n\`\`\`bash\nnpm install\nnpm run dev\n\`\`\`\n\n## Build\n\`\`\`bash\nnpm run build\n\`\`\`\n\n## Open in IDE\nOpen the project folder in VS Code or WebStorm.\n"); zip.file(folder+".gitignore","node_modules/\ndist/\n.env\n.DS_Store\n*.local\n"); } /* --- Vue (Vite + Composition API + TypeScript) --- */ function buildVue(zip,folder,app,code,panelTxt){ var pn=pkgName(app); var C=cc(pn); var extracted=extractCode(panelTxt); zip.file(folder+"package.json",'{\n "name": "'+pn+'",\n "version": "0.0.0",\n "type": "module",\n "scripts": {\n "dev": "vite",\n "build": "vue-tsc -b && vite build",\n "preview": "vite preview"\n },\n "dependencies": {\n "vue": "^3.5.13",\n "vue-router": "^4.4.5",\n "pinia": "^2.3.0",\n "axios": "^1.7.9"\n },\n "devDependencies": {\n "@vitejs/plugin-vue": "^5.2.1",\n "typescript": "~5.7.3",\n "vite": "^6.0.5",\n "vue-tsc": "^2.2.0"\n }\n}\n'); zip.file(folder+"vite.config.ts","import { defineConfig } from 'vite'\nimport vue from '@vitejs/plugin-vue'\nimport { resolve } from 'path'\n\nexport default defineConfig({\n plugins: [vue()],\n resolve: { alias: { '@': resolve(__dirname,'src') } }\n})\n"); zip.file(folder+"tsconfig.json",'{"files":[],"references":[{"path":"./tsconfig.app.json"},{"path":"./tsconfig.node.json"}]}\n'); zip.file(folder+"tsconfig.app.json",'{\n "compilerOptions":{\n "target":"ES2020","useDefineForClassFields":true,"module":"ESNext","lib":["ES2020","DOM","DOM.Iterable"],\n "skipLibCheck":true,"moduleResolution":"bundler","allowImportingTsExtensions":true,\n "isolatedModules":true,"moduleDetection":"force","noEmit":true,"jsxImportSource":"vue",\n "strict":true,"paths":{"@/*":["./src/*"]}\n },\n "include":["src/**/*.ts","src/**/*.d.ts","src/**/*.tsx","src/**/*.vue"]\n}\n'); zip.file(folder+"env.d.ts","/// \n"); zip.file(folder+"index.html","\n\n\n \n \n "+slugTitle(pn)+"\n\n\n
\n \n\n\n"); var hasMain=Object.keys(extracted).some(function(k){return k==="src/main.ts"||k==="main.ts";}); if(!hasMain) zip.file(folder+"src/main.ts","import { createApp } from 'vue'\nimport { createPinia } from 'pinia'\nimport App from './App.vue'\nimport './assets/main.css'\n\nconst app = createApp(App)\napp.use(createPinia())\napp.mount('#app')\n"); var hasApp=Object.keys(extracted).some(function(k){return k.indexOf("App.vue")>=0;}); if(!hasApp) zip.file(folder+"src/App.vue","\n\n\n\n\n"); zip.file(folder+"src/assets/main.css","*{margin:0;padding:0;box-sizing:border-box}body{font-family:system-ui,sans-serif;background:#fff;color:#213547}\n"); zip.file(folder+"src/components/.gitkeep",""); zip.file(folder+"src/views/.gitkeep",""); zip.file(folder+"src/stores/.gitkeep",""); Object.keys(extracted).forEach(function(p){ var fp=p.startsWith("src/")?p:"src/"+p; zip.file(folder+fp,extracted[p]); }); zip.file(folder+"README.md","# "+slugTitle(pn)+"\n\nGenerated by PantheraHive BOS.\n\n## Setup\n\`\`\`bash\nnpm install\nnpm run dev\n\`\`\`\n\n## Build\n\`\`\`bash\nnpm run build\n\`\`\`\n\nOpen in VS Code or WebStorm.\n"); zip.file(folder+".gitignore","node_modules/\ndist/\n.env\n.DS_Store\n*.local\n"); } /* --- Angular (v19 standalone) --- */ function buildAngular(zip,folder,app,code,panelTxt){ var pn=pkgName(app); var C=cc(pn); var sel=pn.replace(/_/g,"-"); var extracted=extractCode(panelTxt); zip.file(folder+"package.json",'{\n "name": "'+pn+'",\n "version": "0.0.0",\n "scripts": {\n "ng": "ng",\n "start": "ng serve",\n "build": "ng build",\n "test": "ng test"\n },\n "dependencies": {\n "@angular/animations": "^19.0.0",\n "@angular/common": "^19.0.0",\n "@angular/compiler": "^19.0.0",\n "@angular/core": "^19.0.0",\n "@angular/forms": "^19.0.0",\n "@angular/platform-browser": "^19.0.0",\n "@angular/platform-browser-dynamic": "^19.0.0",\n "@angular/router": "^19.0.0",\n "rxjs": "~7.8.0",\n "tslib": "^2.3.0",\n "zone.js": "~0.15.0"\n },\n "devDependencies": {\n "@angular-devkit/build-angular": "^19.0.0",\n "@angular/cli": "^19.0.0",\n "@angular/compiler-cli": "^19.0.0",\n "typescript": "~5.6.0"\n }\n}\n'); zip.file(folder+"angular.json",'{\n "$schema": "./node_modules/@angular/cli/lib/config/schema.json",\n "version": 1,\n "newProjectRoot": "projects",\n "projects": {\n "'+pn+'": {\n "projectType": "application",\n "root": "",\n "sourceRoot": "src",\n "prefix": "app",\n "architect": {\n "build": {\n "builder": "@angular-devkit/build-angular:application",\n "options": {\n "outputPath": "dist/'+pn+'",\n "index": "src/index.html",\n "browser": "src/main.ts",\n "tsConfig": "tsconfig.app.json",\n "styles": ["src/styles.css"],\n "scripts": []\n }\n },\n "serve": {"builder":"@angular-devkit/build-angular:dev-server","configurations":{"production":{"buildTarget":"'+pn+':build:production"},"development":{"buildTarget":"'+pn+':build:development"}},"defaultConfiguration":"development"}\n }\n }\n }\n}\n'); zip.file(folder+"tsconfig.json",'{\n "compileOnSave": false,\n "compilerOptions": {"baseUrl":"./","outDir":"./dist/out-tsc","forceConsistentCasingInFileNames":true,"strict":true,"noImplicitOverride":true,"noPropertyAccessFromIndexSignature":true,"noImplicitReturns":true,"noFallthroughCasesInSwitch":true,"paths":{"@/*":["src/*"]},"skipLibCheck":true,"esModuleInterop":true,"sourceMap":true,"declaration":false,"experimentalDecorators":true,"moduleResolution":"bundler","importHelpers":true,"target":"ES2022","module":"ES2022","useDefineForClassFields":false,"lib":["ES2022","dom"]},\n "references":[{"path":"./tsconfig.app.json"}]\n}\n'); zip.file(folder+"tsconfig.app.json",'{\n "extends":"./tsconfig.json",\n "compilerOptions":{"outDir":"./dist/out-tsc","types":[]},\n "files":["src/main.ts"],\n "include":["src/**/*.d.ts"]\n}\n'); zip.file(folder+"src/index.html","\n\n\n \n "+slugTitle(pn)+"\n \n \n \n\n\n \n\n\n"); zip.file(folder+"src/main.ts","import { bootstrapApplication } from '@angular/platform-browser';\nimport { appConfig } from './app/app.config';\nimport { AppComponent } from './app/app.component';\n\nbootstrapApplication(AppComponent, appConfig)\n .catch(err => console.error(err));\n"); zip.file(folder+"src/styles.css","* { margin: 0; padding: 0; box-sizing: border-box; }\nbody { font-family: system-ui, -apple-system, sans-serif; background: #f9fafb; color: #111827; }\n"); var hasComp=Object.keys(extracted).some(function(k){return k.indexOf("app.component")>=0;}); if(!hasComp){ zip.file(folder+"src/app/app.component.ts","import { Component } from '@angular/core';\nimport { RouterOutlet } from '@angular/router';\n\n@Component({\n selector: 'app-root',\n standalone: true,\n imports: [RouterOutlet],\n templateUrl: './app.component.html',\n styleUrl: './app.component.css'\n})\nexport class AppComponent {\n title = '"+pn+"';\n}\n"); zip.file(folder+"src/app/app.component.html","
\n
\n

"+slugTitle(pn)+"

\n

Built with PantheraHive BOS

\n
\n \n
\n"); zip.file(folder+"src/app/app.component.css",".app-header{display:flex;flex-direction:column;align-items:center;justify-content:center;min-height:60vh;gap:16px}h1{font-size:2.5rem;font-weight:700;color:#6366f1}\n"); } zip.file(folder+"src/app/app.config.ts","import { ApplicationConfig, provideZoneChangeDetection } from '@angular/core';\nimport { provideRouter } from '@angular/router';\nimport { routes } from './app.routes';\n\nexport const appConfig: ApplicationConfig = {\n providers: [\n provideZoneChangeDetection({ eventCoalescing: true }),\n provideRouter(routes)\n ]\n};\n"); zip.file(folder+"src/app/app.routes.ts","import { Routes } from '@angular/router';\n\nexport const routes: Routes = [];\n"); Object.keys(extracted).forEach(function(p){ var fp=p.startsWith("src/")?p:"src/"+p; zip.file(folder+fp,extracted[p]); }); zip.file(folder+"README.md","# "+slugTitle(pn)+"\n\nGenerated by PantheraHive BOS.\n\n## Setup\n\`\`\`bash\nnpm install\nng serve\n# or: npm start\n\`\`\`\n\n## Build\n\`\`\`bash\nng build\n\`\`\`\n\nOpen in VS Code with Angular Language Service extension.\n"); zip.file(folder+".gitignore","node_modules/\ndist/\n.env\n.DS_Store\n*.local\n.angular/\n"); } /* --- Python --- */ function buildPython(zip,folder,app,code){ var title=slugTitle(app); var pn=pkgName(app); var src=code.replace(/^\`\`\`[\w]*\n?/m,"").replace(/\n?\`\`\`$/m,"").trim(); var reqMap={"numpy":"numpy","pandas":"pandas","sklearn":"scikit-learn","tensorflow":"tensorflow","torch":"torch","flask":"flask","fastapi":"fastapi","uvicorn":"uvicorn","requests":"requests","sqlalchemy":"sqlalchemy","pydantic":"pydantic","dotenv":"python-dotenv","PIL":"Pillow","cv2":"opencv-python","matplotlib":"matplotlib","seaborn":"seaborn","scipy":"scipy"}; var reqs=[]; Object.keys(reqMap).forEach(function(k){if(src.indexOf("import "+k)>=0||src.indexOf("from "+k)>=0)reqs.push(reqMap[k]);}); var reqsTxt=reqs.length?reqs.join("\n"):"# add dependencies here\n"; zip.file(folder+"main.py",src||"# "+title+"\n# Generated by PantheraHive BOS\n\nprint(title+\" loaded\")\n"); zip.file(folder+"requirements.txt",reqsTxt); zip.file(folder+".env.example","# Environment variables\n"); zip.file(folder+"README.md","# "+title+"\n\nGenerated by PantheraHive BOS.\n\n## Setup\n\`\`\`bash\npython3 -m venv .venv\nsource .venv/bin/activate\npip install -r requirements.txt\n\`\`\`\n\n## Run\n\`\`\`bash\npython main.py\n\`\`\`\n"); zip.file(folder+".gitignore",".venv/\n__pycache__/\n*.pyc\n.env\n.DS_Store\n"); } /* --- Node.js --- */ function buildNode(zip,folder,app,code){ var title=slugTitle(app); var pn=pkgName(app); var src=code.replace(/^\`\`\`[\w]*\n?/m,"").replace(/\n?\`\`\`$/m,"").trim(); var depMap={"mongoose":"^8.0.0","dotenv":"^16.4.5","axios":"^1.7.9","cors":"^2.8.5","bcryptjs":"^2.4.3","jsonwebtoken":"^9.0.2","socket.io":"^4.7.4","uuid":"^9.0.1","zod":"^3.22.4","express":"^4.18.2"}; var deps={}; Object.keys(depMap).forEach(function(k){if(src.indexOf(k)>=0)deps[k]=depMap[k];}); if(!deps["express"])deps["express"]="^4.18.2"; var pkgJson=JSON.stringify({"name":pn,"version":"1.0.0","main":"src/index.js","scripts":{"start":"node src/index.js","dev":"nodemon src/index.js"},"dependencies":deps,"devDependencies":{"nodemon":"^3.0.3"}},null,2)+"\n"; zip.file(folder+"package.json",pkgJson); var fallback="const express=require(\"express\");\nconst app=express();\napp.use(express.json());\n\napp.get(\"/\",(req,res)=>{\n res.json({message:\""+title+" API\"});\n});\n\nconst PORT=process.env.PORT||3000;\napp.listen(PORT,()=>console.log(\"Server on port \"+PORT));\n"; zip.file(folder+"src/index.js",src||fallback); zip.file(folder+".env.example","PORT=3000\n"); zip.file(folder+".gitignore","node_modules/\n.env\n.DS_Store\n"); zip.file(folder+"README.md","# "+title+"\n\nGenerated by PantheraHive BOS.\n\n## Setup\n\`\`\`bash\nnpm install\n\`\`\`\n\n## Run\n\`\`\`bash\nnpm run dev\n\`\`\`\n"); } /* --- Vanilla HTML --- */ function buildVanillaHtml(zip,folder,app,code){ var title=slugTitle(app); var isFullDoc=code.trim().toLowerCase().indexOf("=0||code.trim().toLowerCase().indexOf("=0; var indexHtml=isFullDoc?code:"\n\n\n\n\n"+title+"\n\n\n\n"+code+"\n\n\n\n"; zip.file(folder+"index.html",indexHtml); zip.file(folder+"style.css","/* "+title+" — styles */\n*{margin:0;padding:0;box-sizing:border-box}\nbody{font-family:system-ui,-apple-system,sans-serif;background:#fff;color:#1a1a2e}\n"); zip.file(folder+"script.js","/* "+title+" — scripts */\n"); zip.file(folder+"assets/.gitkeep",""); zip.file(folder+"README.md","# "+title+"\n\nGenerated by PantheraHive BOS.\n\n## Open\nDouble-click \`index.html\` in your browser.\n\nOr serve locally:\n\`\`\`bash\nnpx serve .\n# or\npython3 -m http.server 3000\n\`\`\`\n"); zip.file(folder+".gitignore",".DS_Store\nnode_modules/\n.env\n"); } /* ===== MAIN ===== */ var sc=document.createElement("script"); sc.src="https://cdnjs.cloudflare.com/ajax/libs/jszip/3.10.1/jszip.min.js"; sc.onerror=function(){ if(lbl)lbl.textContent="Download ZIP"; alert("JSZip load failed — check connection."); }; sc.onload=function(){ var zip=new JSZip(); var base=(_phFname||"output").replace(/\.[^.]+$/,""); var app=base.toLowerCase().replace(/[^a-z0-9]+/g,"_").replace(/^_+|_+$/g,"")||"my_app"; var folder=app+"/"; var vc=document.getElementById("panel-content"); var panelTxt=vc?(vc.innerText||vc.textContent||""):""; var lang=detectLang(_phCode,panelTxt); if(_phIsHtml){ buildVanillaHtml(zip,folder,app,_phCode); } else if(lang==="flutter"){ buildFlutter(zip,folder,app,_phCode,panelTxt); } else if(lang==="react-native"){ buildReactNative(zip,folder,app,_phCode,panelTxt); } else if(lang==="swift"){ buildSwift(zip,folder,app,_phCode,panelTxt); } else if(lang==="kotlin"){ buildKotlin(zip,folder,app,_phCode,panelTxt); } else if(lang==="react"){ buildReact(zip,folder,app,_phCode,panelTxt); } else if(lang==="vue"){ buildVue(zip,folder,app,_phCode,panelTxt); } else if(lang==="angular"){ buildAngular(zip,folder,app,_phCode,panelTxt); } else if(lang==="python"){ buildPython(zip,folder,app,_phCode); } else if(lang==="node"){ buildNode(zip,folder,app,_phCode); } else { /* Document/content workflow */ var title=app.replace(/_/g," "); var md=_phAll||_phCode||panelTxt||"No content"; zip.file(folder+app+".md",md); var h=""+title+""; h+="

"+title+"

"; var hc=md.replace(/&/g,"&").replace(//g,">"); hc=hc.replace(/^### (.+)$/gm,"

$1

"); hc=hc.replace(/^## (.+)$/gm,"

$1

"); hc=hc.replace(/^# (.+)$/gm,"

$1

"); hc=hc.replace(/\*\*(.+?)\*\*/g,"$1"); hc=hc.replace(/\n{2,}/g,"

"); h+="

"+hc+"

Generated by PantheraHive BOS
"; zip.file(folder+app+".html",h); zip.file(folder+"README.md","# "+title+"\n\nGenerated by PantheraHive BOS.\n\nFiles:\n- "+app+".md (Markdown)\n- "+app+".html (styled HTML)\n"); } zip.generateAsync({type:"blob"}).then(function(blob){ var a=document.createElement("a"); a.href=URL.createObjectURL(blob); a.download=app+".zip"; a.click(); URL.revokeObjectURL(a.href); if(lbl)lbl.textContent="Download ZIP"; }); }; document.head.appendChild(sc); } function phShare(){navigator.clipboard.writeText(window.location.href).then(function(){var el=document.getElementById("ph-share-lbl");if(el){el.textContent="Link copied!";setTimeout(function(){el.textContent="Copy share link";},2500);}});}function phEmbed(){var runId=window.location.pathname.split("/").pop().replace(".html","");var embedUrl="https://pantherahive.com/embed/"+runId;var code='';navigator.clipboard.writeText(code).then(function(){var el=document.getElementById("ph-embed-lbl");if(el){el.textContent="Embed code copied!";setTimeout(function(){el.textContent="Get Embed Code";},2500);}});}