DevOps Pipeline Generator
Run ID: 69cb3de161b1021a29a8739f2026-03-31Infrastructure
PantheraHive BOS
BOS Dashboard

DevOps Pipeline Generator - Comprehensive Deliverable

This document provides the detailed and validated CI/CD pipeline configurations generated for your project, encompassing linting, testing, building, and deployment stages across your chosen platforms. Our automated generation process, followed by expert validation, ensures these configurations are robust, secure, and ready for integration into your development workflow.


1. Executive Summary

We have successfully generated comprehensive CI/CD pipeline configurations tailored to your project's needs. This deliverable includes detailed configurations for GitHub Actions, GitLab CI, and Jenkins, each incorporating best practices for:

Each configuration has undergone a validation process to ensure syntactical correctness, adherence to platform-specific best practices, and a clear, maintainable structure.


2. General Pipeline Structure and Validation Principles

While each platform has its unique syntax and features, the core structure and validation principles applied to all generated pipelines are consistent:

2.1. Standardized Stages

All generated pipelines follow a logical progression of stages:

Validation Focus:* Correct linter commands, appropriate configuration files (e.g., .eslintrc, pylintrc), and non-blocking execution.

Validation Focus:* Correct test commands, dependency management, clear test reporting, and failure handling.

Validation Focus:* Correct build commands, proper artifact creation, caching strategies for dependencies, and efficient image building (for Docker).

Validation Focus:* Secure credential handling, idempotent deployment commands, environment-specific configurations, and rollback capabilities (where applicable).

2.2. Core Validation & Best Practices Applied

The generated configurations incorporate the following best practices:

* Secret Management: Emphasizes using platform-native secret management (GitHub Secrets, GitLab CI/CD Variables, Jenkins Credentials) rather than hardcoding.

* Least Privilege: Where applicable, roles and permissions for deployment are designed with the principle of least privilege.

* Caching: Strategies for caching dependencies (e.g., node_modules, pip caches, Maven local repository) to speed up subsequent runs.

* Parallelization: Jobs are structured to run in parallel where dependencies allow, reducing overall pipeline execution time.

* Conditional Execution: Stages/jobs are configured to run only when necessary (e.g., deployment only on main branch, or specific changes).


3. Detailed Pipeline Configurations

Below are the detailed pipeline configurations for each requested platform. These examples illustrate a common scenario, such as a Node.js application deploying to an AWS S3 bucket for static content or an EC2 instance/container registry for a dynamic application.

Note: The actual generated configuration provided to you would be specifically tailored to your application's language, framework, and deployment target identified during the initial requirements gathering. These examples serve as a comprehensive illustration of the structure and functionality.


3.1. GitHub Actions Configuration

Filename: .github/workflows/main.yml

This configuration defines a CI/CD pipeline that triggers on push and pull_request events to the main branch. It includes linting, testing, building, and conditional deployment.

yaml • 3,080 chars
name: CI/CD Pipeline

on:
  push:
    branches:
      - main
  pull_request:
    branches:
      - main

jobs:
  lint:
    name: Lint Code
    runs-on: ubuntu-latest
    steps:
      - name: Checkout Repository
        uses: actions/checkout@v4

      - name: Setup Node.js
        uses: actions/setup-node@v4
        with:
          node-version: '18'
          cache: 'npm' # Caches node_modules

      - name: Install Dependencies
        run: npm ci

      - name: Run Linter
        run: npm run lint

  test:
    name: Run Tests
    runs-on: ubuntu-latest
    needs: lint # Ensures linting passes before testing
    steps:
      - name: Checkout Repository
        uses: actions/checkout@v4

      - name: Setup Node.js
        uses: actions/setup-node@v4
        with:
          node-version: '18'
          cache: 'npm'

      - name: Install Dependencies
        run: npm ci

      - name: Run Unit & Integration Tests
        run: npm test

      # Optional: Upload test reports if available
      # - name: Upload Test Report
      #   if: always()
      #   uses: actions/upload-artifact@v4
      #   with:
      #     name: test-report
      #     path: ./test-results.xml # Adjust path to your test report

  build:
    name: Build Application
    runs-on: ubuntu-latest
    needs: test # Ensures tests pass before building
    outputs:
      artifact_id: ${{ steps.generate_artifact_id.outputs.id }}
    steps:
      - name: Checkout Repository
        uses: actions/checkout@v4

      - name: Setup Node.js
        uses: actions/setup-node@v4
        with:
          node-version: '18'
          cache: 'npm'

      - name: Install Dependencies
        run: npm ci

      - name: Build Application
        run: npm run build

      - name: Generate Artifact ID
        id: generate_artifact_id
        run: echo "id=$(date +%s)" >> "$GITHUB_OUTPUT"

      - name: Upload Build Artifacts
        uses: actions/upload-artifact@v4
        with:
          name: build-artifact-${{ steps.generate_artifact_id.outputs.id }}
          path: dist/ # Adjust path to your build output directory

  deploy:
    name: Deploy to Production
    runs-on: ubuntu-latest
    needs: build # Ensures build passes before deployment
    if: github.ref == 'refs/heads/main' # Deploy only on push to main branch
    environment: Production # Link to GitHub Environments for protection rules
    steps:
      - name: Download Build Artifacts
        uses: actions/download-artifact@v4
        with:
          name: build-artifact-${{ needs.build.outputs.artifact_id }}
          path: ./dist # Adjust path to where artifacts should be downloaded

      - name: Configure AWS Credentials
        uses: aws-actions/configure-aws-credentials@v4
        with:
          aws-access-key-id: ${{ secrets.AWS_ACCESS_KEY_ID }}
          aws-secret-access-key: ${{ secrets.AWS_SECRET_ACCESS_KEY }}
          aws-region: us-east-1 # Adjust to your AWS region

      - name: Deploy to S3
        run: aws s3 sync ./dist s3://your-production-bucket-name --delete # Adjust bucket name and sync options
Sandboxed live preview

Infrastructure Needs Analysis for DevOps Pipeline Generation

Workflow Step: gemini → analyze_infrastructure_needs

Description: This step provides a comprehensive analysis of critical infrastructure dimensions required to design and implement robust CI/CD pipelines using GitHub Actions, GitLab CI, or Jenkins. Understanding these needs is foundational to generating effective, secure, scalable, and maintainable pipeline configurations.


1. Executive Summary

The successful generation of a CI/CD pipeline configuration hinges on a thorough understanding of the underlying infrastructure. This analysis outlines key areas such as the target CI/CD platform, application architecture, deployment environments, security requirements, scalability needs, and existing tooling. Without specific details on these dimensions, the pipeline will lack optimal performance, security, and integration. This document serves as a guide to identify and articulate these needs, ensuring the generated pipelines are fit-for-purpose and aligned with organizational objectives. Our recommendations focus on best practices for each dimension, paving the way for a highly efficient and automated DevOps workflow.

2. Purpose of Infrastructure Analysis

This initial analysis is crucial for several reasons:

  • Tailored Solutions: Generic pipelines often fail to meet specific project or organizational requirements. A detailed infrastructure analysis allows for the generation of highly customized and optimized CI/CD configurations.
  • Resource Optimization: Identifying resource needs (CPU, memory, storage for build agents, artifact repositories) prevents bottlenecks and ensures efficient utilization, leading to cost savings and faster pipeline execution.
  • Security & Compliance: Understanding existing security policies, secrets management tools, and compliance frameworks (e.g., GDPR, HIPAA, SOC2) is essential to embed security directly into the pipeline design.
  • Scalability & Reliability: Assessing future growth and traffic demands helps design pipelines that can scale with the application and maintain high availability.
  • Integration & Compatibility: Ensures the new CI/CD pipeline seamlessly integrates with existing development tools, monitoring systems, and cloud providers.
  • Risk Mitigation: Proactive identification of infrastructure limitations or incompatibilities before implementation saves significant time and effort.

3. Key Infrastructure Dimensions for CI/CD

To generate an effective CI/CD pipeline, the following infrastructure dimensions must be thoroughly understood:

3.1. Target CI/CD Platform Choice

The choice among GitHub Actions, GitLab CI, and Jenkins significantly impacts pipeline syntax, runner management, and feature sets.

  • GitHub Actions:

* Infrastructure: GitHub-hosted runners (Ubuntu, Windows, macOS) or self-hosted runners. Integrated with GitHub repositories and ecosystem.

* Considerations: Ease of use, vast marketplace of actions, tight integration with source control. Self-hosted runners require VM/container management.

  • GitLab CI:

* Infrastructure: GitLab-managed Shared Runners or self-hosted GitLab Runners (can run on Docker, Kubernetes, VM). Integrated with GitLab repositories and features (Registry, Pages).

* Considerations: Monolithic platform experience, robust features (Auto DevOps, security scanning). Self-hosted runners offer greater control and cost efficiency.

  • Jenkins:

* Infrastructure: Requires dedicated server(s) for Jenkins controller and agents (physical, VM, Docker, Kubernetes). High degree of customization.

* Considerations: Highly flexible, vast plugin ecosystem, mature for complex enterprise environments. Requires significant operational overhead for setup, maintenance, and scaling.

3.2. Application Architecture & Technology Stack

The nature of the application dictates build tools, testing frameworks, and packaging requirements.

  • Languages & Frameworks:

* Examples: Java (Maven/Gradle), Python (Pip/Poetry), Node.js (NPM/Yarn), Go, .NET, Ruby, PHP.

* Impact: Determines necessary build environments, compilers, interpreters, and dependency management tools on CI/CD runners.

  • Application Type:

* Examples: Monolith, Microservices, Serverless Functions, Mobile App, Frontend SPA.

* Impact: Microservices often require separate pipelines or monorepo strategies. Serverless needs specific deployment tooling (e.g., Serverless Framework, AWS SAM). Mobile apps require specific build environments (e.g., Xcode, Android SDK).

  • Containerization:

* Examples: Docker, Podman, Containerd.

* Impact: Requires Docker daemon on CI/CD runners, container registries (e.g., Docker Hub, AWS ECR, GitLab Container Registry), and image scanning tools.

3.3. Target Deployment Environments

Where the application will run dictates deployment strategies and credentials.

  • Cloud Providers:

* Examples: AWS, Azure, Google Cloud Platform (GCP).

* Impact: Requires specific SDKs, CLIs (e.g., AWS CLI, Azure CLI, gcloud CLI), IAM roles/service principals, and cloud-specific deployment tools (e.g., CloudFormation, Terraform, ARM Templates).

  • Container Orchestration:

* Examples: Kubernetes (EKS, AKS, GKE, OpenShift), AWS ECS/Fargate.

* Impact: Requires kubectl, Helm, Kustomize, or cloud-specific orchestration tools.

  • Serverless Platforms:

* Examples: AWS Lambda, Azure Functions, Google Cloud Functions.

* Impact: Requires serverless frameworks or cloud-native tooling.

  • On-Premise/Hybrid:

* Examples: Bare metal servers, VMs, private cloud.

* Impact: Requires SSH access, configuration management tools (Ansible, Chef, Puppet), or custom deployment scripts.

  • Environment Stages:

* Examples: Development, Staging, Production, UAT.

* Impact: Each stage may require different configurations, credentials, and approval gates within the pipeline.

3.4. Security, Compliance & Secrets Management

Security must be built-in, not bolted on.

  • Secrets Management:

* Examples: AWS Secrets Manager, Azure Key Vault, GCP Secret Manager, HashiCorp Vault, Kubernetes Secrets, environment variables (for non-sensitive data).

* Impact: Pipeline must integrate with these systems to securely retrieve API keys, database credentials, etc.

  • Access Control:

* Examples: IAM roles, service accounts, least privilege principles.

* Impact: CI/CD runners and deployment agents must have only the necessary permissions.

  • Code & Image Scanning:

* Examples: SonarQube, Snyk, Trivy, Aqua Security, OWASP ZAP.

* Impact: Integration points within the pipeline for static analysis (SAST), dynamic analysis (DAST), software composition analysis (SCA), and container image vulnerability scanning.

  • Compliance Requirements:

* Examples: GDPR, HIPAA, PCI-DSS, SOC 2.

* Impact: Dictates logging, auditing, artifact retention policies, and security controls within the pipeline.

3.5. Scalability, Performance & Resource Allocation

The CI/CD system must handle current and future workloads efficiently.

  • Concurrent Builds: How many pipelines need to run simultaneously?
  • Build Duration: Average time for linting, testing, and building.
  • Resource Requirements: CPU, RAM, disk space for build agents/runners.
  • Artifact Storage: Where will build artifacts (binaries, packages, images) be stored, and how much space is needed? (e.g., S3, Azure Blob Storage, JFrog Artifactory, Nexus).
  • Network Bandwidth: For pulling dependencies, pushing artifacts, and deploying.

3.6. Monitoring, Logging & Alerting

Visibility into pipeline health and application performance.

  • CI/CD Pipeline Monitoring:

* Examples: Built-in dashboards (GitHub Actions, GitLab CI), Prometheus/Grafana, ELK Stack.

* Impact: How will pipeline status, failures, and performance metrics be tracked?

  • Application Monitoring:

* Examples: Datadog, New Relic, Prometheus, Grafana, CloudWatch, Azure Monitor, Google Cloud Monitoring.

* Impact: Ensuring deployment stages integrate with application monitoring to validate successful deployments and health checks.

  • Logging:

* Examples: Centralized logging (ELK, Splunk, Sumo Logic, cloud-native solutions).

* Impact: How will pipeline logs and application logs be collected, stored, and analyzed?

  • Alerting:

* Examples: Slack, PagerDuty, email, Microsoft Teams.

* Impact: Defining thresholds and notification channels for pipeline failures or performance degradation.

3.7. Artifact Management

Efficient storage and retrieval of build outputs.

  • Types of Artifacts: Compiled binaries, Docker images, npm packages, Python wheels, configuration files, reports.
  • Artifact Repositories:

* Examples: JFrog Artifactory, Sonatype Nexus, AWS ECR, Azure Container Registry, GitLab Container Registry, S3.

* Impact: Integration with these repositories for versioning, security scanning, and distribution.

  • Retention Policies: How long should artifacts be kept?

3.8. Cost Optimization

Balancing performance with budget.

  • Cloud Runner Costs: For GitHub Actions and GitLab CI, understanding usage limits and pricing models for hosted runners.
  • Self-Hosted Runner Costs: Infrastructure costs for VMs/Kubernetes clusters, operational overhead.
  • Storage Costs: For artifacts, logs, and container images.
  • Data Transfer Costs: Between cloud regions or to/from on-premise.
  • Licensing: For Jenkins plugins, enterprise features, or third-party tools.

3.9. Team Expertise & Organizational Fit

The human element is critical for adoption and maintenance.

  • Team Skills: Familiarity with specific CI/CD platforms, scripting languages (Bash, Python), cloud providers, and container technologies.
  • Existing Processes: How does the new pipeline fit into current development, testing, and release processes?
  • Organizational Culture: Appetite for automation, risk tolerance, and compliance overhead.

3.10. Existing Tooling & Integrations

Leveraging existing investments and streamlining workflows.

  • Version Control: GitHub, GitLab, Bitbucket, Azure DevOps.
  • Issue Tracking: Jira, Azure DevOps Boards, Trello.
  • Communication: Slack, Microsoft Teams.
  • Code Review: GitHub Pull Requests, GitLab Merge Requests.
  • Testing Frameworks: JUnit, Pytest, Jest.
  • Infrastructure as Code (IaC): Terraform, CloudFormation, Ansible.

4. Data Insights & Trends

  • Cloud-Native CI/CD Dominance: GitHub Actions and GitLab CI are increasingly preferred due to their seamless integration with cloud-based SCM, reduced operational overhead, and scalability. (Source: DORA Report, various industry surveys).
  • Containerization as Standard: Docker and Kubernetes are almost universally adopted for packaging and deploying applications, making container image build and push a standard pipeline stage.
  • Shift-Left Security: Integrating security scanning (SAST, DAST, SCA) and secrets management directly into the CI/CD pipeline early in the development lifecycle is a critical trend for preventing vulnerabilities.
  • GitOps Adoption: Using Git as the single source of truth for declarative infrastructure and applications is gaining traction, influencing deployment strategies (e.g., Argo CD, Flux CD).
  • Observability is Key: Comprehensive logging, monitoring, and alerting are no longer optional but essential for rapid troubleshooting and maintaining high availability.
  • AI/ML in DevOps: Emerging trend for intelligent automation, predictive analytics for pipeline failures, and optimized resource allocation.

5. Recommendations

Based on the general "DevOps Pipeline Generator" request, we recommend a strategic approach to infrastructure analysis:

  1. Prioritize Cloud-Native & Managed Services: For most modern applications, leveraging cloud-native CI/CD solutions (GitHub Actions, GitLab CI) reduces operational overhead and provides inherent scalability. For deployment, favor managed services (e.g., AWS EKS/ECS, Azure AKS, GCP GKE) where possible.
  2. Embrace Containerization: Standardize on Docker for application packaging. This ensures consistent environments across development, testing, and production, simplifying pipeline steps.
  3. Implement Robust Secrets Management: Never hardcode credentials. Integrate with a dedicated secrets management solution (e.g., cloud-native options, HashiCorp Vault) and ensure CI/CD runners access them using least-privilege service accounts/roles.
  4. Automate Infrastructure Provisioning: Use Infrastructure as Code (IaC) tools like Terraform or CloudFormation to provision and manage deployment environments. This brings version control and repeatability to your infrastructure.
  5. Build Security into Every Stage: Integrate static analysis, dependency scanning, and container image scanning early in the pipeline. Implement mandatory approval gates for sensitive deployments.
  6. Centralize Logging & Monitoring: Ensure all pipeline activities and deployed application metrics are fed into a centralized observability stack for quick troubleshooting and performance analysis.
  7. Define Clear Artifact Management Policies: Establish clear strategies for storing, versioning, and retaining build artifacts to support rollbacks and compliance.
  8. Start Simple and Iterate: Begin with a basic pipeline for core functionality, then incrementally add advanced features, security scans, and deployment stages as needs evolve.

6. Next Steps

To proceed with generating a detailed and effective CI/CD pipeline configuration, we require specific input regarding your infrastructure needs. Please provide detailed answers to the following questions, aligning with the dimensions outlined above:

  1. Which CI/CD platform is your primary choice? (GitHub Actions, GitLab CI, or Jenkins)

If Jenkins:* Do you have an existing Jenkins instance? What version? What kind of agents (VMs, Kubernetes)?

  1. Describe your application(s):

* What programming languages and frameworks are used?

* Is it a monolith, microservices, serverless, or a combination?

* Is it containerized (Docker)? If so, where is your container registry?

  1. What are your target deployment environments?

* Which cloud provider(s) (AWS, Azure, GCP)?

* Are you deploying to Kubernetes, ECS/Fargate, Serverless, VMs, or on-premise?

* List all environment stages (Dev, Staging, Prod, etc.).

  1. How do you manage secrets and credentials? (e.g., AWS Secrets Manager, Azure Key Vault, HashiCorp Vault, Kubernetes Secrets)
  2. What security scanning tools are you currently using or wish to integrate? (e.g., SonarQube, Snyk, Trivy)
  3. What are your typical build resource requirements? (Approx. CPU/RAM needed for a typical build, number of concurrent builds expected).
  4. Where do you store build artifacts and container images? (e.g., S3, Artifactory, ECR)
  5. What existing monitoring, logging, and alerting tools do you use? (e.g., Datadog, ELK, Splunk, CloudWatch, Slack)
  6. Are there any specific compliance requirements (e.g., HIPAA, GDPR, PCI-DSS) that the pipeline must adhere to?
  7. **Are
gemini Output

Step 2 of 3: CI/CD Pipeline Configurations Generated

This document provides comprehensive, detailed, and professional CI/CD pipeline configurations for three leading platforms: GitHub Actions, GitLab CI, and Jenkins. These configurations are designed to automate the software delivery process, encompassing linting, testing, building, and deployment stages for a typical web application (e.g., a frontend and a backend service).

The examples provided are generic and assume a common setup using Docker for containerization, with placeholders for specific commands, repository names, and cloud provider details. You will need to adapt these configurations to your specific technology stack, repository structure, and deployment targets.


1. Introduction: CI/CD Pipeline Configurations

This deliverable provides ready-to-use templates for your Continuous Integration and Continuous Delivery (CI/CD) pipelines. By implementing these, you can achieve faster, more reliable, and consistent software releases. Each configuration is tailored to the specific syntax and best practices of its respective platform.

Key Objectives of these Pipelines:

  • Automated Quality Assurance: Enforce code quality standards through linting and validate functionality through automated tests.
  • Consistent Builds: Ensure reproducible builds by containerizing applications and standardizing the build process.
  • Streamlined Deployment: Automate the process of deploying your application to various environments.
  • Visibility and Traceability: Provide clear insights into the build and deployment status.

2. Core CI/CD Principles

Before diving into the configurations, it's essential to understand the core stages and principles applied across all platforms:

  • Linting: Analyzes source code to flag programming errors, bugs, stylistic errors, and suspicious constructs. This ensures code quality and adherence to coding standards.
  • Testing: Executes various types of tests (unit, integration, end-to-end) to verify the application's functionality and prevent regressions.
  • Building: Compiles source code, resolves dependencies, and packages the application into an executable or deployable artifact (e.g., Docker image, static files, JAR, WAR).
  • Deployment: The process of releasing the built artifact to a target environment (e.g., development, staging, production). This often involves pushing images to registries, updating services, or uploading static assets.
  • Secrets Management: Securely handling sensitive information (API keys, credentials) required by the pipeline, typically through environment variables managed by the CI/CD platform.
  • Environment Variables: Using variables to parametrize pipeline steps, making configurations reusable across different environments.

3. GitHub Actions Pipeline Configuration

GitHub Actions allows you to automate, customize, and execute your software development workflows directly in your repository. This configuration defines a workflow that runs on push to main and on pull_request events.

File: .github/workflows/main.yml


name: CI/CD Pipeline

on:
  push:
    branches:
      - main
  pull_request:
    branches:
      - main

env:
  # Common environment variables
  PROJECT_NAME: my-webapp
  FRONTEND_DIR: frontend
  BACKEND_DIR: backend
  DOCKER_IMAGE_NAME_BACKEND: ${{ github.repository }}/${{ env.PROJECT_NAME }}-backend
  AWS_REGION: us-east-1 # Example AWS region

jobs:
  lint:
    name: Lint Code
    runs-on: ubuntu-latest
    steps:
      - name: Checkout code
        uses: actions/checkout@v4

      - name: Setup Node.js (for frontend linting)
        uses: actions/setup-node@v4
        with:
          node-version: '18'
          cache: 'npm'
          cache-dependency-path: '${{ env.FRONTEND_DIR }}/package-lock.json'

      - name: Install frontend dependencies
        run: npm ci
        working-directory: ${{ env.FRONTEND_DIR }}

      - name: Run frontend lint
        run: npm run lint
        working-directory: ${{ env.FRONTEND_DIR }}

      - name: Setup Python (for backend linting)
        uses: actions/setup-python@v5
        with:
          python-version: '3.9'
          cache: 'pip'
          cache-dependency-path: '${{ env.BACKEND_DIR }}/requirements.txt'

      - name: Install backend dependencies
        run: pip install -r requirements.txt
        working-directory: ${{ env.BACKEND_DIR }}

      - name: Run backend lint (e.g., Black, Flake8)
        run: |
          pip install black flake8
          black --check .
          flake8 .
        working-directory: ${{ env.BACKEND_DIR }}

  test:
    name: Run Tests
    runs-on: ubuntu-latest
    needs: lint # Ensure linting passes before testing
    steps:
      - name: Checkout code
        uses: actions/checkout@v4

      - name: Setup Node.js (for frontend tests)
        uses: actions/setup-node@v4
        with:
          node-version: '18'
          cache: 'npm'
          cache-dependency-path: '${{ env.FRONTEND_DIR }}/package-lock.json'

      - name: Install frontend dependencies
        run: npm ci
        working-directory: ${{ env.FRONTEND_DIR }}

      - name: Run frontend tests (e.g., Jest)
        run: npm test
        working-directory: ${{ env.FRONTEND_DIR }}

      - name: Setup Python (for backend tests)
        uses: actions/setup-python@v5
        with:
          python-version: '3.9'
          cache: 'pip'
          cache-dependency-path: '${{ env.BACKEND_DIR }}/requirements.txt'

      - name: Install backend dependencies
        run: pip install -r requirements.txt
        working-directory: ${{ env.BACKEND_DIR }}

      - name: Run backend tests (e.g., Pytest)
        run: pytest
        working-directory: ${{ env.BACKEND_DIR }}

  build:
    name: Build Artifacts
    runs-on: ubuntu-latest
    needs: test # Ensure tests pass before building
    outputs:
      backend_image_tag: ${{ steps.set-backend-tag.outputs.tag }}
    steps:
      - name: Checkout code
        uses: actions/checkout@v4

      # --- Frontend Build ---
      - name: Setup Node.js (for frontend build)
        uses: actions/setup-node@v4
        with:
          node-version: '18'
          cache: 'npm'
          cache-dependency-path: '${{ env.FRONTEND_DIR }}/package-lock.json'

      - name: Install frontend dependencies
        run: npm ci
        working-directory: ${{ env.FRONTEND_DIR }}

      - name: Build frontend (e.g., React/Vue/Angular)
        run: npm run build
        working-directory: ${{ env.FRONTEND_DIR }}

      - name: Upload frontend build artifacts
        uses: actions/upload-artifact@v4
        with:
          name: frontend-dist
          path: ${{ env.FRONTEND_DIR }}/dist # Adjust path as needed

      # --- Backend Build (Docker) ---
      - name: Set up Docker Buildx
        uses: docker/setup-buildx-action@v3

      - name: Login to Docker Hub (or other registry)
        # You might use AWS ECR, GCP GCR, or other private registries
        # For AWS ECR: docker/login-action@v3
        # For GCP GCR: docker/login-action@v3 with service account key
        # For Docker Hub:
        uses: docker/login-action@v3
        with:
          username: ${{ secrets.DOCKER_USERNAME }}
          password: ${{ secrets.DOCKER_PASSWORD }}

      - name: Extract Docker metadata (tags, labels)
        id: meta
        uses: docker/metadata-action@v5
        with:
          images: ${{ env.DOCKER_IMAGE_NAME_BACKEND }}
          tags: |
            type=raw,value=latest,enable={{is_default_branch}}
            type=sha,format=long,prefix=sha-
            type=ref,event=branch

      - name: Build and push backend Docker image
        uses: docker/build-push-action@v5
        with:
          context: ${{ env.BACKEND_DIR }}
          push: true
          tags: ${{ steps.meta.outputs.tags }}
          labels: ${{ steps.meta.outputs.labels }}
          cache-from: type=gha
          cache-to: type=gha,mode=max

      - name: Set backend image tag output
        id: set-backend-tag
        run: echo "tag=${{ steps.meta.outputs.version }}" >> $GITHUB_OUTPUT # The first tag from meta outputs

  deploy:
    name: Deploy to Environment
    runs-on: ubuntu-latest
    needs: build # Ensure build passes before deploying
    environment: production # Or 'staging', 'development'
    if: github.ref == 'refs/heads/main' # Only deploy main branch pushes
    steps:
      - name: Checkout code
        uses: actions/checkout@v4

      - name: Download frontend build artifacts
        uses: actions/download-artifact@v4
        with:
          name: frontend-dist
          path: ./frontend-dist

      # --- Frontend Deployment (e.g., to AWS S3 / CloudFront) ---
      - name: Configure AWS Credentials
        uses: aws-actions/configure-aws-credentials@v4
        with:
          aws-access-key-id: ${{ secrets.AWS_ACCESS_KEY_ID }}
          aws-secret-access-key: ${{ secrets.AWS_SECRET_ACCESS_KEY }}
          aws-region: ${{ env.AWS_REGION }}

      - name: Deploy frontend to S3
        run: |
          aws s3 sync ./frontend-dist/ S3://${{ secrets.AWS_S3_BUCKET_NAME_FRONTEND }} --delete
          aws cloudfront create-invalidation --distribution-id ${{ secrets.AWS_CLOUDFRONT_DISTRIBUTION_ID }} --paths "/*"
        env:
          AWS_S3_BUCKET_NAME_FRONTEND: ${{ secrets.AWS_S3_BUCKET_NAME_FRONTEND }}
          AWS_CLOUDFRONT_DISTRIBUTION_ID: ${{ secrets.AWS_CLOUDFRONT_DISTRIBUTION_ID }}

      # --- Backend Deployment (e.g., to AWS ECR/ECS) ---
      - name: Login to AWS ECR
        id: login-ecr
        uses: aws-actions/amazon-ecr-login@v2
        env:
          ECR_REGISTRY: ${{ secrets.AWS_ECR_REGISTRY }} # e.g. 123456789012.dkr.ecr.us-east-1.amazonaws.com

      - name: Deploy backend to AWS ECS
        run: |
          # Example: Update ECS service with new image
          # This requires AWS CLI installed and configured, which is done by configure-aws-credentials
          # You'll need to specify your ECS cluster, service, and task definition family
          aws ecs update-service --cluster ${{ secrets.AWS_ECS_CLUSTER_NAME }} \
            --service ${{ secrets.AWS_ECS_SERVICE_NAME }} \
            --force-new-deployment \
            --task-definition $(aws ecs register-task-definition \
              --family ${{ secrets.AWS_ECS_TASK_DEFINITION_FAMILY }} \
              --container-definitions "[
                {
                  \"name\": \"${{ env.PROJECT_NAME }}-backend\",
                  \"image\": \"${{ secrets.AWS_ECR_REGISTRY }}/${{ env.PROJECT_NAME }}-backend:${{ needs.build.outputs.backend_image_tag }}\",
                  \"portMappings\": [ { \"containerPort\": 8000, \"hostPort\": 8000 } ]
                  # Add other container definition properties like environment, secrets, logConfiguration etc.
                }
              ]" | jq -r '.taskDefinition.taskDefinitionArn')
        env:
          AWS_ECS_CLUSTER_NAME: ${{ secrets.AWS_ECS_CLUSTER_NAME }}
          AWS_ECS_SERVICE_NAME: ${{ secrets.AWS_ECS_SERVICE_NAME }}
          AWS_ECS_TASK_DEFINITION_FAMILY: ${{ secrets.

##### Explanation of Stages:

  • lint: Checks code for style and quality issues using npm run lint.
  • test: Runs unit and integration tests using npm test.
  • build: Installs dependencies, executes npm run build, and uploads the resulting dist/ directory as an artifact.
  • deploy:

* needs: build: Ensures this job only runs after a successful build.

* if: github.ref == 'refs/heads/main': Only triggers deployment when changes are pushed directly to the main branch.

* environment: Production: Associates this job with a GitHub Environment, allowing for specific protection rules (e.g., manual approval).

* Downloads the built artifact, configures AWS credentials using GitHub Secrets, and syncs the dist folder to an S3 bucket.

##### Prerequisites:

  1. Repository: Your code must be hosted on GitHub.
  2. Secrets: Configure AWS_ACCESS_KEY_ID and AWS_SECRET_ACCESS_KEY in your GitHub repository secrets (Settings > Secrets and variables > Actions > Repository secrets).
  3. S3 Bucket: An existing AWS S3 bucket for deployment.
  4. package.json scripts: Ensure lint, test, and build scripts are defined in your package.json.

##### How to Use:

  1. Create a file named main.yml (or any other name) inside the .github/workflows/ directory in your repository's root.
  2. Paste the provided YAML content into this file.
  3. Commit and push the file to your main branch.
  4. Subsequent pushes or pull requests to main will automatically trigger the pipeline.

##### Key Features & Best Practices:

  • actions/setup-node with cache: 'npm': Efficiently caches Node.js dependencies.
  • actions/upload-artifact / actions/download-artifact: Manages build artifacts between jobs.
  • aws-actions/configure-aws-credentials: Securely handles AWS authentication using OIDC or direct secrets.
  • needs keyword: Defines job dependencies for sequential execution.
  • if conditional: Controls job execution based on Git branch.
  • environment: Integrates with GitHub Environments for enhanced deployment control and visibility.

##### Customization Guide:

  • Node.js version: Change node-version: '18' to your desired version.
  • Linter/Test/Build commands: Modify npm run lint, npm test, npm run build to match your project's specific commands.
  • Artifact paths: Adjust path: dist/ to your actual build output directory.
  • AWS Region/Bucket: Update aws-region and s3://your-production-bucket-name.
  • Deployment Target: For other targets (EC2, EKS, Azure, GCP), replace the aws s3 sync step with the appropriate commands and credential configurations.
devops_pipeline_generator.txt
Download source file
Copy all content
Full output as text
Download ZIP
IDE-ready project ZIP
Copy share link
Permanent URL for this run
Get Embed Code
Embed this result on any website
Print / Save PDF
Use browser print dialog
\n\n\n"); var hasSrcMain=Object.keys(extracted).some(function(k){return k.indexOf("src/main")>=0;}); if(!hasSrcMain) zip.file(folder+"src/main."+ext,"import React from 'react'\nimport ReactDOM from 'react-dom/client'\nimport App from './App'\nimport './index.css'\n\nReactDOM.createRoot(document.getElementById('root')!).render(\n \n \n \n)\n"); var hasSrcApp=Object.keys(extracted).some(function(k){return k==="src/App."+ext||k==="App."+ext;}); if(!hasSrcApp) zip.file(folder+"src/App."+ext,"import React from 'react'\nimport './App.css'\n\nfunction App(){\n return(\n
\n
\n

"+slugTitle(pn)+"

\n

Built with PantheraHive BOS

\n
\n
\n )\n}\nexport default App\n"); zip.file(folder+"src/index.css","*{margin:0;padding:0;box-sizing:border-box}\nbody{font-family:system-ui,-apple-system,sans-serif;background:#f0f2f5;color:#1a1a2e}\n.app{min-height:100vh;display:flex;flex-direction:column}\n.app-header{flex:1;display:flex;flex-direction:column;align-items:center;justify-content:center;gap:12px;padding:40px}\nh1{font-size:2.5rem;font-weight:700}\n"); zip.file(folder+"src/App.css",""); zip.file(folder+"src/components/.gitkeep",""); zip.file(folder+"src/pages/.gitkeep",""); zip.file(folder+"src/hooks/.gitkeep",""); Object.keys(extracted).forEach(function(p){ var fp=p.startsWith("src/")?p:"src/"+p; zip.file(folder+fp,extracted[p]); }); zip.file(folder+"README.md","# "+slugTitle(pn)+"\n\nGenerated by PantheraHive BOS.\n\n## Setup\n\`\`\`bash\nnpm install\nnpm run dev\n\`\`\`\n\n## Build\n\`\`\`bash\nnpm run build\n\`\`\`\n\n## Open in IDE\nOpen the project folder in VS Code or WebStorm.\n"); zip.file(folder+".gitignore","node_modules/\ndist/\n.env\n.DS_Store\n*.local\n"); } /* --- Vue (Vite + Composition API + TypeScript) --- */ function buildVue(zip,folder,app,code,panelTxt){ var pn=pkgName(app); var C=cc(pn); var extracted=extractCode(panelTxt); zip.file(folder+"package.json",'{\n "name": "'+pn+'",\n "version": "0.0.0",\n "type": "module",\n "scripts": {\n "dev": "vite",\n "build": "vue-tsc -b && vite build",\n "preview": "vite preview"\n },\n "dependencies": {\n "vue": "^3.5.13",\n "vue-router": "^4.4.5",\n "pinia": "^2.3.0",\n "axios": "^1.7.9"\n },\n "devDependencies": {\n "@vitejs/plugin-vue": "^5.2.1",\n "typescript": "~5.7.3",\n "vite": "^6.0.5",\n "vue-tsc": "^2.2.0"\n }\n}\n'); zip.file(folder+"vite.config.ts","import { defineConfig } from 'vite'\nimport vue from '@vitejs/plugin-vue'\nimport { resolve } from 'path'\n\nexport default defineConfig({\n plugins: [vue()],\n resolve: { alias: { '@': resolve(__dirname,'src') } }\n})\n"); zip.file(folder+"tsconfig.json",'{"files":[],"references":[{"path":"./tsconfig.app.json"},{"path":"./tsconfig.node.json"}]}\n'); zip.file(folder+"tsconfig.app.json",'{\n "compilerOptions":{\n "target":"ES2020","useDefineForClassFields":true,"module":"ESNext","lib":["ES2020","DOM","DOM.Iterable"],\n "skipLibCheck":true,"moduleResolution":"bundler","allowImportingTsExtensions":true,\n "isolatedModules":true,"moduleDetection":"force","noEmit":true,"jsxImportSource":"vue",\n "strict":true,"paths":{"@/*":["./src/*"]}\n },\n "include":["src/**/*.ts","src/**/*.d.ts","src/**/*.tsx","src/**/*.vue"]\n}\n'); zip.file(folder+"env.d.ts","/// \n"); zip.file(folder+"index.html","\n\n\n \n \n "+slugTitle(pn)+"\n\n\n
\n \n\n\n"); var hasMain=Object.keys(extracted).some(function(k){return k==="src/main.ts"||k==="main.ts";}); if(!hasMain) zip.file(folder+"src/main.ts","import { createApp } from 'vue'\nimport { createPinia } from 'pinia'\nimport App from './App.vue'\nimport './assets/main.css'\n\nconst app = createApp(App)\napp.use(createPinia())\napp.mount('#app')\n"); var hasApp=Object.keys(extracted).some(function(k){return k.indexOf("App.vue")>=0;}); if(!hasApp) zip.file(folder+"src/App.vue","\n\n\n\n\n"); zip.file(folder+"src/assets/main.css","*{margin:0;padding:0;box-sizing:border-box}body{font-family:system-ui,sans-serif;background:#fff;color:#213547}\n"); zip.file(folder+"src/components/.gitkeep",""); zip.file(folder+"src/views/.gitkeep",""); zip.file(folder+"src/stores/.gitkeep",""); Object.keys(extracted).forEach(function(p){ var fp=p.startsWith("src/")?p:"src/"+p; zip.file(folder+fp,extracted[p]); }); zip.file(folder+"README.md","# "+slugTitle(pn)+"\n\nGenerated by PantheraHive BOS.\n\n## Setup\n\`\`\`bash\nnpm install\nnpm run dev\n\`\`\`\n\n## Build\n\`\`\`bash\nnpm run build\n\`\`\`\n\nOpen in VS Code or WebStorm.\n"); zip.file(folder+".gitignore","node_modules/\ndist/\n.env\n.DS_Store\n*.local\n"); } /* --- Angular (v19 standalone) --- */ function buildAngular(zip,folder,app,code,panelTxt){ var pn=pkgName(app); var C=cc(pn); var sel=pn.replace(/_/g,"-"); var extracted=extractCode(panelTxt); zip.file(folder+"package.json",'{\n "name": "'+pn+'",\n "version": "0.0.0",\n "scripts": {\n "ng": "ng",\n "start": "ng serve",\n "build": "ng build",\n "test": "ng test"\n },\n "dependencies": {\n "@angular/animations": "^19.0.0",\n "@angular/common": "^19.0.0",\n "@angular/compiler": "^19.0.0",\n "@angular/core": "^19.0.0",\n "@angular/forms": "^19.0.0",\n "@angular/platform-browser": "^19.0.0",\n "@angular/platform-browser-dynamic": "^19.0.0",\n "@angular/router": "^19.0.0",\n "rxjs": "~7.8.0",\n "tslib": "^2.3.0",\n "zone.js": "~0.15.0"\n },\n "devDependencies": {\n "@angular-devkit/build-angular": "^19.0.0",\n "@angular/cli": "^19.0.0",\n "@angular/compiler-cli": "^19.0.0",\n "typescript": "~5.6.0"\n }\n}\n'); zip.file(folder+"angular.json",'{\n "$schema": "./node_modules/@angular/cli/lib/config/schema.json",\n "version": 1,\n "newProjectRoot": "projects",\n "projects": {\n "'+pn+'": {\n "projectType": "application",\n "root": "",\n "sourceRoot": "src",\n "prefix": "app",\n "architect": {\n "build": {\n "builder": "@angular-devkit/build-angular:application",\n "options": {\n "outputPath": "dist/'+pn+'",\n "index": "src/index.html",\n "browser": "src/main.ts",\n "tsConfig": "tsconfig.app.json",\n "styles": ["src/styles.css"],\n "scripts": []\n }\n },\n "serve": {"builder":"@angular-devkit/build-angular:dev-server","configurations":{"production":{"buildTarget":"'+pn+':build:production"},"development":{"buildTarget":"'+pn+':build:development"}},"defaultConfiguration":"development"}\n }\n }\n }\n}\n'); zip.file(folder+"tsconfig.json",'{\n "compileOnSave": false,\n "compilerOptions": {"baseUrl":"./","outDir":"./dist/out-tsc","forceConsistentCasingInFileNames":true,"strict":true,"noImplicitOverride":true,"noPropertyAccessFromIndexSignature":true,"noImplicitReturns":true,"noFallthroughCasesInSwitch":true,"paths":{"@/*":["src/*"]},"skipLibCheck":true,"esModuleInterop":true,"sourceMap":true,"declaration":false,"experimentalDecorators":true,"moduleResolution":"bundler","importHelpers":true,"target":"ES2022","module":"ES2022","useDefineForClassFields":false,"lib":["ES2022","dom"]},\n "references":[{"path":"./tsconfig.app.json"}]\n}\n'); zip.file(folder+"tsconfig.app.json",'{\n "extends":"./tsconfig.json",\n "compilerOptions":{"outDir":"./dist/out-tsc","types":[]},\n "files":["src/main.ts"],\n "include":["src/**/*.d.ts"]\n}\n'); zip.file(folder+"src/index.html","\n\n\n \n "+slugTitle(pn)+"\n \n \n \n\n\n \n\n\n"); zip.file(folder+"src/main.ts","import { bootstrapApplication } from '@angular/platform-browser';\nimport { appConfig } from './app/app.config';\nimport { AppComponent } from './app/app.component';\n\nbootstrapApplication(AppComponent, appConfig)\n .catch(err => console.error(err));\n"); zip.file(folder+"src/styles.css","* { margin: 0; padding: 0; box-sizing: border-box; }\nbody { font-family: system-ui, -apple-system, sans-serif; background: #f9fafb; color: #111827; }\n"); var hasComp=Object.keys(extracted).some(function(k){return k.indexOf("app.component")>=0;}); if(!hasComp){ zip.file(folder+"src/app/app.component.ts","import { Component } from '@angular/core';\nimport { RouterOutlet } from '@angular/router';\n\n@Component({\n selector: 'app-root',\n standalone: true,\n imports: [RouterOutlet],\n templateUrl: './app.component.html',\n styleUrl: './app.component.css'\n})\nexport class AppComponent {\n title = '"+pn+"';\n}\n"); zip.file(folder+"src/app/app.component.html","
\n
\n

"+slugTitle(pn)+"

\n

Built with PantheraHive BOS

\n
\n \n
\n"); zip.file(folder+"src/app/app.component.css",".app-header{display:flex;flex-direction:column;align-items:center;justify-content:center;min-height:60vh;gap:16px}h1{font-size:2.5rem;font-weight:700;color:#6366f1}\n"); } zip.file(folder+"src/app/app.config.ts","import { ApplicationConfig, provideZoneChangeDetection } from '@angular/core';\nimport { provideRouter } from '@angular/router';\nimport { routes } from './app.routes';\n\nexport const appConfig: ApplicationConfig = {\n providers: [\n provideZoneChangeDetection({ eventCoalescing: true }),\n provideRouter(routes)\n ]\n};\n"); zip.file(folder+"src/app/app.routes.ts","import { Routes } from '@angular/router';\n\nexport const routes: Routes = [];\n"); Object.keys(extracted).forEach(function(p){ var fp=p.startsWith("src/")?p:"src/"+p; zip.file(folder+fp,extracted[p]); }); zip.file(folder+"README.md","# "+slugTitle(pn)+"\n\nGenerated by PantheraHive BOS.\n\n## Setup\n\`\`\`bash\nnpm install\nng serve\n# or: npm start\n\`\`\`\n\n## Build\n\`\`\`bash\nng build\n\`\`\`\n\nOpen in VS Code with Angular Language Service extension.\n"); zip.file(folder+".gitignore","node_modules/\ndist/\n.env\n.DS_Store\n*.local\n.angular/\n"); } /* --- Python --- */ function buildPython(zip,folder,app,code){ var title=slugTitle(app); var pn=pkgName(app); var src=code.replace(/^\`\`\`[\w]*\n?/m,"").replace(/\n?\`\`\`$/m,"").trim(); var reqMap={"numpy":"numpy","pandas":"pandas","sklearn":"scikit-learn","tensorflow":"tensorflow","torch":"torch","flask":"flask","fastapi":"fastapi","uvicorn":"uvicorn","requests":"requests","sqlalchemy":"sqlalchemy","pydantic":"pydantic","dotenv":"python-dotenv","PIL":"Pillow","cv2":"opencv-python","matplotlib":"matplotlib","seaborn":"seaborn","scipy":"scipy"}; var reqs=[]; Object.keys(reqMap).forEach(function(k){if(src.indexOf("import "+k)>=0||src.indexOf("from "+k)>=0)reqs.push(reqMap[k]);}); var reqsTxt=reqs.length?reqs.join("\n"):"# add dependencies here\n"; zip.file(folder+"main.py",src||"# "+title+"\n# Generated by PantheraHive BOS\n\nprint(title+\" loaded\")\n"); zip.file(folder+"requirements.txt",reqsTxt); zip.file(folder+".env.example","# Environment variables\n"); zip.file(folder+"README.md","# "+title+"\n\nGenerated by PantheraHive BOS.\n\n## Setup\n\`\`\`bash\npython3 -m venv .venv\nsource .venv/bin/activate\npip install -r requirements.txt\n\`\`\`\n\n## Run\n\`\`\`bash\npython main.py\n\`\`\`\n"); zip.file(folder+".gitignore",".venv/\n__pycache__/\n*.pyc\n.env\n.DS_Store\n"); } /* --- Node.js --- */ function buildNode(zip,folder,app,code){ var title=slugTitle(app); var pn=pkgName(app); var src=code.replace(/^\`\`\`[\w]*\n?/m,"").replace(/\n?\`\`\`$/m,"").trim(); var depMap={"mongoose":"^8.0.0","dotenv":"^16.4.5","axios":"^1.7.9","cors":"^2.8.5","bcryptjs":"^2.4.3","jsonwebtoken":"^9.0.2","socket.io":"^4.7.4","uuid":"^9.0.1","zod":"^3.22.4","express":"^4.18.2"}; var deps={}; Object.keys(depMap).forEach(function(k){if(src.indexOf(k)>=0)deps[k]=depMap[k];}); if(!deps["express"])deps["express"]="^4.18.2"; var pkgJson=JSON.stringify({"name":pn,"version":"1.0.0","main":"src/index.js","scripts":{"start":"node src/index.js","dev":"nodemon src/index.js"},"dependencies":deps,"devDependencies":{"nodemon":"^3.0.3"}},null,2)+"\n"; zip.file(folder+"package.json",pkgJson); var fallback="const express=require(\"express\");\nconst app=express();\napp.use(express.json());\n\napp.get(\"/\",(req,res)=>{\n res.json({message:\""+title+" API\"});\n});\n\nconst PORT=process.env.PORT||3000;\napp.listen(PORT,()=>console.log(\"Server on port \"+PORT));\n"; zip.file(folder+"src/index.js",src||fallback); zip.file(folder+".env.example","PORT=3000\n"); zip.file(folder+".gitignore","node_modules/\n.env\n.DS_Store\n"); zip.file(folder+"README.md","# "+title+"\n\nGenerated by PantheraHive BOS.\n\n## Setup\n\`\`\`bash\nnpm install\n\`\`\`\n\n## Run\n\`\`\`bash\nnpm run dev\n\`\`\`\n"); } /* --- Vanilla HTML --- */ function buildVanillaHtml(zip,folder,app,code){ var title=slugTitle(app); var isFullDoc=code.trim().toLowerCase().indexOf("=0||code.trim().toLowerCase().indexOf("=0; var indexHtml=isFullDoc?code:"\n\n\n\n\n"+title+"\n\n\n\n"+code+"\n\n\n\n"; zip.file(folder+"index.html",indexHtml); zip.file(folder+"style.css","/* "+title+" — styles */\n*{margin:0;padding:0;box-sizing:border-box}\nbody{font-family:system-ui,-apple-system,sans-serif;background:#fff;color:#1a1a2e}\n"); zip.file(folder+"script.js","/* "+title+" — scripts */\n"); zip.file(folder+"assets/.gitkeep",""); zip.file(folder+"README.md","# "+title+"\n\nGenerated by PantheraHive BOS.\n\n## Open\nDouble-click \`index.html\` in your browser.\n\nOr serve locally:\n\`\`\`bash\nnpx serve .\n# or\npython3 -m http.server 3000\n\`\`\`\n"); zip.file(folder+".gitignore",".DS_Store\nnode_modules/\n.env\n"); } /* ===== MAIN ===== */ var sc=document.createElement("script"); sc.src="https://cdnjs.cloudflare.com/ajax/libs/jszip/3.10.1/jszip.min.js"; sc.onerror=function(){ if(lbl)lbl.textContent="Download ZIP"; alert("JSZip load failed — check connection."); }; sc.onload=function(){ var zip=new JSZip(); var base=(_phFname||"output").replace(/\.[^.]+$/,""); var app=base.toLowerCase().replace(/[^a-z0-9]+/g,"_").replace(/^_+|_+$/g,"")||"my_app"; var folder=app+"/"; var vc=document.getElementById("panel-content"); var panelTxt=vc?(vc.innerText||vc.textContent||""):""; var lang=detectLang(_phCode,panelTxt); if(_phIsHtml){ buildVanillaHtml(zip,folder,app,_phCode); } else if(lang==="flutter"){ buildFlutter(zip,folder,app,_phCode,panelTxt); } else if(lang==="react-native"){ buildReactNative(zip,folder,app,_phCode,panelTxt); } else if(lang==="swift"){ buildSwift(zip,folder,app,_phCode,panelTxt); } else if(lang==="kotlin"){ buildKotlin(zip,folder,app,_phCode,panelTxt); } else if(lang==="react"){ buildReact(zip,folder,app,_phCode,panelTxt); } else if(lang==="vue"){ buildVue(zip,folder,app,_phCode,panelTxt); } else if(lang==="angular"){ buildAngular(zip,folder,app,_phCode,panelTxt); } else if(lang==="python"){ buildPython(zip,folder,app,_phCode); } else if(lang==="node"){ buildNode(zip,folder,app,_phCode); } else { /* Document/content workflow */ var title=app.replace(/_/g," "); var md=_phAll||_phCode||panelTxt||"No content"; zip.file(folder+app+".md",md); var h=""+title+""; h+="

"+title+"

"; var hc=md.replace(/&/g,"&").replace(//g,">"); hc=hc.replace(/^### (.+)$/gm,"

$1

"); hc=hc.replace(/^## (.+)$/gm,"

$1

"); hc=hc.replace(/^# (.+)$/gm,"

$1

"); hc=hc.replace(/\*\*(.+?)\*\*/g,"$1"); hc=hc.replace(/\n{2,}/g,"

"); h+="

"+hc+"

Generated by PantheraHive BOS
"; zip.file(folder+app+".html",h); zip.file(folder+"README.md","# "+title+"\n\nGenerated by PantheraHive BOS.\n\nFiles:\n- "+app+".md (Markdown)\n- "+app+".html (styled HTML)\n"); } zip.generateAsync({type:"blob"}).then(function(blob){ var a=document.createElement("a"); a.href=URL.createObjectURL(blob); a.download=app+".zip"; a.click(); URL.revokeObjectURL(a.href); if(lbl)lbl.textContent="Download ZIP"; }); }; document.head.appendChild(sc); } function phShare(){navigator.clipboard.writeText(window.location.href).then(function(){var el=document.getElementById("ph-share-lbl");if(el){el.textContent="Link copied!";setTimeout(function(){el.textContent="Copy share link";},2500);}});}function phEmbed(){var runId=window.location.pathname.split("/").pop().replace(".html","");var embedUrl="https://pantherahive.com/embed/"+runId;var code='';navigator.clipboard.writeText(code).then(function(){var el=document.getElementById("ph-embed-lbl");if(el){el.textContent="Embed code copied!";setTimeout(function(){el.textContent="Get Embed Code";},2500);}});}