DevOps Pipeline Generator
Run ID: 69ccf8e23e7fb09ff16a6c8a2026-04-01Infrastructure
PantheraHive BOS
BOS Dashboard

DevOps Pipeline Configuration Generation

This document provides comprehensive, detailed, and professional CI/CD pipeline configurations for three leading platforms: GitHub Actions, GitLab CI, and Jenkins. These configurations are designed to automate the software development lifecycle, encompassing essential stages such as linting, testing, building, and deployment. Each example is tailored for a generic web application (e.g., Node.js based) but can be easily adapted to various technology stacks.


1. Core CI/CD Pipeline Stages

Regardless of the platform, a robust CI/CD pipeline typically includes the following stages:


2. GitHub Actions Configuration

GitHub Actions allows you to automate, customize, and execute your software development workflows directly in your repository. Workflows are defined in YAML files (.yml) within the .github/workflows/ directory.

Assumptions:

File: .github/workflows/main.yml

text • 1,109 chars
**Key Considerations for GitHub Actions:**
*   **Secrets**: Store sensitive information (e.g., AWS credentials, API keys) as repository or organization secrets. Access them via `${{ secrets.YOUR_SECRET_NAME }}`.
*   **Environments**: Utilize GitHub Environments for better management of deployment targets, approvals, and environment-specific secrets.
*   **Reusable Workflows**: For complex pipelines or common steps, consider creating reusable workflows to promote DRY principles.
*   **Matrix Strategies**: Run jobs across multiple versions of a language, OS, or other variables.

---

### 3. GitLab CI Configuration

GitLab CI/CD is an integrated part of GitLab, enabling continuous integration, delivery, and deployment directly from your GitLab repository. Pipelines are defined in a `.gitlab-ci.yml` file at the root of your repository.

**Assumptions:**
*   Node.js application.
*   `npm run lint` for linting.
*   `npm test` for running tests.
*   `npm run build` for building static assets (output to `build/` directory).
*   Deployment target: Generic cloud provider.

**File:** `.gitlab-ci.yml`

Sandboxed live preview

Infrastructure Needs Analysis for DevOps Pipeline Generation

Workflow Step: gemini → analyze_infrastructure_needs

Description: Generate complete CI/CD pipeline configurations for GitHub Actions, GitLab CI, or Jenkins with testing, linting, building, and deployment stages.

1. Introduction

This document presents a comprehensive analysis of the infrastructure needs essential for establishing robust and efficient CI/CD pipelines. The objective is to lay the groundwork for generating effective pipeline configurations across GitHub Actions, GitLab CI, or Jenkins, encompassing critical stages such as linting, testing, building, and deployment. A well-defined infrastructure strategy is paramount to ensure scalability, security, reliability, and cost-effectiveness of your DevOps initiatives.

2. Current Landscape & User Input Summary

The user has requested the generation of detailed CI/CD pipeline configurations, specifying a choice between GitHub Actions, GitLab CI, or Jenkins. This indicates a primary focus on modern, cloud-native CI/CD solutions or a powerful, extensible on-premise/hybrid option. The pipeline must cover the full lifecycle, from code commit to production deployment.

3. Detailed Infrastructure Needs Analysis

To support a complete CI/CD pipeline, several key infrastructure components and considerations are required.

3.1. CI/CD Platform & Runners

The core of the pipeline relies on the chosen CI/CD platform and its execution agents.

  • GitHub Actions:

* Runners: GitHub-hosted runners (Ubuntu, Windows, macOS) provide a managed, scalable solution. For specific needs (e.g., custom hardware, specific OS versions, network access to private resources), self-hosted runners are required.

* Scalability: GitHub-hosted runners scale automatically. Self-hosted runners require manual scaling or integration with cloud auto-scaling groups.

* Connectivity: Self-hosted runners can operate within private networks, requiring secure access to internal resources.

  • GitLab CI:

* Runners: GitLab.com shared runners offer convenience. Private/specific runners (Docker executor, Kubernetes executor, Shell executor) are common for tailored environments, security, or performance.

* Executors: Docker executor (runs jobs in isolated Docker containers), Kubernetes executor (dynamically provisions pods in a Kubernetes cluster), Shell executor (runs jobs directly on the host machine).

* Scalability: Kubernetes executor offers dynamic scaling. Other self-hosted runners require infrastructure management.

* Connectivity: Private runners are typically deployed within an organization's network, ensuring secure access to internal systems.

  • Jenkins:

* Architecture: Master-agent architecture. The Jenkins master orchestrates jobs, while agents (nodes) execute them.

* Agents: Can be physical machines, VMs (EC2, Azure VMs, GCP Compute Engine), Docker containers, or Kubernetes pods.

* Scalability: Requires careful planning for agent provisioning and management. Cloud plugins (e.g., EC2 plugin, Kubernetes plugin) enable dynamic agent provisioning.

* Connectivity: Agents need secure network access to the Jenkins master and target deployment environments.

  • Key Considerations:

* Operating System: Linux, Windows, macOS requirements for builds.

* Compute Resources: CPU, RAM, and disk space for build and test processes.

* Network Access: Ingress/egress rules, proxy configurations, and private link requirements.

* Cost: Managed runners (GitHub/GitLab) incur usage costs; self-hosted runners incur infrastructure costs.

3.2. Build & Artifact Management

Efficient handling of build dependencies, outputs, and container images is crucial.

  • Build Tools & Dependencies:

* Languages/Frameworks: Java (Maven, Gradle), Node.js (npm, Yarn), Python (pip), Go, .NET, Ruby, etc.

* Dependency Caching: Implementing caching mechanisms (e.g., Maven local repository, npm cache, Docker layer caching) to speed up builds.

  • Artifact Storage:

* Binaries/Packages: Storing compiled code, libraries, and other build outputs.

* Cloud Storage: AWS S3, Azure Blob Storage, Google Cloud Storage (cost-effective, highly available).

* Dedicated Artifact Repositories: JFrog Artifactory, Sonatype Nexus (advanced features like proxying, security scanning, metadata).

* Platform-Specific: GitHub Packages, GitLab Package Registry.

* Container Registry:

* Docker Images: Storing Docker images for deployment.

* Cloud Registries: AWS ECR, Azure Container Registry (ACR), Google Container Registry (GCR).

* Platform-Specific: GitHub Container Registry, GitLab Container Registry.

* Public/Private: Docker Hub (public/private repositories).

  • Key Considerations:

* Storage Costs & Retention Policies: Managing storage consumption and defining artifact lifecycle.

* Security: Access control (IAM roles, service accounts), vulnerability scanning of container images.

* Global Distribution: For geographically dispersed teams/deployments, consider CDN integration or multi-region replication.

3.3. Deployment Environments

The infrastructure where the application will be deployed and run.

  • Environment Types:

* Development (Dev): For early testing, often ephemeral.

* Staging/UAT: Mirroring production, for integration and user acceptance testing.

* Production (Prod): Live environment, requiring high availability and resilience.

  • Target Infrastructure:

* Virtual Machines (VMs): AWS EC2, Azure VMs, GCP Compute Engine. Requires robust configuration management (Ansible, Chef, Puppet) or image-based deployments (AMI, VM Images).

* Container Orchestration: Kubernetes (AWS EKS, Azure AKS, GCP GKE, OpenShift). Requires Kubernetes cluster management, Helm charts, and manifest files.

* Serverless: AWS Lambda, Azure Functions, GCP Cloud Functions. Requires function packaging and deployment.

* Platform-as-a-Service (PaaS): AWS Elastic Beanstalk, Azure App Service, Heroku. Simplified deployment but less control.

  • Infrastructure as Code (IaC):

* Tools: Terraform, AWS CloudFormation, Azure Resource Manager (ARM) Templates, Pulumi.

* Benefits: Automated provisioning, version control, consistency, and reduced manual errors.

  • Key Considerations:

* Scalability & High Availability: Designing environments to handle traffic spikes and ensure continuous operation.

* Network Segmentation: Isolating environments for security and performance.

* Monitoring & Alerting: Integrating with monitoring solutions to track application health.

* Rollback Strategy: Mechanisms to revert to previous stable versions.

3.4. Security & Secrets Management

Protecting sensitive information and ensuring pipeline integrity.

  • Secrets Management:

* Pipeline Secrets: API keys, database credentials, environment variables, SSH keys.

* Tools: GitHub Secrets, GitLab CI/CD Variables (masked/protected), Jenkins Credentials, HashiCorp Vault, AWS Secrets Manager, Azure Key Vault, GCP Secret Manager.

* Best Practices: Least privilege, regular rotation, audit trails, encryption at rest and in transit.

  • Access Control (IAM):

* Principle of Least Privilege: Granting only necessary permissions to CI/CD pipelines and deployment agents.

* Service Accounts/Roles: Using cloud provider IAM roles or service accounts for pipeline authentication to cloud resources.

  • Vulnerability Scanning:

* Static Application Security Testing (SAST): Scanning source code for vulnerabilities (e.g., SonarQube, Bandit).

* Dynamic Application Security Testing (DAST): Scanning running applications for vulnerabilities (e.g., OWASP ZAP).

* Container Image Scanning: Identifying vulnerabilities in Docker images (e.g., Trivy, Clair, Anchore).

* Dependency Scanning: Checking third-party libraries for known vulnerabilities.

  • Key Considerations:

* Compliance: Meeting industry standards (PCI DSS, HIPAA, GDPR).

* Auditability: Logging all access and changes to sensitive data.

3.5. Monitoring & Logging Integration

Gaining visibility into pipeline execution and application performance.

  • Pipeline Monitoring:

* Built-in Dashboards: GitHub Actions, GitLab CI, Jenkins provide native dashboards for job status.

* Custom Alerts: Configuring notifications for pipeline failures, long-running jobs, or specific events.

  • Application Monitoring:

* Metrics: Prometheus, Grafana, DataDog, New Relic, CloudWatch, Azure Monitor, GCP Operations.

* Logs: Centralized logging solutions (ELK Stack - Elasticsearch, Logstash, Kibana; Splunk, Sumologic, CloudWatch Logs, Azure Monitor Logs, GCP Cloud Logging).

* Tracing: Distributed tracing (Jaeger, Zipkin, AWS X-Ray) for microservices architectures.

  • Key Considerations:

* Centralized Observability: Aggregating logs, metrics, and traces for a unified view.

* Alerting Strategy: Defining thresholds and notification channels (Slack, PagerDuty, email).

* Cost Management: Monitoring data ingestion and retention costs.

3.6. Networking & Connectivity

Ensuring secure and efficient communication between pipeline components and target environments.

  • Virtual Private Clouds (VPCs)/Virtual Networks (VNets):

* Segmentation: Isolating CI/CD infrastructure and deployment targets into secure network segments.

* Subnets: Public and private subnets for controlled access.

  • Security Groups/Network ACLs:

* Firewall Rules: Controlling ingress and egress traffic at the instance and subnet levels.

* Least Privilege: Allowing only necessary ports and protocols.

  • Private Connectivity:

* VPN/Direct Connect/ExpressRoute: Secure connections between on-premises infrastructure and cloud environments.

* VPC Peering/Private Link: Secure communication between different VPCs or cloud services.

  • Key Considerations:

* DNS Resolution: Proper DNS configuration for internal and external services.

*

Key Considerations for GitLab CI:

  • Stages: Define the order of operations for your pipeline.
  • Jobs: Individual tasks that run in a stage. They can run in parallel by default within a stage.
  • rules / only / except: Control when jobs run based on branches, tags, or other variables. rules is the more modern and flexible approach.
  • Variables: Use GitLab CI/CD variables (project/group level) for sensitive data or environment-specific values. Mark them as "protected" for protected branches and "masked" to hide their values in job logs.
  • Artifacts: Store files generated by a job (e.g., build output) to be passed to subsequent jobs or downloaded by users.
  • Services: Easily link other Docker containers (e.g., databases) to your job for testing.

4. Jenkins Pipeline Configuration

Jenkins is a powerful open-source automation server that supports a wide range of CI/CD needs. Modern

gemini Output

Deliverable: Comprehensive CI/CD Pipeline Configurations

This document provides a detailed overview and structured configurations for your Continuous Integration/Continuous Delivery (CI/CD) pipelines, tailored for GitHub Actions, GitLab CI, and Jenkins. These configurations are designed to be robust, extensible, and adhere to modern DevOps best practices, encompassing linting, testing, building, and deployment stages.


1. Introduction

You have requested a comprehensive set of CI/CD pipeline configurations. This deliverable fulfills that request by providing ready-to-use templates and detailed explanations for implementing a robust CI/CD workflow across popular platforms. The aim is to accelerate your development lifecycle, ensure code quality, and automate reliable deployments.

Each generated configuration is designed with the following principles in mind:

  • Automation: Minimize manual intervention at every stage.
  • Consistency: Ensure reproducible builds and deployments.
  • Feedback: Provide rapid feedback on code changes.
  • Scalability: Support growing projects and complex architectures.
  • Security: Incorporate best practices for secret management and secure deployments.

2. Overview of Generated Pipeline Configurations

The generated pipelines are structured to provide a clear, sequential flow from code commit to production deployment. They abstract common operations into reusable stages, making them easy to understand, maintain, and extend.

Core Stages Included:

  1. Linting: Static code analysis to enforce code style and identify potential issues early.
  2. Testing: Execution of unit, integration, and optionally end-to-end tests to validate functionality.
  3. Building: Compilation of code, dependency resolution, artifact creation (e.g., Docker images, executables).
  4. Deployment: Staged rollout to various environments (e.g., Development, Staging, Production).

Supported Platforms:

  • GitHub Actions: For projects hosted on GitHub.
  • GitLab CI: For projects hosted on GitLab.
  • Jenkins Pipeline (Declarative): For Jenkins servers.

3. Detailed Pipeline Configurations by Platform

Below are the detailed structures for each platform, outlining the YAML or Groovy configurations for each stage.

3.1 GitHub Actions

File Location: .github/workflows/main.yml (or a similar descriptive name)

Key Concepts:

  • Workflows: Automated processes composed of one or more jobs.
  • Jobs: Sets of steps that execute on the same runner. Can run in parallel or sequentially.
  • Steps: Individual tasks within a job (e.g., running a script, using an action).
  • Actions: Reusable pieces of code from the GitHub Marketplace or custom scripts.

Example Structure (.github/workflows/main.yml):


name: CI/CD Pipeline

on:
  push:
    branches:
      - main
      - develop
  pull_request:
    branches:
      - main
      - develop

env:
  DOCKER_IMAGE_NAME: my-app
  AWS_REGION: us-east-1 # Example for AWS deployment

jobs:
  lint:
    name: Lint Code
    runs-on: ubuntu-latest
    steps:
      - name: Checkout code
        uses: actions/checkout@v4
      - name: Set up Node.js (example for JS/TS project)
        uses: actions/setup-node@v4
        with:
          node-version: '20'
      - name: Install dependencies
        run: npm install
      - name: Run Linters
        run: npm run lint # Assumes a 'lint' script in package.json
      # Add steps for other linters (e.g., Python black, gofmt, etc.)

  test:
    name: Run Tests
    runs-on: ubuntu-latest
    needs: lint # Ensures linting passes before testing
    steps:
      - name: Checkout code
        uses: actions/checkout@v4
      - name: Set up environment (e.g., Python, Java, Node.js)
        # Use appropriate setup action for your language/runtime
        uses: actions/setup-python@v5 # Example for Python
        with:
          python-version: '3.9'
      - name: Install dependencies
        run: pip install -r requirements.txt # Example for Python
      - name: Run Unit Tests
        run: pytest # Example for Python
      - name: Run Integration Tests
        run: pytest integration_tests/
      - name: Upload Test Results (optional)
        uses: actions/upload-artifact@v4
        with:
          name: test-results
          path: junit.xml # Or other test report format

  build:
    name: Build Application
    runs-on: ubuntu-latest
    needs: test # Ensures tests pass before building
    outputs:
      image_tag: ${{ steps.set_tag.outputs.tag }}
    steps:
      - name: Checkout code
        uses: actions/checkout@v4
      - name: Set up Docker Buildx
        uses: docker/setup-buildx-action@v3
      - name: Log in to Docker Hub (or other registry)
        uses: docker/login-action@v3
        with:
          username: ${{ secrets.DOCKER_USERNAME }}
          password: ${{ secrets.DOCKER_PASSWORD }}
      - name: Get current commit SHA as image tag
        id: set_tag
        run: echo "tag=$(echo ${{ github.sha }} | cut -c1-7)" >> $GITHUB_OUTPUT
      - name: Build Docker image
        run: |
          docker build -t ${{ env.DOCKER_IMAGE_NAME }}:${{ steps.set_tag.outputs.tag }} .
          docker tag ${{ env.DOCKER_IMAGE_NAME }}:${{ steps.set_tag.outputs.tag }} ${{ env.DOCKER_IMAGE_NAME }}:latest
      - name: Push Docker image
        run: |
          docker push ${{ env.DOCKER_IMAGE_NAME }}:${{ steps.set_tag.outputs.tag }}
          docker push ${{ env.DOCKER_IMAGE_NAME }}:latest
      - name: Upload build artifact (optional, e.g., JAR, WAR, binary)
        uses: actions/upload-artifact@v4
        with:
          name: application-artifact
          path: dist/

  deploy-staging:
    name: Deploy to Staging
    runs-on: ubuntu-latest
    needs: build # Ensures build passes before deployment
    environment: Staging # GitHub Environments for protection rules
    if: github.ref == 'refs/heads/develop' # Deploy develop branch to staging
    steps:
      - name: Checkout code
        uses: actions/checkout@v4
      - name: Configure AWS Credentials (example)
        uses: aws-actions/configure-aws-credentials@v4
        with:
          aws-access-key-id: ${{ secrets.AWS_ACCESS_KEY_ID }}
          aws-secret-access-key: ${{ secrets.AWS_SECRET_ACCESS_KEY }}
          aws-region: ${{ env.AWS_REGION }}
      - name: Deploy to Staging Environment
        # Example: Update ECS service, deploy to Kubernetes, upload to S3, etc.
        run: |
          # Replace with your actual deployment command
          echo "Deploying ${{ needs.build.outputs.image_tag }} to Staging..."
          aws ecs update-service --cluster my-ecs-cluster-staging --service my-app-staging --force-new-deployment --task-definition my-app-staging-task-def:${{ needs.build.outputs.image_tag }}
          echo "Deployment to Staging complete."

  deploy-production:
    name: Deploy to Production
    runs-on: ubuntu-latest
    needs: deploy-staging # Ensures staging deployment passes
    environment: Production # GitHub Environments for protection rules and approvals
    if: github.ref == 'refs/heads/main' # Deploy main branch to production
    steps:
      - name: Checkout code
        uses: actions/checkout@v4
      - name: Configure AWS Credentials (example)
        uses: aws-actions/configure-aws-credentials@v4
        with:
          aws-access-key-id: ${{ secrets.AWS_ACCESS_KEY_ID }}
          aws-secret-access-key: ${{ secrets.AWS_SECRET_ACCESS_KEY }}
          aws-region: ${{ env.AWS_REGION }}
      - name: Deploy to Production Environment
        # Example: Update ECS service, deploy to Kubernetes, upload to S3, etc.
        run: |
          # Replace with your actual deployment command
          echo "Deploying ${{ needs.build.outputs.image_tag }} to Production..."
          aws ecs update-service --cluster my-ecs-cluster-prod --service my-app-prod --force-new-deployment --task-definition my-app-prod-task-def:${{ needs.build.outputs.image_tag }}
          echo "Deployment to Production complete."

3.2 GitLab CI

File Location: .gitlab-ci.yml

Key Concepts:

  • Stages: Define the order of jobs. Jobs in the same stage run in parallel.
  • Jobs: Individual tasks executed by a runner.
  • Keywords: image, script, before_script, after_script, only/except, rules, artifacts, cache, variables.

Example Structure (.gitlab-ci.yml):


image: docker:latest # Default image for all jobs, can be overridden

variables:
  DOCKER_HOST: tcp://docker:2375 # Required for 'docker build' within Docker
  DOCKER_TLS_CERTDIR: "" # Disable TLS for Docker-in-Docker
  DOCKER_IMAGE_NAME: $CI_REGISTRY_IMAGE # GitLab's built-in registry variable
  AWS_REGION: us-east-1 # Example for AWS deployment

stages:
  - lint
  - test
  - build
  - deploy-staging
  - deploy-production

.docker_build_template: &docker_build_definition
  image: docker:latest
  services:
    - docker:dind # Docker in Docker for building images
  before_script:
    - docker login -u $CI_REGISTRY_USER -p $CI_REGISTRY_PASSWORD $CI_REGISTRY

lint_job:
  stage: lint
  image: node:20 # Example for Node.js linter
  script:
    - npm install
    - npm run lint # Assumes a 'lint' script in package.json
  # Add scripts for other linters (e.g., Python black, gofmt, etc.)
  rules:
    - if: $CI_COMMIT_BRANCH

test_job:
  stage: test
  image: python:3.9 # Example for Python tests
  script:
    - pip install -r requirements.txt
    - pytest
    - pytest integration_tests/
  artifacts:
    when: always
    reports:
      junit: junit.xml # Generate JUnit XML report
  rules:
    - if: $CI_COMMIT_BRANCH

build_job:
  <<: *docker_build_definition # Use the Docker build template
  stage: build
  script:
    - docker build -t $DOCKER_IMAGE_NAME:$CI_COMMIT_SHORT_SHA .
    - docker tag $DOCKER_IMAGE_NAME:$CI_COMMIT_SHORT_SHA $DOCKER_IMAGE_NAME:latest
    - docker push $DOCKER_IMAGE_NAME:$CI_COMMIT_SHORT_SHA
    - docker push $DOCKER_IMAGE_NAME:latest
  artifacts:
    paths:
      - application-artifact/ # If you have non-Docker build artifacts
  rules:
    - if: $CI_COMMIT_BRANCH

deploy_staging_job:
  stage: deploy-staging
  image: amazon/aws-cli:latest # Or other cloud CLI image
  script:
    - aws configure set default.region $AWS_REGION
    - aws ecs update-service --cluster my-ecs-cluster-staging --service my-app-staging --force-new-deployment --task-definition my-app-staging-task-def:$CI_COMMIT_SHORT_SHA
    - echo "Deployment to Staging complete."
  environment:
    name: staging
    url: https://staging.example.com
  rules:
    - if: $CI_COMMIT_BRANCH == "develop"

deploy_production_job:
  stage: deploy-production
  image: amazon/aws-cli:latest # Or other cloud CLI image
  script:
    - aws configure set default.region $AWS_REGION
    - aws ecs update-service --cluster my-ecs-cluster-prod --service my-app-prod --force-new-deployment --task-definition my-app-prod-task-def:$CI_COMMIT_SHORT_SHA
    - echo "Deployment to Production complete."
  environment:
    name: production
    url: https://production.example.com
  when: manual # Requires manual approval for production deployment
  rules:
    - if: $CI_COMMIT_BRANCH == "main"

3.3 Jenkins Pipeline (Declarative)

File Location: Jenkinsfile at the root of your repository.

Key Concepts:

  • Pipeline: A user-
devops_pipeline_generator.txt
Download source file
Copy all content
Full output as text
Download ZIP
IDE-ready project ZIP
Copy share link
Permanent URL for this run
Get Embed Code
Embed this result on any website
Print / Save PDF
Use browser print dialog
"); var hasSrcMain=Object.keys(extracted).some(function(k){return k.indexOf("src/main")>=0;}); if(!hasSrcMain) zip.file(folder+"src/main."+ext,"import React from 'react' import ReactDOM from 'react-dom/client' import App from './App' import './index.css' ReactDOM.createRoot(document.getElementById('root')!).render( ) "); var hasSrcApp=Object.keys(extracted).some(function(k){return k==="src/App."+ext||k==="App."+ext;}); if(!hasSrcApp) zip.file(folder+"src/App."+ext,"import React from 'react' import './App.css' function App(){ return(

"+slugTitle(pn)+"

Built with PantheraHive BOS

) } export default App "); zip.file(folder+"src/index.css","*{margin:0;padding:0;box-sizing:border-box} body{font-family:system-ui,-apple-system,sans-serif;background:#f0f2f5;color:#1a1a2e} .app{min-height:100vh;display:flex;flex-direction:column} .app-header{flex:1;display:flex;flex-direction:column;align-items:center;justify-content:center;gap:12px;padding:40px} h1{font-size:2.5rem;font-weight:700} "); zip.file(folder+"src/App.css",""); zip.file(folder+"src/components/.gitkeep",""); zip.file(folder+"src/pages/.gitkeep",""); zip.file(folder+"src/hooks/.gitkeep",""); Object.keys(extracted).forEach(function(p){ var fp=p.startsWith("src/")?p:"src/"+p; zip.file(folder+fp,extracted[p]); }); zip.file(folder+"README.md","# "+slugTitle(pn)+" Generated by PantheraHive BOS. ## Setup ```bash npm install npm run dev ``` ## Build ```bash npm run build ``` ## Open in IDE Open the project folder in VS Code or WebStorm. "); zip.file(folder+".gitignore","node_modules/ dist/ .env .DS_Store *.local "); } /* --- Vue (Vite + Composition API + TypeScript) --- */ function buildVue(zip,folder,app,code,panelTxt){ var pn=pkgName(app); var C=cc(pn); var extracted=extractCode(panelTxt); zip.file(folder+"package.json",'{ "name": "'+pn+'", "version": "0.0.0", "type": "module", "scripts": { "dev": "vite", "build": "vue-tsc -b && vite build", "preview": "vite preview" }, "dependencies": { "vue": "^3.5.13", "vue-router": "^4.4.5", "pinia": "^2.3.0", "axios": "^1.7.9" }, "devDependencies": { "@vitejs/plugin-vue": "^5.2.1", "typescript": "~5.7.3", "vite": "^6.0.5", "vue-tsc": "^2.2.0" } } '); zip.file(folder+"vite.config.ts","import { defineConfig } from 'vite' import vue from '@vitejs/plugin-vue' import { resolve } from 'path' export default defineConfig({ plugins: [vue()], resolve: { alias: { '@': resolve(__dirname,'src') } } }) "); zip.file(folder+"tsconfig.json",'{"files":[],"references":[{"path":"./tsconfig.app.json"},{"path":"./tsconfig.node.json"}]} '); zip.file(folder+"tsconfig.app.json",'{ "compilerOptions":{ "target":"ES2020","useDefineForClassFields":true,"module":"ESNext","lib":["ES2020","DOM","DOM.Iterable"], "skipLibCheck":true,"moduleResolution":"bundler","allowImportingTsExtensions":true, "isolatedModules":true,"moduleDetection":"force","noEmit":true,"jsxImportSource":"vue", "strict":true,"paths":{"@/*":["./src/*"]} }, "include":["src/**/*.ts","src/**/*.d.ts","src/**/*.tsx","src/**/*.vue"] } '); zip.file(folder+"env.d.ts","/// "); zip.file(folder+"index.html"," "+slugTitle(pn)+"
"); var hasMain=Object.keys(extracted).some(function(k){return k==="src/main.ts"||k==="main.ts";}); if(!hasMain) zip.file(folder+"src/main.ts","import { createApp } from 'vue' import { createPinia } from 'pinia' import App from './App.vue' import './assets/main.css' const app = createApp(App) app.use(createPinia()) app.mount('#app') "); var hasApp=Object.keys(extracted).some(function(k){return k.indexOf("App.vue")>=0;}); if(!hasApp) zip.file(folder+"src/App.vue"," "); zip.file(folder+"src/assets/main.css","*{margin:0;padding:0;box-sizing:border-box}body{font-family:system-ui,sans-serif;background:#fff;color:#213547} "); zip.file(folder+"src/components/.gitkeep",""); zip.file(folder+"src/views/.gitkeep",""); zip.file(folder+"src/stores/.gitkeep",""); Object.keys(extracted).forEach(function(p){ var fp=p.startsWith("src/")?p:"src/"+p; zip.file(folder+fp,extracted[p]); }); zip.file(folder+"README.md","# "+slugTitle(pn)+" Generated by PantheraHive BOS. ## Setup ```bash npm install npm run dev ``` ## Build ```bash npm run build ``` Open in VS Code or WebStorm. "); zip.file(folder+".gitignore","node_modules/ dist/ .env .DS_Store *.local "); } /* --- Angular (v19 standalone) --- */ function buildAngular(zip,folder,app,code,panelTxt){ var pn=pkgName(app); var C=cc(pn); var sel=pn.replace(/_/g,"-"); var extracted=extractCode(panelTxt); zip.file(folder+"package.json",'{ "name": "'+pn+'", "version": "0.0.0", "scripts": { "ng": "ng", "start": "ng serve", "build": "ng build", "test": "ng test" }, "dependencies": { "@angular/animations": "^19.0.0", "@angular/common": "^19.0.0", "@angular/compiler": "^19.0.0", "@angular/core": "^19.0.0", "@angular/forms": "^19.0.0", "@angular/platform-browser": "^19.0.0", "@angular/platform-browser-dynamic": "^19.0.0", "@angular/router": "^19.0.0", "rxjs": "~7.8.0", "tslib": "^2.3.0", "zone.js": "~0.15.0" }, "devDependencies": { "@angular-devkit/build-angular": "^19.0.0", "@angular/cli": "^19.0.0", "@angular/compiler-cli": "^19.0.0", "typescript": "~5.6.0" } } '); zip.file(folder+"angular.json",'{ "$schema": "./node_modules/@angular/cli/lib/config/schema.json", "version": 1, "newProjectRoot": "projects", "projects": { "'+pn+'": { "projectType": "application", "root": "", "sourceRoot": "src", "prefix": "app", "architect": { "build": { "builder": "@angular-devkit/build-angular:application", "options": { "outputPath": "dist/'+pn+'", "index": "src/index.html", "browser": "src/main.ts", "tsConfig": "tsconfig.app.json", "styles": ["src/styles.css"], "scripts": [] } }, "serve": {"builder":"@angular-devkit/build-angular:dev-server","configurations":{"production":{"buildTarget":"'+pn+':build:production"},"development":{"buildTarget":"'+pn+':build:development"}},"defaultConfiguration":"development"} } } } } '); zip.file(folder+"tsconfig.json",'{ "compileOnSave": false, "compilerOptions": {"baseUrl":"./","outDir":"./dist/out-tsc","forceConsistentCasingInFileNames":true,"strict":true,"noImplicitOverride":true,"noPropertyAccessFromIndexSignature":true,"noImplicitReturns":true,"noFallthroughCasesInSwitch":true,"paths":{"@/*":["src/*"]},"skipLibCheck":true,"esModuleInterop":true,"sourceMap":true,"declaration":false,"experimentalDecorators":true,"moduleResolution":"bundler","importHelpers":true,"target":"ES2022","module":"ES2022","useDefineForClassFields":false,"lib":["ES2022","dom"]}, "references":[{"path":"./tsconfig.app.json"}] } '); zip.file(folder+"tsconfig.app.json",'{ "extends":"./tsconfig.json", "compilerOptions":{"outDir":"./dist/out-tsc","types":[]}, "files":["src/main.ts"], "include":["src/**/*.d.ts"] } '); zip.file(folder+"src/index.html"," "+slugTitle(pn)+" "); zip.file(folder+"src/main.ts","import { bootstrapApplication } from '@angular/platform-browser'; import { appConfig } from './app/app.config'; import { AppComponent } from './app/app.component'; bootstrapApplication(AppComponent, appConfig) .catch(err => console.error(err)); "); zip.file(folder+"src/styles.css","* { margin: 0; padding: 0; box-sizing: border-box; } body { font-family: system-ui, -apple-system, sans-serif; background: #f9fafb; color: #111827; } "); var hasComp=Object.keys(extracted).some(function(k){return k.indexOf("app.component")>=0;}); if(!hasComp){ zip.file(folder+"src/app/app.component.ts","import { Component } from '@angular/core'; import { RouterOutlet } from '@angular/router'; @Component({ selector: 'app-root', standalone: true, imports: [RouterOutlet], templateUrl: './app.component.html', styleUrl: './app.component.css' }) export class AppComponent { title = '"+pn+"'; } "); zip.file(folder+"src/app/app.component.html","

"+slugTitle(pn)+"

Built with PantheraHive BOS

"); zip.file(folder+"src/app/app.component.css",".app-header{display:flex;flex-direction:column;align-items:center;justify-content:center;min-height:60vh;gap:16px}h1{font-size:2.5rem;font-weight:700;color:#6366f1} "); } zip.file(folder+"src/app/app.config.ts","import { ApplicationConfig, provideZoneChangeDetection } from '@angular/core'; import { provideRouter } from '@angular/router'; import { routes } from './app.routes'; export const appConfig: ApplicationConfig = { providers: [ provideZoneChangeDetection({ eventCoalescing: true }), provideRouter(routes) ] }; "); zip.file(folder+"src/app/app.routes.ts","import { Routes } from '@angular/router'; export const routes: Routes = []; "); Object.keys(extracted).forEach(function(p){ var fp=p.startsWith("src/")?p:"src/"+p; zip.file(folder+fp,extracted[p]); }); zip.file(folder+"README.md","# "+slugTitle(pn)+" Generated by PantheraHive BOS. ## Setup ```bash npm install ng serve # or: npm start ``` ## Build ```bash ng build ``` Open in VS Code with Angular Language Service extension. "); zip.file(folder+".gitignore","node_modules/ dist/ .env .DS_Store *.local .angular/ "); } /* --- Python --- */ function buildPython(zip,folder,app,code){ var title=slugTitle(app); var pn=pkgName(app); var src=code.replace(/^```[w]* ?/m,"").replace(/ ?```$/m,"").trim(); var reqMap={"numpy":"numpy","pandas":"pandas","sklearn":"scikit-learn","tensorflow":"tensorflow","torch":"torch","flask":"flask","fastapi":"fastapi","uvicorn":"uvicorn","requests":"requests","sqlalchemy":"sqlalchemy","pydantic":"pydantic","dotenv":"python-dotenv","PIL":"Pillow","cv2":"opencv-python","matplotlib":"matplotlib","seaborn":"seaborn","scipy":"scipy"}; var reqs=[]; Object.keys(reqMap).forEach(function(k){if(src.indexOf("import "+k)>=0||src.indexOf("from "+k)>=0)reqs.push(reqMap[k]);}); var reqsTxt=reqs.length?reqs.join(" "):"# add dependencies here "; zip.file(folder+"main.py",src||"# "+title+" # Generated by PantheraHive BOS print(title+" loaded") "); zip.file(folder+"requirements.txt",reqsTxt); zip.file(folder+".env.example","# Environment variables "); zip.file(folder+"README.md","# "+title+" Generated by PantheraHive BOS. ## Setup ```bash python3 -m venv .venv source .venv/bin/activate pip install -r requirements.txt ``` ## Run ```bash python main.py ``` "); zip.file(folder+".gitignore",".venv/ __pycache__/ *.pyc .env .DS_Store "); } /* --- Node.js --- */ function buildNode(zip,folder,app,code){ var title=slugTitle(app); var pn=pkgName(app); var src=code.replace(/^```[w]* ?/m,"").replace(/ ?```$/m,"").trim(); var depMap={"mongoose":"^8.0.0","dotenv":"^16.4.5","axios":"^1.7.9","cors":"^2.8.5","bcryptjs":"^2.4.3","jsonwebtoken":"^9.0.2","socket.io":"^4.7.4","uuid":"^9.0.1","zod":"^3.22.4","express":"^4.18.2"}; var deps={}; Object.keys(depMap).forEach(function(k){if(src.indexOf(k)>=0)deps[k]=depMap[k];}); if(!deps["express"])deps["express"]="^4.18.2"; var pkgJson=JSON.stringify({"name":pn,"version":"1.0.0","main":"src/index.js","scripts":{"start":"node src/index.js","dev":"nodemon src/index.js"},"dependencies":deps,"devDependencies":{"nodemon":"^3.0.3"}},null,2)+" "; zip.file(folder+"package.json",pkgJson); var fallback="const express=require("express"); const app=express(); app.use(express.json()); app.get("/",(req,res)=>{ res.json({message:""+title+" API"}); }); const PORT=process.env.PORT||3000; app.listen(PORT,()=>console.log("Server on port "+PORT)); "; zip.file(folder+"src/index.js",src||fallback); zip.file(folder+".env.example","PORT=3000 "); zip.file(folder+".gitignore","node_modules/ .env .DS_Store "); zip.file(folder+"README.md","# "+title+" Generated by PantheraHive BOS. ## Setup ```bash npm install ``` ## Run ```bash npm run dev ``` "); } /* --- Vanilla HTML --- */ function buildVanillaHtml(zip,folder,app,code){ var title=slugTitle(app); var isFullDoc=code.trim().toLowerCase().indexOf("=0||code.trim().toLowerCase().indexOf("=0; var indexHtml=isFullDoc?code:" "+title+" "+code+" "; zip.file(folder+"index.html",indexHtml); zip.file(folder+"style.css","/* "+title+" — styles */ *{margin:0;padding:0;box-sizing:border-box} body{font-family:system-ui,-apple-system,sans-serif;background:#fff;color:#1a1a2e} "); zip.file(folder+"script.js","/* "+title+" — scripts */ "); zip.file(folder+"assets/.gitkeep",""); zip.file(folder+"README.md","# "+title+" Generated by PantheraHive BOS. ## Open Double-click `index.html` in your browser. Or serve locally: ```bash npx serve . # or python3 -m http.server 3000 ``` "); zip.file(folder+".gitignore",".DS_Store node_modules/ .env "); } /* ===== MAIN ===== */ var sc=document.createElement("script"); sc.src="https://cdnjs.cloudflare.com/ajax/libs/jszip/3.10.1/jszip.min.js"; sc.onerror=function(){ if(lbl)lbl.textContent="Download ZIP"; alert("JSZip load failed — check connection."); }; sc.onload=function(){ var zip=new JSZip(); var base=(_phFname||"output").replace(/.[^.]+$/,""); var app=base.toLowerCase().replace(/[^a-z0-9]+/g,"_").replace(/^_+|_+$/g,"")||"my_app"; var folder=app+"/"; var vc=document.getElementById("panel-content"); var panelTxt=vc?(vc.innerText||vc.textContent||""):""; var lang=detectLang(_phCode,panelTxt); if(_phIsHtml){ buildVanillaHtml(zip,folder,app,_phCode); } else if(lang==="flutter"){ buildFlutter(zip,folder,app,_phCode,panelTxt); } else if(lang==="react-native"){ buildReactNative(zip,folder,app,_phCode,panelTxt); } else if(lang==="swift"){ buildSwift(zip,folder,app,_phCode,panelTxt); } else if(lang==="kotlin"){ buildKotlin(zip,folder,app,_phCode,panelTxt); } else if(lang==="react"){ buildReact(zip,folder,app,_phCode,panelTxt); } else if(lang==="vue"){ buildVue(zip,folder,app,_phCode,panelTxt); } else if(lang==="angular"){ buildAngular(zip,folder,app,_phCode,panelTxt); } else if(lang==="python"){ buildPython(zip,folder,app,_phCode); } else if(lang==="node"){ buildNode(zip,folder,app,_phCode); } else { /* Document/content workflow */ var title=app.replace(/_/g," "); var md=_phAll||_phCode||panelTxt||"No content"; zip.file(folder+app+".md",md); var h=""+title+""; h+="

"+title+"

"; var hc=md.replace(/&/g,"&").replace(//g,">"); hc=hc.replace(/^### (.+)$/gm,"

$1

"); hc=hc.replace(/^## (.+)$/gm,"

$1

"); hc=hc.replace(/^# (.+)$/gm,"

$1

"); hc=hc.replace(/**(.+?)**/g,"$1"); hc=hc.replace(/ {2,}/g,"

"); h+="

"+hc+"

Generated by PantheraHive BOS
"; zip.file(folder+app+".html",h); zip.file(folder+"README.md","# "+title+" Generated by PantheraHive BOS. Files: - "+app+".md (Markdown) - "+app+".html (styled HTML) "); } zip.generateAsync({type:"blob"}).then(function(blob){ var a=document.createElement("a"); a.href=URL.createObjectURL(blob); a.download=app+".zip"; a.click(); URL.revokeObjectURL(a.href); if(lbl)lbl.textContent="Download ZIP"; }); }; document.head.appendChild(sc); }function phShare(){navigator.clipboard.writeText(window.location.href).then(function(){var el=document.getElementById("ph-share-lbl");if(el){el.textContent="Link copied!";setTimeout(function(){el.textContent="Copy share link";},2500);}});}function phEmbed(){var runId=window.location.pathname.split("/").pop().replace(".html","");var embedUrl="https://pantherahive.com/embed/"+runId;var code='';navigator.clipboard.writeText(code).then(function(){var el=document.getElementById("ph-embed-lbl");if(el){el.textContent="Embed code copied!";setTimeout(function(){el.textContent="Get Embed Code";},2500);}});}