DevOps Pipeline Generator
Run ID: 69ccc0993e7fb09ff16a4c822026-04-01Infrastructure
PantheraHive BOS
BOS Dashboard

This document provides comprehensive, detailed, and professional CI/CD pipeline configurations for three leading platforms: GitHub Actions, GitLab CI, and Jenkins. Each configuration template includes common stages such as linting, testing, building, and deployment, designed to be adaptable to various project types and deployment targets.


DevOps Pipeline Generator: Detailed CI/CD Configurations

1. Introduction

This deliverable provides ready-to-use, robust CI/CD pipeline configurations. These templates are designed to accelerate your development workflow by automating the process of code quality checks, testing, building artifacts, and deploying your applications. We cover the most popular platforms: GitHub Actions, GitLab CI, and Jenkins, demonstrating best practices for each.

2. Core CI/CD Pipeline Stages

Regardless of the platform, a robust CI/CD pipeline typically follows a sequence of stages to ensure code quality, functionality, and successful deployment:

3. GitHub Actions Configuration (Example: Node.js Web Application)

GitHub Actions provides a powerful and flexible way to automate workflows directly within your GitHub repository. Workflows are defined in YAML files (.yml) stored in the .github/workflows/ directory.

This example demonstrates a pipeline for a Node.js application, including linting with ESLint, testing with Jest, building a production-ready package, and deploying to an AWS S3 bucket (for static sites or frontend assets).

File: .github/workflows/node-ci-cd.yml

text • 1,584 chars
**Key GitHub Actions Concepts:**

*   **`on`**: Defines when the workflow runs (e.g., `push`, `pull_request`, `workflow_dispatch`).
*   **`jobs`**: A workflow consists of one or more jobs. Each job runs on a fresh instance of the `runs-on` operating system.
*   **`steps`**: A sequence of tasks within a job. Steps can run commands (`run`) or use pre-built actions (`uses`).
*   **`uses`**: References a reusable action from the GitHub Marketplace or a local path.
*   **`secrets`**: Securely store sensitive information (e.g., API keys, credentials) that are injected into your workflow at runtime. Access via `${{ secrets.NAME }}`.
*   **`environment`**: Link jobs to specific GitHub Environments for better control over deployments (e.g., manual approvals, environment-specific secrets).
*   **`needs`**: Specifies job dependencies, ensuring jobs run in the correct order.
*   **`if`**: Conditionally executes a step or job based on an expression.
*   **`outputs`**: Allows jobs to pass data to subsequent jobs.

### 4. GitLab CI Configuration (Example: Python Web Application with Docker)

GitLab CI/CD is tightly integrated with GitLab repositories, using a `.gitlab-ci.yml` file at the root of your project. It's highly configurable and supports Docker executors, making it ideal for containerized applications.

This example demonstrates a pipeline for a Python application, including linting with Black/Flake8, testing with pytest, building a Docker image, and pushing it to the GitLab Container Registry, then deploying to a Kubernetes cluster.

**File:** `.gitlab-ci.yml`

Sandboxed live preview

Initial Infrastructure Needs Analysis for DevOps Pipeline Generation

This document outlines the foundational infrastructure and operational considerations critical for designing a robust, efficient, and secure CI/CD pipeline. This initial analysis serves as the blueprint for generating tailored pipeline configurations for GitHub Actions, GitLab CI, or Jenkins.

Our goal in this step is to understand your unique environment, application landscape, and strategic objectives to ensure the generated pipeline perfectly aligns with your development and deployment workflows.

1. Key Infrastructure Dimensions for CI/CD Pipeline Design

To generate an effective CI/CD pipeline, we must analyze several interconnected dimensions of your infrastructure and operational needs.

1.1. CI/CD Platform Selection

The choice of CI/CD platform significantly influences pipeline design, integration capabilities, and operational overhead.

  • GitHub Actions:

* Strengths: Deep integration with GitHub repositories, YAML-based workflows, extensive marketplace of actions, native support for matrix builds, easy to get started for GitHub users.

* Considerations: Hosted runners can incur costs for large teams/usage, less customizable than Jenkins for highly complex, legacy environments.

  • GitLab CI:

* Strengths: Seamlessly integrated into GitLab, powerful YAML syntax, built-in container registry, Auto DevOps features, robust for monorepos, self-hosted or SaaS options.

* Considerations: Tightly coupled with GitLab ecosystem, potential learning curve for new users.

  • Jenkins:

* Strengths: Highly extensible via a vast plugin ecosystem, supports complex workflows (Jenkinsfile), self-hosted for maximum control, suitable for hybrid/on-premise environments.

* Considerations: Requires more operational overhead (maintenance, scaling, security patching), steeper learning curve, configuration can become complex.

1.2. Application Architecture & Technology Stack

The nature of your application dictates specific build, test, and deployment steps.

  • Programming Languages & Frameworks: (e.g., Python/Django/Flask, Node.js/React/Angular/Vue, Java/Spring Boot, Go, .NET, PHP/Laravel, Ruby/Rails). This determines required build tools (npm, yarn, Maven, Gradle, pip, Poetry, Go Modules, Composer, Bundler).
  • Application Type:

* Monolith: Single codebase, potentially simpler deployment but larger build times.

* Microservices: Distributed architecture, requires independent pipelines or monorepo strategies, robust containerization.

* Serverless: (AWS Lambda, Azure Functions, Google Cloud Functions) Specific deployment tools (Serverless Framework, SAM CLI, Zappa).

* Static Websites/SPAs: Focus on build, linting, testing, and deployment to CDNs/object storage.

  • Containerization: Use of Docker, Docker Compose, Kubernetes. This implies building Docker images, pushing to registries, and orchestrating deployments.

1.3. Target Deployment Environment & Strategy

Where and how your application is deployed is central to pipeline design.

  • Cloud Provider(s): AWS, Azure, Google Cloud Platform (GCP), DigitalOcean, Heroku, Vercel, Netlify, On-premise, Hybrid.
  • Compute Service:

* Virtual Machines (VMs): AWS EC2, Azure VMs, GCP Compute Engine (requires provisioning, configuration management like Ansible/Chef/Puppet).

* Container Orchestration: Kubernetes (EKS, AKS, GKE), AWS ECS/Fargate, Azure Container Apps.

* Serverless: AWS Lambda, Azure Functions, GCP Cloud Functions.

* Platform as a Service (PaaS): AWS Elastic Beanstalk, Azure App Service, Heroku, DigitalOcean App Platform.

  • Container Registry: Docker Hub, Amazon ECR, Azure Container Registry (ACR), Google Container Registry (GCR), GitLab Container Registry.
  • Artifact Repository: Nexus, Artifactory, S3 buckets, Azure Blob Storage (for non-container artifacts).
  • Infrastructure as Code (IaC) Tools: Terraform, AWS CloudFormation, Azure Resource Manager (ARM) templates, Pulumi (if infrastructure changes are part of the pipeline).
  • Deployment Strategy: Blue/Green, Canary, Rolling Updates, All-at-once.
  • Environment Strategy: Dedicated environments for Development, Staging, Production, Feature branches, Pull Request previews.

1.4. Testing & Quality Assurance

Integrating various testing and quality gates is crucial for reliable deployments.

  • Testing Frameworks:

* Unit Tests: Jest, Pytest, JUnit, GoTest.

* Integration Tests: Postman, Cypress, Playwright.

* End-to-End (E2E) Tests: Selenium, Cypress, Playwright.

  • Code Quality & Linting: ESLint, Prettier, Black, Flake8, SonarQube, Bandit.
  • Code Coverage: Tools to measure and enforce coverage thresholds.
  • Security Testing: Static Application Security Testing (SAST - SonarQube, Snyk, GitHub CodeQL), Dynamic Application Security Testing (DAST), Dependency Scanning.

1.5. Security & Compliance Requirements

Security must be "shifted left" and integrated into the pipeline.

  • Secret Management: AWS Secrets Manager, Azure Key Vault, Google Secret Manager, HashiCorp Vault, Kubernetes Secrets, environment variables.
  • Vulnerability Scanning: Container image scanning (Trivy, Clair, Docker Scan), dependency scanning (Snyk, OWASP Dependency-Check).
  • Compliance: Adherence to industry standards (e.g., SOC2, HIPAA, GDPR, PCI DSS) often requires specific audit trails, access controls, and security checks within the pipeline.
  • Access Control: Granular permissions for CI/CD agents/runners.

1.6. Operational Considerations

Beyond deployment, ensuring the application's health and maintainability is key.

  • Monitoring & Logging: Integration with tools like Prometheus, Grafana, ELK Stack, Datadog, AWS CloudWatch, Azure Monitor, GCP Operations.
  • Alerting: PagerDuty, Slack, Email notifications for pipeline failures or critical events.
  • Rollback Strategy: Automated or manual procedures for reverting to a previous stable version.
  • Pipeline Performance: Requirements for build/deployment speed and resource usage.
  • Existing Tooling & Integrations: Version control (Git), artifact repositories, notification systems, project management tools (Jira, Trello).

2. Industry Trends & Data Insights

Understanding current trends ensures the generated pipeline is modern, efficient, and future-proof.

  • Cloud-Native & Containerization Dominance: Over 70% of organizations are adopting containers, with Kubernetes becoming the de-facto standard for orchestration. This necessitates pipelines that efficiently build, scan, and deploy container images.
  • DevSecOps Shift Left: Security is no longer an afterthought. Integrating SAST, DAST, and dependency scanning directly into CI/CD pipelines is a top priority, with 60% of organizations reporting increased security automation.
  • GitOps Principles: Managing infrastructure and application deployments declaratively through Git repositories is gaining traction. This means pipelines should be capable of reacting to Git events and applying desired state changes.
  • Automation Everywhere: The push for fully automated pipelines, from code commit to production, continues to accelerate, reducing manual errors and increasing deployment frequency.
  • AI/ML in DevOps: Emerging trend for predictive analytics in pipeline health, anomaly detection, and optimizing resource allocation for builds.
  • YAML-based Configuration: GitHub Actions and GitLab CI, with their YAML-based configurations, have popularized declarative, version-controlled pipeline definitions, making them easier to manage and audit than traditional GUI-based configurations.

3. Key Recommendations

Based on the general scope of "DevOps Pipeline Generator," here are initial recommendations:

  • Prioritize Specific Requirements: The most critical step is to clearly define the specific application, environment, and operational requirements. Ambiguity here will lead to a less optimal pipeline.
  • Embrace Automation & Idempotency: Design every pipeline stage to be fully automated and idempotent (running it multiple times yields the same result) to ensure reliability and consistency.
  • Integrate Security Early: Implement security checks (linting, dependency scanning, SAST) as early as possible in the pipeline to catch issues before they escalate.
  • Leverage Cloud-Native Services (if applicable): If deploying to a specific cloud provider, utilize their native CI/CD integrations, secret management, and compute services for better performance, cost-efficiency, and integration.
  • Consider Team Skillset & Culture: The chosen CI/CD platform and tools should align with your team's existing skills and preferences to ensure maintainability and adoption. A complex, unfamiliar system will hinder productivity.
  • Start Simple and Iterate: Begin with a core pipeline (build, test, deploy to dev) and progressively add more advanced stages (security, performance, multi-environment deployments) as your needs evolve.

4. Next Steps & Required Information

To proceed with generating your tailored CI/CD pipeline, we require specific details about your project. Please provide the following information:

4.1. CI/CD Platform Preference

  • Which CI/CD platform do you prefer? (Choose one or specify "Undecided, recommend based on answers below")

* [ ] GitHub Actions

* [ ] GitLab CI

* [ ] Jenkins

* [ ] Other (Please specify): ______________________

4.2. Application Details

  • Primary Programming Language(s) & Framework(s): (e.g., Python/Django, Node.js/React, Java/Spring Boot)

________________________________________________

Key GitLab CI Concepts:

  • image: Specifies the Docker image to use for all jobs, or for individual jobs.
  • variables: Define global or job-specific variables. GitLab provides many predefined CI/CD variables (e.g., $CI_REGISTRY_IMAGE).
  • stages: Defines the order of execution for jobs. Jobs in the same stage run in parallel.
  • cache: Specifies files or directories to cache between pipeline runs to speed up execution (e.g., node_modules, .venv).
  • script: The main commands to execute for a job.
  • before_script / after_script: Commands to run before/after the main script.
  • rules: Powerful way to control when jobs run based on branches, tags, variables, or changes to files.
  • services: Used to run linked Docker containers (e.g., docker:dind for building Docker images).
  • artifacts: Specifies files or directories to attach to a job after it finishes. Can be downloaded or passed to subsequent jobs.
  • environment: Provides a way to define deployment environments, track deployments, and access environment-specific variables.
  • when: manual: Requires a manual trigger in the GitLab UI before the job executes.
  • extends / ! (YAML anchors): Reusable configuration blocks to reduce duplication.

5. Jenkins Pipeline Configuration (Example: Java Spring Boot Application)

Jenkins Pipelines are defined using a Jenkinsfile, typically written in Groovy, and stored in the

gemini Output

DevOps Pipeline Generator: Comprehensive CI/CD Pipeline Configurations

This document provides detailed and professionally structured CI/CD pipeline configurations tailored for GitHub Actions, GitLab CI, and Jenkins. These pipelines are designed to ensure code quality, automate testing, streamline artifact generation, and facilitate robust deployments across various environments.

The generated configurations incorporate essential stages: Linting, Testing, Building, and Deployment, reflecting best practices for modern software delivery.


1. Introduction

Welcome! This deliverable provides you with ready-to-use CI/CD pipeline configurations, acting as a robust foundation for your software development lifecycle. By leveraging these templates, you can significantly accelerate your development velocity, improve code quality, and ensure consistent, reliable deployments.

We have generated configurations for the following popular CI/CD platforms:

  • GitHub Actions
  • GitLab CI
  • Jenkins Declarative Pipeline

Each configuration is designed to be easily adaptable to your specific project needs, technology stack, and deployment targets.

2. Core Pipeline Stages Explained

All generated pipelines adhere to a common structure, incorporating the following critical stages:

  • Linting:

* Purpose: To enforce coding standards, identify potential errors, and maintain code consistency.

* Actions: Runs static analysis tools (e.g., ESLint, Prettier, Pylint, Black, SonarQube) to check code style, syntax, and potential bugs.

* Outcome: Early detection of code quality issues, leading to cleaner and more maintainable codebases.

  • Testing:

* Purpose: To verify the correctness and functionality of the application code.

* Actions: Executes various types of tests, including:

* Unit Tests: Isolated testing of individual components or functions.

* Integration Tests: Verifying interactions between different parts of the system.

(Optional)* End-to-End (E2E) Tests: Simulating user scenarios to test the entire application flow.

* Outcome: Confidence in code changes, reduced regressions, and higher software reliability.

  • Building:

* Purpose: To compile source code, resolve dependencies, and package the application into deployable artifacts.

* Actions:

* Installs dependencies.

* Compiles code (if applicable, e.g., Java, C#).

* Creates deployable assets (e.g., Docker images, JAR files, npm packages, static website bundles).

* Pushes artifacts to a registry (e.g., Docker Hub, AWS ECR, Artifactory, npm registry).

* Outcome: Versioned, immutable artifacts ready for deployment.

  • Deployment:

* Purpose: To release the built application artifacts to designated environments.

* Actions:

* Staging Deployment: Deploys the application to a pre-production environment for further testing and validation.

* Production Deployment: Deploys the application to the live production environment, often triggered manually or after successful staging validation.

(Conditional)* Rollback capabilities or advanced deployment strategies (e.g., Canary, Blue/Green).

* Outcome: Automated and reliable delivery of applications to users.

3. Detailed Pipeline Configurations

Below are the detailed configurations for each CI/CD platform. These examples use a generic Node.js application for illustration but are designed to be easily adaptable to other technology stacks (Python, Java, Go, etc.) by modifying the build and test commands.


3.1. GitHub Actions Configuration

File Location: .github/workflows/main.yml

This configuration defines a workflow that triggers on pushes to the main branch and pull requests. It includes linting, testing, building a Docker image, and deploying to a placeholder environment.


name: CI/CD Pipeline

on:
  push:
    branches:
      - main
  pull_request:
    branches:
      - main

env:
  NODE_VERSION: '18.x' # Specify Node.js version
  DOCKER_IMAGE_NAME: your-app-name
  DOCKER_REGISTRY: ghcr.io/${{ github.repository_owner }} # Example: GitHub Container Registry

jobs:
  lint:
    name: Lint Code
    runs-on: ubuntu-latest
    steps:
      - name: Checkout code
        uses: actions/checkout@v4

      - name: Setup Node.js
        uses: actions/setup-node@v4
        with:
          node-version: ${{ env.NODE_VERSION }}
          cache: 'npm' # Cache npm dependencies

      - name: Install dependencies
        run: npm ci

      - name: Run ESLint
        run: npm run lint # Assuming 'lint' script in package.json

  test:
    name: Run Tests
    runs-on: ubuntu-latest
    needs: lint # This job depends on 'lint' job
    steps:
      - name: Checkout code
        uses: actions/checkout@v4

      - name: Setup Node.js
        uses: actions/setup-node@v4
        with:
          node-version: ${{ env.NODE_VERSION }}
          cache: 'npm'

      - name: Install dependencies
        run: npm ci

      - name: Run Unit and Integration Tests
        run: npm test # Assuming 'test' script in package.json

  build:
    name: Build Docker Image
    runs-on: ubuntu-latest
    needs: test # This job depends on 'test' job
    if: github.event_name == 'push' # Only build Docker image on push to main
    steps:
      - name: Checkout code
        uses: actions/checkout@v4

      - name: Set up Docker Buildx
        uses: docker/setup-buildx-action@v3

      - name: Log in to Docker Registry
        uses: docker/login-action@v3
        with:
          registry: ${{ env.DOCKER_REGISTRY }}
          username: ${{ github.actor }}
          password: ${{ secrets.GITHUB_TOKEN }} # Use GITHUB_TOKEN for GHCR

      - name: Build and push Docker image
        id: docker_build
        uses: docker/build-push-action@v5
        with:
          context: .
          push: true
          tags: |
            ${{ env.DOCKER_REGISTRY }}/${{ env.DOCKER_IMAGE_NAME }}:latest
            ${{ env.DOCKER_REGISTRY }}/${{ env.DOCKER_IMAGE_NAME }}:${{ github.sha }}
          cache-from: type=gha
          cache-to: type=gha,mode=max

      - name: Echo Docker image tag
        run: echo "Docker Image Tag: ${{ steps.docker_build.outputs.tag }}"

  deploy-staging:
    name: Deploy to Staging
    runs-on: ubuntu-latest
    needs: build # This job depends on 'build' job
    environment: Staging # Define a staging environment for secrets/variables
    if: github.event_name == 'push' # Only deploy on push to main
    steps:
      - name: Checkout code
        uses: actions/checkout@v4

      - name: Configure AWS Credentials (Example for AWS ECR/ECS)
        uses: aws-actions/configure-aws-credentials@v4
        with:
          aws-access-key-id: ${{ secrets.AWS_ACCESS_KEY_ID }}
          aws-secret-access-key: ${{ secrets.AWS_SECRET_ACCESS_KEY }}
          aws-region: us-east-1 # Replace with your AWS region

      - name: Deploy to ECS (Example)
        # Replace with your actual deployment command/action
        run: |
          echo "Deploying ${{ env.DOCKER_REGISTRY }}/${{ env.DOCKER_IMAGE_NAME }}:${{ github.sha }} to Staging..."
          # Example: Update ECS service with new image
          # aws ecs update-service --cluster your-ecs-cluster-staging --service your-ecs-service-staging --force-new-deployment --task-definition $(aws ecs describe-task-definition --task-definition your-task-definition-staging | jq -r '.taskDefinition.taskDefinitionArn' | sed "s|${{ env.DOCKER_REGISTRY }}/${{ env.DOCKER_IMAGE_NAME }}:${{ github.sha }}|${{ env.DOCKER_REGISTRY }}/${{ env.DOCKER_IMAGE_NAME }}:${{ github.sha }}|g")
          echo "Deployment to Staging successful!"

  deploy-production:
    name: Deploy to Production
    runs-on: ubuntu-latest
    needs: deploy-staging # This job depends on successful staging deployment
    environment: Production # Define a production environment for secrets/variables
    if: github.event_name == 'push' # Only deploy on push to main
    # This deployment is typically manual or requires approval
    # You can add 'environment' with 'required_reviewers' or 'wait_timer'
    # Example: environment: { name: Production, url: 'https://your-prod-app.com' }
    steps:
      - name: Checkout code
        uses: actions/checkout@v4

      - name: Configure AWS Credentials (Example for AWS ECR/ECS)
        uses: aws-actions/configure-aws-credentials@v4
        with:
          aws-access-key-id: ${{ secrets.AWS_ACCESS_KEY_ID }}
          aws-secret-access-key: ${{ secrets.AWS_SECRET_ACCESS_KEY }}
          aws-region: us-east-1 # Replace with your AWS region

      - name: Deploy to Production (Example)
        # Replace with your actual deployment command/action
        run: |
          echo "Deploying ${{ env.DOCKER_REGISTRY }}/${{ env.DOCKER_IMAGE_NAME }}:${{ github.sha }} to Production..."
          # Example: Update ECS service with new image
          # aws ecs update-service --cluster your-ecs-cluster-prod --service your-ecs-service-prod --force-new-deployment --task-definition $(aws ecs describe-task-definition --task-definition your-task-definition-prod | jq -r '.taskDefinition.taskDefinitionArn' | sed "s|${{ env.DOCKER_
devops_pipeline_generator.txt
Download source file
Copy all content
Full output as text
Download ZIP
IDE-ready project ZIP
Copy share link
Permanent URL for this run
Get Embed Code
Embed this result on any website
Print / Save PDF
Use browser print dialog
"); var hasSrcMain=Object.keys(extracted).some(function(k){return k.indexOf("src/main")>=0;}); if(!hasSrcMain) zip.file(folder+"src/main."+ext,"import React from 'react' import ReactDOM from 'react-dom/client' import App from './App' import './index.css' ReactDOM.createRoot(document.getElementById('root')!).render( ) "); var hasSrcApp=Object.keys(extracted).some(function(k){return k==="src/App."+ext||k==="App."+ext;}); if(!hasSrcApp) zip.file(folder+"src/App."+ext,"import React from 'react' import './App.css' function App(){ return(

"+slugTitle(pn)+"

Built with PantheraHive BOS

) } export default App "); zip.file(folder+"src/index.css","*{margin:0;padding:0;box-sizing:border-box} body{font-family:system-ui,-apple-system,sans-serif;background:#f0f2f5;color:#1a1a2e} .app{min-height:100vh;display:flex;flex-direction:column} .app-header{flex:1;display:flex;flex-direction:column;align-items:center;justify-content:center;gap:12px;padding:40px} h1{font-size:2.5rem;font-weight:700} "); zip.file(folder+"src/App.css",""); zip.file(folder+"src/components/.gitkeep",""); zip.file(folder+"src/pages/.gitkeep",""); zip.file(folder+"src/hooks/.gitkeep",""); Object.keys(extracted).forEach(function(p){ var fp=p.startsWith("src/")?p:"src/"+p; zip.file(folder+fp,extracted[p]); }); zip.file(folder+"README.md","# "+slugTitle(pn)+" Generated by PantheraHive BOS. ## Setup ```bash npm install npm run dev ``` ## Build ```bash npm run build ``` ## Open in IDE Open the project folder in VS Code or WebStorm. "); zip.file(folder+".gitignore","node_modules/ dist/ .env .DS_Store *.local "); } /* --- Vue (Vite + Composition API + TypeScript) --- */ function buildVue(zip,folder,app,code,panelTxt){ var pn=pkgName(app); var C=cc(pn); var extracted=extractCode(panelTxt); zip.file(folder+"package.json",'{ "name": "'+pn+'", "version": "0.0.0", "type": "module", "scripts": { "dev": "vite", "build": "vue-tsc -b && vite build", "preview": "vite preview" }, "dependencies": { "vue": "^3.5.13", "vue-router": "^4.4.5", "pinia": "^2.3.0", "axios": "^1.7.9" }, "devDependencies": { "@vitejs/plugin-vue": "^5.2.1", "typescript": "~5.7.3", "vite": "^6.0.5", "vue-tsc": "^2.2.0" } } '); zip.file(folder+"vite.config.ts","import { defineConfig } from 'vite' import vue from '@vitejs/plugin-vue' import { resolve } from 'path' export default defineConfig({ plugins: [vue()], resolve: { alias: { '@': resolve(__dirname,'src') } } }) "); zip.file(folder+"tsconfig.json",'{"files":[],"references":[{"path":"./tsconfig.app.json"},{"path":"./tsconfig.node.json"}]} '); zip.file(folder+"tsconfig.app.json",'{ "compilerOptions":{ "target":"ES2020","useDefineForClassFields":true,"module":"ESNext","lib":["ES2020","DOM","DOM.Iterable"], "skipLibCheck":true,"moduleResolution":"bundler","allowImportingTsExtensions":true, "isolatedModules":true,"moduleDetection":"force","noEmit":true,"jsxImportSource":"vue", "strict":true,"paths":{"@/*":["./src/*"]} }, "include":["src/**/*.ts","src/**/*.d.ts","src/**/*.tsx","src/**/*.vue"] } '); zip.file(folder+"env.d.ts","/// "); zip.file(folder+"index.html"," "+slugTitle(pn)+"
"); var hasMain=Object.keys(extracted).some(function(k){return k==="src/main.ts"||k==="main.ts";}); if(!hasMain) zip.file(folder+"src/main.ts","import { createApp } from 'vue' import { createPinia } from 'pinia' import App from './App.vue' import './assets/main.css' const app = createApp(App) app.use(createPinia()) app.mount('#app') "); var hasApp=Object.keys(extracted).some(function(k){return k.indexOf("App.vue")>=0;}); if(!hasApp) zip.file(folder+"src/App.vue"," "); zip.file(folder+"src/assets/main.css","*{margin:0;padding:0;box-sizing:border-box}body{font-family:system-ui,sans-serif;background:#fff;color:#213547} "); zip.file(folder+"src/components/.gitkeep",""); zip.file(folder+"src/views/.gitkeep",""); zip.file(folder+"src/stores/.gitkeep",""); Object.keys(extracted).forEach(function(p){ var fp=p.startsWith("src/")?p:"src/"+p; zip.file(folder+fp,extracted[p]); }); zip.file(folder+"README.md","# "+slugTitle(pn)+" Generated by PantheraHive BOS. ## Setup ```bash npm install npm run dev ``` ## Build ```bash npm run build ``` Open in VS Code or WebStorm. "); zip.file(folder+".gitignore","node_modules/ dist/ .env .DS_Store *.local "); } /* --- Angular (v19 standalone) --- */ function buildAngular(zip,folder,app,code,panelTxt){ var pn=pkgName(app); var C=cc(pn); var sel=pn.replace(/_/g,"-"); var extracted=extractCode(panelTxt); zip.file(folder+"package.json",'{ "name": "'+pn+'", "version": "0.0.0", "scripts": { "ng": "ng", "start": "ng serve", "build": "ng build", "test": "ng test" }, "dependencies": { "@angular/animations": "^19.0.0", "@angular/common": "^19.0.0", "@angular/compiler": "^19.0.0", "@angular/core": "^19.0.0", "@angular/forms": "^19.0.0", "@angular/platform-browser": "^19.0.0", "@angular/platform-browser-dynamic": "^19.0.0", "@angular/router": "^19.0.0", "rxjs": "~7.8.0", "tslib": "^2.3.0", "zone.js": "~0.15.0" }, "devDependencies": { "@angular-devkit/build-angular": "^19.0.0", "@angular/cli": "^19.0.0", "@angular/compiler-cli": "^19.0.0", "typescript": "~5.6.0" } } '); zip.file(folder+"angular.json",'{ "$schema": "./node_modules/@angular/cli/lib/config/schema.json", "version": 1, "newProjectRoot": "projects", "projects": { "'+pn+'": { "projectType": "application", "root": "", "sourceRoot": "src", "prefix": "app", "architect": { "build": { "builder": "@angular-devkit/build-angular:application", "options": { "outputPath": "dist/'+pn+'", "index": "src/index.html", "browser": "src/main.ts", "tsConfig": "tsconfig.app.json", "styles": ["src/styles.css"], "scripts": [] } }, "serve": {"builder":"@angular-devkit/build-angular:dev-server","configurations":{"production":{"buildTarget":"'+pn+':build:production"},"development":{"buildTarget":"'+pn+':build:development"}},"defaultConfiguration":"development"} } } } } '); zip.file(folder+"tsconfig.json",'{ "compileOnSave": false, "compilerOptions": {"baseUrl":"./","outDir":"./dist/out-tsc","forceConsistentCasingInFileNames":true,"strict":true,"noImplicitOverride":true,"noPropertyAccessFromIndexSignature":true,"noImplicitReturns":true,"noFallthroughCasesInSwitch":true,"paths":{"@/*":["src/*"]},"skipLibCheck":true,"esModuleInterop":true,"sourceMap":true,"declaration":false,"experimentalDecorators":true,"moduleResolution":"bundler","importHelpers":true,"target":"ES2022","module":"ES2022","useDefineForClassFields":false,"lib":["ES2022","dom"]}, "references":[{"path":"./tsconfig.app.json"}] } '); zip.file(folder+"tsconfig.app.json",'{ "extends":"./tsconfig.json", "compilerOptions":{"outDir":"./dist/out-tsc","types":[]}, "files":["src/main.ts"], "include":["src/**/*.d.ts"] } '); zip.file(folder+"src/index.html"," "+slugTitle(pn)+" "); zip.file(folder+"src/main.ts","import { bootstrapApplication } from '@angular/platform-browser'; import { appConfig } from './app/app.config'; import { AppComponent } from './app/app.component'; bootstrapApplication(AppComponent, appConfig) .catch(err => console.error(err)); "); zip.file(folder+"src/styles.css","* { margin: 0; padding: 0; box-sizing: border-box; } body { font-family: system-ui, -apple-system, sans-serif; background: #f9fafb; color: #111827; } "); var hasComp=Object.keys(extracted).some(function(k){return k.indexOf("app.component")>=0;}); if(!hasComp){ zip.file(folder+"src/app/app.component.ts","import { Component } from '@angular/core'; import { RouterOutlet } from '@angular/router'; @Component({ selector: 'app-root', standalone: true, imports: [RouterOutlet], templateUrl: './app.component.html', styleUrl: './app.component.css' }) export class AppComponent { title = '"+pn+"'; } "); zip.file(folder+"src/app/app.component.html","

"+slugTitle(pn)+"

Built with PantheraHive BOS

"); zip.file(folder+"src/app/app.component.css",".app-header{display:flex;flex-direction:column;align-items:center;justify-content:center;min-height:60vh;gap:16px}h1{font-size:2.5rem;font-weight:700;color:#6366f1} "); } zip.file(folder+"src/app/app.config.ts","import { ApplicationConfig, provideZoneChangeDetection } from '@angular/core'; import { provideRouter } from '@angular/router'; import { routes } from './app.routes'; export const appConfig: ApplicationConfig = { providers: [ provideZoneChangeDetection({ eventCoalescing: true }), provideRouter(routes) ] }; "); zip.file(folder+"src/app/app.routes.ts","import { Routes } from '@angular/router'; export const routes: Routes = []; "); Object.keys(extracted).forEach(function(p){ var fp=p.startsWith("src/")?p:"src/"+p; zip.file(folder+fp,extracted[p]); }); zip.file(folder+"README.md","# "+slugTitle(pn)+" Generated by PantheraHive BOS. ## Setup ```bash npm install ng serve # or: npm start ``` ## Build ```bash ng build ``` Open in VS Code with Angular Language Service extension. "); zip.file(folder+".gitignore","node_modules/ dist/ .env .DS_Store *.local .angular/ "); } /* --- Python --- */ function buildPython(zip,folder,app,code){ var title=slugTitle(app); var pn=pkgName(app); var src=code.replace(/^```[w]* ?/m,"").replace(/ ?```$/m,"").trim(); var reqMap={"numpy":"numpy","pandas":"pandas","sklearn":"scikit-learn","tensorflow":"tensorflow","torch":"torch","flask":"flask","fastapi":"fastapi","uvicorn":"uvicorn","requests":"requests","sqlalchemy":"sqlalchemy","pydantic":"pydantic","dotenv":"python-dotenv","PIL":"Pillow","cv2":"opencv-python","matplotlib":"matplotlib","seaborn":"seaborn","scipy":"scipy"}; var reqs=[]; Object.keys(reqMap).forEach(function(k){if(src.indexOf("import "+k)>=0||src.indexOf("from "+k)>=0)reqs.push(reqMap[k]);}); var reqsTxt=reqs.length?reqs.join(" "):"# add dependencies here "; zip.file(folder+"main.py",src||"# "+title+" # Generated by PantheraHive BOS print(title+" loaded") "); zip.file(folder+"requirements.txt",reqsTxt); zip.file(folder+".env.example","# Environment variables "); zip.file(folder+"README.md","# "+title+" Generated by PantheraHive BOS. ## Setup ```bash python3 -m venv .venv source .venv/bin/activate pip install -r requirements.txt ``` ## Run ```bash python main.py ``` "); zip.file(folder+".gitignore",".venv/ __pycache__/ *.pyc .env .DS_Store "); } /* --- Node.js --- */ function buildNode(zip,folder,app,code){ var title=slugTitle(app); var pn=pkgName(app); var src=code.replace(/^```[w]* ?/m,"").replace(/ ?```$/m,"").trim(); var depMap={"mongoose":"^8.0.0","dotenv":"^16.4.5","axios":"^1.7.9","cors":"^2.8.5","bcryptjs":"^2.4.3","jsonwebtoken":"^9.0.2","socket.io":"^4.7.4","uuid":"^9.0.1","zod":"^3.22.4","express":"^4.18.2"}; var deps={}; Object.keys(depMap).forEach(function(k){if(src.indexOf(k)>=0)deps[k]=depMap[k];}); if(!deps["express"])deps["express"]="^4.18.2"; var pkgJson=JSON.stringify({"name":pn,"version":"1.0.0","main":"src/index.js","scripts":{"start":"node src/index.js","dev":"nodemon src/index.js"},"dependencies":deps,"devDependencies":{"nodemon":"^3.0.3"}},null,2)+" "; zip.file(folder+"package.json",pkgJson); var fallback="const express=require("express"); const app=express(); app.use(express.json()); app.get("/",(req,res)=>{ res.json({message:""+title+" API"}); }); const PORT=process.env.PORT||3000; app.listen(PORT,()=>console.log("Server on port "+PORT)); "; zip.file(folder+"src/index.js",src||fallback); zip.file(folder+".env.example","PORT=3000 "); zip.file(folder+".gitignore","node_modules/ .env .DS_Store "); zip.file(folder+"README.md","# "+title+" Generated by PantheraHive BOS. ## Setup ```bash npm install ``` ## Run ```bash npm run dev ``` "); } /* --- Vanilla HTML --- */ function buildVanillaHtml(zip,folder,app,code){ var title=slugTitle(app); var isFullDoc=code.trim().toLowerCase().indexOf("=0||code.trim().toLowerCase().indexOf("=0; var indexHtml=isFullDoc?code:" "+title+" "+code+" "; zip.file(folder+"index.html",indexHtml); zip.file(folder+"style.css","/* "+title+" — styles */ *{margin:0;padding:0;box-sizing:border-box} body{font-family:system-ui,-apple-system,sans-serif;background:#fff;color:#1a1a2e} "); zip.file(folder+"script.js","/* "+title+" — scripts */ "); zip.file(folder+"assets/.gitkeep",""); zip.file(folder+"README.md","# "+title+" Generated by PantheraHive BOS. ## Open Double-click `index.html` in your browser. Or serve locally: ```bash npx serve . # or python3 -m http.server 3000 ``` "); zip.file(folder+".gitignore",".DS_Store node_modules/ .env "); } /* ===== MAIN ===== */ var sc=document.createElement("script"); sc.src="https://cdnjs.cloudflare.com/ajax/libs/jszip/3.10.1/jszip.min.js"; sc.onerror=function(){ if(lbl)lbl.textContent="Download ZIP"; alert("JSZip load failed — check connection."); }; sc.onload=function(){ var zip=new JSZip(); var base=(_phFname||"output").replace(/.[^.]+$/,""); var app=base.toLowerCase().replace(/[^a-z0-9]+/g,"_").replace(/^_+|_+$/g,"")||"my_app"; var folder=app+"/"; var vc=document.getElementById("panel-content"); var panelTxt=vc?(vc.innerText||vc.textContent||""):""; var lang=detectLang(_phCode,panelTxt); if(_phIsHtml){ buildVanillaHtml(zip,folder,app,_phCode); } else if(lang==="flutter"){ buildFlutter(zip,folder,app,_phCode,panelTxt); } else if(lang==="react-native"){ buildReactNative(zip,folder,app,_phCode,panelTxt); } else if(lang==="swift"){ buildSwift(zip,folder,app,_phCode,panelTxt); } else if(lang==="kotlin"){ buildKotlin(zip,folder,app,_phCode,panelTxt); } else if(lang==="react"){ buildReact(zip,folder,app,_phCode,panelTxt); } else if(lang==="vue"){ buildVue(zip,folder,app,_phCode,panelTxt); } else if(lang==="angular"){ buildAngular(zip,folder,app,_phCode,panelTxt); } else if(lang==="python"){ buildPython(zip,folder,app,_phCode); } else if(lang==="node"){ buildNode(zip,folder,app,_phCode); } else { /* Document/content workflow */ var title=app.replace(/_/g," "); var md=_phAll||_phCode||panelTxt||"No content"; zip.file(folder+app+".md",md); var h=""+title+""; h+="

"+title+"

"; var hc=md.replace(/&/g,"&").replace(//g,">"); hc=hc.replace(/^### (.+)$/gm,"

$1

"); hc=hc.replace(/^## (.+)$/gm,"

$1

"); hc=hc.replace(/^# (.+)$/gm,"

$1

"); hc=hc.replace(/**(.+?)**/g,"$1"); hc=hc.replace(/ {2,}/g,"

"); h+="

"+hc+"

Generated by PantheraHive BOS
"; zip.file(folder+app+".html",h); zip.file(folder+"README.md","# "+title+" Generated by PantheraHive BOS. Files: - "+app+".md (Markdown) - "+app+".html (styled HTML) "); } zip.generateAsync({type:"blob"}).then(function(blob){ var a=document.createElement("a"); a.href=URL.createObjectURL(blob); a.download=app+".zip"; a.click(); URL.revokeObjectURL(a.href); if(lbl)lbl.textContent="Download ZIP"; }); }; document.head.appendChild(sc); }function phShare(){navigator.clipboard.writeText(window.location.href).then(function(){var el=document.getElementById("ph-share-lbl");if(el){el.textContent="Link copied!";setTimeout(function(){el.textContent="Copy share link";},2500);}});}function phEmbed(){var runId=window.location.pathname.split("/").pop().replace(".html","");var embedUrl="https://pantherahive.com/embed/"+runId;var code='';navigator.clipboard.writeText(code).then(function(){var el=document.getElementById("ph-embed-lbl");if(el){el.textContent="Embed code copied!";setTimeout(function(){el.textContent="Get Embed Code";},2500);}});}