DevOps Pipeline Generator
Run ID: 69cbe6d561b1021a29a8d5b32026-03-31Infrastructure
PantheraHive BOS
BOS Dashboard

DevOps Pipeline Generator: Comprehensive CI/CD Pipeline Configurations

This document provides detailed and professional Continuous Integration/Continuous Delivery (CI/CD) pipeline configurations, designed to streamline your software development lifecycle. We have generated configurations that integrate essential stages such as linting, testing, building, and deployment, adhering to modern DevOps best practices.

1. Introduction

The goal of this deliverable is to provide ready-to-use CI/CD pipeline configurations tailored for your project. We understand the critical need for automation, reliability, and speed in modern software delivery. This output includes a detailed, actionable example for GitHub Actions, alongside conceptual frameworks and key considerations for GitLab CI and Jenkins, ensuring you have a robust foundation for your chosen platform.

2. Core Pipeline Stages & Best Practices

Regardless of the CI/CD platform, a well-structured pipeline typically includes the following core stages to ensure code quality, functionality, and efficient delivery:

General Best Practices Applied:

3. Generated Pipeline Configurations: GitHub Actions (Detailed Example)

Below is a comprehensive GitHub Actions workflow (.github/workflows/main.yml) example for a typical Node.js web application that uses Docker for containerization and aims to deploy to a cloud service (e.g., AWS ECR/ECS). This configuration can be adapted for Python, Java, Go, or other languages with minor modifications to build and test steps.

main.yml Example: Node.js Web App with Docker and Cloud Deployment

yaml • 6,786 chars
# .github/workflows/main.yml
name: CI/CD Pipeline

on:
  push:
    branches:
      - main
      - develop
    tags:
      - 'v*.*.*' # Trigger on version tags like v1.0.0
  pull_request:
    branches:
      - main
      - develop
  workflow_dispatch: # Allows manual triggering of the workflow

env:
  # Define common environment variables for the workflow
  AWS_REGION: us-east-1
  ECR_REPOSITORY: my-node-app
  DOCKER_IMAGE_NAME: my-node-app

jobs:
  # --- 1. Linting Stage ---
  lint:
    name: Lint Code
    runs-on: ubuntu-latest
    steps:
      - name: Checkout Code
        uses: actions/checkout@v4

      - name: Setup Node.js
        uses: actions/setup-node@v4
        with:
          node-version: '18'
          cache: 'npm' # Cache npm dependencies

      - name: Install Dependencies
        run: npm ci # Use npm ci for clean installs in CI environments

      - name: Run ESLint
        run: npm run lint # Assuming you have a 'lint' script in package.json

  # --- 2. Testing Stage ---
  test:
    name: Run Tests
    runs-on: ubuntu-latest
    needs: lint # This job depends on the 'lint' job completing successfully
    steps:
      - name: Checkout Code
        uses: actions/checkout@v4

      - name: Setup Node.js
        uses: actions/setup-node@v4
        with:
          node-version: '18'
          cache: 'npm'

      - name: Install Dependencies
        run: npm ci

      - name: Run Unit and Integration Tests
        run: npm test # Assuming 'npm test' runs your test suite (e.g., Jest)

      # Optional: Upload test results for reporting
      - name: Upload Test Results
        if: always() # Run even if tests fail
        uses: actions/upload-artifact@v4
        with:
          name: test-results
          path: junit.xml # Or whatever your test reporter outputs

  # --- 3. Building Stage (Application & Docker Image) ---
  build:
    name: Build Application & Docker Image
    runs-on: ubuntu-latest
    needs: test # This job depends on the 'test' job completing successfully
    outputs:
      image_tag: ${{ steps.set-image-tag.outputs.image_tag }} # Output the generated image tag
    steps:
      - name: Checkout Code
        uses: actions/checkout@v4

      - name: Set up Docker Buildx
        uses: docker/setup-buildx-action@v3

      - name: Configure AWS Credentials
        uses: aws-actions/configure-aws-credentials@v4
        with:
          aws-access-key-id: ${{ secrets.AWS_ACCESS_KEY_ID }}
          aws-secret-access-key: ${{ secrets.AWS_SECRET_ACCESS_KEY }}
          aws-region: ${{ env.AWS_REGION }}

      - name: Login to Amazon ECR
        id: login-ecr
        uses: aws-actions/amazon-ecr-login@v2

      - name: Set Docker Image Tag
        id: set-image-tag
        run: |
          IMAGE_TAG="latest"
          if [[ "${{ github.ref }}" == "refs/heads/main" ]]; then
            IMAGE_TAG="${{ github.sha }}"
          elif [[ "${{ github.ref }}" == "refs/tags/v*.*.*" ]]; then
            IMAGE_TAG="${{ github.ref_name }}"
          fi
          echo "IMAGE_TAG=$IMAGE_TAG" >> $GITHUB_ENV
          echo "image_tag=$IMAGE_TAG" >> $GITHUB_OUTPUT

      - name: Build and Push Docker Image
        uses: docker/build-push-action@v5
        with:
          context: .
          push: true
          tags: ${{ steps.login-ecr.outputs.registry }}/${{ env.ECR_REPOSITORY }}:${{ env.IMAGE_TAG }}
          cache-from: type=gha,scope=${{ github.workflow }}
          cache-to: type=gha,scope=${{ github.workflow }},mode=max

  # --- 4. Deployment Stage (Staging Environment) ---
  deploy-staging:
    name: Deploy to Staging
    runs-on: ubuntu-latest
    needs: build # This job depends on the 'build' job completing successfully
    environment:
      name: Staging
      url: https://staging.example.com # Optional: URL to deployed application
    steps:
      - name: Configure AWS Credentials
        uses: aws-actions/configure-aws-credentials@v4
        with:
          aws-access-key-id: ${{ secrets.AWS_ACCESS_KEY_ID }}
          aws-secret-access-key: ${{ secrets.AWS_SECRET_ACCESS_KEY }}
          aws-region: ${{ env.AWS_REGION }}

      - name: Deploy to AWS ECS (Staging)
        run: |
          # Example: Update an ECS service to use the new Docker image
          # Replace with your actual deployment commands (e.g., using AWS CLI, Terraform, CloudFormation)
          echo "Deploying image ${{ needs.build.outputs.image_tag }} to Staging ECS service..."
          aws ecs update-service --cluster my-ecs-cluster-staging --service my-app-staging-service --force-new-deployment \
            --task-definition $(aws ecs describe-task-definition --task-definition my-app-staging-task --query 'taskDefinition.taskDefinitionArn' --output text) \
            --container-overrides '[{"name":"my-node-app","image":"${{ steps.login-ecr.outputs.registry }}/${{ env.ECR_REPOSITORY }}:${{ needs.build.outputs.image_tag }}"}]'
          echo "Staging deployment initiated."
        env:
          # Pass the image tag from the build job
          IMAGE_TAG: ${{ needs.build.outputs.image_tag }}

  # --- 5. Deployment Stage (Production Environment) ---
  deploy-production:
    name: Deploy to Production
    runs-on: ubuntu-latest
    needs: deploy-staging # Production deployment depends on successful staging deployment
    environment:
      name: Production
      url: https://www.example.com
    if: github.ref == 'refs/heads/main' && github.event_name == 'push' # Only deploy main branch pushes to production
    # Optional: require manual approval for production deployments
    # environment:
    #   name: Production
    #   url: https://www.example.com
    #   wait-for-manual-approval: true # Requires a custom action or external tool for approval
    steps:
      - name: Configure AWS Credentials
        uses: aws-actions/configure-aws-credentials@v4
        with:
          aws-access-key-id: ${{ secrets.AWS_ACCESS_KEY_ID }}
          aws-secret-access-key: ${{ secrets.AWS_SECRET_ACCESS_KEY }}
          aws-region: ${{ env.AWS_REGION }}

      - name: Deploy to AWS ECS (Production)
        run: |
          echo "Deploying image ${{ needs.build.outputs.image_tag }} to Production ECS service..."
          aws ecs update-service --cluster my-ecs-cluster-production --service my-app-production-service --force-new-deployment \
            --task-definition $(aws ecs describe-task-definition --task-definition my-app-production-task --query 'taskDefinition.taskDefinitionArn' --output text) \
            --container-overrides '[{"name":"my-node-app","image":"${{ steps.login-ecr.outputs.registry }}/${{ env.ECR_REPOSITORY }}:${{ needs.build.outputs.image_tag }}"}]'
          echo "Production deployment initiated."
        env:
          IMAGE_TAG: ${{ needs.build.outputs.image_tag }}

Sandboxed live preview

DevOps Pipeline Infrastructure Needs Analysis

Introduction

This document provides a comprehensive analysis of the foundational infrastructure requirements necessary for establishing a robust, scalable, and secure Continuous Integration/Continuous Delivery (CI/CD) pipeline. A well-designed CI/CD infrastructure is critical for accelerating software delivery, improving code quality, and ensuring operational stability. This analysis serves as the initial step in generating a tailored pipeline configuration, laying the groundwork by identifying the key components and considerations for a modern DevOps workflow.

Key Infrastructure Pillars for CI/CD

Building an effective CI/CD pipeline requires careful consideration of several interconnected infrastructure components. These pillars ensure that code can be efficiently built, tested, deployed, and monitored across various environments.

1. Version Control System (VCS)

The VCS is the bedrock of any CI/CD pipeline, acting as the single source of truth for all code and configuration.

  • Purpose: Stores source code, tracks changes, facilitates collaboration, and triggers pipeline execution upon code commits.
  • Common Choices:

* GitHub: Widely adopted, strong ecosystem, extensive integrations, ideal for open-source and private projects.

* GitLab: Comprehensive platform offering VCS, CI/CD, container registry, and more, all integrated.

* Bitbucket: Popular among Atlassian suite users, good for private repositories.

  • Infrastructure Needs: Typically cloud-hosted and managed by the provider, requiring only network access for CI/CD tools. For self-hosted solutions (e.g., GitLab Self-Managed), server resources, backup, and maintenance are required.
  • Recommendation: Align your CI/CD platform choice with your VCS for tighter integration and simplified workflows (e.g., GitHub Actions with GitHub, GitLab CI with GitLab).

2. CI/CD Platform (Orchestrator)

This is the engine that orchestrates the entire pipeline, defining and executing stages like testing, building, and deployment.

  • Purpose: Automates the software delivery process, from code commit to production deployment.
  • Common Choices & Infrastructure Considerations:

* GitHub Actions:

* Infrastructure: Fully managed by GitHub. Utilizes GitHub-hosted runners (Ubuntu, Windows, macOS) or self-hosted runners.

* Pros: Deep integration with GitHub, YAML-based configuration, vast marketplace of actions.

* Cons: Less flexible for complex on-premise integrations compared to Jenkins.

* GitLab CI:

* Infrastructure: Integrated directly into GitLab. Uses GitLab-managed shared runners or self-hosted GitLab Runners. Can be cloud-hosted or self-managed.

* Pros: Single platform for VCS, CI/CD, registry, and more; powerful YAML syntax.

* Cons: Self-managed instances require significant operational overhead.

* Jenkins:

* Infrastructure: Self-hosted (on-premise or cloud VMs/containers). Requires dedicated server(s) for the Jenkins controller and agents/executors.

* Pros: Highly extensible (thousands of plugins), extremely flexible, suitable for complex, legacy, or hybrid environments.

* Cons: Higher maintenance burden, requires dedicated infrastructure management, XML/Groovy-based configurations can be less intuitive for new users.

  • Recommendation: For cloud-native projects, GitHub Actions or GitLab CI offer excellent out-of-the-box experiences with minimal infrastructure management. For highly customized, on-premise, or legacy environments, Jenkins provides unparalleled flexibility but demands more operational effort.

3. Build & Test Environments (Runners/Agents)

These are the compute resources where pipeline jobs (compiling, testing, linting) are executed.

  • Purpose: Provide isolated and consistent environments for executing pipeline stages.
  • Types:

* Cloud-Hosted/Managed Runners: Provided by the CI/CD platform (e.g., GitHub-hosted runners, GitLab Shared Runners).

* Infrastructure Needs: None from the user's perspective; managed by the provider.

* Pros: Zero maintenance, on-demand scalability, pay-per-use.

* Cons: Limited customization, potential cold start delays, shared resources.

* Self-Hosted Runners/Agents: Machines (VMs, containers, physical servers) provisioned and managed by the user.

* Infrastructure Needs: Dedicated compute resources (CPU, RAM, storage), operating system (Linux, Windows, macOS), network connectivity, security hardening, monitoring.

* Pros: Full control over environment, custom hardware/software, access to internal networks, potentially lower cost for high usage.

* Cons: Significant operational overhead, requires scaling management, security responsibility.

  • Recommendation: Start with managed runners for simplicity and speed. Transition to self-hosted runners if specific requirements arise (e.g., specialized hardware, strict network isolation, cost optimization for high volume).

4. Artifact & Image Management

Storage for the outputs of your build process, ensuring traceability and reusability.

  • Purpose: Store compiled binaries, packages, Docker images, test reports, and other build artifacts.
  • Common Tools & Infrastructure:

* Container Registries: For Docker/OCI images (e.g., Docker Hub, AWS ECR, Google Container Registry, Azure Container Registry, GitLab Container Registry).

* Infrastructure Needs: Cloud-managed services typically, requiring network access and IAM policies. Self-hosted options (e.g., Harbor) require dedicated server resources.

* Package Repositories: For language-specific packages (e.g., Maven Central, npm registry, NuGet, PyPI). Private instances like JFrog Artifactory or Sonatype Nexus support multiple formats.

* Infrastructure Needs: Cloud-managed services or self-hosted servers with database and storage.

* Object Storage: For generic build artifacts, logs, reports (e.g., AWS S3, Google Cloud Storage, Azure Blob Storage).

* Infrastructure Needs: Cloud-managed, requiring IAM and network access.

  • Recommendation: Utilize dedicated artifact repositories for proper versioning, security scanning, and efficient distribution. Cloud-native registries are often the easiest to integrate and scale.

5. Deployment Targets & Environments

The infrastructure where your application will ultimately run.

  • Purpose: Provide distinct environments (Development, Staging, Production) for testing and operating the application.
  • Common Targets & Infrastructure:

* Virtual Machines (VMs): AWS EC2, Azure VMs, Google Compute Engine.

* Infrastructure Needs: Provisioning and management of VMs, network configuration (VPCs, subnets, security groups), OS patching, monitoring agents.

* Container Orchestration (Kubernetes): AWS EKS, Azure AKS, Google GKE, OpenShift.

* Infrastructure Needs: Kubernetes cluster setup and management, node pools, networking (CNI), storage (CSI), ingress controllers, service meshes.

* Serverless Platforms: AWS Lambda, Azure Functions, Google Cloud Functions.

* Infrastructure Needs: Managed by cloud provider, requiring function code, configuration, IAM roles, and API Gateway/event source setup.

* Platform-as-a-Service (PaaS): Heroku, AWS Elastic Beanstalk, Azure App Service, Google App Engine.

* Infrastructure Needs: Managed by cloud provider, requiring application code and configuration.

  • Recommendation: Choose deployment targets based on application architecture, scalability requirements, operational expertise, and cost considerations. Define a clear strategy for environment provisioning (e.g., Infrastructure as Code).

6. Secrets Management

Securely storing and accessing sensitive information during the pipeline execution and application runtime.

  • Purpose: Protect credentials (API keys, database passwords, cloud tokens), certificates, and other sensitive data.
  • Common Tools & Infrastructure:

* CI/CD Platform Native Secrets: GitHub Secrets, GitLab CI/CD Variables, Jenkins Credentials.

* Infrastructure Needs: Built-in to the platform, typically encrypted at rest and in transit.

* Cloud-Native Secret Managers: AWS Secrets Manager, Azure Key Vault, Google Secret Manager.

* Infrastructure Needs: Managed services, requiring IAM policies for access control.

* Dedicated Secret Management Tools: HashiCorp Vault.

* Infrastructure Needs: Dedicated server(s) for Vault server, storage backend (e.g., Consul, S3), robust security configuration, and client libraries/integrations.

  • Recommendation: Integrate a robust secrets management solution from the outset. Cloud-native options offer good integration with cloud environments, while dedicated tools like Vault provide advanced features for complex hybrid scenarios. Avoid hardcoding secrets.

7. Monitoring, Logging & Alerting

Observability tools for tracking pipeline health and deployed application performance.

  • Purpose: Gain insights into pipeline execution, identify bottlenecks, detect application issues, and trigger alerts.
  • Common Tools & Infrastructure:

* Logging: Centralized log aggregation (e.g., ELK Stack - Elasticsearch, Logstash, Kibana; Grafana Loki; cloud-native services like AWS CloudWatch Logs, Azure Monitor Logs, Google Cloud Logging).

* Infrastructure Needs: Dedicated servers for log collectors/processors, storage for logs, visualization dashboards. Managed services simplify this.

* Monitoring: Metrics collection and visualization (e.g., Prometheus/Grafana, cloud-native services like AWS CloudWatch, Azure Monitor, Google Cloud Monitoring).

* Infrastructure Needs: Dedicated servers for metric collectors, time-series databases, dashboarding tools.

* Alerting: Integration with communication platforms (e.g., PagerDuty, Slack, email) to notify teams of critical events.

  • Recommendation: Implement comprehensive monitoring and logging for both the CI/CD pipeline itself and the deployed applications. This is crucial for rapid issue detection and resolution, contributing to overall system reliability.

8. Security & Compliance

Integrating security practices and tools throughout the pipeline.

  • Purpose: Identify and mitigate security vulnerabilities, ensure compliance with regulatory standards.
  • Aspects & Infrastructure:

* Static Application Security Testing (SAST): Tools integrated into the build stage (e.g., SonarQube, Snyk, Checkmarx).

* Infrastructure Needs: Dedicated server for SAST tool, integration with CI/CD.

* Dynamic Application Security Testing (DAST): Tools for testing running applications (e.g., OWASP ZAP, Burp Suite).

* Infrastructure Needs: Test environment where DAST tool can scan the application.

* Dependency Scanning: Identify vulnerabilities in third-party libraries (e.g., Snyk, Trivy, OWASP Dependency-Check).

* Infrastructure Needs: Can run directly on CI/CD runners or integrate with artifact repositories.

* Container Image Scanning: Identify vulnerabilities in Docker images (e.g., Trivy, Clair, built-in cloud registry scanners).

* Infrastructure Needs: Integrated with container registry or CI/CD pipeline.

* Network Security: Firewalls, VPCs, network segmentation, access control lists (ACLs).

* Infrastructure Needs: Configured at the cloud provider or on-premise network level.

  • Recommendation: Adopt a "Shift-Left" security approach, embedding security checks early and throughout the CI/CD pipeline.

Data Insights & Trends

The landscape of CI/CD infrastructure is constantly evolving, driven by cloud adoption, automation, and security concerns.

  • Cloud-Native Dominance: The move towards managed cloud services for CI/CD platforms, artifact storage, and deployment targets continues to accelerate. This reduces operational overhead and improves scalability.
  • Containerization & Kubernetes: Docker and Kubernetes are de facto standards for application packaging and orchestration, driving demand for CI/CD pipelines that natively support container builds, scanning, and deployments to Kubernetes clusters
gemini Output

DevOps Pipeline Generator: Generated CI/CD Configurations

This deliverable provides comprehensive CI/CD pipeline configurations for three popular platforms: GitHub Actions, GitLab CI, and Jenkins. Each configuration includes essential stages: linting, testing, building, and deployment, designed to ensure code quality, reliability, and efficient delivery.

The examples are based on a generic Node.js application, containerized with Docker, but are structured to be easily adaptable to other languages, frameworks, and deployment targets.


1. GitHub Actions Configuration

Overview

GitHub Actions is an event-driven automation platform directly integrated into GitHub. This configuration defines a workflow for a Node.js application, triggered on pushes to the main branch and pull requests. It performs linting, testing, Docker image building, pushing

Explanation of GitHub Actions Configuration

  • name: The name of your workflow, displayed in the GitHub Actions tab.
  • on: Defines the events that trigger the workflow.

* push: Triggers on pushes to main, develop branches, and any tag starting with v.

* pull_request: Triggers on pull requests targeting main or develop.

* workflow_dispatch: Allows manual triggering from the GitHub UI.

  • env: Defines environment variables accessible to all jobs in the workflow.
  • jobs: A collection of named jobs that run in parallel by default, unless dependencies are specified.

* lint:

* runs-on: Specifies the runner environment (e.g., ubuntu-latest).

* steps: A sequence of tasks to be executed.

* uses: Reuses existing actions (e.g., actions/checkout@v4 for checking out code, actions/setup-node@v4 for setting up Node.js).

* with: Provides input parameters to an action.

* run: Executes a shell command.

* cache: Configures dependency caching to speed up builds.

* test:

* needs: lint: Ensures this job only runs after lint successfully completes.

* if: always(): An example of a conditional step, ensuring artifact upload even if tests fail.

* actions/upload-artifact@v4: Used to save generated files (like test reports) as workflow artifacts.

* build:

*

devops_pipeline_generator.txt
Download source file
Copy all content
Full output as text
Download ZIP
IDE-ready project ZIP
Copy share link
Permanent URL for this run
Get Embed Code
Embed this result on any website
Print / Save PDF
Use browser print dialog
\n\n\n"); var hasSrcMain=Object.keys(extracted).some(function(k){return k.indexOf("src/main")>=0;}); if(!hasSrcMain) zip.file(folder+"src/main."+ext,"import React from 'react'\nimport ReactDOM from 'react-dom/client'\nimport App from './App'\nimport './index.css'\n\nReactDOM.createRoot(document.getElementById('root')!).render(\n \n \n \n)\n"); var hasSrcApp=Object.keys(extracted).some(function(k){return k==="src/App."+ext||k==="App."+ext;}); if(!hasSrcApp) zip.file(folder+"src/App."+ext,"import React from 'react'\nimport './App.css'\n\nfunction App(){\n return(\n
\n
\n

"+slugTitle(pn)+"

\n

Built with PantheraHive BOS

\n
\n
\n )\n}\nexport default App\n"); zip.file(folder+"src/index.css","*{margin:0;padding:0;box-sizing:border-box}\nbody{font-family:system-ui,-apple-system,sans-serif;background:#f0f2f5;color:#1a1a2e}\n.app{min-height:100vh;display:flex;flex-direction:column}\n.app-header{flex:1;display:flex;flex-direction:column;align-items:center;justify-content:center;gap:12px;padding:40px}\nh1{font-size:2.5rem;font-weight:700}\n"); zip.file(folder+"src/App.css",""); zip.file(folder+"src/components/.gitkeep",""); zip.file(folder+"src/pages/.gitkeep",""); zip.file(folder+"src/hooks/.gitkeep",""); Object.keys(extracted).forEach(function(p){ var fp=p.startsWith("src/")?p:"src/"+p; zip.file(folder+fp,extracted[p]); }); zip.file(folder+"README.md","# "+slugTitle(pn)+"\n\nGenerated by PantheraHive BOS.\n\n## Setup\n\`\`\`bash\nnpm install\nnpm run dev\n\`\`\`\n\n## Build\n\`\`\`bash\nnpm run build\n\`\`\`\n\n## Open in IDE\nOpen the project folder in VS Code or WebStorm.\n"); zip.file(folder+".gitignore","node_modules/\ndist/\n.env\n.DS_Store\n*.local\n"); } /* --- Vue (Vite + Composition API + TypeScript) --- */ function buildVue(zip,folder,app,code,panelTxt){ var pn=pkgName(app); var C=cc(pn); var extracted=extractCode(panelTxt); zip.file(folder+"package.json",'{\n "name": "'+pn+'",\n "version": "0.0.0",\n "type": "module",\n "scripts": {\n "dev": "vite",\n "build": "vue-tsc -b && vite build",\n "preview": "vite preview"\n },\n "dependencies": {\n "vue": "^3.5.13",\n "vue-router": "^4.4.5",\n "pinia": "^2.3.0",\n "axios": "^1.7.9"\n },\n "devDependencies": {\n "@vitejs/plugin-vue": "^5.2.1",\n "typescript": "~5.7.3",\n "vite": "^6.0.5",\n "vue-tsc": "^2.2.0"\n }\n}\n'); zip.file(folder+"vite.config.ts","import { defineConfig } from 'vite'\nimport vue from '@vitejs/plugin-vue'\nimport { resolve } from 'path'\n\nexport default defineConfig({\n plugins: [vue()],\n resolve: { alias: { '@': resolve(__dirname,'src') } }\n})\n"); zip.file(folder+"tsconfig.json",'{"files":[],"references":[{"path":"./tsconfig.app.json"},{"path":"./tsconfig.node.json"}]}\n'); zip.file(folder+"tsconfig.app.json",'{\n "compilerOptions":{\n "target":"ES2020","useDefineForClassFields":true,"module":"ESNext","lib":["ES2020","DOM","DOM.Iterable"],\n "skipLibCheck":true,"moduleResolution":"bundler","allowImportingTsExtensions":true,\n "isolatedModules":true,"moduleDetection":"force","noEmit":true,"jsxImportSource":"vue",\n "strict":true,"paths":{"@/*":["./src/*"]}\n },\n "include":["src/**/*.ts","src/**/*.d.ts","src/**/*.tsx","src/**/*.vue"]\n}\n'); zip.file(folder+"env.d.ts","/// \n"); zip.file(folder+"index.html","\n\n\n \n \n "+slugTitle(pn)+"\n\n\n
\n \n\n\n"); var hasMain=Object.keys(extracted).some(function(k){return k==="src/main.ts"||k==="main.ts";}); if(!hasMain) zip.file(folder+"src/main.ts","import { createApp } from 'vue'\nimport { createPinia } from 'pinia'\nimport App from './App.vue'\nimport './assets/main.css'\n\nconst app = createApp(App)\napp.use(createPinia())\napp.mount('#app')\n"); var hasApp=Object.keys(extracted).some(function(k){return k.indexOf("App.vue")>=0;}); if(!hasApp) zip.file(folder+"src/App.vue","\n\n\n\n\n"); zip.file(folder+"src/assets/main.css","*{margin:0;padding:0;box-sizing:border-box}body{font-family:system-ui,sans-serif;background:#fff;color:#213547}\n"); zip.file(folder+"src/components/.gitkeep",""); zip.file(folder+"src/views/.gitkeep",""); zip.file(folder+"src/stores/.gitkeep",""); Object.keys(extracted).forEach(function(p){ var fp=p.startsWith("src/")?p:"src/"+p; zip.file(folder+fp,extracted[p]); }); zip.file(folder+"README.md","# "+slugTitle(pn)+"\n\nGenerated by PantheraHive BOS.\n\n## Setup\n\`\`\`bash\nnpm install\nnpm run dev\n\`\`\`\n\n## Build\n\`\`\`bash\nnpm run build\n\`\`\`\n\nOpen in VS Code or WebStorm.\n"); zip.file(folder+".gitignore","node_modules/\ndist/\n.env\n.DS_Store\n*.local\n"); } /* --- Angular (v19 standalone) --- */ function buildAngular(zip,folder,app,code,panelTxt){ var pn=pkgName(app); var C=cc(pn); var sel=pn.replace(/_/g,"-"); var extracted=extractCode(panelTxt); zip.file(folder+"package.json",'{\n "name": "'+pn+'",\n "version": "0.0.0",\n "scripts": {\n "ng": "ng",\n "start": "ng serve",\n "build": "ng build",\n "test": "ng test"\n },\n "dependencies": {\n "@angular/animations": "^19.0.0",\n "@angular/common": "^19.0.0",\n "@angular/compiler": "^19.0.0",\n "@angular/core": "^19.0.0",\n "@angular/forms": "^19.0.0",\n "@angular/platform-browser": "^19.0.0",\n "@angular/platform-browser-dynamic": "^19.0.0",\n "@angular/router": "^19.0.0",\n "rxjs": "~7.8.0",\n "tslib": "^2.3.0",\n "zone.js": "~0.15.0"\n },\n "devDependencies": {\n "@angular-devkit/build-angular": "^19.0.0",\n "@angular/cli": "^19.0.0",\n "@angular/compiler-cli": "^19.0.0",\n "typescript": "~5.6.0"\n }\n}\n'); zip.file(folder+"angular.json",'{\n "$schema": "./node_modules/@angular/cli/lib/config/schema.json",\n "version": 1,\n "newProjectRoot": "projects",\n "projects": {\n "'+pn+'": {\n "projectType": "application",\n "root": "",\n "sourceRoot": "src",\n "prefix": "app",\n "architect": {\n "build": {\n "builder": "@angular-devkit/build-angular:application",\n "options": {\n "outputPath": "dist/'+pn+'",\n "index": "src/index.html",\n "browser": "src/main.ts",\n "tsConfig": "tsconfig.app.json",\n "styles": ["src/styles.css"],\n "scripts": []\n }\n },\n "serve": {"builder":"@angular-devkit/build-angular:dev-server","configurations":{"production":{"buildTarget":"'+pn+':build:production"},"development":{"buildTarget":"'+pn+':build:development"}},"defaultConfiguration":"development"}\n }\n }\n }\n}\n'); zip.file(folder+"tsconfig.json",'{\n "compileOnSave": false,\n "compilerOptions": {"baseUrl":"./","outDir":"./dist/out-tsc","forceConsistentCasingInFileNames":true,"strict":true,"noImplicitOverride":true,"noPropertyAccessFromIndexSignature":true,"noImplicitReturns":true,"noFallthroughCasesInSwitch":true,"paths":{"@/*":["src/*"]},"skipLibCheck":true,"esModuleInterop":true,"sourceMap":true,"declaration":false,"experimentalDecorators":true,"moduleResolution":"bundler","importHelpers":true,"target":"ES2022","module":"ES2022","useDefineForClassFields":false,"lib":["ES2022","dom"]},\n "references":[{"path":"./tsconfig.app.json"}]\n}\n'); zip.file(folder+"tsconfig.app.json",'{\n "extends":"./tsconfig.json",\n "compilerOptions":{"outDir":"./dist/out-tsc","types":[]},\n "files":["src/main.ts"],\n "include":["src/**/*.d.ts"]\n}\n'); zip.file(folder+"src/index.html","\n\n\n \n "+slugTitle(pn)+"\n \n \n \n\n\n \n\n\n"); zip.file(folder+"src/main.ts","import { bootstrapApplication } from '@angular/platform-browser';\nimport { appConfig } from './app/app.config';\nimport { AppComponent } from './app/app.component';\n\nbootstrapApplication(AppComponent, appConfig)\n .catch(err => console.error(err));\n"); zip.file(folder+"src/styles.css","* { margin: 0; padding: 0; box-sizing: border-box; }\nbody { font-family: system-ui, -apple-system, sans-serif; background: #f9fafb; color: #111827; }\n"); var hasComp=Object.keys(extracted).some(function(k){return k.indexOf("app.component")>=0;}); if(!hasComp){ zip.file(folder+"src/app/app.component.ts","import { Component } from '@angular/core';\nimport { RouterOutlet } from '@angular/router';\n\n@Component({\n selector: 'app-root',\n standalone: true,\n imports: [RouterOutlet],\n templateUrl: './app.component.html',\n styleUrl: './app.component.css'\n})\nexport class AppComponent {\n title = '"+pn+"';\n}\n"); zip.file(folder+"src/app/app.component.html","
\n
\n

"+slugTitle(pn)+"

\n

Built with PantheraHive BOS

\n
\n \n
\n"); zip.file(folder+"src/app/app.component.css",".app-header{display:flex;flex-direction:column;align-items:center;justify-content:center;min-height:60vh;gap:16px}h1{font-size:2.5rem;font-weight:700;color:#6366f1}\n"); } zip.file(folder+"src/app/app.config.ts","import { ApplicationConfig, provideZoneChangeDetection } from '@angular/core';\nimport { provideRouter } from '@angular/router';\nimport { routes } from './app.routes';\n\nexport const appConfig: ApplicationConfig = {\n providers: [\n provideZoneChangeDetection({ eventCoalescing: true }),\n provideRouter(routes)\n ]\n};\n"); zip.file(folder+"src/app/app.routes.ts","import { Routes } from '@angular/router';\n\nexport const routes: Routes = [];\n"); Object.keys(extracted).forEach(function(p){ var fp=p.startsWith("src/")?p:"src/"+p; zip.file(folder+fp,extracted[p]); }); zip.file(folder+"README.md","# "+slugTitle(pn)+"\n\nGenerated by PantheraHive BOS.\n\n## Setup\n\`\`\`bash\nnpm install\nng serve\n# or: npm start\n\`\`\`\n\n## Build\n\`\`\`bash\nng build\n\`\`\`\n\nOpen in VS Code with Angular Language Service extension.\n"); zip.file(folder+".gitignore","node_modules/\ndist/\n.env\n.DS_Store\n*.local\n.angular/\n"); } /* --- Python --- */ function buildPython(zip,folder,app,code){ var title=slugTitle(app); var pn=pkgName(app); var src=code.replace(/^\`\`\`[\w]*\n?/m,"").replace(/\n?\`\`\`$/m,"").trim(); var reqMap={"numpy":"numpy","pandas":"pandas","sklearn":"scikit-learn","tensorflow":"tensorflow","torch":"torch","flask":"flask","fastapi":"fastapi","uvicorn":"uvicorn","requests":"requests","sqlalchemy":"sqlalchemy","pydantic":"pydantic","dotenv":"python-dotenv","PIL":"Pillow","cv2":"opencv-python","matplotlib":"matplotlib","seaborn":"seaborn","scipy":"scipy"}; var reqs=[]; Object.keys(reqMap).forEach(function(k){if(src.indexOf("import "+k)>=0||src.indexOf("from "+k)>=0)reqs.push(reqMap[k]);}); var reqsTxt=reqs.length?reqs.join("\n"):"# add dependencies here\n"; zip.file(folder+"main.py",src||"# "+title+"\n# Generated by PantheraHive BOS\n\nprint(title+\" loaded\")\n"); zip.file(folder+"requirements.txt",reqsTxt); zip.file(folder+".env.example","# Environment variables\n"); zip.file(folder+"README.md","# "+title+"\n\nGenerated by PantheraHive BOS.\n\n## Setup\n\`\`\`bash\npython3 -m venv .venv\nsource .venv/bin/activate\npip install -r requirements.txt\n\`\`\`\n\n## Run\n\`\`\`bash\npython main.py\n\`\`\`\n"); zip.file(folder+".gitignore",".venv/\n__pycache__/\n*.pyc\n.env\n.DS_Store\n"); } /* --- Node.js --- */ function buildNode(zip,folder,app,code){ var title=slugTitle(app); var pn=pkgName(app); var src=code.replace(/^\`\`\`[\w]*\n?/m,"").replace(/\n?\`\`\`$/m,"").trim(); var depMap={"mongoose":"^8.0.0","dotenv":"^16.4.5","axios":"^1.7.9","cors":"^2.8.5","bcryptjs":"^2.4.3","jsonwebtoken":"^9.0.2","socket.io":"^4.7.4","uuid":"^9.0.1","zod":"^3.22.4","express":"^4.18.2"}; var deps={}; Object.keys(depMap).forEach(function(k){if(src.indexOf(k)>=0)deps[k]=depMap[k];}); if(!deps["express"])deps["express"]="^4.18.2"; var pkgJson=JSON.stringify({"name":pn,"version":"1.0.0","main":"src/index.js","scripts":{"start":"node src/index.js","dev":"nodemon src/index.js"},"dependencies":deps,"devDependencies":{"nodemon":"^3.0.3"}},null,2)+"\n"; zip.file(folder+"package.json",pkgJson); var fallback="const express=require(\"express\");\nconst app=express();\napp.use(express.json());\n\napp.get(\"/\",(req,res)=>{\n res.json({message:\""+title+" API\"});\n});\n\nconst PORT=process.env.PORT||3000;\napp.listen(PORT,()=>console.log(\"Server on port \"+PORT));\n"; zip.file(folder+"src/index.js",src||fallback); zip.file(folder+".env.example","PORT=3000\n"); zip.file(folder+".gitignore","node_modules/\n.env\n.DS_Store\n"); zip.file(folder+"README.md","# "+title+"\n\nGenerated by PantheraHive BOS.\n\n## Setup\n\`\`\`bash\nnpm install\n\`\`\`\n\n## Run\n\`\`\`bash\nnpm run dev\n\`\`\`\n"); } /* --- Vanilla HTML --- */ function buildVanillaHtml(zip,folder,app,code){ var title=slugTitle(app); var isFullDoc=code.trim().toLowerCase().indexOf("=0||code.trim().toLowerCase().indexOf("=0; var indexHtml=isFullDoc?code:"\n\n\n\n\n"+title+"\n\n\n\n"+code+"\n\n\n\n"; zip.file(folder+"index.html",indexHtml); zip.file(folder+"style.css","/* "+title+" — styles */\n*{margin:0;padding:0;box-sizing:border-box}\nbody{font-family:system-ui,-apple-system,sans-serif;background:#fff;color:#1a1a2e}\n"); zip.file(folder+"script.js","/* "+title+" — scripts */\n"); zip.file(folder+"assets/.gitkeep",""); zip.file(folder+"README.md","# "+title+"\n\nGenerated by PantheraHive BOS.\n\n## Open\nDouble-click \`index.html\` in your browser.\n\nOr serve locally:\n\`\`\`bash\nnpx serve .\n# or\npython3 -m http.server 3000\n\`\`\`\n"); zip.file(folder+".gitignore",".DS_Store\nnode_modules/\n.env\n"); } /* ===== MAIN ===== */ var sc=document.createElement("script"); sc.src="https://cdnjs.cloudflare.com/ajax/libs/jszip/3.10.1/jszip.min.js"; sc.onerror=function(){ if(lbl)lbl.textContent="Download ZIP"; alert("JSZip load failed — check connection."); }; sc.onload=function(){ var zip=new JSZip(); var base=(_phFname||"output").replace(/\.[^.]+$/,""); var app=base.toLowerCase().replace(/[^a-z0-9]+/g,"_").replace(/^_+|_+$/g,"")||"my_app"; var folder=app+"/"; var vc=document.getElementById("panel-content"); var panelTxt=vc?(vc.innerText||vc.textContent||""):""; var lang=detectLang(_phCode,panelTxt); if(_phIsHtml){ buildVanillaHtml(zip,folder,app,_phCode); } else if(lang==="flutter"){ buildFlutter(zip,folder,app,_phCode,panelTxt); } else if(lang==="react-native"){ buildReactNative(zip,folder,app,_phCode,panelTxt); } else if(lang==="swift"){ buildSwift(zip,folder,app,_phCode,panelTxt); } else if(lang==="kotlin"){ buildKotlin(zip,folder,app,_phCode,panelTxt); } else if(lang==="react"){ buildReact(zip,folder,app,_phCode,panelTxt); } else if(lang==="vue"){ buildVue(zip,folder,app,_phCode,panelTxt); } else if(lang==="angular"){ buildAngular(zip,folder,app,_phCode,panelTxt); } else if(lang==="python"){ buildPython(zip,folder,app,_phCode); } else if(lang==="node"){ buildNode(zip,folder,app,_phCode); } else { /* Document/content workflow */ var title=app.replace(/_/g," "); var md=_phAll||_phCode||panelTxt||"No content"; zip.file(folder+app+".md",md); var h=""+title+""; h+="

"+title+"

"; var hc=md.replace(/&/g,"&").replace(//g,">"); hc=hc.replace(/^### (.+)$/gm,"

$1

"); hc=hc.replace(/^## (.+)$/gm,"

$1

"); hc=hc.replace(/^# (.+)$/gm,"

$1

"); hc=hc.replace(/\*\*(.+?)\*\*/g,"$1"); hc=hc.replace(/\n{2,}/g,"

"); h+="

"+hc+"

Generated by PantheraHive BOS
"; zip.file(folder+app+".html",h); zip.file(folder+"README.md","# "+title+"\n\nGenerated by PantheraHive BOS.\n\nFiles:\n- "+app+".md (Markdown)\n- "+app+".html (styled HTML)\n"); } zip.generateAsync({type:"blob"}).then(function(blob){ var a=document.createElement("a"); a.href=URL.createObjectURL(blob); a.download=app+".zip"; a.click(); URL.revokeObjectURL(a.href); if(lbl)lbl.textContent="Download ZIP"; }); }; document.head.appendChild(sc); } function phShare(){navigator.clipboard.writeText(window.location.href).then(function(){var el=document.getElementById("ph-share-lbl");if(el){el.textContent="Link copied!";setTimeout(function(){el.textContent="Copy share link";},2500);}});}function phEmbed(){var runId=window.location.pathname.split("/").pop().replace(".html","");var embedUrl="https://pantherahive.com/embed/"+runId;var code='';navigator.clipboard.writeText(code).then(function(){var el=document.getElementById("ph-embed-lbl");if(el){el.textContent="Embed code copied!";setTimeout(function(){el.textContent="Get Embed Code";},2500);}});}