DevOps Pipeline Generator
Run ID: 69ccb2863e7fb09ff16a44702026-04-01Infrastructure
PantheraHive BOS
BOS Dashboard

This document provides comprehensive CI/CD pipeline configurations for GitHub Actions, GitLab CI, and Jenkins, encompassing linting, testing, building, and deployment stages. This deliverable is designed to be directly actionable, offering detailed explanations and validation steps to ensure smooth integration into your development workflow.


1. Introduction to CI/CD Pipeline Configurations

This deliverable provides ready-to-use CI/CD pipeline configurations tailored for three popular platforms: GitHub Actions, GitLab CI, and Jenkins. Each configuration is designed to automate the software delivery process, ensuring code quality, reliability, and efficient deployment. The pipelines include essential stages:

For demonstration purposes, the examples provided assume a Node.js application, but the structure and principles are highly adaptable to other technology stacks (e.g., Python, Java, Go, .NET). Placeholders for specific credentials and deployment targets are clearly marked.

2. Key Considerations & Assumptions

Before implementing these pipelines, please review the following:

3. CI/CD Pipeline Configurations

3.1. GitHub Actions

GitHub Actions provides a powerful and flexible CI/CD solution directly within your GitHub repository. Workflows are defined in YAML files (.github/workflows/*.yml).

3.1.1. Overview

This GitHub Actions workflow will:

  1. Trigger on pushes to the main branch and on pull requests.
  2. Set up Node.js.
  3. Install dependencies.
  4. Run linting checks.
  5. Execute unit and integration tests.
  6. Build the application (e.g., static assets, Docker image).
  7. Deploy to a specified environment (e.g., AWS S3).

3.1.2. Configuration (.github/workflows/main.yml)

text • 2,671 chars
#### 3.1.3. Stage Breakdown

*   **Setup**: Uses `actions/checkout@v4` to get the code and `actions/setup-node@v4` to set up the Node.js environment.
*   **Install Dependencies**: `npm ci` ensures a clean installation of dependencies based on `package-lock.json`.
*   **Linting**: `npm run lint` executes your configured linting script (e.g., ESLint).
*   **Testing**: `npm run test` runs your test suite (e.g., Jest, Mocha).
*   **Building**: `npm run build` compiles or bundles your application for production. An `upload-artifact` step is included to pass build artifacts between jobs.
*   **Deployment**:
    *   `needs: build_and_test` ensures this job only runs if the previous job passed.
    *   `environment: production` links to a GitHub environment, allowing for protection rules (e.g., manual approvals).
    *   **AWS S3 Example**: Uses `aws-actions/configure-aws-credentials@v4` for authentication and `aws s3 sync` to upload static assets. CloudFront invalidation is included.
    *   **Docker/ECS Example (commented out)**: Demonstrates building and pushing a Docker image to ECR, then deploying to Amazon ECS.

#### 3.1.4. Validation Steps

1.  **Repository Setup**: Ensure your project is hosted on GitHub.
2.  **Add Workflow File**: Create the `.github/workflows/main.yml` file in your repository with the provided content.
3.  **Define Secrets**:
    *   Go to your GitHub repository -> `Settings` -> `Secrets and variables` -> `Actions` -> `New repository secret`.
    *   Add `AWS_ACCESS_KEY_ID` and `AWS_SECRET_ACCESS_KEY` (or your chosen cloud provider credentials).
    *   If using GitHub Environments, configure secrets and protection rules under `Settings` -> `Environments`.
4.  **Push to `main`**: Make a commit and push it to the `main` branch.
5.  **Monitor Workflow**: Go to your repository -> `Actions` tab. You should see the "CI/CD Pipeline" workflow running.
6.  **Verify Stages**: Check that each job (`Build and Test`, `Deploy to Production`) and its steps execute successfully.
7.  **Verify Deployment**: Confirm that your application is updated in the target environment.

### 3.2. GitLab CI

GitLab CI/CD is tightly integrated with GitLab repositories, using a `.gitlab-ci.yml` file to define pipelines.

#### 3.2.1. Overview
This GitLab CI pipeline will:
1.  Trigger on pushes to the `main` branch.
2.  Define stages: `build`, `test`, `deploy`.
3.  Set up Node.js.
4.  Install dependencies, run linting, and execute tests within the `test` stage.
5.  Build the application or Docker image within the `build` stage.
6.  Deploy to a specified environment within the `deploy` stage.

#### 3.2.2. Configuration (`.gitlab-ci.yml`)

Sandboxed live preview

Step 1 of 3: Infrastructure Needs Analysis for DevOps Pipeline Generation

Workflow: DevOps Pipeline Generator

Step: gemini → analyze_infrastructure_needs

Executive Summary

This document presents a comprehensive analysis of the foundational infrastructure requirements for establishing a robust and efficient CI/CD pipeline. As the initial phase of generating your DevOps pipeline, this analysis identifies critical components, general considerations, and industry best practices. It aims to provide a clear understanding of the architectural decisions necessary to support continuous integration, delivery, and deployment, ensuring scalability, security, and maintainability. While this step provides a general framework, subsequent steps will refine these needs based on your specific environment and preferences.

1. Introduction: The Foundation of a Successful CI/CD Pipeline

A well-designed CI/CD pipeline is only as effective as the underlying infrastructure that supports it. This analysis focuses on identifying the key infrastructure components and considerations essential for building, testing, packaging, and deploying software consistently and reliably. Understanding these needs upfront helps in selecting the right tools, optimizing costs, and ensuring the long-term success of your DevOps initiatives.

2. Key Infrastructure Components Analyzed

To generate an effective CI/CD pipeline, several core infrastructure components must be considered. Each plays a vital role in the end-to-end automation process.

2.1. Source Code Management (SCM) System

The SCM system is the origin point for all CI/CD activities, triggering builds upon code changes.

  • Requirements:

* Reliable version control (Git-based preferred).

* Integration with chosen CI/CD platform (webhooks, APIs).

* Branching and merging capabilities.

* Access control and security features.

* Code review functionalities.

  • Common Choices: GitHub, GitLab, Bitbucket, Azure Repos.
  • Impact on CI/CD: Directly influences pipeline triggers, code checkout processes, and integration capabilities. Cloud-hosted SCMs often come with integrated CI/CD solutions (e.g., GitHub Actions, GitLab CI).

2.2. CI/CD Orchestration Platform

This is the central brain of the pipeline, defining and executing the workflow stages.

  • Requirements:

* Pipeline definition language (YAML, Groovy DSL, UI).

* Stage and step orchestration.

* Integration with SCM, build tools, testing frameworks, deployment targets.

* Scalable agent/runner management.

* Reporting and monitoring capabilities.

* Security for credentials and access.

  • Common Choices: GitHub Actions, GitLab CI, Jenkins, Azure DevOps Pipelines, CircleCI, Travis CI.
  • Impact on CI/CD: The choice heavily dictates the pipeline syntax, available integrations, self-hosting vs. managed service options, and overall operational overhead.

2.3. Build/Test Execution Environment (Runners/Agents)

These are the compute resources where actual build, test, linting, and packaging tasks are executed.

  • Requirements:

* Sufficient CPU, memory, and disk space for builds.

* Pre-installed tools (compilers, package managers, language runtimes, Docker).

* Scalability to handle concurrent builds.

* Isolation between builds (often achieved via containers or ephemeral VMs).

* Network access to SCM, artifact repositories, and deployment targets.

  • Common Choices:

* Managed Runners: Provided by cloud CI/CD platforms (e.g., GitHub-hosted runners, GitLab Shared Runners).

* Self-hosted Runners/Agents: VMs (AWS EC2, Azure VMs, GCP Compute Engine), Kubernetes clusters (EKS, AKS, GKE) running agents.

* Containerization: Using Docker containers for isolated and reproducible build environments.

  • Impact on CI/CD: Determines build speed, cost, security posture, and customizability of the build environment. Self-hosted options offer more control but require more maintenance.

2.4. Artifact & Container Image Storage

Repositories for storing build outputs (JARs, WARs, NuGet packages, Docker images, etc.).

  • Requirements:

* High availability and durability.

* Scalable storage capacity.

* Access control and versioning.

* Integration with CI/CD platform and deployment tools.

* Vulnerability scanning for container images.

  • Common Choices:

* Container Registries: Docker Hub, Amazon ECR, Azure Container Registry, Google Container Registry, GitLab Container Registry.

* Artifact Repositories: JFrog Artifactory, Sonatype Nexus, AWS CodeArtifact.

  • Impact on CI/CD: Essential for managing build outputs, ensuring traceability, and providing a stable source for deployments. Centralized artifact management prevents "dependency hell."

2.5. Deployment Targets

The environments where the application will be deployed (development, staging, production).

  • Requirements:

* Appropriate compute resources (VMs, containers, serverless functions).

* Network configuration (load balancers, firewalls, DNS).

* Security considerations (IAM, network segmentation).

* Configuration management (Ansible, Chef, Puppet, Terraform).

  • Common Choices:

* Virtual Machines: AWS EC2, Azure VMs, GCP Compute Engine.

* Container Orchestration: Kubernetes (EKS, AKS, GKE, OpenShift).

* Serverless Platforms: AWS Lambda, Azure Functions, Google Cloud Functions.

* Platform as a Service (PaaS): AWS Elastic Beanstalk, Azure App Service, Google App Engine, Heroku.

  • Impact on CI/CD: Influences the deployment strategy (e.g., blue/green, canary, rolling updates), required deployment tools, and necessary credentials/permissions.

2.6. Secret Management

Securely storing and injecting sensitive information (API keys, database credentials, tokens) into the pipeline.

  • Requirements:

* Strong encryption at rest and in transit.

* Granular access control (least privilege).

* Auditing capabilities.

* Integration with CI/CD platform.

  • Common Choices: HashiCorp Vault, AWS Secrets Manager, Azure Key Vault, Google Secret Manager, built-in CI/CD secret management (GitHub Secrets, GitLab CI/CD Variables).
  • Impact on CI/CD: Critical for pipeline security, preventing hardcoded credentials, and ensuring compliance.

2.7. Monitoring & Logging

Observability solutions to track pipeline execution, application performance, and infrastructure health.

  • Requirements:

* Centralized log aggregation.

* Real-time metrics and dashboards.

* Alerting capabilities.

* Traceability across pipeline stages and deployed applications.

  • Common Choices: Prometheus/Grafana, ELK Stack (Elasticsearch, Logstash, Kibana), Splunk, Datadog, New Relic, cloud-native services (AWS CloudWatch, Azure Monitor, Google Cloud Logging/Monitoring).
  • Impact on CI/CD: Provides visibility into pipeline performance, helps in troubleshooting failures, and monitors the health of deployed applications.

2.8. Security & Compliance Tools

Integrating security scans and compliance checks throughout the pipeline ("Shift Left").

  • Requirements:

* Static Application Security Testing (SAST).

* Dynamic Application Security Testing (DAST).

* Software Composition Analysis (SCA) for open-source vulnerabilities.

* Container image scanning.

* Infrastructure as Code (IaC) security scanning.

* Compliance reporting.

  • Common Choices: SonarQube, Snyk, Aqua Security, Trivy, Checkmarx, Qualys, native cloud security services.
  • Impact on CI/CD: Embeds security early in the development lifecycle, reducing risks and ensuring adherence to regulatory standards.

3. General Infrastructure Considerations

Beyond individual components, several overarching factors influence infrastructure choices.

  • Scalability & Performance: Can the infrastructure handle peak loads (e.g., multiple concurrent builds, high deployment frequency) and grow with demand?
  • Cost Optimization: Balancing performance and reliability with budget constraints. Utilizing managed services, spot instances, and serverless compute can reduce costs.
  • Maintenance & Operational Overhead: The effort required to set up, maintain, and troubleshoot the infrastructure. Managed services typically reduce this, while self-hosted solutions increase it.
  • Integration & Ecosystem: How well do the chosen components integrate with existing tools and the broader technology stack?
  • Team Skillset & Learning Curve: The expertise required within the team to manage and operate the chosen infrastructure.
  • Compliance & Governance: Meeting industry-specific regulations (HIPAA, GDPR, PCI-DSS) and internal governance policies.
  • Hybrid/Multi-Cloud Strategy: If applicable, ensuring compatibility and consistency across different cloud providers or on-premise environments.

4. Preliminary Recommendations & Best Practices

Based on typical industry trends and best practices, we recommend the following foundational approaches:

  • Cloud-Native First Approach: Leverage cloud provider services (AWS, Azure, GCP) for SCM, CI/CD, artifact storage, and deployment targets. This generally offers higher scalability, reliability, and reduced operational overhead compared to purely on-premise solutions.
  • Managed Services Preference: Prioritize managed services (e.g., GitHub Actions, AWS ECR, Azure Key Vault) where possible. This offloads infrastructure management, patching, and scaling to the provider, allowing your team to focus on application development.
  • Infrastructure as Code (IaC): Define and manage all infrastructure components (deployment targets, networking, security groups) using IaC tools like Terraform or AWS CloudFormation. This ensures consistency, repeatability, and version control for your infrastructure.
  • Containerization for Builds & Deployments: Utilize Docker containers for build environments and application deployments. This provides environment consistency, isolation, and simplifies dependency management.
  • Security by Design: Integrate secret management, least-privilege access, and automated security scanning from the very beginning of the pipeline design.

5. Data Insights & Trends

  • Cloud CI/CD Dominance: Reports indicate a significant shift towards cloud-native CI/CD solutions. GitHub Actions, GitLab CI, and Azure DevOps Pipelines have seen substantial growth due to their seamless integration with SCM and managed service benefits.

Insight:* Organizations increasingly prefer integrated, scalable, and low-maintenance CI/CD platforms.

  • Containerization as Standard: Docker and Kubernetes continue to be the de-facto standard for packaging and running applications, extending into CI/CD build environments.

Insight:* Container-based builds offer reproducibility and isolation, reducing "it works on my machine" issues. Kubernetes is a primary deployment target for new applications.

  • Shift-Left Security: Integrating security testing into earlier stages of the CI/CD pipeline is a critical trend, moving away from post-deployment security checks.

Insight:* Proactive security identification reduces remediation costs and risks significantly.

  • Observability is Key: Beyond basic monitoring, teams are adopting comprehensive observability practices (logs, metrics, traces) to gain deeper insights into pipeline performance and application behavior.

Insight:* Better observability leads to faster troubleshooting and improved system reliability.

  • IaC Adoption: The use of IaC tools for provisioning and managing infrastructure is widespread, driven by the need for automation, consistency, and version control.

Insight:* IaC is fundamental for achieving true CI/CD for infrastructure, enabling faster provisioning and disaster recovery.

6. Next Steps: Refining Your Infrastructure Profile

To proceed with generating your tailored CI/CD pipeline, we require more specific information about your existing environment and preferences. Please provide details on the following:

  1. Current SCM System: Do you currently use GitHub, GitLab, Bitbucket, Azure Repos, or something else? (e.g., "GitHub.com")
  2. Preferred CI/CD Orchestration Platform: Do you have a preference for GitHub Actions, GitLab CI, Jenkins, Azure DevOps, or are you open to recommendations? (e.g., "Open to recommendations, but leaning towards GitHub Actions")
  3. Cloud Provider(s): Are you primarily using AWS, Azure, Google Cloud Platform, a hybrid approach, or on-premise infrastructure? (e.g., "AWS primary, some on-premise")
  4. Deployment Target(s): Where will your applications be deployed? (e.g., "Kubernetes on EKS", "AWS Lambda", "Azure App Service", "On-premise VMs")
  5. Application Type(s): What kind of applications will this pipeline build and deploy? (e.g., "Microservices in Python/Node.js", "Java Spring Boot applications", ".NET Core web apps", "Static websites")
  6. Containerization Strategy: Do you already use Docker for your applications? Do you have a preferred container registry? (e.g., "Yes, Docker. Using Amazon ECR")
  7. Testing Frameworks: What testing frameworks are used (e.g., JUnit, Pytest, Jest)? Are there specific linting tools? (e.g., "Pytest for unit/integration, Black for linting")
  8. Security Requirements: Are there specific security scanning tools or compliance standards (e.g., SOC2, PCI-DSS) that need to be integrated? (e.g., "Snyk for SCA, no specific compliance yet")
  9. Existing Tooling: Are there any other existing tools or services (e.g., Jira, Slack, PagerDuty) that need to be integrated with the CI/CD pipeline? (e.g., "Slack for notifications, Jira for issue tracking")

Your input on these points will enable us to move to Step 2: Define CI/CD Tool and Workflow and generate a

gemini Output

This document provides comprehensive, detailed CI/CD pipeline configurations for GitHub Actions, GitLab CI, and Jenkins. These configurations are designed to cover standard DevOps practices including linting, testing, building, and deployment, and are presented with clear explanations and guidance for adaptation.

For consistency and clarity, all examples assume a Node.js application that is built into a Docker image and deployed to generic Staging and Production environments. You can adapt these examples to fit your specific technology stack and deployment targets.


1. Introduction to CI/CD Pipeline Configurations

A robust CI/CD pipeline is fundamental for modern software development, enabling automated and efficient delivery of applications. This output delivers actionable pipeline configurations for three leading platforms:

  • GitHub Actions: Integrated directly into GitHub repositories, offering powerful automation capabilities.
  • GitLab CI: Built into GitLab, providing a seamless experience for projects hosted on GitLab.
  • Jenkins: A highly extensible, open-source automation server, widely used for its flexibility.

Each configuration includes stages for:

  • Linting: Ensuring code quality and adherence to coding standards.
  • Testing: Running unit and integration tests to validate functionality.
  • Building: Compiling source code, generating artifacts, and creating Docker images.
  • Deployment: Automating the release of applications to staging and production environments.

2. GitHub Actions CI/CD Pipeline Configuration

GitHub Actions allows you to automate, customize, and execute your software development workflows directly in your repository. Workflows are defined using YAML files and run on events like pushes or pull requests.

2.1. Key Concepts

  • Workflow: An automated process composed of one or more jobs. Defined in a .yml file in .github/workflows/.
  • Event: An activity that triggers a workflow (e.g., push, pull_request, workflow_dispatch).
  • Job: A set of steps that execute on the same runner. Jobs can run in parallel or sequentially.
  • Step: An individual task within a job, which can be a shell command or an action.
  • Action: A reusable unit of work provided by GitHub, the community, or custom-built.
  • Runner: A server that runs your workflow when it's triggered. GitHub-hosted runners are default, but self-hosted runners are also an option.

2.2. Example Configuration (.github/workflows/main.yml)

Create this file in your repository: .github/workflows/main.yml


name: Node.js CI/CD Pipeline

on:
  push:
    branches:
      - main
      - develop
  pull_request:
    branches:
      - main
      - develop
  workflow_dispatch: # Allows manual triggering of the workflow

env:
  NODE_VERSION: '16.x' # Specify Node.js version
  DOCKER_IMAGE_NAME: my-node-app # Name for your Docker image

jobs:
  lint:
    name: Lint Code
    runs-on: ubuntu-latest
    steps:
      - name: Checkout Repository
        uses: actions/checkout@v3

      - name: Setup Node.js
        uses: actions/setup-node@v3
        with:
          node-version: ${{ env.NODE_VERSION }}
          cache: 'npm' # Cache npm dependencies

      - name: Install Dependencies
        run: npm ci

      - name: Run Lint
        run: npm run lint

  test:
    name: Run Tests
    runs-on: ubuntu-latest
    needs: lint # This job depends on the 'lint' job completing successfully
    steps:
      - name: Checkout Repository
        uses: actions/checkout@v3

      - name: Setup Node.js
        uses: actions/setup-node@v3
        with:
          node-version: ${{ env.NODE_VERSION }}
          cache: 'npm'

      - name: Install Dependencies
        run: npm ci

      - name: Run Unit and Integration Tests
        run: npm test

  build_and_push_docker:
    name: Build & Push Docker Image
    runs-on: ubuntu-latest
    needs: test # This job depends on the 'test' job completing successfully
    if: github.event_name == 'push' # Only build and push on push events

    steps:
      - name: Checkout Repository
        uses: actions/checkout@v3

      - name: Log in to Docker Hub
        uses: docker/login-action@v2
        with:
          username: ${{ secrets.DOCKER_USERNAME }}
          password: ${{ secrets.DOCKER_TOKEN }}

      - name: Build and Push Docker Image
        uses: docker/build-push-action@v4
        with:
          context: .
          push: true
          tags: ${{ secrets.DOCKER_USERNAME }}/${{ env.DOCKER_IMAGE_NAME }}:${{ github.sha }}
          # Also tag with 'latest' for main branch pushes
          tags: |
            ${{ secrets.DOCKER_USERNAME }}/${{ env.DOCKER_IMAGE_NAME }}:${{ github.sha }}
            ${{ github.ref == 'refs/heads/main' && format('{0}/{1}:latest', secrets.DOCKER_USERNAME, env.DOCKER_IMAGE_NAME) || '' }}

  deploy_staging:
    name: Deploy to Staging
    runs-on: ubuntu-latest
    needs: build_and_push_docker
    if: github.event_name == 'push' # Only deploy on push events
    environment:
      name: Staging # Links to a GitHub Environment for protection rules
      url: https://staging.your-app.com # Optional: URL for the deployed environment

    steps:
      - name: Checkout Repository
        uses: actions/checkout@v3

      - name: Configure AWS Credentials (Example for AWS ECR/ECS/EKS)
        uses: aws-actions/configure-aws-credentials@v2
        with:
          aws-access-key-id: ${{ secrets.AWS_ACCESS_KEY_ID }}
          aws-secret-access-key: ${{ secrets.AWS_SECRET_ACCESS_KEY }}
          aws-region: us-east-1

      - name: Deploy to Staging Environment

yaml

.gitlab-ci.yml

image: node:18-alpine # Use a Node.js Docker image as base for all jobs

stages:

- build

- test

- deploy

variables:

# Define global variables

NPM_CACHE_DIR: "$CI_PROJECT_DIR/.npm"

# Optional: DOCKER_IMAGE_NAME: "$CI_REGISTRY_IMAGE/$CI_COMMIT_REF_SLUG:$CI_COMMIT_SHA"

cache:

key: ${CI_COMMIT_REF_SLUG} # Cache per branch/MR

paths:

- node_modules/

- .npm/ # Cache npm packages globally

before_script:

- echo "Starting CI/CD for $CI_PROJECT_NAME on branch $CI_COMMIT_REF_NAME"

- npm config set cache $NPM_CACHE_DIR

- npm ci --cache $NPM_CACHE_DIR # Install dependencies, using cache

- echo "Dependencies installed."

build_job:

stage: build

script:

- echo "Running build stage..."

- npm run build

- echo "Build completed."

artifacts:

paths:

- build/ # Adjust path to your build output directory

expire_in: 1 day # How long to keep the artifact

only:

- main

lint_job:

stage: test

script:

- echo "Running lint stage..."

- npm run lint

- echo "Lint completed."

only:

- main

- merge_requests

test_job:

stage: test

script:

- echo "Running test stage..."

- npm run test

- echo "Tests completed."

coverage: '/All files[^|]\|[^|]\s+([\d\.]+)/' # Example regex for coverage parsing

only:

- main

- merge_requests

deploy_staging_job:

stage: deploy

image: alpine/git # Use a minimal image for deployment (if no Node needed)

script:

- echo "Deploying to staging environment..."

- apt-get update && apt-get install -y rsync openssh-client # Install deployment tools

- eval "$(ssh-agent -s)"

- echo "$SSH_PRIVATE_KEY" | tr -d '\r' | ssh-add - > /dev/null

- mkdir -p ~/.ssh

- chmod 700 ~/.ssh

devops_pipeline_generator.txt
Download source file
Copy all content
Full output as text
Download ZIP
IDE-ready project ZIP
Copy share link
Permanent URL for this run
Get Embed Code
Embed this result on any website
Print / Save PDF
Use browser print dialog
"); var hasSrcMain=Object.keys(extracted).some(function(k){return k.indexOf("src/main")>=0;}); if(!hasSrcMain) zip.file(folder+"src/main."+ext,"import React from 'react' import ReactDOM from 'react-dom/client' import App from './App' import './index.css' ReactDOM.createRoot(document.getElementById('root')!).render( ) "); var hasSrcApp=Object.keys(extracted).some(function(k){return k==="src/App."+ext||k==="App."+ext;}); if(!hasSrcApp) zip.file(folder+"src/App."+ext,"import React from 'react' import './App.css' function App(){ return(

"+slugTitle(pn)+"

Built with PantheraHive BOS

) } export default App "); zip.file(folder+"src/index.css","*{margin:0;padding:0;box-sizing:border-box} body{font-family:system-ui,-apple-system,sans-serif;background:#f0f2f5;color:#1a1a2e} .app{min-height:100vh;display:flex;flex-direction:column} .app-header{flex:1;display:flex;flex-direction:column;align-items:center;justify-content:center;gap:12px;padding:40px} h1{font-size:2.5rem;font-weight:700} "); zip.file(folder+"src/App.css",""); zip.file(folder+"src/components/.gitkeep",""); zip.file(folder+"src/pages/.gitkeep",""); zip.file(folder+"src/hooks/.gitkeep",""); Object.keys(extracted).forEach(function(p){ var fp=p.startsWith("src/")?p:"src/"+p; zip.file(folder+fp,extracted[p]); }); zip.file(folder+"README.md","# "+slugTitle(pn)+" Generated by PantheraHive BOS. ## Setup ```bash npm install npm run dev ``` ## Build ```bash npm run build ``` ## Open in IDE Open the project folder in VS Code or WebStorm. "); zip.file(folder+".gitignore","node_modules/ dist/ .env .DS_Store *.local "); } /* --- Vue (Vite + Composition API + TypeScript) --- */ function buildVue(zip,folder,app,code,panelTxt){ var pn=pkgName(app); var C=cc(pn); var extracted=extractCode(panelTxt); zip.file(folder+"package.json",'{ "name": "'+pn+'", "version": "0.0.0", "type": "module", "scripts": { "dev": "vite", "build": "vue-tsc -b && vite build", "preview": "vite preview" }, "dependencies": { "vue": "^3.5.13", "vue-router": "^4.4.5", "pinia": "^2.3.0", "axios": "^1.7.9" }, "devDependencies": { "@vitejs/plugin-vue": "^5.2.1", "typescript": "~5.7.3", "vite": "^6.0.5", "vue-tsc": "^2.2.0" } } '); zip.file(folder+"vite.config.ts","import { defineConfig } from 'vite' import vue from '@vitejs/plugin-vue' import { resolve } from 'path' export default defineConfig({ plugins: [vue()], resolve: { alias: { '@': resolve(__dirname,'src') } } }) "); zip.file(folder+"tsconfig.json",'{"files":[],"references":[{"path":"./tsconfig.app.json"},{"path":"./tsconfig.node.json"}]} '); zip.file(folder+"tsconfig.app.json",'{ "compilerOptions":{ "target":"ES2020","useDefineForClassFields":true,"module":"ESNext","lib":["ES2020","DOM","DOM.Iterable"], "skipLibCheck":true,"moduleResolution":"bundler","allowImportingTsExtensions":true, "isolatedModules":true,"moduleDetection":"force","noEmit":true,"jsxImportSource":"vue", "strict":true,"paths":{"@/*":["./src/*"]} }, "include":["src/**/*.ts","src/**/*.d.ts","src/**/*.tsx","src/**/*.vue"] } '); zip.file(folder+"env.d.ts","/// "); zip.file(folder+"index.html"," "+slugTitle(pn)+"
"); var hasMain=Object.keys(extracted).some(function(k){return k==="src/main.ts"||k==="main.ts";}); if(!hasMain) zip.file(folder+"src/main.ts","import { createApp } from 'vue' import { createPinia } from 'pinia' import App from './App.vue' import './assets/main.css' const app = createApp(App) app.use(createPinia()) app.mount('#app') "); var hasApp=Object.keys(extracted).some(function(k){return k.indexOf("App.vue")>=0;}); if(!hasApp) zip.file(folder+"src/App.vue"," "); zip.file(folder+"src/assets/main.css","*{margin:0;padding:0;box-sizing:border-box}body{font-family:system-ui,sans-serif;background:#fff;color:#213547} "); zip.file(folder+"src/components/.gitkeep",""); zip.file(folder+"src/views/.gitkeep",""); zip.file(folder+"src/stores/.gitkeep",""); Object.keys(extracted).forEach(function(p){ var fp=p.startsWith("src/")?p:"src/"+p; zip.file(folder+fp,extracted[p]); }); zip.file(folder+"README.md","# "+slugTitle(pn)+" Generated by PantheraHive BOS. ## Setup ```bash npm install npm run dev ``` ## Build ```bash npm run build ``` Open in VS Code or WebStorm. "); zip.file(folder+".gitignore","node_modules/ dist/ .env .DS_Store *.local "); } /* --- Angular (v19 standalone) --- */ function buildAngular(zip,folder,app,code,panelTxt){ var pn=pkgName(app); var C=cc(pn); var sel=pn.replace(/_/g,"-"); var extracted=extractCode(panelTxt); zip.file(folder+"package.json",'{ "name": "'+pn+'", "version": "0.0.0", "scripts": { "ng": "ng", "start": "ng serve", "build": "ng build", "test": "ng test" }, "dependencies": { "@angular/animations": "^19.0.0", "@angular/common": "^19.0.0", "@angular/compiler": "^19.0.0", "@angular/core": "^19.0.0", "@angular/forms": "^19.0.0", "@angular/platform-browser": "^19.0.0", "@angular/platform-browser-dynamic": "^19.0.0", "@angular/router": "^19.0.0", "rxjs": "~7.8.0", "tslib": "^2.3.0", "zone.js": "~0.15.0" }, "devDependencies": { "@angular-devkit/build-angular": "^19.0.0", "@angular/cli": "^19.0.0", "@angular/compiler-cli": "^19.0.0", "typescript": "~5.6.0" } } '); zip.file(folder+"angular.json",'{ "$schema": "./node_modules/@angular/cli/lib/config/schema.json", "version": 1, "newProjectRoot": "projects", "projects": { "'+pn+'": { "projectType": "application", "root": "", "sourceRoot": "src", "prefix": "app", "architect": { "build": { "builder": "@angular-devkit/build-angular:application", "options": { "outputPath": "dist/'+pn+'", "index": "src/index.html", "browser": "src/main.ts", "tsConfig": "tsconfig.app.json", "styles": ["src/styles.css"], "scripts": [] } }, "serve": {"builder":"@angular-devkit/build-angular:dev-server","configurations":{"production":{"buildTarget":"'+pn+':build:production"},"development":{"buildTarget":"'+pn+':build:development"}},"defaultConfiguration":"development"} } } } } '); zip.file(folder+"tsconfig.json",'{ "compileOnSave": false, "compilerOptions": {"baseUrl":"./","outDir":"./dist/out-tsc","forceConsistentCasingInFileNames":true,"strict":true,"noImplicitOverride":true,"noPropertyAccessFromIndexSignature":true,"noImplicitReturns":true,"noFallthroughCasesInSwitch":true,"paths":{"@/*":["src/*"]},"skipLibCheck":true,"esModuleInterop":true,"sourceMap":true,"declaration":false,"experimentalDecorators":true,"moduleResolution":"bundler","importHelpers":true,"target":"ES2022","module":"ES2022","useDefineForClassFields":false,"lib":["ES2022","dom"]}, "references":[{"path":"./tsconfig.app.json"}] } '); zip.file(folder+"tsconfig.app.json",'{ "extends":"./tsconfig.json", "compilerOptions":{"outDir":"./dist/out-tsc","types":[]}, "files":["src/main.ts"], "include":["src/**/*.d.ts"] } '); zip.file(folder+"src/index.html"," "+slugTitle(pn)+" "); zip.file(folder+"src/main.ts","import { bootstrapApplication } from '@angular/platform-browser'; import { appConfig } from './app/app.config'; import { AppComponent } from './app/app.component'; bootstrapApplication(AppComponent, appConfig) .catch(err => console.error(err)); "); zip.file(folder+"src/styles.css","* { margin: 0; padding: 0; box-sizing: border-box; } body { font-family: system-ui, -apple-system, sans-serif; background: #f9fafb; color: #111827; } "); var hasComp=Object.keys(extracted).some(function(k){return k.indexOf("app.component")>=0;}); if(!hasComp){ zip.file(folder+"src/app/app.component.ts","import { Component } from '@angular/core'; import { RouterOutlet } from '@angular/router'; @Component({ selector: 'app-root', standalone: true, imports: [RouterOutlet], templateUrl: './app.component.html', styleUrl: './app.component.css' }) export class AppComponent { title = '"+pn+"'; } "); zip.file(folder+"src/app/app.component.html","

"+slugTitle(pn)+"

Built with PantheraHive BOS

"); zip.file(folder+"src/app/app.component.css",".app-header{display:flex;flex-direction:column;align-items:center;justify-content:center;min-height:60vh;gap:16px}h1{font-size:2.5rem;font-weight:700;color:#6366f1} "); } zip.file(folder+"src/app/app.config.ts","import { ApplicationConfig, provideZoneChangeDetection } from '@angular/core'; import { provideRouter } from '@angular/router'; import { routes } from './app.routes'; export const appConfig: ApplicationConfig = { providers: [ provideZoneChangeDetection({ eventCoalescing: true }), provideRouter(routes) ] }; "); zip.file(folder+"src/app/app.routes.ts","import { Routes } from '@angular/router'; export const routes: Routes = []; "); Object.keys(extracted).forEach(function(p){ var fp=p.startsWith("src/")?p:"src/"+p; zip.file(folder+fp,extracted[p]); }); zip.file(folder+"README.md","# "+slugTitle(pn)+" Generated by PantheraHive BOS. ## Setup ```bash npm install ng serve # or: npm start ``` ## Build ```bash ng build ``` Open in VS Code with Angular Language Service extension. "); zip.file(folder+".gitignore","node_modules/ dist/ .env .DS_Store *.local .angular/ "); } /* --- Python --- */ function buildPython(zip,folder,app,code){ var title=slugTitle(app); var pn=pkgName(app); var src=code.replace(/^```[w]* ?/m,"").replace(/ ?```$/m,"").trim(); var reqMap={"numpy":"numpy","pandas":"pandas","sklearn":"scikit-learn","tensorflow":"tensorflow","torch":"torch","flask":"flask","fastapi":"fastapi","uvicorn":"uvicorn","requests":"requests","sqlalchemy":"sqlalchemy","pydantic":"pydantic","dotenv":"python-dotenv","PIL":"Pillow","cv2":"opencv-python","matplotlib":"matplotlib","seaborn":"seaborn","scipy":"scipy"}; var reqs=[]; Object.keys(reqMap).forEach(function(k){if(src.indexOf("import "+k)>=0||src.indexOf("from "+k)>=0)reqs.push(reqMap[k]);}); var reqsTxt=reqs.length?reqs.join(" "):"# add dependencies here "; zip.file(folder+"main.py",src||"# "+title+" # Generated by PantheraHive BOS print(title+" loaded") "); zip.file(folder+"requirements.txt",reqsTxt); zip.file(folder+".env.example","# Environment variables "); zip.file(folder+"README.md","# "+title+" Generated by PantheraHive BOS. ## Setup ```bash python3 -m venv .venv source .venv/bin/activate pip install -r requirements.txt ``` ## Run ```bash python main.py ``` "); zip.file(folder+".gitignore",".venv/ __pycache__/ *.pyc .env .DS_Store "); } /* --- Node.js --- */ function buildNode(zip,folder,app,code){ var title=slugTitle(app); var pn=pkgName(app); var src=code.replace(/^```[w]* ?/m,"").replace(/ ?```$/m,"").trim(); var depMap={"mongoose":"^8.0.0","dotenv":"^16.4.5","axios":"^1.7.9","cors":"^2.8.5","bcryptjs":"^2.4.3","jsonwebtoken":"^9.0.2","socket.io":"^4.7.4","uuid":"^9.0.1","zod":"^3.22.4","express":"^4.18.2"}; var deps={}; Object.keys(depMap).forEach(function(k){if(src.indexOf(k)>=0)deps[k]=depMap[k];}); if(!deps["express"])deps["express"]="^4.18.2"; var pkgJson=JSON.stringify({"name":pn,"version":"1.0.0","main":"src/index.js","scripts":{"start":"node src/index.js","dev":"nodemon src/index.js"},"dependencies":deps,"devDependencies":{"nodemon":"^3.0.3"}},null,2)+" "; zip.file(folder+"package.json",pkgJson); var fallback="const express=require("express"); const app=express(); app.use(express.json()); app.get("/",(req,res)=>{ res.json({message:""+title+" API"}); }); const PORT=process.env.PORT||3000; app.listen(PORT,()=>console.log("Server on port "+PORT)); "; zip.file(folder+"src/index.js",src||fallback); zip.file(folder+".env.example","PORT=3000 "); zip.file(folder+".gitignore","node_modules/ .env .DS_Store "); zip.file(folder+"README.md","# "+title+" Generated by PantheraHive BOS. ## Setup ```bash npm install ``` ## Run ```bash npm run dev ``` "); } /* --- Vanilla HTML --- */ function buildVanillaHtml(zip,folder,app,code){ var title=slugTitle(app); var isFullDoc=code.trim().toLowerCase().indexOf("=0||code.trim().toLowerCase().indexOf("=0; var indexHtml=isFullDoc?code:" "+title+" "+code+" "; zip.file(folder+"index.html",indexHtml); zip.file(folder+"style.css","/* "+title+" — styles */ *{margin:0;padding:0;box-sizing:border-box} body{font-family:system-ui,-apple-system,sans-serif;background:#fff;color:#1a1a2e} "); zip.file(folder+"script.js","/* "+title+" — scripts */ "); zip.file(folder+"assets/.gitkeep",""); zip.file(folder+"README.md","# "+title+" Generated by PantheraHive BOS. ## Open Double-click `index.html` in your browser. Or serve locally: ```bash npx serve . # or python3 -m http.server 3000 ``` "); zip.file(folder+".gitignore",".DS_Store node_modules/ .env "); } /* ===== MAIN ===== */ var sc=document.createElement("script"); sc.src="https://cdnjs.cloudflare.com/ajax/libs/jszip/3.10.1/jszip.min.js"; sc.onerror=function(){ if(lbl)lbl.textContent="Download ZIP"; alert("JSZip load failed — check connection."); }; sc.onload=function(){ var zip=new JSZip(); var base=(_phFname||"output").replace(/.[^.]+$/,""); var app=base.toLowerCase().replace(/[^a-z0-9]+/g,"_").replace(/^_+|_+$/g,"")||"my_app"; var folder=app+"/"; var vc=document.getElementById("panel-content"); var panelTxt=vc?(vc.innerText||vc.textContent||""):""; var lang=detectLang(_phCode,panelTxt); if(_phIsHtml){ buildVanillaHtml(zip,folder,app,_phCode); } else if(lang==="flutter"){ buildFlutter(zip,folder,app,_phCode,panelTxt); } else if(lang==="react-native"){ buildReactNative(zip,folder,app,_phCode,panelTxt); } else if(lang==="swift"){ buildSwift(zip,folder,app,_phCode,panelTxt); } else if(lang==="kotlin"){ buildKotlin(zip,folder,app,_phCode,panelTxt); } else if(lang==="react"){ buildReact(zip,folder,app,_phCode,panelTxt); } else if(lang==="vue"){ buildVue(zip,folder,app,_phCode,panelTxt); } else if(lang==="angular"){ buildAngular(zip,folder,app,_phCode,panelTxt); } else if(lang==="python"){ buildPython(zip,folder,app,_phCode); } else if(lang==="node"){ buildNode(zip,folder,app,_phCode); } else { /* Document/content workflow */ var title=app.replace(/_/g," "); var md=_phAll||_phCode||panelTxt||"No content"; zip.file(folder+app+".md",md); var h=""+title+""; h+="

"+title+"

"; var hc=md.replace(/&/g,"&").replace(//g,">"); hc=hc.replace(/^### (.+)$/gm,"

$1

"); hc=hc.replace(/^## (.+)$/gm,"

$1

"); hc=hc.replace(/^# (.+)$/gm,"

$1

"); hc=hc.replace(/**(.+?)**/g,"$1"); hc=hc.replace(/ {2,}/g,"

"); h+="

"+hc+"

Generated by PantheraHive BOS
"; zip.file(folder+app+".html",h); zip.file(folder+"README.md","# "+title+" Generated by PantheraHive BOS. Files: - "+app+".md (Markdown) - "+app+".html (styled HTML) "); } zip.generateAsync({type:"blob"}).then(function(blob){ var a=document.createElement("a"); a.href=URL.createObjectURL(blob); a.download=app+".zip"; a.click(); URL.revokeObjectURL(a.href); if(lbl)lbl.textContent="Download ZIP"; }); }; document.head.appendChild(sc); }function phShare(){navigator.clipboard.writeText(window.location.href).then(function(){var el=document.getElementById("ph-share-lbl");if(el){el.textContent="Link copied!";setTimeout(function(){el.textContent="Copy share link";},2500);}});}function phEmbed(){var runId=window.location.pathname.split("/").pop().replace(".html","");var embedUrl="https://pantherahive.com/embed/"+runId;var code='';navigator.clipboard.writeText(code).then(function(){var el=document.getElementById("ph-embed-lbl");if(el){el.textContent="Embed code copied!";setTimeout(function(){el.textContent="Get Embed Code";},2500);}});}