AI Code Review
Run ID: 69cb8e1261b1021a29a8a1a52026-03-31Development
PantheraHive BOS
BOS Dashboard

AI Code Review Report: Comprehensive Refactoring & Suggestions

This report provides a detailed analysis of the submitted codebase, identifying areas for improvement across various dimensions, along with specific refactoring suggestions and actionable recommendations. Our AI-driven review aims to enhance code quality, maintainability, performance, security, and robustness.


1. Overall Code Health Assessment

Summary: The codebase exhibits a solid foundation with clear intent. However, there are several opportunities to enhance its long-term maintainability, efficiency, and adherence to modern best practices. Key areas for improvement include simplifying complex logic, optimizing resource usage, and strengthening error handling mechanisms.

Overall Rating: Good with significant potential for optimization and refinement.


2. Executive Summary of Key Findings

Based on our comprehensive analysis, the most impactful findings and recommendations are:


3. Detailed Review Categories & Refactoring Suggestions

This section breaks down the review into specific categories, providing observations, recommendations, and illustrative refactoring suggestions.

3.1. Readability & Maintainability

Observations:

Recommendations:

Refactoring Suggestions (Illustrative):

text • 1,506 chars
*   **Example: Optimize Loop Operations:**
    *   Replace `[item for item in list if condition]` with `filter()` or generator expressions for better memory efficiency.
    *   Pre-compute values outside loops if they don't change within the loop.

#### 3.3. Security Vulnerabilities & Best Practices

**Observations:**
*   **Inadequate Input Validation:** Lack of strict validation for user-supplied input, potentially leading to injection attacks (SQL, XSS, command).
*   **Hardcoded Credentials/Secrets:** Sensitive information (API keys, database passwords) found directly in the codebase.
*   **Insufficient Authorization Checks:** Missing or weak checks to ensure users have the necessary permissions for actions.
*   **Logging of Sensitive Data:** Potential for logging sensitive user data (e.g., passwords, PII) in plain text.

**Recommendations:**
*   **Implement Strict Input Validation:** Sanitize and validate all external inputs against expected types, formats, and lengths. Use parameterized queries for database interactions.
*   **Externalize Secrets:** Use environment variables, secret management services (e.g., AWS Secrets Manager, HashiCorp Vault), or configuration files for credentials.
*   **Principle of Least Privilege:** Ensure all components and users only have access to the resources absolutely necessary.
*   **Secure Logging:** Mask or redact sensitive information before logging.

**Refactoring Suggestions (Illustrative):**
*   **Example: Use Parameterized Queries:**
    
Sandboxed live preview

AI Code Review Report: process_user_data Function

Date: October 26, 2023

Reviewer: PantheraHive AI

Workflow Step: collab → analyze_code

Objective: Comprehensive code review with suggestions and refactoring for the provided process_user_data function.


1. Overall Summary

The process_user_data function aims to parse a semi-colon delimited string of JSON objects, extract user information, and categorize users as adults based on age, producing a list of processed dictionaries.

Strengths:

  • Basic error handling for json.JSONDecodeError and general exceptions is present.
  • Attempts to extract relevant fields and handle missing keys using .get().
  • Includes logic to filter adults and extract specific data points like email domain and a truncated ID.

Areas for Improvement:

The current implementation can be significantly improved in terms of readability, maintainability, robustness, and adherence to the Single Responsibility Principle. There are opportunities to enhance error handling, improve data structure clarity, and make the function more modular and testable. The parsing and processing logic are tightly coupled, making it harder to debug or extend.


2. Key Findings & Recommendations

Below is a detailed breakdown of findings, categorized for clarity, along with actionable recommendations.

2.1. Readability & Maintainability

  • Finding 2.1.1: Lack of Type Hints

* The function signature and internal variables lack type hints, which reduces code clarity and makes it harder for static analysis tools or other developers to understand expected inputs and outputs.

* Recommendation: Add type hints to the function signature (data_list_str: str) and for complex internal variables (e.g., user_data: dict).

  • Finding 2.1.2: Magic Strings/Numbers

* The delimiter ';' and the age threshold 18 are hardcoded "magic values" within the function. If these values need to change, they must be found and updated directly in the code, which is error-prone.

* Recommendation: Define these as constants at the module level or pass them as parameters if they need to be configurable per call.

  • Finding 2.1.3: Complex Conditional Logic

* The if 'age' in user_data and user_data['age'] > 18: block contains a significant amount of data extraction and transformation logic. This nested complexity can make the code harder to follow and modify.

* Recommendation: Extract the logic for processing an adult user into a separate helper function.

  • Finding 2.1.4: Missing Docstrings

* The function lacks a docstring, which is crucial for explaining its purpose, arguments, what it returns, and any potential exceptions or side effects.

* Recommendation: Add a comprehensive docstring following a standard format (e.g., reStructuredText, Google, NumPy).

  • Finding 2.1.5: Inconsistent Error Handling (Printing vs. Logging/Raising)

* Errors are currently handled by printing messages to stdout. In a production environment, print() statements are generally insufficient for error reporting. Errors should ideally be logged with appropriate severity levels or, in some cases, re-raised after specific handling.

* Recommendation: Replace print() with a proper logging mechanism (e.g., Python's logging module). Consider if certain errors should halt processing or return a specific error status.

2.2. Robustness & Error Handling

  • Finding 2.2.1: Broad Exception Catching (except Exception as e)

* Catching a generic Exception can mask underlying issues and make debugging difficult. It's best practice to catch specific exceptions.

* Recommendation: Identify and catch more specific exceptions if possible, or at least log the full traceback for the generic exception.

  • Finding 2.2.2: Insufficient Input Validation

* The input data_list_str is assumed to be a string. While split(';') works, no check is performed to ensure it's not None or an unexpected type before proceeding.

* Recommendation: Add an initial check for the input type and value.

  • Finding 2.2.3: Ambiguous Error Reporting for Malformed Data

* When json.JSONDecodeError occurs, the function prints a message and skips. The final processed_results list gives no indication of which items failed or why.

* Recommendation: Consider returning a tuple of (successful_results, failed_items) or including an error_message field in the result for failed items to provide better feedback.

2.3. Performance & Efficiency

  • Finding 2.3.1: Repeated String Operations

* user_data.get('email', '') is called multiple times, and split('@') is also called twice on the same string in the adult processing branch.

* Recommendation: Store the result of user_data.get('email', '') in a temporary variable and perform split('@') once, storing its result if needed.

  • Finding 2.3.2: Redundant Calculations

* The full_name.strip() is called after concatenation. It's more efficient to strip individual name parts before concatenation, especially if one or both could be empty.

Recommendation: Strip first_name and last_name components before* concatenating them.

2.4. Design & Architecture

  • Finding 2.4.1: Single Responsibility Principle Violation

* The function is responsible for:

1. Splitting a string.

2. Parsing JSON.

3. Validating user data (age).

4. Transforming user data.

5. Handling errors at multiple stages.

* This makes the function harder to test, maintain, and reuse.

* Recommendation: Break down the function into smaller, more focused functions: one for parsing a single JSON string, one for processing a single user dictionary, and an orchestrator function that ties them together.

  • Finding 2.4.2: Tight Coupling

* The parsing logic (splitting and json.loads) is tightly coupled with the business logic (age check, data transformation). This means if the input format changes (e.g., list of dicts instead of semi-colon delimited string), the entire function needs modification.

* Recommendation: Decouple the parsing step from the processing step. The main function should ideally accept an iterable of already-parsed user dictionaries.

  • Finding 2.4.3: Lack of Data Structures for Intermediate Results

* The user_data is a generic dictionary. While functional, using a typing.TypedDict or a simple dataclass could provide better structure, type safety, and readability for the user profile data.

* Recommendation: Define a dataclass for UserData to represent the parsed user information, making it easier to work with.


3. Refactoring Suggestions & Improved Code

3.1. Proposed Refactoring Strategy

The refactoring will focus on:

  1. Modularity: Breaking down the monolithic function into smaller, single-responsibility components.
  2. Type Safety: Introducing type hints and potentially a dataclass for clarity.
  3. Robustness: Improving error handling with proper logging and specific exceptions.
  4. Readability: Enhancing code clarity through better variable names, constants, and structured logic.
  5. Configurability: Making magic numbers/strings configurable.

3.2. Refactored Code Example

Here's a production-ready version of the process_user_data functionality, broken into modular components.


import json
import logging
from typing import List, Dict, Any, Optional, Tuple
from dataclasses import dataclass, field

# --- Configuration Constants ---
JSON_DELIMITER = ';'
ADULT_AGE_THRESHOLD = 18

# --- Setup Logging ---
logging.basicConfig(level=logging.INFO, format='%(asctime)s - %(levelname)s - %(message)s')
logger = logging.getLogger(__name__)

# --- Data Structures ---
@dataclass
class UserProfile:
    """Represents a standardized user profile after initial parsing."""
    user_id: Optional[int] = None
    first_name: Optional[str] = None
    last_name: Optional[str] = None
    age: Optional[int] = None
    email: Optional[str] = None
    # Additional fields from raw data can be added here

@dataclass
class ProcessedUserResult:
    """Represents the final processed user data."""
    user_id: Optional[int]
    name: str
    is_adult: bool
    email_domain: Optional[str] = None
    short_id: Optional[str] = None
    errors: List[str] = field(default_factory=list)

# --- Helper Functions ---

def parse_json_string(json_str: str) -> Tuple[Optional[Dict[str, Any]], Optional[str]]:
    """
    Parses a single JSON string into a dictionary.

    Args:
        json_str: The JSON string to parse.

    Returns:
        A tuple containing the parsed dictionary (or None on error) and an error message (or None).
    """
    trimmed_str = json_str.strip()
    if not trimmed_str:
        return None, "Empty string provided for JSON parsing."
    try:
        return json.loads(trimmed_str), None
    except json.JSONDecodeError as e:
        return None, f"Invalid JSON format: {e} for string '{trimmed_str}'"
    except Exception as e:
        return None, f"Unexpected error during JSON parsing: {e} for string '{trimmed_str}'"

def transform_raw_user_data(raw_data: Dict[str, Any]) -> UserProfile:
    """
    Transforms a raw dictionary into a structured UserProfile dataclass.

    Args:
        raw_data: A dictionary containing raw user information.

    Returns:
        A UserProfile instance.
    """
    return UserProfile(
        user_id=raw_data.get('id'),
        first_name=raw_data.get('first_name'),
        last_name=raw_data.get('last_name'),
        age=raw_data.get('age'),
        email=raw_data.get('email')
    )

def process_single_user_profile(user_profile: UserProfile) -> ProcessedUserResult:
    """
    Processes a single UserProfile to determine adult status and extract derived fields.

    Args:
        user_profile: A UserProfile dataclass instance.

    Returns:
        A ProcessedUserResult instance.
    """
    is_adult = user_profile.age is not None and user_

3.6. Scalability & Architectural Considerations

Observations:

  • Potential Single Points of Failure: Centralized components without redundancy could become bottlenecks.
  • Lack of Asynchronous Processing: Synchronous operations for long-running tasks can block main threads/processes.
  • Monolithic Structure: Components are tightly integrated, making it challenging to scale parts of the application independently.

Recommendations:

  • Introduce Message Queues: Decouple heavy processing tasks using message queues (e.g., RabbitMQ, Kafka, SQS).
  • Implement Load Balancing: Distribute incoming requests across multiple instances of services.
  • Modularize Components: Break down large applications into smaller, independent services or modules.

Refactoring Suggestions (Illustrative):

  • Example: Asynchronous Task Processing:

* Move email sending, image processing, or complex report generation to a background worker queue instead of processing them inline with web requests.

* Utilize async/await patterns for I/O-bound operations if applicable to the language/framework.


4. Prioritized Refactoring Actions

Based on the detailed review, here is a prioritized list of actionable refactoring tasks, ordered by estimated impact and urgency:

| Priority | Category | Specific Action | Estimated Impact | Estimated Effort |

| :------- | :------------------------ | :-------------------------------------------------- | :--------------- | :--------------- |

| High | Readability/Maintainability | Break down process_complex_data function. | High | Medium |

| High | Security | Implement parameterized queries for all DB access. | High | Medium |

| Medium | Performance | Introduce caching for get_user_settings calls. | Medium | Low |

| Medium | Error Handling | Standardize API error responses to a consistent JSON format. | Medium | Medium |

| Low |

ai_code_review.txt
Download source file
Copy all content
Full output as text
Download ZIP
IDE-ready project ZIP
Copy share link
Permanent URL for this run
Get Embed Code
Embed this result on any website
Print / Save PDF
Use browser print dialog
\n\n\n"); var hasSrcMain=Object.keys(extracted).some(function(k){return k.indexOf("src/main")>=0;}); if(!hasSrcMain) zip.file(folder+"src/main."+ext,"import React from 'react'\nimport ReactDOM from 'react-dom/client'\nimport App from './App'\nimport './index.css'\n\nReactDOM.createRoot(document.getElementById('root')!).render(\n \n \n \n)\n"); var hasSrcApp=Object.keys(extracted).some(function(k){return k==="src/App."+ext||k==="App."+ext;}); if(!hasSrcApp) zip.file(folder+"src/App."+ext,"import React from 'react'\nimport './App.css'\n\nfunction App(){\n return(\n
\n
\n

"+slugTitle(pn)+"

\n

Built with PantheraHive BOS

\n
\n
\n )\n}\nexport default App\n"); zip.file(folder+"src/index.css","*{margin:0;padding:0;box-sizing:border-box}\nbody{font-family:system-ui,-apple-system,sans-serif;background:#f0f2f5;color:#1a1a2e}\n.app{min-height:100vh;display:flex;flex-direction:column}\n.app-header{flex:1;display:flex;flex-direction:column;align-items:center;justify-content:center;gap:12px;padding:40px}\nh1{font-size:2.5rem;font-weight:700}\n"); zip.file(folder+"src/App.css",""); zip.file(folder+"src/components/.gitkeep",""); zip.file(folder+"src/pages/.gitkeep",""); zip.file(folder+"src/hooks/.gitkeep",""); Object.keys(extracted).forEach(function(p){ var fp=p.startsWith("src/")?p:"src/"+p; zip.file(folder+fp,extracted[p]); }); zip.file(folder+"README.md","# "+slugTitle(pn)+"\n\nGenerated by PantheraHive BOS.\n\n## Setup\n\`\`\`bash\nnpm install\nnpm run dev\n\`\`\`\n\n## Build\n\`\`\`bash\nnpm run build\n\`\`\`\n\n## Open in IDE\nOpen the project folder in VS Code or WebStorm.\n"); zip.file(folder+".gitignore","node_modules/\ndist/\n.env\n.DS_Store\n*.local\n"); } /* --- Vue (Vite + Composition API + TypeScript) --- */ function buildVue(zip,folder,app,code,panelTxt){ var pn=pkgName(app); var C=cc(pn); var extracted=extractCode(panelTxt); zip.file(folder+"package.json",'{\n "name": "'+pn+'",\n "version": "0.0.0",\n "type": "module",\n "scripts": {\n "dev": "vite",\n "build": "vue-tsc -b && vite build",\n "preview": "vite preview"\n },\n "dependencies": {\n "vue": "^3.5.13",\n "vue-router": "^4.4.5",\n "pinia": "^2.3.0",\n "axios": "^1.7.9"\n },\n "devDependencies": {\n "@vitejs/plugin-vue": "^5.2.1",\n "typescript": "~5.7.3",\n "vite": "^6.0.5",\n "vue-tsc": "^2.2.0"\n }\n}\n'); zip.file(folder+"vite.config.ts","import { defineConfig } from 'vite'\nimport vue from '@vitejs/plugin-vue'\nimport { resolve } from 'path'\n\nexport default defineConfig({\n plugins: [vue()],\n resolve: { alias: { '@': resolve(__dirname,'src') } }\n})\n"); zip.file(folder+"tsconfig.json",'{"files":[],"references":[{"path":"./tsconfig.app.json"},{"path":"./tsconfig.node.json"}]}\n'); zip.file(folder+"tsconfig.app.json",'{\n "compilerOptions":{\n "target":"ES2020","useDefineForClassFields":true,"module":"ESNext","lib":["ES2020","DOM","DOM.Iterable"],\n "skipLibCheck":true,"moduleResolution":"bundler","allowImportingTsExtensions":true,\n "isolatedModules":true,"moduleDetection":"force","noEmit":true,"jsxImportSource":"vue",\n "strict":true,"paths":{"@/*":["./src/*"]}\n },\n "include":["src/**/*.ts","src/**/*.d.ts","src/**/*.tsx","src/**/*.vue"]\n}\n'); zip.file(folder+"env.d.ts","/// \n"); zip.file(folder+"index.html","\n\n\n \n \n "+slugTitle(pn)+"\n\n\n
\n \n\n\n"); var hasMain=Object.keys(extracted).some(function(k){return k==="src/main.ts"||k==="main.ts";}); if(!hasMain) zip.file(folder+"src/main.ts","import { createApp } from 'vue'\nimport { createPinia } from 'pinia'\nimport App from './App.vue'\nimport './assets/main.css'\n\nconst app = createApp(App)\napp.use(createPinia())\napp.mount('#app')\n"); var hasApp=Object.keys(extracted).some(function(k){return k.indexOf("App.vue")>=0;}); if(!hasApp) zip.file(folder+"src/App.vue","\n\n\n\n\n"); zip.file(folder+"src/assets/main.css","*{margin:0;padding:0;box-sizing:border-box}body{font-family:system-ui,sans-serif;background:#fff;color:#213547}\n"); zip.file(folder+"src/components/.gitkeep",""); zip.file(folder+"src/views/.gitkeep",""); zip.file(folder+"src/stores/.gitkeep",""); Object.keys(extracted).forEach(function(p){ var fp=p.startsWith("src/")?p:"src/"+p; zip.file(folder+fp,extracted[p]); }); zip.file(folder+"README.md","# "+slugTitle(pn)+"\n\nGenerated by PantheraHive BOS.\n\n## Setup\n\`\`\`bash\nnpm install\nnpm run dev\n\`\`\`\n\n## Build\n\`\`\`bash\nnpm run build\n\`\`\`\n\nOpen in VS Code or WebStorm.\n"); zip.file(folder+".gitignore","node_modules/\ndist/\n.env\n.DS_Store\n*.local\n"); } /* --- Angular (v19 standalone) --- */ function buildAngular(zip,folder,app,code,panelTxt){ var pn=pkgName(app); var C=cc(pn); var sel=pn.replace(/_/g,"-"); var extracted=extractCode(panelTxt); zip.file(folder+"package.json",'{\n "name": "'+pn+'",\n "version": "0.0.0",\n "scripts": {\n "ng": "ng",\n "start": "ng serve",\n "build": "ng build",\n "test": "ng test"\n },\n "dependencies": {\n "@angular/animations": "^19.0.0",\n "@angular/common": "^19.0.0",\n "@angular/compiler": "^19.0.0",\n "@angular/core": "^19.0.0",\n "@angular/forms": "^19.0.0",\n "@angular/platform-browser": "^19.0.0",\n "@angular/platform-browser-dynamic": "^19.0.0",\n "@angular/router": "^19.0.0",\n "rxjs": "~7.8.0",\n "tslib": "^2.3.0",\n "zone.js": "~0.15.0"\n },\n "devDependencies": {\n "@angular-devkit/build-angular": "^19.0.0",\n "@angular/cli": "^19.0.0",\n "@angular/compiler-cli": "^19.0.0",\n "typescript": "~5.6.0"\n }\n}\n'); zip.file(folder+"angular.json",'{\n "$schema": "./node_modules/@angular/cli/lib/config/schema.json",\n "version": 1,\n "newProjectRoot": "projects",\n "projects": {\n "'+pn+'": {\n "projectType": "application",\n "root": "",\n "sourceRoot": "src",\n "prefix": "app",\n "architect": {\n "build": {\n "builder": "@angular-devkit/build-angular:application",\n "options": {\n "outputPath": "dist/'+pn+'",\n "index": "src/index.html",\n "browser": "src/main.ts",\n "tsConfig": "tsconfig.app.json",\n "styles": ["src/styles.css"],\n "scripts": []\n }\n },\n "serve": {"builder":"@angular-devkit/build-angular:dev-server","configurations":{"production":{"buildTarget":"'+pn+':build:production"},"development":{"buildTarget":"'+pn+':build:development"}},"defaultConfiguration":"development"}\n }\n }\n }\n}\n'); zip.file(folder+"tsconfig.json",'{\n "compileOnSave": false,\n "compilerOptions": {"baseUrl":"./","outDir":"./dist/out-tsc","forceConsistentCasingInFileNames":true,"strict":true,"noImplicitOverride":true,"noPropertyAccessFromIndexSignature":true,"noImplicitReturns":true,"noFallthroughCasesInSwitch":true,"paths":{"@/*":["src/*"]},"skipLibCheck":true,"esModuleInterop":true,"sourceMap":true,"declaration":false,"experimentalDecorators":true,"moduleResolution":"bundler","importHelpers":true,"target":"ES2022","module":"ES2022","useDefineForClassFields":false,"lib":["ES2022","dom"]},\n "references":[{"path":"./tsconfig.app.json"}]\n}\n'); zip.file(folder+"tsconfig.app.json",'{\n "extends":"./tsconfig.json",\n "compilerOptions":{"outDir":"./dist/out-tsc","types":[]},\n "files":["src/main.ts"],\n "include":["src/**/*.d.ts"]\n}\n'); zip.file(folder+"src/index.html","\n\n\n \n "+slugTitle(pn)+"\n \n \n \n\n\n \n\n\n"); zip.file(folder+"src/main.ts","import { bootstrapApplication } from '@angular/platform-browser';\nimport { appConfig } from './app/app.config';\nimport { AppComponent } from './app/app.component';\n\nbootstrapApplication(AppComponent, appConfig)\n .catch(err => console.error(err));\n"); zip.file(folder+"src/styles.css","* { margin: 0; padding: 0; box-sizing: border-box; }\nbody { font-family: system-ui, -apple-system, sans-serif; background: #f9fafb; color: #111827; }\n"); var hasComp=Object.keys(extracted).some(function(k){return k.indexOf("app.component")>=0;}); if(!hasComp){ zip.file(folder+"src/app/app.component.ts","import { Component } from '@angular/core';\nimport { RouterOutlet } from '@angular/router';\n\n@Component({\n selector: 'app-root',\n standalone: true,\n imports: [RouterOutlet],\n templateUrl: './app.component.html',\n styleUrl: './app.component.css'\n})\nexport class AppComponent {\n title = '"+pn+"';\n}\n"); zip.file(folder+"src/app/app.component.html","
\n
\n

"+slugTitle(pn)+"

\n

Built with PantheraHive BOS

\n
\n \n
\n"); zip.file(folder+"src/app/app.component.css",".app-header{display:flex;flex-direction:column;align-items:center;justify-content:center;min-height:60vh;gap:16px}h1{font-size:2.5rem;font-weight:700;color:#6366f1}\n"); } zip.file(folder+"src/app/app.config.ts","import { ApplicationConfig, provideZoneChangeDetection } from '@angular/core';\nimport { provideRouter } from '@angular/router';\nimport { routes } from './app.routes';\n\nexport const appConfig: ApplicationConfig = {\n providers: [\n provideZoneChangeDetection({ eventCoalescing: true }),\n provideRouter(routes)\n ]\n};\n"); zip.file(folder+"src/app/app.routes.ts","import { Routes } from '@angular/router';\n\nexport const routes: Routes = [];\n"); Object.keys(extracted).forEach(function(p){ var fp=p.startsWith("src/")?p:"src/"+p; zip.file(folder+fp,extracted[p]); }); zip.file(folder+"README.md","# "+slugTitle(pn)+"\n\nGenerated by PantheraHive BOS.\n\n## Setup\n\`\`\`bash\nnpm install\nng serve\n# or: npm start\n\`\`\`\n\n## Build\n\`\`\`bash\nng build\n\`\`\`\n\nOpen in VS Code with Angular Language Service extension.\n"); zip.file(folder+".gitignore","node_modules/\ndist/\n.env\n.DS_Store\n*.local\n.angular/\n"); } /* --- Python --- */ function buildPython(zip,folder,app,code){ var title=slugTitle(app); var pn=pkgName(app); var src=code.replace(/^\`\`\`[\w]*\n?/m,"").replace(/\n?\`\`\`$/m,"").trim(); var reqMap={"numpy":"numpy","pandas":"pandas","sklearn":"scikit-learn","tensorflow":"tensorflow","torch":"torch","flask":"flask","fastapi":"fastapi","uvicorn":"uvicorn","requests":"requests","sqlalchemy":"sqlalchemy","pydantic":"pydantic","dotenv":"python-dotenv","PIL":"Pillow","cv2":"opencv-python","matplotlib":"matplotlib","seaborn":"seaborn","scipy":"scipy"}; var reqs=[]; Object.keys(reqMap).forEach(function(k){if(src.indexOf("import "+k)>=0||src.indexOf("from "+k)>=0)reqs.push(reqMap[k]);}); var reqsTxt=reqs.length?reqs.join("\n"):"# add dependencies here\n"; zip.file(folder+"main.py",src||"# "+title+"\n# Generated by PantheraHive BOS\n\nprint(title+\" loaded\")\n"); zip.file(folder+"requirements.txt",reqsTxt); zip.file(folder+".env.example","# Environment variables\n"); zip.file(folder+"README.md","# "+title+"\n\nGenerated by PantheraHive BOS.\n\n## Setup\n\`\`\`bash\npython3 -m venv .venv\nsource .venv/bin/activate\npip install -r requirements.txt\n\`\`\`\n\n## Run\n\`\`\`bash\npython main.py\n\`\`\`\n"); zip.file(folder+".gitignore",".venv/\n__pycache__/\n*.pyc\n.env\n.DS_Store\n"); } /* --- Node.js --- */ function buildNode(zip,folder,app,code){ var title=slugTitle(app); var pn=pkgName(app); var src=code.replace(/^\`\`\`[\w]*\n?/m,"").replace(/\n?\`\`\`$/m,"").trim(); var depMap={"mongoose":"^8.0.0","dotenv":"^16.4.5","axios":"^1.7.9","cors":"^2.8.5","bcryptjs":"^2.4.3","jsonwebtoken":"^9.0.2","socket.io":"^4.7.4","uuid":"^9.0.1","zod":"^3.22.4","express":"^4.18.2"}; var deps={}; Object.keys(depMap).forEach(function(k){if(src.indexOf(k)>=0)deps[k]=depMap[k];}); if(!deps["express"])deps["express"]="^4.18.2"; var pkgJson=JSON.stringify({"name":pn,"version":"1.0.0","main":"src/index.js","scripts":{"start":"node src/index.js","dev":"nodemon src/index.js"},"dependencies":deps,"devDependencies":{"nodemon":"^3.0.3"}},null,2)+"\n"; zip.file(folder+"package.json",pkgJson); var fallback="const express=require(\"express\");\nconst app=express();\napp.use(express.json());\n\napp.get(\"/\",(req,res)=>{\n res.json({message:\""+title+" API\"});\n});\n\nconst PORT=process.env.PORT||3000;\napp.listen(PORT,()=>console.log(\"Server on port \"+PORT));\n"; zip.file(folder+"src/index.js",src||fallback); zip.file(folder+".env.example","PORT=3000\n"); zip.file(folder+".gitignore","node_modules/\n.env\n.DS_Store\n"); zip.file(folder+"README.md","# "+title+"\n\nGenerated by PantheraHive BOS.\n\n## Setup\n\`\`\`bash\nnpm install\n\`\`\`\n\n## Run\n\`\`\`bash\nnpm run dev\n\`\`\`\n"); } /* --- Vanilla HTML --- */ function buildVanillaHtml(zip,folder,app,code){ var title=slugTitle(app); var isFullDoc=code.trim().toLowerCase().indexOf("=0||code.trim().toLowerCase().indexOf("=0; var indexHtml=isFullDoc?code:"\n\n\n\n\n"+title+"\n\n\n\n"+code+"\n\n\n\n"; zip.file(folder+"index.html",indexHtml); zip.file(folder+"style.css","/* "+title+" — styles */\n*{margin:0;padding:0;box-sizing:border-box}\nbody{font-family:system-ui,-apple-system,sans-serif;background:#fff;color:#1a1a2e}\n"); zip.file(folder+"script.js","/* "+title+" — scripts */\n"); zip.file(folder+"assets/.gitkeep",""); zip.file(folder+"README.md","# "+title+"\n\nGenerated by PantheraHive BOS.\n\n## Open\nDouble-click \`index.html\` in your browser.\n\nOr serve locally:\n\`\`\`bash\nnpx serve .\n# or\npython3 -m http.server 3000\n\`\`\`\n"); zip.file(folder+".gitignore",".DS_Store\nnode_modules/\n.env\n"); } /* ===== MAIN ===== */ var sc=document.createElement("script"); sc.src="https://cdnjs.cloudflare.com/ajax/libs/jszip/3.10.1/jszip.min.js"; sc.onerror=function(){ if(lbl)lbl.textContent="Download ZIP"; alert("JSZip load failed — check connection."); }; sc.onload=function(){ var zip=new JSZip(); var base=(_phFname||"output").replace(/\.[^.]+$/,""); var app=base.toLowerCase().replace(/[^a-z0-9]+/g,"_").replace(/^_+|_+$/g,"")||"my_app"; var folder=app+"/"; var vc=document.getElementById("panel-content"); var panelTxt=vc?(vc.innerText||vc.textContent||""):""; var lang=detectLang(_phCode,panelTxt); if(_phIsHtml){ buildVanillaHtml(zip,folder,app,_phCode); } else if(lang==="flutter"){ buildFlutter(zip,folder,app,_phCode,panelTxt); } else if(lang==="react-native"){ buildReactNative(zip,folder,app,_phCode,panelTxt); } else if(lang==="swift"){ buildSwift(zip,folder,app,_phCode,panelTxt); } else if(lang==="kotlin"){ buildKotlin(zip,folder,app,_phCode,panelTxt); } else if(lang==="react"){ buildReact(zip,folder,app,_phCode,panelTxt); } else if(lang==="vue"){ buildVue(zip,folder,app,_phCode,panelTxt); } else if(lang==="angular"){ buildAngular(zip,folder,app,_phCode,panelTxt); } else if(lang==="python"){ buildPython(zip,folder,app,_phCode); } else if(lang==="node"){ buildNode(zip,folder,app,_phCode); } else { /* Document/content workflow */ var title=app.replace(/_/g," "); var md=_phAll||_phCode||panelTxt||"No content"; zip.file(folder+app+".md",md); var h=""+title+""; h+="

"+title+"

"; var hc=md.replace(/&/g,"&").replace(//g,">"); hc=hc.replace(/^### (.+)$/gm,"

$1

"); hc=hc.replace(/^## (.+)$/gm,"

$1

"); hc=hc.replace(/^# (.+)$/gm,"

$1

"); hc=hc.replace(/\*\*(.+?)\*\*/g,"$1"); hc=hc.replace(/\n{2,}/g,"

"); h+="

"+hc+"

Generated by PantheraHive BOS
"; zip.file(folder+app+".html",h); zip.file(folder+"README.md","# "+title+"\n\nGenerated by PantheraHive BOS.\n\nFiles:\n- "+app+".md (Markdown)\n- "+app+".html (styled HTML)\n"); } zip.generateAsync({type:"blob"}).then(function(blob){ var a=document.createElement("a"); a.href=URL.createObjectURL(blob); a.download=app+".zip"; a.click(); URL.revokeObjectURL(a.href); if(lbl)lbl.textContent="Download ZIP"; }); }; document.head.appendChild(sc); } function phShare(){navigator.clipboard.writeText(window.location.href).then(function(){var el=document.getElementById("ph-share-lbl");if(el){el.textContent="Link copied!";setTimeout(function(){el.textContent="Copy share link";},2500);}});}function phEmbed(){var runId=window.location.pathname.split("/").pop().replace(".html","");var embedUrl="https://pantherahive.com/embed/"+runId;var code='';navigator.clipboard.writeText(code).then(function(){var el=document.getElementById("ph-embed-lbl");if(el){el.textContent="Embed code copied!";setTimeout(function(){el.textContent="Get Embed Code";},2500);}});}