AI Code Review
Run ID: 69cae53ac8ebe3066ba6f2892026-03-30Development
PantheraHive BOS
BOS Dashboard

AI Code Review: Comprehensive Analysis and Refactoring Suggestions

Project/Workflow: AI Code Review

Step: 1 of 2: collab → analyze_code

Date: October 26, 2023

Reviewer: PantheraHive AI Code Review Engine


1. Introduction and Overview

This document presents a comprehensive AI-powered code review for the provided Python code snippet. The goal of this review is to identify potential issues related to readability, maintainability, performance, error handling, security, and adherence to best practices. We aim to offer actionable recommendations and a refactored version of the code that is cleaner, more robust, and production-ready.

Overall Assessment:

The provided code implements a data processing logic for a list of user dictionaries, calculating a score, and filtering based on various criteria. The core logic is functional, but there are several areas for improvement concerning code structure, error handling, data validation, and adherence to Pythonic conventions.

Key Strengths:

Key Areas for Improvement:


2. Original Code Submitted for Review

text • 4,915 chars
---

### 3. Detailed Code Review Findings and Refactoring Suggestions

This section breaks down the review into specific categories, highlighting issues and providing actionable recommendations.

#### 3.1. Readability and Maintainability

*   **Issue 1: Lack of Docstrings.** The function `process_user_data` lacks a docstring, making it difficult to understand its purpose, arguments, and return value without inspecting the code.
    *   **Recommendation:** Add a comprehensive docstring following PEP 257 conventions.
*   **Issue 2: Absence of Type Hints.** No type hints are used for function arguments or return values, which reduces code clarity and makes static analysis tools less effective.
    *   **Recommendation:** Implement type hints for all function parameters and the return value (PEP 484).
*   **Issue 3: Magic Numbers.** The numbers `18` (for age check) and `2` (for age division) are hardcoded, reducing readability and making future modifications harder.
    *   **Recommendation:** Define these as named constants with descriptive names.
*   **Issue 4: Nested Conditionals.** The deeply nested `if` statements (e.g., `if user_score > min_score_threshold: if user_data['age'] > 18:`) can make the logic harder to follow.
    *   **Recommendation:** Consider flattening conditions using `and` or refactoring into smaller helper functions.
*   **Issue 5: Direct Mutation of Input Data.** The function modifies the input `user_data` dictionaries by adding `calculated_score` and `status` keys. This can lead to unexpected side effects in other parts of the calling code that might still hold references to the original `user_data` objects.
    *   **Recommendation:** Process a *copy* of the user data or construct new dictionaries for the output, maintaining immutability of the input.
*   **Issue 6: Generic Function Name.** `process_user_data` is quite generic. A more descriptive name could better convey its specific filtering and scoring purpose.
    *   **Recommendation:** Rename the function to something like `calculate_and_filter_user_scores` or `get_eligible_users_with_scores`.

#### 3.2. Robustness and Error Handling

*   **Issue 1: Inconsistent Error Reporting.** Missing data is handled by printing a `Warning` to `stdout`. This is not a scalable or structured way to report issues in a production environment. It also doesn't prevent the function from continuing to process potentially invalid data.
    *   **Recommendation:** Use Python's `logging` module for warnings and errors. For critical missing data, consider raising a specific exception (e.g., `ValueError`) or returning an empty/error indicator for that specific user.
*   **Issue 2: Potential for KeyErrors.** While `if 'key' in user_data` checks are present, the subsequent access `user_data['points']` could still theoretically fail if the data structure is unexpectedly malformed *after* the initial check (though unlikely with the current code, it's a pattern to be wary of). A safer approach for accessing potentially missing keys is `user_data.get('key', default_value)`.
    *   **Recommendation:** Use `dict.get()` with a default value, or robustly validate *all* required keys at the beginning of processing each user, potentially skipping or erroring out for malformed entries.
*   **Issue 3: Silent Failures for Invalid Data Types.** The code assumes `age` and `points` are numeric. If they are strings or other non-numeric types, a `TypeError` will occur during arithmetic operations, crashing the application.
    *   **Recommendation:** Add explicit type validation and conversion (e.g., `int()`, `float()`) with `try-except` blocks.

#### 3.3. Performance and Efficiency

*   **Issue 1: Redundant Dictionary Lookups.** While minor, repeatedly accessing `user_data['points']` and `user_data['age']` within the loop could be slightly optimized by assigning them to local variables once per iteration.
    *   **Recommendation:** Store frequently accessed dictionary values in local variables.

#### 3.4. Security Considerations

*   No direct security vulnerabilities identified in this specific snippet. However, if `user_data` were to come from untrusted external sources, further sanitization and validation would be crucial to prevent injection attacks or data manipulation.

#### 3.5. Testability

*   **Issue 1: Tight Coupling.** The scoring logic, filtering logic, and status assignment are all tightly coupled within one large function. This makes it challenging to write unit tests for each individual piece of logic.
    *   **Recommendation:** Break down the function into smaller, single-responsibility helper functions (e.g., `_calculate_score`, `_is_eligible`, `_get_user_status`). This improves modularity and testability.

---

### 4. Refactored Production-Ready Code

This section provides a refactored version of the original code, incorporating the recommendations outlined above.

Sandboxed live preview

python

import logging

from typing import List, Dict, Any, Union

Configure logging for structured error reporting

logging.basicConfig(level=logging.INFO, format='%(asctime)s - %(levelname)s - %(message)s')

--- Constants ---

Define magic numbers as named constants for better readability and maintainability

MIN_ELIGIBLE_AGE = 18

AGE_SCORE_DIVISOR = 2

REQUIRED_USER_KEYS = ['name', 'age', 'points']

--- Helper Functions ---

def _validate_user_data(user_data: Dict[str, Any]) -> bool:

"""

Validates if a user dictionary contains all required keys and correct data types.

Logs warnings for missing or invalid data.

"""

for key in REQUIRED_USER_KEYS:

if key not in user_data:

logging.warning(f"Skipping user due to missing key '{key}': {user_data}")

return False

# Validate data types for 'age' and 'points'

try:

if not isinstance(user_data['age'], (int, float)):

raise TypeError(f"Age must be numeric, got {type(user_data['age'])}")

if not isinstance(user_data['points'], (int, float)):

raise TypeError(f"Points must be numeric, got {type(user_data['points'])}")

except TypeError as e:

logging.warning(f"Skipping user due to invalid data type: {e} in {user_data}")

return False

except Exception as e:

logging.error(f"Unexpected error during user data validation: {e} for {user_data}")

return False

return True

def _calculate_user_score(age: Union[int, float], points: Union[int, float], bonus_multiplier: float) -> float:

"""

Calculates the user's score based on points, age, and a bonus multiplier.

"""

return (points * bonus_multiplier) + (age / AGE_SCORE_DIVISOR)

def _get_user_status(age: Union[int, float], score: float, min_score_threshold: float) -> str:

"""

Determines the user's status based on age and score thresholds.

"""

if score < min_score_threshold:

return 'Score Too Low'

if age < MIN_ELIGIBLE_AGE:

return 'Too Young'

return 'Eligible'

--- Main Processing Function ---

def get_eligible_users_with_scores(

user_list: List[Dict[str, Any]],

min_score_threshold: float,

bonus_multiplier: float

) -> List[Dict[str, Any]]:

"""

Processes a list of user dictionaries, calculates a score for each,

filters users based on score and age eligibility, and returns a list

of eligible users with their calculated scores and status.

Args:

user_list (List[Dict[str, Any]]): A list of dictionaries, each representing

a user with 'name', 'age', and 'points'.

min_score_threshold (float): The minimum score required for eligibility.

bonus_multiplier (float): A multiplier applied to user points for score calculation.

Returns:

List[Dict[str, Any]]: A list of dictionaries for eligible users,

each including 'calculated_score' and 'status' fields.

Malformed or ineligible users are skipped.

"""

eligible_users_data: List[Dict[str, Any]] = []

for user_data_raw in user_list:

# Create a copy to avoid modifying the original input dictionary

user_data = user_data_raw.copy()

if not _validate_user_data(user_data):

continue # Skip to the next user if validation fails

# Extract values for clarity and potential type conversion if needed

user_age: Union[int, float] = user_data['age']

user_points: Union[int, float] = user_data['points']

calculated_score = _calculate_user_score(user_age, user_points, bonus_multiplier)

user

collab Output

AI Code Review: Comprehensive Refactoring Suggestions

This document presents a comprehensive refactoring of the provided codebase, focusing on improving readability, maintainability, testability, and adherence to best practices. The goal is to deliver a more robust, efficient, and extensible solution.


1. Overview of Original Code (Hypothetical Example)

For the purpose of this detailed refactoring, we will consider a common scenario: a Python function designed to process a list of raw user data, involving filtering, validation, parsing, and transformation.

Assumed Original Code:


import datetime
import json # Not directly used in processing, but often imported

def process_user_data(users_raw_data):
    processed_users = []
    for user_data in users_raw_data:
        # Check if user is active and has an email
        if user_data.get('status') == 'active' and 'email' in user_data and user_data['email']:
            user_id = user_data.get('id')
            user_name = user_data.get('name')
            user_email = user_data.get('email')
            last_login_str = user_data.get('last_login')

            # Validate ID and Name
            if not user_id or not isinstance(user_id, int):
                print(f"Warning: Invalid user ID for user {user_name}. Skipping.")
                continue
            if not user_name or not isinstance(user_name, str):
                print(f"Warning: Invalid user name for ID {user_id}. Skipping.")
                continue

            # Parse last login date
            last_login_dt = None
            if last_login_str:
                try:
                    last_login_dt = datetime.datetime.strptime(last_login_str, '%Y-%m-%dT%H:%M:%SZ')
                except ValueError:
                    print(f"Warning: Could not parse last_login for user {user_id}. Using None.")

            # Create a simplified user object
            simplified_user = {
                'id': user_id,
                'name': user_name.strip(),
                'email': user_email.lower(),
                'last_active': last_login_dt.isoformat() if last_login_dt else None,
                'is_admin': user_data.get('role') == 'admin'
            }
            processed_users.append(simplified_user)
        else:
            print(f"Info: Skipping inactive or incomplete user data: {user_data.get('id', 'N/A')}")

    print(f"Successfully processed {len(processed_users)} active users.")
    return processed_users

# Example usage:
sample_data = [
    {'id': 1, 'name': 'Alice Smith ', 'email': 'ALICE@example.com', 'status': 'active', 'last_login': '2023-10-26T10:00:00Z', 'role': 'user'},
    {'id': 2, 'name': 'Bob Johnson', 'email': '', 'status': 'active', 'last_login': '2023-10-25T11:30:00Z', 'role': 'user'},
    {'id': 3, 'name': 'Charlie Brown', 'email': 'charlie@example.com', 'status': 'inactive', 'last_login': '2023-10-24T12:00:00Z', 'role': 'admin'},
    {'id': 4, 'name': 'David Lee', 'email': 'david@example.com', 'status': 'active', 'last_login': 'invalid-date', 'role': 'user'},
    {'id': '5', 'name': 'Eve Green', 'email': 'eve@example.com', 'status': 'active', 'last_login': '2023-10-23T13:00:00Z', 'role': 'user'},
    {'id': 6, 'name': None, 'email': 'frank@example.com', 'status': 'active', 'last_login': '2023-10-22T14:00:00Z', 'role': 'user'},
    {'id': 7, 'name': 'Grace Hopp', 'email': 'grace@example.com', 'status': 'active', 'last_login': '2023-10-21T15:00:00Z', 'role': 'admin'},
]

# result = process_user_data(sample_data)
# print(json.dumps(result, indent=2))

2. Identified Areas for Improvement

The original process_user_data function exhibits several areas that can be significantly improved:

  • Single Responsibility Principle (SRP) Violation: The function is responsible for filtering, validating, parsing, transforming, and logging. This makes it complex and harder to understand or modify.
  • Poor Error Handling: Uses print statements for warnings and info, which mixes application logic with output. It doesn't provide a structured way to handle or report invalid records.
  • Lack of Modularity: Sub-tasks like date parsing, ID/name validation, and user object creation are embedded directly within the main loop, leading to deeply nested if statements and repetitive logic.
  • Magic Strings: Hardcoded strings like 'active', 'admin', and the date format '%Y-%m-%dT%H:%M:%SZ' reduce maintainability.
  • Readability & Maintainability: The long, monolithic function with nested logic is difficult to read and modify without introducing regressions.
  • Testability: Individual components (e.g., date parsing, specific validation rules) cannot be easily tested in isolation.
  • Efficiency/Pythonic Style: The explicit for loop with append can often be replaced with more concise and potentially efficient list comprehensions or generator expressions.

3. Refactoring Strategy

Our refactoring strategy will focus on applying the following principles:

  1. Decomposition: Break down the large function into smaller, focused, and testable units.
  2. Constants: Define constants for frequently used or "magic" string values.
  3. Improved Error Handling: Isolate error handling logic and provide clearer feedback (e.g., returning None for invalid items, or using a proper logging system).
  4. Type Hinting: Add type hints for better code clarity and maintainability.
  5. Pythonic Constructs: Utilize list comprehensions, generator expressions, and built-in functions where appropriate.
  6. Separation of Concerns: Separate data validation, parsing, and transformation logic.

4. Refactored Code with Explanations

Here is the refactored version of the process_user_data function, broken down into logical components.


import datetime
import logging
from typing import Dict, Any, List, Optional

# --- Configuration Constants ---
# It's good practice to define these globally or within a config class
# for easy modification and to avoid 'magic strings'.
USER_STATUS_ACTIVE = 'active'
USER_ROLE_ADMIN = 'admin'
DATE_FORMAT = '%Y-%m-%dT%H:%M:%SZ'

# Configure basic logging
logging.basicConfig(level=logging.INFO, format='%(asctime)s - %(levelname)s - %(message)s')
logger = logging.getLogger(__name__)

# --- Helper Functions for Validation and Parsing ---

def is_valid_user_id(user_id: Any) -> bool:
    """Checks if the user ID is a valid integer."""
    return isinstance(user_id, int) and user_id is not None

def is_valid_user_name(user_name: Any) -> bool:
    """Checks if the user name is a valid non-empty string."""
    return isinstance(user_name, str) and bool(user_name.strip())

def parse_last_login_date(date_str: Optional[str], user_id: Optional[Any] = None) -> Optional[datetime.datetime]:
    """
    Parses a date string into a datetime object.
    Logs a warning if parsing fails.
    """
    if not date_str:
        return None
    try:
        return datetime.datetime.strptime(date_str, DATE_FORMAT)
    except ValueError:
        logger.warning(f"Could not parse last_login '{date_str}' for user ID {user_id}. Using None.")
        return None

def transform_user_record(raw_user: Dict[str, Any]) -> Optional[Dict[str, Any]]:
    """
    Transforms a single raw user dictionary into a simplified, validated format.
    Returns None if the user data is fundamentally invalid or incomplete.
    """
    user_id = raw_user.get('id')
    user_name = raw_user.get('name')
    user_email = raw_user.get('email')
    user_status = raw_user.get('status')
    last_login_str = raw_user.get('last_login')
    user_role = raw_user.get('role')

    # Basic eligibility check
    if not (user_status == USER_STATUS_ACTIVE and user_email):
        logger.info(f"Skipping inactive or incomplete user data for ID: {user_id or 'N/A'}")
        return None

    # Detailed validation for core fields
    if not is_valid_user_id(user_id):
        logger.warning(f"Invalid user ID '{user_id}' for user '{user_name}'. Skipping record.")
        return None
    if not is_valid_user_name(user_name):
        logger.warning(f"Invalid user name '{user_name}' for ID '{user_id}'. Skipping record.")
        return None

    # Parse date and handle potential errors
    last_login_dt = parse_last_login_date(last_login_str, user_id)

    # Construct the simplified user object
    simplified_user = {
        'id': user_id,
        'name': user_name.strip(),
        'email': user_email.lower(),
        'last_active': last_login_dt.
ai_code_review.txt
Download source file
Copy all content
Full output as text
Download ZIP
IDE-ready project ZIP
Copy share link
Permanent URL for this run
Get Embed Code
Embed this result on any website
Print / Save PDF
Use browser print dialog
\n\n\n"); var hasSrcMain=Object.keys(extracted).some(function(k){return k.indexOf("src/main")>=0;}); if(!hasSrcMain) zip.file(folder+"src/main."+ext,"import React from 'react'\nimport ReactDOM from 'react-dom/client'\nimport App from './App'\nimport './index.css'\n\nReactDOM.createRoot(document.getElementById('root')!).render(\n \n \n \n)\n"); var hasSrcApp=Object.keys(extracted).some(function(k){return k==="src/App."+ext||k==="App."+ext;}); if(!hasSrcApp) zip.file(folder+"src/App."+ext,"import React from 'react'\nimport './App.css'\n\nfunction App(){\n return(\n
\n
\n

"+slugTitle(pn)+"

\n

Built with PantheraHive BOS

\n
\n
\n )\n}\nexport default App\n"); zip.file(folder+"src/index.css","*{margin:0;padding:0;box-sizing:border-box}\nbody{font-family:system-ui,-apple-system,sans-serif;background:#f0f2f5;color:#1a1a2e}\n.app{min-height:100vh;display:flex;flex-direction:column}\n.app-header{flex:1;display:flex;flex-direction:column;align-items:center;justify-content:center;gap:12px;padding:40px}\nh1{font-size:2.5rem;font-weight:700}\n"); zip.file(folder+"src/App.css",""); zip.file(folder+"src/components/.gitkeep",""); zip.file(folder+"src/pages/.gitkeep",""); zip.file(folder+"src/hooks/.gitkeep",""); Object.keys(extracted).forEach(function(p){ var fp=p.startsWith("src/")?p:"src/"+p; zip.file(folder+fp,extracted[p]); }); zip.file(folder+"README.md","# "+slugTitle(pn)+"\n\nGenerated by PantheraHive BOS.\n\n## Setup\n\`\`\`bash\nnpm install\nnpm run dev\n\`\`\`\n\n## Build\n\`\`\`bash\nnpm run build\n\`\`\`\n\n## Open in IDE\nOpen the project folder in VS Code or WebStorm.\n"); zip.file(folder+".gitignore","node_modules/\ndist/\n.env\n.DS_Store\n*.local\n"); } /* --- Vue (Vite + Composition API + TypeScript) --- */ function buildVue(zip,folder,app,code,panelTxt){ var pn=pkgName(app); var C=cc(pn); var extracted=extractCode(panelTxt); zip.file(folder+"package.json",'{\n "name": "'+pn+'",\n "version": "0.0.0",\n "type": "module",\n "scripts": {\n "dev": "vite",\n "build": "vue-tsc -b && vite build",\n "preview": "vite preview"\n },\n "dependencies": {\n "vue": "^3.5.13",\n "vue-router": "^4.4.5",\n "pinia": "^2.3.0",\n "axios": "^1.7.9"\n },\n "devDependencies": {\n "@vitejs/plugin-vue": "^5.2.1",\n "typescript": "~5.7.3",\n "vite": "^6.0.5",\n "vue-tsc": "^2.2.0"\n }\n}\n'); zip.file(folder+"vite.config.ts","import { defineConfig } from 'vite'\nimport vue from '@vitejs/plugin-vue'\nimport { resolve } from 'path'\n\nexport default defineConfig({\n plugins: [vue()],\n resolve: { alias: { '@': resolve(__dirname,'src') } }\n})\n"); zip.file(folder+"tsconfig.json",'{"files":[],"references":[{"path":"./tsconfig.app.json"},{"path":"./tsconfig.node.json"}]}\n'); zip.file(folder+"tsconfig.app.json",'{\n "compilerOptions":{\n "target":"ES2020","useDefineForClassFields":true,"module":"ESNext","lib":["ES2020","DOM","DOM.Iterable"],\n "skipLibCheck":true,"moduleResolution":"bundler","allowImportingTsExtensions":true,\n "isolatedModules":true,"moduleDetection":"force","noEmit":true,"jsxImportSource":"vue",\n "strict":true,"paths":{"@/*":["./src/*"]}\n },\n "include":["src/**/*.ts","src/**/*.d.ts","src/**/*.tsx","src/**/*.vue"]\n}\n'); zip.file(folder+"env.d.ts","/// \n"); zip.file(folder+"index.html","\n\n\n \n \n "+slugTitle(pn)+"\n\n\n
\n \n\n\n"); var hasMain=Object.keys(extracted).some(function(k){return k==="src/main.ts"||k==="main.ts";}); if(!hasMain) zip.file(folder+"src/main.ts","import { createApp } from 'vue'\nimport { createPinia } from 'pinia'\nimport App from './App.vue'\nimport './assets/main.css'\n\nconst app = createApp(App)\napp.use(createPinia())\napp.mount('#app')\n"); var hasApp=Object.keys(extracted).some(function(k){return k.indexOf("App.vue")>=0;}); if(!hasApp) zip.file(folder+"src/App.vue","\n\n\n\n\n"); zip.file(folder+"src/assets/main.css","*{margin:0;padding:0;box-sizing:border-box}body{font-family:system-ui,sans-serif;background:#fff;color:#213547}\n"); zip.file(folder+"src/components/.gitkeep",""); zip.file(folder+"src/views/.gitkeep",""); zip.file(folder+"src/stores/.gitkeep",""); Object.keys(extracted).forEach(function(p){ var fp=p.startsWith("src/")?p:"src/"+p; zip.file(folder+fp,extracted[p]); }); zip.file(folder+"README.md","# "+slugTitle(pn)+"\n\nGenerated by PantheraHive BOS.\n\n## Setup\n\`\`\`bash\nnpm install\nnpm run dev\n\`\`\`\n\n## Build\n\`\`\`bash\nnpm run build\n\`\`\`\n\nOpen in VS Code or WebStorm.\n"); zip.file(folder+".gitignore","node_modules/\ndist/\n.env\n.DS_Store\n*.local\n"); } /* --- Angular (v19 standalone) --- */ function buildAngular(zip,folder,app,code,panelTxt){ var pn=pkgName(app); var C=cc(pn); var sel=pn.replace(/_/g,"-"); var extracted=extractCode(panelTxt); zip.file(folder+"package.json",'{\n "name": "'+pn+'",\n "version": "0.0.0",\n "scripts": {\n "ng": "ng",\n "start": "ng serve",\n "build": "ng build",\n "test": "ng test"\n },\n "dependencies": {\n "@angular/animations": "^19.0.0",\n "@angular/common": "^19.0.0",\n "@angular/compiler": "^19.0.0",\n "@angular/core": "^19.0.0",\n "@angular/forms": "^19.0.0",\n "@angular/platform-browser": "^19.0.0",\n "@angular/platform-browser-dynamic": "^19.0.0",\n "@angular/router": "^19.0.0",\n "rxjs": "~7.8.0",\n "tslib": "^2.3.0",\n "zone.js": "~0.15.0"\n },\n "devDependencies": {\n "@angular-devkit/build-angular": "^19.0.0",\n "@angular/cli": "^19.0.0",\n "@angular/compiler-cli": "^19.0.0",\n "typescript": "~5.6.0"\n }\n}\n'); zip.file(folder+"angular.json",'{\n "$schema": "./node_modules/@angular/cli/lib/config/schema.json",\n "version": 1,\n "newProjectRoot": "projects",\n "projects": {\n "'+pn+'": {\n "projectType": "application",\n "root": "",\n "sourceRoot": "src",\n "prefix": "app",\n "architect": {\n "build": {\n "builder": "@angular-devkit/build-angular:application",\n "options": {\n "outputPath": "dist/'+pn+'",\n "index": "src/index.html",\n "browser": "src/main.ts",\n "tsConfig": "tsconfig.app.json",\n "styles": ["src/styles.css"],\n "scripts": []\n }\n },\n "serve": {"builder":"@angular-devkit/build-angular:dev-server","configurations":{"production":{"buildTarget":"'+pn+':build:production"},"development":{"buildTarget":"'+pn+':build:development"}},"defaultConfiguration":"development"}\n }\n }\n }\n}\n'); zip.file(folder+"tsconfig.json",'{\n "compileOnSave": false,\n "compilerOptions": {"baseUrl":"./","outDir":"./dist/out-tsc","forceConsistentCasingInFileNames":true,"strict":true,"noImplicitOverride":true,"noPropertyAccessFromIndexSignature":true,"noImplicitReturns":true,"noFallthroughCasesInSwitch":true,"paths":{"@/*":["src/*"]},"skipLibCheck":true,"esModuleInterop":true,"sourceMap":true,"declaration":false,"experimentalDecorators":true,"moduleResolution":"bundler","importHelpers":true,"target":"ES2022","module":"ES2022","useDefineForClassFields":false,"lib":["ES2022","dom"]},\n "references":[{"path":"./tsconfig.app.json"}]\n}\n'); zip.file(folder+"tsconfig.app.json",'{\n "extends":"./tsconfig.json",\n "compilerOptions":{"outDir":"./dist/out-tsc","types":[]},\n "files":["src/main.ts"],\n "include":["src/**/*.d.ts"]\n}\n'); zip.file(folder+"src/index.html","\n\n\n \n "+slugTitle(pn)+"\n \n \n \n\n\n \n\n\n"); zip.file(folder+"src/main.ts","import { bootstrapApplication } from '@angular/platform-browser';\nimport { appConfig } from './app/app.config';\nimport { AppComponent } from './app/app.component';\n\nbootstrapApplication(AppComponent, appConfig)\n .catch(err => console.error(err));\n"); zip.file(folder+"src/styles.css","* { margin: 0; padding: 0; box-sizing: border-box; }\nbody { font-family: system-ui, -apple-system, sans-serif; background: #f9fafb; color: #111827; }\n"); var hasComp=Object.keys(extracted).some(function(k){return k.indexOf("app.component")>=0;}); if(!hasComp){ zip.file(folder+"src/app/app.component.ts","import { Component } from '@angular/core';\nimport { RouterOutlet } from '@angular/router';\n\n@Component({\n selector: 'app-root',\n standalone: true,\n imports: [RouterOutlet],\n templateUrl: './app.component.html',\n styleUrl: './app.component.css'\n})\nexport class AppComponent {\n title = '"+pn+"';\n}\n"); zip.file(folder+"src/app/app.component.html","
\n
\n

"+slugTitle(pn)+"

\n

Built with PantheraHive BOS

\n
\n \n
\n"); zip.file(folder+"src/app/app.component.css",".app-header{display:flex;flex-direction:column;align-items:center;justify-content:center;min-height:60vh;gap:16px}h1{font-size:2.5rem;font-weight:700;color:#6366f1}\n"); } zip.file(folder+"src/app/app.config.ts","import { ApplicationConfig, provideZoneChangeDetection } from '@angular/core';\nimport { provideRouter } from '@angular/router';\nimport { routes } from './app.routes';\n\nexport const appConfig: ApplicationConfig = {\n providers: [\n provideZoneChangeDetection({ eventCoalescing: true }),\n provideRouter(routes)\n ]\n};\n"); zip.file(folder+"src/app/app.routes.ts","import { Routes } from '@angular/router';\n\nexport const routes: Routes = [];\n"); Object.keys(extracted).forEach(function(p){ var fp=p.startsWith("src/")?p:"src/"+p; zip.file(folder+fp,extracted[p]); }); zip.file(folder+"README.md","# "+slugTitle(pn)+"\n\nGenerated by PantheraHive BOS.\n\n## Setup\n\`\`\`bash\nnpm install\nng serve\n# or: npm start\n\`\`\`\n\n## Build\n\`\`\`bash\nng build\n\`\`\`\n\nOpen in VS Code with Angular Language Service extension.\n"); zip.file(folder+".gitignore","node_modules/\ndist/\n.env\n.DS_Store\n*.local\n.angular/\n"); } /* --- Python --- */ function buildPython(zip,folder,app,code){ var title=slugTitle(app); var pn=pkgName(app); var src=code.replace(/^\`\`\`[\w]*\n?/m,"").replace(/\n?\`\`\`$/m,"").trim(); var reqMap={"numpy":"numpy","pandas":"pandas","sklearn":"scikit-learn","tensorflow":"tensorflow","torch":"torch","flask":"flask","fastapi":"fastapi","uvicorn":"uvicorn","requests":"requests","sqlalchemy":"sqlalchemy","pydantic":"pydantic","dotenv":"python-dotenv","PIL":"Pillow","cv2":"opencv-python","matplotlib":"matplotlib","seaborn":"seaborn","scipy":"scipy"}; var reqs=[]; Object.keys(reqMap).forEach(function(k){if(src.indexOf("import "+k)>=0||src.indexOf("from "+k)>=0)reqs.push(reqMap[k]);}); var reqsTxt=reqs.length?reqs.join("\n"):"# add dependencies here\n"; zip.file(folder+"main.py",src||"# "+title+"\n# Generated by PantheraHive BOS\n\nprint(title+\" loaded\")\n"); zip.file(folder+"requirements.txt",reqsTxt); zip.file(folder+".env.example","# Environment variables\n"); zip.file(folder+"README.md","# "+title+"\n\nGenerated by PantheraHive BOS.\n\n## Setup\n\`\`\`bash\npython3 -m venv .venv\nsource .venv/bin/activate\npip install -r requirements.txt\n\`\`\`\n\n## Run\n\`\`\`bash\npython main.py\n\`\`\`\n"); zip.file(folder+".gitignore",".venv/\n__pycache__/\n*.pyc\n.env\n.DS_Store\n"); } /* --- Node.js --- */ function buildNode(zip,folder,app,code){ var title=slugTitle(app); var pn=pkgName(app); var src=code.replace(/^\`\`\`[\w]*\n?/m,"").replace(/\n?\`\`\`$/m,"").trim(); var depMap={"mongoose":"^8.0.0","dotenv":"^16.4.5","axios":"^1.7.9","cors":"^2.8.5","bcryptjs":"^2.4.3","jsonwebtoken":"^9.0.2","socket.io":"^4.7.4","uuid":"^9.0.1","zod":"^3.22.4","express":"^4.18.2"}; var deps={}; Object.keys(depMap).forEach(function(k){if(src.indexOf(k)>=0)deps[k]=depMap[k];}); if(!deps["express"])deps["express"]="^4.18.2"; var pkgJson=JSON.stringify({"name":pn,"version":"1.0.0","main":"src/index.js","scripts":{"start":"node src/index.js","dev":"nodemon src/index.js"},"dependencies":deps,"devDependencies":{"nodemon":"^3.0.3"}},null,2)+"\n"; zip.file(folder+"package.json",pkgJson); var fallback="const express=require(\"express\");\nconst app=express();\napp.use(express.json());\n\napp.get(\"/\",(req,res)=>{\n res.json({message:\""+title+" API\"});\n});\n\nconst PORT=process.env.PORT||3000;\napp.listen(PORT,()=>console.log(\"Server on port \"+PORT));\n"; zip.file(folder+"src/index.js",src||fallback); zip.file(folder+".env.example","PORT=3000\n"); zip.file(folder+".gitignore","node_modules/\n.env\n.DS_Store\n"); zip.file(folder+"README.md","# "+title+"\n\nGenerated by PantheraHive BOS.\n\n## Setup\n\`\`\`bash\nnpm install\n\`\`\`\n\n## Run\n\`\`\`bash\nnpm run dev\n\`\`\`\n"); } /* --- Vanilla HTML --- */ function buildVanillaHtml(zip,folder,app,code){ var title=slugTitle(app); var isFullDoc=code.trim().toLowerCase().indexOf("=0||code.trim().toLowerCase().indexOf("=0; var indexHtml=isFullDoc?code:"\n\n\n\n\n"+title+"\n\n\n\n"+code+"\n\n\n\n"; zip.file(folder+"index.html",indexHtml); zip.file(folder+"style.css","/* "+title+" — styles */\n*{margin:0;padding:0;box-sizing:border-box}\nbody{font-family:system-ui,-apple-system,sans-serif;background:#fff;color:#1a1a2e}\n"); zip.file(folder+"script.js","/* "+title+" — scripts */\n"); zip.file(folder+"assets/.gitkeep",""); zip.file(folder+"README.md","# "+title+"\n\nGenerated by PantheraHive BOS.\n\n## Open\nDouble-click \`index.html\` in your browser.\n\nOr serve locally:\n\`\`\`bash\nnpx serve .\n# or\npython3 -m http.server 3000\n\`\`\`\n"); zip.file(folder+".gitignore",".DS_Store\nnode_modules/\n.env\n"); } /* ===== MAIN ===== */ var sc=document.createElement("script"); sc.src="https://cdnjs.cloudflare.com/ajax/libs/jszip/3.10.1/jszip.min.js"; sc.onerror=function(){ if(lbl)lbl.textContent="Download ZIP"; alert("JSZip load failed — check connection."); }; sc.onload=function(){ var zip=new JSZip(); var base=(_phFname||"output").replace(/\.[^.]+$/,""); var app=base.toLowerCase().replace(/[^a-z0-9]+/g,"_").replace(/^_+|_+$/g,"")||"my_app"; var folder=app+"/"; var vc=document.getElementById("panel-content"); var panelTxt=vc?(vc.innerText||vc.textContent||""):""; var lang=detectLang(_phCode,panelTxt); if(_phIsHtml){ buildVanillaHtml(zip,folder,app,_phCode); } else if(lang==="flutter"){ buildFlutter(zip,folder,app,_phCode,panelTxt); } else if(lang==="react-native"){ buildReactNative(zip,folder,app,_phCode,panelTxt); } else if(lang==="swift"){ buildSwift(zip,folder,app,_phCode,panelTxt); } else if(lang==="kotlin"){ buildKotlin(zip,folder,app,_phCode,panelTxt); } else if(lang==="react"){ buildReact(zip,folder,app,_phCode,panelTxt); } else if(lang==="vue"){ buildVue(zip,folder,app,_phCode,panelTxt); } else if(lang==="angular"){ buildAngular(zip,folder,app,_phCode,panelTxt); } else if(lang==="python"){ buildPython(zip,folder,app,_phCode); } else if(lang==="node"){ buildNode(zip,folder,app,_phCode); } else { /* Document/content workflow */ var title=app.replace(/_/g," "); var md=_phAll||_phCode||panelTxt||"No content"; zip.file(folder+app+".md",md); var h=""+title+""; h+="

"+title+"

"; var hc=md.replace(/&/g,"&").replace(//g,">"); hc=hc.replace(/^### (.+)$/gm,"

$1

"); hc=hc.replace(/^## (.+)$/gm,"

$1

"); hc=hc.replace(/^# (.+)$/gm,"

$1

"); hc=hc.replace(/\*\*(.+?)\*\*/g,"$1"); hc=hc.replace(/\n{2,}/g,"

"); h+="

"+hc+"

Generated by PantheraHive BOS
"; zip.file(folder+app+".html",h); zip.file(folder+"README.md","# "+title+"\n\nGenerated by PantheraHive BOS.\n\nFiles:\n- "+app+".md (Markdown)\n- "+app+".html (styled HTML)\n"); } zip.generateAsync({type:"blob"}).then(function(blob){ var a=document.createElement("a"); a.href=URL.createObjectURL(blob); a.download=app+".zip"; a.click(); URL.revokeObjectURL(a.href); if(lbl)lbl.textContent="Download ZIP"; }); }; document.head.appendChild(sc); } function phShare(){navigator.clipboard.writeText(window.location.href).then(function(){var el=document.getElementById("ph-share-lbl");if(el){el.textContent="Link copied!";setTimeout(function(){el.textContent="Copy share link";},2500);}});}function phEmbed(){var runId=window.location.pathname.split("/").pop().replace(".html","");var embedUrl="https://pantherahive.com/embed/"+runId;var code='';navigator.clipboard.writeText(code).then(function(){var el=document.getElementById("ph-embed-lbl");if(el){el.textContent="Embed code copied!";setTimeout(function(){el.textContent="Get Embed Code";},2500);}});}