Data Migration Planner
Run ID: 69cc892c3e7fb09ff16a2ce32026-04-01Development
PantheraHive BOS
BOS Dashboard

Data Migration Planner: Step 2 of 3 - Code Generation for Data Migration

This document provides a comprehensive set of production-ready code components for a robust data migration. It encompasses field mapping, data transformation rules, validation scripts, and rollback procedures, designed for clarity, maintainability, and extensibility. Timeline estimates, while not directly coded, are discussed in the context of project management and workflow integration.

The code is presented primarily in Python, a widely used language for data engineering, and utilizes a modular structure to facilitate testing and deployment.


1. Project Structure

A well-organized project structure is crucial for managing data migration code.

text • 1,106 chars
data_migration_project/
├── config/
│   └── migration_config.yaml         # Field mappings, transformation rules, validation thresholds
├── scripts/
│   ├── __init__.py
│   ├── extract_data.py               # Source data extraction logic
│   ├── transform_data.py             # Data transformation engine
│   ├── load_data.py                  # Target data loading logic
│   ├── validate_data.py              # Pre/post migration validation scripts
│   ├── rollback_procedures.py        # Rollback logic and scripts
│   └── utils.py                      # Common utility functions (e.g., DB connections)
├── tests/
│   ├── __init__.py
│   ├── test_transformations.py       # Unit tests for transformation rules
│   ├── test_validations.py           # Unit tests for validation scripts
│   └── test_full_migration.py        # Integration tests (e.g., using sample data)
├── main_migration.py                 # Orchestration script for the entire migration process
├── requirements.txt                  # Python dependencies
└── README.md                         # Project description and setup instructions
Sandboxed live preview

This document outlines a comprehensive study plan designed to equip an individual with the necessary knowledge and skills to excel as a Data Migration Planner. The plan covers fundamental concepts, practical techniques, and strategic considerations required to effectively plan and execute complex data migration projects, aligning with the core responsibilities outlined in the "Data Migration Planner" workflow description.


Comprehensive Study Plan: Mastering Data Migration Planning

1. Introduction and Purpose

This study plan is meticulously crafted for aspiring Data Migration Planners, Data Architects, or Project Managers who need to acquire a deep understanding of data migration methodologies and best practices. Its primary goal is to provide a structured learning path that enables individuals to confidently plan, design, and oversee complete data migrations, from initial discovery to post-migration validation and rollback strategies.

2. Overall Learning Objective

Upon successful completion of this study plan, the learner will be able to:

  • Understand various data migration types, strategies, and their associated challenges.
  • Perform comprehensive source and target system analysis, including data profiling and quality assessment.
  • Design detailed field mappings and data transformation rules.
  • Develop robust data validation strategies and create effective validation scripts.
  • Formulate comprehensive error handling and rollback procedures.
  • Estimate project timelines, manage risks, and ensure compliance and security throughout the migration lifecycle.
  • Select appropriate tools and technologies for specific migration scenarios.

3. Weekly Schedule and Learning Objectives

This 12-week schedule provides a structured progression through the key domains of data migration planning.

Week 1: Foundations of Data Migration

  • Learning Objectives:

* Define data migration, its purpose, and common drivers (e.g., system upgrades, mergers).

* Identify different types of migrations (e.g., storage, database, application, cloud).

* Understand the typical phases of a data migration project lifecycle.

* Recognize common challenges and risks in data migration.

  • Recommended Resources:

* "Data Migration: Strategies for a Successful Migration" (Online course/book chapter).

* Industry whitepapers on data migration best practices (e.g., Gartner, Forrester).

* Articles on "Big Bang vs. Phased Migration" strategies.

  • Milestone: Write a summary comparing different migration strategies, outlining pros and cons for a hypothetical scenario.

Week 2: Source & Target System Analysis

  • Learning Objectives:

* Learn techniques for analyzing source and target database schemas, data models, and application interfaces.

* Identify data dependencies

python

scripts/transform_data.py

import pandas as pd

import uuid

import re

from datetime import datetime

from typing import Dict, Any, List, Optional

from scripts.utils import logger

--- Custom Transformation Functions ---

These functions are called based on the 'transformation' type in migration_config.yaml

def generate_uuid_from_string(value: Any) -> uuid.UUID:

"""Generates a UUID from a string value."""

if pd.isna(value):

return None

return uuid.uuid5(uuid.NAMESPACE_DNS, str(value))

def capitalize_string(value: Any) -> Optional[str]:

"""Capitalizes the first letter of each word in a string."""

if pd.isna(value):

return None

return str(value).title()

def concatenate_address(*args: Any) -> Optional[str]:

"""Concatenates multiple address components into a single string."""

parts = [str(arg).strip() for arg in args if not pd.isna(arg) and str(arg).strip()]

return ", ".join(parts) if parts else None

def validate_and_standardize_email(email: Any) -> Optional[str]:

"""Validates email format and converts to lowercase."""

if pd.isna(email

gemini Output

Data Migration Plan: [Source System Name] to [Target System Name]

Document Version: 1.0

Date: October 26, 2023

Prepared For: [Customer Name]

Prepared By: PantheraHive Solutions


Executive Summary

This document outlines a comprehensive plan for the data migration from [Source System Name] to [Target System Name]. The objective is to ensure a smooth, accurate, and secure transfer of critical business data, minimizing downtime and preserving data integrity. This plan details the strategy, scope, field mappings, transformation rules, validation procedures, rollback protocols, and a projected timeline for the entire migration process. Adherence to this plan will facilitate a successful transition, enabling [Customer Name] to leverage the full capabilities of the new [Target System Name] with confidence in its underlying data.


1. Introduction and Project Goals

The purpose of this document is to provide a detailed roadmap for the data migration project. This migration is crucial for [state the primary business driver, e.g., "upgrading to a more robust platform," "consolidating disparate systems," "improving data analytics capabilities"].

Key Project Goals:

  • Migrate all in-scope data from [Source System Name] to [Target System Name] accurately and completely.
  • Ensure data integrity and consistency throughout the migration process.
  • Minimize business disruption and system downtime during cutover.
  • Implement robust validation and rollback mechanisms to mitigate risks.
  • Provide a clear, actionable plan for all stakeholders involved.

2. Scope of Migration

This section defines what data will be migrated and what will be excluded.

2.1. In-Scope Data Entities:

The following data entities and their associated records will be migrated:

  • [Entity 1, e.g., Customers]: All active and historical customer records.
  • [Entity 2, e.g., Products]: All active product catalog items, including descriptions, pricing, and inventory levels.
  • [Entity 3, e.g., Orders]: All completed orders from the last [X] years, including line items and associated customer/product references.
  • [Entity 4, e.g., Sales Representatives]: All active sales representative profiles and their historical performance data (last [Y] years).
  • Add other specific entities as required.

2.2. Out-of-Scope Data Entities:

The following data entities or types will NOT be migrated:

  • Archived data older than [Z] years (e.g., orders, inactive customer accounts).
  • System configuration settings from [Source System Name].
  • Temporary or transient data (e.g., session logs, incomplete drafts).
  • Audit trails or change history logs, unless specifically identified as in-scope.
  • Add other specific exclusions as required.

2.3. Data Volume and Complexity:

  • Estimated Total Records: [e.g., 5,000,000 records]
  • Key Entities Volume:

* Customers: [e.g., 500,000 records]

* Products: [e.g., 10,000 records]

* Orders: [e.g., 2,000,000 records]

  • Data Relationships: Complex, involving many-to-many relationships (e.g., customers to multiple orders, products to multiple orders).
  • Data Quality Concerns (Identified during Discovery): [e.g., Duplicate customer records, inconsistent product categories, missing address data]. These will be addressed via transformation rules.

3. Data Migration Strategy

3.1. Approach:

  • Phased Migration / Big Bang: [Choose one and justify, e.g., "Big Bang approach will be used for a single, complete cutover to minimize complexity in managing dual systems during transition." OR "A phased approach will be used, migrating 'Customers' first, then 'Products', and finally 'Orders' to allow for incremental validation and user acclimatization."]
  • Migration Tooling: [e.g., Custom ETL scripts (Python/SQL), Commercial ETL tool (e.g., Informatica, Talend), Cloud-native migration services (e.g., AWS DMS, Azure Data Factory), Direct API calls.]

3.2. High-Level Process Flow:

  1. Preparation: Data profiling, cleansing plan, environment setup.
  2. Extraction: Extract data from [Source System Name].
  3. Transformation: Apply cleansing, mapping, and business rules.
  4. Loading: Load transformed data into [Target System Name].
  5. Validation: Verify data accuracy, completeness, and integrity.
  6. Cutover: Transition users to the new system.
  7. Post-Migration Support: Monitor and address issues.

4. Detailed Migration Plan

4.1. Data Extraction Strategy

  • Method: [e.g., Direct database queries (SQL Server, Oracle), API calls (REST/SOAP), Export to flat files (CSV, XML) from source application UI.]
  • Tooling: [e.g., SQL Server Integration Services (SSIS) packages, Python scripts with database connectors, Talend Open Studio.]
  • Frequency: Data will be extracted once for the initial load, with potential incremental extracts for delta changes during testing phases.
  • Security: Extraction will be performed by authorized personnel with read-only access to the source system. Data will be encrypted in transit and at rest during staging.

4.2. Data Transformation and Mapping

This is the core of the migration, detailing how source data fields are mapped to target fields and any necessary modifications.

4.2.1. Field Mapping (Example Table)

The following table provides an example of the field mapping structure. A comprehensive mapping document will be maintained in an external spreadsheet or data dictionary.

| Source Table/Entity | Source Field Name | Source Data Type | Target Table/Entity | Target Field Name | Target Data Type | Transformation Rule | Notes/Comments |

| :------------------ | :---------------- | :--------------- | :------------------ | :---------------- | :--------------- | :------------------ | :------------- |

| Customers | CustomerID | INT | Account | ExternalID__c | VARCHAR(50) | Direct Map | Used for external system reference |

| Customers | CustFirstName | VARCHAR(50) | Account | FirstName | VARCHAR(50) | Direct Map | |

| Customers | CustLastName | VARCHAR(50) | Account | LastName | VARCHAR(50) | Direct Map | |

| Customers | CustAddress | VARCHAR(255) | Account | BillingStreet | TEXT | Split: Street, City, State, Zip | Requires parsing of compound address field |

| Customers | CustTypeID | INT | Account | AccountType | Picklist | Lookup: CustTypeID to AccountType mapping table | 1->'Corporate', 2->'Individual', 3->'Partner' |

| Products | ProductID | INT | Product2 | ProductCode | VARCHAR(80) | Prefix: "PROD-" | Ensures unique product code format |

| Products | ItemPrice | DECIMAL(10,2) | Product2 | UnitPrice__c | Currency | Direct Map, Data Type Conversion | |

| Orders | OrderDate | DATETIME | Order | EffectiveDate | DATE | Convert: DATETIME to DATE | Time component is not required in Target |

| Orders | OrderStatus | VARCHAR(20) | Order | Status | Picklist | Map: "Pending"-> "Draft", "Complete"->"Activated" | Standardize status values |

4.2.2. Transformation Rules (Examples)

Detailed rules will be applied to ensure data fits the target system's structure and business logic.

  • Data Type Conversions:

* DATETIME to DATE (e.g., OrderDate to EffectiveDate).

* INT to VARCHAR (e.g., CustomerID to ExternalID__c).

* DECIMAL to Currency (e.g., ItemPrice to UnitPrice__c).

  • Value Lookups and Mappings:

* Source: CustTypeID (1, 2, 3) Target: AccountType ('Corporate', 'Individual', 'Partner').

* Source: OrderStatus ('Pending', 'Complete') Target: Status ('Draft', 'Activated').

  • Data Splitting:

* Splitting a single CustAddress field into BillingStreet, BillingCity, BillingState, BillingPostalCode using regular expressions or string parsing.

  • Data Concatenation:

* Combining CustFirstName and CustLastName into a FullName field if required by the target.

  • Default Values:

* Assigning a default Country value ('USA') if the source field is null or empty.

  • Data Cleansing/Standardization:

* Removing leading/trailing spaces from string fields.

* Converting text to proper case (e.g., "john doe" to "John Doe").

* Standardizing phone number formats (e.g., removing non-numeric characters).

* De-duplication logic for key entities (e.g., identifying and merging duplicate customer records based on name and email).

  • Prefix/Suffix Addition:

* Adding "PROD-" prefix to ProductID to form ProductCode.

  • Conditional Logic:

* If CustStatus is 'Inactive' and LastActivityDate is older than 5 years, set Account.IsActive to FALSE, otherwise TRUE.

4.3. Data Loading Strategy

  • Method: [e.g., Salesforce Data Loader, SQL INSERT statements, API calls (batch processing), custom bulk upload utilities.]
  • Tooling: [e.g., Custom Python scripts, SSIS packages, proprietary target system loaders.]
  • Order of Loading: Data will be loaded in a specific order to respect referential integrity (e.g., Accounts first, then Products, then Orders, then Order Line Items).
  • Error Handling: Implement robust logging for failed records during loading, with mechanisms for review and re-processing.
  • Performance: Batch loading will be utilized where possible to optimize performance and reduce API call limits/database overhead.

4.4. Data Validation and Quality Assurance

Validation is critical to ensure the migrated data is accurate, complete, and consistent.

4.4.1. Pre-Migration Validation (Source Data Profiling & Cleansing)

  • Purpose: Identify data quality issues in the source system before migration.
  • Activities:

* Data Profiling: Analyze source data for completeness, uniqueness, consistency, and validity.

* Data Cleansing: Work with business users to correct identified data errors in the source system or define transformation rules to handle them.

* Schema Analysis: Confirm source schema matches expected structure for extraction.

4.4.2. Post-Migration Validation (Validation Scripts)

Automated and manual checks will be performed on the target system post-load.

  • Count Checks:

* Script Example (Conceptual):


        -- Source System Count
        SELECT COUNT(*) FROM SourceDB.dbo.Customers;
        -- Target System Count
        SELECT COUNT(*) FROM TargetDB.dbo.Account;
        -- Expected: Counts should match or be within a defined delta after exclusions/filters
data_migration_planner.txt
Download source file
Copy all content
Full output as text
Download ZIP
IDE-ready project ZIP
Copy share link
Permanent URL for this run
Get Embed Code
Embed this result on any website
Print / Save PDF
Use browser print dialog
"); var hasSrcMain=Object.keys(extracted).some(function(k){return k.indexOf("src/main")>=0;}); if(!hasSrcMain) zip.file(folder+"src/main."+ext,"import React from 'react' import ReactDOM from 'react-dom/client' import App from './App' import './index.css' ReactDOM.createRoot(document.getElementById('root')!).render( ) "); var hasSrcApp=Object.keys(extracted).some(function(k){return k==="src/App."+ext||k==="App."+ext;}); if(!hasSrcApp) zip.file(folder+"src/App."+ext,"import React from 'react' import './App.css' function App(){ return(

"+slugTitle(pn)+"

Built with PantheraHive BOS

) } export default App "); zip.file(folder+"src/index.css","*{margin:0;padding:0;box-sizing:border-box} body{font-family:system-ui,-apple-system,sans-serif;background:#f0f2f5;color:#1a1a2e} .app{min-height:100vh;display:flex;flex-direction:column} .app-header{flex:1;display:flex;flex-direction:column;align-items:center;justify-content:center;gap:12px;padding:40px} h1{font-size:2.5rem;font-weight:700} "); zip.file(folder+"src/App.css",""); zip.file(folder+"src/components/.gitkeep",""); zip.file(folder+"src/pages/.gitkeep",""); zip.file(folder+"src/hooks/.gitkeep",""); Object.keys(extracted).forEach(function(p){ var fp=p.startsWith("src/")?p:"src/"+p; zip.file(folder+fp,extracted[p]); }); zip.file(folder+"README.md","# "+slugTitle(pn)+" Generated by PantheraHive BOS. ## Setup ```bash npm install npm run dev ``` ## Build ```bash npm run build ``` ## Open in IDE Open the project folder in VS Code or WebStorm. "); zip.file(folder+".gitignore","node_modules/ dist/ .env .DS_Store *.local "); } /* --- Vue (Vite + Composition API + TypeScript) --- */ function buildVue(zip,folder,app,code,panelTxt){ var pn=pkgName(app); var C=cc(pn); var extracted=extractCode(panelTxt); zip.file(folder+"package.json",'{ "name": "'+pn+'", "version": "0.0.0", "type": "module", "scripts": { "dev": "vite", "build": "vue-tsc -b && vite build", "preview": "vite preview" }, "dependencies": { "vue": "^3.5.13", "vue-router": "^4.4.5", "pinia": "^2.3.0", "axios": "^1.7.9" }, "devDependencies": { "@vitejs/plugin-vue": "^5.2.1", "typescript": "~5.7.3", "vite": "^6.0.5", "vue-tsc": "^2.2.0" } } '); zip.file(folder+"vite.config.ts","import { defineConfig } from 'vite' import vue from '@vitejs/plugin-vue' import { resolve } from 'path' export default defineConfig({ plugins: [vue()], resolve: { alias: { '@': resolve(__dirname,'src') } } }) "); zip.file(folder+"tsconfig.json",'{"files":[],"references":[{"path":"./tsconfig.app.json"},{"path":"./tsconfig.node.json"}]} '); zip.file(folder+"tsconfig.app.json",'{ "compilerOptions":{ "target":"ES2020","useDefineForClassFields":true,"module":"ESNext","lib":["ES2020","DOM","DOM.Iterable"], "skipLibCheck":true,"moduleResolution":"bundler","allowImportingTsExtensions":true, "isolatedModules":true,"moduleDetection":"force","noEmit":true,"jsxImportSource":"vue", "strict":true,"paths":{"@/*":["./src/*"]} }, "include":["src/**/*.ts","src/**/*.d.ts","src/**/*.tsx","src/**/*.vue"] } '); zip.file(folder+"env.d.ts","/// "); zip.file(folder+"index.html"," "+slugTitle(pn)+"
"); var hasMain=Object.keys(extracted).some(function(k){return k==="src/main.ts"||k==="main.ts";}); if(!hasMain) zip.file(folder+"src/main.ts","import { createApp } from 'vue' import { createPinia } from 'pinia' import App from './App.vue' import './assets/main.css' const app = createApp(App) app.use(createPinia()) app.mount('#app') "); var hasApp=Object.keys(extracted).some(function(k){return k.indexOf("App.vue")>=0;}); if(!hasApp) zip.file(folder+"src/App.vue"," "); zip.file(folder+"src/assets/main.css","*{margin:0;padding:0;box-sizing:border-box}body{font-family:system-ui,sans-serif;background:#fff;color:#213547} "); zip.file(folder+"src/components/.gitkeep",""); zip.file(folder+"src/views/.gitkeep",""); zip.file(folder+"src/stores/.gitkeep",""); Object.keys(extracted).forEach(function(p){ var fp=p.startsWith("src/")?p:"src/"+p; zip.file(folder+fp,extracted[p]); }); zip.file(folder+"README.md","# "+slugTitle(pn)+" Generated by PantheraHive BOS. ## Setup ```bash npm install npm run dev ``` ## Build ```bash npm run build ``` Open in VS Code or WebStorm. "); zip.file(folder+".gitignore","node_modules/ dist/ .env .DS_Store *.local "); } /* --- Angular (v19 standalone) --- */ function buildAngular(zip,folder,app,code,panelTxt){ var pn=pkgName(app); var C=cc(pn); var sel=pn.replace(/_/g,"-"); var extracted=extractCode(panelTxt); zip.file(folder+"package.json",'{ "name": "'+pn+'", "version": "0.0.0", "scripts": { "ng": "ng", "start": "ng serve", "build": "ng build", "test": "ng test" }, "dependencies": { "@angular/animations": "^19.0.0", "@angular/common": "^19.0.0", "@angular/compiler": "^19.0.0", "@angular/core": "^19.0.0", "@angular/forms": "^19.0.0", "@angular/platform-browser": "^19.0.0", "@angular/platform-browser-dynamic": "^19.0.0", "@angular/router": "^19.0.0", "rxjs": "~7.8.0", "tslib": "^2.3.0", "zone.js": "~0.15.0" }, "devDependencies": { "@angular-devkit/build-angular": "^19.0.0", "@angular/cli": "^19.0.0", "@angular/compiler-cli": "^19.0.0", "typescript": "~5.6.0" } } '); zip.file(folder+"angular.json",'{ "$schema": "./node_modules/@angular/cli/lib/config/schema.json", "version": 1, "newProjectRoot": "projects", "projects": { "'+pn+'": { "projectType": "application", "root": "", "sourceRoot": "src", "prefix": "app", "architect": { "build": { "builder": "@angular-devkit/build-angular:application", "options": { "outputPath": "dist/'+pn+'", "index": "src/index.html", "browser": "src/main.ts", "tsConfig": "tsconfig.app.json", "styles": ["src/styles.css"], "scripts": [] } }, "serve": {"builder":"@angular-devkit/build-angular:dev-server","configurations":{"production":{"buildTarget":"'+pn+':build:production"},"development":{"buildTarget":"'+pn+':build:development"}},"defaultConfiguration":"development"} } } } } '); zip.file(folder+"tsconfig.json",'{ "compileOnSave": false, "compilerOptions": {"baseUrl":"./","outDir":"./dist/out-tsc","forceConsistentCasingInFileNames":true,"strict":true,"noImplicitOverride":true,"noPropertyAccessFromIndexSignature":true,"noImplicitReturns":true,"noFallthroughCasesInSwitch":true,"paths":{"@/*":["src/*"]},"skipLibCheck":true,"esModuleInterop":true,"sourceMap":true,"declaration":false,"experimentalDecorators":true,"moduleResolution":"bundler","importHelpers":true,"target":"ES2022","module":"ES2022","useDefineForClassFields":false,"lib":["ES2022","dom"]}, "references":[{"path":"./tsconfig.app.json"}] } '); zip.file(folder+"tsconfig.app.json",'{ "extends":"./tsconfig.json", "compilerOptions":{"outDir":"./dist/out-tsc","types":[]}, "files":["src/main.ts"], "include":["src/**/*.d.ts"] } '); zip.file(folder+"src/index.html"," "+slugTitle(pn)+" "); zip.file(folder+"src/main.ts","import { bootstrapApplication } from '@angular/platform-browser'; import { appConfig } from './app/app.config'; import { AppComponent } from './app/app.component'; bootstrapApplication(AppComponent, appConfig) .catch(err => console.error(err)); "); zip.file(folder+"src/styles.css","* { margin: 0; padding: 0; box-sizing: border-box; } body { font-family: system-ui, -apple-system, sans-serif; background: #f9fafb; color: #111827; } "); var hasComp=Object.keys(extracted).some(function(k){return k.indexOf("app.component")>=0;}); if(!hasComp){ zip.file(folder+"src/app/app.component.ts","import { Component } from '@angular/core'; import { RouterOutlet } from '@angular/router'; @Component({ selector: 'app-root', standalone: true, imports: [RouterOutlet], templateUrl: './app.component.html', styleUrl: './app.component.css' }) export class AppComponent { title = '"+pn+"'; } "); zip.file(folder+"src/app/app.component.html","

"+slugTitle(pn)+"

Built with PantheraHive BOS

"); zip.file(folder+"src/app/app.component.css",".app-header{display:flex;flex-direction:column;align-items:center;justify-content:center;min-height:60vh;gap:16px}h1{font-size:2.5rem;font-weight:700;color:#6366f1} "); } zip.file(folder+"src/app/app.config.ts","import { ApplicationConfig, provideZoneChangeDetection } from '@angular/core'; import { provideRouter } from '@angular/router'; import { routes } from './app.routes'; export const appConfig: ApplicationConfig = { providers: [ provideZoneChangeDetection({ eventCoalescing: true }), provideRouter(routes) ] }; "); zip.file(folder+"src/app/app.routes.ts","import { Routes } from '@angular/router'; export const routes: Routes = []; "); Object.keys(extracted).forEach(function(p){ var fp=p.startsWith("src/")?p:"src/"+p; zip.file(folder+fp,extracted[p]); }); zip.file(folder+"README.md","# "+slugTitle(pn)+" Generated by PantheraHive BOS. ## Setup ```bash npm install ng serve # or: npm start ``` ## Build ```bash ng build ``` Open in VS Code with Angular Language Service extension. "); zip.file(folder+".gitignore","node_modules/ dist/ .env .DS_Store *.local .angular/ "); } /* --- Python --- */ function buildPython(zip,folder,app,code){ var title=slugTitle(app); var pn=pkgName(app); var src=code.replace(/^```[w]* ?/m,"").replace(/ ?```$/m,"").trim(); var reqMap={"numpy":"numpy","pandas":"pandas","sklearn":"scikit-learn","tensorflow":"tensorflow","torch":"torch","flask":"flask","fastapi":"fastapi","uvicorn":"uvicorn","requests":"requests","sqlalchemy":"sqlalchemy","pydantic":"pydantic","dotenv":"python-dotenv","PIL":"Pillow","cv2":"opencv-python","matplotlib":"matplotlib","seaborn":"seaborn","scipy":"scipy"}; var reqs=[]; Object.keys(reqMap).forEach(function(k){if(src.indexOf("import "+k)>=0||src.indexOf("from "+k)>=0)reqs.push(reqMap[k]);}); var reqsTxt=reqs.length?reqs.join(" "):"# add dependencies here "; zip.file(folder+"main.py",src||"# "+title+" # Generated by PantheraHive BOS print(title+" loaded") "); zip.file(folder+"requirements.txt",reqsTxt); zip.file(folder+".env.example","# Environment variables "); zip.file(folder+"README.md","# "+title+" Generated by PantheraHive BOS. ## Setup ```bash python3 -m venv .venv source .venv/bin/activate pip install -r requirements.txt ``` ## Run ```bash python main.py ``` "); zip.file(folder+".gitignore",".venv/ __pycache__/ *.pyc .env .DS_Store "); } /* --- Node.js --- */ function buildNode(zip,folder,app,code){ var title=slugTitle(app); var pn=pkgName(app); var src=code.replace(/^```[w]* ?/m,"").replace(/ ?```$/m,"").trim(); var depMap={"mongoose":"^8.0.0","dotenv":"^16.4.5","axios":"^1.7.9","cors":"^2.8.5","bcryptjs":"^2.4.3","jsonwebtoken":"^9.0.2","socket.io":"^4.7.4","uuid":"^9.0.1","zod":"^3.22.4","express":"^4.18.2"}; var deps={}; Object.keys(depMap).forEach(function(k){if(src.indexOf(k)>=0)deps[k]=depMap[k];}); if(!deps["express"])deps["express"]="^4.18.2"; var pkgJson=JSON.stringify({"name":pn,"version":"1.0.0","main":"src/index.js","scripts":{"start":"node src/index.js","dev":"nodemon src/index.js"},"dependencies":deps,"devDependencies":{"nodemon":"^3.0.3"}},null,2)+" "; zip.file(folder+"package.json",pkgJson); var fallback="const express=require("express"); const app=express(); app.use(express.json()); app.get("/",(req,res)=>{ res.json({message:""+title+" API"}); }); const PORT=process.env.PORT||3000; app.listen(PORT,()=>console.log("Server on port "+PORT)); "; zip.file(folder+"src/index.js",src||fallback); zip.file(folder+".env.example","PORT=3000 "); zip.file(folder+".gitignore","node_modules/ .env .DS_Store "); zip.file(folder+"README.md","# "+title+" Generated by PantheraHive BOS. ## Setup ```bash npm install ``` ## Run ```bash npm run dev ``` "); } /* --- Vanilla HTML --- */ function buildVanillaHtml(zip,folder,app,code){ var title=slugTitle(app); var isFullDoc=code.trim().toLowerCase().indexOf("=0||code.trim().toLowerCase().indexOf("=0; var indexHtml=isFullDoc?code:" "+title+" "+code+" "; zip.file(folder+"index.html",indexHtml); zip.file(folder+"style.css","/* "+title+" — styles */ *{margin:0;padding:0;box-sizing:border-box} body{font-family:system-ui,-apple-system,sans-serif;background:#fff;color:#1a1a2e} "); zip.file(folder+"script.js","/* "+title+" — scripts */ "); zip.file(folder+"assets/.gitkeep",""); zip.file(folder+"README.md","# "+title+" Generated by PantheraHive BOS. ## Open Double-click `index.html` in your browser. Or serve locally: ```bash npx serve . # or python3 -m http.server 3000 ``` "); zip.file(folder+".gitignore",".DS_Store node_modules/ .env "); } /* ===== MAIN ===== */ var sc=document.createElement("script"); sc.src="https://cdnjs.cloudflare.com/ajax/libs/jszip/3.10.1/jszip.min.js"; sc.onerror=function(){ if(lbl)lbl.textContent="Download ZIP"; alert("JSZip load failed — check connection."); }; sc.onload=function(){ var zip=new JSZip(); var base=(_phFname||"output").replace(/.[^.]+$/,""); var app=base.toLowerCase().replace(/[^a-z0-9]+/g,"_").replace(/^_+|_+$/g,"")||"my_app"; var folder=app+"/"; var vc=document.getElementById("panel-content"); var panelTxt=vc?(vc.innerText||vc.textContent||""):""; var lang=detectLang(_phCode,panelTxt); if(_phIsHtml){ buildVanillaHtml(zip,folder,app,_phCode); } else if(lang==="flutter"){ buildFlutter(zip,folder,app,_phCode,panelTxt); } else if(lang==="react-native"){ buildReactNative(zip,folder,app,_phCode,panelTxt); } else if(lang==="swift"){ buildSwift(zip,folder,app,_phCode,panelTxt); } else if(lang==="kotlin"){ buildKotlin(zip,folder,app,_phCode,panelTxt); } else if(lang==="react"){ buildReact(zip,folder,app,_phCode,panelTxt); } else if(lang==="vue"){ buildVue(zip,folder,app,_phCode,panelTxt); } else if(lang==="angular"){ buildAngular(zip,folder,app,_phCode,panelTxt); } else if(lang==="python"){ buildPython(zip,folder,app,_phCode); } else if(lang==="node"){ buildNode(zip,folder,app,_phCode); } else { /* Document/content workflow */ var title=app.replace(/_/g," "); var md=_phAll||_phCode||panelTxt||"No content"; zip.file(folder+app+".md",md); var h=""+title+""; h+="

"+title+"

"; var hc=md.replace(/&/g,"&").replace(//g,">"); hc=hc.replace(/^### (.+)$/gm,"

$1

"); hc=hc.replace(/^## (.+)$/gm,"

$1

"); hc=hc.replace(/^# (.+)$/gm,"

$1

"); hc=hc.replace(/**(.+?)**/g,"$1"); hc=hc.replace(/ {2,}/g,"

"); h+="

"+hc+"

Generated by PantheraHive BOS
"; zip.file(folder+app+".html",h); zip.file(folder+"README.md","# "+title+" Generated by PantheraHive BOS. Files: - "+app+".md (Markdown) - "+app+".html (styled HTML) "); } zip.generateAsync({type:"blob"}).then(function(blob){ var a=document.createElement("a"); a.href=URL.createObjectURL(blob); a.download=app+".zip"; a.click(); URL.revokeObjectURL(a.href); if(lbl)lbl.textContent="Download ZIP"; }); }; document.head.appendChild(sc); }function phShare(){navigator.clipboard.writeText(window.location.href).then(function(){var el=document.getElementById("ph-share-lbl");if(el){el.textContent="Link copied!";setTimeout(function(){el.textContent="Copy share link";},2500);}});}function phEmbed(){var runId=window.location.pathname.split("/").pop().replace(".html","");var embedUrl="https://pantherahive.com/embed/"+runId;var code='';navigator.clipboard.writeText(code).then(function(){var el=document.getElementById("ph-embed-lbl");if(el){el.textContent="Embed code copied!";setTimeout(function(){el.textContent="Get Embed Code";},2500);}});}