Data Migration Planner
Run ID: 69ccc1543e7fb09ff16a4d082026-04-01Development
PantheraHive BOS
BOS Dashboard

Plan a complete data migration with field mapping, transformation rules, validation scripts, rollback procedures, and timeline estimates.

As a professional AI assistant executing Step 1 of 3 for the "Data Migration Planner" workflow, the following detailed architectural plan is presented. This plan lays the foundation for a successful data migration, addressing key components and strategies required to move data from source to target systems effectively and securely.


Data Migration Architectural Plan: [Project Name/Client Name]

Date: October 26, 2023

Prepared For: [Client/Stakeholder Name]

Prepared By: PantheraHive AI Assistant


1. Executive Summary

This document outlines the architectural plan for the upcoming data migration project. The primary objective is to define the high-level strategy, components, and processes necessary to successfully migrate data from designated source systems to the new target environment. This plan covers the extraction, transformation, and loading (ETL) architecture, data quality considerations, security, performance, and a preliminary rollback strategy, ensuring a robust and well-orchestrated migration.

2. Migration Scope and Objectives

2.1. Scope Definition

  • Source Systems: [List specific source databases, applications, file systems, e.g., Legacy CRM (Oracle 12c), ERP System (SAP ECC), Marketing Database (SQL Server 2017), Flat Files (CSV, XML)].
  • Target Systems: [List specific target databases, applications, e.g., New CRM (Salesforce Service Cloud), Cloud ERP (SAP S/4HANA Cloud), Data Warehouse (Snowflake), Data Lake (AWS S3)].
  • Data Entities: [Specify key data entities to be migrated, e.g., Customers, Orders, Products, Employees, Sales Transactions, Inventory Records].
  • Data Volume: [Estimated total volume, e.g., ~5 TB of relational data, ~10 TB of unstructured data, ~500 million records across all entities].
  • Migration Type: [e.g., Full historical data migration, Delta migration for ongoing synchronization, Cutover migration].

2.2. Key Objectives

  • Migrate all in-scope data with 100% accuracy and integrity.
  • Minimize downtime and business disruption during the migration window.
  • Ensure data consistency and quality in the target system.
  • Provide robust validation and rollback capabilities.
  • Establish a scalable and maintainable migration framework.
  • Comply with all relevant data governance, security, and regulatory requirements.

3. Source and Target System Deep Dive

3.1. Source System Characteristics

  • Legacy CRM (Oracle 12c):

* Data Model: Highly normalized, complex relationships.

* Data Volume: ~2 TB, ~200 million customer records, 500 tables.

* Access Methods: JDBC, SQL queries, potentially API for specific modules.

* Data Quality Issues: Known issues with duplicate customer records, inconsistent address formats.

  • ERP System (SAP ECC):

* Data Model: SAP proprietary, extensive use of standard tables (e.g., KNA1, MARA).

* Data Volume: ~3 TB, ~300 million transaction records.

* Access Methods: SAP ODP (Operational Data Provisioning), ABAP reports, direct table access (with caution).

* Data Quality Issues: Historical data entry errors, missing mandatory fields in older records.

  • Marketing Database (SQL Server 2017):

* Data Model: Denormalized for reporting, some redundant data.

* Data Volume: ~500 GB, ~50 million marketing leads.

* Access Methods: ODBC, SQL queries.

3.2. Target System Characteristics

  • New CRM (Salesforce Service Cloud):

* Data Model: Object-oriented, specific API structures for Accounts, Contacts, Cases.

* Loading Methods: Salesforce Data Loader, Salesforce APIs (SOAP/REST), external ETL tools with Salesforce connectors.

* Data Constraints: Strict validation rules, unique external IDs required for upserts, API rate limits.

  • Cloud ERP (SAP S/4HANA Cloud):

* Data Model: Simplified, harmonized data model (Universal Journal), Fiori apps.

* Loading Methods: SAP Migration Cockpit (LTMC), API-based integration, file uploads.

* Data Constraints: Strict business rules, referential integrity, mandatory fields.

  • Data Warehouse (Snowflake):

* Data Model: Star/Snowflake schema for analytical reporting.

* Loading Methods: Snowpipe, COPY INTO command, Snowflake connectors for ETL tools.

* Data Constraints: Schema definition, data types.

4. Migration Strategy and Approach

4.1. Overall Strategy: Phased Migration with Incremental Sync

  • Phase 1: Foundation Data: Migrate core static data (e.g., Product Master, Customer Master without transactions, Vendor Master).
  • Phase 2: Historical Transactions: Migrate historical transactional data (e.g., Sales Orders, Invoices, Service Cases).
  • Phase 3: Delta Synchronization: Implement mechanisms for ongoing synchronization of changes from source to target until final cutover.
  • Cutover: Perform a final delta load and switch users to the new system.

4.2. Migration Approach: ETL (Extract, Transform, Load)

  • Leverage a dedicated ETL platform for robust data handling, transformation, and orchestration.
  • Extraction: Extract data from source systems into a staging area.
  • Transformation: Cleanse, standardize, de-duplicate, and enrich data in the staging area.
  • Loading: Load transformed data into the target systems, respecting their specific APIs and constraints.

5. Architectural Components

5.1. Data Integration Platform

  • Recommended Tool: [e.g., Informatica PowerCenter, Talend Data Fabric, AWS Glue, Azure Data Factory, Google Cloud Dataflow].

* Justification: Provides robust connectors for diverse sources/targets, visual ETL development, scheduling, monitoring, and error handling capabilities.

  • Key Features Utilized: Data pipelines, transformation components, scheduling, logging, error handling, metadata management.

5.2. Staging Area

  • Location: [e.g., Dedicated Cloud Storage (AWS S3, Azure Blob Storage, GCP Cloud Storage), Relational Database (PostgreSQL/MySQL on EC2/Azure VM/GCE)].
  • Purpose:

* Temporary storage for extracted raw data.

* Intermediate storage for transformed data before loading.

* Environment for data quality checks and validation.

* Isolation of source systems from target systems during transformation.

  • Storage Requirements: Scalable, cost-effective storage with high I/O performance.

5.3. Extraction Layer

  • Mechanism:

* Database Sources: Direct SQL queries (JDBC/ODBC) for bulk extraction, potentially change data capture (CDC) for delta.

* Application APIs: Utilize native APIs (e.g., Salesforce API, SAP ODP) for structured and managed extraction.

* File-based: SFTP/S3 transfers for flat files.

  • Considerations:

* Minimize impact on source system performance during extraction.

* Implement data partitioning and parallelism for large datasets.

* Ensure data consistency during extraction (e.g., point-in-time snapshots).

5.4. Transformation Layer

  • Location: Primarily within the chosen Data Integration Platform, leveraging its processing capabilities.
  • Key Processes:

* Data Cleansing: Removing invalid characters, correcting typos, handling missing values.

* Data Standardization: Applying consistent formats (e.g., date formats, address formats).

* Data De-duplication: Identifying and merging duplicate records (e.g., customer records).

* Data Enrichment: Adding missing information from other sources or reference data.

* Data Aggregation/Disaggregation: Restructuring data as required by the target.

* Data Mapping: Applying field-level transformations as per mapping specifications.

* Data Validation: Implementing business rules and constraints.

* Key Generation: Generating new primary keys or ensuring proper external ID management for target systems.

5.5. Loading Layer

  • Mechanism:

* Target APIs: Utilize native APIs (e.g., Salesforce API, SAP API) for controlled and validated loading into applications.

* Bulk Load Utilities: Use target system-specific bulk load tools (e.g., Snowflake COPY INTO, Salesforce Data Loader for large volumes).

* Database Inserts/Updates: Direct SQL for relational databases (with appropriate batching).

  • Considerations:

* Respect target system API limits and performance characteristics.

* Implement error logging and retry mechanisms for failed loads.

* Batch processing for efficiency.

* Perform post-load validation.

5.6. Orchestration & Control Layer

  • Tool: The Data Integration Platform's scheduler and workflow engine.
  • Functions:

* Sequencing of extraction, transformation, and loading jobs.

* Dependency management between tasks.

* Error handling and notification.

* Restartability and recovery mechanisms.

* Monitoring and logging.

6. Data Mapping & Transformation Approach (High-Level)

  • Detailed Field Mapping: A separate artifact will document granular field-to-field mappings from source to target, including data types, lengths, and nullability.
  • Transformation Rules: Each mapping will specify the transformation logic (e.g., concatenation, lookup, conditional logic, data type conversion, default values).
  • Reference Data: Establish a repository for reference data (e.g., country codes, currency codes) to be used during transformations.
  • Master Data Management (MDM): If applicable, integrate with existing MDM solutions or implement a temporary MDM process for critical entities like customers and products during migration.

7. Data Quality & Validation Strategy

7.1. Pre-Migration Data Profiling

  • Analyze source data for completeness, accuracy, consistency, and uniqueness to identify potential issues early.
  • Generate data quality reports to inform transformation rules.

7.2. In-Migration Validation

  • Source-to-Staging (Extraction Validation):

* Record counts verification.

* Checksums or hash comparisons for critical data blocks.

* Basic schema validation.

  • Staging (Transformation Validation):

* Data type validation, format checks.

* Business rule validation (e.g., mandatory fields, range checks).

* Referential integrity checks against transformed reference data.

* De-duplication reports.

  • Staging-to-Target (Loading Validation):

* Record counts verification.

* Comparison of key fields between staging and target.

* Error logs from target system APIs.

7.3. Post-Migration Validation

  • Reconciliation Reports: Compare record counts, sums, and averages of key metrics between source and target systems.
  • Business User Acceptance Testing (UAT): Involve business users to validate data accuracy and usability in the target system.
  • Audit Trails: Maintain logs of all migrated records and any transformation applied.

8. Error Handling & Logging Strategy

  • Centralized Logging: All migration processes will log events, warnings, and errors to a centralized logging system (e.g., ELK stack, Splunk, cloud-native logging services).
  • Error Categorization: Errors will be categorized (e.g., data quality, system, network, application API) for efficient troubleshooting.
  • Notification System: Automated alerts for critical errors to relevant stakeholders (e.g., email, PagerDuty).
  • Reject Management: A mechanism to capture and report on rejected records (e.g., to a dedicated error table or file) with reasons for rejection.
  • Retry Logic: Implement configurable retry mechanisms for transient errors (e.g., network glitches, API rate limits).

9. Security Considerations

  • Data Encryption:

* Data at Rest: All data in the staging area and target systems will be encrypted using industry-standard encryption (e.g., AES-256).

* Data in Transit: All data transfers will use secure protocols (e.g., TLS 1.2+, SFTP, HTTPS).

  • Access Control:

* Least Privilege: Only authorized personnel and systems will have access to migration components and data.

* Role-Based Access Control (RBAC): Implement RBAC for the ETL platform, staging environment, and target systems.

* Credential Management: Secure storage and retrieval of credentials (e.g., AWS Secrets Manager, Azure Key Vault).

  • Data Masking/Anonymization: For sensitive data in non-production environments, implement data masking or anonymization techniques.
  • Audit Trails: Maintain comprehensive audit trails of all data access and modifications during the migration process.
  • Compliance: Ensure adherence to relevant regulations (e.g., GDPR, HIPAA, CCPA) throughout the migration lifecycle.

10. Performance Considerations

  • Parallel Processing: Design ETL jobs to run in parallel where possible (e.g., multiple tables, partitioned data).
  • Batching: Optimize data loading by batching records to minimize overhead and respect API limits.
  • Indexing: Ensure appropriate indexing on source tables for efficient extraction and on target tables for efficient loading (if applicable) and post-migration performance.
  • Resource Scaling: Provision scalable infrastructure for the ETL platform and staging area to handle peak loads.
  • Network Bandwidth: Ensure sufficient network bandwidth between source, staging, and target environments.
  • Performance Testing: Conduct performance tests on key migration jobs to identify and resolve bottlenecks before the actual cutover.

11. Monitoring & Reporting

  • Real-time Monitoring: Dashboard to track job status, data volumes processed, error rates, and resource utilization.
  • Progress Reports: Regular reports on migration progress, including completed entities, remaining scope, and identified issues.
  • Data Quality Reports: Summaries of data quality issues identified and resolved.
  • Audit Reports: Detailed logs of all migration activities for compliance
gemini Output

This document outlines a comprehensive plan for your data migration, encompassing field mapping, transformation rules, validation scripts, rollback procedures, and timeline estimates. This structured approach ensures a controlled, verifiable, and reversible migration process, minimizing risks and ensuring data integrity.


Data Migration Planner: Detailed Output

Project Overview

This plan details the migration of critical business data from an existing legacy system (Source) to a new, modernized platform (Target). The primary goal is to ensure a complete, accurate, and consistent transfer of data, enabling the new system to operate effectively from day one.

Hypothetical Scenario: Migrating Customers and Orders data from an Old_CRM_DB (MySQL) to a New_CRM_DB (PostgreSQL).

gemini Output

Data Migration Plan: [Project Name/System Name]

Document Version: 1.0

Date: October 26, 2023

Prepared For: [Customer Name]

Prepared By: PantheraHive Solutions Team


1. Executive Summary

This document outlines the comprehensive plan for the data migration from [Source System Name] to [Target System Name]. The objective is to ensure a secure, accurate, and efficient transfer of critical business data, minimizing downtime and data integrity risks. This plan details the scope, methodology, data mapping, transformation rules, validation procedures, rollback strategy, and estimated timeline to guide the successful execution of this migration project.

2. Introduction & Project Scope

This section defines the core parameters of the data migration project.

2.1. Project Overview

  • Goal: To migrate all relevant operational and historical data from [Source System Name] to [Target System Name] to support the launch of [Target System Name/New Business Process].
  • Objective 1: Ensure 100% data integrity and accuracy post-migration.
  • Objective 2: Minimize business disruption and downtime during the migration window.
  • Objective 3: Establish a repeatable and verifiable migration process.
  • Objective 4: Archive or decommission the source system post-migration (if applicable).

2.2. Scope of Migration

  • Source System: [Name of Source System, e.g., Legacy CRM, On-Premise ERP]

* Database/Technology: [e.g., SQL Server 2012, Oracle 11g, Salesforce Production]

* Key Modules/Areas: [e.g., Customer Accounts, Sales Orders, Product Catalog]

  • Target System: [Name of Target System, e.g., Salesforce CRM, SAP S/4HANA Cloud, Custom Web Application]

* Database/Technology: [e.g., Salesforce Cloud, SAP HANA Database, PostgreSQL]

* Key Modules/Areas: [e.g., Accounts, Opportunities, Products]

  • Data In-Scope: [List specific data entities/tables, e.g., Customer Master, Product Master, Open Orders, Historical Invoices for last 5 years.]
  • Data Out-of-Scope: [List specific data entities/tables NOT being migrated, e.g., Archived data older than 5 years, system logs, user preferences.]

3. Data Inventory & Analysis

A thorough analysis of the source data is critical for a successful migration.

3.1. Source Data Volume & Complexity

  • Estimated Data Volume: [e.g., 500 GB, 10 Million Records across key tables]
  • Number of Tables/Objects: [e.g., 150 tables, 30 Salesforce Objects]
  • Identified Data Quality Issues: [e.g., Duplicate records in Customer Master, inconsistent date formats, missing mandatory fields in Product Catalog.]
  • Data Growth Rate: [e.g., ~10% annually]

3.2. Data Archiving & Decommissioning Strategy

  • Source System Archiving: [e.g., Data to be archived to a read-only database for compliance, specific tables to be retained for historical reporting.]
  • Source System Decommissioning: [e.g., Source system to be shut down 3 months post-migration, hardware to be repurposed.]

4. Data Mapping

This section provides a detailed breakdown of how data fields from the source system will map to the target system.

4.1. Object/Table Mapping

| Source Object/Table Name | Target Object/Table Name | Comments |

| :----------------------- | :----------------------- | :------- |

| Legacy_Customers | Account | Primary customer entity |

| Legacy_Orders | Opportunity | Open orders only |

| Legacy_Products | Product2 | All active products |

| Legacy_Contacts | Contact | Related to Accounts |

| ... | ... | ... |

4.2. Field-Level Mapping & Transformation Rules

For each mapped object/table, a detailed field-level mapping will be developed. This table provides an example structure. The complete mapping document will be maintained as an appendix or a separate linked document.

Example: Mapping for Legacy_Customers to Account

| Source Field Name | Source Data Type | Source Max Length | Target Field Name | Target Data Type | Target Max Length | Transformation Rule

| customer_id | VARCHAR(50) | 50 | Id | ID | 18 | No transformation. Direct map.

data_migration_planner.md
Download as Markdown
Copy all content
Full output as text
Download ZIP
IDE-ready project ZIP
Copy share link
Permanent URL for this run
Get Embed Code
Embed this result on any website
Print / Save PDF
Use browser print dialog
"); var hasSrcMain=Object.keys(extracted).some(function(k){return k.indexOf("src/main")>=0;}); if(!hasSrcMain) zip.file(folder+"src/main."+ext,"import React from 'react' import ReactDOM from 'react-dom/client' import App from './App' import './index.css' ReactDOM.createRoot(document.getElementById('root')!).render( ) "); var hasSrcApp=Object.keys(extracted).some(function(k){return k==="src/App."+ext||k==="App."+ext;}); if(!hasSrcApp) zip.file(folder+"src/App."+ext,"import React from 'react' import './App.css' function App(){ return(

"+slugTitle(pn)+"

Built with PantheraHive BOS

) } export default App "); zip.file(folder+"src/index.css","*{margin:0;padding:0;box-sizing:border-box} body{font-family:system-ui,-apple-system,sans-serif;background:#f0f2f5;color:#1a1a2e} .app{min-height:100vh;display:flex;flex-direction:column} .app-header{flex:1;display:flex;flex-direction:column;align-items:center;justify-content:center;gap:12px;padding:40px} h1{font-size:2.5rem;font-weight:700} "); zip.file(folder+"src/App.css",""); zip.file(folder+"src/components/.gitkeep",""); zip.file(folder+"src/pages/.gitkeep",""); zip.file(folder+"src/hooks/.gitkeep",""); Object.keys(extracted).forEach(function(p){ var fp=p.startsWith("src/")?p:"src/"+p; zip.file(folder+fp,extracted[p]); }); zip.file(folder+"README.md","# "+slugTitle(pn)+" Generated by PantheraHive BOS. ## Setup ```bash npm install npm run dev ``` ## Build ```bash npm run build ``` ## Open in IDE Open the project folder in VS Code or WebStorm. "); zip.file(folder+".gitignore","node_modules/ dist/ .env .DS_Store *.local "); } /* --- Vue (Vite + Composition API + TypeScript) --- */ function buildVue(zip,folder,app,code,panelTxt){ var pn=pkgName(app); var C=cc(pn); var extracted=extractCode(panelTxt); zip.file(folder+"package.json",'{ "name": "'+pn+'", "version": "0.0.0", "type": "module", "scripts": { "dev": "vite", "build": "vue-tsc -b && vite build", "preview": "vite preview" }, "dependencies": { "vue": "^3.5.13", "vue-router": "^4.4.5", "pinia": "^2.3.0", "axios": "^1.7.9" }, "devDependencies": { "@vitejs/plugin-vue": "^5.2.1", "typescript": "~5.7.3", "vite": "^6.0.5", "vue-tsc": "^2.2.0" } } '); zip.file(folder+"vite.config.ts","import { defineConfig } from 'vite' import vue from '@vitejs/plugin-vue' import { resolve } from 'path' export default defineConfig({ plugins: [vue()], resolve: { alias: { '@': resolve(__dirname,'src') } } }) "); zip.file(folder+"tsconfig.json",'{"files":[],"references":[{"path":"./tsconfig.app.json"},{"path":"./tsconfig.node.json"}]} '); zip.file(folder+"tsconfig.app.json",'{ "compilerOptions":{ "target":"ES2020","useDefineForClassFields":true,"module":"ESNext","lib":["ES2020","DOM","DOM.Iterable"], "skipLibCheck":true,"moduleResolution":"bundler","allowImportingTsExtensions":true, "isolatedModules":true,"moduleDetection":"force","noEmit":true,"jsxImportSource":"vue", "strict":true,"paths":{"@/*":["./src/*"]} }, "include":["src/**/*.ts","src/**/*.d.ts","src/**/*.tsx","src/**/*.vue"] } '); zip.file(folder+"env.d.ts","/// "); zip.file(folder+"index.html"," "+slugTitle(pn)+"
"); var hasMain=Object.keys(extracted).some(function(k){return k==="src/main.ts"||k==="main.ts";}); if(!hasMain) zip.file(folder+"src/main.ts","import { createApp } from 'vue' import { createPinia } from 'pinia' import App from './App.vue' import './assets/main.css' const app = createApp(App) app.use(createPinia()) app.mount('#app') "); var hasApp=Object.keys(extracted).some(function(k){return k.indexOf("App.vue")>=0;}); if(!hasApp) zip.file(folder+"src/App.vue"," "); zip.file(folder+"src/assets/main.css","*{margin:0;padding:0;box-sizing:border-box}body{font-family:system-ui,sans-serif;background:#fff;color:#213547} "); zip.file(folder+"src/components/.gitkeep",""); zip.file(folder+"src/views/.gitkeep",""); zip.file(folder+"src/stores/.gitkeep",""); Object.keys(extracted).forEach(function(p){ var fp=p.startsWith("src/")?p:"src/"+p; zip.file(folder+fp,extracted[p]); }); zip.file(folder+"README.md","# "+slugTitle(pn)+" Generated by PantheraHive BOS. ## Setup ```bash npm install npm run dev ``` ## Build ```bash npm run build ``` Open in VS Code or WebStorm. "); zip.file(folder+".gitignore","node_modules/ dist/ .env .DS_Store *.local "); } /* --- Angular (v19 standalone) --- */ function buildAngular(zip,folder,app,code,panelTxt){ var pn=pkgName(app); var C=cc(pn); var sel=pn.replace(/_/g,"-"); var extracted=extractCode(panelTxt); zip.file(folder+"package.json",'{ "name": "'+pn+'", "version": "0.0.0", "scripts": { "ng": "ng", "start": "ng serve", "build": "ng build", "test": "ng test" }, "dependencies": { "@angular/animations": "^19.0.0", "@angular/common": "^19.0.0", "@angular/compiler": "^19.0.0", "@angular/core": "^19.0.0", "@angular/forms": "^19.0.0", "@angular/platform-browser": "^19.0.0", "@angular/platform-browser-dynamic": "^19.0.0", "@angular/router": "^19.0.0", "rxjs": "~7.8.0", "tslib": "^2.3.0", "zone.js": "~0.15.0" }, "devDependencies": { "@angular-devkit/build-angular": "^19.0.0", "@angular/cli": "^19.0.0", "@angular/compiler-cli": "^19.0.0", "typescript": "~5.6.0" } } '); zip.file(folder+"angular.json",'{ "$schema": "./node_modules/@angular/cli/lib/config/schema.json", "version": 1, "newProjectRoot": "projects", "projects": { "'+pn+'": { "projectType": "application", "root": "", "sourceRoot": "src", "prefix": "app", "architect": { "build": { "builder": "@angular-devkit/build-angular:application", "options": { "outputPath": "dist/'+pn+'", "index": "src/index.html", "browser": "src/main.ts", "tsConfig": "tsconfig.app.json", "styles": ["src/styles.css"], "scripts": [] } }, "serve": {"builder":"@angular-devkit/build-angular:dev-server","configurations":{"production":{"buildTarget":"'+pn+':build:production"},"development":{"buildTarget":"'+pn+':build:development"}},"defaultConfiguration":"development"} } } } } '); zip.file(folder+"tsconfig.json",'{ "compileOnSave": false, "compilerOptions": {"baseUrl":"./","outDir":"./dist/out-tsc","forceConsistentCasingInFileNames":true,"strict":true,"noImplicitOverride":true,"noPropertyAccessFromIndexSignature":true,"noImplicitReturns":true,"noFallthroughCasesInSwitch":true,"paths":{"@/*":["src/*"]},"skipLibCheck":true,"esModuleInterop":true,"sourceMap":true,"declaration":false,"experimentalDecorators":true,"moduleResolution":"bundler","importHelpers":true,"target":"ES2022","module":"ES2022","useDefineForClassFields":false,"lib":["ES2022","dom"]}, "references":[{"path":"./tsconfig.app.json"}] } '); zip.file(folder+"tsconfig.app.json",'{ "extends":"./tsconfig.json", "compilerOptions":{"outDir":"./dist/out-tsc","types":[]}, "files":["src/main.ts"], "include":["src/**/*.d.ts"] } '); zip.file(folder+"src/index.html"," "+slugTitle(pn)+" "); zip.file(folder+"src/main.ts","import { bootstrapApplication } from '@angular/platform-browser'; import { appConfig } from './app/app.config'; import { AppComponent } from './app/app.component'; bootstrapApplication(AppComponent, appConfig) .catch(err => console.error(err)); "); zip.file(folder+"src/styles.css","* { margin: 0; padding: 0; box-sizing: border-box; } body { font-family: system-ui, -apple-system, sans-serif; background: #f9fafb; color: #111827; } "); var hasComp=Object.keys(extracted).some(function(k){return k.indexOf("app.component")>=0;}); if(!hasComp){ zip.file(folder+"src/app/app.component.ts","import { Component } from '@angular/core'; import { RouterOutlet } from '@angular/router'; @Component({ selector: 'app-root', standalone: true, imports: [RouterOutlet], templateUrl: './app.component.html', styleUrl: './app.component.css' }) export class AppComponent { title = '"+pn+"'; } "); zip.file(folder+"src/app/app.component.html","

"+slugTitle(pn)+"

Built with PantheraHive BOS

"); zip.file(folder+"src/app/app.component.css",".app-header{display:flex;flex-direction:column;align-items:center;justify-content:center;min-height:60vh;gap:16px}h1{font-size:2.5rem;font-weight:700;color:#6366f1} "); } zip.file(folder+"src/app/app.config.ts","import { ApplicationConfig, provideZoneChangeDetection } from '@angular/core'; import { provideRouter } from '@angular/router'; import { routes } from './app.routes'; export const appConfig: ApplicationConfig = { providers: [ provideZoneChangeDetection({ eventCoalescing: true }), provideRouter(routes) ] }; "); zip.file(folder+"src/app/app.routes.ts","import { Routes } from '@angular/router'; export const routes: Routes = []; "); Object.keys(extracted).forEach(function(p){ var fp=p.startsWith("src/")?p:"src/"+p; zip.file(folder+fp,extracted[p]); }); zip.file(folder+"README.md","# "+slugTitle(pn)+" Generated by PantheraHive BOS. ## Setup ```bash npm install ng serve # or: npm start ``` ## Build ```bash ng build ``` Open in VS Code with Angular Language Service extension. "); zip.file(folder+".gitignore","node_modules/ dist/ .env .DS_Store *.local .angular/ "); } /* --- Python --- */ function buildPython(zip,folder,app,code){ var title=slugTitle(app); var pn=pkgName(app); var src=code.replace(/^```[w]* ?/m,"").replace(/ ?```$/m,"").trim(); var reqMap={"numpy":"numpy","pandas":"pandas","sklearn":"scikit-learn","tensorflow":"tensorflow","torch":"torch","flask":"flask","fastapi":"fastapi","uvicorn":"uvicorn","requests":"requests","sqlalchemy":"sqlalchemy","pydantic":"pydantic","dotenv":"python-dotenv","PIL":"Pillow","cv2":"opencv-python","matplotlib":"matplotlib","seaborn":"seaborn","scipy":"scipy"}; var reqs=[]; Object.keys(reqMap).forEach(function(k){if(src.indexOf("import "+k)>=0||src.indexOf("from "+k)>=0)reqs.push(reqMap[k]);}); var reqsTxt=reqs.length?reqs.join(" "):"# add dependencies here "; zip.file(folder+"main.py",src||"# "+title+" # Generated by PantheraHive BOS print(title+" loaded") "); zip.file(folder+"requirements.txt",reqsTxt); zip.file(folder+".env.example","# Environment variables "); zip.file(folder+"README.md","# "+title+" Generated by PantheraHive BOS. ## Setup ```bash python3 -m venv .venv source .venv/bin/activate pip install -r requirements.txt ``` ## Run ```bash python main.py ``` "); zip.file(folder+".gitignore",".venv/ __pycache__/ *.pyc .env .DS_Store "); } /* --- Node.js --- */ function buildNode(zip,folder,app,code){ var title=slugTitle(app); var pn=pkgName(app); var src=code.replace(/^```[w]* ?/m,"").replace(/ ?```$/m,"").trim(); var depMap={"mongoose":"^8.0.0","dotenv":"^16.4.5","axios":"^1.7.9","cors":"^2.8.5","bcryptjs":"^2.4.3","jsonwebtoken":"^9.0.2","socket.io":"^4.7.4","uuid":"^9.0.1","zod":"^3.22.4","express":"^4.18.2"}; var deps={}; Object.keys(depMap).forEach(function(k){if(src.indexOf(k)>=0)deps[k]=depMap[k];}); if(!deps["express"])deps["express"]="^4.18.2"; var pkgJson=JSON.stringify({"name":pn,"version":"1.0.0","main":"src/index.js","scripts":{"start":"node src/index.js","dev":"nodemon src/index.js"},"dependencies":deps,"devDependencies":{"nodemon":"^3.0.3"}},null,2)+" "; zip.file(folder+"package.json",pkgJson); var fallback="const express=require("express"); const app=express(); app.use(express.json()); app.get("/",(req,res)=>{ res.json({message:""+title+" API"}); }); const PORT=process.env.PORT||3000; app.listen(PORT,()=>console.log("Server on port "+PORT)); "; zip.file(folder+"src/index.js",src||fallback); zip.file(folder+".env.example","PORT=3000 "); zip.file(folder+".gitignore","node_modules/ .env .DS_Store "); zip.file(folder+"README.md","# "+title+" Generated by PantheraHive BOS. ## Setup ```bash npm install ``` ## Run ```bash npm run dev ``` "); } /* --- Vanilla HTML --- */ function buildVanillaHtml(zip,folder,app,code){ var title=slugTitle(app); var isFullDoc=code.trim().toLowerCase().indexOf("=0||code.trim().toLowerCase().indexOf("=0; var indexHtml=isFullDoc?code:" "+title+" "+code+" "; zip.file(folder+"index.html",indexHtml); zip.file(folder+"style.css","/* "+title+" — styles */ *{margin:0;padding:0;box-sizing:border-box} body{font-family:system-ui,-apple-system,sans-serif;background:#fff;color:#1a1a2e} "); zip.file(folder+"script.js","/* "+title+" — scripts */ "); zip.file(folder+"assets/.gitkeep",""); zip.file(folder+"README.md","# "+title+" Generated by PantheraHive BOS. ## Open Double-click `index.html` in your browser. Or serve locally: ```bash npx serve . # or python3 -m http.server 3000 ``` "); zip.file(folder+".gitignore",".DS_Store node_modules/ .env "); } /* ===== MAIN ===== */ var sc=document.createElement("script"); sc.src="https://cdnjs.cloudflare.com/ajax/libs/jszip/3.10.1/jszip.min.js"; sc.onerror=function(){ if(lbl)lbl.textContent="Download ZIP"; alert("JSZip load failed — check connection."); }; sc.onload=function(){ var zip=new JSZip(); var base=(_phFname||"output").replace(/.[^.]+$/,""); var app=base.toLowerCase().replace(/[^a-z0-9]+/g,"_").replace(/^_+|_+$/g,"")||"my_app"; var folder=app+"/"; var vc=document.getElementById("panel-content"); var panelTxt=vc?(vc.innerText||vc.textContent||""):""; var lang=detectLang(_phCode,panelTxt); if(_phIsHtml){ buildVanillaHtml(zip,folder,app,_phCode); } else if(lang==="flutter"){ buildFlutter(zip,folder,app,_phCode,panelTxt); } else if(lang==="react-native"){ buildReactNative(zip,folder,app,_phCode,panelTxt); } else if(lang==="swift"){ buildSwift(zip,folder,app,_phCode,panelTxt); } else if(lang==="kotlin"){ buildKotlin(zip,folder,app,_phCode,panelTxt); } else if(lang==="react"){ buildReact(zip,folder,app,_phCode,panelTxt); } else if(lang==="vue"){ buildVue(zip,folder,app,_phCode,panelTxt); } else if(lang==="angular"){ buildAngular(zip,folder,app,_phCode,panelTxt); } else if(lang==="python"){ buildPython(zip,folder,app,_phCode); } else if(lang==="node"){ buildNode(zip,folder,app,_phCode); } else { /* Document/content workflow */ var title=app.replace(/_/g," "); var md=_phAll||_phCode||panelTxt||"No content"; zip.file(folder+app+".md",md); var h=""+title+""; h+="

"+title+"

"; var hc=md.replace(/&/g,"&").replace(//g,">"); hc=hc.replace(/^### (.+)$/gm,"

$1

"); hc=hc.replace(/^## (.+)$/gm,"

$1

"); hc=hc.replace(/^# (.+)$/gm,"

$1

"); hc=hc.replace(/**(.+?)**/g,"$1"); hc=hc.replace(/ {2,}/g,"

"); h+="

"+hc+"

Generated by PantheraHive BOS
"; zip.file(folder+app+".html",h); zip.file(folder+"README.md","# "+title+" Generated by PantheraHive BOS. Files: - "+app+".md (Markdown) - "+app+".html (styled HTML) "); } zip.generateAsync({type:"blob"}).then(function(blob){ var a=document.createElement("a"); a.href=URL.createObjectURL(blob); a.download=app+".zip"; a.click(); URL.revokeObjectURL(a.href); if(lbl)lbl.textContent="Download ZIP"; }); }; document.head.appendChild(sc); }function phShare(){navigator.clipboard.writeText(window.location.href).then(function(){var el=document.getElementById("ph-share-lbl");if(el){el.textContent="Link copied!";setTimeout(function(){el.textContent="Copy share link";},2500);}});}function phEmbed(){var runId=window.location.pathname.split("/").pop().replace(".html","");var embedUrl="https://pantherahive.com/embed/"+runId;var code='';navigator.clipboard.writeText(code).then(function(){var el=document.getElementById("ph-embed-lbl");if(el){el.textContent="Embed code copied!";setTimeout(function(){el.textContent="Get Embed Code";},2500);}});}