Data Migration Planner
Run ID: 69cd23f13e7fb09ff16a852a2026-04-01Development
PantheraHive BOS
BOS Dashboard

Data Migration Planner: Detailed Technical Implementation & Code Generation

This document outlines a comprehensive data migration plan, providing detailed technical specifications, field mapping, transformation rules, validation scripts, rollback procedures, and timeline estimates. The accompanying Python code snippets are designed to be modular, extensible, and production-ready, serving as a robust foundation for your data migration project.


1. Introduction to the Migration Framework

The provided code establishes a framework for orchestrating data migration. It emphasizes modularity, allowing for easy adaptation to different data sources, targets, and transformation logic. The core components include:


2. Core Migration Script (data_migration_framework.py)

This main script orchestrates the migration process, integrating all components.

text • 527 chars
**Explanation:**

*   **`users_table`**: The logical entity or table being migrated.
*   **`source_to_target`**: Maps source column names to their corresponding target column names, along with metadata (data type, primary key status, uniqueness, nullability, default values). This is crucial for schema definition and data type conversions.
*   **`target_to_source`**: A reverse mapping, useful for post-migration validation where target data might be fetched and compared against source.

**Loading the mapping in Python:**

Sandboxed live preview

As part of the "Data Migration Planner" workflow, and in response to your request for a detailed study plan, this document outlines a comprehensive program designed to equip individuals with the knowledge and skills required to effectively plan, execute, and manage data migration projects.


Comprehensive Data Migration Study Plan

1. Introduction and Purpose

This study plan is meticulously crafted to provide a structured learning pathway for mastering the multifaceted domain of data migration. It covers fundamental concepts, best practices, technical considerations, and project management aspects essential for successful data transfer between systems. The goal is to build a solid foundation, enabling participants to confidently tackle real-world data migration challenges.

2. Target Audience

This plan is ideal for:

  • Aspiring Data Engineers, Data Architects, and ETL Developers.
  • IT Project Managers overseeing data-intensive projects.
  • Database Administrators and System Architects involved in system upgrades or consolidations.
  • Business Analysts seeking a deeper understanding of technical migration processes.
  • Anyone looking to enhance their skills in data management and system integration.

3. Overall Learning Goal

To gain a comprehensive and practical understanding of the entire data migration lifecycle, from initial assessment and planning to execution, validation, and post-migration activities, enabling the design and implementation of robust and secure data migration solutions.

4. Key Learning Objectives

Upon completion of this study plan, participants will be able to:

  • Understand Data Migration Fundamentals: Define data migration, identify its types, stages, and associated risks.
  • Conduct Data Source Analysis: Profile source data, identify data quality issues, and understand data structures.
  • Design Data Mapping and Transformation Rules: Create detailed field mappings and define complex transformation logic.
  • Develop Data Quality and Cleansing Strategies: Implement techniques for data standardization, de-duplication, and enrichment.
  • Evaluate Migration Strategies and Tools: Select appropriate migration methodologies (e.g., Big Bang, Phased) and choose suitable ETL/ELT tools.
  • Implement Robust Validation and Testing: Design comprehensive validation scripts and testing procedures to ensure data integrity and accuracy.
  • Plan for Rollback and Error Handling: Develop contingency plans and error resolution strategies for migration failures.
  • Address Security, Compliance, and Performance: Integrate security measures, ensure regulatory compliance, and optimize migration performance.
  • Manage Data Migration Projects: Apply project management principles to scope, plan, execute, and monitor data migration initiatives.
  • Execute Post-Migration Activities: Plan for post-migration support, data archiving, and system decommissioning.

5. Recommended Study Duration

This plan is designed for an 8-12 week duration, assuming approximately 10-15 hours of study per week. The pace can be adjusted based on individual learning speed and prior experience.

6. Weekly Schedule and Topics

| Week | Module | Key Topics Covered | Practical Activities / Focus |

| :--- | :----- | :----------------- | :--------------------------- |

| 1 | Introduction to Data Migration | - What is Data Migration? Types (Storage, Database, Application, Cloud) <br> - Data Migration Lifecycle (Assessment, Design, Execution, Validation, Go-Live) <br> - Common Challenges & Risks (Data Loss, Downtime, Cost) <br> - Business Drivers & Benefits | - Research case studies of successful/failed migrations. <br> - Self-assessment of current knowledge. <br> - Define scope for a hypothetical migration project. |

| 2 | Data Source Analysis & Profiling | - Understanding Source Systems & Data Models <br> - Data Profiling Techniques & Tools (e.g., SQL queries, data profiling software) <br> - Identifying Data Quality Issues (missing values, inconsistencies, duplicates) <br> - Data Volume & Velocity Assessment | - Practice SQL queries for data profiling (e.g., COUNT(*), DISTINCT, GROUP BY, MIN, MAX). <br> - Use a sample dataset to identify data quality issues. |

| 3 | Data Mapping & Transformation Rules | - Field-to-Field Mapping (Source to Target) <br> - Data Transformation Types (Lookup, Concatenation, Split, Aggregation, Derivation) <br> - Documenting Mapping Specifications & Transformation Logic <br> - Handling Data Type Conversions | - Create a detailed data mapping document for a sample scenario. <br> - Write pseudo-code or actual code for complex transformation rules. |

| 4 | Data Quality & Cleansing | - Data Cleansing Strategies (Standardization, De-duplication, Enrichment) <br> - Data Quality Rules Definition <br> - Tools for Data Quality Management <br> - Master Data Management (MDM) concepts | - Implement simple data cleansing scripts (e.g., Python, SQL). <br> - Evaluate different data quality tools. |

| 5 | Migration Strategy & Tooling | - Migration Approaches (Big Bang vs. Phased, Lift & Shift, Re-platform) <br> - ETL vs. ELT Paradigms <br> - Overview of Migration Tools (e.g., SSIS, Informatica, Talend, AWS DMS, Azure Data Factory, Google Cloud Dataflow) <br> - On-premise vs. Cloud Migration Considerations | - Research and compare 2-3 ETL/ELT tools. <br> - Outline a migration strategy for a given business scenario. |

| 6 | Data Validation & Testing | - Importance of Data Validation <br> - Types of Validation (Count, Sum, Reconciliation, Format, Referential Integrity) <br> - Designing Validation Scripts & Test Cases <br> - User Acceptance Testing (UAT) for Data | - Develop validation scripts using SQL or scripting languages. <br> - Create a test plan for data migration, including UAT scenarios. |

| 7 | Rollback Planning & Error Handling | - Developing a Comprehensive Rollback Strategy <br> - Error Logging, Monitoring, and Alerting <br> - Data Recovery Procedures <br> - Contingency Planning for Migration Failures | - Design an error handling framework for a migration process. <br> - Draft a rollback procedure document for a critical data table. |

| 8 | Security, Compliance & Performance | - Data Security during Migration (Encryption, Access Control) <br> - Regulatory Compliance (GDPR, HIPAA, PCI-DSS) <br> - Performance Optimization Techniques (Batching, Indexing, Parallel Processing) <br> - Audit Trails & Logging | - Research compliance requirements relevant to your industry. <br> - Brainstorm performance bottlenecks and solutions for a migration. |

| 9 | Project Management & Go-Live | - Data Migration Project Planning (Scope, Timeline, Resources, Budget) <br> - Stakeholder Management & Communication <br> - Cutover Planning & Downtime Management <br> - Go-Live Checklist & Readiness Assessment | - Develop a high-level project plan for a data migration. <br> - Create a stakeholder communication plan. |

| 10 | Post-Migration Activities & Review | - Post-Migration Support & Monitoring <br> - Data Archiving & Decommissioning of Legacy Systems <br> - Performance Tuning of New System <br> - Lessons Learned & Project Review | - Outline post-migration monitoring metrics. <br> - Conduct a "lessons learned" exercise for the hypothetical project. |

7. Recommended Resources

  • Books:

* "Data Migration" by John Owens

* "The Data Warehouse Toolkit" by Ralph Kimball (for dimensional modeling concepts relevant to target systems)

* "Designing Data-Intensive Applications" by Martin Kleppmann (for foundational distributed systems concepts)

  • Online Courses (e.g., Coursera, Udemy, edX, Pluralsight):

* "Data Engineering with Google Cloud" / "Microsoft Azure Data Engineer Associate" / "AWS Certified Data Analytics" specialization courses.

* Courses specifically on ETL tools (e.g., "Talend Data Integration," "Informatica PowerCenter").

* SQL and Python for Data Analysis courses.

  • Documentation & Blogs:

* Official documentation for major cloud providers (AWS, Azure, GCP) on their data migration services.

* Vendor documentation for specific ETL tools.

* Industry blogs (e.g., Data Engineering Weekly, Towards Data Science) for best practices and emerging trends.

  • Tools for Hands-on Practice:

* Databases: PostgreSQL, MySQL (install locally or use cloud-managed services).

* SQL Client Tools: DBeaver, SQL Developer, pgAdmin.

* Scripting: Python with libraries like Pandas, SQLAlchemy.

* ETL Tools (Free/Trial versions): Talend Open Studio, Apache Nifi, Pentaho Data Integration.

* Spreadsheet Software: Excel, Google Sheets (for data mapping and small-scale analysis).

8. Milestones

  • End of Week 2: Submit a Data Profiling Report for a chosen sample dataset, highlighting data quality issues.
  • End of Week 4: Complete a Detailed Data Mapping Document with transformation rules for at least 10-15 fields across two tables.
  • End of Week 6: Develop a Prototype Data Validation Script (e.g., in SQL or Python) to verify data counts, sums, and a few format rules.
  • End of Week 8: Draft a Migration Strategy Document outlining the chosen approach, tooling, and key risks for a hypothetical project.
  • End of Week 10: Capstone Project Submission: A comprehensive "Data Migration Plan" document for a specific scenario, integrating all learned concepts (assessment, mapping, strategy, validation, rollback, timeline).

9. Assessment Strategies

  • Self-Assessment Quizzes: Regularly test understanding of module topics.
  • Practical Exercises/Labs: Apply concepts using real or simulated data, developing scripts and configurations.
  • Case Study Analysis: Analyze provided data migration scenarios and propose solutions.
  • Project-Based Assignments: Each milestone serves as a project deliverable, demonstrating practical application.
  • Peer Review/Discussions: Engage with a study group or online community to discuss challenges and solutions.
  • Final Capstone Project: The ultimate assessment, requiring integration of all learned material into a cohesive and actionable plan.

10. Tips for Success

  • Hands-on Practice is Crucial: Theory is important, but practical application solidifies understanding. Work with actual data and tools.
  • Stay Curious: The data landscape evolves rapidly. Continuously read blogs, articles, and documentation.
  • Network: Join data professional communities, attend webinars, and connect with peers.
  • Document Everything: Good documentation is vital in data migration. Practice creating clear and concise plans, mappings, and procedures.
  • Understand the Business Context: Data migration is not just a technical exercise; understand the business reasons and impact.

This detailed study plan provides a robust framework for mastering data migration. By diligently following this schedule and engaging with the recommended resources and activities, you will develop the expertise required to excel in this critical area of data management.

python

import pandas as pd

from datetime import datetime

import logging

logger = logging.getLogger(__name__)

--- Helper Functions for Transformations ---

def standardize_status(status_code):

"""

Transforms a source status code to a target standard.

Example: 'ACT' -> 'ACTIVE', '

gemini Output

Data Migration Planner: Comprehensive Migration Strategy

Project Name: [Insert Project Name, e.g., CRM System Upgrade Data Migration]

Date: October 26, 2023

Version: 1.0

Prepared For: [Customer Name]

Prepared By: PantheraHive Solutions


1. Executive Summary

This document outlines the comprehensive plan for the data migration from [Source System Name] to [Target System Name]. The primary objective of this migration is to seamlessly transfer critical business data, ensuring accuracy, integrity, and minimal disruption to operations. This plan details the scope, methodology, field mapping, transformation rules, validation procedures, rollback strategy, and estimated timeline to guide a successful migration project.

2. Project Scope & Objectives

2.1. Project Scope

  • Source System: [Name of Source System, e.g., Legacy CRM v1.0]

* Database Type/Version: [e.g., SQL Server 2012]

* Key Modules/Entities: [e.g., Customer Accounts, Contacts, Opportunities, Products, Orders]

  • Target System: [Name of Target System, e.g., Salesforce Sales Cloud]

* Database Type/Version: [e.g., Salesforce Standard Objects & Custom Objects]

* Key Modules/Entities: [e.g., Accounts, Contacts, Opportunities, Products, Orders]

  • Data Included: All active and historical data for specified entities within the last [e.g., 5 years].
  • Data Excluded: Archived data older than [e.g., 5 years], test data, temporary records, or data identified as irrelevant/redundant.
  • Integration Points: Existing integrations with [System A] and [System B] will be reviewed and re-established/reconfigured for the Target System.

2.2. Project Objectives

  • Data Integrity: Ensure 100% data accuracy and completeness in the Target System post-migration.
  • Minimal Downtime: Execute the production cutover with the least possible impact on business operations, targeting a maximum downtime of [X] hours.
  • Performance: The migrated data must support optimal performance of the Target System.
  • User Acceptance: Ensure the migrated data meets the functional and reporting requirements of end-users.
  • Security & Compliance: Maintain data security standards and adhere to all relevant compliance regulations (e.g., GDPR, HIPAA).
  • Auditability: Provide a clear audit trail of all migration activities and data changes.

3. Source & Target Systems Overview

| Feature | Source System ([Name]) | Target System ([Name]) |

| :------------------- | :-------------------------------------------------------- | :------------------------------------------------------------ |

| System Type | [e.g., On-Premise CRM, Custom Application] | [e.g., Cloud-Based CRM, ERP Suite] |

| Database | [e.g., SQL Server 2012, Oracle 11g] | [e.g., Salesforce Database, PostgreSQL] |

| Key Entities | [e.g., Customers, Orders, Products, Invoices] | [e.g., Accounts, Opportunities, Products, Sales Orders] |

| Data Volume Est. | [e.g., 500 GB, 10 million records] | [e.g., 600 GB, 12 million records (post-transformation)] |

| API/Access | [e.g., Direct DB access, ODBC, Web Services] | [e.g., Salesforce API, REST API, ODBC] |

| Current Data Quality | [e.g., Moderate, known inconsistencies in addresses] | [e.g., High, new validation rules to be enforced] |

4. Data Inventory & Analysis

A detailed data inventory has been compiled, identifying all relevant tables, fields, and relationships within the Source System. Key activities include:

  • Data Profiling: Analysis of source data for patterns, anomalies, data types, uniqueness, and completeness.
  • Dependency Mapping: Identification of parent-child relationships and foreign key constraints to ensure correct migration order.
  • Volume Estimation: Accurate estimation of record counts and storage requirements for each entity.
  • Initial Quality Assessment: Documentation of identified data quality issues (e.g., duplicate records, missing values, inconsistent formats).

5. Detailed Field Mapping

The field mapping document serves as the cornerstone for understanding how each piece of data will transition from the Source to the Target system. A comprehensive mapping will be maintained in a dedicated spreadsheet, including:

  • Source Table Name: [e.g., dbo.Customers]
  • Source Field Name: [e.g., CUST_FName]
  • Source Data Type: [e.g., VARCHAR(50)]
  • Source Sample Data: [e.g., John]
  • Target Object Name: [e.g., Account]
  • Target Field Name: [e.g., FirstName__c (if custom) or FirstName (if standard)]
  • Target Data Type: [e.g., TEXT(80)]
  • Required in Target?: [Yes/No]
  • Transformation Rule ID: [Reference to specific transformation rule, e.g., TR001]
  • Notes/Comments: Any specific considerations, lookup requirements, or potential issues.
  • Primary/Foreign Key: Indication of key fields to maintain relationships.

Example Mapping (Illustrative):

| Source Table | Source Field | Source Type | Sample Data | Target Object | Target Field | Target Type | Required | Transformation Rule | Notes |

| :----------- | :----------- | :---------- | :---------- | :------------ | :----------- | :---------- | :------- | :------------------ | :-------------------------------------- |

| Customers | CUST_ID | INT | 12345 | Account | External_ID__c | TEXT(20) | Yes | TR001 | Unique external ID for reconciliation |

| Customers | CUST_FName | VARCHAR(50) | John | Contact | FirstName | TEXT(40) | Yes | TR002 | Map to Contact object for primary contact |

| Customers | CUST_LName | VARCHAR(50) | Doe | Contact | LastName | TEXT(80) | Yes | TR002 | |

| Customers | ADDR_LINE1 | VARCHAR(100)| 123 Main St | Account | BillingStreet | TEXT(255) | Yes | TR003 | Combine with ADDR_LINE2 if needed |

| Customers | STATUS_CD | CHAR(1) | A | Account | Status__c | Picklist | Yes | TR004 | Lookup: A=Active, I=Inactive, P=Pending |

| Orders | ORDER_DT | DATETIME | 2023-01-15| Order | OrderDate | DATE | Yes | TR005 | Convert to DATE only |

6. Data Transformation Rules

Detailed rules will be applied during the extraction, transformation, and load (ETL) process to ensure data conforms to the Target System's requirements and business logic. Each rule will be documented with its ID, description, and logic.

  • TR001: External ID Generation:

* Description: Concatenate CUST_ID from Customers table with a prefix "LEG-" to form External_ID__c in Account object.

* Logic: CONCAT('LEG-', [CUST_ID])

  • TR002: Name Consolidation:

* Description: Map CUST_FName and CUST_LName to Contact.FirstName and Contact.LastName respectively. Ensure proper capitalization.

* Logic: PROPER([CUST_FName]), PROPER([CUST_LName])

  • TR003: Address Line Concatenation:

* Description: Combine ADDR_LINE1 and ADDR_LINE2 into Account.BillingStreet if ADDR_LINE2 is not null.

* Logic: IF([ADDR_LINE2] IS NOT NULL, CONCAT([ADDR_LINE1], ', ', [ADDR_LINE2]), [ADDR_LINE1])

  • TR004: Status Code Lookup:

* Description: Translate single-character status codes from Source to descriptive picklist values in Target.

* Logic: CASE [STATUS_CD]

* WHEN 'A' THEN 'Active'

* WHEN 'I' THEN 'Inactive'

* WHEN 'P' THEN 'Pending Review'

* ELSE 'Unknown'

END

  • TR005: Date Format Conversion:

* Description: Convert DATETIME fields from Source to DATE only format in Target.

* Logic: CAST([ORDER_DT] AS DATE)

  • TR006: Default Value Assignment:

* Description: If Industry is null in Source, default to "Other" in Target.

* Logic: IF([INDUSTRY] IS NULL, 'Other', [INDUSTRY])

  • TR007: Currency Conversion:

* Description: Convert all monetary values from [Source Currency] to [Target Currency] using a fixed exchange rate of [X] or a dynamic rate from [Exchange Rate Service].

Logic: [AMOUNT] [EXCHANGE_RATE]

  • TR008: Data Cleansing - Trim Spaces:
data_migration_planner.txt
Download source file
Copy all content
Full output as text
Download ZIP
IDE-ready project ZIP
Copy share link
Permanent URL for this run
Get Embed Code
Embed this result on any website
Print / Save PDF
Use browser print dialog
"); var hasSrcMain=Object.keys(extracted).some(function(k){return k.indexOf("src/main")>=0;}); if(!hasSrcMain) zip.file(folder+"src/main."+ext,"import React from 'react' import ReactDOM from 'react-dom/client' import App from './App' import './index.css' ReactDOM.createRoot(document.getElementById('root')!).render( ) "); var hasSrcApp=Object.keys(extracted).some(function(k){return k==="src/App."+ext||k==="App."+ext;}); if(!hasSrcApp) zip.file(folder+"src/App."+ext,"import React from 'react' import './App.css' function App(){ return(

"+slugTitle(pn)+"

Built with PantheraHive BOS

) } export default App "); zip.file(folder+"src/index.css","*{margin:0;padding:0;box-sizing:border-box} body{font-family:system-ui,-apple-system,sans-serif;background:#f0f2f5;color:#1a1a2e} .app{min-height:100vh;display:flex;flex-direction:column} .app-header{flex:1;display:flex;flex-direction:column;align-items:center;justify-content:center;gap:12px;padding:40px} h1{font-size:2.5rem;font-weight:700} "); zip.file(folder+"src/App.css",""); zip.file(folder+"src/components/.gitkeep",""); zip.file(folder+"src/pages/.gitkeep",""); zip.file(folder+"src/hooks/.gitkeep",""); Object.keys(extracted).forEach(function(p){ var fp=p.startsWith("src/")?p:"src/"+p; zip.file(folder+fp,extracted[p]); }); zip.file(folder+"README.md","# "+slugTitle(pn)+" Generated by PantheraHive BOS. ## Setup ```bash npm install npm run dev ``` ## Build ```bash npm run build ``` ## Open in IDE Open the project folder in VS Code or WebStorm. "); zip.file(folder+".gitignore","node_modules/ dist/ .env .DS_Store *.local "); } /* --- Vue (Vite + Composition API + TypeScript) --- */ function buildVue(zip,folder,app,code,panelTxt){ var pn=pkgName(app); var C=cc(pn); var extracted=extractCode(panelTxt); zip.file(folder+"package.json",'{ "name": "'+pn+'", "version": "0.0.0", "type": "module", "scripts": { "dev": "vite", "build": "vue-tsc -b && vite build", "preview": "vite preview" }, "dependencies": { "vue": "^3.5.13", "vue-router": "^4.4.5", "pinia": "^2.3.0", "axios": "^1.7.9" }, "devDependencies": { "@vitejs/plugin-vue": "^5.2.1", "typescript": "~5.7.3", "vite": "^6.0.5", "vue-tsc": "^2.2.0" } } '); zip.file(folder+"vite.config.ts","import { defineConfig } from 'vite' import vue from '@vitejs/plugin-vue' import { resolve } from 'path' export default defineConfig({ plugins: [vue()], resolve: { alias: { '@': resolve(__dirname,'src') } } }) "); zip.file(folder+"tsconfig.json",'{"files":[],"references":[{"path":"./tsconfig.app.json"},{"path":"./tsconfig.node.json"}]} '); zip.file(folder+"tsconfig.app.json",'{ "compilerOptions":{ "target":"ES2020","useDefineForClassFields":true,"module":"ESNext","lib":["ES2020","DOM","DOM.Iterable"], "skipLibCheck":true,"moduleResolution":"bundler","allowImportingTsExtensions":true, "isolatedModules":true,"moduleDetection":"force","noEmit":true,"jsxImportSource":"vue", "strict":true,"paths":{"@/*":["./src/*"]} }, "include":["src/**/*.ts","src/**/*.d.ts","src/**/*.tsx","src/**/*.vue"] } '); zip.file(folder+"env.d.ts","/// "); zip.file(folder+"index.html"," "+slugTitle(pn)+"
"); var hasMain=Object.keys(extracted).some(function(k){return k==="src/main.ts"||k==="main.ts";}); if(!hasMain) zip.file(folder+"src/main.ts","import { createApp } from 'vue' import { createPinia } from 'pinia' import App from './App.vue' import './assets/main.css' const app = createApp(App) app.use(createPinia()) app.mount('#app') "); var hasApp=Object.keys(extracted).some(function(k){return k.indexOf("App.vue")>=0;}); if(!hasApp) zip.file(folder+"src/App.vue"," "); zip.file(folder+"src/assets/main.css","*{margin:0;padding:0;box-sizing:border-box}body{font-family:system-ui,sans-serif;background:#fff;color:#213547} "); zip.file(folder+"src/components/.gitkeep",""); zip.file(folder+"src/views/.gitkeep",""); zip.file(folder+"src/stores/.gitkeep",""); Object.keys(extracted).forEach(function(p){ var fp=p.startsWith("src/")?p:"src/"+p; zip.file(folder+fp,extracted[p]); }); zip.file(folder+"README.md","# "+slugTitle(pn)+" Generated by PantheraHive BOS. ## Setup ```bash npm install npm run dev ``` ## Build ```bash npm run build ``` Open in VS Code or WebStorm. "); zip.file(folder+".gitignore","node_modules/ dist/ .env .DS_Store *.local "); } /* --- Angular (v19 standalone) --- */ function buildAngular(zip,folder,app,code,panelTxt){ var pn=pkgName(app); var C=cc(pn); var sel=pn.replace(/_/g,"-"); var extracted=extractCode(panelTxt); zip.file(folder+"package.json",'{ "name": "'+pn+'", "version": "0.0.0", "scripts": { "ng": "ng", "start": "ng serve", "build": "ng build", "test": "ng test" }, "dependencies": { "@angular/animations": "^19.0.0", "@angular/common": "^19.0.0", "@angular/compiler": "^19.0.0", "@angular/core": "^19.0.0", "@angular/forms": "^19.0.0", "@angular/platform-browser": "^19.0.0", "@angular/platform-browser-dynamic": "^19.0.0", "@angular/router": "^19.0.0", "rxjs": "~7.8.0", "tslib": "^2.3.0", "zone.js": "~0.15.0" }, "devDependencies": { "@angular-devkit/build-angular": "^19.0.0", "@angular/cli": "^19.0.0", "@angular/compiler-cli": "^19.0.0", "typescript": "~5.6.0" } } '); zip.file(folder+"angular.json",'{ "$schema": "./node_modules/@angular/cli/lib/config/schema.json", "version": 1, "newProjectRoot": "projects", "projects": { "'+pn+'": { "projectType": "application", "root": "", "sourceRoot": "src", "prefix": "app", "architect": { "build": { "builder": "@angular-devkit/build-angular:application", "options": { "outputPath": "dist/'+pn+'", "index": "src/index.html", "browser": "src/main.ts", "tsConfig": "tsconfig.app.json", "styles": ["src/styles.css"], "scripts": [] } }, "serve": {"builder":"@angular-devkit/build-angular:dev-server","configurations":{"production":{"buildTarget":"'+pn+':build:production"},"development":{"buildTarget":"'+pn+':build:development"}},"defaultConfiguration":"development"} } } } } '); zip.file(folder+"tsconfig.json",'{ "compileOnSave": false, "compilerOptions": {"baseUrl":"./","outDir":"./dist/out-tsc","forceConsistentCasingInFileNames":true,"strict":true,"noImplicitOverride":true,"noPropertyAccessFromIndexSignature":true,"noImplicitReturns":true,"noFallthroughCasesInSwitch":true,"paths":{"@/*":["src/*"]},"skipLibCheck":true,"esModuleInterop":true,"sourceMap":true,"declaration":false,"experimentalDecorators":true,"moduleResolution":"bundler","importHelpers":true,"target":"ES2022","module":"ES2022","useDefineForClassFields":false,"lib":["ES2022","dom"]}, "references":[{"path":"./tsconfig.app.json"}] } '); zip.file(folder+"tsconfig.app.json",'{ "extends":"./tsconfig.json", "compilerOptions":{"outDir":"./dist/out-tsc","types":[]}, "files":["src/main.ts"], "include":["src/**/*.d.ts"] } '); zip.file(folder+"src/index.html"," "+slugTitle(pn)+" "); zip.file(folder+"src/main.ts","import { bootstrapApplication } from '@angular/platform-browser'; import { appConfig } from './app/app.config'; import { AppComponent } from './app/app.component'; bootstrapApplication(AppComponent, appConfig) .catch(err => console.error(err)); "); zip.file(folder+"src/styles.css","* { margin: 0; padding: 0; box-sizing: border-box; } body { font-family: system-ui, -apple-system, sans-serif; background: #f9fafb; color: #111827; } "); var hasComp=Object.keys(extracted).some(function(k){return k.indexOf("app.component")>=0;}); if(!hasComp){ zip.file(folder+"src/app/app.component.ts","import { Component } from '@angular/core'; import { RouterOutlet } from '@angular/router'; @Component({ selector: 'app-root', standalone: true, imports: [RouterOutlet], templateUrl: './app.component.html', styleUrl: './app.component.css' }) export class AppComponent { title = '"+pn+"'; } "); zip.file(folder+"src/app/app.component.html","

"+slugTitle(pn)+"

Built with PantheraHive BOS

"); zip.file(folder+"src/app/app.component.css",".app-header{display:flex;flex-direction:column;align-items:center;justify-content:center;min-height:60vh;gap:16px}h1{font-size:2.5rem;font-weight:700;color:#6366f1} "); } zip.file(folder+"src/app/app.config.ts","import { ApplicationConfig, provideZoneChangeDetection } from '@angular/core'; import { provideRouter } from '@angular/router'; import { routes } from './app.routes'; export const appConfig: ApplicationConfig = { providers: [ provideZoneChangeDetection({ eventCoalescing: true }), provideRouter(routes) ] }; "); zip.file(folder+"src/app/app.routes.ts","import { Routes } from '@angular/router'; export const routes: Routes = []; "); Object.keys(extracted).forEach(function(p){ var fp=p.startsWith("src/")?p:"src/"+p; zip.file(folder+fp,extracted[p]); }); zip.file(folder+"README.md","# "+slugTitle(pn)+" Generated by PantheraHive BOS. ## Setup ```bash npm install ng serve # or: npm start ``` ## Build ```bash ng build ``` Open in VS Code with Angular Language Service extension. "); zip.file(folder+".gitignore","node_modules/ dist/ .env .DS_Store *.local .angular/ "); } /* --- Python --- */ function buildPython(zip,folder,app,code){ var title=slugTitle(app); var pn=pkgName(app); var src=code.replace(/^```[w]* ?/m,"").replace(/ ?```$/m,"").trim(); var reqMap={"numpy":"numpy","pandas":"pandas","sklearn":"scikit-learn","tensorflow":"tensorflow","torch":"torch","flask":"flask","fastapi":"fastapi","uvicorn":"uvicorn","requests":"requests","sqlalchemy":"sqlalchemy","pydantic":"pydantic","dotenv":"python-dotenv","PIL":"Pillow","cv2":"opencv-python","matplotlib":"matplotlib","seaborn":"seaborn","scipy":"scipy"}; var reqs=[]; Object.keys(reqMap).forEach(function(k){if(src.indexOf("import "+k)>=0||src.indexOf("from "+k)>=0)reqs.push(reqMap[k]);}); var reqsTxt=reqs.length?reqs.join(" "):"# add dependencies here "; zip.file(folder+"main.py",src||"# "+title+" # Generated by PantheraHive BOS print(title+" loaded") "); zip.file(folder+"requirements.txt",reqsTxt); zip.file(folder+".env.example","# Environment variables "); zip.file(folder+"README.md","# "+title+" Generated by PantheraHive BOS. ## Setup ```bash python3 -m venv .venv source .venv/bin/activate pip install -r requirements.txt ``` ## Run ```bash python main.py ``` "); zip.file(folder+".gitignore",".venv/ __pycache__/ *.pyc .env .DS_Store "); } /* --- Node.js --- */ function buildNode(zip,folder,app,code){ var title=slugTitle(app); var pn=pkgName(app); var src=code.replace(/^```[w]* ?/m,"").replace(/ ?```$/m,"").trim(); var depMap={"mongoose":"^8.0.0","dotenv":"^16.4.5","axios":"^1.7.9","cors":"^2.8.5","bcryptjs":"^2.4.3","jsonwebtoken":"^9.0.2","socket.io":"^4.7.4","uuid":"^9.0.1","zod":"^3.22.4","express":"^4.18.2"}; var deps={}; Object.keys(depMap).forEach(function(k){if(src.indexOf(k)>=0)deps[k]=depMap[k];}); if(!deps["express"])deps["express"]="^4.18.2"; var pkgJson=JSON.stringify({"name":pn,"version":"1.0.0","main":"src/index.js","scripts":{"start":"node src/index.js","dev":"nodemon src/index.js"},"dependencies":deps,"devDependencies":{"nodemon":"^3.0.3"}},null,2)+" "; zip.file(folder+"package.json",pkgJson); var fallback="const express=require("express"); const app=express(); app.use(express.json()); app.get("/",(req,res)=>{ res.json({message:""+title+" API"}); }); const PORT=process.env.PORT||3000; app.listen(PORT,()=>console.log("Server on port "+PORT)); "; zip.file(folder+"src/index.js",src||fallback); zip.file(folder+".env.example","PORT=3000 "); zip.file(folder+".gitignore","node_modules/ .env .DS_Store "); zip.file(folder+"README.md","# "+title+" Generated by PantheraHive BOS. ## Setup ```bash npm install ``` ## Run ```bash npm run dev ``` "); } /* --- Vanilla HTML --- */ function buildVanillaHtml(zip,folder,app,code){ var title=slugTitle(app); var isFullDoc=code.trim().toLowerCase().indexOf("=0||code.trim().toLowerCase().indexOf("=0; var indexHtml=isFullDoc?code:" "+title+" "+code+" "; zip.file(folder+"index.html",indexHtml); zip.file(folder+"style.css","/* "+title+" — styles */ *{margin:0;padding:0;box-sizing:border-box} body{font-family:system-ui,-apple-system,sans-serif;background:#fff;color:#1a1a2e} "); zip.file(folder+"script.js","/* "+title+" — scripts */ "); zip.file(folder+"assets/.gitkeep",""); zip.file(folder+"README.md","# "+title+" Generated by PantheraHive BOS. ## Open Double-click `index.html` in your browser. Or serve locally: ```bash npx serve . # or python3 -m http.server 3000 ``` "); zip.file(folder+".gitignore",".DS_Store node_modules/ .env "); } /* ===== MAIN ===== */ var sc=document.createElement("script"); sc.src="https://cdnjs.cloudflare.com/ajax/libs/jszip/3.10.1/jszip.min.js"; sc.onerror=function(){ if(lbl)lbl.textContent="Download ZIP"; alert("JSZip load failed — check connection."); }; sc.onload=function(){ var zip=new JSZip(); var base=(_phFname||"output").replace(/.[^.]+$/,""); var app=base.toLowerCase().replace(/[^a-z0-9]+/g,"_").replace(/^_+|_+$/g,"")||"my_app"; var folder=app+"/"; var vc=document.getElementById("panel-content"); var panelTxt=vc?(vc.innerText||vc.textContent||""):""; var lang=detectLang(_phCode,panelTxt); if(_phIsHtml){ buildVanillaHtml(zip,folder,app,_phCode); } else if(lang==="flutter"){ buildFlutter(zip,folder,app,_phCode,panelTxt); } else if(lang==="react-native"){ buildReactNative(zip,folder,app,_phCode,panelTxt); } else if(lang==="swift"){ buildSwift(zip,folder,app,_phCode,panelTxt); } else if(lang==="kotlin"){ buildKotlin(zip,folder,app,_phCode,panelTxt); } else if(lang==="react"){ buildReact(zip,folder,app,_phCode,panelTxt); } else if(lang==="vue"){ buildVue(zip,folder,app,_phCode,panelTxt); } else if(lang==="angular"){ buildAngular(zip,folder,app,_phCode,panelTxt); } else if(lang==="python"){ buildPython(zip,folder,app,_phCode); } else if(lang==="node"){ buildNode(zip,folder,app,_phCode); } else { /* Document/content workflow */ var title=app.replace(/_/g," "); var md=_phAll||_phCode||panelTxt||"No content"; zip.file(folder+app+".md",md); var h=""+title+""; h+="

"+title+"

"; var hc=md.replace(/&/g,"&").replace(//g,">"); hc=hc.replace(/^### (.+)$/gm,"

$1

"); hc=hc.replace(/^## (.+)$/gm,"

$1

"); hc=hc.replace(/^# (.+)$/gm,"

$1

"); hc=hc.replace(/**(.+?)**/g,"$1"); hc=hc.replace(/ {2,}/g,"

"); h+="

"+hc+"

Generated by PantheraHive BOS
"; zip.file(folder+app+".html",h); zip.file(folder+"README.md","# "+title+" Generated by PantheraHive BOS. Files: - "+app+".md (Markdown) - "+app+".html (styled HTML) "); } zip.generateAsync({type:"blob"}).then(function(blob){ var a=document.createElement("a"); a.href=URL.createObjectURL(blob); a.download=app+".zip"; a.click(); URL.revokeObjectURL(a.href); if(lbl)lbl.textContent="Download ZIP"; }); }; document.head.appendChild(sc); }function phShare(){navigator.clipboard.writeText(window.location.href).then(function(){var el=document.getElementById("ph-share-lbl");if(el){el.textContent="Link copied!";setTimeout(function(){el.textContent="Copy share link";},2500);}});}function phEmbed(){var runId=window.location.pathname.split("/").pop().replace(".html","");var embedUrl="https://pantherahive.com/embed/"+runId;var code='';navigator.clipboard.writeText(code).then(function(){var el=document.getElementById("ph-embed-lbl");if(el){el.textContent="Embed code copied!";setTimeout(function(){el.textContent="Get Embed Code";},2500);}});}