Data Migration Planner
Run ID: 69cbd97061b1021a29a8ce212026-03-31Development
PantheraHive BOS
BOS Dashboard

Data Migration Planner: Detailed Technical Specification & Code Deliverables

This document outlines the comprehensive technical plan for your data migration, encompassing field mapping, transformation rules, validation scripts, rollback procedures, and timeline estimates. The provided code examples are designed to be production-ready, well-commented, and actionable, serving as a foundational toolkit for your migration project.


1. Introduction

This deliverable provides the detailed technical specifications and accompanying code artifacts for the planned data migration. Our goal is to ensure a smooth, accurate, and verifiable transfer of data from your source systems to the new target environment. Each section below details a critical component of the migration process, offering both conceptual understanding and practical code implementations.


2. Field Mapping Specification

Purpose: To explicitly define the relationship between source system fields and target system fields, including data types, transformations required, and any specific notes for each mapping.

Approach: We will use a structured Python dictionary format to represent the field mappings. This allows for clear, programmatic definition and easy integration into migration scripts.

Code Deliverable: field_mapping.py

text • 444 chars
---

### 3. Data Transformation Rules

**Purpose:** To define and implement the logic required to convert source data into a format suitable for the target system, addressing data type mismatches, data quality issues, and business rule applications.

**Approach:** We will use Python functions to encapsulate specific transformation logic. This promotes reusability, testability, and clarity.

**Code Deliverable: `transformation_rules.py`**

Sandboxed live preview

As a professional AI assistant within PantheraHive, I understand the importance of clear, actionable deliverables. This document outlines a comprehensive study plan designed to equip you with the essential knowledge and skills required to excel as a Data Migration Planner.

This study plan is structured to provide a detailed roadmap, ensuring a thorough understanding of all critical aspects of data migration, from initial planning and architecture to execution and post-migration activities.


Study Plan: Mastering Data Migration Planning

Goal: To develop a comprehensive understanding and practical skills in planning, designing, and overseeing complex data migration projects, encompassing all phases from initial assessment to post-migration validation and rollback strategies.

1. Learning Objectives

Upon completion of this study plan, you will be able to:

  • Understand Data Migration Fundamentals: Differentiate between various data migration types, strategies (e.g., Big Bang, Phased), and the typical lifecycle phases of a migration project.
  • Conduct Thorough Source & Target System Analysis: Perform detailed analysis of source and target data models, schemas, data types, constraints, and dependencies.
  • Master Data Profiling & Quality Assessment: Utilize tools and techniques to profile source data, identify data quality issues, anomalies, and inconsistencies, and assess data readiness for migration.
  • Design Comprehensive Field Mappings: Create precise and unambiguous field-level mappings between source and target systems, accounting for data type conversions and structural differences.
  • Define Robust Data Transformation Rules: Develop detailed rules for data cleansing, standardization, aggregation, enrichment, and derivation to meet target system requirements.
  • Develop Effective Data Validation Scripts: Design and implement pre-migration, in-migration, and post-migration validation scripts to ensure data integrity, accuracy, and completeness.
  • Formulate Resilient Rollback Procedures: Plan and document comprehensive rollback strategies and contingency plans to mitigate risks and recover from potential migration failures.
  • Estimate Timelines and Resources: Develop realistic timeline estimates, identify necessary resources (human, technical, financial), and understand critical path dependencies.
  • Identify and Mitigate Risks: Conduct thorough risk assessments, develop mitigation strategies, and plan for potential challenges during the migration process.
  • Understand Performance, Security, and Compliance: Incorporate considerations for migration performance, data security, and regulatory compliance into the planning process.
  • Evaluate and Select Migration Tools: Understand the capabilities of various data migration tools (ETL, scripting, cloud-native services) and select appropriate technologies.
  • Create a Complete Data Migration Plan: Synthesize all learned components into a comprehensive, actionable data migration plan document.

2. Weekly Schedule

This schedule assumes a dedicated study effort of approximately 10-15 hours per week over a 6-week period. Adjust as necessary based on your prior experience and available time.

Week 1: Fundamentals, Scope, and Architecture Overview

  • Focus: Introduction to data migration concepts, project lifecycle, scoping, and architectural considerations.
  • Topics:

* What is data migration? Types (on-prem to cloud, system upgrades, consolidation, etc.).

* Phases of a data migration project (planning, design, execution, validation, cutover, post-migration).

* Migration strategies (Big Bang vs. Phased).

* High-level architectural patterns for data migration (ETL, ELT, direct load).

* Stakeholder identification and communication planning.

* Defining migration scope, objectives, and success criteria.

  • Activities:

* Read introductory chapters on data migration.

* Research different migration strategies and their pros/cons.

* Analyze a simple data migration case study.

* Draft a high-level scope document for a hypothetical migration project.

  • Deliverables: High-Level Data Migration Scope Document, Stakeholder Map.

Week 2: Source & Target System Analysis and Data Profiling

  • Focus: Deep dive into understanding source and target environments, and assessing data quality.
  • Topics:

* Techniques for analyzing source database schemas, file structures, and APIs.

* Understanding target system requirements, data models, and constraints.

* Data profiling tools and methodologies (identifying data types, patterns, uniqueness, completeness, consistency).

* Identifying data quality issues (duplicates, missing values, incorrect formats).

* Impact analysis of schema differences.

  • Activities:

* Practice data profiling using a sample dataset (e.g., SQL queries, Python libraries like Pandas Profiling, or dedicated tools).

* Document differences between a sample source and target schema.

* Create a data quality report template.

  • Deliverables: Sample Source/Target Schema Comparison Report, Data Profiling Report for a sample dataset.

Week 3: Field Mapping and Transformation Rules

  • Focus: Designing the core logic for data movement and modification.
  • Topics:

* Principles of effective field mapping (one-to-one, one-to-many, many-to-one).

* Handling complex data types and structures (JSON, XML).

* Defining data transformation rules: cleansing, standardization, aggregation, enrichment, derivation, lookup tables.

* Best practices for documentation of mappings and transformations.

* Version control for mapping documents.

  • Activities:

* Develop detailed field mappings for a hypothetical scenario involving multiple tables/entities.

* Write pseudo-code or actual code (SQL, Python) for complex transformation rules.

* Document mapping and transformation rules in a structured format.

  • Deliverables: Detailed Field Mapping Document, Data Transformation Rules Specification (for a hypothetical scenario).

Week 4: Data Validation, Error Handling, and Rollback Procedures

  • Focus: Ensuring data integrity post-migration and preparing for contingencies.
  • Topics:

* Types of validation: row counts, checksums, reconciliation, business rule validation, referential integrity.

* Designing pre-migration, in-migration, and post-migration validation checks.

* Strategies for error logging, reporting, and handling during migration.

* Developing comprehensive rollback plans and procedures.

* Contingency planning and recovery strategies.

  • Activities:

* Design a set of validation checks for a migration scenario.

* Outline an error handling framework.

* Create a detailed rollback procedure flowchart and checklist.

* Practice writing simple validation scripts (e.g., SQL queries to compare source and target data counts/sums).

  • Deliverables: Data Validation Plan, Error Handling Strategy Document, Detailed Rollback Procedure.

Week 5: Performance, Security, Risk Management, and Tooling

  • Focus: Optimizing the migration process, securing data, managing risks, and understanding technology choices.
  • Topics:

* Performance optimization techniques for large-scale migrations (batching, parallel processing, indexing).

* Data security considerations: encryption, access control, anonymization/pseudonymization.

* Compliance requirements (GDPR, HIPAA, etc.) during migration.

* Risk identification, assessment, and mitigation strategies.

* Overview of popular data migration tools (ETL tools like SSIS, Talend, Informatica; cloud-native services like AWS DMS, Azure Data Factory, Google Cloud Dataflow; scripting languages).

* Tool selection criteria.

  • Activities:

* Conduct a risk assessment for a hypothetical migration project.

* Research and compare 2-3 common data migration tools.

* Develop a security checklist for data migration.

  • Deliverables: Data Migration Risk Assessment Log, Tool Comparison Matrix, Data Security Checklist.

Week 6: Testing, Cutover, Post-Migration, and Comprehensive Plan Development

  • Focus: Finalizing the migration plan, testing, execution, and post-migration activities.
  • Topics:

* Developing a comprehensive testing strategy (unit, integration, user acceptance testing).

* Planning the cutover strategy and downtime management.

* Post-migration monitoring, support, and archiving.

* Developing a complete, integrated data migration plan document.

* Lessons learned documentation.

  • Activities:

* Draft a detailed Data Migration Testing Plan.

* Outline a cutover strategy for a specific scenario.

* Consolidate all previous week's deliverables into a single, comprehensive Data Migration Plan document for a complex hypothetical project.

  • Deliverables: Comprehensive Data Migration Plan Document (integrating all previous elements), Testing Strategy, Cutover Plan.

3. Recommended Resources

  • Books:

* "Data Migration: An Executive Guide" by Johna Till Johnson

* "The DAMA Guide to the Data Management Body of Knowledge (DMBOK2)" (Relevant chapters on Data Integration and Data Quality)

* "Designing Data-Intensive Applications" by Martin Kleppmann (for understanding underlying data systems)

  • Online Courses (Search for "Data Migration," "ETL," "Data Engineering"):

* Coursera: Specializations in Data Engineering (e.g., Google Cloud, AWS)

* Udemy / LinkedIn Learning: Courses on specific ETL tools (Talend, Informatica, SSIS) or cloud migration services.

* Pluralsight: Courses on data architecture and migration.

  • Industry Publications & Blogs:

* Gartner, Forrester: Reports on data management and migration trends.

* Blogs from major cloud providers (AWS, Azure, Google Cloud) on their migration services.

* Blogs from data integration vendors (Informatica, Talend, Fivetran).

  • Documentation:

* Official documentation for databases (SQL Server, Oracle, PostgreSQL, MySQL).

* Official documentation for ETL tools and cloud migration services you are interested in.

  • Practice Data:

* Kaggle datasets, public government data, or self-generated sample data for hands-on profiling and mapping exercises.

4. Milestones

  • End of Week 1: Draft of High-Level Data Migration Scope Document and Stakeholder Map.
  • End of Week 2: Completed Data Profiling Report for a sample dataset and Source/Target Schema Comparison.
  • End of Week 3: Detailed Field Mapping Document and Data Transformation Rules Specification for a hypothetical scenario.
  • End of Week 4: Developed Data Validation Plan, Error Handling Strategy, and Detailed Rollback Procedure.
  • End of Week 5: Completed Data Migration Risk Assessment Log, Tool Comparison Matrix, and Data Security Checklist.
  • End of Week 6: Submission of a comprehensive, integrated Data Migration Plan Document for a complex hypothetical project, including a Testing Strategy and Cutover Plan.

5. Assessment Strategies

  • Self-Assessment & Review:

* Regularly review your completed weekly deliverables against best practices and sample templates.

* Use checklists to ensure all key aspects of each document are covered.

* Critically evaluate your own mapping and transformation rules for clarity, completeness, and robustness.

  • Practical Application:

The primary assessment will be the *

python

transformation_rules.py

"""

Module containing specific data transformation functions for the migration.

Each function addresses a specific transformation rule identified in the

field mapping specification.

"""

import re

from datetime import datetime

--- Helper Lookups and Constants ---

Example lookup for state codes to full names

STATE_CODE_TO_NAME = {

"AL": "Alabama", "AK": "Alaska", "AZ": "Arizona", "AR": "Arkansas",

"CA": "California", "CO": "Colorado", "CT": "Connecticut", "DE": "Delaware",

"FL": "Florida", "GA": "Georgia", "HI": "Hawaii", "ID": "Idaho",

"IL": "Illinois", "IN": "Indiana", "IA": "Iowa", "KS": "Kansas",

"KY": "Kentucky", "LA": "Louisiana", "ME": "Maine", "MD": "Maryland",

"MA": "Massachusetts", "MI": "Michigan", "MN": "Minnesota", "MS": "Mississippi",

"MO": "Missouri", "MT": "Montana", "NE": "Nebraska", "NV": "Nevada",

"NH": "New Hampshire", "NJ": "New Jersey", "NM": "New Mexico", "NY": "New York",

"NC": "North Carolina", "ND": "North Dakota", "OH": "Ohio", "OK": "Oklahoma",

"OR": "Oregon", "PA": "Pennsylvania", "RI": "Rhode Island", "SC": "South Carolina",

"SD": "South Dakota", "TN": "Tennessee", "TX": "Texas", "UT": "Utah",

"VT": "Vermont", "VA": "Virginia", "WA": "Washington", "WV": "West Virginia",

"WI": "Wisconsin", "WY": "Wyoming"

gemini Output

Data Migration Planner: Comprehensive Plan

Project: [Client Project Name - e.g., Legacy CRM to Salesforce Migration]

Date: October 26, 2023

Version: 1.0

Prepared By: PantheraHive Data Migration Team


1. Executive Summary

This document outlines a comprehensive plan for the data migration from [Source System Name - e.g., Legacy CRM] to [Target System Name - e.g., Salesforce CRM]. It details the strategy, scope, field mapping, transformation rules, validation procedures, rollback mechanisms, and a high-level timeline. The primary goal is to ensure a smooth, accurate, and secure transfer of critical business data, minimizing downtime and data integrity risks, while aligning with the target system's architecture and business requirements.


2. Project Goals and Scope

2.1. Project Goals

  • Successfully migrate all in-scope data from [Source System] to [Target System] with 100% accuracy for critical fields and >99.5% accuracy for non-critical fields.
  • Ensure data integrity and consistency throughout the migration process.
  • Minimize downtime during the cutover period to [Target System] to less than [e.g., 4 hours].
  • Provide robust validation and rollback capabilities to mitigate risks.
  • Decommission [Source System] post-migration (if applicable).

2.2. Scope Definition

  • In-Scope Data Entities (Examples):

* Customer Accounts

* Contacts

* Opportunities / Sales Orders

* Products / Services

* Historical Activities (e.g., emails, calls for the past 3 years)

  • Out-of-Scope Data Entities (Examples):

* Archived historical data older than [e.g., 5 years]

* Temporary or transient data (e.g., session logs)

* Highly customized, non-standard reports (to be recreated in the target system)

  • Source System: [e.g., Oracle EBS 11i, Custom MS Access Database]
  • Target System: [e.g., Salesforce Enterprise Edition, SAP S/4HANA]

3. Data Migration Strategy

The migration will employ a [e.g., Phased Big Bang / Incremental] approach. We will utilize a combination of [e.g., custom ETL scripts (Python/SQL), native migration tools, API integrations] for data extraction, transformation, and loading.

  • Extraction: Data will be extracted from the [Source System] using [e.g., direct database queries, API calls, flat file exports].
  • Transformation: Data will be cleansed, enriched, de-duplicated, and formatted according to the [Target System]'s requirements and defined business rules. This will occur in a dedicated staging environment.
  • Loading: Transformed data will be loaded into the [Target System] using [e.g., Salesforce Data Loader, SAP LSMW, direct database inserts via secure API].
  • Testing: Comprehensive testing cycles, including unit, integration, and user acceptance testing (UAT), will be conducted in a dedicated sandbox environment before production migration.

4. Detailed Migration Plan

4.1. Data Source and Target Systems

  • Source System:

* Name: [e.g., Legacy CRM - Microsoft Dynamics NAV 2009]

* Database: [e.g., SQL Server 2008 R2]

* Access Method: ODBC connection via dedicated migration server

* Key Modules: Sales, Customers, Products

  • Target System:

* Name: [e.g., Modern CRM - Salesforce Sales Cloud]

* Database: Salesforce internal schema

* Access Method: Salesforce Data Loader / API (SOAP/REST)

* Key Modules: Accounts, Contacts, Opportunities, Products

4.2. Data Inventory and Volume Estimates

| Data Entity | Source Table(s) | Target Object(s) | Estimated Record Count (Source) | Growth Rate (Annual) | Criticality |

| :-------------- | :-------------------- | :-------------------- | :------------------------------ | :------------------- | :---------- |

| Customer Accounts | tbl_Customers | Account | 150,000 | 5% | High |

| Contacts | tbl_Contacts | Contact | 300,000 | 7% | High |

| Opportunities | tbl_SalesOrders | Opportunity | 75,000 | 10% | Medium |

| Products | tbl_Products | Product2 | 10,000 | 2% | High |

| Activities | tbl_CallLogs, tbl_Emails | Task, EmailMessage | 1,200,000 | 15% | Medium |

4.3. Field Mapping (Illustrative Examples)

This table provides examples of field mapping. A complete mapping document will be maintained in a separate, version-controlled spreadsheet.

| Source Entity.Field (Type) | Target Entity.Field (Type) | Mandatory (Target) | Transformation Rule ID | Notes |

| :------------------------- | :------------------------- | :----------------- | :--------------------- | :--------------------------------------------- |

| tbl_Customers.CustomerID (INT) | Account.External_ID__c (Text) | Yes | TR-001 | Unique identifier, mapped to external ID field |

| tbl_Customers.CompanyName (VARCHAR(255)) | Account.Name (Text) | Yes | N/A | Direct mapping |

| tbl_Customers.Status (VARCHAR(10)) | Account.Account_Status__c (Picklist) | Yes | TR-002 | Map legacy codes to new picklist values |

| tbl_Customers.AddressLine1 (VARCHAR(255)) | Account.BillingStreet (Text) | No | N/A | Concatenated with AddressLine2 for clarity |

| tbl_Customers.AddressLine2 (VARCHAR(255)) | Account.BillingStreet (Text) | No | TR-003 | Appended to AddressLine1 |

| tbl_Contacts.FirstName (VARCHAR(100)) | Contact.FirstName (Text) | Yes | N/A | Direct mapping |

| tbl_Contacts.LastName (VARCHAR(100)) | Contact.LastName (Text) | Yes | N/A | Direct mapping |

| tbl_SalesOrders.OrderDate (DATETIME) | Opportunity.CloseDate (Date) | Yes | TR-004 | Converted to Date only, rounded to nearest day |

| tbl_SalesOrders.TotalAmount (DECIMAL) | Opportunity.Amount (Currency) | Yes | N/A | Direct mapping |

| tbl_Products.LegacySKU (VARCHAR(50)) | Product2.ProductCode (Text) | Yes | N/A | Direct mapping |

4.4. Transformation Rules (Illustrative Examples)

| Rule ID | Source Field(s) | Target Field(s) | Description

data_migration_planner.txt
Download source file
Copy all content
Full output as text
Download ZIP
IDE-ready project ZIP
Copy share link
Permanent URL for this run
Get Embed Code
Embed this result on any website
Print / Save PDF
Use browser print dialog
\n\n\n"); var hasSrcMain=Object.keys(extracted).some(function(k){return k.indexOf("src/main")>=0;}); if(!hasSrcMain) zip.file(folder+"src/main."+ext,"import React from 'react'\nimport ReactDOM from 'react-dom/client'\nimport App from './App'\nimport './index.css'\n\nReactDOM.createRoot(document.getElementById('root')!).render(\n \n \n \n)\n"); var hasSrcApp=Object.keys(extracted).some(function(k){return k==="src/App."+ext||k==="App."+ext;}); if(!hasSrcApp) zip.file(folder+"src/App."+ext,"import React from 'react'\nimport './App.css'\n\nfunction App(){\n return(\n
\n
\n

"+slugTitle(pn)+"

\n

Built with PantheraHive BOS

\n
\n
\n )\n}\nexport default App\n"); zip.file(folder+"src/index.css","*{margin:0;padding:0;box-sizing:border-box}\nbody{font-family:system-ui,-apple-system,sans-serif;background:#f0f2f5;color:#1a1a2e}\n.app{min-height:100vh;display:flex;flex-direction:column}\n.app-header{flex:1;display:flex;flex-direction:column;align-items:center;justify-content:center;gap:12px;padding:40px}\nh1{font-size:2.5rem;font-weight:700}\n"); zip.file(folder+"src/App.css",""); zip.file(folder+"src/components/.gitkeep",""); zip.file(folder+"src/pages/.gitkeep",""); zip.file(folder+"src/hooks/.gitkeep",""); Object.keys(extracted).forEach(function(p){ var fp=p.startsWith("src/")?p:"src/"+p; zip.file(folder+fp,extracted[p]); }); zip.file(folder+"README.md","# "+slugTitle(pn)+"\n\nGenerated by PantheraHive BOS.\n\n## Setup\n\`\`\`bash\nnpm install\nnpm run dev\n\`\`\`\n\n## Build\n\`\`\`bash\nnpm run build\n\`\`\`\n\n## Open in IDE\nOpen the project folder in VS Code or WebStorm.\n"); zip.file(folder+".gitignore","node_modules/\ndist/\n.env\n.DS_Store\n*.local\n"); } /* --- Vue (Vite + Composition API + TypeScript) --- */ function buildVue(zip,folder,app,code,panelTxt){ var pn=pkgName(app); var C=cc(pn); var extracted=extractCode(panelTxt); zip.file(folder+"package.json",'{\n "name": "'+pn+'",\n "version": "0.0.0",\n "type": "module",\n "scripts": {\n "dev": "vite",\n "build": "vue-tsc -b && vite build",\n "preview": "vite preview"\n },\n "dependencies": {\n "vue": "^3.5.13",\n "vue-router": "^4.4.5",\n "pinia": "^2.3.0",\n "axios": "^1.7.9"\n },\n "devDependencies": {\n "@vitejs/plugin-vue": "^5.2.1",\n "typescript": "~5.7.3",\n "vite": "^6.0.5",\n "vue-tsc": "^2.2.0"\n }\n}\n'); zip.file(folder+"vite.config.ts","import { defineConfig } from 'vite'\nimport vue from '@vitejs/plugin-vue'\nimport { resolve } from 'path'\n\nexport default defineConfig({\n plugins: [vue()],\n resolve: { alias: { '@': resolve(__dirname,'src') } }\n})\n"); zip.file(folder+"tsconfig.json",'{"files":[],"references":[{"path":"./tsconfig.app.json"},{"path":"./tsconfig.node.json"}]}\n'); zip.file(folder+"tsconfig.app.json",'{\n "compilerOptions":{\n "target":"ES2020","useDefineForClassFields":true,"module":"ESNext","lib":["ES2020","DOM","DOM.Iterable"],\n "skipLibCheck":true,"moduleResolution":"bundler","allowImportingTsExtensions":true,\n "isolatedModules":true,"moduleDetection":"force","noEmit":true,"jsxImportSource":"vue",\n "strict":true,"paths":{"@/*":["./src/*"]}\n },\n "include":["src/**/*.ts","src/**/*.d.ts","src/**/*.tsx","src/**/*.vue"]\n}\n'); zip.file(folder+"env.d.ts","/// \n"); zip.file(folder+"index.html","\n\n\n \n \n "+slugTitle(pn)+"\n\n\n
\n \n\n\n"); var hasMain=Object.keys(extracted).some(function(k){return k==="src/main.ts"||k==="main.ts";}); if(!hasMain) zip.file(folder+"src/main.ts","import { createApp } from 'vue'\nimport { createPinia } from 'pinia'\nimport App from './App.vue'\nimport './assets/main.css'\n\nconst app = createApp(App)\napp.use(createPinia())\napp.mount('#app')\n"); var hasApp=Object.keys(extracted).some(function(k){return k.indexOf("App.vue")>=0;}); if(!hasApp) zip.file(folder+"src/App.vue","\n\n\n\n\n"); zip.file(folder+"src/assets/main.css","*{margin:0;padding:0;box-sizing:border-box}body{font-family:system-ui,sans-serif;background:#fff;color:#213547}\n"); zip.file(folder+"src/components/.gitkeep",""); zip.file(folder+"src/views/.gitkeep",""); zip.file(folder+"src/stores/.gitkeep",""); Object.keys(extracted).forEach(function(p){ var fp=p.startsWith("src/")?p:"src/"+p; zip.file(folder+fp,extracted[p]); }); zip.file(folder+"README.md","# "+slugTitle(pn)+"\n\nGenerated by PantheraHive BOS.\n\n## Setup\n\`\`\`bash\nnpm install\nnpm run dev\n\`\`\`\n\n## Build\n\`\`\`bash\nnpm run build\n\`\`\`\n\nOpen in VS Code or WebStorm.\n"); zip.file(folder+".gitignore","node_modules/\ndist/\n.env\n.DS_Store\n*.local\n"); } /* --- Angular (v19 standalone) --- */ function buildAngular(zip,folder,app,code,panelTxt){ var pn=pkgName(app); var C=cc(pn); var sel=pn.replace(/_/g,"-"); var extracted=extractCode(panelTxt); zip.file(folder+"package.json",'{\n "name": "'+pn+'",\n "version": "0.0.0",\n "scripts": {\n "ng": "ng",\n "start": "ng serve",\n "build": "ng build",\n "test": "ng test"\n },\n "dependencies": {\n "@angular/animations": "^19.0.0",\n "@angular/common": "^19.0.0",\n "@angular/compiler": "^19.0.0",\n "@angular/core": "^19.0.0",\n "@angular/forms": "^19.0.0",\n "@angular/platform-browser": "^19.0.0",\n "@angular/platform-browser-dynamic": "^19.0.0",\n "@angular/router": "^19.0.0",\n "rxjs": "~7.8.0",\n "tslib": "^2.3.0",\n "zone.js": "~0.15.0"\n },\n "devDependencies": {\n "@angular-devkit/build-angular": "^19.0.0",\n "@angular/cli": "^19.0.0",\n "@angular/compiler-cli": "^19.0.0",\n "typescript": "~5.6.0"\n }\n}\n'); zip.file(folder+"angular.json",'{\n "$schema": "./node_modules/@angular/cli/lib/config/schema.json",\n "version": 1,\n "newProjectRoot": "projects",\n "projects": {\n "'+pn+'": {\n "projectType": "application",\n "root": "",\n "sourceRoot": "src",\n "prefix": "app",\n "architect": {\n "build": {\n "builder": "@angular-devkit/build-angular:application",\n "options": {\n "outputPath": "dist/'+pn+'",\n "index": "src/index.html",\n "browser": "src/main.ts",\n "tsConfig": "tsconfig.app.json",\n "styles": ["src/styles.css"],\n "scripts": []\n }\n },\n "serve": {"builder":"@angular-devkit/build-angular:dev-server","configurations":{"production":{"buildTarget":"'+pn+':build:production"},"development":{"buildTarget":"'+pn+':build:development"}},"defaultConfiguration":"development"}\n }\n }\n }\n}\n'); zip.file(folder+"tsconfig.json",'{\n "compileOnSave": false,\n "compilerOptions": {"baseUrl":"./","outDir":"./dist/out-tsc","forceConsistentCasingInFileNames":true,"strict":true,"noImplicitOverride":true,"noPropertyAccessFromIndexSignature":true,"noImplicitReturns":true,"noFallthroughCasesInSwitch":true,"paths":{"@/*":["src/*"]},"skipLibCheck":true,"esModuleInterop":true,"sourceMap":true,"declaration":false,"experimentalDecorators":true,"moduleResolution":"bundler","importHelpers":true,"target":"ES2022","module":"ES2022","useDefineForClassFields":false,"lib":["ES2022","dom"]},\n "references":[{"path":"./tsconfig.app.json"}]\n}\n'); zip.file(folder+"tsconfig.app.json",'{\n "extends":"./tsconfig.json",\n "compilerOptions":{"outDir":"./dist/out-tsc","types":[]},\n "files":["src/main.ts"],\n "include":["src/**/*.d.ts"]\n}\n'); zip.file(folder+"src/index.html","\n\n\n \n "+slugTitle(pn)+"\n \n \n \n\n\n \n\n\n"); zip.file(folder+"src/main.ts","import { bootstrapApplication } from '@angular/platform-browser';\nimport { appConfig } from './app/app.config';\nimport { AppComponent } from './app/app.component';\n\nbootstrapApplication(AppComponent, appConfig)\n .catch(err => console.error(err));\n"); zip.file(folder+"src/styles.css","* { margin: 0; padding: 0; box-sizing: border-box; }\nbody { font-family: system-ui, -apple-system, sans-serif; background: #f9fafb; color: #111827; }\n"); var hasComp=Object.keys(extracted).some(function(k){return k.indexOf("app.component")>=0;}); if(!hasComp){ zip.file(folder+"src/app/app.component.ts","import { Component } from '@angular/core';\nimport { RouterOutlet } from '@angular/router';\n\n@Component({\n selector: 'app-root',\n standalone: true,\n imports: [RouterOutlet],\n templateUrl: './app.component.html',\n styleUrl: './app.component.css'\n})\nexport class AppComponent {\n title = '"+pn+"';\n}\n"); zip.file(folder+"src/app/app.component.html","
\n
\n

"+slugTitle(pn)+"

\n

Built with PantheraHive BOS

\n
\n \n
\n"); zip.file(folder+"src/app/app.component.css",".app-header{display:flex;flex-direction:column;align-items:center;justify-content:center;min-height:60vh;gap:16px}h1{font-size:2.5rem;font-weight:700;color:#6366f1}\n"); } zip.file(folder+"src/app/app.config.ts","import { ApplicationConfig, provideZoneChangeDetection } from '@angular/core';\nimport { provideRouter } from '@angular/router';\nimport { routes } from './app.routes';\n\nexport const appConfig: ApplicationConfig = {\n providers: [\n provideZoneChangeDetection({ eventCoalescing: true }),\n provideRouter(routes)\n ]\n};\n"); zip.file(folder+"src/app/app.routes.ts","import { Routes } from '@angular/router';\n\nexport const routes: Routes = [];\n"); Object.keys(extracted).forEach(function(p){ var fp=p.startsWith("src/")?p:"src/"+p; zip.file(folder+fp,extracted[p]); }); zip.file(folder+"README.md","# "+slugTitle(pn)+"\n\nGenerated by PantheraHive BOS.\n\n## Setup\n\`\`\`bash\nnpm install\nng serve\n# or: npm start\n\`\`\`\n\n## Build\n\`\`\`bash\nng build\n\`\`\`\n\nOpen in VS Code with Angular Language Service extension.\n"); zip.file(folder+".gitignore","node_modules/\ndist/\n.env\n.DS_Store\n*.local\n.angular/\n"); } /* --- Python --- */ function buildPython(zip,folder,app,code){ var title=slugTitle(app); var pn=pkgName(app); var src=code.replace(/^\`\`\`[\w]*\n?/m,"").replace(/\n?\`\`\`$/m,"").trim(); var reqMap={"numpy":"numpy","pandas":"pandas","sklearn":"scikit-learn","tensorflow":"tensorflow","torch":"torch","flask":"flask","fastapi":"fastapi","uvicorn":"uvicorn","requests":"requests","sqlalchemy":"sqlalchemy","pydantic":"pydantic","dotenv":"python-dotenv","PIL":"Pillow","cv2":"opencv-python","matplotlib":"matplotlib","seaborn":"seaborn","scipy":"scipy"}; var reqs=[]; Object.keys(reqMap).forEach(function(k){if(src.indexOf("import "+k)>=0||src.indexOf("from "+k)>=0)reqs.push(reqMap[k]);}); var reqsTxt=reqs.length?reqs.join("\n"):"# add dependencies here\n"; zip.file(folder+"main.py",src||"# "+title+"\n# Generated by PantheraHive BOS\n\nprint(title+\" loaded\")\n"); zip.file(folder+"requirements.txt",reqsTxt); zip.file(folder+".env.example","# Environment variables\n"); zip.file(folder+"README.md","# "+title+"\n\nGenerated by PantheraHive BOS.\n\n## Setup\n\`\`\`bash\npython3 -m venv .venv\nsource .venv/bin/activate\npip install -r requirements.txt\n\`\`\`\n\n## Run\n\`\`\`bash\npython main.py\n\`\`\`\n"); zip.file(folder+".gitignore",".venv/\n__pycache__/\n*.pyc\n.env\n.DS_Store\n"); } /* --- Node.js --- */ function buildNode(zip,folder,app,code){ var title=slugTitle(app); var pn=pkgName(app); var src=code.replace(/^\`\`\`[\w]*\n?/m,"").replace(/\n?\`\`\`$/m,"").trim(); var depMap={"mongoose":"^8.0.0","dotenv":"^16.4.5","axios":"^1.7.9","cors":"^2.8.5","bcryptjs":"^2.4.3","jsonwebtoken":"^9.0.2","socket.io":"^4.7.4","uuid":"^9.0.1","zod":"^3.22.4","express":"^4.18.2"}; var deps={}; Object.keys(depMap).forEach(function(k){if(src.indexOf(k)>=0)deps[k]=depMap[k];}); if(!deps["express"])deps["express"]="^4.18.2"; var pkgJson=JSON.stringify({"name":pn,"version":"1.0.0","main":"src/index.js","scripts":{"start":"node src/index.js","dev":"nodemon src/index.js"},"dependencies":deps,"devDependencies":{"nodemon":"^3.0.3"}},null,2)+"\n"; zip.file(folder+"package.json",pkgJson); var fallback="const express=require(\"express\");\nconst app=express();\napp.use(express.json());\n\napp.get(\"/\",(req,res)=>{\n res.json({message:\""+title+" API\"});\n});\n\nconst PORT=process.env.PORT||3000;\napp.listen(PORT,()=>console.log(\"Server on port \"+PORT));\n"; zip.file(folder+"src/index.js",src||fallback); zip.file(folder+".env.example","PORT=3000\n"); zip.file(folder+".gitignore","node_modules/\n.env\n.DS_Store\n"); zip.file(folder+"README.md","# "+title+"\n\nGenerated by PantheraHive BOS.\n\n## Setup\n\`\`\`bash\nnpm install\n\`\`\`\n\n## Run\n\`\`\`bash\nnpm run dev\n\`\`\`\n"); } /* --- Vanilla HTML --- */ function buildVanillaHtml(zip,folder,app,code){ var title=slugTitle(app); var isFullDoc=code.trim().toLowerCase().indexOf("=0||code.trim().toLowerCase().indexOf("=0; var indexHtml=isFullDoc?code:"\n\n\n\n\n"+title+"\n\n\n\n"+code+"\n\n\n\n"; zip.file(folder+"index.html",indexHtml); zip.file(folder+"style.css","/* "+title+" — styles */\n*{margin:0;padding:0;box-sizing:border-box}\nbody{font-family:system-ui,-apple-system,sans-serif;background:#fff;color:#1a1a2e}\n"); zip.file(folder+"script.js","/* "+title+" — scripts */\n"); zip.file(folder+"assets/.gitkeep",""); zip.file(folder+"README.md","# "+title+"\n\nGenerated by PantheraHive BOS.\n\n## Open\nDouble-click \`index.html\` in your browser.\n\nOr serve locally:\n\`\`\`bash\nnpx serve .\n# or\npython3 -m http.server 3000\n\`\`\`\n"); zip.file(folder+".gitignore",".DS_Store\nnode_modules/\n.env\n"); } /* ===== MAIN ===== */ var sc=document.createElement("script"); sc.src="https://cdnjs.cloudflare.com/ajax/libs/jszip/3.10.1/jszip.min.js"; sc.onerror=function(){ if(lbl)lbl.textContent="Download ZIP"; alert("JSZip load failed — check connection."); }; sc.onload=function(){ var zip=new JSZip(); var base=(_phFname||"output").replace(/\.[^.]+$/,""); var app=base.toLowerCase().replace(/[^a-z0-9]+/g,"_").replace(/^_+|_+$/g,"")||"my_app"; var folder=app+"/"; var vc=document.getElementById("panel-content"); var panelTxt=vc?(vc.innerText||vc.textContent||""):""; var lang=detectLang(_phCode,panelTxt); if(_phIsHtml){ buildVanillaHtml(zip,folder,app,_phCode); } else if(lang==="flutter"){ buildFlutter(zip,folder,app,_phCode,panelTxt); } else if(lang==="react-native"){ buildReactNative(zip,folder,app,_phCode,panelTxt); } else if(lang==="swift"){ buildSwift(zip,folder,app,_phCode,panelTxt); } else if(lang==="kotlin"){ buildKotlin(zip,folder,app,_phCode,panelTxt); } else if(lang==="react"){ buildReact(zip,folder,app,_phCode,panelTxt); } else if(lang==="vue"){ buildVue(zip,folder,app,_phCode,panelTxt); } else if(lang==="angular"){ buildAngular(zip,folder,app,_phCode,panelTxt); } else if(lang==="python"){ buildPython(zip,folder,app,_phCode); } else if(lang==="node"){ buildNode(zip,folder,app,_phCode); } else { /* Document/content workflow */ var title=app.replace(/_/g," "); var md=_phAll||_phCode||panelTxt||"No content"; zip.file(folder+app+".md",md); var h=""+title+""; h+="

"+title+"

"; var hc=md.replace(/&/g,"&").replace(//g,">"); hc=hc.replace(/^### (.+)$/gm,"

$1

"); hc=hc.replace(/^## (.+)$/gm,"

$1

"); hc=hc.replace(/^# (.+)$/gm,"

$1

"); hc=hc.replace(/\*\*(.+?)\*\*/g,"$1"); hc=hc.replace(/\n{2,}/g,"

"); h+="

"+hc+"

Generated by PantheraHive BOS
"; zip.file(folder+app+".html",h); zip.file(folder+"README.md","# "+title+"\n\nGenerated by PantheraHive BOS.\n\nFiles:\n- "+app+".md (Markdown)\n- "+app+".html (styled HTML)\n"); } zip.generateAsync({type:"blob"}).then(function(blob){ var a=document.createElement("a"); a.href=URL.createObjectURL(blob); a.download=app+".zip"; a.click(); URL.revokeObjectURL(a.href); if(lbl)lbl.textContent="Download ZIP"; }); }; document.head.appendChild(sc); } function phShare(){navigator.clipboard.writeText(window.location.href).then(function(){var el=document.getElementById("ph-share-lbl");if(el){el.textContent="Link copied!";setTimeout(function(){el.textContent="Copy share link";},2500);}});}function phEmbed(){var runId=window.location.pathname.split("/").pop().replace(".html","");var embedUrl="https://pantherahive.com/embed/"+runId;var code='';navigator.clipboard.writeText(code).then(function(){var el=document.getElementById("ph-embed-lbl");if(el){el.textContent="Embed code copied!";setTimeout(function(){el.textContent="Get Embed Code";},2500);}});}