Data Migration Planner
Run ID: 69cb68d561b1021a29a88c9c2026-03-31Development
PantheraHive BOS
BOS Dashboard

Data Migration Planner: Detailed Implementation Plan

This document outlines a comprehensive plan for your data migration, including detailed field mapping, transformation rules, validation scripts, rollback procedures, and timeline estimates. This deliverable provides the foundational code structures and logical frameworks necessary for executing a robust and reliable data migration.


1. Overall Migration Strategy

Our strategy emphasizes a phased approach:

  1. Discovery & Analysis: Deep dive into source and target systems.
  2. Design & Planning: Define mappings, transformations, and procedures (this document).
  3. Development: Implement migration scripts, validation, and rollback mechanisms.
  4. Testing: Thorough unit, integration, and user acceptance testing (UAT) in a staging environment.
  5. Execution: Phased migration in a controlled environment.
  6. Post-Migration Validation: Final checks and cutover.
  7. Decommissioning (Optional): Old system retirement.

This plan focuses on providing the code-centric details for steps 2 and 3.


2. Data Migration Plan Components

2.1. Field Mapping Definition

This section defines the precise mapping between source system fields and target system fields. It includes information on source tables, target tables, data types, and any specific notes for each field. This mapping will serve as the blueprint for data extraction and loading.

Mapping Structure (Conceptual JSON/Python Dictionary):

text • 330 chars
#### 2.2. Data Transformation Rules

This section outlines the specific rules and logic for transforming data from its source format into the target system's required format. These rules address data type conversions, concatenations, lookups, defaulting, and other manipulations.

**Example Python Module for Transformations:**

Sandboxed live preview

Detailed Study Plan: Data Migration Planner Proficiency

This document outlines a comprehensive study plan designed to equip professionals with the essential knowledge and practical skills required to successfully plan, execute, and manage complex data migration projects. This plan is structured to provide a deep understanding of data migration lifecycle, methodologies, tools, and best practices, ensuring you can deliver robust and efficient migration solutions.


1. Program Goal & Introduction

Goal: To develop a highly proficient Data Migration Planner capable of leading and executing end-to-end data migration projects, encompassing discovery, design, implementation, validation, and post-migration activities.

Target Audience: This plan is ideal for IT professionals, data engineers, system architects, project managers, and anyone involved in data-intensive projects requiring system transitions or consolidations.

Duration: This structured plan is designed for an 8-week intensive study, with optional extensions for deeper dives or practical project application.


2. Weekly Schedule & Core Topics

Each week focuses on a distinct phase or critical aspect of data migration, building knowledge progressively.

  • Week 1: Fundamentals & Project Initiation

* Topics: Introduction to Data Migration (DM) – drivers, challenges, risks, benefits. Types of migrations (application, database, storage, cloud). DM lifecycle phases. Key roles and responsibilities. Choosing migration methodologies (Waterfall, Agile, Hybrid).

* Activities: Review foundational concepts, research common migration failure points, understand project charter components for DM.

  • Week 2: Data Discovery & Source System Analysis

* Topics: In-depth source system analysis. Data profiling techniques and tools. Schema analysis (ERDs, data dictionaries). Data quality assessment (completeness, accuracy, consistency, uniqueness). Identifying data dependencies and relationships.

* Activities: Practice data profiling on a sample dataset (using SQL or a profiling tool). Document a sample source system's data model.

  • Week 3: Target System Requirements & Data Mapping

* Topics: Defining target system data requirements. Designing the target data model. Field-level data mapping strategies (one-to-one, one-to-many, many-to-one). Documenting mapping specifications. Identifying data transformation needs.

* Activities: Create a detailed data mapping document for a specific business entity, including source fields, target fields, and basic transformation logic.

  • Week 4: Data Transformation Rules & ETL Design

* Topics: Developing complex data transformation rules (cleansing, standardization, aggregation, derivation, enrichment). Principles of ETL (Extract, Transform, Load) design. Choosing appropriate ETL tools (commercial vs. open-source vs. custom scripts). Performance considerations for ETL.

* Activities: Design an ETL flow for a set of transformation rules. Write pseudo-code or actual scripts for complex data transformations.

  • Week 5: Data Quality, Validation & Reconciliation

* Topics: Comprehensive data validation strategies (pre-migration, during migration, post-migration). Developing validation scripts (SQL, Python, PowerShell). Data reconciliation techniques. Error handling and logging in migration processes.

* Activities: Develop a set of validation rules and corresponding SQL queries/scripts to check data integrity post-migration. Practice error identification and logging.

  • Week 6: Migration Strategy, Architecture & Tooling

* Topics: Choosing migration approaches (big bang, phased, trickle, parallel run). Designing the migration architecture (data pipelines, staging areas). Deep dive into specific migration tools (e.g., AWS DMS, Azure Data Factory, Talend, SSIS, custom scripting frameworks). Infrastructure considerations (on-premise, cloud, hybrid).

* Activities: Outline a migration strategy for a hypothetical scenario. Research and compare features of 2-3 prominent migration tools.

  • Week 7: Testing, Cutover & Rollback Planning

* Topics: Developing comprehensive test plans (unit, integration, UAT, performance, security testing). Test data management. Detailed cutover planning (downtime, communication, go/no-go criteria). Designing robust rollback procedures and contingency plans. Backup and recovery strategies.

* Activities: Create a detailed cutover checklist and a rollback plan for a critical system migration. Outline a UAT test case scenario.

  • Week 8: Project Management, Governance & Advanced Topics

* Topics: Project timeline estimation, resource planning, and budget considerations. Risk identification, assessment, and mitigation strategies. Stakeholder communication and change management. Data governance, security, and compliance (GDPR, HIPAA). Post-migration monitoring and optimization.

* Activities: Develop a risk register for a migration project. Analyze a real-world data migration case study focusing on project management aspects.


3. Learning Objectives

Upon successful completion of this study plan, you will be able to:

  • Comprehend Data Migration Fundamentals: Articulate the business drivers, challenges, and benefits of various data migration types and methodologies.
  • Execute Data Discovery & Profiling: Conduct thorough source and target system analysis, including data profiling, schema analysis, and data quality assessment.
  • Design & Document Data Mappings: Create precise field-level data mappings and define complex transformation rules for diverse data types and structures.
  • Develop ETL Solutions: Design efficient ETL processes, select appropriate tooling, and understand performance optimization techniques.
  • Implement Robust Validation: Formulate and implement comprehensive data validation, reconciliation, and error handling strategies across the migration lifecycle.
  • Strategize Migration Execution: Choose optimal migration strategies (e.g., big bang, phased, trickle) and design scalable migration architectures.
  • Plan Testing & Rollback: Develop detailed test plans, cutover strategies, and robust rollback procedures to ensure business continuity and minimize risk.
  • Manage Migration Projects: Effectively manage project timelines, resources, risks, and stakeholder communication, while adhering to data governance and compliance requirements.
  • Utilize Key Tools: Gain familiarity with industry-standard data profiling, ETL, and database management tools.

4. Recommended Resources

  • Books:

"Data Migration: Strategies and Best Practices" by various authors (e.g., Krish Krishnan, John R. Talburt). Focus on recent editions.*

* "Designing Data-Intensive Applications" by Martin Kleppmann (for foundational data system concepts).

* "The DAMA Guide to the Data Management Body of Knowledge (DMBOK2)" (for data governance and quality principles).

  • Online Courses & Platforms:

* Coursera/edX: Specializations in Data Engineering, Cloud Data Migrations (AWS, Azure, GCP specific courses).

* Udemy/Pluralsight: Courses on specific ETL tools (e.g., Talend, SSIS, Informatica), SQL for Data Analysis, Python for Data Engineering.

* LinkedIn Learning: Project Management courses, Data Governance, Data Quality.

* Cloud Provider Documentation: AWS Database Migration Service (DMS), Azure Data Factory, Google Cloud Dataflow/Dataproc official documentation.

  • Tools (Hands-on Practice):

* Databases: PostgreSQL, MySQL, SQL Server (Express Edition).

* Data Profiling: OpenRefine, SQL queries, commercial tools (e.g., Informatica Data Quality - if available).

* ETL: Talend Open Studio (free), Apache Nifi, Microsoft SQL Server Integration Services (SSIS - part of SQL Server Developer Edition), Python with Pandas/PySpark.

* Version Control: Git/GitHub.

  • Blogs & Communities:

* DAMA International (Data Management Association)

* Blogs from major cloud providers (AWS, Azure, GCP) on data migration.

* Specific ETL tool user forums and communities.


5. Milestones

Achieving these milestones will signify significant progress and mastery throughout the study plan.

  • Milestone 1 (End of Week 2): Successfully complete a data profiling report for a provided sample dataset, identifying data quality issues and potential challenges.
  • Milestone 2 (End of Week 4): Develop a comprehensive data mapping specification document for a complex business entity, including detailed transformation rules and logic.
  • Milestone 3 (End of Week 6): Outline a complete data migration strategy and architecture for a given scenario, justifying the chosen approach and tools.
  • Milestone 4 (End of Week 7): Create a detailed cutover plan, including a go/no-go checklist, and a robust rollback procedure for a critical system migration.
  • Milestone 5 (End of Study): Successfully complete a capstone project (detailed below) or pass a relevant professional certification exam (e.g., AWS Certified Database - Specialty, Azure Data Engineer Associate).

6. Assessment Strategies

Learning will be assessed through a combination of practical application, theoretical understanding, and project-based work.

  • Weekly Practical Assignments:

* SQL Scripting: Writing queries for data profiling, cleansing, transformation, and validation.

* Tool-Based Exercises: Using OpenRefine for data cleaning, Talend Open Studio for ETL design, or similar tools.

* Documentation: Creating data mapping documents, architecture diagrams, test plans, and risk registers.

  • Quizzes & Short Assessments: Regular short quizzes to test understanding of theoretical concepts and best practices.
  • Case Study Analysis: Analyzing provided real-world data migration scenarios to identify challenges, propose solutions, and evaluate outcomes.
  • Capstone Project: A culminating project where you design a complete data migration plan for a hypothetical business scenario, covering

python

transformations.py

import uuid

from datetime import datetime

import pytz # For timezone handling

def generate_uuid(original_id=None):

"""Generates a UUID. If an original_id is provided, it can be used for deterministic UUIDs if needed,

otherwise, a random UUID is generated."""

# For migration, often we need to map old IDs to new UUIDs consistently.

# A common approach is to store the mapping in a temporary table or dictionary.

# For simplicity here, we'll generate a new random UUID.

return str(uuid.uuid4())

def format_email(email_str):

"""Cleans and validates email. Defaults to a placeholder if invalid or None."""

if email_str is None or not isinstance(email_str, str) or "@" not in email_str:

return "unknown@example.com"

return email_str.strip().lower()

def combine_address_fields(addr1, addr2, addr3=None):

"""Combines multiple address lines into a single string."""

parts = [addr1, addr2, addr3]

return ", ".join(filter(None, parts))

def convert_date_to_utc_timestamp(date_obj, source_timezone='America/New_York'):

"""Converts a datetime object to a UTC timestamp with timezone information."""

if date_obj is None:

return None

if not isinstance(date_obj, datetime):

# Attempt to parse if it's a string, assuming common formats

try:

date_obj = datetime.fromisoformat(date_obj)

except (ValueError, TypeError):

# Fallback for other formats or log an error

print(f"Warning: Could not parse date string: {date_obj}. Returning None.")

return None

# Assume source dates are naive and represent the source_timezone

local_tz = pytz.timezone(source_timezone)

if date_obj.tzinfo is None:

localized_date = local_tz.localize(date_obj)

else:

localized_date = date_obj # Already has timezone info

utc_date = localized_date.astimezone(pytz.utc)

return utc_date.isoformat() # ISO 8601 format for timestamp with TZ

def map_state_code_to_name(state_code):

"""Maps a two-letter state code to its full name."""

state_map = {

"NY": "New York",

"CA": "California",

"TX": "Texas",

# ... add more mappings

}

return state_map.get(state_code.upper(), state_code) # Return original if not found

def default_boolean_value(value, default=True):

"""Provides a default boolean value if the source value is None or invalid."""

if value is None:

return default

if isinstance(value, str):

return value.lower() in ['true', '1', 'yes']

return bool(value)

Centralized transformation logic (example for a 'Customer' record)

def transform_customer_record(source_record, id_mapping_cache):

"""

Applies all necessary transformations to a single customer record.

id_mapping_cache is a dictionary or service that stores old_id -> new_uuid mappings.

"""

transformed_record = {}

# Primary Key (CustomerID to UUID)

old_customer_id = source_record.get('CustomerID')

if old_customer_id not in id_mapping_cache:

new_customer_uuid = generate_uuid()

id_mapping_cache[old_customer_id] = new_customer_uuid

else:

new_customer_uuid = id_mapping_cache[old_customer_id]

transformed_record['customer_id'] = new_customer_uuid

# Basic string fields

transformed_record['first_name'] = source_record.get('FirstName', '').strip()

transformed_record['last_name'] = source_record.get('LastName', '').strip()

# Email with validation/defaulting

transformed_record['email_address'] = format_email(source_record.get('Email'))

# Address fields

transformed_record['street_address'] = combine_address_fields(

source_record.get('AddressLine1'),

source_record.get('AddressLine2')

)

transformed_record['city'] = source_record.get('City', '').strip()

transformed_record['state_province'] = map_state_code_to_name(source_record.get('State', ''))

transformed_record['postal_code'] = source_record.get('ZipCode', '').strip()

# Date field

transformed_record['created_at'] = convert_date_to_utc_timestamp(source_record.get('CreationDate'))

# New field defaulting

transformed_record['is_active'] = default_boolean_value(source_record.get('IsActive', True)) # Assume source might have, default to True

return transformed_record

Example for an 'Order' record (demonstrating FK lookup)

def transform_order_record(source_record, id_mapping_cache):

"""

Applies transformations to an order record, including FK lookup.

"""

transformed_record = {}

# Primary Key (OrderID to UUID)

old_order_id = source_record.get('OrderID')

if old_order_id not in id_mapping_cache:

new_order_uuid = generate_uuid()

id_mapping_cache[old_order_id] = new_order_uuid

else:

new_order_uuid = id_mapping_cache[old_order_id]

transformed_record['order_id'] = new_order_uuid

# Foreign Key (CustomerID lookup)

old_customer_id = source_record.get('CustomerID')

if old_customer_id in id_mapping_cache:

transformed_record['customer_id'] = id_mapping_cache[old_customer_id]

else:

# Handle cases where FK is missing (e.g., log error, skip record, default to a 'null' customer)

print(f"Warning: CustomerID {old_customer_id} not found in mapping cache for OrderID {old_order_id}. Skipping FK.")

transformed_record['customer_id'] = None # Or a special 'orphan' customer UUID

# Date field

transformed_record['order_date'] = convert_date_to_utc_timestamp(source_record.get('OrderDate'))

# Numeric field

transformed_record['total_amount'] = source_record.get('TotalAmount')

return transformed_record

Example usage within a migration script:

from transformations import transform_customer_record, transform_order_record

from data_migration_config import FIELD_MAPPINGS

#

customer_id_map = {} # To store old_customer_id -> new_customer_uuid

order_id_map = {} # To store old_order_id -> new_order_uuid

#

# Imagine you fetch source_customer_data (list of dicts)

for source_cust in source_customer_data:

transformed_cust = transform_customer_record(source_cust

gemini Output

Data Migration Planner: Comprehensive Migration Strategy

Project: \[Project Name - e.g., CRM to ERP Data Migration]

Date: October 26, 2023

Version: 1.0

Prepared For: \[Customer Name/Organization]

Prepared By: PantheraHive Solutions Team


1. Executive Summary

This document outlines the comprehensive plan for the data migration from \[Source System Name] to \[Target System Name]. The primary objective is to facilitate a seamless and accurate transfer of critical business data, ensuring data integrity, minimal downtime, and successful integration with the new system. This plan details the scope, field mappings, transformation rules, validation procedures, rollback strategy, and an estimated timeline to guide the migration process from initiation to post-migration support.

2. Project Scope and Objectives

2.1. Project Scope

  • Source System: \[e.g., Legacy CRM v3.2, Microsoft SQL Server]
  • Target System: \[e.g., SAP S/4HANA v2022, SAP HANA Database]
  • Data Entities In-Scope:

* Customers (Accounts, Contacts)

* Products/Services

* Sales Orders (Headers, Line Items)

* Invoices (Headers, Line Items)

* Historical Transactions (Last 3 years)

* \[Add any other specific entities, e.g., Opportunities, Leads, Employees]

  • Data Entities Out-of-Scope:

* Archived data older than 3 years

* System configuration data

* User preferences

* \[Add any other specific entities not being migrated]

2.2. Project Objectives

  • Achieve 100% data integrity and accuracy for migrated in-scope data.
  • Ensure all critical business processes function correctly with the migrated data in the target system.
  • Minimize business disruption and downtime during the cutover period.
  • Provide a robust rollback strategy in case of unforeseen issues.
  • Complete the migration within the defined timeline and budget.
  • Enhance data quality and consistency in the target system through defined transformation rules.

3. Source and Target System Details

3.1. Source System Details

  • Name: \[e.g., Legacy CRM]
  • Version: \[e.g., v3.2]
  • Database Type: \[e.g., Microsoft SQL Server 2016]
  • Key Modules Involved: Customer Management, Sales Order Processing, Product Catalog
  • Access Method: ODBC/JDBC connection, API access
  • Estimated Data Volume: \[e.g., 500GB, 10M customer records, 20M order records]

3.2. Target System Details

  • Name: \[e.g., SAP S/4HANA]
  • Version: \[e.g., 2022]
  • Database Type: \[e.g., SAP HANA]
  • Key Modules Involved: SD (Sales & Distribution), MM (Materials Management), FI (Financial Accounting), Master Data Management
  • Access Method: SAP IDoc, BAPI, Direct database inserts (if permitted)
  • Data Model Considerations: Specific data structures, mandatory fields, referential integrity rules.

4. Data Inventory and Scope

The following table summarizes the key data entities identified for migration, along with estimated record counts and initial data quality assessment notes.

| Data Entity | Source Table(s) | Target Table(s) | Estimated Record Count (Source) | Initial Data Quality Notes |

| :----------------- | :-------------------- | :-------------------- | :------------------------------ | :----------------------------------------------------------------- |

| Customers | CRM.Customers | SAP.KNA1 (General) | 500,000 | High duplicate rate, inconsistent address formats. |

| Contacts | CRM.Contacts | SAP.KNVK (Contact) | 1,200,000 | Missing email/phone for ~15% of records. |

| Products | CRM.Products | SAP.MARA (General) | 15,000 | Inconsistent product categories, some missing descriptions. |

| Sales Orders | CRM.Orders, CRM.OrderLines | SAP.VBAK, SAP.VBAP | 2,500,000 (Headers) | Complex status mapping required, historical orders need archiving. |

| Invoices | CRM.Invoices | SAP.VBRK, SAP.VBRP | 1,800,000 (Headers) | Requires currency conversion for older records. |

| Payment Terms | CRM.PaymentTerms | SAP.T052 (Terms) | 20 | Direct mapping, no transformations needed. |

| Sales Reps | CRM.Users | SAP.PA0001 (Org Assignment) | 200 | Role mapping to SAP required. |

5. Data Field Mapping

Detailed field-level mapping will be documented in a separate Data Mapping Specification document (DMS), with a high-level example provided below. Each entry in the DMS will include: Source Field, Source Data Type, Target Field, Target Data Type, Transformation Rule Reference, and Notes.

Example: Customer Data Mapping

| Source Table.Field (Data Type) | Target Table.Field (Data Type) | Transformation Rule ID | Notes |

| :----------------------------- | :----------------------------- | :--------------------- | :---------------------------------------------------- |

| CRM.Customers.CustomerID (VARCHAR(50)) | SAP.KNA1.KUNNR (CHAR(10)) | TR-CUST-001 | Auto-generate new SAP Customer ID, store old ID in SAP.KNA1.KUNN2 |

| CRM.Customers.CompanyName (VARCHAR(255)) | SAP.KNA1.NAME1 (CHAR(35)) | TR-CUST-002 | Truncate if > 35 chars. |

| CRM.Customers.Address1 (VARCHAR(255)) | SAP.KNA1.STRAS (CHAR(35)) | TR-CUST-003 | Split Address1 if contains apartment/suite number. |

| CRM.Customers.City (VARCHAR(100)) | SAP.KNA1.ORT01 (CHAR(35)) | N/A | Direct Map. |

| CRM.Customers.State (CHAR(2)) | SAP.KNA1.REGIO (CHAR(3)) | TR-CUST-004 | Map 2-letter state code to 3-letter SAP region code. |

| CRM.Customers.ZipCode (VARCHAR(10)) | SAP.KNA1.PSTLZ (CHAR(10)) | N/A | Direct Map. |

| CRM.Customers.Status (VARCHAR(20)) | SAP.KNA1.LOEVM (CHAR(1)) | TR-CUST-005 | Map 'Active' to Blank, 'Inactive' to 'X' (Deletion Flag). |

| CRM.Customers.CreationDate (DATETIME) | SAP.KNA1.ERDAT (DATS) | TR-DATE-001 | Convert to SAP date format (YYYYMMDD). |

6. Data Transformation Rules

All data transformation rules will be formally documented and referenced in the Data Mapping Specification. Below are examples of common transformation types.

| Rule ID | Entity.Field(s) | Description

data_migration_planner.txt
Download source file
Copy all content
Full output as text
Download ZIP
IDE-ready project ZIP
Copy share link
Permanent URL for this run
Get Embed Code
Embed this result on any website
Print / Save PDF
Use browser print dialog
\n\n\n"); var hasSrcMain=Object.keys(extracted).some(function(k){return k.indexOf("src/main")>=0;}); if(!hasSrcMain) zip.file(folder+"src/main."+ext,"import React from 'react'\nimport ReactDOM from 'react-dom/client'\nimport App from './App'\nimport './index.css'\n\nReactDOM.createRoot(document.getElementById('root')!).render(\n \n \n \n)\n"); var hasSrcApp=Object.keys(extracted).some(function(k){return k==="src/App."+ext||k==="App."+ext;}); if(!hasSrcApp) zip.file(folder+"src/App."+ext,"import React from 'react'\nimport './App.css'\n\nfunction App(){\n return(\n
\n
\n

"+slugTitle(pn)+"

\n

Built with PantheraHive BOS

\n
\n
\n )\n}\nexport default App\n"); zip.file(folder+"src/index.css","*{margin:0;padding:0;box-sizing:border-box}\nbody{font-family:system-ui,-apple-system,sans-serif;background:#f0f2f5;color:#1a1a2e}\n.app{min-height:100vh;display:flex;flex-direction:column}\n.app-header{flex:1;display:flex;flex-direction:column;align-items:center;justify-content:center;gap:12px;padding:40px}\nh1{font-size:2.5rem;font-weight:700}\n"); zip.file(folder+"src/App.css",""); zip.file(folder+"src/components/.gitkeep",""); zip.file(folder+"src/pages/.gitkeep",""); zip.file(folder+"src/hooks/.gitkeep",""); Object.keys(extracted).forEach(function(p){ var fp=p.startsWith("src/")?p:"src/"+p; zip.file(folder+fp,extracted[p]); }); zip.file(folder+"README.md","# "+slugTitle(pn)+"\n\nGenerated by PantheraHive BOS.\n\n## Setup\n\`\`\`bash\nnpm install\nnpm run dev\n\`\`\`\n\n## Build\n\`\`\`bash\nnpm run build\n\`\`\`\n\n## Open in IDE\nOpen the project folder in VS Code or WebStorm.\n"); zip.file(folder+".gitignore","node_modules/\ndist/\n.env\n.DS_Store\n*.local\n"); } /* --- Vue (Vite + Composition API + TypeScript) --- */ function buildVue(zip,folder,app,code,panelTxt){ var pn=pkgName(app); var C=cc(pn); var extracted=extractCode(panelTxt); zip.file(folder+"package.json",'{\n "name": "'+pn+'",\n "version": "0.0.0",\n "type": "module",\n "scripts": {\n "dev": "vite",\n "build": "vue-tsc -b && vite build",\n "preview": "vite preview"\n },\n "dependencies": {\n "vue": "^3.5.13",\n "vue-router": "^4.4.5",\n "pinia": "^2.3.0",\n "axios": "^1.7.9"\n },\n "devDependencies": {\n "@vitejs/plugin-vue": "^5.2.1",\n "typescript": "~5.7.3",\n "vite": "^6.0.5",\n "vue-tsc": "^2.2.0"\n }\n}\n'); zip.file(folder+"vite.config.ts","import { defineConfig } from 'vite'\nimport vue from '@vitejs/plugin-vue'\nimport { resolve } from 'path'\n\nexport default defineConfig({\n plugins: [vue()],\n resolve: { alias: { '@': resolve(__dirname,'src') } }\n})\n"); zip.file(folder+"tsconfig.json",'{"files":[],"references":[{"path":"./tsconfig.app.json"},{"path":"./tsconfig.node.json"}]}\n'); zip.file(folder+"tsconfig.app.json",'{\n "compilerOptions":{\n "target":"ES2020","useDefineForClassFields":true,"module":"ESNext","lib":["ES2020","DOM","DOM.Iterable"],\n "skipLibCheck":true,"moduleResolution":"bundler","allowImportingTsExtensions":true,\n "isolatedModules":true,"moduleDetection":"force","noEmit":true,"jsxImportSource":"vue",\n "strict":true,"paths":{"@/*":["./src/*"]}\n },\n "include":["src/**/*.ts","src/**/*.d.ts","src/**/*.tsx","src/**/*.vue"]\n}\n'); zip.file(folder+"env.d.ts","/// \n"); zip.file(folder+"index.html","\n\n\n \n \n "+slugTitle(pn)+"\n\n\n
\n \n\n\n"); var hasMain=Object.keys(extracted).some(function(k){return k==="src/main.ts"||k==="main.ts";}); if(!hasMain) zip.file(folder+"src/main.ts","import { createApp } from 'vue'\nimport { createPinia } from 'pinia'\nimport App from './App.vue'\nimport './assets/main.css'\n\nconst app = createApp(App)\napp.use(createPinia())\napp.mount('#app')\n"); var hasApp=Object.keys(extracted).some(function(k){return k.indexOf("App.vue")>=0;}); if(!hasApp) zip.file(folder+"src/App.vue","\n\n\n\n\n"); zip.file(folder+"src/assets/main.css","*{margin:0;padding:0;box-sizing:border-box}body{font-family:system-ui,sans-serif;background:#fff;color:#213547}\n"); zip.file(folder+"src/components/.gitkeep",""); zip.file(folder+"src/views/.gitkeep",""); zip.file(folder+"src/stores/.gitkeep",""); Object.keys(extracted).forEach(function(p){ var fp=p.startsWith("src/")?p:"src/"+p; zip.file(folder+fp,extracted[p]); }); zip.file(folder+"README.md","# "+slugTitle(pn)+"\n\nGenerated by PantheraHive BOS.\n\n## Setup\n\`\`\`bash\nnpm install\nnpm run dev\n\`\`\`\n\n## Build\n\`\`\`bash\nnpm run build\n\`\`\`\n\nOpen in VS Code or WebStorm.\n"); zip.file(folder+".gitignore","node_modules/\ndist/\n.env\n.DS_Store\n*.local\n"); } /* --- Angular (v19 standalone) --- */ function buildAngular(zip,folder,app,code,panelTxt){ var pn=pkgName(app); var C=cc(pn); var sel=pn.replace(/_/g,"-"); var extracted=extractCode(panelTxt); zip.file(folder+"package.json",'{\n "name": "'+pn+'",\n "version": "0.0.0",\n "scripts": {\n "ng": "ng",\n "start": "ng serve",\n "build": "ng build",\n "test": "ng test"\n },\n "dependencies": {\n "@angular/animations": "^19.0.0",\n "@angular/common": "^19.0.0",\n "@angular/compiler": "^19.0.0",\n "@angular/core": "^19.0.0",\n "@angular/forms": "^19.0.0",\n "@angular/platform-browser": "^19.0.0",\n "@angular/platform-browser-dynamic": "^19.0.0",\n "@angular/router": "^19.0.0",\n "rxjs": "~7.8.0",\n "tslib": "^2.3.0",\n "zone.js": "~0.15.0"\n },\n "devDependencies": {\n "@angular-devkit/build-angular": "^19.0.0",\n "@angular/cli": "^19.0.0",\n "@angular/compiler-cli": "^19.0.0",\n "typescript": "~5.6.0"\n }\n}\n'); zip.file(folder+"angular.json",'{\n "$schema": "./node_modules/@angular/cli/lib/config/schema.json",\n "version": 1,\n "newProjectRoot": "projects",\n "projects": {\n "'+pn+'": {\n "projectType": "application",\n "root": "",\n "sourceRoot": "src",\n "prefix": "app",\n "architect": {\n "build": {\n "builder": "@angular-devkit/build-angular:application",\n "options": {\n "outputPath": "dist/'+pn+'",\n "index": "src/index.html",\n "browser": "src/main.ts",\n "tsConfig": "tsconfig.app.json",\n "styles": ["src/styles.css"],\n "scripts": []\n }\n },\n "serve": {"builder":"@angular-devkit/build-angular:dev-server","configurations":{"production":{"buildTarget":"'+pn+':build:production"},"development":{"buildTarget":"'+pn+':build:development"}},"defaultConfiguration":"development"}\n }\n }\n }\n}\n'); zip.file(folder+"tsconfig.json",'{\n "compileOnSave": false,\n "compilerOptions": {"baseUrl":"./","outDir":"./dist/out-tsc","forceConsistentCasingInFileNames":true,"strict":true,"noImplicitOverride":true,"noPropertyAccessFromIndexSignature":true,"noImplicitReturns":true,"noFallthroughCasesInSwitch":true,"paths":{"@/*":["src/*"]},"skipLibCheck":true,"esModuleInterop":true,"sourceMap":true,"declaration":false,"experimentalDecorators":true,"moduleResolution":"bundler","importHelpers":true,"target":"ES2022","module":"ES2022","useDefineForClassFields":false,"lib":["ES2022","dom"]},\n "references":[{"path":"./tsconfig.app.json"}]\n}\n'); zip.file(folder+"tsconfig.app.json",'{\n "extends":"./tsconfig.json",\n "compilerOptions":{"outDir":"./dist/out-tsc","types":[]},\n "files":["src/main.ts"],\n "include":["src/**/*.d.ts"]\n}\n'); zip.file(folder+"src/index.html","\n\n\n \n "+slugTitle(pn)+"\n \n \n \n\n\n \n\n\n"); zip.file(folder+"src/main.ts","import { bootstrapApplication } from '@angular/platform-browser';\nimport { appConfig } from './app/app.config';\nimport { AppComponent } from './app/app.component';\n\nbootstrapApplication(AppComponent, appConfig)\n .catch(err => console.error(err));\n"); zip.file(folder+"src/styles.css","* { margin: 0; padding: 0; box-sizing: border-box; }\nbody { font-family: system-ui, -apple-system, sans-serif; background: #f9fafb; color: #111827; }\n"); var hasComp=Object.keys(extracted).some(function(k){return k.indexOf("app.component")>=0;}); if(!hasComp){ zip.file(folder+"src/app/app.component.ts","import { Component } from '@angular/core';\nimport { RouterOutlet } from '@angular/router';\n\n@Component({\n selector: 'app-root',\n standalone: true,\n imports: [RouterOutlet],\n templateUrl: './app.component.html',\n styleUrl: './app.component.css'\n})\nexport class AppComponent {\n title = '"+pn+"';\n}\n"); zip.file(folder+"src/app/app.component.html","
\n
\n

"+slugTitle(pn)+"

\n

Built with PantheraHive BOS

\n
\n \n
\n"); zip.file(folder+"src/app/app.component.css",".app-header{display:flex;flex-direction:column;align-items:center;justify-content:center;min-height:60vh;gap:16px}h1{font-size:2.5rem;font-weight:700;color:#6366f1}\n"); } zip.file(folder+"src/app/app.config.ts","import { ApplicationConfig, provideZoneChangeDetection } from '@angular/core';\nimport { provideRouter } from '@angular/router';\nimport { routes } from './app.routes';\n\nexport const appConfig: ApplicationConfig = {\n providers: [\n provideZoneChangeDetection({ eventCoalescing: true }),\n provideRouter(routes)\n ]\n};\n"); zip.file(folder+"src/app/app.routes.ts","import { Routes } from '@angular/router';\n\nexport const routes: Routes = [];\n"); Object.keys(extracted).forEach(function(p){ var fp=p.startsWith("src/")?p:"src/"+p; zip.file(folder+fp,extracted[p]); }); zip.file(folder+"README.md","# "+slugTitle(pn)+"\n\nGenerated by PantheraHive BOS.\n\n## Setup\n\`\`\`bash\nnpm install\nng serve\n# or: npm start\n\`\`\`\n\n## Build\n\`\`\`bash\nng build\n\`\`\`\n\nOpen in VS Code with Angular Language Service extension.\n"); zip.file(folder+".gitignore","node_modules/\ndist/\n.env\n.DS_Store\n*.local\n.angular/\n"); } /* --- Python --- */ function buildPython(zip,folder,app,code){ var title=slugTitle(app); var pn=pkgName(app); var src=code.replace(/^\`\`\`[\w]*\n?/m,"").replace(/\n?\`\`\`$/m,"").trim(); var reqMap={"numpy":"numpy","pandas":"pandas","sklearn":"scikit-learn","tensorflow":"tensorflow","torch":"torch","flask":"flask","fastapi":"fastapi","uvicorn":"uvicorn","requests":"requests","sqlalchemy":"sqlalchemy","pydantic":"pydantic","dotenv":"python-dotenv","PIL":"Pillow","cv2":"opencv-python","matplotlib":"matplotlib","seaborn":"seaborn","scipy":"scipy"}; var reqs=[]; Object.keys(reqMap).forEach(function(k){if(src.indexOf("import "+k)>=0||src.indexOf("from "+k)>=0)reqs.push(reqMap[k]);}); var reqsTxt=reqs.length?reqs.join("\n"):"# add dependencies here\n"; zip.file(folder+"main.py",src||"# "+title+"\n# Generated by PantheraHive BOS\n\nprint(title+\" loaded\")\n"); zip.file(folder+"requirements.txt",reqsTxt); zip.file(folder+".env.example","# Environment variables\n"); zip.file(folder+"README.md","# "+title+"\n\nGenerated by PantheraHive BOS.\n\n## Setup\n\`\`\`bash\npython3 -m venv .venv\nsource .venv/bin/activate\npip install -r requirements.txt\n\`\`\`\n\n## Run\n\`\`\`bash\npython main.py\n\`\`\`\n"); zip.file(folder+".gitignore",".venv/\n__pycache__/\n*.pyc\n.env\n.DS_Store\n"); } /* --- Node.js --- */ function buildNode(zip,folder,app,code){ var title=slugTitle(app); var pn=pkgName(app); var src=code.replace(/^\`\`\`[\w]*\n?/m,"").replace(/\n?\`\`\`$/m,"").trim(); var depMap={"mongoose":"^8.0.0","dotenv":"^16.4.5","axios":"^1.7.9","cors":"^2.8.5","bcryptjs":"^2.4.3","jsonwebtoken":"^9.0.2","socket.io":"^4.7.4","uuid":"^9.0.1","zod":"^3.22.4","express":"^4.18.2"}; var deps={}; Object.keys(depMap).forEach(function(k){if(src.indexOf(k)>=0)deps[k]=depMap[k];}); if(!deps["express"])deps["express"]="^4.18.2"; var pkgJson=JSON.stringify({"name":pn,"version":"1.0.0","main":"src/index.js","scripts":{"start":"node src/index.js","dev":"nodemon src/index.js"},"dependencies":deps,"devDependencies":{"nodemon":"^3.0.3"}},null,2)+"\n"; zip.file(folder+"package.json",pkgJson); var fallback="const express=require(\"express\");\nconst app=express();\napp.use(express.json());\n\napp.get(\"/\",(req,res)=>{\n res.json({message:\""+title+" API\"});\n});\n\nconst PORT=process.env.PORT||3000;\napp.listen(PORT,()=>console.log(\"Server on port \"+PORT));\n"; zip.file(folder+"src/index.js",src||fallback); zip.file(folder+".env.example","PORT=3000\n"); zip.file(folder+".gitignore","node_modules/\n.env\n.DS_Store\n"); zip.file(folder+"README.md","# "+title+"\n\nGenerated by PantheraHive BOS.\n\n## Setup\n\`\`\`bash\nnpm install\n\`\`\`\n\n## Run\n\`\`\`bash\nnpm run dev\n\`\`\`\n"); } /* --- Vanilla HTML --- */ function buildVanillaHtml(zip,folder,app,code){ var title=slugTitle(app); var isFullDoc=code.trim().toLowerCase().indexOf("=0||code.trim().toLowerCase().indexOf("=0; var indexHtml=isFullDoc?code:"\n\n\n\n\n"+title+"\n\n\n\n"+code+"\n\n\n\n"; zip.file(folder+"index.html",indexHtml); zip.file(folder+"style.css","/* "+title+" — styles */\n*{margin:0;padding:0;box-sizing:border-box}\nbody{font-family:system-ui,-apple-system,sans-serif;background:#fff;color:#1a1a2e}\n"); zip.file(folder+"script.js","/* "+title+" — scripts */\n"); zip.file(folder+"assets/.gitkeep",""); zip.file(folder+"README.md","# "+title+"\n\nGenerated by PantheraHive BOS.\n\n## Open\nDouble-click \`index.html\` in your browser.\n\nOr serve locally:\n\`\`\`bash\nnpx serve .\n# or\npython3 -m http.server 3000\n\`\`\`\n"); zip.file(folder+".gitignore",".DS_Store\nnode_modules/\n.env\n"); } /* ===== MAIN ===== */ var sc=document.createElement("script"); sc.src="https://cdnjs.cloudflare.com/ajax/libs/jszip/3.10.1/jszip.min.js"; sc.onerror=function(){ if(lbl)lbl.textContent="Download ZIP"; alert("JSZip load failed — check connection."); }; sc.onload=function(){ var zip=new JSZip(); var base=(_phFname||"output").replace(/\.[^.]+$/,""); var app=base.toLowerCase().replace(/[^a-z0-9]+/g,"_").replace(/^_+|_+$/g,"")||"my_app"; var folder=app+"/"; var vc=document.getElementById("panel-content"); var panelTxt=vc?(vc.innerText||vc.textContent||""):""; var lang=detectLang(_phCode,panelTxt); if(_phIsHtml){ buildVanillaHtml(zip,folder,app,_phCode); } else if(lang==="flutter"){ buildFlutter(zip,folder,app,_phCode,panelTxt); } else if(lang==="react-native"){ buildReactNative(zip,folder,app,_phCode,panelTxt); } else if(lang==="swift"){ buildSwift(zip,folder,app,_phCode,panelTxt); } else if(lang==="kotlin"){ buildKotlin(zip,folder,app,_phCode,panelTxt); } else if(lang==="react"){ buildReact(zip,folder,app,_phCode,panelTxt); } else if(lang==="vue"){ buildVue(zip,folder,app,_phCode,panelTxt); } else if(lang==="angular"){ buildAngular(zip,folder,app,_phCode,panelTxt); } else if(lang==="python"){ buildPython(zip,folder,app,_phCode); } else if(lang==="node"){ buildNode(zip,folder,app,_phCode); } else { /* Document/content workflow */ var title=app.replace(/_/g," "); var md=_phAll||_phCode||panelTxt||"No content"; zip.file(folder+app+".md",md); var h=""+title+""; h+="

"+title+"

"; var hc=md.replace(/&/g,"&").replace(//g,">"); hc=hc.replace(/^### (.+)$/gm,"

$1

"); hc=hc.replace(/^## (.+)$/gm,"

$1

"); hc=hc.replace(/^# (.+)$/gm,"

$1

"); hc=hc.replace(/\*\*(.+?)\*\*/g,"$1"); hc=hc.replace(/\n{2,}/g,"

"); h+="

"+hc+"

Generated by PantheraHive BOS
"; zip.file(folder+app+".html",h); zip.file(folder+"README.md","# "+title+"\n\nGenerated by PantheraHive BOS.\n\nFiles:\n- "+app+".md (Markdown)\n- "+app+".html (styled HTML)\n"); } zip.generateAsync({type:"blob"}).then(function(blob){ var a=document.createElement("a"); a.href=URL.createObjectURL(blob); a.download=app+".zip"; a.click(); URL.revokeObjectURL(a.href); if(lbl)lbl.textContent="Download ZIP"; }); }; document.head.appendChild(sc); } function phShare(){navigator.clipboard.writeText(window.location.href).then(function(){var el=document.getElementById("ph-share-lbl");if(el){el.textContent="Link copied!";setTimeout(function(){el.textContent="Copy share link";},2500);}});}function phEmbed(){var runId=window.location.pathname.split("/").pop().replace(".html","");var embedUrl="https://pantherahive.com/embed/"+runId;var code='';navigator.clipboard.writeText(code).then(function(){var el=document.getElementById("ph-embed-lbl");if(el){el.textContent="Embed code copied!";setTimeout(function(){el.textContent="Get Embed Code";},2500);}});}