Data Migration Planner
Run ID: 69cb76f461b1021a29a894b02026-03-31Development
PantheraHive BOS
BOS Dashboard

Plan a complete data migration with field mapping, transformation rules, validation scripts, rollback procedures, and timeline estimates.

Comprehensive Study Plan: Mastering Data Migration Planning

This document outlines a detailed 12-week study plan designed to equip professionals with the knowledge and practical skills required to plan and execute complex data migrations successfully. This plan emphasizes a structured approach, combining theoretical understanding with practical application, culminating in the ability to confidently manage data migration projects.


1. Introduction and Overall Learning Objectives

Purpose: To provide a structured, actionable roadmap for individuals aiming to become proficient in data migration planning, covering all critical aspects from initial assessment to post-migration validation and project management.

Target Audience: IT professionals, project managers, data architects, database administrators, and developers looking to specialize or enhance their skills in data migration.

Expected Outcome: Upon completion, the learner will possess a comprehensive understanding of data migration methodologies, best practices, tools, and the ability to design robust migration strategies, manage risks, and ensure data integrity throughout the process.

Overall Learning Objectives:

  • Understand the full lifecycle of a data migration project.
  • Master techniques for data profiling, analysis, and discovery.
  • Develop robust field mapping and data transformation rules.
  • Design and implement effective data validation and quality assurance procedures.
  • Formulate comprehensive rollback and contingency plans.
  • Evaluate and select appropriate data migration tools and technologies.
  • Estimate project timelines, resources, and manage associated risks.
  • Communicate effectively with stakeholders throughout the migration process.
  • Lead or contribute significantly to successful data migration initiatives.

2. Weekly Schedule and Detailed Learning Objectives

This 12-week schedule breaks down the learning journey into manageable modules, each with specific objectives.

Week 1: Data Migration Fundamentals & Project Initiation

  • Learning Objectives:

* Define data migration, its types, and common triggers.

* Understand the key phases of a data migration lifecycle.

* Identify common challenges and risks in data migration.

* Learn how to define project scope, objectives, and success criteria.

* Understand stakeholder identification and initial communication strategies.

* Introduction to data governance and compliance considerations in migration.

  • Activities: Read foundational articles, watch introductory videos, participate in a discussion forum on common migration pitfalls.

Week 2: Source System Analysis & Data Profiling

  • Learning Objectives:

* Master techniques for analyzing source system architecture and data models.

* Learn to perform comprehensive data profiling to understand data characteristics (volume, velocity, variety, veracity).

* Identify data quality issues (duplicates, inconsistencies, missing values) within the source.

* Understand the importance of data dictionaries and metadata management.

* Gain proficiency in using SQL queries and basic data profiling tools.

  • Activities: Practice SQL queries for data profiling, analyze a sample dataset, document findings.

Week 3: Target System Design & Schema Mapping

  • Learning Objectives:

* Understand the process of designing or adapting the target system schema.

* Learn to identify and resolve discrepancies between source and target data models.

* Develop strategies for handling new fields, deprecated fields, and data type changes.

* Understand the impact of target system constraints (e.g., referential integrity, unique keys) on migration.

* Introduction to data warehousing concepts if applicable to the target.

  • Activities: Compare a source schema to a target schema, propose schema adjustments, create an initial schema mapping document.

Week 4: Field Mapping & Data Transformation Logic

  • Learning Objectives:

* Master the creation of detailed field-level mapping documents.

* Develop complex data transformation rules (e.g., concatenation, splitting, lookup, aggregation, conditional logic).

* Understand the implications of data cleansing and enrichment during transformation.

* Learn to document transformation logic clearly and unambiguously.

* Introduction to ETL (Extract, Transform, Load) concepts and tools.

  • Activities: Create detailed field mapping and transformation rules for a complex business entity, write pseudo-code for transformations.

Week 5: Data Quality, Cleansing & Enrichment

  • Learning Objectives:

* Deep dive into data quality dimensions (accuracy, completeness, consistency, timeliness, validity, uniqueness).

* Learn various data cleansing techniques and best practices.

* Understand the role of data governance in maintaining quality post-migration.

* Explore strategies for data enrichment (e.g., geocoding, third-party data).

* Understand the trade-offs between automated and manual cleansing.

  • Activities: Apply data cleansing rules to a dirty dataset, document a data quality improvement plan.

Week 6: Migration Strategy & Tooling Selection

  • Learning Objectives:

* Understand different migration strategies (e.g., Big Bang, Phased, Trickle, Coexistence).

* Evaluate the pros and cons of various strategies based on project context.

* Learn to select appropriate data migration tools (e.g., ETL tools, custom scripts, cloud migration services).

* Understand criteria for tool selection (cost, performance, features, support, skill set).

* Introduction to concepts like data virtualization and replication.

  • Activities: Research and compare 3-4 data migration tools, propose a migration strategy for a given scenario.

Week 7: Data Validation & Testing

  • Learning Objectives:

* Design comprehensive data validation scripts and test cases.

* Understand different types of validation (e.g., count validation, checksums, reconciliation, business rule validation).

* Learn to define acceptance criteria for migrated data.

* Develop a robust data migration testing plan (unit, integration, user acceptance testing).

* Understand the importance of test data management.

  • Activities: Write sample SQL validation scripts, create a data migration test plan with scenarios and expected outcomes.

Week 8: Rollback & Contingency Planning

  • Learning Objectives:

* Understand the critical importance of a well-defined rollback strategy.

* Learn to design detailed rollback procedures for various failure points.

* Identify potential migration risks and develop contingency plans.

* Understand data backup and recovery strategies in the context of migration.

* Develop a communication plan for migration failures.

  • Activities: Draft a rollback plan for a specific migration phase, identify top 5 risks and their mitigation strategies.

Week 9: Performance, Security & Compliance

  • Learning Objectives:

* Optimize migration performance (e.g., batching, indexing, parallel processing).

* Understand data security considerations during migration (encryption, access control).

* Learn about compliance requirements (GDPR, HIPAA, PCI-DSS) and their impact on migration.

* Implement auditing and logging mechanisms for migration activities.

* Understand data archiving strategies post-migration.

  • Activities: Research compliance requirements for a specific industry, propose performance optimization techniques for a large dataset.

Week 10: Cutover Execution & Post-Migration Activities

  • Learning Objectives:

* Plan and manage the cutover phase, including downtime minimization.

* Understand the steps involved in go-live and immediate post-migration monitoring.

* Learn about post-migration data reconciliation and reporting.

* Develop a plan for decommissioning old systems.

* Understand hypercare period management and issue resolution.

  • Activities: Create a detailed cutover checklist, outline post-migration monitoring metrics.

Week 11: Project Management & Communication

  • Learning Objectives:

* Apply project management principles to data migration (scope, schedule, budget, resources).

* Develop effective communication strategies for diverse stakeholders.

* Learn to manage expectations and report progress.

* Understand change management principles in the context of migration.

* Risk management and issue tracking best practices.

  • Activities: Develop a stakeholder communication plan, create a basic project schedule (Gantt chart concept).

Week 12: Capstone Project & Review

  • Learning Objectives:

* Synthesize all learned concepts into a comprehensive data migration plan.

* Present a well-articulated migration strategy to a mock stakeholder group.

* Identify areas for continuous improvement in data migration processes.

* Review key concepts and prepare for advanced topics or certifications.

  • Activities: Work on a comprehensive data migration plan for a provided case study, prepare and deliver a presentation.

3. Recommended Resources

This section lists essential resources to support the learning journey.

Books:

  • "Data Migration: The Essential Guide" by Johny S. Maalouf
  • "Data Migration and Related Technologies" by Christian H. N. Schaller
  • "The DAMA Guide to the Data Management Body of Knowledge (DMBOK2)" - for broader data management context.
  • "Designing Data-Intensive Applications" by Martin Kleppmann - for understanding underlying data systems.

Online Courses & Platforms:

  • Coursera/edX: Courses on Data Warehousing, ETL, Database Management, Cloud Data Engineering (e.g., Google Cloud Data Engineering, AWS Data Analytics).
  • Udemy/Pluralsight: Practical courses on specific ETL tools (e.g., Talend, Informatica, SSIS), SQL, Python for data manipulation.
  • LinkedIn Learning: Courses on Project Management, Change Management, specific database technologies.
  • Cloud Provider Documentation: AWS Database Migration Service (DMS), Azure Database Migration Service, Google Cloud Dataproc/Dataflow documentation.

Articles & Whitepapers:

  • Gartner, Forrester Reports: Industry insights on data migration tools and trends.
  • Vendor Whitepapers: Specific guidance from major database vendors (Oracle, Microsoft, IBM) on migrating to/from their platforms.
  • Industry Blogs: Data management, ETL, cloud computing blogs for best practices and case studies.

Tools for Practice:

  • Databases: PostgreSQL, MySQL (free and widely used for practice).
  • SQL Client Tools: DBeaver, SQL Developer, pgAdmin.
  • ETL Tools (Community/Trial Editions): Talend Open Studio, Pentaho Data Integration (Kettle), Apache NiFi.
  • Data Profiling Tools: OpenRefine, Pandas (Python library).
  • Project Management Tools: Asana, Trello, Jira (for understanding concepts).
  • Spreadsheets: Microsoft Excel, Google Sheets (for field mapping, smaller datasets).

Communities:

  • Stack Overflow: For specific technical questions.
  • LinkedIn Groups: Data Management, Data Migration Professionals.
  • Local Meetups/User Groups: Database user groups, cloud computing meetups.

4. Milestones

Key checkpoints to track progress and reinforce learning throughout the 12-week program.

  • End of Week 2: Source Data Analysis Report: Submission of a detailed report on a sample source dataset, including data profiling results and identified quality issues.
  • End of Week 4: Field Mapping & Transformation Document: Completion of a comprehensive field-level mapping and transformation logic document for a given business scenario.
  • End of Week 6: Migration Strategy Proposal: Presentation of a chosen migration strategy and tool selection justification for a hypothetical project, including pros and cons.
  • End of Week 8: Rollback & Validation Plan: Draft of a detailed rollback plan and a set of data validation scripts for a specific migration phase.
  • End of Week 11: Project Communication Plan: Submission of a stakeholder communication and risk management plan for a data migration project.
  • End of Week 12: Capstone Data Migration Plan & Presentation: Final comprehensive data migration plan and a professional presentation to a mock stakeholder group.

5. Assessment Strategies

Various methods will be employed to assess understanding and practical application of the learned material.

  • Weekly Quizzes/Exercises: Short online quizzes or practical exercises at the end of each week to test understanding of core concepts and immediate application.
  • Practical Assignments: Hands-on tasks such as writing SQL queries for data profiling, designing schema mappings, or developing transformation rules using a tool or pseudo-code.
  • Case Study Analysis: Evaluation of the learner's ability to analyze complex migration scenarios, identify challenges, and propose appropriate solutions.
  • Document Review: Assessment of the clarity, completeness, and accuracy of submitted documentation (e.g., mapping documents, test plans, rollback procedures).
  • Capstone Project Evaluation: Comprehensive assessment of the final data migration plan, including its completeness, realism, risk mitigation, and the learner's ability to articulate and defend their strategy during a presentation.
  • Peer Review (Optional): Engagement in peer review of assignments and plans to foster collaborative learning and critical feedback.
  • Self-Assessment: Encouragement of regular self-reflection on learning progress and identification of areas requiring further study or practice.

This detailed study plan provides a robust framework for anyone committed to mastering the complexities of data migration planning. Consistent effort, hands-on practice, and engagement with the recommended resources will be key to achieving proficiency.

gemini Output

This deliverable provides a comprehensive, detailed, and professional data migration plan structured as executable Python code. This code defines the necessary data structures (classes) to represent all aspects of a migration plan, including field mappings, transformation rules, validation scripts, rollback procedures, and timeline estimates. It also includes functionality to generate a human-readable report from the plan and convert it to a JSON format.

The generated code is designed for clarity, maintainability, and extensibility, serving as a robust foundation for your data migration project.


Data Migration Planner Code Deliverable

This Python script (data_migration_planner.py) encapsulates the entire data migration plan. It defines classes for each component of the plan and a main DataMigrationPlan class to orchestrate them. An example usage

gemini Output

Data Migration Planner: Comprehensive Migration Plan

Document Version: 1.0

Date: October 26, 2023

Prepared For: [Customer Name]

Prepared By: PantheraHive Solutions


1. Executive Summary

This document outlines a comprehensive plan for the data migration project from [Source System Name(s)] to [Target System Name(s)]. It details the strategic approach, defines key phases, and provides actionable steps for field mapping, data transformation, validation, and rollback procedures. The goal is to ensure a smooth, accurate, and secure transfer of data, minimizing downtime and mitigating risks, while maintaining data integrity and business continuity.

2. Project Overview

  • Project Name: [e.g., CRM System Upgrade Data Migration]
  • Source System(s): [e.g., Legacy CRM (SQL Server 2012), ERP System (Oracle 11g)]

* Key Data Entities: [e.g., Customers, Contacts, Orders, Products, Accounts]

  • Target System(s): [e.g., Salesforce Sales Cloud, SAP S/4HANA]

* Key Data Entities: [e.g., Accounts, Contacts, Opportunities, Products, Orders]

  • Migration Goal: To successfully migrate all relevant historical and active data from the specified source systems to the new target system, enabling full operational capability of the new platform by [Target Go-Live Date].
  • Migration Objectives:

* Achieve 100% data accuracy and completeness in the target system for in-scope data.

* Minimize business disruption and downtime during the migration window.

* Ensure data integrity and compliance with new system requirements.

* Provide a robust rollback strategy in case of unforeseen issues.

* Establish clear documentation for future reference and audits.

3. Data Migration Strategy

The proposed strategy for this migration is a Phased Migration approach, focusing on critical data entities first, followed by less critical or historical data. This allows for iterative testing, validation, and refinement of the migration process, reducing overall risk.

  • Phase 1: Critical Business Data: Migrate core operational data (e.g., active customers, open orders, current product catalog) to enable essential business functions in the new system.
  • Phase 2: Historical & Ancillary Data: Migrate historical records, archived data, and less frequently accessed information.
  • Cutover Strategy: A "Big Bang" cutover will be employed for each phase, where data will be frozen in the source system, migrated, validated, and then the target system will go live for that data subset. This minimizes data synchronization complexities.

4. Detailed Migration Plan Components

4.1. Field Mapping (Source to Target)

The field mapping process is critical for defining how data elements from the source system will correspond to the data elements in the target system. This will be documented in a comprehensive "Data Mapping Specification" document.

Process:

  1. Discovery & Analysis: Identify all relevant fields in both source and target systems.
  2. Initial Mapping: Propose direct mappings where fields align perfectly.
  3. Complex Mapping Identification: Identify fields requiring transformation, aggregation, splitting, or lookup.
  4. Business Review: Present proposed mappings to business stakeholders for verification and sign-off.
  5. Technical Review: Review mappings with development and architecture teams for feasibility and performance.

Example Mapping Structure (Extract from Data Mapping Specification):

| Source System | Source Field Name | Source Data Type | Source Description | Target System | Target Field Name | Target Data Type | Target Description | Mapping Logic / Transformation Rule ID | Notes |

| :------------ | :---------------- | :--------------- | :----------------- | :------------ | :---------------- | :--------------- | :----------------- | :------------------------------------ | :---- |

| Legacy CRM | CUSTOMER\_ID | INT | Unique Customer ID | Salesforce | Account.ExternalId__c | Text(255) | External ID for Account | Direct Map (Type Conversion) | Used for upsert operations |

| Legacy CRM | CUST\_NAME | VARCHAR(250) | Customer Full Name | Salesforce | Account.Name | Text(255) | Account Name | Direct Map | |

| Legacy CRM | CUST\_ADDR1 | VARCHAR(100) | Customer Address Line 1 | Salesforce | Account.BillingStreet | Text(255) | Billing Street | Concatenate ADDR1, ADDR2, ADDR3 (TR-001) | |

| Legacy CRM | CUST\_STATUS | CHAR(1) | A=Active, I=Inactive | Salesforce | Account.Status__c | Picklist | Account Status | Lookup (TR-002) | Map 'A' to 'Active', 'I' to 'Inactive' |

| Legacy CRM | LAST\_MOD\_DT | DATETIME | Last Modified Date | Salesforce | Account.LastModifiedDate | DateTime | Last Modified Date | Direct Map | |

4.2. Data Transformation Rules

Data transformation rules define how source data will be modified to fit the target system's structure, format, and business logic. Each rule will be detailed in the "Data Transformation Rules Specification."

Common Transformation Types:

  • Data Type Conversion: Changing a field's data type (e.g., INT to TEXT, DATETIME to DATE).

Example:* CUSTOMER_ID (INT) in Legacy CRM to Account.ExternalId__c (Text) in Salesforce.

  • Concatenation: Combining multiple source fields into a single target field.

Example (TR-001):* CUST_ADDR1, CUST_ADDR2, CUST_ADDR3 from Legacy CRM combined into Account.BillingStreet in Salesforce, separated by commas.

  • Splitting: Dividing a single source field into multiple target fields.

Example:* FULL_NAME from source split into FirstName and LastName in target.

  • Lookup / Mapping: Converting source values to target-specific values based on a predefined mapping table or logic.

Example (TR-002):* CUST_STATUS ('A'/'I') from Legacy CRM mapped to Account.Status__c ('Active'/'Inactive') in Salesforce.

  • Default Values: Assigning a default value to a target field if the source field is null or empty.

Example:* If SOURCE_REGION is null, set TARGET_REGION to 'Unknown'.

  • Aggregation / Calculation: Performing calculations or aggregations on source data.

Example:* Summing ORDER_LINE_ITEM_AMOUNT to populate TOTAL_ORDER_VALUE.

  • Data Cleansing: Removing invalid characters, trimming whitespace, standardizing formats (e.g., phone numbers).

Example:* Remove all non-numeric characters from PHONE_NUMBER field.

Documentation Format (Extract from Data Transformation Rules Specification):

| Rule ID | Rule Name | Source Field(s) | Target Field | Transformation Logic

data_migration_planner.md
Download as Markdown
Copy all content
Full output as text
Download ZIP
IDE-ready project ZIP
Copy share link
Permanent URL for this run
Get Embed Code
Embed this result on any website
Print / Save PDF
Use browser print dialog
\n\n\n"); var hasSrcMain=Object.keys(extracted).some(function(k){return k.indexOf("src/main")>=0;}); if(!hasSrcMain) zip.file(folder+"src/main."+ext,"import React from 'react'\nimport ReactDOM from 'react-dom/client'\nimport App from './App'\nimport './index.css'\n\nReactDOM.createRoot(document.getElementById('root')!).render(\n \n \n \n)\n"); var hasSrcApp=Object.keys(extracted).some(function(k){return k==="src/App."+ext||k==="App."+ext;}); if(!hasSrcApp) zip.file(folder+"src/App."+ext,"import React from 'react'\nimport './App.css'\n\nfunction App(){\n return(\n
\n
\n

"+slugTitle(pn)+"

\n

Built with PantheraHive BOS

\n
\n
\n )\n}\nexport default App\n"); zip.file(folder+"src/index.css","*{margin:0;padding:0;box-sizing:border-box}\nbody{font-family:system-ui,-apple-system,sans-serif;background:#f0f2f5;color:#1a1a2e}\n.app{min-height:100vh;display:flex;flex-direction:column}\n.app-header{flex:1;display:flex;flex-direction:column;align-items:center;justify-content:center;gap:12px;padding:40px}\nh1{font-size:2.5rem;font-weight:700}\n"); zip.file(folder+"src/App.css",""); zip.file(folder+"src/components/.gitkeep",""); zip.file(folder+"src/pages/.gitkeep",""); zip.file(folder+"src/hooks/.gitkeep",""); Object.keys(extracted).forEach(function(p){ var fp=p.startsWith("src/")?p:"src/"+p; zip.file(folder+fp,extracted[p]); }); zip.file(folder+"README.md","# "+slugTitle(pn)+"\n\nGenerated by PantheraHive BOS.\n\n## Setup\n\`\`\`bash\nnpm install\nnpm run dev\n\`\`\`\n\n## Build\n\`\`\`bash\nnpm run build\n\`\`\`\n\n## Open in IDE\nOpen the project folder in VS Code or WebStorm.\n"); zip.file(folder+".gitignore","node_modules/\ndist/\n.env\n.DS_Store\n*.local\n"); } /* --- Vue (Vite + Composition API + TypeScript) --- */ function buildVue(zip,folder,app,code,panelTxt){ var pn=pkgName(app); var C=cc(pn); var extracted=extractCode(panelTxt); zip.file(folder+"package.json",'{\n "name": "'+pn+'",\n "version": "0.0.0",\n "type": "module",\n "scripts": {\n "dev": "vite",\n "build": "vue-tsc -b && vite build",\n "preview": "vite preview"\n },\n "dependencies": {\n "vue": "^3.5.13",\n "vue-router": "^4.4.5",\n "pinia": "^2.3.0",\n "axios": "^1.7.9"\n },\n "devDependencies": {\n "@vitejs/plugin-vue": "^5.2.1",\n "typescript": "~5.7.3",\n "vite": "^6.0.5",\n "vue-tsc": "^2.2.0"\n }\n}\n'); zip.file(folder+"vite.config.ts","import { defineConfig } from 'vite'\nimport vue from '@vitejs/plugin-vue'\nimport { resolve } from 'path'\n\nexport default defineConfig({\n plugins: [vue()],\n resolve: { alias: { '@': resolve(__dirname,'src') } }\n})\n"); zip.file(folder+"tsconfig.json",'{"files":[],"references":[{"path":"./tsconfig.app.json"},{"path":"./tsconfig.node.json"}]}\n'); zip.file(folder+"tsconfig.app.json",'{\n "compilerOptions":{\n "target":"ES2020","useDefineForClassFields":true,"module":"ESNext","lib":["ES2020","DOM","DOM.Iterable"],\n "skipLibCheck":true,"moduleResolution":"bundler","allowImportingTsExtensions":true,\n "isolatedModules":true,"moduleDetection":"force","noEmit":true,"jsxImportSource":"vue",\n "strict":true,"paths":{"@/*":["./src/*"]}\n },\n "include":["src/**/*.ts","src/**/*.d.ts","src/**/*.tsx","src/**/*.vue"]\n}\n'); zip.file(folder+"env.d.ts","/// \n"); zip.file(folder+"index.html","\n\n\n \n \n "+slugTitle(pn)+"\n\n\n
\n \n\n\n"); var hasMain=Object.keys(extracted).some(function(k){return k==="src/main.ts"||k==="main.ts";}); if(!hasMain) zip.file(folder+"src/main.ts","import { createApp } from 'vue'\nimport { createPinia } from 'pinia'\nimport App from './App.vue'\nimport './assets/main.css'\n\nconst app = createApp(App)\napp.use(createPinia())\napp.mount('#app')\n"); var hasApp=Object.keys(extracted).some(function(k){return k.indexOf("App.vue")>=0;}); if(!hasApp) zip.file(folder+"src/App.vue","\n\n\n\n\n"); zip.file(folder+"src/assets/main.css","*{margin:0;padding:0;box-sizing:border-box}body{font-family:system-ui,sans-serif;background:#fff;color:#213547}\n"); zip.file(folder+"src/components/.gitkeep",""); zip.file(folder+"src/views/.gitkeep",""); zip.file(folder+"src/stores/.gitkeep",""); Object.keys(extracted).forEach(function(p){ var fp=p.startsWith("src/")?p:"src/"+p; zip.file(folder+fp,extracted[p]); }); zip.file(folder+"README.md","# "+slugTitle(pn)+"\n\nGenerated by PantheraHive BOS.\n\n## Setup\n\`\`\`bash\nnpm install\nnpm run dev\n\`\`\`\n\n## Build\n\`\`\`bash\nnpm run build\n\`\`\`\n\nOpen in VS Code or WebStorm.\n"); zip.file(folder+".gitignore","node_modules/\ndist/\n.env\n.DS_Store\n*.local\n"); } /* --- Angular (v19 standalone) --- */ function buildAngular(zip,folder,app,code,panelTxt){ var pn=pkgName(app); var C=cc(pn); var sel=pn.replace(/_/g,"-"); var extracted=extractCode(panelTxt); zip.file(folder+"package.json",'{\n "name": "'+pn+'",\n "version": "0.0.0",\n "scripts": {\n "ng": "ng",\n "start": "ng serve",\n "build": "ng build",\n "test": "ng test"\n },\n "dependencies": {\n "@angular/animations": "^19.0.0",\n "@angular/common": "^19.0.0",\n "@angular/compiler": "^19.0.0",\n "@angular/core": "^19.0.0",\n "@angular/forms": "^19.0.0",\n "@angular/platform-browser": "^19.0.0",\n "@angular/platform-browser-dynamic": "^19.0.0",\n "@angular/router": "^19.0.0",\n "rxjs": "~7.8.0",\n "tslib": "^2.3.0",\n "zone.js": "~0.15.0"\n },\n "devDependencies": {\n "@angular-devkit/build-angular": "^19.0.0",\n "@angular/cli": "^19.0.0",\n "@angular/compiler-cli": "^19.0.0",\n "typescript": "~5.6.0"\n }\n}\n'); zip.file(folder+"angular.json",'{\n "$schema": "./node_modules/@angular/cli/lib/config/schema.json",\n "version": 1,\n "newProjectRoot": "projects",\n "projects": {\n "'+pn+'": {\n "projectType": "application",\n "root": "",\n "sourceRoot": "src",\n "prefix": "app",\n "architect": {\n "build": {\n "builder": "@angular-devkit/build-angular:application",\n "options": {\n "outputPath": "dist/'+pn+'",\n "index": "src/index.html",\n "browser": "src/main.ts",\n "tsConfig": "tsconfig.app.json",\n "styles": ["src/styles.css"],\n "scripts": []\n }\n },\n "serve": {"builder":"@angular-devkit/build-angular:dev-server","configurations":{"production":{"buildTarget":"'+pn+':build:production"},"development":{"buildTarget":"'+pn+':build:development"}},"defaultConfiguration":"development"}\n }\n }\n }\n}\n'); zip.file(folder+"tsconfig.json",'{\n "compileOnSave": false,\n "compilerOptions": {"baseUrl":"./","outDir":"./dist/out-tsc","forceConsistentCasingInFileNames":true,"strict":true,"noImplicitOverride":true,"noPropertyAccessFromIndexSignature":true,"noImplicitReturns":true,"noFallthroughCasesInSwitch":true,"paths":{"@/*":["src/*"]},"skipLibCheck":true,"esModuleInterop":true,"sourceMap":true,"declaration":false,"experimentalDecorators":true,"moduleResolution":"bundler","importHelpers":true,"target":"ES2022","module":"ES2022","useDefineForClassFields":false,"lib":["ES2022","dom"]},\n "references":[{"path":"./tsconfig.app.json"}]\n}\n'); zip.file(folder+"tsconfig.app.json",'{\n "extends":"./tsconfig.json",\n "compilerOptions":{"outDir":"./dist/out-tsc","types":[]},\n "files":["src/main.ts"],\n "include":["src/**/*.d.ts"]\n}\n'); zip.file(folder+"src/index.html","\n\n\n \n "+slugTitle(pn)+"\n \n \n \n\n\n \n\n\n"); zip.file(folder+"src/main.ts","import { bootstrapApplication } from '@angular/platform-browser';\nimport { appConfig } from './app/app.config';\nimport { AppComponent } from './app/app.component';\n\nbootstrapApplication(AppComponent, appConfig)\n .catch(err => console.error(err));\n"); zip.file(folder+"src/styles.css","* { margin: 0; padding: 0; box-sizing: border-box; }\nbody { font-family: system-ui, -apple-system, sans-serif; background: #f9fafb; color: #111827; }\n"); var hasComp=Object.keys(extracted).some(function(k){return k.indexOf("app.component")>=0;}); if(!hasComp){ zip.file(folder+"src/app/app.component.ts","import { Component } from '@angular/core';\nimport { RouterOutlet } from '@angular/router';\n\n@Component({\n selector: 'app-root',\n standalone: true,\n imports: [RouterOutlet],\n templateUrl: './app.component.html',\n styleUrl: './app.component.css'\n})\nexport class AppComponent {\n title = '"+pn+"';\n}\n"); zip.file(folder+"src/app/app.component.html","
\n
\n

"+slugTitle(pn)+"

\n

Built with PantheraHive BOS

\n
\n \n
\n"); zip.file(folder+"src/app/app.component.css",".app-header{display:flex;flex-direction:column;align-items:center;justify-content:center;min-height:60vh;gap:16px}h1{font-size:2.5rem;font-weight:700;color:#6366f1}\n"); } zip.file(folder+"src/app/app.config.ts","import { ApplicationConfig, provideZoneChangeDetection } from '@angular/core';\nimport { provideRouter } from '@angular/router';\nimport { routes } from './app.routes';\n\nexport const appConfig: ApplicationConfig = {\n providers: [\n provideZoneChangeDetection({ eventCoalescing: true }),\n provideRouter(routes)\n ]\n};\n"); zip.file(folder+"src/app/app.routes.ts","import { Routes } from '@angular/router';\n\nexport const routes: Routes = [];\n"); Object.keys(extracted).forEach(function(p){ var fp=p.startsWith("src/")?p:"src/"+p; zip.file(folder+fp,extracted[p]); }); zip.file(folder+"README.md","# "+slugTitle(pn)+"\n\nGenerated by PantheraHive BOS.\n\n## Setup\n\`\`\`bash\nnpm install\nng serve\n# or: npm start\n\`\`\`\n\n## Build\n\`\`\`bash\nng build\n\`\`\`\n\nOpen in VS Code with Angular Language Service extension.\n"); zip.file(folder+".gitignore","node_modules/\ndist/\n.env\n.DS_Store\n*.local\n.angular/\n"); } /* --- Python --- */ function buildPython(zip,folder,app,code){ var title=slugTitle(app); var pn=pkgName(app); var src=code.replace(/^\`\`\`[\w]*\n?/m,"").replace(/\n?\`\`\`$/m,"").trim(); var reqMap={"numpy":"numpy","pandas":"pandas","sklearn":"scikit-learn","tensorflow":"tensorflow","torch":"torch","flask":"flask","fastapi":"fastapi","uvicorn":"uvicorn","requests":"requests","sqlalchemy":"sqlalchemy","pydantic":"pydantic","dotenv":"python-dotenv","PIL":"Pillow","cv2":"opencv-python","matplotlib":"matplotlib","seaborn":"seaborn","scipy":"scipy"}; var reqs=[]; Object.keys(reqMap).forEach(function(k){if(src.indexOf("import "+k)>=0||src.indexOf("from "+k)>=0)reqs.push(reqMap[k]);}); var reqsTxt=reqs.length?reqs.join("\n"):"# add dependencies here\n"; zip.file(folder+"main.py",src||"# "+title+"\n# Generated by PantheraHive BOS\n\nprint(title+\" loaded\")\n"); zip.file(folder+"requirements.txt",reqsTxt); zip.file(folder+".env.example","# Environment variables\n"); zip.file(folder+"README.md","# "+title+"\n\nGenerated by PantheraHive BOS.\n\n## Setup\n\`\`\`bash\npython3 -m venv .venv\nsource .venv/bin/activate\npip install -r requirements.txt\n\`\`\`\n\n## Run\n\`\`\`bash\npython main.py\n\`\`\`\n"); zip.file(folder+".gitignore",".venv/\n__pycache__/\n*.pyc\n.env\n.DS_Store\n"); } /* --- Node.js --- */ function buildNode(zip,folder,app,code){ var title=slugTitle(app); var pn=pkgName(app); var src=code.replace(/^\`\`\`[\w]*\n?/m,"").replace(/\n?\`\`\`$/m,"").trim(); var depMap={"mongoose":"^8.0.0","dotenv":"^16.4.5","axios":"^1.7.9","cors":"^2.8.5","bcryptjs":"^2.4.3","jsonwebtoken":"^9.0.2","socket.io":"^4.7.4","uuid":"^9.0.1","zod":"^3.22.4","express":"^4.18.2"}; var deps={}; Object.keys(depMap).forEach(function(k){if(src.indexOf(k)>=0)deps[k]=depMap[k];}); if(!deps["express"])deps["express"]="^4.18.2"; var pkgJson=JSON.stringify({"name":pn,"version":"1.0.0","main":"src/index.js","scripts":{"start":"node src/index.js","dev":"nodemon src/index.js"},"dependencies":deps,"devDependencies":{"nodemon":"^3.0.3"}},null,2)+"\n"; zip.file(folder+"package.json",pkgJson); var fallback="const express=require(\"express\");\nconst app=express();\napp.use(express.json());\n\napp.get(\"/\",(req,res)=>{\n res.json({message:\""+title+" API\"});\n});\n\nconst PORT=process.env.PORT||3000;\napp.listen(PORT,()=>console.log(\"Server on port \"+PORT));\n"; zip.file(folder+"src/index.js",src||fallback); zip.file(folder+".env.example","PORT=3000\n"); zip.file(folder+".gitignore","node_modules/\n.env\n.DS_Store\n"); zip.file(folder+"README.md","# "+title+"\n\nGenerated by PantheraHive BOS.\n\n## Setup\n\`\`\`bash\nnpm install\n\`\`\`\n\n## Run\n\`\`\`bash\nnpm run dev\n\`\`\`\n"); } /* --- Vanilla HTML --- */ function buildVanillaHtml(zip,folder,app,code){ var title=slugTitle(app); var isFullDoc=code.trim().toLowerCase().indexOf("=0||code.trim().toLowerCase().indexOf("=0; var indexHtml=isFullDoc?code:"\n\n\n\n\n"+title+"\n\n\n\n"+code+"\n\n\n\n"; zip.file(folder+"index.html",indexHtml); zip.file(folder+"style.css","/* "+title+" — styles */\n*{margin:0;padding:0;box-sizing:border-box}\nbody{font-family:system-ui,-apple-system,sans-serif;background:#fff;color:#1a1a2e}\n"); zip.file(folder+"script.js","/* "+title+" — scripts */\n"); zip.file(folder+"assets/.gitkeep",""); zip.file(folder+"README.md","# "+title+"\n\nGenerated by PantheraHive BOS.\n\n## Open\nDouble-click \`index.html\` in your browser.\n\nOr serve locally:\n\`\`\`bash\nnpx serve .\n# or\npython3 -m http.server 3000\n\`\`\`\n"); zip.file(folder+".gitignore",".DS_Store\nnode_modules/\n.env\n"); } /* ===== MAIN ===== */ var sc=document.createElement("script"); sc.src="https://cdnjs.cloudflare.com/ajax/libs/jszip/3.10.1/jszip.min.js"; sc.onerror=function(){ if(lbl)lbl.textContent="Download ZIP"; alert("JSZip load failed — check connection."); }; sc.onload=function(){ var zip=new JSZip(); var base=(_phFname||"output").replace(/\.[^.]+$/,""); var app=base.toLowerCase().replace(/[^a-z0-9]+/g,"_").replace(/^_+|_+$/g,"")||"my_app"; var folder=app+"/"; var vc=document.getElementById("panel-content"); var panelTxt=vc?(vc.innerText||vc.textContent||""):""; var lang=detectLang(_phCode,panelTxt); if(_phIsHtml){ buildVanillaHtml(zip,folder,app,_phCode); } else if(lang==="flutter"){ buildFlutter(zip,folder,app,_phCode,panelTxt); } else if(lang==="react-native"){ buildReactNative(zip,folder,app,_phCode,panelTxt); } else if(lang==="swift"){ buildSwift(zip,folder,app,_phCode,panelTxt); } else if(lang==="kotlin"){ buildKotlin(zip,folder,app,_phCode,panelTxt); } else if(lang==="react"){ buildReact(zip,folder,app,_phCode,panelTxt); } else if(lang==="vue"){ buildVue(zip,folder,app,_phCode,panelTxt); } else if(lang==="angular"){ buildAngular(zip,folder,app,_phCode,panelTxt); } else if(lang==="python"){ buildPython(zip,folder,app,_phCode); } else if(lang==="node"){ buildNode(zip,folder,app,_phCode); } else { /* Document/content workflow */ var title=app.replace(/_/g," "); var md=_phAll||_phCode||panelTxt||"No content"; zip.file(folder+app+".md",md); var h=""+title+""; h+="

"+title+"

"; var hc=md.replace(/&/g,"&").replace(//g,">"); hc=hc.replace(/^### (.+)$/gm,"

$1

"); hc=hc.replace(/^## (.+)$/gm,"

$1

"); hc=hc.replace(/^# (.+)$/gm,"

$1

"); hc=hc.replace(/\*\*(.+?)\*\*/g,"$1"); hc=hc.replace(/\n{2,}/g,"

"); h+="

"+hc+"

Generated by PantheraHive BOS
"; zip.file(folder+app+".html",h); zip.file(folder+"README.md","# "+title+"\n\nGenerated by PantheraHive BOS.\n\nFiles:\n- "+app+".md (Markdown)\n- "+app+".html (styled HTML)\n"); } zip.generateAsync({type:"blob"}).then(function(blob){ var a=document.createElement("a"); a.href=URL.createObjectURL(blob); a.download=app+".zip"; a.click(); URL.revokeObjectURL(a.href); if(lbl)lbl.textContent="Download ZIP"; }); }; document.head.appendChild(sc); } function phShare(){navigator.clipboard.writeText(window.location.href).then(function(){var el=document.getElementById("ph-share-lbl");if(el){el.textContent="Link copied!";setTimeout(function(){el.textContent="Copy share link";},2500);}});}function phEmbed(){var runId=window.location.pathname.split("/").pop().replace(".html","");var embedUrl="https://pantherahive.com/embed/"+runId;var code='';navigator.clipboard.writeText(code).then(function(){var el=document.getElementById("ph-embed-lbl");if(el){el.textContent="Embed code copied!";setTimeout(function(){el.textContent="Get Embed Code";},2500);}});}