Data Migration Planner
Run ID: 69bca67277c0421c0bf49b912026-03-29Development
PantheraHive BOS
BOS Dashboard

As a professional AI assistant within PantheraHive, I have executed the "Data Migration Planner" workflow. Below is the comprehensive plan for migrating data from "Test Source System" to "Test Target System", incorporating detailed field mapping, transformation rules, validation scripts, rollback procedures, and timeline estimates.


Data Migration Plan: Test Source System to Test Target System

Project ID: DMM-TST-001

Date: October 26, 2023

Prepared for: PantheraHive Operations Team

1. Executive Summary

This document outlines a comprehensive plan for the migration of data from the "Test Source System" to the "Test Target System". The objective is to ensure a secure, accurate, and efficient transfer of data, minimizing downtime and mitigating risks. This plan details the critical phases of the migration, including data profiling, field mapping, transformation logic, robust validation strategies, contingency rollback procedures, and a high-level timeline. Given the generic nature of the data_types and constraints inputs, this plan provides illustrative examples based on common enterprise data scenarios (e.g., Customer, Product, Order data) which would be refined upon receipt of actual system schemas and data profiles.

2. Project Overview

* 100% data completeness (all in-scope records migrated).

* Data accuracy maintained as per defined transformation rules.

* Referential integrity preserved in the target system.

* Minimal downtime during the cutover period.

* Successful post-migration validation by business users.

3. Assumed Data Inventory & Scope

Based on the request for a comprehensive output, we assume the migration will involve common business entities. For this plan, we will illustrate with examples from Customer, Product, and Order data.

4. Field Mapping

This section details the mapping of fields from the Test Source System to the Test Target System. It includes data types and notes on required transformations.

Entity: Customer

| Source Field Name | Source Data Type | Source Constraints | Target Field Name | Target Data Type | Target Constraints | Mapping Notes / Transformation Required |

| :---------------- | :--------------- | :----------------- | :---------------- | :--------------- | :----------------- | :-------------------------------------- |

| CustID | INT | PK, NOT NULL | CustomerID | UUID | PK, NOT NULL | Generate UUID from Source CustID (if UUID required), else direct map. |

| FirstName | VARCHAR(50) | NOT NULL | FirstName | VARCHAR(100) | NOT NULL | Direct Map. Handle empty strings. |

| LastName | VARCHAR(50) | NOT NULL | LastName | VARCHAR(100) | NOT NULL | Direct Map. Handle empty strings. |

| Email | VARCHAR(100) | UNIQUE | EmailAddress | VARCHAR(150) | UNIQUE | Direct Map. Validate email format. |

| Phone | VARCHAR(20) | | PhoneNumber | VARCHAR(25) | | Cleanse (remove non-digits). Add country code prefix if missing. |

| AddressLine1 | VARCHAR(100) | | StreetAddress | VARCHAR(200) | | Concatenate AddressLine2 if it exists. |

| City | VARCHAR(50) | | City | VARCHAR(100) | | Direct Map. |

| State | VARCHAR(2) | | StateProvince | VARCHAR(50) | | Map abbreviations to full names (e.g., "CA" -> "California"). |

| ZipCode | VARCHAR(10) | | PostalCode | VARCHAR(10) | | Direct Map. Validate format (e.g., NNNNN or NNNNN-NNNN). |

| AccountStatus | CHAR(1) | | CustomerStatus | VARCHAR(20) | NOT NULL | Value mapping: 'A'->'Active', 'I'->'Inactive', 'P'->'Pending'. Default to 'Pending' if null. |

| CreatedDate | DATETIME | NOT NULL | CreationDate | TIMESTAMP | NOT NULL | Direct Map. Convert to UTC. |

Entity: Product

| Source Field Name | Source Data Type | Source Constraints | Target Field Name | Target Data Type | Target Constraints | Mapping Notes / Transformation Required |

| :---------------- | :--------------- | :----------------- | :---------------- | :--------------- | :----------------- | :-------------------------------------- |

| ProdID | INT | PK, NOT NULL | ProductID | UUID | PK, NOT NULL | Generate UUID from Source ProdID. |

| SKU | VARCHAR(20) | UNIQUE, NOT NULL | SKU | VARCHAR(30) | UNIQUE, NOT NULL | Direct Map. Trim whitespace. |

| ProdName | VARCHAR(100) | NOT NULL | ProductName | VARCHAR(200) | NOT NULL | Direct Map. Capitalize first letter. |

| Description | TEXT | | ProductDesc | TEXT | | Direct Map. Clean HTML tags if present. |

| Price | DECIMAL(10,2) | NOT NULL | UnitPrice | DECIMAL(12,4) | NOT NULL, >0 | Direct Map. Ensure positive value. |

| Category | VARCHAR(50) | | ProductCategory | VARCHAR(100) | | Value mapping: 'ELEC'->'Electronics', 'CLO'->'Clothing'. Default to 'Miscellaneous'. |

| IsActive | BIT | | IsActive | BOOLEAN | NOT NULL | Direct Map (0->false, 1->true). Default to true. |

Entity: Order

| Source Field Name | Source Data Type | Source Constraints | Target Field Name | Target Data Type | Target Constraints | Mapping Notes / Transformation Required |

| :---------------- | :--------------- | :----------------- | :---------------- | :--------------- | :----------------- | :-------------------------------------- |

| OrderID | INT | PK, NOT NULL | OrderID | UUID | PK, NOT NULL | Generate UUID from Source OrderID. |

| CustID_FK | INT | FK to Customers | CustomerID | UUID | FK to Customers | Map to new CustomerID UUID. |

| OrderDate | DATETIME | NOT NULL | OrderTimestamp | TIMESTAMP | NOT NULL | Direct Map. Convert to UTC. |

| TotalAmount | DECIMAL(12,2) | NOT NULL | OrderTotal | DECIMAL(14,4) | NOT NULL, >0 | Direct Map. Ensure positive value. |

| OrderStatus | VARCHAR(10) | | Status | VARCHAR(20) | NOT NULL | Value mapping: 'P'->'Pending', 'C'->'Completed', 'X'->'Cancelled'. Default to 'Pending'. |

| ShippingAddress | VARCHAR(200) | | ShippingAddress | VARCHAR(250) | | Direct Map. |

5. Transformation Rules

This section details the specific rules applied to data during the ETL (Extract, Transform, Load) process.

  1. Data Type Conversion:

* Source INT to Target UUID (Customer, Product, Order IDs): For primary keys, generate a new UUID in the target system. A lookup table will be created to map the source INT ID to the new target UUID for referential integrity.

* Source DATETIME to Target TIMESTAMP (Dates): Convert all date/time fields to a consistent YYYY-MM-DD HH:MI:SS.ms format and ensure UTC timezone.

* Source BIT to Target BOOLEAN (Flags): Map 0 to FALSE and 1 to TRUE.

  1. Value Mapping / Lookups:

* AccountStatus (Customer):

* 'A' (Active) -> 'Active'

* 'I' (Inactive) -> 'Inactive'

* 'P' (Pending) -> 'Pending'

* Any other value or NULL -> 'Pending' (Default)

* State (Customer): Use a lookup table (e.g., State_Abbreviations_to_Full_Name) to convert 2-letter abbreviations to full state names. If no match, keep original or flag for manual review.

* Category (Product):

* 'ELEC' -> 'Electronics'

* 'CLO' -> 'Clothing'

* NULL or other -> 'Miscellaneous' (Default)

* OrderStatus (Order):

* 'P' -> 'Pending'

* 'C' -> 'Completed'

* 'X' -> 'Cancelled'

* NULL or other -> 'Pending' (Default)

  1. Data Cleansing & Standardization:

* Phone (Customer): Remove all non-numeric characters. If resulting string length is less than 10, prepend a default country code (e.g., '+1').

* Email (Customer): Convert to lowercase. Trim leading/trailing whitespace. Validate against a regex pattern for basic email format. Flag invalid emails.

* Text Fields (Description, ProductName): Trim leading/trailing whitespace. Remove common special characters or HTML tags if present. Capitalize first letter of ProductName.

* AddressLine1 & AddressLine2 (Customer): Concatenate AddressLine1 and AddressLine2 into StreetAddress, separated by a comma or space, ensuring no double separators.

  1. Derivation / Aggregation:

* No complex derivations or aggregations are defined at this stage based on the sample data. If line items were included for orders, TotalAmount might be derived by summing line item totals.

  1. Referential Integrity:

* After migrating parent entities (e.g., Customers, Products), populate the lookup table (SourceID -> TargetUUID).

* When migrating child entities (e.g., Orders), use the lookup table to replace the source foreign key (CustID_FK) with the target foreign key (CustomerID UUID).

  1. Default Value Assignment:

* Assign default values for target fields that are NOT NULL but might be null or missing in the source (e.g., CustomerStatus defaults to 'Pending', IsActive defaults to TRUE).

6. Validation Scripts

Validation is critical at multiple stages: pre-migration (source data quality), during migration (ETL process), and post-migration (target data quality and completeness).

A. Pre-Migration Validation (Source System)

* Primary Key Uniqueness:

text • 1,176 chars
**B. During Migration Validation (ETL Process)**

*   **Objective:** Monitor the ETL process for errors, data type mismatches, and transformation failures.
*   **Scripts/Checks (Integrated into ETL tool/scripts):**
    *   **Row Count Comparison:** Compare the number of extracted rows from source to the number of loaded rows in target for each entity.
        *   `Source Count (Customers) = SELECT COUNT(*) FROM SourceDB.Customers;`
        *   `Target Count (Customers) = SELECT COUNT(*) FROM TargetDB.Customers;`
    *   **Data Type Conformity:** Log errors for any data that fails to convert to the target data type.
    *   **Rejected Records:** Capture and report records that fail transformation rules (e.g., invalid email, negative price) into a dedicated error log table.
    *   **Transformation Logic Verification:** Sample data checks to ensure transformations (e.g., status mapping, date conversion) are applied correctly.

**C. Post-Migration Validation (Target System)**

*   **Objective:** Verify the completeness, accuracy, and integrity of the migrated data in the target system.
*   **Scripts/Checks:**
    *   **Target Primary Key Uniqueness:**
        
Sandboxed live preview

7. Rollback Procedures

A robust rollback plan is essential to mitigate risks and ensure business continuity in case of critical failure during or immediately after migration.

  1. Pre-Migration Full Backup:

* Source System: Perform a full database backup of the Test Source System immediately before starting the migration. This ensures the source system can be fully restored to its pre-migration state.

* Target System: Perform a full database backup of the Test Target System (if it contains existing data) or a snapshot of the empty database structure.

  1. Migration Transaction Management:

* Where possible, wrap loading operations in transactions. If an error occurs during a batch load, the entire batch can be rolled back.

  1. Data Deletion/Truncation in Target:

* Procedure: If a rollback is initiated, truncate all tables in the Test Target System that were populated during the migration. This ensures no partial or corrupted data remains.

* Order: Truncate child tables first (e.g., Orders, OrderLineItems), then parent tables (e.g., Customers, Products) to avoid foreign key violations.

  1. Restore Target System (if applicable):

* If the target system had existing data before migration, restore it from the pre-migration backup.

* If the target system was empty, simply truncating the tables effectively rolls it back to an empty state.

  1. Source System Restoration (Extreme Case):

* In the event of severe data corruption or loss in the source system (highly unlikely with read-only extraction), restore the source system from its pre-migration backup.

  1. Communication Plan:

* Clearly define who declares a rollback, who executes it, and how stakeholders are notified.

* Establish clear criteria for triggering a rollback (e.g., >5% data accuracy errors, critical business function failure).

  1. Post-Rollback Actions:

* Analyze the root cause of the failure.

* Rectify issues in the ETL process, data quality, or infrastructure.

* Re-plan and re-execute the migration.

8. Timeline Estimates

This timeline is a high-level estimate. Actual durations will depend on data volume, complexity, resource availability, and the number of iterations required for testing and validation.

Phase 1: Planning & Analysis (Estimated: 2-3 Weeks)

  • Detailed Data Profiling & Assessment: Analyze source data quality, identify anomalies, confirm data types, volumes, and constraints. (1 week)
  • Finalize Field Mapping: Refine mapping based on profiling, confirm target schema. (0.5 week)
  • Define Transformation Rules: Document all transformation logic in detail. (0.5 week)
  • Develop Validation Strategy & Rollback Plan: Finalize scripts and procedures. (0.5 week)
  • Environment Setup: Prepare development, test, and production environments. (0.5 week)

Phase 2: Development & ETL Implementation (Estimated: 4-6 Weeks)

  • ETL Script/Tool Development: Implement extraction, transformation, and loading logic. (3-4 weeks)
  • Data Cleansing Routines: Develop scripts/processes for data quality improvement. (0.5-1 week)
  • Lookup Table Generation: Create and populate lookup tables for ID mapping, value conversions. (0.5 week)
  • Validation Script Development: Code pre- and post-migration validation checks. (0.5-1 week)

Phase 3: Testing (Estimated: 4-5 Weeks)

  • Unit Testing: Test individual ETL components and transformation rules with sample data. (1 week)
  • Integration Testing: Test the end-to-end migration process with a small subset of realistic data. (1 week)
  • Performance Testing: Evaluate ETL process performance with representative data volumes. (0.5 week)
  • User Acceptance Testing (UAT): Business users validate migrated data in a test environment. Includes functional testing of target system with migrated data. (1.5-2 weeks)
  • Regression Testing: Ensure existing target system functionalities are not impacted. (0.5 week)

Phase 4: Execution (Estimated: 1-2 Weeks)

  • Pilot Migration (Optional but Recommended): Migrate a small, non-critical subset of data to the production target environment. (0.5-1 week)
  • Final Data Freeze & Pre-Migration Backups: Lock source system for changes and perform final backups. (1-2 days)
  • Full Production Migration: Execute the complete ETL process. (1-3 days, depending on volume and complexity)
  • Post-Migration Validation: Execute all post-migration validation scripts. (1-2 days)
  • Business Sign-off: Formal acceptance of migrated data by business stakeholders. (1 day)

Phase 5: Post-Migration Support & Monitoring (Estimated: 2-4 Weeks)

  • Hypercare Period: Intensive monitoring of target system and business operations. (2-4 weeks)
  • Issue Resolution: Address any data-related issues or discrepancies.
  • Documentation & Knowledge Transfer: Finalize documentation and conduct training.

Total Estimated Duration: 13-20 Weeks

9. Resource Requirements

  • Personnel:

* Project Manager

* Data Architects / ETL Developers

* Database Administrators (DBAs)

* Business Analysts (for requirements, mapping, UAT)

* Quality Assurance / Testers

* System Administrators (for environment setup)

  • Tools:

* ETL Tool (e.g., SSIS, Talend, Informatica, custom scripts in Python/SQL)

* Database Management Tools

* Version Control System (e.g., Git)

* Project Management Software

* Data Profiling Tools

  • Infrastructure:

* Dedicated servers/VMs for ETL processing

* Sufficient storage for temporary data, backups, and logs

* Network connectivity between source and target systems

10. Risk Assessment & Mitigation

| Risk | Mitigation Strategy |

| :------------------------------------- | :---------------------------------------------------------------------------------------------------------------------------------------------------------------------- |

| Data Quality Issues | Thorough data profiling, pre-migration cleansing, robust transformation rules, comprehensive validation scripts, error logging and reporting. |

| Data Loss or Corruption | Full pre-migration backups of both source and target, transactional loading, detailed rollback procedures, comprehensive validation. |

| System Downtime Exceeds Window | Accurate volume estimation, performance testing of ETL, optimize ETL process, phased migration, clear communication channels. |

| Performance Bottlenecks | Early performance testing, optimized database queries, scalable ETL infrastructure, incremental loading strategies if applicable. |

| Referential Integrity Violations | Careful sequencing of data loads (parent before child), robust foreign key mapping logic (lookup tables), and post-migration referential integrity checks. |

| Inaccurate Data Transformation | Detailed mapping documentation, unit testing of transformation logic, UAT with business users, sample data comparison. |

| Security & Compliance Violations | Adherence to data privacy regulations (e.g., GDPR, CCPA), secure data transfer protocols, access control to migration environments, anonymization/tokenization where required. |

| Lack of Business User Acceptance | Early and continuous involvement of business stakeholders in planning, mapping, and UAT phases. Clear communication of changes. |

| Unexpected Source System Changes | Regular communication with source system owners, change management process, flexible ETL design. |

11. Next Steps

  1. Review and Feedback: This plan requires review by all relevant stakeholders (Business, IT, Operations).
  2. Detailed Data Profiling: Initiate in-depth data profiling of the Test Source System to gather actual schemas, data types, volumes, and quality metrics.
  3. Refine Field Mapping and Transformation Rules: Update sections 4 and 5 with precise details based on profiling results and specific business requirements.
  4. Resource Allocation: Confirm availability of required personnel and infrastructure.
  5. Tool Selection: Finalize ETL tools and technologies.
  6. Detailed Project Schedule: Develop a granular project schedule with specific tasks and dependencies.

This comprehensive plan provides a solid foundation for the data migration project. It is designed to be adaptable and will be refined with more specific details as the project progresses and actual data characteristics become fully known.

Step 2: projectmanager

Data Migration Planner: Comprehensive Plan

1. Project Overview & Scope

Project Title: Data Migration from Test Source System to Test Target System

Date: October 26, 2023

Prepared By: PantheraHive AI Assistant

Source System: Test Source System

Target System: Test Target System

Objective: To successfully migrate selected data from the Test Source System to the Test Target System, ensuring data integrity, accuracy, and minimal downtime. This plan outlines the comprehensive strategy, including field mapping, transformation rules, validation procedures, rollback mechanisms, and timeline estimates.

Scope Inclusions:

  • Migration of core business entities (e.g., Customers, Products, Orders, Invoices).
  • Field-level mapping and data type alignment.
  • Data cleansing and transformation as per business rules.
  • Validation of data integrity and completeness in the target system.
  • Development of robust rollback procedures.
  • Performance testing of the migration process.

Scope Exclusions:

  • Migration of historical archive data not actively used.
  • Migration of system configuration settings (to be manually configured).
  • Development of new features in the target system.

2. Data Migration Scope: Entities and Data Types

Based on the request for comprehensive output, we will assume a typical business scenario involving customer, product, and order data for detailed planning.

Key Entities Identified for Migration:

  • Customer: Customer details, contact information.
  • Product: Product catalog, pricing.
  • Order: Order headers, line items.
  • Invoice: Invoice details, payment status.

Representative Data Types in Scope (Source & Target):

| Data Type Category | Source System Examples | Target System Examples | Description |

| :----------------- | :--------------------- | :--------------------- | :---------- |

| Strings | VARCHAR(255), TEXT | NVARCHAR(4000), TEXT | Names, addresses, descriptions, codes |

| Integers | INT, BIGINT | INT, BIGINT | IDs, quantities, counts |

| Decimals/Money | DECIMAL(18,2), MONEY | NUMERIC(19,4), DECIMAL(18,2) | Prices, amounts, rates |

| Dates/Times | DATETIME, DATE | DATETIME2, DATE | Creation dates, modification timestamps, order dates |

| Booleans | BIT, TINYINT | BIT, BOOLEAN | Flags (e.g., IsActive, IsPaid) |

| Unique IDs | GUID, VARCHAR(36) | UNIQUEIDENTIFIER, VARCHAR(36) | Globally unique identifiers |

3. Field Mapping

This section details the mapping of fields from the Test Source System to the Test Target System, including any necessary data type adjustments or specific notes.

Table: Customer

| Source Table.Field (Data Type) | Target Table.Field (Data Type) | Mapping Type | Notes/Transformation Required |

| :----------------------------- | :----------------------------- | :----------- | :---------------------------- |

| SRC_Customers.CustomerID (INT) | TRG_Customers.CustomerID (BIGINT) | Direct | PK, Auto-increment in Target, ensure uniqueness. |

| SRC_Customers.FirstName (VARCHAR(50)) | TRG_Customers.FirstName (NVARCHAR(100)) | Direct | Target allows more characters. |

| SRC_Customers.LastName (VARCHAR(50)) | TRG_Customers.LastName (NVARCHAR(100)) | Direct | Target allows more characters. |

| SRC_Customers.EmailAddress (VARCHAR(100)) | TRG_Customers.Email (NVARCHAR(255)) | Direct | Target allows more characters. Unique constraint in Target. |

| SRC_Customers.Phone (VARCHAR(20)) | TRG_Customers.PhoneNumber (NVARCHAR(50)) | Direct | Target allows more characters. |

| SRC_Customers.Address1 (VARCHAR(100)) | TRG_Customers.StreetAddress (NVARCHAR(200)) | Direct | |

| SRC_Customers.City (VARCHAR(50)) | TRG_Customers.City (NVARCHAR(100)) | Direct | |

| SRC_Customers.StateCode (VARCHAR(2)) | TRG_Customers.State (NVARCHAR(2)) | Direct | Standardize to 2-letter codes. |

| SRC_Customers.Zip (VARCHAR(10)) | TRG_Customers.PostalCode (NVARCHAR(10)) | Direct | |

| SRC_Customers.JoinDate (DATETIME) | TRG_Customers.MemberSince (DATETIME2) | Direct | Precision difference, no functional impact. |

| SRC_Customers.IsActive (BIT) | TRG_Customers.AccountStatusID (INT) | Lookup | Map 1 to 1 (Active), 0 to 2 (Inactive). |

| SRC_Customers.CustomerType (VARCHAR(10)) | TRG_Customers.CustomerTypeID (INT) | Lookup | Map RES to 1 (Residential), BUS to 2 (Business). |

Table: Product

| Source Table.Field (Data Type) | Target Table.Field (Data Type) | Mapping Type | Notes/Transformation Required |

| :----------------------------- | :----------------------------- | :----------- | :---------------------------- |

| SRC_Products.ProductID (INT) | TRG_Products.ProductID (BIGINT) | Direct | PK, Auto-increment in Target, ensure uniqueness. |

| SRC_Products.SKU (VARCHAR(20)) | TRG_Products.SKU (NVARCHAR(50)) | Direct | Unique constraint in Target. |

| SRC_Products.ProductName (VARCHAR(100)) | TRG_Products.Name (NVARCHAR(255)) | Direct | |

| SRC_Products.Description (TEXT) | TRG_Products.Description (TEXT) | Direct | |

| SRC_Products.UnitPrice (MONEY) | TRG_Products.Price (NUMERIC(19,4)) | Direct | Data type precision change. |

| SRC_Products.Category (VARCHAR(50)) | TRG_Products.CategoryID (INT) | Lookup | Map categories to TRG_Categories table IDs. |

| SRC_Products.IsAvailable (TINYINT) | TRG_Products.IsActive (BIT) | Direct | Map 1 to 1, 0 to 0. |

Table: Order

| Source Table.Field (Data Type) | Target Table.Field (Data Type) | Mapping Type | Notes/Transformation Required |

| :----------------------------- | :----------------------------- | :----------- | :---------------------------- |

| SRC_Orders.OrderID (INT) | TRG_Orders.OrderID (BIGINT) | Direct | PK, Auto-increment in Target, ensure uniqueness. |

| SRC_Orders.CustomerID (INT) | TRG_Orders.CustomerID (BIGINT) | Direct | FK to TRG_Customers.CustomerID. |

| SRC_Orders.OrderDate (DATETIME) | TRG_Orders.OrderDate (DATETIME2) | Direct | |

| SRC_Orders.OrderStatus (VARCHAR(10)) | TRG_Orders.OrderStatusID (INT) | Lookup | Map PEND to 1 (Pending), COMP to 2 (Completed), CANC to 3 (Cancelled). |

| SRC_Orders.TotalAmount (MONEY) | TRG_Orders.TotalAmount (NUMERIC(19,4)) | Direct | Data type precision change. |

| SRC_Orders.ShippingAddress (VARCHAR(200)) | TRG_Orders.ShippingAddress (NVARCHAR(400)) | Direct | |

4. Data Transformation Rules

This section outlines the specific rules for transforming data during migration, addressing differences in data types, formats, and business logic.

| Transformation ID | Source Field(s) | Target Field | Rule Description | Example (Source -> Target) |

| :---------------- | :-------------- | :----------- | :--------------- | :------------------------- |

| TRN-CUST-001 | SRC_Customers.IsActive | TRG_Customers.AccountStatusID | Map Boolean (BIT) to Integer ID. If IsActive = 1 then AccountStatusID = 1 (Active), else AccountStatusID = 2 (Inactive). | 1 -> 1, 0 -> 2 |

| TRN-CUST-002 | SRC_Customers.CustomerType | TRG_Customers.CustomerTypeID | Lookup mapping for customer types. If CustomerType = 'RES' then CustomerTypeID = 1 (Residential), if CustomerType = 'BUS' then CustomerTypeID = 2 (Business). Default to 3 (Other) if unknown. | 'RES' -> 1, 'UNK' -> 3 |

| TRN-CUST-003 | SRC_Customers.EmailAddress | TRG_Customers.Email | Standardize email addresses to lowercase. Handle NULLs by setting to NULL in target. | 'Test@Example.com' -> 'test@example.com' |

| TRN-PROD-001 | SRC_Products.Category | TRG_Products.CategoryID | Lookup mapping for product categories. Map source category string to corresponding ID in TRG_Categories lookup table. If category not found, default to 99 (Uncategorized). | 'Electronics' -> 101, 'Books' -> 102 |

| TRN-PROD-002 | SRC_Products.UnitPrice | TRG_Products.Price | Convert MONEY to NUMERIC(19,4). Ensure correct precision and rounding (round half-even). | 19.995 -> 19.9950 |

| TRN-ORDER-001 | SRC_Orders.OrderStatus | TRG_Orders.OrderStatusID | Lookup mapping for order status. If OrderStatus = 'PEND' then OrderStatusID = 1 (Pending), OrderStatus = 'COMP' then OrderStatusID = 2 (Completed), OrderStatus = 'CANC' then OrderStatusID = 3 (Cancelled). Default to 4 (Unknown) if not matched. | 'PEND' -> 1, 'INV' -> 4 |

| TRN-GEN-001 | All VARCHAR fields | All NVARCHAR fields | Trim leading/trailing whitespace. Handle empty strings by converting to NULL where applicable (e.g., optional address lines). | ' Value ' -> 'Value', '' -> NULL (if nullable) |

| TRN-GEN-002 | All Date/Time fields | All Date/Time fields | Ensure UTC conversion if source dates are local time and target requires UTC. (Assumption: Both systems use local time for this plan, no conversion needed unless specified). | (No conversion) |

5. Data Validation Scripts

Robust validation is critical to ensure the migrated data is accurate, complete, and consistent. This section outlines the validation scripts and procedures.

5.1. Pre-Migration Data Quality Checks (Source System)

These scripts identify potential issues in the source data before migration, allowing for cleansing or issue resolution.

| Validation Script Name | Description | Expected Outcome/Metric | Responsibility |

| :--------------------- | :---------- | :---------------------- | :------------- |

| SRC_Customer_PK_Unique | Check for duplicate CustomerID in SRC_Customers. | 0 duplicates | Data Steward, Developer |

| SRC_Customer_Email_Format | Validate email format in SRC_Customers.EmailAddress. | >95% valid emails | Data Steward |

| SRC_Customer_FK_Integrity | Check for orphaned orders (orders with non-existent CustomerID). | 0 orphans | Developer |

| SRC_Product_SKU_Unique | Check for duplicate SKU in SRC_Products. | 0 duplicates | Data Steward, Developer |

| SRC_Order_Total_Consistency | Sum of OrderLineItem amounts matches OrderHeader.TotalAmount for SRC_Orders. | >98% consistency | Developer |

| SRC_Data_Type_Compliance | Verify source data fits expected types (e.g., numeric fields contain only numbers). | 0 type mismatches | Developer |

5.2. Post-Migration Data Validation (Target System)

These scripts verify the migrated data in the target system against the source and expected business rules.

| Validation Phase | Validation Script Name | Description | Expected Outcome/Metric | Responsibility |

| :--------------- | :--------------------- | :---------- | :---------------------- | :------------- |

| Phase 1: Record Counts | TRG_Count_Customers | Compare record count of TRG_Customers with SRC_Customers. | TRG_Customers count = SRC_Customers count | Developer, QA |

| | TRG_Count_Products | Compare record count of TRG_Products with SRC_Products. | TRG_Products count = SRC_Products count | Developer, QA |

| | TRG_Count_Orders | Compare record count of TRG_Orders with SRC_Orders. | TRG_Orders count = SRC_Orders count | Developer, QA |

| Phase 2: Data Integrity | TRG_Customer_PK_Unique | Verify CustomerID is unique and not null in TRG_Customers. | 0 duplicates, 0 nulls | QA |

| | TRG_Customer_FK_Integrity | Verify CustomerID in TRG_Orders references existing TRG_Customers. | 0 FK violations | QA |

| | TRG_Product_SKU_Unique | Verify SKU is unique and not null in TRG_Products. | 0 duplicates, 0 nulls | QA |

| | TRG_Order_Status_Lookup | Verify OrderStatusID in TRG_Orders exists in TRG_OrderStatusLookup. | 0 FK violations | QA |

| Phase 3: Data Accuracy & Transformation | TRG_Sample_Data_Verification | Manual/scripted comparison of 5% random sample of records across all entities for field-level accuracy. | 100% match for selected fields | QA, Business User |

| | TRG_Customer_Status_Mapping | Verify TRG_Customers.AccountStatusID correctly reflects SRC_Customers.IsActive. (e.g., SELECT COUNT(*) WHERE TRG.AccountStatusID=1 AND SRC.IsActive=0 should be 0) | 0 mismatches | QA |

| | TRG_Product_Price_Precision | Verify TRG_Products.Price maintains correct precision as per TRN-PROD-002. | Data matches within defined precision | QA |

| Phase 4: Business Logic | TRG_Reporting_Validation | Run key reports in Target System and compare aggregates/totals with Source System reports. | Totals match within acceptable variance (e.g., 0%) | Business User, QA |

| | TRG_Application_Smoke_Test | Perform critical business transactions (e.g., create new customer, place order) in the target application. | Application functions correctly | Business User, QA |

6. Rollback Procedures

A comprehensive rollback plan is essential to mitigate risks in case of migration failure or critical issues detected post-migration.

| Rollback Step | Description | Responsible Team | Estimated Time | Trigger Condition |

| :------------ | :---------- | :--------------- | :------------- | :---------------- |

| RLB-001: Halt Migration Process | Immediately stop all ongoing migration scripts and ETL jobs. | Migration Team | 15 minutes | Any critical error during execution, major data integrity failure. |

| RLB-002: Isolate Target System | Disconnect target system from any external interfaces or user access to prevent further data changes. | IT Operations | 30 minutes | RLB-001 completed. |

| RLB-003: Restore Target Database | Restore the Target Database from the pre-migration backup. This ensures the target system is in its exact state before migration attempt. | DBA Team | 2-4 hours (depending on DB size) | Decision to rollback (e.g., failed validation, critical business impact). |

| RLB-004: Validate Target Restoration | Verify the target database has been successfully restored to its pre-migration state (e.g., check table counts, sample data). | DBA Team, QA | 1 hour | RLB-003 completed. |

| RLB-005: Re-enable Target System Access | Reconnect target system interfaces and restore user access (if applicable and safe to do so for continued operations). | IT Operations | 30 minutes | RLB-004 successful. |

| RLB-006: Analyze Failure & Rerun | Conduct a root cause analysis of the migration failure. Apply fixes to scripts/data. Plan for a re-execution of the migration. | Migration Team, Developers | 1-3 days | RLB-004 successful. |

| RLB-007: Source System Cleanup (if applicable) | If the migration process modified the source system (e.g., marking records as migrated), revert these changes. (Less common for this type of migration). | Developer, DBA Team | 1 hour | Only if SRC was modified. |

Rollback Strategy: The primary strategy is to perform a full database restore of the target system from a point-in-time backup taken immediately before the migration execution. This is the most reliable method for ensuring a clean slate.

7. Timeline Estimates

This section provides a phased timeline for the data migration project, including key activities and estimated durations.

Project Start Date (Relative): Day 0 (e.g., Kick-off)

| Phase | Key Activities | Estimated Duration | Start Date (Relative) | End Date (Relative) | Dependencies |

| :---- | :------------- | :----------------- | :-------------------- | :------------------ | :----------- |

| Phase 1: Planning & Analysis | Define scope, gather requirements, document source/target schemas, initial field mapping, resource allocation. | 2 weeks | Day 0 | Day 10 | None |

| Phase 2: Design & Development | Finalize field mapping & transformation rules, design ETL architecture, develop ETL scripts, develop validation scripts. | 4 weeks | Day 11 | Day 30 | Phase 1 completion |

| Phase 3: Testing & Refinement | Unit testing of ETL scripts, data quality testing (pre-migration), integration testing, performance testing, UAT with business users, refine scripts based on feedback. | 5 weeks | Day 31 | Day 55 | Phase 2 completion, Test environment setup |

| Phase 4: Pre-Migration Activities | Target system setup, production readiness checks, full database backup of Target, source data freeze communication. | 1 week | Day 56 | Day 60 | Phase 3 completion, Production environment ready |

| Phase 5: Migration Execution | Execute full data migration, real-time monitoring, initial post-migration validation checks. | 2-3 days | Day 61 | Day 63 | Phase 4 completion, Minimal system downtime window |

| Phase 6: Post-Migration & Go-Live | Comprehensive post-migration validation, business user acceptance, cutover to target system, decommissioning of source (if applicable). | 1 week | Day 64 | Day 68 | Phase 5 completion |

| Phase 7: Hypercare & Monitoring | Monitor target system performance and data integrity, address any immediate post-go-live issues. | 2 weeks | Day 69 | Day 78 | Phase 6 completion |

Total Estimated Project Duration: Approximately 14-16 Weeks

8. Resources and Responsibilities

| Role/Team | Key Responsibilities |

| :-------- | :------------------- |

| Project Manager | Overall project oversight, stakeholder communication, risk management, timeline adherence. |

| Data Architects | Define data models, ensure data integrity, approve mapping and transformation rules. |

| ETL Developers | Design, develop, and test ETL scripts; implement transformation and validation logic. |

| DBA Team | Database setup, performance tuning, backup/restore procedures, data security. |

| QA Team | Develop and execute test plans, validate data accuracy and completeness, manage UAT. |

| Business Analysts | Gather and clarify business requirements, assist with field mapping and UAT. |

| Data Stewards | Source data quality assessment, data cleansing, data ownership. |

| IT Operations | Infrastructure provisioning, system monitoring, deployment support. |

| Business Users | Provide business context, perform UAT, sign off on migrated data. |

9. Risks and Mitigation

| Risk ID | Risk Description | Mitigation Strategy | Severity (H/M/L) | Likelihood (H/M/L) |

| :------ | :--------------- | :------------------ | :--------------- | :----------------- |

| R-DM-001 | Data quality issues in source system impact target data integrity. | Comprehensive pre-migration data profiling and cleansing; strict validation rules; business sign-off on data quality reports. | H | M |

| R-DM-002 | Performance issues during migration lead to extended downtime. | Performance testing with representative data volumes; optimize ETL scripts; staged migration approach if feasible. | H | M |

| R-DM-003 | Incorrect field mapping or transformation rules cause data inaccuracies. | Thorough review and sign-off on mapping documents by Data Architects and Business Users; extensive unit and integration testing. | H | M |

| R-DM-004 | Rollback procedures fail or are ineffective. | Regular testing of rollback procedures; ensure robust, verified backups; clear communication channels during incidents. | H | L |

| R-DM-005 | Unexpected data volume or complexity exceeds estimates. | Detailed data profiling early in the project; iterative development and testing; contingency buffer in timeline/resources. | M | M |

| R-DM-006 | Lack of stakeholder alignment or communication breakdown. | Regular project updates, dedicated communication plan, clear roles and responsibilities, frequent review meetings. | M | M |

| R-DM-007 | Target system constraints (e.g., storage, performance) not adequately identified. | Early engagement with target system architects/DBAs; thorough review of target system documentation; load testing. | M | L |

10. Communication Plan

Effective communication is vital for the success of the migration project.

| Audience | Communication Method | Frequency | Content | Owner |

| :------- | :------------------- | :-------- | :------ | :---- |

| Project Team | Daily Stand-ups, Weekly Project Meetings | Daily, Weekly | Progress updates, roadblocks, task assignments, risk review. | Project Manager |

| Steering Committee / Executives | Monthly Steering Committee Meetings, Email Updates | Monthly, Ad-hoc | High-level progress, key milestones, major risks, budget status. | Project Manager |

| Business Users | Bi-weekly Updates, UAT Sessions | Bi-weekly, Ad-hoc | Impact on business processes, UAT progress, training needs, Go-Live readiness. | Project Manager, Business Analyst |

| IT Operations | Weekly Syncs, Technical Design Reviews | Weekly, Ad-hoc | Infrastructure requirements, deployment plans, monitoring, incident response. | Project Manager, Technical Lead |

| External Vendors (if applicable) | As Needed | As Needed | Coordination of system integrations, data exchange, support. | Project Manager |

| General Stakeholders | Project SharePoint Site, Email Newsletters | Monthly | General project updates, key achievements, upcoming milestones. | Project Manager |

This comprehensive plan provides a solid framework for the successful data migration from the Test Source System to the Test Target System. Adherence to these steps, coupled with diligent execution and proactive risk management, will ensure a smooth transition.

data_migration_planner.txt
Download source file
Copy all content
Full output as text
Download ZIP
IDE-ready project ZIP
Copy share link
Permanent URL for this run
Get Embed Code
Embed this result on any website
Print / Save PDF
Use browser print dialog
\n\n\n"); var hasSrcMain=Object.keys(extracted).some(function(k){return k.indexOf("src/main")>=0;}); if(!hasSrcMain) zip.file(folder+"src/main."+ext,"import React from 'react'\nimport ReactDOM from 'react-dom/client'\nimport App from './App'\nimport './index.css'\n\nReactDOM.createRoot(document.getElementById('root')!).render(\n \n \n \n)\n"); var hasSrcApp=Object.keys(extracted).some(function(k){return k==="src/App."+ext||k==="App."+ext;}); if(!hasSrcApp) zip.file(folder+"src/App."+ext,"import React from 'react'\nimport './App.css'\n\nfunction App(){\n return(\n
\n
\n

"+slugTitle(pn)+"

\n

Built with PantheraHive BOS

\n
\n
\n )\n}\nexport default App\n"); zip.file(folder+"src/index.css","*{margin:0;padding:0;box-sizing:border-box}\nbody{font-family:system-ui,-apple-system,sans-serif;background:#f0f2f5;color:#1a1a2e}\n.app{min-height:100vh;display:flex;flex-direction:column}\n.app-header{flex:1;display:flex;flex-direction:column;align-items:center;justify-content:center;gap:12px;padding:40px}\nh1{font-size:2.5rem;font-weight:700}\n"); zip.file(folder+"src/App.css",""); zip.file(folder+"src/components/.gitkeep",""); zip.file(folder+"src/pages/.gitkeep",""); zip.file(folder+"src/hooks/.gitkeep",""); Object.keys(extracted).forEach(function(p){ var fp=p.startsWith("src/")?p:"src/"+p; zip.file(folder+fp,extracted[p]); }); zip.file(folder+"README.md","# "+slugTitle(pn)+"\n\nGenerated by PantheraHive BOS.\n\n## Setup\n\`\`\`bash\nnpm install\nnpm run dev\n\`\`\`\n\n## Build\n\`\`\`bash\nnpm run build\n\`\`\`\n\n## Open in IDE\nOpen the project folder in VS Code or WebStorm.\n"); zip.file(folder+".gitignore","node_modules/\ndist/\n.env\n.DS_Store\n*.local\n"); } /* --- Vue (Vite + Composition API + TypeScript) --- */ function buildVue(zip,folder,app,code,panelTxt){ var pn=pkgName(app); var C=cc(pn); var extracted=extractCode(panelTxt); zip.file(folder+"package.json",'{\n "name": "'+pn+'",\n "version": "0.0.0",\n "type": "module",\n "scripts": {\n "dev": "vite",\n "build": "vue-tsc -b && vite build",\n "preview": "vite preview"\n },\n "dependencies": {\n "vue": "^3.5.13",\n "vue-router": "^4.4.5",\n "pinia": "^2.3.0",\n "axios": "^1.7.9"\n },\n "devDependencies": {\n "@vitejs/plugin-vue": "^5.2.1",\n "typescript": "~5.7.3",\n "vite": "^6.0.5",\n "vue-tsc": "^2.2.0"\n }\n}\n'); zip.file(folder+"vite.config.ts","import { defineConfig } from 'vite'\nimport vue from '@vitejs/plugin-vue'\nimport { resolve } from 'path'\n\nexport default defineConfig({\n plugins: [vue()],\n resolve: { alias: { '@': resolve(__dirname,'src') } }\n})\n"); zip.file(folder+"tsconfig.json",'{"files":[],"references":[{"path":"./tsconfig.app.json"},{"path":"./tsconfig.node.json"}]}\n'); zip.file(folder+"tsconfig.app.json",'{\n "compilerOptions":{\n "target":"ES2020","useDefineForClassFields":true,"module":"ESNext","lib":["ES2020","DOM","DOM.Iterable"],\n "skipLibCheck":true,"moduleResolution":"bundler","allowImportingTsExtensions":true,\n "isolatedModules":true,"moduleDetection":"force","noEmit":true,"jsxImportSource":"vue",\n "strict":true,"paths":{"@/*":["./src/*"]}\n },\n "include":["src/**/*.ts","src/**/*.d.ts","src/**/*.tsx","src/**/*.vue"]\n}\n'); zip.file(folder+"env.d.ts","/// \n"); zip.file(folder+"index.html","\n\n\n \n \n "+slugTitle(pn)+"\n\n\n
\n \n\n\n"); var hasMain=Object.keys(extracted).some(function(k){return k==="src/main.ts"||k==="main.ts";}); if(!hasMain) zip.file(folder+"src/main.ts","import { createApp } from 'vue'\nimport { createPinia } from 'pinia'\nimport App from './App.vue'\nimport './assets/main.css'\n\nconst app = createApp(App)\napp.use(createPinia())\napp.mount('#app')\n"); var hasApp=Object.keys(extracted).some(function(k){return k.indexOf("App.vue")>=0;}); if(!hasApp) zip.file(folder+"src/App.vue","\n\n\n\n\n"); zip.file(folder+"src/assets/main.css","*{margin:0;padding:0;box-sizing:border-box}body{font-family:system-ui,sans-serif;background:#fff;color:#213547}\n"); zip.file(folder+"src/components/.gitkeep",""); zip.file(folder+"src/views/.gitkeep",""); zip.file(folder+"src/stores/.gitkeep",""); Object.keys(extracted).forEach(function(p){ var fp=p.startsWith("src/")?p:"src/"+p; zip.file(folder+fp,extracted[p]); }); zip.file(folder+"README.md","# "+slugTitle(pn)+"\n\nGenerated by PantheraHive BOS.\n\n## Setup\n\`\`\`bash\nnpm install\nnpm run dev\n\`\`\`\n\n## Build\n\`\`\`bash\nnpm run build\n\`\`\`\n\nOpen in VS Code or WebStorm.\n"); zip.file(folder+".gitignore","node_modules/\ndist/\n.env\n.DS_Store\n*.local\n"); } /* --- Angular (v19 standalone) --- */ function buildAngular(zip,folder,app,code,panelTxt){ var pn=pkgName(app); var C=cc(pn); var sel=pn.replace(/_/g,"-"); var extracted=extractCode(panelTxt); zip.file(folder+"package.json",'{\n "name": "'+pn+'",\n "version": "0.0.0",\n "scripts": {\n "ng": "ng",\n "start": "ng serve",\n "build": "ng build",\n "test": "ng test"\n },\n "dependencies": {\n "@angular/animations": "^19.0.0",\n "@angular/common": "^19.0.0",\n "@angular/compiler": "^19.0.0",\n "@angular/core": "^19.0.0",\n "@angular/forms": "^19.0.0",\n "@angular/platform-browser": "^19.0.0",\n "@angular/platform-browser-dynamic": "^19.0.0",\n "@angular/router": "^19.0.0",\n "rxjs": "~7.8.0",\n "tslib": "^2.3.0",\n "zone.js": "~0.15.0"\n },\n "devDependencies": {\n "@angular-devkit/build-angular": "^19.0.0",\n "@angular/cli": "^19.0.0",\n "@angular/compiler-cli": "^19.0.0",\n "typescript": "~5.6.0"\n }\n}\n'); zip.file(folder+"angular.json",'{\n "$schema": "./node_modules/@angular/cli/lib/config/schema.json",\n "version": 1,\n "newProjectRoot": "projects",\n "projects": {\n "'+pn+'": {\n "projectType": "application",\n "root": "",\n "sourceRoot": "src",\n "prefix": "app",\n "architect": {\n "build": {\n "builder": "@angular-devkit/build-angular:application",\n "options": {\n "outputPath": "dist/'+pn+'",\n "index": "src/index.html",\n "browser": "src/main.ts",\n "tsConfig": "tsconfig.app.json",\n "styles": ["src/styles.css"],\n "scripts": []\n }\n },\n "serve": {"builder":"@angular-devkit/build-angular:dev-server","configurations":{"production":{"buildTarget":"'+pn+':build:production"},"development":{"buildTarget":"'+pn+':build:development"}},"defaultConfiguration":"development"}\n }\n }\n }\n}\n'); zip.file(folder+"tsconfig.json",'{\n "compileOnSave": false,\n "compilerOptions": {"baseUrl":"./","outDir":"./dist/out-tsc","forceConsistentCasingInFileNames":true,"strict":true,"noImplicitOverride":true,"noPropertyAccessFromIndexSignature":true,"noImplicitReturns":true,"noFallthroughCasesInSwitch":true,"paths":{"@/*":["src/*"]},"skipLibCheck":true,"esModuleInterop":true,"sourceMap":true,"declaration":false,"experimentalDecorators":true,"moduleResolution":"bundler","importHelpers":true,"target":"ES2022","module":"ES2022","useDefineForClassFields":false,"lib":["ES2022","dom"]},\n "references":[{"path":"./tsconfig.app.json"}]\n}\n'); zip.file(folder+"tsconfig.app.json",'{\n "extends":"./tsconfig.json",\n "compilerOptions":{"outDir":"./dist/out-tsc","types":[]},\n "files":["src/main.ts"],\n "include":["src/**/*.d.ts"]\n}\n'); zip.file(folder+"src/index.html","\n\n\n \n "+slugTitle(pn)+"\n \n \n \n\n\n \n\n\n"); zip.file(folder+"src/main.ts","import { bootstrapApplication } from '@angular/platform-browser';\nimport { appConfig } from './app/app.config';\nimport { AppComponent } from './app/app.component';\n\nbootstrapApplication(AppComponent, appConfig)\n .catch(err => console.error(err));\n"); zip.file(folder+"src/styles.css","* { margin: 0; padding: 0; box-sizing: border-box; }\nbody { font-family: system-ui, -apple-system, sans-serif; background: #f9fafb; color: #111827; }\n"); var hasComp=Object.keys(extracted).some(function(k){return k.indexOf("app.component")>=0;}); if(!hasComp){ zip.file(folder+"src/app/app.component.ts","import { Component } from '@angular/core';\nimport { RouterOutlet } from '@angular/router';\n\n@Component({\n selector: 'app-root',\n standalone: true,\n imports: [RouterOutlet],\n templateUrl: './app.component.html',\n styleUrl: './app.component.css'\n})\nexport class AppComponent {\n title = '"+pn+"';\n}\n"); zip.file(folder+"src/app/app.component.html","
\n
\n

"+slugTitle(pn)+"

\n

Built with PantheraHive BOS

\n
\n \n
\n"); zip.file(folder+"src/app/app.component.css",".app-header{display:flex;flex-direction:column;align-items:center;justify-content:center;min-height:60vh;gap:16px}h1{font-size:2.5rem;font-weight:700;color:#6366f1}\n"); } zip.file(folder+"src/app/app.config.ts","import { ApplicationConfig, provideZoneChangeDetection } from '@angular/core';\nimport { provideRouter } from '@angular/router';\nimport { routes } from './app.routes';\n\nexport const appConfig: ApplicationConfig = {\n providers: [\n provideZoneChangeDetection({ eventCoalescing: true }),\n provideRouter(routes)\n ]\n};\n"); zip.file(folder+"src/app/app.routes.ts","import { Routes } from '@angular/router';\n\nexport const routes: Routes = [];\n"); Object.keys(extracted).forEach(function(p){ var fp=p.startsWith("src/")?p:"src/"+p; zip.file(folder+fp,extracted[p]); }); zip.file(folder+"README.md","# "+slugTitle(pn)+"\n\nGenerated by PantheraHive BOS.\n\n## Setup\n\`\`\`bash\nnpm install\nng serve\n# or: npm start\n\`\`\`\n\n## Build\n\`\`\`bash\nng build\n\`\`\`\n\nOpen in VS Code with Angular Language Service extension.\n"); zip.file(folder+".gitignore","node_modules/\ndist/\n.env\n.DS_Store\n*.local\n.angular/\n"); } /* --- Python --- */ function buildPython(zip,folder,app,code){ var title=slugTitle(app); var pn=pkgName(app); var src=code.replace(/^\`\`\`[\w]*\n?/m,"").replace(/\n?\`\`\`$/m,"").trim(); var reqMap={"numpy":"numpy","pandas":"pandas","sklearn":"scikit-learn","tensorflow":"tensorflow","torch":"torch","flask":"flask","fastapi":"fastapi","uvicorn":"uvicorn","requests":"requests","sqlalchemy":"sqlalchemy","pydantic":"pydantic","dotenv":"python-dotenv","PIL":"Pillow","cv2":"opencv-python","matplotlib":"matplotlib","seaborn":"seaborn","scipy":"scipy"}; var reqs=[]; Object.keys(reqMap).forEach(function(k){if(src.indexOf("import "+k)>=0||src.indexOf("from "+k)>=0)reqs.push(reqMap[k]);}); var reqsTxt=reqs.length?reqs.join("\n"):"# add dependencies here\n"; zip.file(folder+"main.py",src||"# "+title+"\n# Generated by PantheraHive BOS\n\nprint(title+\" loaded\")\n"); zip.file(folder+"requirements.txt",reqsTxt); zip.file(folder+".env.example","# Environment variables\n"); zip.file(folder+"README.md","# "+title+"\n\nGenerated by PantheraHive BOS.\n\n## Setup\n\`\`\`bash\npython3 -m venv .venv\nsource .venv/bin/activate\npip install -r requirements.txt\n\`\`\`\n\n## Run\n\`\`\`bash\npython main.py\n\`\`\`\n"); zip.file(folder+".gitignore",".venv/\n__pycache__/\n*.pyc\n.env\n.DS_Store\n"); } /* --- Node.js --- */ function buildNode(zip,folder,app,code){ var title=slugTitle(app); var pn=pkgName(app); var src=code.replace(/^\`\`\`[\w]*\n?/m,"").replace(/\n?\`\`\`$/m,"").trim(); var depMap={"mongoose":"^8.0.0","dotenv":"^16.4.5","axios":"^1.7.9","cors":"^2.8.5","bcryptjs":"^2.4.3","jsonwebtoken":"^9.0.2","socket.io":"^4.7.4","uuid":"^9.0.1","zod":"^3.22.4","express":"^4.18.2"}; var deps={}; Object.keys(depMap).forEach(function(k){if(src.indexOf(k)>=0)deps[k]=depMap[k];}); if(!deps["express"])deps["express"]="^4.18.2"; var pkgJson=JSON.stringify({"name":pn,"version":"1.0.0","main":"src/index.js","scripts":{"start":"node src/index.js","dev":"nodemon src/index.js"},"dependencies":deps,"devDependencies":{"nodemon":"^3.0.3"}},null,2)+"\n"; zip.file(folder+"package.json",pkgJson); var fallback="const express=require(\"express\");\nconst app=express();\napp.use(express.json());\n\napp.get(\"/\",(req,res)=>{\n res.json({message:\""+title+" API\"});\n});\n\nconst PORT=process.env.PORT||3000;\napp.listen(PORT,()=>console.log(\"Server on port \"+PORT));\n"; zip.file(folder+"src/index.js",src||fallback); zip.file(folder+".env.example","PORT=3000\n"); zip.file(folder+".gitignore","node_modules/\n.env\n.DS_Store\n"); zip.file(folder+"README.md","# "+title+"\n\nGenerated by PantheraHive BOS.\n\n## Setup\n\`\`\`bash\nnpm install\n\`\`\`\n\n## Run\n\`\`\`bash\nnpm run dev\n\`\`\`\n"); } /* --- Vanilla HTML --- */ function buildVanillaHtml(zip,folder,app,code){ var title=slugTitle(app); var isFullDoc=code.trim().toLowerCase().indexOf("=0||code.trim().toLowerCase().indexOf("=0; var indexHtml=isFullDoc?code:"\n\n\n\n\n"+title+"\n\n\n\n"+code+"\n\n\n\n"; zip.file(folder+"index.html",indexHtml); zip.file(folder+"style.css","/* "+title+" — styles */\n*{margin:0;padding:0;box-sizing:border-box}\nbody{font-family:system-ui,-apple-system,sans-serif;background:#fff;color:#1a1a2e}\n"); zip.file(folder+"script.js","/* "+title+" — scripts */\n"); zip.file(folder+"assets/.gitkeep",""); zip.file(folder+"README.md","# "+title+"\n\nGenerated by PantheraHive BOS.\n\n## Open\nDouble-click \`index.html\` in your browser.\n\nOr serve locally:\n\`\`\`bash\nnpx serve .\n# or\npython3 -m http.server 3000\n\`\`\`\n"); zip.file(folder+".gitignore",".DS_Store\nnode_modules/\n.env\n"); } /* ===== MAIN ===== */ var sc=document.createElement("script"); sc.src="https://cdnjs.cloudflare.com/ajax/libs/jszip/3.10.1/jszip.min.js"; sc.onerror=function(){ if(lbl)lbl.textContent="Download ZIP"; alert("JSZip load failed — check connection."); }; sc.onload=function(){ var zip=new JSZip(); var base=(_phFname||"output").replace(/\.[^.]+$/,""); var app=base.toLowerCase().replace(/[^a-z0-9]+/g,"_").replace(/^_+|_+$/g,"")||"my_app"; var folder=app+"/"; var vc=document.getElementById("panel-content"); var panelTxt=vc?(vc.innerText||vc.textContent||""):""; var lang=detectLang(_phCode,panelTxt); if(_phIsHtml){ buildVanillaHtml(zip,folder,app,_phCode); } else if(lang==="flutter"){ buildFlutter(zip,folder,app,_phCode,panelTxt); } else if(lang==="react-native"){ buildReactNative(zip,folder,app,_phCode,panelTxt); } else if(lang==="swift"){ buildSwift(zip,folder,app,_phCode,panelTxt); } else if(lang==="kotlin"){ buildKotlin(zip,folder,app,_phCode,panelTxt); } else if(lang==="react"){ buildReact(zip,folder,app,_phCode,panelTxt); } else if(lang==="vue"){ buildVue(zip,folder,app,_phCode,panelTxt); } else if(lang==="angular"){ buildAngular(zip,folder,app,_phCode,panelTxt); } else if(lang==="python"){ buildPython(zip,folder,app,_phCode); } else if(lang==="node"){ buildNode(zip,folder,app,_phCode); } else { /* Document/content workflow */ var title=app.replace(/_/g," "); var md=_phAll||_phCode||panelTxt||"No content"; zip.file(folder+app+".md",md); var h=""+title+""; h+="

"+title+"

"; var hc=md.replace(/&/g,"&").replace(//g,">"); hc=hc.replace(/^### (.+)$/gm,"

$1

"); hc=hc.replace(/^## (.+)$/gm,"

$1

"); hc=hc.replace(/^# (.+)$/gm,"

$1

"); hc=hc.replace(/\*\*(.+?)\*\*/g,"$1"); hc=hc.replace(/\n{2,}/g,"

"); h+="

"+hc+"

Generated by PantheraHive BOS
"; zip.file(folder+app+".html",h); zip.file(folder+"README.md","# "+title+"\n\nGenerated by PantheraHive BOS.\n\nFiles:\n- "+app+".md (Markdown)\n- "+app+".html (styled HTML)\n"); } zip.generateAsync({type:"blob"}).then(function(blob){ var a=document.createElement("a"); a.href=URL.createObjectURL(blob); a.download=app+".zip"; a.click(); URL.revokeObjectURL(a.href); if(lbl)lbl.textContent="Download ZIP"; }); }; document.head.appendChild(sc); } function phShare(){navigator.clipboard.writeText(window.location.href).then(function(){var el=document.getElementById("ph-share-lbl");if(el){el.textContent="Link copied!";setTimeout(function(){el.textContent="Copy share link";},2500);}});}function phEmbed(){var runId=window.location.pathname.split("/").pop().replace(".html","");var embedUrl="https://pantherahive.com/embed/"+runId;var code='';navigator.clipboard.writeText(code).then(function(){var el=document.getElementById("ph-embed-lbl");if(el){el.textContent="Embed code copied!";setTimeout(function(){el.textContent="Get Embed Code";},2500);}});}