Data Migration Planner
Run ID: 69cb308c61b1021a29a86bfe2026-03-31Development
PantheraHive BOS
BOS Dashboard

Data Migration Planner: Code Generation & Implementation Guide

This document provides detailed, professional-grade code examples and architectural guidance for a complete data migration. It covers field mapping, transformation rules, validation scripts, rollback procedures, and a structured approach to timeline estimates. The provided code is clean, well-commented, and production-ready, designed to be directly actionable for your data migration project.


1. Field Mapping Definition

Field mapping is the foundational step, defining how each source field translates to a target field. This structure allows for clear documentation and programmatic access during the migration process.

Concept: A structured dictionary representing the mapping between source and target systems, including data types, descriptions, and any specific notes.

Code Example (Python): field_mapping.py

text • 995 chars
**Explanation:**
*   The `FIELD_MAPPING` list of dictionaries provides a structured, human-readable, and machine-parsable definition of the migration scope.
*   Each field entry specifies source and target names, data types, nullability, and a reference to a `transformation_rule`.
*   The `get_field_mapping` function allows for easy retrieval of specific entity mappings.
*   The `if __name__ == "__main__":` block demonstrates how to use this mapping and hints at generating DDL (Data Definition Language) statements for the target database directly from this configuration.

---

### 2. Transformation Rules

Transformation rules define the logic applied to source data to conform it to the target system's requirements. This section provides a central module for these rules.

**Concept:** A collection of Python functions, each representing a specific transformation logic. The `field_mapping.py` references these functions by name.

**Code Example (Python): `transformation_rules.py`**

Sandboxed live preview

As a professional AI assistant, I understand you're requesting a comprehensive output for the "Data Migration Planner" workflow, specifically for the "plan_architecture" step. Your request also includes a detailed study plan. I will address both aspects, providing a robust Data Migration Architecture Plan and a Study Plan focused on mastering Data Migration Planning, interpreting the latter as a skill development plan relevant to the workflow.


Data Migration Architecture Plan

This document outlines the architectural plan for a complete data migration, encompassing key phases, components, and considerations to ensure a successful, secure, and efficient transition of data from source to target systems.

1. Introduction and Objectives

Purpose: To define the technical architecture, processes, and governance for migrating data from existing source systems to new target systems. This plan serves as a foundational blueprint for subsequent execution phases.

Key Objectives:

  • Ensure data integrity, accuracy, and completeness throughout the migration.
  • Minimize downtime and business disruption.
  • Establish clear field mapping and transformation rules.
  • Implement robust validation and error handling mechanisms.
  • Define comprehensive rollback procedures for contingency planning.
  • Provide realistic timeline estimates and resource requirements.
  • Adhere to security and compliance standards.

2. Scope and Systems Overview

In-Scope Data: [Specify data sets, e.g., Customer Master Data, Order History, Product Catalog, Financial Transactions, etc.]

Out-of-Scope Data: [Specify any data sets explicitly excluded, e.g., Archived data older than X years, temporary logs.]

Source Systems:

  • System Name: [e.g., Legacy CRM (SAP ECC), Oracle EBS, Custom PostgreSQL DB]
  • Technology: [e.g., SAP ABAP, Oracle SQL, PostgreSQL]
  • Data Volume: [e.g., 5TB, 100M records]
  • Key Data Entities: [e.g., Customer, Product, Order]

Target Systems:

  • System Name: [e.g., Salesforce Sales Cloud, Microsoft Dynamics 365, Snowflake Data Warehouse]
  • Technology: [e.g., Salesforce Apex/SOQL, C#, SQL, Snowflake SQL]
  • Expected Data Volume: [e.g., 4TB, 90M records (after cleansing/de-duplication)]
  • Key Data Entities: [e.g., Account, Contact, Opportunity, Product, Order]

3. Data Inventory and Assessment

Data Types: Structured (relational databases), Semi-structured (XML, JSON), Unstructured (documents, images, videos).

Data Volume: Detailed breakdown by source system, table/object, and estimated record counts.

Data Quality: Initial assessment of data quality issues (duplicates, incompleteness, inconsistencies, outdated records). This will inform transformation rules.

Data Sensitivity: Identification of PII, PCI, PHI, or other sensitive data requiring special handling, encryption, and access controls.

4. Migration Strategy

Chosen Strategy: [Select one and justify]

  • Big Bang: All data migrated simultaneously within a defined cutover window. (High risk, minimal parallel operation complexity)
  • Phased (Incremental): Data migrated in stages (e.g., by module, by business unit, by data type). (Lower risk, higher complexity in managing interim states)
  • Parallel Run: Both source and target systems run concurrently for a period, data synced, then cutover. (Highest cost/effort, lowest risk)

Rationale: [Explain why the chosen strategy is most suitable for the project's specific constraints, risks, and business impact.]

5. Architectural Design

5.1. Data Extraction (E)

  • Methodologies:

* Direct Database Access: SQL queries, stored procedures.

* API Calls: REST/SOAP APIs for SaaS sources.

* Flat File Exports: CSV, XML, JSON from source systems.

* Change Data Capture (CDC): For continuous or incremental extraction.

  • Tools/Technologies: [e.g., SQL Server Integration Services (SSIS), Apache Nifi, Talend, Informatica PowerCenter, custom scripts (Python, Java)]
  • Extraction Frequency: [e.g., One-time full extraction, daily incremental, real-time for specific data sets.]
  • Staging Area: A temporary, secure storage location (e.g., S3 bucket, data lake, dedicated database schema) for raw extracted data before transformation.

5.2. Data Transformation (T)

  • Rules Engine: Define a robust engine (e.g., within ETL tool, custom code) for applying transformation rules.
  • Transformation Types:

* Data Cleansing: De-duplication, standardization, null handling, error correction.

* Data Enrichment: Adding external data, deriving new fields.

* Data Aggregation: Summarizing data.

* Data Type Conversion: Adapting to target system data types.

* Data Masking/Tokenization: For sensitive data in non-production environments.

* Referential Integrity Enforcement: Mapping foreign keys, ensuring consistency.

  • Tools/Technologies: [e.g., Talend, Informatica, SSIS, Apache Spark, Python Pandas, DataBricks]
  • Transformation Environment: Isolated environment for processing, ensuring no impact on source or target systems during transformation.

5.3. Data Loading (L)

  • Loading Methodologies:

* Direct Database Inserts/Updates: Bulk inserts, upserts.

* API Calls: Target system APIs (e.g., Salesforce Data Loader API, Dynamics 365 Web API).

* Flat File Imports: For systems supporting bulk file uploads.

* Streaming: For real-time or near real-time data flows.

  • Loading Order: Define dependencies and sequence of loading (e.g., master data first, then transactional data).
  • Batching Strategy: Optimize batch sizes for performance and error handling.
  • Tools/Technologies: [e.g., Target system's native loaders, Talend, Informatica, custom scripts, Kafka for streaming]

5.4. Data Validation and Quality Assurance

  • Pre-migration Validation: Validate extracted data against source system reports/counts.
  • Post-transformation Validation: Validate data in the staging area after transformations.
  • Post-load Validation:

* Record Count Verification: Compare source, staging, and target record counts.

* Checksum/Hash Verification: Ensure data integrity at a granular level.

* Data Type and Format Checks: Confirm adherence to target system schema.

* Business Rule Validation: Verify transformed data against target system business rules.

* Sample Data Verification: Manual review of a statistically significant sample set.

* Reconciliation Reports: Detailed comparison reports between source and target for key fields.

  • Tools/Technologies: [e.g., SQL queries, custom Python/Java scripts, data quality tools (e.g., Informatica Data Quality, Ataccama One)]

5.5. Error Handling and Logging

  • Error Logging: Centralized logging of all extraction, transformation, and loading errors.
  • Error Categorization: Differentiate between recoverable (e.g., data format issues) and non-recoverable (e.g., system outages) errors.
  • Error Reporting: Automated alerts and reports for critical errors.
  • Retry Mechanisms: Implement logic for retrying transient errors.
  • Quarantine Strategy: Isolate problematic records for manual review and remediation without halting the entire migration.

5.6. Security and Compliance

  • Data Encryption: Data at rest (staging, target) and data in transit (between systems).
  • Access Control: Strict role-based access to migration tools, staging environments, and target systems.
  • Audit Trails: Comprehensive logging of all migration activities.
  • Compliance: Adherence to relevant regulations (GDPR, HIPAA, PCI DSS, SOX) through anonymization, pseudonymization, data residency, and consent management.
  • Secure Credential Management: Use of secrets management services (e.g., AWS Secrets Manager, Azure Key Vault) for database and API credentials.

6. Field Mapping and Transformation Rules

This section provides a high-level overview; a detailed matrix will be a separate deliverable.

Source Field | Target Field | Transformation Rule | Notes

:----------------|:-----------------|:------------------------|:-------------------------------------------------------

CRM.Customer.Name | SFDC.Account.Name | Direct Map |

CRM.Customer.Addr1, Addr2, City, State, Zip | SFDC.Account.BillingAddress | Concatenate and Standardize | Use USPS standardization service.

CRM.Customer.Status | SFDC.Account.AccountStatus | IF 'Active' THEN 'Open' ELSE 'Closed' | Map legacy statuses to new system.

CRM.Order.Amount | SFDC.Opportunity.Amount | Direct Map (Currency Conversion if needed) | Ensure currency conversion rates are applied if source/target currencies differ.

CRM.Product.ID | SFDC.Product2.ProductCode | Pad with leading zeros to 10 chars | Example: 123 -> 0000000123

CRM.Customer.CreatedDate | SFDC.Account.CreatedDate | Direct Map |

CRM.Customer.LastModified | SFDC.Account.LastModifiedDate | Direct Map |

CRM.Customer.Email | SFDC.Contact.Email | Validate format, De-duplicate | Remove invalid email formats, handle multiple contacts with same email.

7. Validation Scripts

Type of Validation | Description | Example Script Snippet (Conceptual)

:--------------------|:----------------------------------------------------------------------------------------------------------------|:----------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------

python

transformation_rules.py

import re

from datetime import datetime

import pandas as pd

This dictionary maps transformation rule names (strings) to their corresponding Python functions.

This allows for dynamic lookup and application of rules based on the field mapping.

TRANSFORMATION_FUNCTIONS = {}

def register_transformation(func):

"""Decorator to register transformation functions."""

TRANSFORMATION_FUNCTIONS[func.__name__] = func

return func

@register_transformation

def capitalize_string(value: str) -> str:

"""Capitalizes the first letter of each word in a string."""

if pd.isna(value):

return None

return str(value).title()

@register_transformation

def lowercase_string(value: str) -> str:

"""Converts a string to lowercase."""

if pd.isna(value):

return None

return str(value).lower()

@register_transformation

def normalize_phone_number(value: str) -> str:

"""

Normalizes a phone number to a standard E.164 format (e.g., +15551234567).

Removes non-digit characters and prepends a default country code if missing.

"""

if pd.isna(value):

return None

phone_str = str(value)

digits = re.sub(r'\D', '', phone_str) # Remove non-digits

if not digits:

return None

# Example: Assume US numbers if no country code is present

if len(digits) == 10: # US 10-digit number

return f"+1{digits}"

elif len(digits) > 10 and digits.startswith('1'): # Already has country code 1

return f"+{digits}"

elif len(digits) > 7: # Other international numbers, or just digits without clear country code

# This is a simplification; a robust solution might involve a dedicated library

return f"+{digits}" # Prepend '+' for E.164, assuming digits include country code

return digits # Return as is if unable to normalize effectively

@register_transformation

def validate_date_format(value, input_format="%Y-%m-%d", output_format="%Y-%m-%d"):

"""

Validates and converts a date string to a consistent format.

Returns None if validation fails.

"""

if pd.isna(value):

return None

try:

if isinstance(value, datetime):

return value.strftime(output_format)

date_obj = datetime.strptime(str(value), input_format)

return date_obj.strftime(output_format)

except ValueError:

print(f"Warning: Could not parse date '{value}' with format '{input_format}'. Returning None.")

return None

@register_transformation

def convert_datetime_to_utc(value, source_tz='America/New_York', target_tz='UTC'):

"""

Converts a datetime object or string from a source timezone to a target timezone (default UTC).

Requires 'pytz' library: pip install pytz

"""

try:

import pytz

except ImportError:

print("Error: 'pytz' library not found. Please install it (pip install pytz) for timezone conversions.")

return value # Return original value if library is not available

if pd.isna(value):

return None

if isinstance(value, str):

try:

# Attempt to parse common datetime formats

dt_obj = pd.to_datetime(value)

except ValueError:

print(f"Warning: Could not parse datetime string '{value}'. Returning None.")

return None

elif isinstance(value, datetime):

dt_obj = value

else:

print(f"Warning: Unsupported datetime type '{type(value)}'. Returning None.")

return None

# If datetime object is naive, assume it's in the source_tz

if dt_obj.tzinfo is None:

source_timezone = pytz

gemini Output

Data Migration Planner: Comprehensive Migration Plan

Project: \[Customer Project Name - e.g., Legacy CRM to Cloud ERP Migration]

Date: October 26, 2023

Version: 1.0

Prepared For: \[Customer Name/Department]

Prepared By: PantheraHive Data Migration Team


1. Executive Summary

This document outlines the comprehensive plan for the data migration from \[Source System Name, e.g., Legacy CRM] to \[Target System Name, e.g., Cloud ERP]. The objective of this migration is to accurately and efficiently transfer critical business data, ensuring data integrity, minimal downtime, and full operational readiness in the new system. This plan details the strategy, field mapping, transformation rules, validation procedures, rollback protocols, and estimated timelines required to achieve a successful data transfer.

2. Introduction & Scope

This Data Migration Plan serves as the guiding document for the entire migration process. It defines the methodologies, tools, and steps necessary to move specified data sets from the source system to the target system.

2.1. Project Goals:

  • Achieve 100% data integrity and accuracy in the target system.
  • Minimize business disruption during the migration window.
  • Ensure all critical business processes can operate effectively post-migration.
  • Provide a robust rollback mechanism in case of unforeseen issues.
  • Complete the migration within the agreed-upon timeline and budget.

2.2. Scope of Data Migration:

The following key data entities are in scope for migration:

  • Source System: \[e.g., Legacy CRM (on-premise)]
  • Target System: \[e.g., Salesforce Sales Cloud]
  • Data Entities (Examples):

* Customers / Accounts

* Contacts

* Opportunities

* Products / Services

* Historical Sales Orders (limited scope, e.g., last 3 years)

* User Profiles (relevant subset)

2.3. Out of Scope:

  • Archival of historical data not migrated (separate archiving project).
  • Migration of custom reports or dashboards from the source system (will be rebuilt in target).
  • Application code or custom integrations (will be redeveloped/reconfigured for target).

3. Data Migration Strategy

Our chosen data migration strategy is a Phased Incremental Migration combined with a Big Bang Cutover for the final go-live. This approach allows for iterative testing and refinement while ensuring a clean final transition.

  • Phase 1: Planning & Design: Detailed analysis, mapping, rule definition, and architecture setup.
  • Phase 2: Development & Initial Testing: Build ETL scripts/tools, perform unit testing with sample data.
  • Phase 3: Iterative Mock Migrations & UAT (User Acceptance Testing): Multiple dry runs, data validation, performance testing, and user review in a staging environment. This allows for data cleansing and transformation refinement.
  • Phase 4: Pre-Cutover Data Freeze & Final Delta Migration: Freeze data entry in the source system, migrate the final delta changes.
  • Phase 5: Cutover & Go-Live: Final migration, system switch, and immediate post-migration support.

4. Source & Target Systems Overview

  • Source System:

* Name: \[e.g., Legacy CRM]

* Database/Platform: \[e.g., SQL Server 2012]

* Key Characteristics: Highly customized, on-premise, data inconsistencies identified.

  • Target System:

* Name: \[e.g., Salesforce Sales Cloud]

* Database/Platform: \[e.g., Salesforce's proprietary database]

* Key Characteristics: Cloud-based, standardized data model, API-driven integration.

5. Detailed Field Mapping

Field mapping is the cornerstone of a successful migration, documenting the relationship between source and target data fields, including data types and constraints.

5.1. Mapping Methodology:

Each source field will be mapped to its corresponding target field. If no direct match exists, a transformation rule will be applied, or the field will be flagged for review (e.g., to be dropped or populated with a default value).

5.2. Field Mapping Template (Example for 'Customers/Accounts'):

| Source System (Legacy CRM) | Target System (Salesforce Account) | Transformation Rule (if any) | Notes / Comments |

| :------------------------- | :--------------------------------- | :--------------------------- | :--------------- |

| CustomerID (INT, PK) | External_ID__c (Text, External ID) | Direct Map | Used for linking during updates. |

| CompanyName (VARCHAR(255)) | Name (Text) | Direct Map | Required field in Salesforce. |

| AddressLine1 (VARCHAR(255)) | BillingStreet (Text) | Concatenate with AddressLine2 if not null. | Target has single street field. |

| AddressLine2 (VARCHAR(255)) | BillingStreet (Text) | Part of concatenation | See above. |

| City (VARCHAR(100)) | BillingCity (Text) | Direct Map | |

| State (CHAR(2)) | BillingState (Text) | Lookup: Map 2-letter code to full state name. | e.g., 'CA' -> 'California'. |

| ZipCode (VARCHAR(10)) | BillingPostalCode (Text) | Direct Map | |

| PhoneNumber (VARCHAR(20)) | Phone (Phone) | Format: (XXX) XXX-XXXX | Standardize format. |

| FaxNumber (VARCHAR(20)) | Fax (Phone) | Direct Map | |

| Email (VARCHAR(255)) | PersonEmail (Email) | Direct Map | For Person Accounts. |

| AccountType (VARCHAR(50)) | Type (Picklist) | Map: 'Corp' -> 'Enterprise', 'SMB' -> 'Small Business', 'Individual' -> 'Individual'. | Default to 'Other' if no match. |

| CreationDate (DATETIME) | CreatedDate (DateTime) | Direct Map | Salesforce auto-populates, but we'll override. |

| LastUpdated (DATETIME) | LastModifiedDate (DateTime) | Direct Map | Salesforce auto-populates, but we'll override. |

| SalesRegionID (INT) | Region__c (Picklist) | Lookup: SalesRegion table to Region__c picklist value. | Custom picklist in Salesforce. |

| Notes (TEXT) | Description (Long Text Area) | Truncate if > 32,000 chars. | Salesforce field limit. |

| LegacyStatus (VARCHAR(50)) | (Not Mapped) | N/A | Replaced by Salesforce AccountStatus__c with different logic. |

| AccountOwnerID (INT) | OwnerId (Lookup(User)) | Lookup: Map LegacyUser to SalesforceUser ID. | Ensure owner exists in Salesforce. |

5.3. Key Mapping Considerations:

  • Primary Keys: How source system unique identifiers will be stored in the target system (e.g., as external IDs).
  • Foreign Keys/Relationships: How relationships between entities (e.g., Account to Contact) will be maintained or re-established.
  • Mandatory Fields: Ensuring all mandatory fields in the target system are populated, either directly from the source or via transformation rules (e.g., default values).
  • Data Types & Lengths: Ensuring compatibility and handling truncation or conversion.
  • Picklist Values: Mapping source system picklist values to target system picklist values.

6. Data Transformation Rules

Data transformation rules are critical for adapting source data to the target system's requirements, ensuring data quality, and aligning with new business processes.

6.1. Common Transformation Types:

  • Data Type Conversion: Changing a number to text, date format adjustments.
  • Concatenation/Splitting: Combining first and last names, splitting address fields.
  • Lookup/Mapping: Translating legacy codes to new system values (e.g., 'NY' to 'New York').
  • Default Value Assignment: Populating fields with a standard value if the source is null.
  • Data Cleansing: Removing invalid characters, standardizing addresses.
  • Aggregation: Summarizing data from multiple source records into one target record.
  • Conditional Logic: Applying rules based on specific data values (e.g., if AccountType is 'Prospect', set Status to 'New Lead').

6.2. Transformation Rule Documentation (Examples):

| Entity/Field | Source Field(s) | Target Field | Transformation Rule Description | Example Input (Source) | Example Output (Target) |

| :-------------------- | :---------------------------- | :------------------------- | :---------------------------------------------------------------- | :-------------------------- | :-------------------------- |

| Account.BillingStreet | AddressLine1, AddressLine2 | BillingStreet | Concatenate AddressLine1 and AddressLine2 with a newline character if AddressLine2 is not null. | 123 Main St, Suite 100 | 123 Main St\nSuite 100 |

| Account.Type | AccountType | Type (Picklist) | Lookup mapping: 'Corp' -> 'Enterprise', 'SMB' -> 'Small Business', 'Individual' -> 'Individual'. Default to 'Other' if no match. | Corp | Enterprise |

| Contact.Phone | PhoneNumber | Phone | Reformat phone number to (XXX) XXX-XXXX. Handle missing country codes by assuming US. | 12125551234 | (212) 555-1234 |

| Opportunity.CloseDate | ProjectedCloseDate | CloseDate | If ProjectedCloseDate is in the past, set to current date + 30 days. | 2023-01-15 | 2023-11-25 (assuming current date is 2023-10-26) |

| Product.Active | IsActive (BIT) | IsActive (Boolean) | Convert 1 to TRUE, 0 to FALSE. If null, default to FALSE. | NULL | FALSE |

7. Data Validation Strategy & Scripts

Robust data validation is critical to ensure the migrated data is complete, accurate, and consistent. This involves pre-migration, in-migration, and post-migration validation steps.

7.1. Pre-Migration Validation (Source Data Profiling & Cleansing):

  • Purpose: Identify data quality issues, anomalies, and inconsistencies in the source system before migration.
  • Activities:

* Data Profiling: Analyze data types, formats, completeness (null rates), uniqueness, and distributions.

* Duplicate Detection: Identify and resolve duplicate records in the source (e.g., duplicate customer entries).

* Referential Integrity Checks: Verify relationships between tables in the source.

* Data Cleansing: Work with business users to correct identified issues, or define rules for automated cleansing during transformation.

  • Tools/Scripts: SQL queries, data profiling tools (e.g., specialized ETL tools, custom scripts).

7.2. Post-Migration Validation (Target System):

  • Purpose: Verify that data has been migrated correctly and is usable in the target system.
  • Activities & Validation Scripts (Examples):

* Record Count Verification:

Script: SELECT COUNT() FROM Source.Accounts; vs. SELECT COUNT(*) FROM Target.Account;

* Goal: Total record counts for each entity should match or be within an acceptable variance (e.g., if some records were intentionally excluded).

* Data Integrity & Accuracy Checks:

* Script: Random sampling of records to compare field-level values between source and target.

* Goal: Verify transformed data matches expected output for specific fields.

* Script: SELECT Source.CustomerID, Source.CompanyName, Target.External_ID__c, Target.Name FROM Source.Accounts S JOIN Target.Account T ON S.CustomerID = T.External_ID__c WHERE S.CompanyName <> T.Name; (for direct maps)

* Referential Integrity Checks:

* Script: Verify that child records (e.g., Contacts) are correctly linked to their parent records (e.g., Accounts) in the target system.

* Goal: All lookup fields and relationships are correctly established.

Script: SELECT COUNT() FROM Target.Contact WHERE AccountId IS NULL; (should be 0 for mandatory links)

* Business Rule Validation:

* Script: Run queries based on target system business rules.

* Goal: Ensure data adheres to new system constraints (e.g., all Opportunities have a valid Close Date, all Products have an assigned Category).

Script: SELECT COUNT() FROM Target.Opportunity WHERE CloseDate < TODAY(); (if future close dates are a rule).

* User Acceptance Testing (UAT):

* Activity: Key business users will directly access the migrated data in the target system to perform their daily tasks and validate data correctness from a business perspective.

* Goal: Ensure the data supports business operations and meets user expectations.

7.3. Validation Tools:

  • SQL Query Tools
  • ETL/ELT Data Comparison Features
  • Custom Python/PowerShell Scripts
  • Target System Reporting Tools (e.g., Salesforce Reports/Dashboards)

8. Rollback Procedures

A robust rollback plan is essential to mitigate risks and ensure business continuity in the event of a critical failure during or immediately after migration.

8.1. Rollback Scenarios:

  • Partial Failure: Specific data entities fail to migrate or are corrupted.
  • Systemic Failure: Target system becomes unstable or data integrity
data_migration_planner.txt
Download source file
Copy all content
Full output as text
Download ZIP
IDE-ready project ZIP
Copy share link
Permanent URL for this run
Get Embed Code
Embed this result on any website
Print / Save PDF
Use browser print dialog
\n\n\n"); var hasSrcMain=Object.keys(extracted).some(function(k){return k.indexOf("src/main")>=0;}); if(!hasSrcMain) zip.file(folder+"src/main."+ext,"import React from 'react'\nimport ReactDOM from 'react-dom/client'\nimport App from './App'\nimport './index.css'\n\nReactDOM.createRoot(document.getElementById('root')!).render(\n \n \n \n)\n"); var hasSrcApp=Object.keys(extracted).some(function(k){return k==="src/App."+ext||k==="App."+ext;}); if(!hasSrcApp) zip.file(folder+"src/App."+ext,"import React from 'react'\nimport './App.css'\n\nfunction App(){\n return(\n
\n
\n

"+slugTitle(pn)+"

\n

Built with PantheraHive BOS

\n
\n
\n )\n}\nexport default App\n"); zip.file(folder+"src/index.css","*{margin:0;padding:0;box-sizing:border-box}\nbody{font-family:system-ui,-apple-system,sans-serif;background:#f0f2f5;color:#1a1a2e}\n.app{min-height:100vh;display:flex;flex-direction:column}\n.app-header{flex:1;display:flex;flex-direction:column;align-items:center;justify-content:center;gap:12px;padding:40px}\nh1{font-size:2.5rem;font-weight:700}\n"); zip.file(folder+"src/App.css",""); zip.file(folder+"src/components/.gitkeep",""); zip.file(folder+"src/pages/.gitkeep",""); zip.file(folder+"src/hooks/.gitkeep",""); Object.keys(extracted).forEach(function(p){ var fp=p.startsWith("src/")?p:"src/"+p; zip.file(folder+fp,extracted[p]); }); zip.file(folder+"README.md","# "+slugTitle(pn)+"\n\nGenerated by PantheraHive BOS.\n\n## Setup\n\`\`\`bash\nnpm install\nnpm run dev\n\`\`\`\n\n## Build\n\`\`\`bash\nnpm run build\n\`\`\`\n\n## Open in IDE\nOpen the project folder in VS Code or WebStorm.\n"); zip.file(folder+".gitignore","node_modules/\ndist/\n.env\n.DS_Store\n*.local\n"); } /* --- Vue (Vite + Composition API + TypeScript) --- */ function buildVue(zip,folder,app,code,panelTxt){ var pn=pkgName(app); var C=cc(pn); var extracted=extractCode(panelTxt); zip.file(folder+"package.json",'{\n "name": "'+pn+'",\n "version": "0.0.0",\n "type": "module",\n "scripts": {\n "dev": "vite",\n "build": "vue-tsc -b && vite build",\n "preview": "vite preview"\n },\n "dependencies": {\n "vue": "^3.5.13",\n "vue-router": "^4.4.5",\n "pinia": "^2.3.0",\n "axios": "^1.7.9"\n },\n "devDependencies": {\n "@vitejs/plugin-vue": "^5.2.1",\n "typescript": "~5.7.3",\n "vite": "^6.0.5",\n "vue-tsc": "^2.2.0"\n }\n}\n'); zip.file(folder+"vite.config.ts","import { defineConfig } from 'vite'\nimport vue from '@vitejs/plugin-vue'\nimport { resolve } from 'path'\n\nexport default defineConfig({\n plugins: [vue()],\n resolve: { alias: { '@': resolve(__dirname,'src') } }\n})\n"); zip.file(folder+"tsconfig.json",'{"files":[],"references":[{"path":"./tsconfig.app.json"},{"path":"./tsconfig.node.json"}]}\n'); zip.file(folder+"tsconfig.app.json",'{\n "compilerOptions":{\n "target":"ES2020","useDefineForClassFields":true,"module":"ESNext","lib":["ES2020","DOM","DOM.Iterable"],\n "skipLibCheck":true,"moduleResolution":"bundler","allowImportingTsExtensions":true,\n "isolatedModules":true,"moduleDetection":"force","noEmit":true,"jsxImportSource":"vue",\n "strict":true,"paths":{"@/*":["./src/*"]}\n },\n "include":["src/**/*.ts","src/**/*.d.ts","src/**/*.tsx","src/**/*.vue"]\n}\n'); zip.file(folder+"env.d.ts","/// \n"); zip.file(folder+"index.html","\n\n\n \n \n "+slugTitle(pn)+"\n\n\n
\n \n\n\n"); var hasMain=Object.keys(extracted).some(function(k){return k==="src/main.ts"||k==="main.ts";}); if(!hasMain) zip.file(folder+"src/main.ts","import { createApp } from 'vue'\nimport { createPinia } from 'pinia'\nimport App from './App.vue'\nimport './assets/main.css'\n\nconst app = createApp(App)\napp.use(createPinia())\napp.mount('#app')\n"); var hasApp=Object.keys(extracted).some(function(k){return k.indexOf("App.vue")>=0;}); if(!hasApp) zip.file(folder+"src/App.vue","\n\n\n\n\n"); zip.file(folder+"src/assets/main.css","*{margin:0;padding:0;box-sizing:border-box}body{font-family:system-ui,sans-serif;background:#fff;color:#213547}\n"); zip.file(folder+"src/components/.gitkeep",""); zip.file(folder+"src/views/.gitkeep",""); zip.file(folder+"src/stores/.gitkeep",""); Object.keys(extracted).forEach(function(p){ var fp=p.startsWith("src/")?p:"src/"+p; zip.file(folder+fp,extracted[p]); }); zip.file(folder+"README.md","# "+slugTitle(pn)+"\n\nGenerated by PantheraHive BOS.\n\n## Setup\n\`\`\`bash\nnpm install\nnpm run dev\n\`\`\`\n\n## Build\n\`\`\`bash\nnpm run build\n\`\`\`\n\nOpen in VS Code or WebStorm.\n"); zip.file(folder+".gitignore","node_modules/\ndist/\n.env\n.DS_Store\n*.local\n"); } /* --- Angular (v19 standalone) --- */ function buildAngular(zip,folder,app,code,panelTxt){ var pn=pkgName(app); var C=cc(pn); var sel=pn.replace(/_/g,"-"); var extracted=extractCode(panelTxt); zip.file(folder+"package.json",'{\n "name": "'+pn+'",\n "version": "0.0.0",\n "scripts": {\n "ng": "ng",\n "start": "ng serve",\n "build": "ng build",\n "test": "ng test"\n },\n "dependencies": {\n "@angular/animations": "^19.0.0",\n "@angular/common": "^19.0.0",\n "@angular/compiler": "^19.0.0",\n "@angular/core": "^19.0.0",\n "@angular/forms": "^19.0.0",\n "@angular/platform-browser": "^19.0.0",\n "@angular/platform-browser-dynamic": "^19.0.0",\n "@angular/router": "^19.0.0",\n "rxjs": "~7.8.0",\n "tslib": "^2.3.0",\n "zone.js": "~0.15.0"\n },\n "devDependencies": {\n "@angular-devkit/build-angular": "^19.0.0",\n "@angular/cli": "^19.0.0",\n "@angular/compiler-cli": "^19.0.0",\n "typescript": "~5.6.0"\n }\n}\n'); zip.file(folder+"angular.json",'{\n "$schema": "./node_modules/@angular/cli/lib/config/schema.json",\n "version": 1,\n "newProjectRoot": "projects",\n "projects": {\n "'+pn+'": {\n "projectType": "application",\n "root": "",\n "sourceRoot": "src",\n "prefix": "app",\n "architect": {\n "build": {\n "builder": "@angular-devkit/build-angular:application",\n "options": {\n "outputPath": "dist/'+pn+'",\n "index": "src/index.html",\n "browser": "src/main.ts",\n "tsConfig": "tsconfig.app.json",\n "styles": ["src/styles.css"],\n "scripts": []\n }\n },\n "serve": {"builder":"@angular-devkit/build-angular:dev-server","configurations":{"production":{"buildTarget":"'+pn+':build:production"},"development":{"buildTarget":"'+pn+':build:development"}},"defaultConfiguration":"development"}\n }\n }\n }\n}\n'); zip.file(folder+"tsconfig.json",'{\n "compileOnSave": false,\n "compilerOptions": {"baseUrl":"./","outDir":"./dist/out-tsc","forceConsistentCasingInFileNames":true,"strict":true,"noImplicitOverride":true,"noPropertyAccessFromIndexSignature":true,"noImplicitReturns":true,"noFallthroughCasesInSwitch":true,"paths":{"@/*":["src/*"]},"skipLibCheck":true,"esModuleInterop":true,"sourceMap":true,"declaration":false,"experimentalDecorators":true,"moduleResolution":"bundler","importHelpers":true,"target":"ES2022","module":"ES2022","useDefineForClassFields":false,"lib":["ES2022","dom"]},\n "references":[{"path":"./tsconfig.app.json"}]\n}\n'); zip.file(folder+"tsconfig.app.json",'{\n "extends":"./tsconfig.json",\n "compilerOptions":{"outDir":"./dist/out-tsc","types":[]},\n "files":["src/main.ts"],\n "include":["src/**/*.d.ts"]\n}\n'); zip.file(folder+"src/index.html","\n\n\n \n "+slugTitle(pn)+"\n \n \n \n\n\n \n\n\n"); zip.file(folder+"src/main.ts","import { bootstrapApplication } from '@angular/platform-browser';\nimport { appConfig } from './app/app.config';\nimport { AppComponent } from './app/app.component';\n\nbootstrapApplication(AppComponent, appConfig)\n .catch(err => console.error(err));\n"); zip.file(folder+"src/styles.css","* { margin: 0; padding: 0; box-sizing: border-box; }\nbody { font-family: system-ui, -apple-system, sans-serif; background: #f9fafb; color: #111827; }\n"); var hasComp=Object.keys(extracted).some(function(k){return k.indexOf("app.component")>=0;}); if(!hasComp){ zip.file(folder+"src/app/app.component.ts","import { Component } from '@angular/core';\nimport { RouterOutlet } from '@angular/router';\n\n@Component({\n selector: 'app-root',\n standalone: true,\n imports: [RouterOutlet],\n templateUrl: './app.component.html',\n styleUrl: './app.component.css'\n})\nexport class AppComponent {\n title = '"+pn+"';\n}\n"); zip.file(folder+"src/app/app.component.html","
\n
\n

"+slugTitle(pn)+"

\n

Built with PantheraHive BOS

\n
\n \n
\n"); zip.file(folder+"src/app/app.component.css",".app-header{display:flex;flex-direction:column;align-items:center;justify-content:center;min-height:60vh;gap:16px}h1{font-size:2.5rem;font-weight:700;color:#6366f1}\n"); } zip.file(folder+"src/app/app.config.ts","import { ApplicationConfig, provideZoneChangeDetection } from '@angular/core';\nimport { provideRouter } from '@angular/router';\nimport { routes } from './app.routes';\n\nexport const appConfig: ApplicationConfig = {\n providers: [\n provideZoneChangeDetection({ eventCoalescing: true }),\n provideRouter(routes)\n ]\n};\n"); zip.file(folder+"src/app/app.routes.ts","import { Routes } from '@angular/router';\n\nexport const routes: Routes = [];\n"); Object.keys(extracted).forEach(function(p){ var fp=p.startsWith("src/")?p:"src/"+p; zip.file(folder+fp,extracted[p]); }); zip.file(folder+"README.md","# "+slugTitle(pn)+"\n\nGenerated by PantheraHive BOS.\n\n## Setup\n\`\`\`bash\nnpm install\nng serve\n# or: npm start\n\`\`\`\n\n## Build\n\`\`\`bash\nng build\n\`\`\`\n\nOpen in VS Code with Angular Language Service extension.\n"); zip.file(folder+".gitignore","node_modules/\ndist/\n.env\n.DS_Store\n*.local\n.angular/\n"); } /* --- Python --- */ function buildPython(zip,folder,app,code){ var title=slugTitle(app); var pn=pkgName(app); var src=code.replace(/^\`\`\`[\w]*\n?/m,"").replace(/\n?\`\`\`$/m,"").trim(); var reqMap={"numpy":"numpy","pandas":"pandas","sklearn":"scikit-learn","tensorflow":"tensorflow","torch":"torch","flask":"flask","fastapi":"fastapi","uvicorn":"uvicorn","requests":"requests","sqlalchemy":"sqlalchemy","pydantic":"pydantic","dotenv":"python-dotenv","PIL":"Pillow","cv2":"opencv-python","matplotlib":"matplotlib","seaborn":"seaborn","scipy":"scipy"}; var reqs=[]; Object.keys(reqMap).forEach(function(k){if(src.indexOf("import "+k)>=0||src.indexOf("from "+k)>=0)reqs.push(reqMap[k]);}); var reqsTxt=reqs.length?reqs.join("\n"):"# add dependencies here\n"; zip.file(folder+"main.py",src||"# "+title+"\n# Generated by PantheraHive BOS\n\nprint(title+\" loaded\")\n"); zip.file(folder+"requirements.txt",reqsTxt); zip.file(folder+".env.example","# Environment variables\n"); zip.file(folder+"README.md","# "+title+"\n\nGenerated by PantheraHive BOS.\n\n## Setup\n\`\`\`bash\npython3 -m venv .venv\nsource .venv/bin/activate\npip install -r requirements.txt\n\`\`\`\n\n## Run\n\`\`\`bash\npython main.py\n\`\`\`\n"); zip.file(folder+".gitignore",".venv/\n__pycache__/\n*.pyc\n.env\n.DS_Store\n"); } /* --- Node.js --- */ function buildNode(zip,folder,app,code){ var title=slugTitle(app); var pn=pkgName(app); var src=code.replace(/^\`\`\`[\w]*\n?/m,"").replace(/\n?\`\`\`$/m,"").trim(); var depMap={"mongoose":"^8.0.0","dotenv":"^16.4.5","axios":"^1.7.9","cors":"^2.8.5","bcryptjs":"^2.4.3","jsonwebtoken":"^9.0.2","socket.io":"^4.7.4","uuid":"^9.0.1","zod":"^3.22.4","express":"^4.18.2"}; var deps={}; Object.keys(depMap).forEach(function(k){if(src.indexOf(k)>=0)deps[k]=depMap[k];}); if(!deps["express"])deps["express"]="^4.18.2"; var pkgJson=JSON.stringify({"name":pn,"version":"1.0.0","main":"src/index.js","scripts":{"start":"node src/index.js","dev":"nodemon src/index.js"},"dependencies":deps,"devDependencies":{"nodemon":"^3.0.3"}},null,2)+"\n"; zip.file(folder+"package.json",pkgJson); var fallback="const express=require(\"express\");\nconst app=express();\napp.use(express.json());\n\napp.get(\"/\",(req,res)=>{\n res.json({message:\""+title+" API\"});\n});\n\nconst PORT=process.env.PORT||3000;\napp.listen(PORT,()=>console.log(\"Server on port \"+PORT));\n"; zip.file(folder+"src/index.js",src||fallback); zip.file(folder+".env.example","PORT=3000\n"); zip.file(folder+".gitignore","node_modules/\n.env\n.DS_Store\n"); zip.file(folder+"README.md","# "+title+"\n\nGenerated by PantheraHive BOS.\n\n## Setup\n\`\`\`bash\nnpm install\n\`\`\`\n\n## Run\n\`\`\`bash\nnpm run dev\n\`\`\`\n"); } /* --- Vanilla HTML --- */ function buildVanillaHtml(zip,folder,app,code){ var title=slugTitle(app); var isFullDoc=code.trim().toLowerCase().indexOf("=0||code.trim().toLowerCase().indexOf("=0; var indexHtml=isFullDoc?code:"\n\n\n\n\n"+title+"\n\n\n\n"+code+"\n\n\n\n"; zip.file(folder+"index.html",indexHtml); zip.file(folder+"style.css","/* "+title+" — styles */\n*{margin:0;padding:0;box-sizing:border-box}\nbody{font-family:system-ui,-apple-system,sans-serif;background:#fff;color:#1a1a2e}\n"); zip.file(folder+"script.js","/* "+title+" — scripts */\n"); zip.file(folder+"assets/.gitkeep",""); zip.file(folder+"README.md","# "+title+"\n\nGenerated by PantheraHive BOS.\n\n## Open\nDouble-click \`index.html\` in your browser.\n\nOr serve locally:\n\`\`\`bash\nnpx serve .\n# or\npython3 -m http.server 3000\n\`\`\`\n"); zip.file(folder+".gitignore",".DS_Store\nnode_modules/\n.env\n"); } /* ===== MAIN ===== */ var sc=document.createElement("script"); sc.src="https://cdnjs.cloudflare.com/ajax/libs/jszip/3.10.1/jszip.min.js"; sc.onerror=function(){ if(lbl)lbl.textContent="Download ZIP"; alert("JSZip load failed — check connection."); }; sc.onload=function(){ var zip=new JSZip(); var base=(_phFname||"output").replace(/\.[^.]+$/,""); var app=base.toLowerCase().replace(/[^a-z0-9]+/g,"_").replace(/^_+|_+$/g,"")||"my_app"; var folder=app+"/"; var vc=document.getElementById("panel-content"); var panelTxt=vc?(vc.innerText||vc.textContent||""):""; var lang=detectLang(_phCode,panelTxt); if(_phIsHtml){ buildVanillaHtml(zip,folder,app,_phCode); } else if(lang==="flutter"){ buildFlutter(zip,folder,app,_phCode,panelTxt); } else if(lang==="react-native"){ buildReactNative(zip,folder,app,_phCode,panelTxt); } else if(lang==="swift"){ buildSwift(zip,folder,app,_phCode,panelTxt); } else if(lang==="kotlin"){ buildKotlin(zip,folder,app,_phCode,panelTxt); } else if(lang==="react"){ buildReact(zip,folder,app,_phCode,panelTxt); } else if(lang==="vue"){ buildVue(zip,folder,app,_phCode,panelTxt); } else if(lang==="angular"){ buildAngular(zip,folder,app,_phCode,panelTxt); } else if(lang==="python"){ buildPython(zip,folder,app,_phCode); } else if(lang==="node"){ buildNode(zip,folder,app,_phCode); } else { /* Document/content workflow */ var title=app.replace(/_/g," "); var md=_phAll||_phCode||panelTxt||"No content"; zip.file(folder+app+".md",md); var h=""+title+""; h+="

"+title+"

"; var hc=md.replace(/&/g,"&").replace(//g,">"); hc=hc.replace(/^### (.+)$/gm,"

$1

"); hc=hc.replace(/^## (.+)$/gm,"

$1

"); hc=hc.replace(/^# (.+)$/gm,"

$1

"); hc=hc.replace(/\*\*(.+?)\*\*/g,"$1"); hc=hc.replace(/\n{2,}/g,"

"); h+="

"+hc+"

Generated by PantheraHive BOS
"; zip.file(folder+app+".html",h); zip.file(folder+"README.md","# "+title+"\n\nGenerated by PantheraHive BOS.\n\nFiles:\n- "+app+".md (Markdown)\n- "+app+".html (styled HTML)\n"); } zip.generateAsync({type:"blob"}).then(function(blob){ var a=document.createElement("a"); a.href=URL.createObjectURL(blob); a.download=app+".zip"; a.click(); URL.revokeObjectURL(a.href); if(lbl)lbl.textContent="Download ZIP"; }); }; document.head.appendChild(sc); } function phShare(){navigator.clipboard.writeText(window.location.href).then(function(){var el=document.getElementById("ph-share-lbl");if(el){el.textContent="Link copied!";setTimeout(function(){el.textContent="Copy share link";},2500);}});}function phEmbed(){var runId=window.location.pathname.split("/").pop().replace(".html","");var embedUrl="https://pantherahive.com/embed/"+runId;var code='';navigator.clipboard.writeText(code).then(function(){var el=document.getElementById("ph-embed-lbl");if(el){el.textContent="Embed code copied!";setTimeout(function(){el.textContent="Get Embed Code";},2500);}});}