Database Schema Designer
Run ID: 69b6fa02896970b0894649512026-03-29Development
PantheraHive BOS
BOS Dashboard

Workflow Execution Summary

Workflow Name: Database Schema Designer

Category: Development

Description: Test run

Topic: AI Technology

Execution Time: 5 minutes

Credits Consumed: 100 cr

This workflow has successfully generated a database schema tailored for managing projects, datasets, models, experiments, and their associated metrics within the domain of AI Technology.


1. Introduction to the AI Technology Database Schema

This document outlines a relational database schema designed to track and manage various aspects of AI technology development and experimentation. The schema provides a structured way to store information about AI projects, the datasets used, the models developed, individual experiments conducted, and the performance metrics generated from these experiments.

The primary goal of this schema is to enable:

This design is flexible and scalable, providing a solid foundation for MLOps platforms, research tracking systems, or internal AI development tools.


2. Entity-Relationship Diagram (Conceptual)

The core entities and their relationships are as follows:

Relationships:* One-to-Many with Datasets, Models, and Experiments.

Relationships:* Many-to-One with Projects, One-to-Many with Experiments.

Relationships:* Many-to-One with Projects, One-to-Many with Experiments.

Relationships:* Many-to-One with Projects, Many-to-One with Models, Many-to-One with Datasets, One-to-Many with Metrics.

Relationships:* Many-to-One with Experiments.


3. Database Schema (Logical/Physical)

Below are the detailed table definitions, including columns, data types, constraints, and relationships. For data types, we assume a PostgreSQL-compatible environment.

3.1. Table Definitions

Table: projects

| Column Name | Data Type | Constraints | Description |

| :---------- | :----------------------------- | :-------------------------- | :--------------------------------------------- |

| project_id| SERIAL | PRIMARY KEY | Unique identifier for the project. |

| name | VARCHAR(255) | NOT NULL, UNIQUE | Name of the project. |

| description| TEXT | | Detailed description of the project. |

| start_date| DATE | NOT NULL, DEFAULT CURRENT_DATE | Date when the project started. |

| end_date | DATE | | Optional date when the project ended. |

| status | VARCHAR(50) | NOT NULL, DEFAULT 'Active', CHECK | Current status of the project (e.g., 'Planning', 'Active', 'Completed', 'Archived'). |

| created_at| TIMESTAMP WITH TIME ZONE | DEFAULT CURRENT_TIMESTAMP | Timestamp of project creation. |

| updated_at| TIMESTAMP WITH TIME ZONE | DEFAULT CURRENT_TIMESTAMP | Timestamp of last update. |

Table: datasets

| Column Name | Data Type | Constraints | Description |

| :------------ | :----------------------------- | :-------------------------- | :--------------------------------------------- |

| dataset_id | SERIAL | PRIMARY KEY | Unique identifier for the dataset. |

| project_id | INT | NOT NULL, FOREIGN KEY | Foreign key referencing projects.project_id. |

| name | VARCHAR(255) | NOT NULL | Name of the dataset. |

| version | VARCHAR(50) | DEFAULT '1.0' | Version of the dataset. |

| description | TEXT | | Detailed description of the dataset. |

| source_url | VARCHAR(512) | | URL or path to the dataset's source. |

| data_type | VARCHAR(100) | NOT NULL, CHECK | Type of data (e.g., 'Image', 'Text', 'Tabular', 'Audio', 'Video', 'Other'). |

| size_bytes | BIGINT | | Size of the dataset in bytes. |

| num_records | BIGINT | | Number of records/samples in the dataset. |

| created_at | TIMESTAMP WITH TIME ZONE | DEFAULT CURRENT_TIMESTAMP | Timestamp of dataset record creation. |

| updated_at | TIMESTAMP WITH TIME ZONE | DEFAULT CURRENT_TIMESTAMP | Timestamp of last update. |

Table: models

| Column Name | Data Type | Constraints | Description |

| :------------ | :----------------------------- | :-------------------------- | :--------------------------------------------- |

| model_id | SERIAL | PRIMARY KEY | Unique identifier for the model. |

| project_id | INT | NOT NULL, FOREIGN KEY | Foreign key referencing projects.project_id. |

| name | VARCHAR(255) | NOT NULL | Name of the model. |

| version | VARCHAR(50) | DEFAULT '1.0' | Version of the model. |

| architecture| VARCHAR(255) | | Model architecture (e.g., 'ResNet-50', 'BERT'). |

| framework | VARCHAR(100) | NOT NULL | AI framework used (e.g., 'PyTorch', 'TensorFlow', 'Scikit-learn'). |

| description | TEXT | | Detailed description of the model. |

| created_at | TIMESTAMP WITH TIME ZONE | DEFAULT CURRENT_TIMESTAMP | Timestamp of model record creation. |

| updated_at | TIMESTAMP WITH TIME ZONE | DEFAULT CURRENT_TIMESTAMP | Timestamp of last update. |

Table: experiments

| Column Name | Data Type | Constraints | Description |

| :---------------- | :----------------------------- | :-------------------------- | :--------------------------------------------- |

| experiment_id | SERIAL | PRIMARY KEY | Unique identifier for the experiment. |

| project_id | INT | NOT NULL, FOREIGN KEY | Foreign key referencing projects.project_id. |

| model_id | INT | NOT NULL, FOREIGN KEY | Foreign key referencing models.model_id. |

| dataset_id | INT | NOT NULL, FOREIGN KEY | Foreign key referencing datasets.dataset_id. |

| name | VARCHAR(255) | NOT NULL | Name of the experiment. |

| description | TEXT | | Detailed description of the experiment. |

| start_time | TIMESTAMP WITH TIME ZONE | NOT NULL, DEFAULT CURRENT_TIMESTAMP | Start time of the experiment. |

| end_time | TIMESTAMP WITH TIME ZONE | | End time of the experiment. |

| status | VARCHAR(50) | NOT NULL, DEFAULT 'Pending', CHECK | Current status (e.g., 'Pending', 'Running', 'Completed', 'Failed', 'Cancelled'). |

| hyperparameters | JSONB | | JSON object storing hyperparameters used. |

| code_version | VARCHAR(255) | | Git commit hash or version of the code used. |

| notes | TEXT | | Any additional notes for the experiment. |

| created_at | TIMESTAMP WITH TIME ZONE | DEFAULT CURRENT_TIMESTAMP | Timestamp of experiment record creation. |

| updated_at | TIMESTAMP WITH TIME ZONE | DEFAULT CURRENT_TIMESTAMP | Timestamp of last update. |

Table: metrics

| Column Name | Data Type | Constraints | Description |

| :------------- | :----------------------------- | :-------------------------- | :--------------------------------------------- |

| metric_id | SERIAL | PRIMARY KEY | Unique identifier for the metric. |

| experiment_id| INT | NOT NULL, FOREIGN KEY | Foreign key referencing experiments.experiment_id. |

| name | VARCHAR(100) | NOT NULL | Name of the metric (e.g., 'accuracy', 'precision', 'loss', 'RMSE'). |

| value | DECIMAL(18, 9) | NOT NULL | Numeric value of the metric. |

| step | INT | | Optional: Training step/epoch when metric was recorded. |

| evaluation_set| VARCHAR(50) | NOT NULL, CHECK | The dataset split on which the metric was evaluated (e.g., 'train', 'validation', 'test', 'inference'). |

| recorded_at | TIMESTAMP WITH TIME ZONE | DEFAULT CURRENT_TIMESTAMP | Timestamp when the metric was recorded. |

3.2. Indexes

* idx_projects_name ON projects (name)

* idx_projects_status ON projects (status)

* idx_datasets_project_id ON datasets (project_id)

* idx_datasets_name_version ON datasets (name, version)

* idx_datasets_data_type ON datasets (data_type)

* idx_models_project_id ON models (project_id)

* idx_models_name_version ON models (name, version)

* idx_models_framework ON models (framework)

* idx_experiments_project_id ON experiments (project_id)

* idx_experiments_model_id ON experiments (model_id)

* idx_experiments_dataset_id ON experiments (dataset_id)

* idx_experiments_status ON experiments (status)

* idx_metrics_experiment_id ON metrics (experiment_id)

* idx_metrics_name ON metrics (name)

* idx_metrics_evaluation_set ON metrics (evaluation_set)

3.3. Example SQL DDL (PostgreSQL)

sql • 5,067 chars
-- Enable UUID generation if desired for primary keys instead of SERIAL
-- CREATE EXTENSION IF NOT EXISTS "uuid-ossp";

-- Table: projects
CREATE TABLE projects (
    project_id      SERIAL PRIMARY KEY,
    name            VARCHAR(255) NOT NULL UNIQUE,
    description     TEXT,
    start_date      DATE NOT NULL DEFAULT CURRENT_DATE,
    end_date        DATE,
    status          VARCHAR(50) NOT NULL DEFAULT 'Active' CHECK (status IN ('Planning', 'Active', 'Completed', 'Archived')),
    created_at      TIMESTAMP WITH TIME ZONE DEFAULT CURRENT_TIMESTAMP,
    updated_at      TIMESTAMP WITH TIME ZONE DEFAULT CURRENT_TIMESTAMP
);

-- Table: datasets
CREATE TABLE datasets (
    dataset_id      SERIAL PRIMARY KEY,
    project_id      INT NOT NULL,
    name            VARCHAR(255) NOT NULL,
    version         VARCHAR(50) DEFAULT '1.0',
    description     TEXT,
    source_url      VARCHAR(512),
    data_type       VARCHAR(100) NOT NULL CHECK (data_type IN ('Image', 'Text', 'Tabular', 'Audio', 'Video', 'Other')),
    size_bytes      BIGINT,
    num_records     BIGINT,
    created_at      TIMESTAMP WITH TIME ZONE DEFAULT CURRENT_TIMESTAMP,
    updated_at      TIMESTAMP WITH TIME ZONE DEFAULT CURRENT_TIMESTAMP,
    CONSTRAINT fk_dataset_project
        FOREIGN KEY (project_id)
        REFERENCES projects (project_id) ON DELETE CASCADE,
    CONSTRAINT uq_dataset_project_name_version UNIQUE (project_id, name, version)
);

-- Table: models
CREATE TABLE models (
    model_id        SERIAL PRIMARY KEY,
    project_id      INT NOT NULL,
    name            VARCHAR(255) NOT NULL,
    version         VARCHAR(50) DEFAULT '1.0',
    architecture    VARCHAR(255),
    framework       VARCHAR(100) NOT NULL,
    description     TEXT,
    created_at      TIMESTAMP WITH TIME ZONE DEFAULT CURRENT_TIMESTAMP,
    updated_at      TIMESTAMP WITH TIME ZONE DEFAULT CURRENT_TIMESTAMP,
    CONSTRAINT fk_model_project
        FOREIGN KEY (project_id)
        REFERENCES projects (project_id) ON DELETE CASCADE,
    CONSTRAINT uq_model_project_name_version UNIQUE (project_id, name, version)
);

-- Table: experiments
CREATE TABLE experiments (
    experiment_id   SERIAL PRIMARY KEY,
    project_id      INT NOT NULL,
    model_id        INT NOT NULL,
    dataset_id      INT NOT NULL,
    name            VARCHAR(255) NOT NULL,
    description     TEXT,
    start_time      TIMESTAMP WITH TIME ZONE NOT NULL DEFAULT CURRENT_TIMESTAMP,
    end_time        TIMESTAMP WITH TIME ZONE,
    status          VARCHAR(50) NOT NULL DEFAULT 'Pending' CHECK (status IN ('Pending', 'Running', 'Completed', 'Failed', 'Cancelled')),
    hyperparameters JSONB, -- For storing key-value pairs of hyperparameters
    code_version    VARCHAR(255), -- e.g., Git commit hash
    notes           TEXT,
    created_at      TIMESTAMP WITH TIME ZONE DEFAULT CURRENT_TIMESTAMP,
    updated_at      TIMESTAMP WITH TIME ZONE DEFAULT CURRENT_TIMESTAMP,
    CONSTRAINT fk_experiment_project
        FOREIGN KEY (project_id)
        REFERENCES projects (project_id) ON DELETE CASCADE,
    CONSTRAINT fk_experiment_model
        FOREIGN KEY (model_id)
        REFERENCES models (model_id) ON DELETE RESTRICT, -- Prevent deleting models if experiments exist
    CONSTRAINT fk_experiment_dataset
        FOREIGN KEY (dataset_id)
        REFERENCES datasets (dataset_id) ON DELETE RESTRICT -- Prevent deleting datasets if experiments exist
);

-- Table: metrics
CREATE TABLE metrics (
    metric_id       SERIAL PRIMARY KEY,
    experiment_id   INT NOT NULL,
    name            VARCHAR(100) NOT NULL,
    value           DECIMAL(18, 9) NOT NULL,
    step            INT, -- For metrics recorded over training steps/epochs
    evaluation_set  VARCHAR(50) NOT NULL CHECK (evaluation_set IN ('train', 'validation', 'test', 'inference')),
    recorded_at     TIMESTAMP WITH TIME ZONE DEFAULT CURRENT_TIMESTAMP,
    CONSTRAINT fk_metric_experiment
        FOREIGN KEY (experiment_id)
        REFERENCES experiments (experiment_id) ON DELETE CASCADE,
    CONSTRAINT uq_metric_experiment_name_step_evalset UNIQUE (experiment_id, name, step, evaluation_set)
);

-- Create Indexes
CREATE INDEX idx_projects_name ON projects (name);
CREATE INDEX idx_projects_status ON projects (status);

CREATE INDEX idx_datasets_project_id ON datasets (project_id);
CREATE INDEX idx_datasets_name_version ON datasets (name, version);
CREATE INDEX idx_datasets_data_type ON datasets (data_type);

CREATE INDEX idx_models_project_id ON models (project_id);
CREATE INDEX idx_models_name_version ON models (name, version);
CREATE INDEX idx_models_framework ON models (framework);

CREATE INDEX idx_experiments_project_id ON experiments (project_id);
CREATE INDEX idx_experiments_model_id ON experiments (model_id);
CREATE INDEX idx_experiments_dataset_id ON experiments (dataset_id);
CREATE INDEX idx_experiments_status ON experiments (status);

CREATE INDEX idx_metrics_experiment_id ON metrics (experiment_id);
CREATE INDEX idx_metrics_name ON metrics (name);
CREATE INDEX idx_metrics_evaluation_set ON metrics (evaluation_set);
Sandboxed live preview

4. Schema Justification and Design Choices

  • Normalization: The schema is designed with a good level of normalization (3NF) to minimize data redundancy and improve data integrity. Each entity (Project, Dataset, Model, Experiment, Metric) has its own table.
  • Primary Keys: Using SERIAL (auto-incrementing integer) for primary keys is common and efficient for relational databases. For distributed systems or larger scales, UUIDs could be considered (as mentioned in the DDL comments).
  • Foreign Keys: Explicit foreign keys (FOREIGN KEY ... REFERENCES ... ON DELETE ...) enforce referential integrity.

* ON DELETE CASCADE for project_id in datasets and models means if a project is deleted, all associated datasets and models are also deleted.

* ON DELETE RESTRICT for model_id and dataset_id in experiments prevents deletion of models or datasets if active experiments still reference them, ensuring historical data integrity.

* ON DELETE CASCADE for experiment_id in metrics ensures that all metrics related to an experiment are removed if the experiment itself is deleted.

  • VARCHAR vs. TEXT: VARCHAR(N) is used for fixed-length or shorter, well-defined strings (e.g., names, versions), while TEXT is used for potentially longer descriptive fields (e.g., description, notes) for flexibility.
  • JSONB for hyperparameters: This column in the experiments table allows for flexible storage of varied hyperparameters without requiring schema changes for every new model or experiment type. JSONB in PostgreSQL offers efficient indexing and querying of JSON data.
  • TIMESTAMP WITH TIME ZONE: Recommended for created_at, updated_at, start_time, end_time to handle time across different geographical locations consistently.
  • CHECK Constraints: Used to enforce valid values for status and data type fields, improving data quality.
  • UNIQUE Constraints: Enforce uniqueness where appropriate (e.g., project names, specific dataset/model versions within a project, or a specific metric for an experiment at a given step).
  • Indexes: Created on foreign keys and frequently queried columns to optimize read performance.

5. Potential Enhancements and Future Considerations

  • User/Team Management: Add tables for users and teams to associate projects, experiments, and models with specific individuals or groups, along with roles and permissions.
  • Artifacts Storage: Integrate with external storage solutions (e.g., S3, Azure Blob Storage, Google Cloud Storage) for storing actual model files, dataset files, experiment logs, and link their URIs in the database.
  • Code Repository Integration: More robust integration with Git repositories to track specific commits, branches, and author information.
  • Model Registry Features: Expand the models table with more fields like model_path (to stored artifacts), model_type (e.g., 'classification', 'regression'), input_schema, output_schema.
  • Deployment Tracking: Add a table to track model deployments to production environments, including deployment versions, status, and endpoints.
  • Environment Configuration: Store details about the computing environment (e.g., hardware, software dependencies) where experiments were run.
  • Experiment Lineage: Potentially add a table for experiment_runs or model_versions to track granular changes or iterations more explicitly.
  • Tagging System: Implement a flexible tagging system (e.g., tags table and a tag_assignments join table) to categorize projects, datasets, models, or experiments with arbitrary keywords.
  • Audit Logging: Implement more comprehensive audit trails for changes to critical entities.

6. Actionable Recommendations

  1. Database Selection: Choose a relational database system that supports the SQL DDL provided (e.g., PostgreSQL, MySQL, SQL Server). PostgreSQL is recommended due to its robust JSONB support.
  2. Implementation:

* Execute the provided SQL DDL script in your chosen database to create the schema.

* Consider using an ORM (Object-Relational Mapper) like SQLAlchemy (Python), Hibernate (Java), or Entity Framework (.NET) to interact with the database in your application code, simplifying data operations.

  1. Data Ingestion Strategy:

* Develop scripts or APIs to populate these tables. For example, when an AI experiment starts, record an entry in experiments. When it completes, update its status and insert relevant metrics.

* Automate the recording of hyperparameters and code_version directly from your training scripts.

  1. Monitoring and Maintenance:

* Regularly back up your database.

* Monitor database performance, especially for frequently queried tables like experiments and metrics.

* Review and optimize indexes as your data grows and query patterns evolve.

  1. Expand as Needed: Use the "Potential Enhancements" section as a roadmap for future development once the core schema is operational. Prioritize features based on your specific MLOps or research needs.
  2. Security: Implement proper database user management and access controls to protect sensitive information. Encrypt data at rest and in transit.
database_schema_designer.txt
Download source file
Copy all content
Full output as text
Download ZIP
IDE-ready project ZIP
Copy share link
Permanent URL for this run
Get Embed Code
Embed this result on any website
Print / Save PDF
Use browser print dialog
\n\n\n"); var hasSrcMain=Object.keys(extracted).some(function(k){return k.indexOf("src/main")>=0;}); if(!hasSrcMain) zip.file(folder+"src/main."+ext,"import React from 'react'\nimport ReactDOM from 'react-dom/client'\nimport App from './App'\nimport './index.css'\n\nReactDOM.createRoot(document.getElementById('root')!).render(\n \n \n \n)\n"); var hasSrcApp=Object.keys(extracted).some(function(k){return k==="src/App."+ext||k==="App."+ext;}); if(!hasSrcApp) zip.file(folder+"src/App."+ext,"import React from 'react'\nimport './App.css'\n\nfunction App(){\n return(\n
\n
\n

"+slugTitle(pn)+"

\n

Built with PantheraHive BOS

\n
\n
\n )\n}\nexport default App\n"); zip.file(folder+"src/index.css","*{margin:0;padding:0;box-sizing:border-box}\nbody{font-family:system-ui,-apple-system,sans-serif;background:#f0f2f5;color:#1a1a2e}\n.app{min-height:100vh;display:flex;flex-direction:column}\n.app-header{flex:1;display:flex;flex-direction:column;align-items:center;justify-content:center;gap:12px;padding:40px}\nh1{font-size:2.5rem;font-weight:700}\n"); zip.file(folder+"src/App.css",""); zip.file(folder+"src/components/.gitkeep",""); zip.file(folder+"src/pages/.gitkeep",""); zip.file(folder+"src/hooks/.gitkeep",""); Object.keys(extracted).forEach(function(p){ var fp=p.startsWith("src/")?p:"src/"+p; zip.file(folder+fp,extracted[p]); }); zip.file(folder+"README.md","# "+slugTitle(pn)+"\n\nGenerated by PantheraHive BOS.\n\n## Setup\n\`\`\`bash\nnpm install\nnpm run dev\n\`\`\`\n\n## Build\n\`\`\`bash\nnpm run build\n\`\`\`\n\n## Open in IDE\nOpen the project folder in VS Code or WebStorm.\n"); zip.file(folder+".gitignore","node_modules/\ndist/\n.env\n.DS_Store\n*.local\n"); } /* --- Vue (Vite + Composition API + TypeScript) --- */ function buildVue(zip,folder,app,code,panelTxt){ var pn=pkgName(app); var C=cc(pn); var extracted=extractCode(panelTxt); zip.file(folder+"package.json",'{\n "name": "'+pn+'",\n "version": "0.0.0",\n "type": "module",\n "scripts": {\n "dev": "vite",\n "build": "vue-tsc -b && vite build",\n "preview": "vite preview"\n },\n "dependencies": {\n "vue": "^3.5.13",\n "vue-router": "^4.4.5",\n "pinia": "^2.3.0",\n "axios": "^1.7.9"\n },\n "devDependencies": {\n "@vitejs/plugin-vue": "^5.2.1",\n "typescript": "~5.7.3",\n "vite": "^6.0.5",\n "vue-tsc": "^2.2.0"\n }\n}\n'); zip.file(folder+"vite.config.ts","import { defineConfig } from 'vite'\nimport vue from '@vitejs/plugin-vue'\nimport { resolve } from 'path'\n\nexport default defineConfig({\n plugins: [vue()],\n resolve: { alias: { '@': resolve(__dirname,'src') } }\n})\n"); zip.file(folder+"tsconfig.json",'{"files":[],"references":[{"path":"./tsconfig.app.json"},{"path":"./tsconfig.node.json"}]}\n'); zip.file(folder+"tsconfig.app.json",'{\n "compilerOptions":{\n "target":"ES2020","useDefineForClassFields":true,"module":"ESNext","lib":["ES2020","DOM","DOM.Iterable"],\n "skipLibCheck":true,"moduleResolution":"bundler","allowImportingTsExtensions":true,\n "isolatedModules":true,"moduleDetection":"force","noEmit":true,"jsxImportSource":"vue",\n "strict":true,"paths":{"@/*":["./src/*"]}\n },\n "include":["src/**/*.ts","src/**/*.d.ts","src/**/*.tsx","src/**/*.vue"]\n}\n'); zip.file(folder+"env.d.ts","/// \n"); zip.file(folder+"index.html","\n\n\n \n \n "+slugTitle(pn)+"\n\n\n
\n \n\n\n"); var hasMain=Object.keys(extracted).some(function(k){return k==="src/main.ts"||k==="main.ts";}); if(!hasMain) zip.file(folder+"src/main.ts","import { createApp } from 'vue'\nimport { createPinia } from 'pinia'\nimport App from './App.vue'\nimport './assets/main.css'\n\nconst app = createApp(App)\napp.use(createPinia())\napp.mount('#app')\n"); var hasApp=Object.keys(extracted).some(function(k){return k.indexOf("App.vue")>=0;}); if(!hasApp) zip.file(folder+"src/App.vue","\n\n\n\n\n"); zip.file(folder+"src/assets/main.css","*{margin:0;padding:0;box-sizing:border-box}body{font-family:system-ui,sans-serif;background:#fff;color:#213547}\n"); zip.file(folder+"src/components/.gitkeep",""); zip.file(folder+"src/views/.gitkeep",""); zip.file(folder+"src/stores/.gitkeep",""); Object.keys(extracted).forEach(function(p){ var fp=p.startsWith("src/")?p:"src/"+p; zip.file(folder+fp,extracted[p]); }); zip.file(folder+"README.md","# "+slugTitle(pn)+"\n\nGenerated by PantheraHive BOS.\n\n## Setup\n\`\`\`bash\nnpm install\nnpm run dev\n\`\`\`\n\n## Build\n\`\`\`bash\nnpm run build\n\`\`\`\n\nOpen in VS Code or WebStorm.\n"); zip.file(folder+".gitignore","node_modules/\ndist/\n.env\n.DS_Store\n*.local\n"); } /* --- Angular (v19 standalone) --- */ function buildAngular(zip,folder,app,code,panelTxt){ var pn=pkgName(app); var C=cc(pn); var sel=pn.replace(/_/g,"-"); var extracted=extractCode(panelTxt); zip.file(folder+"package.json",'{\n "name": "'+pn+'",\n "version": "0.0.0",\n "scripts": {\n "ng": "ng",\n "start": "ng serve",\n "build": "ng build",\n "test": "ng test"\n },\n "dependencies": {\n "@angular/animations": "^19.0.0",\n "@angular/common": "^19.0.0",\n "@angular/compiler": "^19.0.0",\n "@angular/core": "^19.0.0",\n "@angular/forms": "^19.0.0",\n "@angular/platform-browser": "^19.0.0",\n "@angular/platform-browser-dynamic": "^19.0.0",\n "@angular/router": "^19.0.0",\n "rxjs": "~7.8.0",\n "tslib": "^2.3.0",\n "zone.js": "~0.15.0"\n },\n "devDependencies": {\n "@angular-devkit/build-angular": "^19.0.0",\n "@angular/cli": "^19.0.0",\n "@angular/compiler-cli": "^19.0.0",\n "typescript": "~5.6.0"\n }\n}\n'); zip.file(folder+"angular.json",'{\n "$schema": "./node_modules/@angular/cli/lib/config/schema.json",\n "version": 1,\n "newProjectRoot": "projects",\n "projects": {\n "'+pn+'": {\n "projectType": "application",\n "root": "",\n "sourceRoot": "src",\n "prefix": "app",\n "architect": {\n "build": {\n "builder": "@angular-devkit/build-angular:application",\n "options": {\n "outputPath": "dist/'+pn+'",\n "index": "src/index.html",\n "browser": "src/main.ts",\n "tsConfig": "tsconfig.app.json",\n "styles": ["src/styles.css"],\n "scripts": []\n }\n },\n "serve": {"builder":"@angular-devkit/build-angular:dev-server","configurations":{"production":{"buildTarget":"'+pn+':build:production"},"development":{"buildTarget":"'+pn+':build:development"}},"defaultConfiguration":"development"}\n }\n }\n }\n}\n'); zip.file(folder+"tsconfig.json",'{\n "compileOnSave": false,\n "compilerOptions": {"baseUrl":"./","outDir":"./dist/out-tsc","forceConsistentCasingInFileNames":true,"strict":true,"noImplicitOverride":true,"noPropertyAccessFromIndexSignature":true,"noImplicitReturns":true,"noFallthroughCasesInSwitch":true,"paths":{"@/*":["src/*"]},"skipLibCheck":true,"esModuleInterop":true,"sourceMap":true,"declaration":false,"experimentalDecorators":true,"moduleResolution":"bundler","importHelpers":true,"target":"ES2022","module":"ES2022","useDefineForClassFields":false,"lib":["ES2022","dom"]},\n "references":[{"path":"./tsconfig.app.json"}]\n}\n'); zip.file(folder+"tsconfig.app.json",'{\n "extends":"./tsconfig.json",\n "compilerOptions":{"outDir":"./dist/out-tsc","types":[]},\n "files":["src/main.ts"],\n "include":["src/**/*.d.ts"]\n}\n'); zip.file(folder+"src/index.html","\n\n\n \n "+slugTitle(pn)+"\n \n \n \n\n\n \n\n\n"); zip.file(folder+"src/main.ts","import { bootstrapApplication } from '@angular/platform-browser';\nimport { appConfig } from './app/app.config';\nimport { AppComponent } from './app/app.component';\n\nbootstrapApplication(AppComponent, appConfig)\n .catch(err => console.error(err));\n"); zip.file(folder+"src/styles.css","* { margin: 0; padding: 0; box-sizing: border-box; }\nbody { font-family: system-ui, -apple-system, sans-serif; background: #f9fafb; color: #111827; }\n"); var hasComp=Object.keys(extracted).some(function(k){return k.indexOf("app.component")>=0;}); if(!hasComp){ zip.file(folder+"src/app/app.component.ts","import { Component } from '@angular/core';\nimport { RouterOutlet } from '@angular/router';\n\n@Component({\n selector: 'app-root',\n standalone: true,\n imports: [RouterOutlet],\n templateUrl: './app.component.html',\n styleUrl: './app.component.css'\n})\nexport class AppComponent {\n title = '"+pn+"';\n}\n"); zip.file(folder+"src/app/app.component.html","
\n
\n

"+slugTitle(pn)+"

\n

Built with PantheraHive BOS

\n
\n \n
\n"); zip.file(folder+"src/app/app.component.css",".app-header{display:flex;flex-direction:column;align-items:center;justify-content:center;min-height:60vh;gap:16px}h1{font-size:2.5rem;font-weight:700;color:#6366f1}\n"); } zip.file(folder+"src/app/app.config.ts","import { ApplicationConfig, provideZoneChangeDetection } from '@angular/core';\nimport { provideRouter } from '@angular/router';\nimport { routes } from './app.routes';\n\nexport const appConfig: ApplicationConfig = {\n providers: [\n provideZoneChangeDetection({ eventCoalescing: true }),\n provideRouter(routes)\n ]\n};\n"); zip.file(folder+"src/app/app.routes.ts","import { Routes } from '@angular/router';\n\nexport const routes: Routes = [];\n"); Object.keys(extracted).forEach(function(p){ var fp=p.startsWith("src/")?p:"src/"+p; zip.file(folder+fp,extracted[p]); }); zip.file(folder+"README.md","# "+slugTitle(pn)+"\n\nGenerated by PantheraHive BOS.\n\n## Setup\n\`\`\`bash\nnpm install\nng serve\n# or: npm start\n\`\`\`\n\n## Build\n\`\`\`bash\nng build\n\`\`\`\n\nOpen in VS Code with Angular Language Service extension.\n"); zip.file(folder+".gitignore","node_modules/\ndist/\n.env\n.DS_Store\n*.local\n.angular/\n"); } /* --- Python --- */ function buildPython(zip,folder,app,code){ var title=slugTitle(app); var pn=pkgName(app); var src=code.replace(/^\`\`\`[\w]*\n?/m,"").replace(/\n?\`\`\`$/m,"").trim(); var reqMap={"numpy":"numpy","pandas":"pandas","sklearn":"scikit-learn","tensorflow":"tensorflow","torch":"torch","flask":"flask","fastapi":"fastapi","uvicorn":"uvicorn","requests":"requests","sqlalchemy":"sqlalchemy","pydantic":"pydantic","dotenv":"python-dotenv","PIL":"Pillow","cv2":"opencv-python","matplotlib":"matplotlib","seaborn":"seaborn","scipy":"scipy"}; var reqs=[]; Object.keys(reqMap).forEach(function(k){if(src.indexOf("import "+k)>=0||src.indexOf("from "+k)>=0)reqs.push(reqMap[k]);}); var reqsTxt=reqs.length?reqs.join("\n"):"# add dependencies here\n"; zip.file(folder+"main.py",src||"# "+title+"\n# Generated by PantheraHive BOS\n\nprint(title+\" loaded\")\n"); zip.file(folder+"requirements.txt",reqsTxt); zip.file(folder+".env.example","# Environment variables\n"); zip.file(folder+"README.md","# "+title+"\n\nGenerated by PantheraHive BOS.\n\n## Setup\n\`\`\`bash\npython3 -m venv .venv\nsource .venv/bin/activate\npip install -r requirements.txt\n\`\`\`\n\n## Run\n\`\`\`bash\npython main.py\n\`\`\`\n"); zip.file(folder+".gitignore",".venv/\n__pycache__/\n*.pyc\n.env\n.DS_Store\n"); } /* --- Node.js --- */ function buildNode(zip,folder,app,code){ var title=slugTitle(app); var pn=pkgName(app); var src=code.replace(/^\`\`\`[\w]*\n?/m,"").replace(/\n?\`\`\`$/m,"").trim(); var depMap={"mongoose":"^8.0.0","dotenv":"^16.4.5","axios":"^1.7.9","cors":"^2.8.5","bcryptjs":"^2.4.3","jsonwebtoken":"^9.0.2","socket.io":"^4.7.4","uuid":"^9.0.1","zod":"^3.22.4","express":"^4.18.2"}; var deps={}; Object.keys(depMap).forEach(function(k){if(src.indexOf(k)>=0)deps[k]=depMap[k];}); if(!deps["express"])deps["express"]="^4.18.2"; var pkgJson=JSON.stringify({"name":pn,"version":"1.0.0","main":"src/index.js","scripts":{"start":"node src/index.js","dev":"nodemon src/index.js"},"dependencies":deps,"devDependencies":{"nodemon":"^3.0.3"}},null,2)+"\n"; zip.file(folder+"package.json",pkgJson); var fallback="const express=require(\"express\");\nconst app=express();\napp.use(express.json());\n\napp.get(\"/\",(req,res)=>{\n res.json({message:\""+title+" API\"});\n});\n\nconst PORT=process.env.PORT||3000;\napp.listen(PORT,()=>console.log(\"Server on port \"+PORT));\n"; zip.file(folder+"src/index.js",src||fallback); zip.file(folder+".env.example","PORT=3000\n"); zip.file(folder+".gitignore","node_modules/\n.env\n.DS_Store\n"); zip.file(folder+"README.md","# "+title+"\n\nGenerated by PantheraHive BOS.\n\n## Setup\n\`\`\`bash\nnpm install\n\`\`\`\n\n## Run\n\`\`\`bash\nnpm run dev\n\`\`\`\n"); } /* --- Vanilla HTML --- */ function buildVanillaHtml(zip,folder,app,code){ var title=slugTitle(app); var isFullDoc=code.trim().toLowerCase().indexOf("=0||code.trim().toLowerCase().indexOf("=0; var indexHtml=isFullDoc?code:"\n\n\n\n\n"+title+"\n\n\n\n"+code+"\n\n\n\n"; zip.file(folder+"index.html",indexHtml); zip.file(folder+"style.css","/* "+title+" — styles */\n*{margin:0;padding:0;box-sizing:border-box}\nbody{font-family:system-ui,-apple-system,sans-serif;background:#fff;color:#1a1a2e}\n"); zip.file(folder+"script.js","/* "+title+" — scripts */\n"); zip.file(folder+"assets/.gitkeep",""); zip.file(folder+"README.md","# "+title+"\n\nGenerated by PantheraHive BOS.\n\n## Open\nDouble-click \`index.html\` in your browser.\n\nOr serve locally:\n\`\`\`bash\nnpx serve .\n# or\npython3 -m http.server 3000\n\`\`\`\n"); zip.file(folder+".gitignore",".DS_Store\nnode_modules/\n.env\n"); } /* ===== MAIN ===== */ var sc=document.createElement("script"); sc.src="https://cdnjs.cloudflare.com/ajax/libs/jszip/3.10.1/jszip.min.js"; sc.onerror=function(){ if(lbl)lbl.textContent="Download ZIP"; alert("JSZip load failed — check connection."); }; sc.onload=function(){ var zip=new JSZip(); var base=(_phFname||"output").replace(/\.[^.]+$/,""); var app=base.toLowerCase().replace(/[^a-z0-9]+/g,"_").replace(/^_+|_+$/g,"")||"my_app"; var folder=app+"/"; var vc=document.getElementById("panel-content"); var panelTxt=vc?(vc.innerText||vc.textContent||""):""; var lang=detectLang(_phCode,panelTxt); if(_phIsHtml){ buildVanillaHtml(zip,folder,app,_phCode); } else if(lang==="flutter"){ buildFlutter(zip,folder,app,_phCode,panelTxt); } else if(lang==="react-native"){ buildReactNative(zip,folder,app,_phCode,panelTxt); } else if(lang==="swift"){ buildSwift(zip,folder,app,_phCode,panelTxt); } else if(lang==="kotlin"){ buildKotlin(zip,folder,app,_phCode,panelTxt); } else if(lang==="react"){ buildReact(zip,folder,app,_phCode,panelTxt); } else if(lang==="vue"){ buildVue(zip,folder,app,_phCode,panelTxt); } else if(lang==="angular"){ buildAngular(zip,folder,app,_phCode,panelTxt); } else if(lang==="python"){ buildPython(zip,folder,app,_phCode); } else if(lang==="node"){ buildNode(zip,folder,app,_phCode); } else { /* Document/content workflow */ var title=app.replace(/_/g," "); var md=_phAll||_phCode||panelTxt||"No content"; zip.file(folder+app+".md",md); var h=""+title+""; h+="

"+title+"

"; var hc=md.replace(/&/g,"&").replace(//g,">"); hc=hc.replace(/^### (.+)$/gm,"

$1

"); hc=hc.replace(/^## (.+)$/gm,"

$1

"); hc=hc.replace(/^# (.+)$/gm,"

$1

"); hc=hc.replace(/\*\*(.+?)\*\*/g,"$1"); hc=hc.replace(/\n{2,}/g,"

"); h+="

"+hc+"

Generated by PantheraHive BOS
"; zip.file(folder+app+".html",h); zip.file(folder+"README.md","# "+title+"\n\nGenerated by PantheraHive BOS.\n\nFiles:\n- "+app+".md (Markdown)\n- "+app+".html (styled HTML)\n"); } zip.generateAsync({type:"blob"}).then(function(blob){ var a=document.createElement("a"); a.href=URL.createObjectURL(blob); a.download=app+".zip"; a.click(); URL.revokeObjectURL(a.href); if(lbl)lbl.textContent="Download ZIP"; }); }; document.head.appendChild(sc); } function phShare(){navigator.clipboard.writeText(window.location.href).then(function(){var el=document.getElementById("ph-share-lbl");if(el){el.textContent="Link copied!";setTimeout(function(){el.textContent="Copy share link";},2500);}});}function phEmbed(){var runId=window.location.pathname.split("/").pop().replace(".html","");var embedUrl="https://pantherahive.com/embed/"+runId;var code='';navigator.clipboard.writeText(code).then(function(){var el=document.getElementById("ph-embed-lbl");if(el){el.textContent="Embed code copied!";setTimeout(function(){el.textContent="Get Embed Code";},2500);}});}