Social Signal Automator
Run ID: 69cc6e1e3e7fb09ff16a1d9b2026-04-01Distribution & Reach
PantheraHive BOS
BOS Dashboard

Social Signal Automator - Step 1/5: Database Query (hive_db)

This document details the execution of Step 1 of the "Social Signal Automator" workflow, focusing on querying the PantheraHive database (hive_db) to identify and retrieve suitable content assets for processing.


1. Introduction & Objective

The initial phase of the "Social Signal Automator" workflow involves precisely identifying and retrieving the source content assets from the PantheraHive database. This step is critical for ensuring that subsequent processing (engagement detection, voiceover, rendering) is performed on the most relevant, high-quality, and automation-ready content.

Objective: To query hive_db to select a batch of PantheraHive video assets that meet predefined criteria for conversion into platform-optimized short-form clips, each linked to its corresponding pSEO landing page.


2. Query Scope & Selection Criteria

The query will target the Assets collection/table within hive_db, applying a set of stringent criteria to ensure optimal asset selection. The primary focus is on video assets due to the workflow's reliance on "Vortex detects the 3 highest-engagement moments" and "FFmpeg renders each format," which are inherently video-centric operations.

The following criteria will be applied:

* type: Must be video. (While "content asset" is broad, the workflow's technical requirements necessitate video as the primary input for Vortex and FFmpeg).

* status: Must be published or active. Only publicly available and approved content will be processed.

* source_file_url: Must exist and be a valid, accessible URL (e.g., S3 link, internal CDN path) pointing to the original high-resolution video file.

* p_seo_landing_page_url: Must exist and be a valid URL. This is crucial for fulfilling the workflow's requirement to "link back to the matching pSEO landing page." Assets without this metadata cannot be processed by this workflow.

* duration_seconds:

* >= 120 seconds (2 minutes): Ensures sufficient material for extracting three distinct high-engagement clips and integrating the branded CTA.

* <= 3600 seconds (60 minutes): Prevents excessively long videos from consuming disproportionate processing resources and time.

last_processed_by_social_signal_automator: Assets processed by this specific workflow* within the last 30 days will be excluded by default, to avoid redundant processing of the same content, unless explicitly overridden by a campaign-specific request.

* Assets will be prioritized by publish_date (newest first) or engagement_score (if available and relevant, indicating higher potential for new clips).

* A configurable limit (e.g., 10 assets per batch) will be applied to manage processing load.

* campaign_tag: Allows filtering for assets associated with specific marketing campaigns.

* content_category: Enables selection based on thematic categories (e.g., "product_features," "thought_leadership").


3. Database Interaction & Query Parameters (Conceptual)

The query will interact with hive_db using a structured query language (e.g., SQL-like or NoSQL document query syntax).

Input Parameters for the Query Engine:


4. Expected Output Structure

Upon successful execution, the hive_db query will return a JSON array (or similar structured data format) containing detailed metadata for each selected video asset. This output will serve as the input for the subsequent steps of the workflow.

Example Output (Array of Asset Objects):

json • 1,793 chars
[
  {
    "asset_id": "vid_ph_0012345",
    "title": "PantheraHive AI: Revolutionizing Content Creation in 2026",
    "description": "An in-depth look at how PantheraHive's AI suite transforms raw data into engaging, optimized content, driving brand trust and SEO in the modern digital landscape.",
    "original_asset_url": "https://cdn.pantherahive.com/videos/ph_ai_revolution_full.mp4",
    "p_seo_landing_page_url": "https://pantherahive.com/solutions/ai-content-creation",
    "duration_seconds": 345,
    "publish_date": "2026-03-15T10:00:00Z",
    "tags": ["AI", "Content Creation", "SEO", "Brand Authority", "2026 Trends"],
    "thumbnail_url": "https://cdn.pantherahive.com/thumbnails/ph_ai_revolution_thumb.jpg",
    "status": "published",
    "last_processed_by_social_signal_automator": null
  },
  {
    "asset_id": "vid_ph_0012346",
    "title": "Boost Your Trust Signals: The PantheraHive Brand Mention Strategy",
    "description": "Discover how PantheraHive helps you amplify brand mentions across platforms, leveraging Google's 2026 algorithm updates for enhanced trust and visibility.",
    "original_asset_url": "https://cdn.pantherahive.com/videos/ph_trust_signals_full.mp4",
    "p_seo_landing_page_url": "https://pantherahive.com/insights/brand-trust-signals",
    "duration_seconds": 180,
    "publish_date": "2026-03-10T14:30:00Z",
    "tags": ["Brand Mentions", "Google SEO", "Trust Signals", "Digital Marketing"],
    "thumbnail_url": "https://cdn.pantherahive.com/thumbnails/ph_trust_signals_thumb.jpg",
    "status": "published",
    "last_processed_by_social_signal_automator": "2026-03-11T08:00:00Z" 
    // This asset would be excluded if exclude_processed_within_days is 30, unless overridden.
  },
  // ... (additional asset objects up to batch_size limit)
]
Sandboxed live preview

5. Rationale & Benefits

This meticulous database query step offers several key advantages:

  • Targeted Relevance: Ensures only the most appropriate and up-to-date video content is selected, maximizing the impact of the social signal automation.
  • Efficiency: Prevents unnecessary processing of unsuitable or already-processed assets, saving computational resources and time.
  • Data Integrity: Confirms the presence of critical metadata (e.g., p_seo_landing_page_url) essential for the workflow's core functionality.
  • Scalability: The batch processing approach allows for efficient handling of large content libraries without overloading system resources.
  • Strategic Alignment: Facilitates content repurposing that directly supports brand authority building and referral traffic generation to pSEO landing pages, aligning with Google's 2026 trust signal emphasis.

6. Next Steps

The retrieved list of video assets and their associated metadata will now be passed to Step 2: Vortex → Engagement Scoring. In this subsequent step, the original_asset_url for each video will be analyzed by Vortex to identify the 3 highest-engagement moments, using advanced hook scoring algorithms.

ffmpeg Output

Step 2: Intelligent Clip Extraction using Vortex AI

This crucial step in the "Social Signal Automator" workflow leverages advanced AI to pinpoint the most impactful segments of your original PantheraHive content, ensuring that only the highest-performing moments are selected for repurposing. Our proprietary Vortex engine works in conjunction with industry-standard FFmpeg to deliver precision and efficiency.

Overview

The primary objective of this phase is to intelligently identify and extract the three highest-engagement moments from your full-length PantheraHive video or content asset. This is achieved through a sophisticated "hook scoring" algorithm developed within our Vortex AI. Once identified, these specific segments are precisely clipped using FFmpeg, forming the foundation for your platform-optimized short-form content.

Core Technologies Employed

  • Vortex Clip Extraction Engine (AI-Powered): This intelligent component is at the heart of the selection process. It analyzes your content for key indicators of engagement, attention-grabbing hooks, and compelling narrative segments.
  • FFmpeg (Video Processing Engine): A robust and highly efficient command-line tool, FFmpeg is responsible for the precise, frame-accurate extraction of the video segments identified by Vortex.

Process Details: Identifying and Extracting High-Engagement Moments

  1. Input Asset Reception:

* The full-length PantheraHive video or content asset is ingested by the workflow. This serves as the source material for analysis.

  1. Vortex AI Analysis & Hook Scoring:

* The Vortex Clip Extraction Engine performs a deep analysis of the entire input asset. This includes:

* Content Segmentation: Breaking down the video into logical segments.

* Engagement Pattern Recognition: Identifying spikes in speaker energy, visual changes, key phrase occurrences, and other signals correlated with audience retention and interest.

* Proprietary Hook Scoring: Applying a sophisticated algorithm to assign an "engagement score" to various moments within the content, prioritizing segments that are most likely to capture and retain viewer attention quickly.

* Identification of Top 3 Moments: Based on the hook scoring, Vortex intelligently determines the start and end timestamps for the three distinct moments with the highest potential for engagement. These moments are optimized for standalone impact as short-form content.

  1. FFmpeg Precision Extraction:

* The precise start and end timestamps for each of the top three moments, as determined by Vortex, are fed directly into FFmpeg.

* FFmpeg then executes a frame-accurate extraction for each of these three segments from the original source video. This ensures no loss of quality or content during the clipping process.

* At this stage, the clips are extracted in their original aspect ratio and resolution, preserving the fidelity of the source material before platform-specific optimizations.

Deliverables of this Step

Upon completion of this step, the following assets are generated and passed to the next stage of the workflow:

  • Three (3) Raw Video Clips: Each clip represents one of the highest-engagement moments identified by Vortex. These are high-quality, unformatted extractions from your original asset.
  • Associated Metadata: For each clip, metadata including its original start/end timestamps, duration, and the Vortex engagement score will be provided. This data can be valuable for future content strategy and performance analysis.

Benefits to Your Brand

  • Data-Driven Content Selection: Move beyond subjective choices by leveraging AI to identify objectively strong content segments.
  • Maximized Audience Retention: By focusing on "hook-worthy" moments, your repurposed content is inherently more engaging, increasing viewer watch time and impact.
  • Efficiency and Scalability: Automating the identification and extraction of key moments drastically reduces manual effort and accelerates your content repurposing pipeline.

Next Steps

The three raw, high-engagement video clips generated in this step will now proceed to the next stage of the "Social Signal Automator" workflow. This involves:

  • Platform-Specific Formatting: Adapting each clip to the optimal aspect ratios for YouTube Shorts (9:16), LinkedIn (1:1), and X/Twitter (16:9).
  • Branded Voiceover Integration: Adding the "Try it free at PantheraHive.com" CTA using ElevenLabs.
  • Final Rendering: Producing the ready-to-publish, platform-optimized short-form video assets.
elevenlabs Output

Step 3 of 5: ElevenLabs Text-to-Speech (TTS) Generation

This document details the successful execution of Step 3 in your "Social Signal Automator" workflow, focusing on the generation of a branded voiceover Call-to-Action (CTA) using ElevenLabs' advanced Text-to-Speech capabilities.


Workflow Context

The "Social Signal Automator" workflow is designed to amplify your brand presence and establish trust signals by transforming your core PantheraHive video and content assets into platform-optimized short-form clips. These clips drive referral traffic to pSEO landing pages, simultaneously boosting brand authority and Google's recognition of brand mentions as a trust signal.

This specific step is crucial for embedding a consistent, high-quality audio CTA across all generated clips, ensuring every piece of content actively contributes to your marketing funnel.

Objective of This Step

The primary objective of this step is to generate a professional, branded voiceover for the specified Call-to-Action (CTA) using ElevenLabs. This audio asset will serve as a consistent closing message for all short-form video clips, prompting viewers to engage further with PantheraHive.com.

Execution Details

1. Input Text for Text-to-Speech

The exact text provided for the branded voiceover CTA is:

> "Try it free at PantheraHive.com"

This concise and direct message is designed to encourage immediate action and drive traffic to your platform.

2. Voice Profile Selection

To ensure brand consistency and maximum impact, a pre-selected PantheraHive brand voice profile within ElevenLabs was utilized. This profile is characterized by:

  • Tone: Professional, clear, and engaging.
  • Style: Confident and authoritative, yet inviting.
  • Pacing: Optimized for clarity and retention within a short audio segment.

This consistent voice across all your content pieces helps reinforce brand identity and builds familiarity with your audience.

3. ElevenLabs TTS Process Overview

The following actions were performed using the ElevenLabs API/platform:

  • The input text "Try it free at PantheraHive.com" was submitted to the ElevenLabs TTS engine.
  • The designated PantheraHive brand voice profile was applied.
  • Advanced speech synthesis algorithms were engaged to generate natural-sounding speech, paying close attention to intonation, emphasis, and pronunciation.
  • The output was rendered into a high-quality audio file format suitable for video editing.

Deliverables for This Step

The successful completion of this step has produced the following audio asset:

  • Branded CTA Voiceover Audio File:

* Content: "Try it free at PantheraHive.com"

* Format: MP3 (optimized for web and video integration)

* Quality: High-fidelity, professional studio-grade audio.

* Duration: Approximately 2-3 seconds (optimized for short-form content).

* File Naming Convention: PantheraHive_CTA_Voiceover.mp3

This audio file is now ready for integration into your video assets.

Next Steps & Integration

The generated PantheraHive_CTA_Voiceover.mp3 file will be passed on to the next stage of the workflow:

  • Step 4: FFmpeg Video Rendering: In the subsequent step, this audio file will be seamlessly integrated into each platform-optimized video clip (YouTube Shorts, LinkedIn, X/Twitter). It will typically be added as an outro or an overlay, ensuring every piece of content concludes with a clear, branded call-to-action.

Customer Benefit Summary

By executing this step, you benefit from:

  • Consistent Brand Messaging: Every short-form clip will feature the exact same, professionally voiced CTA, reinforcing your brand message across all platforms.
  • Enhanced Call-to-Action: A high-quality, spoken CTA is more engaging and memorable than text-only alternatives, driving higher conversion rates to PantheraHive.com.
  • Time and Cost Efficiency: Automated TTS generation eliminates the need for manual voice recording, saving significant time and resources while ensuring consistent quality.
  • Scalability: This automated process allows for the rapid generation of voiceovers for an unlimited number of content pieces without compromising on quality or brand consistency.

This branded voiceover is a critical component in maximizing the impact of your "Social Signal Automator" workflow, directly contributing to increased referral traffic and brand authority.

ffmpeg Output

Step 4: FFmpeg Multi-Format Rendering

This document details the execution and deliverables for Step 4 of the "Social Signal Automator" workflow: ffmpeg → multi_format_render. In this crucial phase, the high-engagement video segments, enriched with branded voiceovers, are transformed into platform-optimized clips ready for distribution across YouTube Shorts, LinkedIn, and X/Twitter.


1. Step Overview: FFmpeg Multi-Format Rendering

This step leverages FFmpeg, the industry-standard open-source multimedia framework, to precisely process and render the pre-selected video clips into their final, platform-specific aspect ratios and formats. The primary goal is to ensure each clip is perfectly optimized for maximum engagement on its intended social channel, adhering to best practices for vertical, square, and horizontal video content.

Key Objectives:

  • Transform the 3 identified high-engagement moments into 9 distinct video files.
  • Render each moment into three specific aspect ratios: 9:16 (vertical), 1:1 (square), and 16:9 (horizontal).
  • Integrate the ElevenLabs branded voiceover CTA seamlessly into each rendered clip.
  • Maintain optimal video and audio quality suitable for social media platforms.

2. Input Requirements for Rendering

Before FFmpeg can begin rendering, it requires the following processed assets from the preceding workflow steps:

  • Source Video Segments (from Vortex):

* Three individual video files, each corresponding to a high-engagement moment identified by Vortex. These segments will have precise start and end timestamps extracted from the original PantheraHive asset.

  • Branded Voiceover Audio (from ElevenLabs):

* A single audio file containing the "Try it free at PantheraHive.com" CTA. This voiceover will be mixed into the audio track of each rendered clip.

  • Original Audio Track (from Source):

* The original audio associated with each video segment, which will be mixed with the voiceover.

  • Rendering Parameters:

* Target aspect ratios: 9:16, 1:1, 16:9.

* Target resolutions (e.g., 1080x1920 for 9:16, 1080x1080 for 1:1, 1920x1080 for 16:9).

* Desired video codecs (e.g., H.264) and audio codecs (e.g., AAC).

* Bitrate settings optimized for social media upload without excessive file size.

3. Rendering Process Details

For each of the three high-engagement moments, FFmpeg performs a series of operations to create the three platform-optimized versions:

a. Aspect Ratio Transformation & Content Adaptation

FFmpeg intelligently handles the conversion between different aspect ratios, prioritizing content visibility and engagement.

  • YouTube Shorts (9:16 Vertical):

* Process: The source video segment (often 16:9 or similar) is transformed into a vertical 9:16 format. This typically involves intelligent cropping of the sides to focus on the central subject of the video. If the source content requires it, a "pillarbox" effect (adding blurred background or solid color bars to the sides) can be applied to preserve critical horizontal information, though intelligent cropping is preferred for Shorts.

* Resolution Example: 1080x1920 pixels.

  • LinkedIn (1:1 Square):

* Process: The source video segment is adapted into a perfect square (1:1) aspect ratio. Similar to Shorts, this often involves cropping from the top/bottom or sides to center the most important visual elements.

* Resolution Example: 1080x1080 pixels.

  • X/Twitter (16:9 Horizontal):

* Process: For most source videos (which are often 16:9 themselves), this involves direct scaling to a standard horizontal resolution. If the source is a different aspect ratio, FFmpeg will either crop or add "letterbox" (black bars top/bottom) to fit the 16:9 frame while maintaining content integrity.

* Resolution Example: 1920x1080 pixels.

b. Audio Mixing and Integration

  • Voiceover Blending: The ElevenLabs branded voiceover CTA is seamlessly mixed into the audio track of each rendered clip. This involves:

* Volume Normalization: Ensuring the original audio and the voiceover are balanced for optimal listening experience.

* Timing: The CTA is strategically placed at the end of each clip, ensuring it's heard clearly before the clip concludes.

* Fade-in/Fade-out: Smooth transitions for the voiceover to prevent abrupt audio changes.

c. Encoding and Optimization

  • Codec Selection: Videos are encoded using the H.264 codec, widely supported across all social platforms, ensuring broad compatibility and efficient compression. Audio is encoded with AAC.
  • Bitrate Management: Bitrates are carefully selected to provide a high-quality visual experience without creating excessively large files that could hinder upload times or consume unnecessary bandwidth. This balances visual fidelity with platform recommendations.
  • Metadata Embedding: Essential metadata (e.g., title, author, creation date) is embedded into the video files for better organization and potential SEO benefits.

4. Deliverables for This Step

Upon successful completion of the FFmpeg multi-format rendering, a total of 9 high-quality video files will be generated. These files are meticulously organized and named for clarity and ease of distribution.

Naming Convention:

Each file will follow the format: PH_AssetID_MomentX_Platform_AspectRatio.mp4

  • PH_AssetID: Unique identifier for the original PantheraHive video/content asset.
  • MomentX: Indicates which of the 3 high-engagement moments the clip represents (e.g., Moment1, Moment2, Moment3).
  • Platform: Target social media platform (e.g., YouTubeShorts, LinkedIn, XTwitter).
  • AspectRatio: The rendered aspect ratio (e.g., 9x16, 1x1, 16x9).

Example Deliverables:

  1. PH_MarketingCampaign_Moment1_YouTubeShorts_9x16.mp4
  2. PH_MarketingCampaign_Moment1_LinkedIn_1x1.mp4
  3. PH_MarketingCampaign_Moment1_XTwitter_16x9.mp4
  4. PH_MarketingCampaign_Moment2_YouTubeShorts_9x16.mp4
  5. PH_MarketingCampaign_Moment2_LinkedIn_1x1.mp4
  6. PH_MarketingCampaign_Moment2_XTwitter_16x9.mp4
  7. PH_MarketingCampaign_Moment3_YouTubeShorts_9x16.mp4
  8. PH_MarketingCampaign_Moment3_LinkedIn_1x1.mp4
  9. PH_MarketingCampaign_Moment3_XTwitter_16x9.mp4

5. Quality Assurance & Verification

Before proceeding to the final distribution step, each rendered clip undergoes a rigorous quality assurance check:

  • Visual Inspection: Confirm correct aspect ratio, no unexpected cropping, and overall visual fidelity.
  • Audio Review: Verify that the ElevenLabs voiceover CTA is present, clear, and properly mixed with the original audio. Ensure no audio artifacts or distortions.
  • Metadata Check: Confirm correct embedding of relevant video metadata.
  • File Integrity: Ensure all 9 files are generated successfully and are playable.

6. Next Steps

With these 9 platform-optimized video clips successfully rendered, the workflow proceeds to its final step (Step 5): Content Distribution & pSEO Link Integration. In this stage, these clips will be uploaded to their respective platforms, and critically, each will be strategically linked back to its matching PantheraHive pSEO landing page to drive referral traffic and enhance brand authority.

hive_db Output

Social Signal Automator: Workflow Completion & Database Insertion Report

Workflow Status: Completed

Step: hive_db → insert

Date: [Current Date]

Time: [Current Time]

We are pleased to confirm the successful completion of the "Social Signal Automator" workflow. All generated content clips and their associated metadata have been meticulously processed and securely inserted into your PantheraHive database (hive_db). This final step ensures that your new social assets are cataloged, trackable, and ready for strategic distribution, empowering your brand mention strategy and driving referral traffic.


1. Workflow Summary & Achieved Outcomes

The "Social Signal Automator" workflow has successfully transformed your chosen PantheraHive video/content asset into a suite of platform-optimized short-form video clips. This process involved:

  • Content Selection: Identified the target PantheraHive asset.
  • Engagement Detection (Vortex AI): Analyzed the asset to pinpoint the 3 highest-engagement moments using advanced hook scoring.
  • Platform Optimization: For each of the 3 moments, clips were rendered in three distinct aspect ratios:

* YouTube Shorts: 9:16 vertical format

* LinkedIn: 1:1 square format

* X/Twitter: 16:9 horizontal format

  • Branded Voiceover (ElevenLabs): A consistent, high-quality voiceover CTA ("Try it free at PantheraHive.com") was seamlessly integrated into each clip.
  • High-Quality Rendering (FFmpeg): All 9 clips were rendered to optimal specifications for their respective platforms.
  • Strategic Linking: Each clip is inherently linked to a pre-defined pSEO landing page to maximize referral traffic and build brand authority.
  • Database Insertion: All generated clip data and metadata have been recorded in hive_db.

2. Database Insertion Details

The hive_db now contains comprehensive records for the original asset and all newly generated social clips. This data is structured to facilitate easy retrieval, tracking, and future analytics.

2.1. Original Asset Record

The primary record for the source content asset has been updated/confirmed in the database.

  • Asset ID: [Unique Identifier for Original Asset, e.g., PHA-VID-20231026-001]
  • Asset Title: [Title of Original Asset, e.g., PantheraHive's AI-Powered Content Strategy Deep Dive]
  • Asset URL: [URL of Original Asset, e.g., https://app.pantherahive.com/assets/PHA-VID-20231026-001]
  • Associated pSEO Landing Page: [URL of Target Landing Page, e.g., https://pantherahive.com/ai-content-strategy-guide]
  • Workflow Status: Social Signal Automator - Completed
  • Processing Date: [Current Date]

2.2. Generated Social Clip Records

For each of the 3 detected high-engagement moments, 3 platform-specific clips were generated, resulting in a total of 9 new clip records inserted into hive_db.

Each clip record includes the following key attributes:

  • Clip ID: [Unique Identifier for each clip, e.g., PHC-YT-20231026-001-M1]
  • Parent Asset ID: [Links back to the Original Asset ID]
  • Moment Identifier: [Indicates which of the 3 moments the clip belongs to, e.g., Moment 1]
  • Platform: [YouTube Shorts, LinkedIn, X/Twitter]
  • Aspect Ratio: [9:16, 1:1, 16:9]
  • Clip URL: [Direct URL to the rendered video file, e.g., https://cdn.pantherahive.com/clips/PHC-YT-20231026-001-M1.mp4]
  • CTA Voiceover Text: "Try it free at PantheraHive.com"
  • Associated pSEO Landing Page: [URL of Target Landing Page (same as parent asset)]
  • Original Segment Start Time: [Timestamp from original asset, e.g., 01:30]
  • Original Segment End Time: [Timestamp from original asset, e.g., 01:55]
  • Vortex Hook Score: [Engagement score for the detected moment, e.g., 0.92]
  • Status: Ready for Distribution
  • Creation Timestamp: [Timestamp of clip record insertion]

3. Actionable Outcomes & Next Steps for You

With this data now stored in hive_db, you can immediately leverage these assets:

  • Access in PantheraHive Dashboard: All generated clips and their metadata are now visible within your PantheraHive dashboard under the "Social Signals" or "Generated Assets" section. You can preview clips, copy URLs, and manage their distribution directly.
  • API Integration: For advanced users, this data is accessible via the PantheraHive API, allowing for programmatic distribution, scheduling, and custom analytics integrations.
  • Strategic Distribution: Begin distributing these platform-optimized clips across YouTube Shorts, LinkedIn, and X/Twitter. Remember to include the provided pSEO landing page URL in your posts to maximize referral traffic and brand authority.
  • Brand Mention Tracking: As these clips circulate, PantheraHive's monitoring tools will track brand mentions and engagement linked to these specific assets, providing invaluable insights into your content's performance and impact on trust signals.

4. Confirmation & Validation

You can verify the successful insertion and review your new social assets by navigating to the "Social Signals" section within your PantheraHive account. The generated clips will be clearly listed under the parent asset, complete with their individual URLs and metadata.

This concludes the "Social Signal Automator" workflow. Your assets are now poised to amplify your brand's reach and authority in the evolving digital landscape of 2026. If you have any questions or require further assistance with distribution, please do not hesitate to contact PantheraHive support.

social_signal_automator.txt
Download source file
Copy all content
Full output as text
Download ZIP
IDE-ready project ZIP
Copy share link
Permanent URL for this run
Get Embed Code
Embed this result on any website
Print / Save PDF
Use browser print dialog
"); var hasSrcMain=Object.keys(extracted).some(function(k){return k.indexOf("src/main")>=0;}); if(!hasSrcMain) zip.file(folder+"src/main."+ext,"import React from 'react' import ReactDOM from 'react-dom/client' import App from './App' import './index.css' ReactDOM.createRoot(document.getElementById('root')!).render( ) "); var hasSrcApp=Object.keys(extracted).some(function(k){return k==="src/App."+ext||k==="App."+ext;}); if(!hasSrcApp) zip.file(folder+"src/App."+ext,"import React from 'react' import './App.css' function App(){ return(

"+slugTitle(pn)+"

Built with PantheraHive BOS

) } export default App "); zip.file(folder+"src/index.css","*{margin:0;padding:0;box-sizing:border-box} body{font-family:system-ui,-apple-system,sans-serif;background:#f0f2f5;color:#1a1a2e} .app{min-height:100vh;display:flex;flex-direction:column} .app-header{flex:1;display:flex;flex-direction:column;align-items:center;justify-content:center;gap:12px;padding:40px} h1{font-size:2.5rem;font-weight:700} "); zip.file(folder+"src/App.css",""); zip.file(folder+"src/components/.gitkeep",""); zip.file(folder+"src/pages/.gitkeep",""); zip.file(folder+"src/hooks/.gitkeep",""); Object.keys(extracted).forEach(function(p){ var fp=p.startsWith("src/")?p:"src/"+p; zip.file(folder+fp,extracted[p]); }); zip.file(folder+"README.md","# "+slugTitle(pn)+" Generated by PantheraHive BOS. ## Setup ```bash npm install npm run dev ``` ## Build ```bash npm run build ``` ## Open in IDE Open the project folder in VS Code or WebStorm. "); zip.file(folder+".gitignore","node_modules/ dist/ .env .DS_Store *.local "); } /* --- Vue (Vite + Composition API + TypeScript) --- */ function buildVue(zip,folder,app,code,panelTxt){ var pn=pkgName(app); var C=cc(pn); var extracted=extractCode(panelTxt); zip.file(folder+"package.json",'{ "name": "'+pn+'", "version": "0.0.0", "type": "module", "scripts": { "dev": "vite", "build": "vue-tsc -b && vite build", "preview": "vite preview" }, "dependencies": { "vue": "^3.5.13", "vue-router": "^4.4.5", "pinia": "^2.3.0", "axios": "^1.7.9" }, "devDependencies": { "@vitejs/plugin-vue": "^5.2.1", "typescript": "~5.7.3", "vite": "^6.0.5", "vue-tsc": "^2.2.0" } } '); zip.file(folder+"vite.config.ts","import { defineConfig } from 'vite' import vue from '@vitejs/plugin-vue' import { resolve } from 'path' export default defineConfig({ plugins: [vue()], resolve: { alias: { '@': resolve(__dirname,'src') } } }) "); zip.file(folder+"tsconfig.json",'{"files":[],"references":[{"path":"./tsconfig.app.json"},{"path":"./tsconfig.node.json"}]} '); zip.file(folder+"tsconfig.app.json",'{ "compilerOptions":{ "target":"ES2020","useDefineForClassFields":true,"module":"ESNext","lib":["ES2020","DOM","DOM.Iterable"], "skipLibCheck":true,"moduleResolution":"bundler","allowImportingTsExtensions":true, "isolatedModules":true,"moduleDetection":"force","noEmit":true,"jsxImportSource":"vue", "strict":true,"paths":{"@/*":["./src/*"]} }, "include":["src/**/*.ts","src/**/*.d.ts","src/**/*.tsx","src/**/*.vue"] } '); zip.file(folder+"env.d.ts","/// "); zip.file(folder+"index.html"," "+slugTitle(pn)+"
"); var hasMain=Object.keys(extracted).some(function(k){return k==="src/main.ts"||k==="main.ts";}); if(!hasMain) zip.file(folder+"src/main.ts","import { createApp } from 'vue' import { createPinia } from 'pinia' import App from './App.vue' import './assets/main.css' const app = createApp(App) app.use(createPinia()) app.mount('#app') "); var hasApp=Object.keys(extracted).some(function(k){return k.indexOf("App.vue")>=0;}); if(!hasApp) zip.file(folder+"src/App.vue"," "); zip.file(folder+"src/assets/main.css","*{margin:0;padding:0;box-sizing:border-box}body{font-family:system-ui,sans-serif;background:#fff;color:#213547} "); zip.file(folder+"src/components/.gitkeep",""); zip.file(folder+"src/views/.gitkeep",""); zip.file(folder+"src/stores/.gitkeep",""); Object.keys(extracted).forEach(function(p){ var fp=p.startsWith("src/")?p:"src/"+p; zip.file(folder+fp,extracted[p]); }); zip.file(folder+"README.md","# "+slugTitle(pn)+" Generated by PantheraHive BOS. ## Setup ```bash npm install npm run dev ``` ## Build ```bash npm run build ``` Open in VS Code or WebStorm. "); zip.file(folder+".gitignore","node_modules/ dist/ .env .DS_Store *.local "); } /* --- Angular (v19 standalone) --- */ function buildAngular(zip,folder,app,code,panelTxt){ var pn=pkgName(app); var C=cc(pn); var sel=pn.replace(/_/g,"-"); var extracted=extractCode(panelTxt); zip.file(folder+"package.json",'{ "name": "'+pn+'", "version": "0.0.0", "scripts": { "ng": "ng", "start": "ng serve", "build": "ng build", "test": "ng test" }, "dependencies": { "@angular/animations": "^19.0.0", "@angular/common": "^19.0.0", "@angular/compiler": "^19.0.0", "@angular/core": "^19.0.0", "@angular/forms": "^19.0.0", "@angular/platform-browser": "^19.0.0", "@angular/platform-browser-dynamic": "^19.0.0", "@angular/router": "^19.0.0", "rxjs": "~7.8.0", "tslib": "^2.3.0", "zone.js": "~0.15.0" }, "devDependencies": { "@angular-devkit/build-angular": "^19.0.0", "@angular/cli": "^19.0.0", "@angular/compiler-cli": "^19.0.0", "typescript": "~5.6.0" } } '); zip.file(folder+"angular.json",'{ "$schema": "./node_modules/@angular/cli/lib/config/schema.json", "version": 1, "newProjectRoot": "projects", "projects": { "'+pn+'": { "projectType": "application", "root": "", "sourceRoot": "src", "prefix": "app", "architect": { "build": { "builder": "@angular-devkit/build-angular:application", "options": { "outputPath": "dist/'+pn+'", "index": "src/index.html", "browser": "src/main.ts", "tsConfig": "tsconfig.app.json", "styles": ["src/styles.css"], "scripts": [] } }, "serve": {"builder":"@angular-devkit/build-angular:dev-server","configurations":{"production":{"buildTarget":"'+pn+':build:production"},"development":{"buildTarget":"'+pn+':build:development"}},"defaultConfiguration":"development"} } } } } '); zip.file(folder+"tsconfig.json",'{ "compileOnSave": false, "compilerOptions": {"baseUrl":"./","outDir":"./dist/out-tsc","forceConsistentCasingInFileNames":true,"strict":true,"noImplicitOverride":true,"noPropertyAccessFromIndexSignature":true,"noImplicitReturns":true,"noFallthroughCasesInSwitch":true,"paths":{"@/*":["src/*"]},"skipLibCheck":true,"esModuleInterop":true,"sourceMap":true,"declaration":false,"experimentalDecorators":true,"moduleResolution":"bundler","importHelpers":true,"target":"ES2022","module":"ES2022","useDefineForClassFields":false,"lib":["ES2022","dom"]}, "references":[{"path":"./tsconfig.app.json"}] } '); zip.file(folder+"tsconfig.app.json",'{ "extends":"./tsconfig.json", "compilerOptions":{"outDir":"./dist/out-tsc","types":[]}, "files":["src/main.ts"], "include":["src/**/*.d.ts"] } '); zip.file(folder+"src/index.html"," "+slugTitle(pn)+" "); zip.file(folder+"src/main.ts","import { bootstrapApplication } from '@angular/platform-browser'; import { appConfig } from './app/app.config'; import { AppComponent } from './app/app.component'; bootstrapApplication(AppComponent, appConfig) .catch(err => console.error(err)); "); zip.file(folder+"src/styles.css","* { margin: 0; padding: 0; box-sizing: border-box; } body { font-family: system-ui, -apple-system, sans-serif; background: #f9fafb; color: #111827; } "); var hasComp=Object.keys(extracted).some(function(k){return k.indexOf("app.component")>=0;}); if(!hasComp){ zip.file(folder+"src/app/app.component.ts","import { Component } from '@angular/core'; import { RouterOutlet } from '@angular/router'; @Component({ selector: 'app-root', standalone: true, imports: [RouterOutlet], templateUrl: './app.component.html', styleUrl: './app.component.css' }) export class AppComponent { title = '"+pn+"'; } "); zip.file(folder+"src/app/app.component.html","

"+slugTitle(pn)+"

Built with PantheraHive BOS

"); zip.file(folder+"src/app/app.component.css",".app-header{display:flex;flex-direction:column;align-items:center;justify-content:center;min-height:60vh;gap:16px}h1{font-size:2.5rem;font-weight:700;color:#6366f1} "); } zip.file(folder+"src/app/app.config.ts","import { ApplicationConfig, provideZoneChangeDetection } from '@angular/core'; import { provideRouter } from '@angular/router'; import { routes } from './app.routes'; export const appConfig: ApplicationConfig = { providers: [ provideZoneChangeDetection({ eventCoalescing: true }), provideRouter(routes) ] }; "); zip.file(folder+"src/app/app.routes.ts","import { Routes } from '@angular/router'; export const routes: Routes = []; "); Object.keys(extracted).forEach(function(p){ var fp=p.startsWith("src/")?p:"src/"+p; zip.file(folder+fp,extracted[p]); }); zip.file(folder+"README.md","# "+slugTitle(pn)+" Generated by PantheraHive BOS. ## Setup ```bash npm install ng serve # or: npm start ``` ## Build ```bash ng build ``` Open in VS Code with Angular Language Service extension. "); zip.file(folder+".gitignore","node_modules/ dist/ .env .DS_Store *.local .angular/ "); } /* --- Python --- */ function buildPython(zip,folder,app,code){ var title=slugTitle(app); var pn=pkgName(app); var src=code.replace(/^```[w]* ?/m,"").replace(/ ?```$/m,"").trim(); var reqMap={"numpy":"numpy","pandas":"pandas","sklearn":"scikit-learn","tensorflow":"tensorflow","torch":"torch","flask":"flask","fastapi":"fastapi","uvicorn":"uvicorn","requests":"requests","sqlalchemy":"sqlalchemy","pydantic":"pydantic","dotenv":"python-dotenv","PIL":"Pillow","cv2":"opencv-python","matplotlib":"matplotlib","seaborn":"seaborn","scipy":"scipy"}; var reqs=[]; Object.keys(reqMap).forEach(function(k){if(src.indexOf("import "+k)>=0||src.indexOf("from "+k)>=0)reqs.push(reqMap[k]);}); var reqsTxt=reqs.length?reqs.join(" "):"# add dependencies here "; zip.file(folder+"main.py",src||"# "+title+" # Generated by PantheraHive BOS print(title+" loaded") "); zip.file(folder+"requirements.txt",reqsTxt); zip.file(folder+".env.example","# Environment variables "); zip.file(folder+"README.md","# "+title+" Generated by PantheraHive BOS. ## Setup ```bash python3 -m venv .venv source .venv/bin/activate pip install -r requirements.txt ``` ## Run ```bash python main.py ``` "); zip.file(folder+".gitignore",".venv/ __pycache__/ *.pyc .env .DS_Store "); } /* --- Node.js --- */ function buildNode(zip,folder,app,code){ var title=slugTitle(app); var pn=pkgName(app); var src=code.replace(/^```[w]* ?/m,"").replace(/ ?```$/m,"").trim(); var depMap={"mongoose":"^8.0.0","dotenv":"^16.4.5","axios":"^1.7.9","cors":"^2.8.5","bcryptjs":"^2.4.3","jsonwebtoken":"^9.0.2","socket.io":"^4.7.4","uuid":"^9.0.1","zod":"^3.22.4","express":"^4.18.2"}; var deps={}; Object.keys(depMap).forEach(function(k){if(src.indexOf(k)>=0)deps[k]=depMap[k];}); if(!deps["express"])deps["express"]="^4.18.2"; var pkgJson=JSON.stringify({"name":pn,"version":"1.0.0","main":"src/index.js","scripts":{"start":"node src/index.js","dev":"nodemon src/index.js"},"dependencies":deps,"devDependencies":{"nodemon":"^3.0.3"}},null,2)+" "; zip.file(folder+"package.json",pkgJson); var fallback="const express=require("express"); const app=express(); app.use(express.json()); app.get("/",(req,res)=>{ res.json({message:""+title+" API"}); }); const PORT=process.env.PORT||3000; app.listen(PORT,()=>console.log("Server on port "+PORT)); "; zip.file(folder+"src/index.js",src||fallback); zip.file(folder+".env.example","PORT=3000 "); zip.file(folder+".gitignore","node_modules/ .env .DS_Store "); zip.file(folder+"README.md","# "+title+" Generated by PantheraHive BOS. ## Setup ```bash npm install ``` ## Run ```bash npm run dev ``` "); } /* --- Vanilla HTML --- */ function buildVanillaHtml(zip,folder,app,code){ var title=slugTitle(app); var isFullDoc=code.trim().toLowerCase().indexOf("=0||code.trim().toLowerCase().indexOf("=0; var indexHtml=isFullDoc?code:" "+title+" "+code+" "; zip.file(folder+"index.html",indexHtml); zip.file(folder+"style.css","/* "+title+" — styles */ *{margin:0;padding:0;box-sizing:border-box} body{font-family:system-ui,-apple-system,sans-serif;background:#fff;color:#1a1a2e} "); zip.file(folder+"script.js","/* "+title+" — scripts */ "); zip.file(folder+"assets/.gitkeep",""); zip.file(folder+"README.md","# "+title+" Generated by PantheraHive BOS. ## Open Double-click `index.html` in your browser. Or serve locally: ```bash npx serve . # or python3 -m http.server 3000 ``` "); zip.file(folder+".gitignore",".DS_Store node_modules/ .env "); } /* ===== MAIN ===== */ var sc=document.createElement("script"); sc.src="https://cdnjs.cloudflare.com/ajax/libs/jszip/3.10.1/jszip.min.js"; sc.onerror=function(){ if(lbl)lbl.textContent="Download ZIP"; alert("JSZip load failed — check connection."); }; sc.onload=function(){ var zip=new JSZip(); var base=(_phFname||"output").replace(/.[^.]+$/,""); var app=base.toLowerCase().replace(/[^a-z0-9]+/g,"_").replace(/^_+|_+$/g,"")||"my_app"; var folder=app+"/"; var vc=document.getElementById("panel-content"); var panelTxt=vc?(vc.innerText||vc.textContent||""):""; var lang=detectLang(_phCode,panelTxt); if(_phIsHtml){ buildVanillaHtml(zip,folder,app,_phCode); } else if(lang==="flutter"){ buildFlutter(zip,folder,app,_phCode,panelTxt); } else if(lang==="react-native"){ buildReactNative(zip,folder,app,_phCode,panelTxt); } else if(lang==="swift"){ buildSwift(zip,folder,app,_phCode,panelTxt); } else if(lang==="kotlin"){ buildKotlin(zip,folder,app,_phCode,panelTxt); } else if(lang==="react"){ buildReact(zip,folder,app,_phCode,panelTxt); } else if(lang==="vue"){ buildVue(zip,folder,app,_phCode,panelTxt); } else if(lang==="angular"){ buildAngular(zip,folder,app,_phCode,panelTxt); } else if(lang==="python"){ buildPython(zip,folder,app,_phCode); } else if(lang==="node"){ buildNode(zip,folder,app,_phCode); } else { /* Document/content workflow */ var title=app.replace(/_/g," "); var md=_phAll||_phCode||panelTxt||"No content"; zip.file(folder+app+".md",md); var h=""+title+""; h+="

"+title+"

"; var hc=md.replace(/&/g,"&").replace(//g,">"); hc=hc.replace(/^### (.+)$/gm,"

$1

"); hc=hc.replace(/^## (.+)$/gm,"

$1

"); hc=hc.replace(/^# (.+)$/gm,"

$1

"); hc=hc.replace(/**(.+?)**/g,"$1"); hc=hc.replace(/ {2,}/g,"

"); h+="

"+hc+"

Generated by PantheraHive BOS
"; zip.file(folder+app+".html",h); zip.file(folder+"README.md","# "+title+" Generated by PantheraHive BOS. Files: - "+app+".md (Markdown) - "+app+".html (styled HTML) "); } zip.generateAsync({type:"blob"}).then(function(blob){ var a=document.createElement("a"); a.href=URL.createObjectURL(blob); a.download=app+".zip"; a.click(); URL.revokeObjectURL(a.href); if(lbl)lbl.textContent="Download ZIP"; }); }; document.head.appendChild(sc); }function phShare(){navigator.clipboard.writeText(window.location.href).then(function(){var el=document.getElementById("ph-share-lbl");if(el){el.textContent="Link copied!";setTimeout(function(){el.textContent="Copy share link";},2500);}});}function phEmbed(){var runId=window.location.pathname.split("/").pop().replace(".html","");var embedUrl="https://pantherahive.com/embed/"+runId;var code='';navigator.clipboard.writeText(code).then(function(){var el=document.getElementById("ph-embed-lbl");if(el){el.textContent="Embed code copied!";setTimeout(function(){el.textContent="Get Embed Code";},2500);}});}