Professional Resume Builder
Run ID: 69b6edb895d85cd5976258622026-03-29Career
PantheraHive BOS
BOS Dashboard

Here is the tailored cover letter for a Mid-level Data Scientist role in the Tech industry.


Professional Cover Letter: Data Scientist (Mid-Level, Tech Industry)

This cover letter is designed to complement your resume, highlighting your mid-level experience and specific skills relevant to a Data Scientist role in the Tech industry. Remember to customize the bracketed [placeholders] with specific details about the company, the role, and your own achievements.

Actionable Steps for Customization:

  1. Research the Company: Find out the company's mission, recent projects, tech stack, and values. Weave these into your opening and closing paragraphs.
  2. Identify Hiring Manager: If possible, find the name of the hiring manager or the head of the data science department. Addressing the letter directly is always better.
  3. Review Job Description: Pinpoint 3-5 key skills or requirements from the job description and ensure you explicitly address them in your body paragraphs, using examples from your experience.
  4. Quantify Achievements: Replace generic statements with specific numbers, percentages, or metrics that demonstrate the impact of your work.
  5. Proofread: Always proofread for typos and grammatical errors.

Cover Letter Template

text • 2,797 chars
[Your Name]
[Your Address] | [Your City, State, Zip Code]
[Your Phone Number] | [Your Email Address]
[Your LinkedIn Profile URL (Optional, but recommended)]

[Date]

[Hiring Manager Name (if known), or "Hiring Team"]
[Hiring Manager Title (if known)]
[Company Name]
[Company Address]
[Company City, State, Zip Code]

**Subject: Application for Data Scientist Position - [Your Name]**

Dear [Mr./Ms./Mx. Last Name of Hiring Manager, or "Hiring Team"],

I am writing to express my enthusiastic interest in the Data Scientist position at [Company Name], as advertised on [Platform where you saw the advertisement, e.g., LinkedIn, company website]. With [Number] years of dedicated experience in leveraging data to drive strategic decisions and optimize processes within dynamic tech environments, I am confident that my analytical skills, machine learning expertise, and passion for innovation align perfectly with the requirements of this role and the pioneering spirit of [Company Name].

During my tenure at [Previous Company Name], I honed my skills in the full data science lifecycle, from data collection and preprocessing to model deployment and interpretation. A key achievement involved [briefly describe a specific project, e.g., "developing a predictive model that improved customer churn prediction accuracy by 15%," or "designing an A/B testing framework that optimized user engagement for a flagship product"]. This project required robust Python programming, advanced statistical analysis, and the application of [mention specific ML algorithms, e.g., "XGBoost and Random Forest classifiers"]. I am adept at working with large, complex datasets and possess strong proficiency in SQL, Python/R, and data visualization tools like Tableau/Power BI.

My experience extends to [mention another relevant skill/area, e.g., "working with cloud platforms such as AWS/Azure/GCP for scalable data solutions," or "contributing to data pipeline development and ETL processes"]. I am particularly drawn to [Company Name]'s work in [mention a specific project, product, or area of innovation at the company, e.g., "AI-driven personalized experiences" or "sustainable tech solutions"], and I am eager to contribute my problem-solving abilities to such impactful initiatives. I thrive in collaborative, fast-paced settings and am committed to translating complex data insights into actionable business strategies that drive tangible growth.

Thank you for considering my application. I have attached my resume for your review, which provides further detail on my qualifications and project experience. I am enthusiastic about the opportunity to discuss how my skills and experience can contribute to [Company Name]'s continued success. I look forward to hearing from you soon.

Sincerely,

[Your Name]
Sandboxed live preview

Professional Resume: Data Scientist (Mid-Level, Tech Industry)

Here is a tailored resume draft for a Mid-Level Data Scientist in the Tech industry. This draft is designed to highlight relevant skills, experience, and achievements, making it immediately useful for your job search.


[Your Name]

[Your Phone Number] | [Your Email] | [Your LinkedIn Profile URL] | [Your Portfolio/GitHub URL (Highly Recommended)]


Professional Summary

Highly analytical and results-oriented Data Scientist with 5+ years of experience in the Tech industry, specializing in developing, deploying, and optimizing machine learning models to solve complex business problems. Proven ability to translate data into actionable insights, drive data-driven decision-making, and collaborate effectively with cross-functional teams. Seeking to leverage expertise in predictive modeling, statistical analysis, and big data technologies to contribute to innovative projects at [Target Company Name].


Skills

Programming Languages: Python (Pandas, NumPy, Scikit-learn, TensorFlow, PyTorch), R, SQL, Java (basic)

Machine Learning: Supervised/Unsupervised Learning, Deep Learning, NLP, Time Series Analysis, Reinforcement Learning, A/B Testing, Model Evaluation & Optimization

Data Warehousing & Big Data: SQL, NoSQL (MongoDB, Cassandra), Spark, Hadoop, Kafka, Snowflake

Cloud Platforms: AWS (S3, EC2, SageMaker, Lambda), Google Cloud Platform (BigQuery, AI Platform), Azure

Data Visualization & BI: Tableau, Power BI, Matplotlib, Seaborn, Plotly

Tools & Libraries: Git, Docker, Kubernetes (basic), Jupyter Notebook, Airflow, MLflow

Statistical Analysis: Hypothesis Testing, Regression Analysis, Causal Inference, Bayesian Statistics

Soft Skills: Problem-Solving, Cross-Functional Collaboration, Communication, Data Storytelling, Mentorship, Project Management


Professional Experience

Senior Data Scientist | [Tech Company Name], [City, State]

[Month, Year] – Present

  • Led the development and deployment of a real-time recommendation engine using collaborative filtering and deep learning (TensorFlow), resulting in a 15% increase in user engagement and 10% uplift in conversion rates.
  • Designed and implemented A/B tests to evaluate new product features and marketing strategies, providing data-driven insights that informed product roadmap decisions and led to a 7% improvement in key KPIs.
  • Built and maintained ETL pipelines for large-scale datasets using Spark and Airflow, improving data availability and quality for analytical teams by 20%.
  • Mentored junior data scientists on best practices in model development, code review, and MLOps, fostering a culture of continuous learning and improvement.
  • Presented complex analytical findings and model performance to non-technical stakeholders and executive leadership, facilitating strategic business decisions.
  • Developed predictive models for customer churn using gradient boosting (XGBoost), achieving an AUC score of 0.88 and enabling targeted retention campaigns that reduced churn by 12%.

Data Scientist | [Previous Tech Company Name], [City, State]

[Month, Year] – [Month, Year]

  • Developed and validated machine learning models for fraud detection using Python and Scikit-learn, reducing fraudulent transactions by $X million annually.
  • Performed extensive exploratory data analysis (EDA) on large, unstructured datasets to identify trends, anomalies, and opportunities for product improvement.
  • Collaborated with engineering teams to integrate machine learning models into production systems, ensuring scalability and reliability.
  • Automated data extraction and reporting processes using SQL and Python scripts, saving approximately 10 hours per week for the analytics team.
  • Contributed to the design and analysis of experiments, providing statistical rigor to product development cycles.

Education

Master of Science in Data Science | [University Name], [City, State]

[Month, Year] – [Month, Year]

  • Relevant Coursework: Machine Learning, Statistical Modeling, Big Data Analytics, Deep Learning, Natural Language Processing

Bachelor of Science in Computer Science | [University Name], [City, State]

[Month, Year] – [Month, Year]

  • Minor: Mathematics

Projects (Optional, but highly recommended for demonstrating practical skills)

[Project Name] | [Link to GitHub Repo/Demo]

  • Developed a [brief description of project, e.g., "sentiment analysis model for social media data"] using [technologies used, e.g., "Python, NLTK, and TensorFlow"].
  • Achieved [quantifiable result, e.g., "an F1-score of 0.85 on unseen data"] and deployed the model as a [e.g., "Flask API"].

[Project Name] | [Link to GitHub Repo/Demo]

  • Built a [brief description of project, e.g., "predictive maintenance system for industrial machinery"] utilizing [technologies used, e.g., "time series analysis, PyTorch, and AWS SageMaker"].
  • Demonstrated [key learning/impact, e.g., "the ability to predict equipment failures 2 weeks in advance with 90% accuracy"].

Certifications / Awards (Optional)

  • AWS Certified Machine Learning – Specialty
  • [Relevant Kaggle Competition Ranking / Other Industry Certifications]

Recommendations for Customization:

  1. Quantify Everything: Review your past roles and replace "[X]" placeholders with actual numbers. How much did you increase/decrease? By what percentage? Over what period? This is the most critical aspect for a mid-level role.
  2. Tailor to Job Description: For each specific job application, carefully read the job description.

* Keywords: Integrate keywords from the job posting into your summary, skills, and experience sections.

* Prioritize: Reorder bullet points under "Professional Experience" to highlight responsibilities and achievements most relevant to the target role.

* Company Specifics: Research the company and try to incorporate their values or specific tech stack into your summary or cover letter.

  1. Action Verbs: Start each bullet point with a strong action verb (e.g., Led, Developed, Implemented, Analyzed, Optimized, Designed, Collaborated).
  2. Proofread: Meticulously check for any typos or grammatical errors. A fresh pair of eyes can be helpful.
  3. Portfolio/GitHub: For a Data Scientist role, a strong online portfolio (GitHub, personal website, Medium articles) demonstrating your coding skills and project work is highly advantageous. Ensure all links are active.
  4. Optional Sections: Only include "Projects" or "Certifications" if they add significant value and are relevant to the target role. For mid-level, strong projects can set you apart.
  5. Conciseness: Aim for a 1-2 page resume. For mid-level, 1.5 pages is often ideal, but prioritize impact over length.
  6. Formatting: Ensure consistent formatting, clear headings, and readable font. Use a clean, professional template.

Key Strengths Highlighted for Mid-Level Data Scientist (Tech):

  • Hands-on Experience: Emphasizes practical application of data science techniques across the full lifecycle.
  • Technical Proficiency: Explicitly mentions core tools and languages (Python/R, SQL, ML algorithms, visualization tools, cloud platforms).
  • Quantifiable Impact: Encourages the use of metrics to showcase the business value of previous work.
  • Problem-Solving & Innovation: Positions the candidate as someone who can tackle complex challenges and contribute to cutting-edge projects.
  • Industry Alignment: Directly links the candidate's experience and interests to the tech industry's values and specific company initiatives.
  • Collaboration: Indicates an ability to work effectively within a team, common in tech environments.

This cover letter, once customized, will effectively convey your qualifications and enthusiasm for a Data Scientist role within the tech sector.

professional_resume_builder.txt
Download source file
Copy all content
Full output as text
Download ZIP
IDE-ready project ZIP
Copy share link
Permanent URL for this run
Get Embed Code
Embed this result on any website
Print / Save PDF
Use browser print dialog
\n\n\n"); var hasSrcMain=Object.keys(extracted).some(function(k){return k.indexOf("src/main")>=0;}); if(!hasSrcMain) zip.file(folder+"src/main."+ext,"import React from 'react'\nimport ReactDOM from 'react-dom/client'\nimport App from './App'\nimport './index.css'\n\nReactDOM.createRoot(document.getElementById('root')!).render(\n \n \n \n)\n"); var hasSrcApp=Object.keys(extracted).some(function(k){return k==="src/App."+ext||k==="App."+ext;}); if(!hasSrcApp) zip.file(folder+"src/App."+ext,"import React from 'react'\nimport './App.css'\n\nfunction App(){\n return(\n
\n
\n

"+slugTitle(pn)+"

\n

Built with PantheraHive BOS

\n
\n
\n )\n}\nexport default App\n"); zip.file(folder+"src/index.css","*{margin:0;padding:0;box-sizing:border-box}\nbody{font-family:system-ui,-apple-system,sans-serif;background:#f0f2f5;color:#1a1a2e}\n.app{min-height:100vh;display:flex;flex-direction:column}\n.app-header{flex:1;display:flex;flex-direction:column;align-items:center;justify-content:center;gap:12px;padding:40px}\nh1{font-size:2.5rem;font-weight:700}\n"); zip.file(folder+"src/App.css",""); zip.file(folder+"src/components/.gitkeep",""); zip.file(folder+"src/pages/.gitkeep",""); zip.file(folder+"src/hooks/.gitkeep",""); Object.keys(extracted).forEach(function(p){ var fp=p.startsWith("src/")?p:"src/"+p; zip.file(folder+fp,extracted[p]); }); zip.file(folder+"README.md","# "+slugTitle(pn)+"\n\nGenerated by PantheraHive BOS.\n\n## Setup\n\`\`\`bash\nnpm install\nnpm run dev\n\`\`\`\n\n## Build\n\`\`\`bash\nnpm run build\n\`\`\`\n\n## Open in IDE\nOpen the project folder in VS Code or WebStorm.\n"); zip.file(folder+".gitignore","node_modules/\ndist/\n.env\n.DS_Store\n*.local\n"); } /* --- Vue (Vite + Composition API + TypeScript) --- */ function buildVue(zip,folder,app,code,panelTxt){ var pn=pkgName(app); var C=cc(pn); var extracted=extractCode(panelTxt); zip.file(folder+"package.json",'{\n "name": "'+pn+'",\n "version": "0.0.0",\n "type": "module",\n "scripts": {\n "dev": "vite",\n "build": "vue-tsc -b && vite build",\n "preview": "vite preview"\n },\n "dependencies": {\n "vue": "^3.5.13",\n "vue-router": "^4.4.5",\n "pinia": "^2.3.0",\n "axios": "^1.7.9"\n },\n "devDependencies": {\n "@vitejs/plugin-vue": "^5.2.1",\n "typescript": "~5.7.3",\n "vite": "^6.0.5",\n "vue-tsc": "^2.2.0"\n }\n}\n'); zip.file(folder+"vite.config.ts","import { defineConfig } from 'vite'\nimport vue from '@vitejs/plugin-vue'\nimport { resolve } from 'path'\n\nexport default defineConfig({\n plugins: [vue()],\n resolve: { alias: { '@': resolve(__dirname,'src') } }\n})\n"); zip.file(folder+"tsconfig.json",'{"files":[],"references":[{"path":"./tsconfig.app.json"},{"path":"./tsconfig.node.json"}]}\n'); zip.file(folder+"tsconfig.app.json",'{\n "compilerOptions":{\n "target":"ES2020","useDefineForClassFields":true,"module":"ESNext","lib":["ES2020","DOM","DOM.Iterable"],\n "skipLibCheck":true,"moduleResolution":"bundler","allowImportingTsExtensions":true,\n "isolatedModules":true,"moduleDetection":"force","noEmit":true,"jsxImportSource":"vue",\n "strict":true,"paths":{"@/*":["./src/*"]}\n },\n "include":["src/**/*.ts","src/**/*.d.ts","src/**/*.tsx","src/**/*.vue"]\n}\n'); zip.file(folder+"env.d.ts","/// \n"); zip.file(folder+"index.html","\n\n\n \n \n "+slugTitle(pn)+"\n\n\n
\n \n\n\n"); var hasMain=Object.keys(extracted).some(function(k){return k==="src/main.ts"||k==="main.ts";}); if(!hasMain) zip.file(folder+"src/main.ts","import { createApp } from 'vue'\nimport { createPinia } from 'pinia'\nimport App from './App.vue'\nimport './assets/main.css'\n\nconst app = createApp(App)\napp.use(createPinia())\napp.mount('#app')\n"); var hasApp=Object.keys(extracted).some(function(k){return k.indexOf("App.vue")>=0;}); if(!hasApp) zip.file(folder+"src/App.vue","\n\n\n\n\n"); zip.file(folder+"src/assets/main.css","*{margin:0;padding:0;box-sizing:border-box}body{font-family:system-ui,sans-serif;background:#fff;color:#213547}\n"); zip.file(folder+"src/components/.gitkeep",""); zip.file(folder+"src/views/.gitkeep",""); zip.file(folder+"src/stores/.gitkeep",""); Object.keys(extracted).forEach(function(p){ var fp=p.startsWith("src/")?p:"src/"+p; zip.file(folder+fp,extracted[p]); }); zip.file(folder+"README.md","# "+slugTitle(pn)+"\n\nGenerated by PantheraHive BOS.\n\n## Setup\n\`\`\`bash\nnpm install\nnpm run dev\n\`\`\`\n\n## Build\n\`\`\`bash\nnpm run build\n\`\`\`\n\nOpen in VS Code or WebStorm.\n"); zip.file(folder+".gitignore","node_modules/\ndist/\n.env\n.DS_Store\n*.local\n"); } /* --- Angular (v19 standalone) --- */ function buildAngular(zip,folder,app,code,panelTxt){ var pn=pkgName(app); var C=cc(pn); var sel=pn.replace(/_/g,"-"); var extracted=extractCode(panelTxt); zip.file(folder+"package.json",'{\n "name": "'+pn+'",\n "version": "0.0.0",\n "scripts": {\n "ng": "ng",\n "start": "ng serve",\n "build": "ng build",\n "test": "ng test"\n },\n "dependencies": {\n "@angular/animations": "^19.0.0",\n "@angular/common": "^19.0.0",\n "@angular/compiler": "^19.0.0",\n "@angular/core": "^19.0.0",\n "@angular/forms": "^19.0.0",\n "@angular/platform-browser": "^19.0.0",\n "@angular/platform-browser-dynamic": "^19.0.0",\n "@angular/router": "^19.0.0",\n "rxjs": "~7.8.0",\n "tslib": "^2.3.0",\n "zone.js": "~0.15.0"\n },\n "devDependencies": {\n "@angular-devkit/build-angular": "^19.0.0",\n "@angular/cli": "^19.0.0",\n "@angular/compiler-cli": "^19.0.0",\n "typescript": "~5.6.0"\n }\n}\n'); zip.file(folder+"angular.json",'{\n "$schema": "./node_modules/@angular/cli/lib/config/schema.json",\n "version": 1,\n "newProjectRoot": "projects",\n "projects": {\n "'+pn+'": {\n "projectType": "application",\n "root": "",\n "sourceRoot": "src",\n "prefix": "app",\n "architect": {\n "build": {\n "builder": "@angular-devkit/build-angular:application",\n "options": {\n "outputPath": "dist/'+pn+'",\n "index": "src/index.html",\n "browser": "src/main.ts",\n "tsConfig": "tsconfig.app.json",\n "styles": ["src/styles.css"],\n "scripts": []\n }\n },\n "serve": {"builder":"@angular-devkit/build-angular:dev-server","configurations":{"production":{"buildTarget":"'+pn+':build:production"},"development":{"buildTarget":"'+pn+':build:development"}},"defaultConfiguration":"development"}\n }\n }\n }\n}\n'); zip.file(folder+"tsconfig.json",'{\n "compileOnSave": false,\n "compilerOptions": {"baseUrl":"./","outDir":"./dist/out-tsc","forceConsistentCasingInFileNames":true,"strict":true,"noImplicitOverride":true,"noPropertyAccessFromIndexSignature":true,"noImplicitReturns":true,"noFallthroughCasesInSwitch":true,"paths":{"@/*":["src/*"]},"skipLibCheck":true,"esModuleInterop":true,"sourceMap":true,"declaration":false,"experimentalDecorators":true,"moduleResolution":"bundler","importHelpers":true,"target":"ES2022","module":"ES2022","useDefineForClassFields":false,"lib":["ES2022","dom"]},\n "references":[{"path":"./tsconfig.app.json"}]\n}\n'); zip.file(folder+"tsconfig.app.json",'{\n "extends":"./tsconfig.json",\n "compilerOptions":{"outDir":"./dist/out-tsc","types":[]},\n "files":["src/main.ts"],\n "include":["src/**/*.d.ts"]\n}\n'); zip.file(folder+"src/index.html","\n\n\n \n "+slugTitle(pn)+"\n \n \n \n\n\n \n\n\n"); zip.file(folder+"src/main.ts","import { bootstrapApplication } from '@angular/platform-browser';\nimport { appConfig } from './app/app.config';\nimport { AppComponent } from './app/app.component';\n\nbootstrapApplication(AppComponent, appConfig)\n .catch(err => console.error(err));\n"); zip.file(folder+"src/styles.css","* { margin: 0; padding: 0; box-sizing: border-box; }\nbody { font-family: system-ui, -apple-system, sans-serif; background: #f9fafb; color: #111827; }\n"); var hasComp=Object.keys(extracted).some(function(k){return k.indexOf("app.component")>=0;}); if(!hasComp){ zip.file(folder+"src/app/app.component.ts","import { Component } from '@angular/core';\nimport { RouterOutlet } from '@angular/router';\n\n@Component({\n selector: 'app-root',\n standalone: true,\n imports: [RouterOutlet],\n templateUrl: './app.component.html',\n styleUrl: './app.component.css'\n})\nexport class AppComponent {\n title = '"+pn+"';\n}\n"); zip.file(folder+"src/app/app.component.html","
\n
\n

"+slugTitle(pn)+"

\n

Built with PantheraHive BOS

\n
\n \n
\n"); zip.file(folder+"src/app/app.component.css",".app-header{display:flex;flex-direction:column;align-items:center;justify-content:center;min-height:60vh;gap:16px}h1{font-size:2.5rem;font-weight:700;color:#6366f1}\n"); } zip.file(folder+"src/app/app.config.ts","import { ApplicationConfig, provideZoneChangeDetection } from '@angular/core';\nimport { provideRouter } from '@angular/router';\nimport { routes } from './app.routes';\n\nexport const appConfig: ApplicationConfig = {\n providers: [\n provideZoneChangeDetection({ eventCoalescing: true }),\n provideRouter(routes)\n ]\n};\n"); zip.file(folder+"src/app/app.routes.ts","import { Routes } from '@angular/router';\n\nexport const routes: Routes = [];\n"); Object.keys(extracted).forEach(function(p){ var fp=p.startsWith("src/")?p:"src/"+p; zip.file(folder+fp,extracted[p]); }); zip.file(folder+"README.md","# "+slugTitle(pn)+"\n\nGenerated by PantheraHive BOS.\n\n## Setup\n\`\`\`bash\nnpm install\nng serve\n# or: npm start\n\`\`\`\n\n## Build\n\`\`\`bash\nng build\n\`\`\`\n\nOpen in VS Code with Angular Language Service extension.\n"); zip.file(folder+".gitignore","node_modules/\ndist/\n.env\n.DS_Store\n*.local\n.angular/\n"); } /* --- Python --- */ function buildPython(zip,folder,app,code){ var title=slugTitle(app); var pn=pkgName(app); var src=code.replace(/^\`\`\`[\w]*\n?/m,"").replace(/\n?\`\`\`$/m,"").trim(); var reqMap={"numpy":"numpy","pandas":"pandas","sklearn":"scikit-learn","tensorflow":"tensorflow","torch":"torch","flask":"flask","fastapi":"fastapi","uvicorn":"uvicorn","requests":"requests","sqlalchemy":"sqlalchemy","pydantic":"pydantic","dotenv":"python-dotenv","PIL":"Pillow","cv2":"opencv-python","matplotlib":"matplotlib","seaborn":"seaborn","scipy":"scipy"}; var reqs=[]; Object.keys(reqMap).forEach(function(k){if(src.indexOf("import "+k)>=0||src.indexOf("from "+k)>=0)reqs.push(reqMap[k]);}); var reqsTxt=reqs.length?reqs.join("\n"):"# add dependencies here\n"; zip.file(folder+"main.py",src||"# "+title+"\n# Generated by PantheraHive BOS\n\nprint(title+\" loaded\")\n"); zip.file(folder+"requirements.txt",reqsTxt); zip.file(folder+".env.example","# Environment variables\n"); zip.file(folder+"README.md","# "+title+"\n\nGenerated by PantheraHive BOS.\n\n## Setup\n\`\`\`bash\npython3 -m venv .venv\nsource .venv/bin/activate\npip install -r requirements.txt\n\`\`\`\n\n## Run\n\`\`\`bash\npython main.py\n\`\`\`\n"); zip.file(folder+".gitignore",".venv/\n__pycache__/\n*.pyc\n.env\n.DS_Store\n"); } /* --- Node.js --- */ function buildNode(zip,folder,app,code){ var title=slugTitle(app); var pn=pkgName(app); var src=code.replace(/^\`\`\`[\w]*\n?/m,"").replace(/\n?\`\`\`$/m,"").trim(); var depMap={"mongoose":"^8.0.0","dotenv":"^16.4.5","axios":"^1.7.9","cors":"^2.8.5","bcryptjs":"^2.4.3","jsonwebtoken":"^9.0.2","socket.io":"^4.7.4","uuid":"^9.0.1","zod":"^3.22.4","express":"^4.18.2"}; var deps={}; Object.keys(depMap).forEach(function(k){if(src.indexOf(k)>=0)deps[k]=depMap[k];}); if(!deps["express"])deps["express"]="^4.18.2"; var pkgJson=JSON.stringify({"name":pn,"version":"1.0.0","main":"src/index.js","scripts":{"start":"node src/index.js","dev":"nodemon src/index.js"},"dependencies":deps,"devDependencies":{"nodemon":"^3.0.3"}},null,2)+"\n"; zip.file(folder+"package.json",pkgJson); var fallback="const express=require(\"express\");\nconst app=express();\napp.use(express.json());\n\napp.get(\"/\",(req,res)=>{\n res.json({message:\""+title+" API\"});\n});\n\nconst PORT=process.env.PORT||3000;\napp.listen(PORT,()=>console.log(\"Server on port \"+PORT));\n"; zip.file(folder+"src/index.js",src||fallback); zip.file(folder+".env.example","PORT=3000\n"); zip.file(folder+".gitignore","node_modules/\n.env\n.DS_Store\n"); zip.file(folder+"README.md","# "+title+"\n\nGenerated by PantheraHive BOS.\n\n## Setup\n\`\`\`bash\nnpm install\n\`\`\`\n\n## Run\n\`\`\`bash\nnpm run dev\n\`\`\`\n"); } /* --- Vanilla HTML --- */ function buildVanillaHtml(zip,folder,app,code){ var title=slugTitle(app); var isFullDoc=code.trim().toLowerCase().indexOf("=0||code.trim().toLowerCase().indexOf("=0; var indexHtml=isFullDoc?code:"\n\n\n\n\n"+title+"\n\n\n\n"+code+"\n\n\n\n"; zip.file(folder+"index.html",indexHtml); zip.file(folder+"style.css","/* "+title+" — styles */\n*{margin:0;padding:0;box-sizing:border-box}\nbody{font-family:system-ui,-apple-system,sans-serif;background:#fff;color:#1a1a2e}\n"); zip.file(folder+"script.js","/* "+title+" — scripts */\n"); zip.file(folder+"assets/.gitkeep",""); zip.file(folder+"README.md","# "+title+"\n\nGenerated by PantheraHive BOS.\n\n## Open\nDouble-click \`index.html\` in your browser.\n\nOr serve locally:\n\`\`\`bash\nnpx serve .\n# or\npython3 -m http.server 3000\n\`\`\`\n"); zip.file(folder+".gitignore",".DS_Store\nnode_modules/\n.env\n"); } /* ===== MAIN ===== */ var sc=document.createElement("script"); sc.src="https://cdnjs.cloudflare.com/ajax/libs/jszip/3.10.1/jszip.min.js"; sc.onerror=function(){ if(lbl)lbl.textContent="Download ZIP"; alert("JSZip load failed — check connection."); }; sc.onload=function(){ var zip=new JSZip(); var base=(_phFname||"output").replace(/\.[^.]+$/,""); var app=base.toLowerCase().replace(/[^a-z0-9]+/g,"_").replace(/^_+|_+$/g,"")||"my_app"; var folder=app+"/"; var vc=document.getElementById("panel-content"); var panelTxt=vc?(vc.innerText||vc.textContent||""):""; var lang=detectLang(_phCode,panelTxt); if(_phIsHtml){ buildVanillaHtml(zip,folder,app,_phCode); } else if(lang==="flutter"){ buildFlutter(zip,folder,app,_phCode,panelTxt); } else if(lang==="react-native"){ buildReactNative(zip,folder,app,_phCode,panelTxt); } else if(lang==="swift"){ buildSwift(zip,folder,app,_phCode,panelTxt); } else if(lang==="kotlin"){ buildKotlin(zip,folder,app,_phCode,panelTxt); } else if(lang==="react"){ buildReact(zip,folder,app,_phCode,panelTxt); } else if(lang==="vue"){ buildVue(zip,folder,app,_phCode,panelTxt); } else if(lang==="angular"){ buildAngular(zip,folder,app,_phCode,panelTxt); } else if(lang==="python"){ buildPython(zip,folder,app,_phCode); } else if(lang==="node"){ buildNode(zip,folder,app,_phCode); } else { /* Document/content workflow */ var title=app.replace(/_/g," "); var md=_phAll||_phCode||panelTxt||"No content"; zip.file(folder+app+".md",md); var h=""+title+""; h+="

"+title+"

"; var hc=md.replace(/&/g,"&").replace(//g,">"); hc=hc.replace(/^### (.+)$/gm,"

$1

"); hc=hc.replace(/^## (.+)$/gm,"

$1

"); hc=hc.replace(/^# (.+)$/gm,"

$1

"); hc=hc.replace(/\*\*(.+?)\*\*/g,"$1"); hc=hc.replace(/\n{2,}/g,"

"); h+="

"+hc+"

Generated by PantheraHive BOS
"; zip.file(folder+app+".html",h); zip.file(folder+"README.md","# "+title+"\n\nGenerated by PantheraHive BOS.\n\nFiles:\n- "+app+".md (Markdown)\n- "+app+".html (styled HTML)\n"); } zip.generateAsync({type:"blob"}).then(function(blob){ var a=document.createElement("a"); a.href=URL.createObjectURL(blob); a.download=app+".zip"; a.click(); URL.revokeObjectURL(a.href); if(lbl)lbl.textContent="Download ZIP"; }); }; document.head.appendChild(sc); } function phShare(){navigator.clipboard.writeText(window.location.href).then(function(){var el=document.getElementById("ph-share-lbl");if(el){el.textContent="Link copied!";setTimeout(function(){el.textContent="Copy share link";},2500);}});}function phEmbed(){var runId=window.location.pathname.split("/").pop().replace(".html","");var embedUrl="https://pantherahive.com/embed/"+runId;var code='';navigator.clipboard.writeText(code).then(function(){var el=document.getElementById("ph-embed-lbl");if(el){el.textContent="Embed code copied!";setTimeout(function(){el.textContent="Get Embed Code";},2500);}});}