AI-powered B2B landing page generator using a local LLM via @llama-node/core.
npm installThis project uses @llama-node/core and expects a GGML model (typically .bin).
Auto-discovery checks these paths first:
models/model.bin
models/model.ggml
models/model.GGML
If multiple model files exist, set an explicit path:
# PowerShell
$env:LLM_MODEL_PATH = "models/model.bin"Important:
GGUFis not supported by this codebase/runtime.- Some files named
.GGMLare actually GGUF internally; if the app reportsUnsupported model format: GGUF, replace it with a true GGML.binmodel.
npm run devOpen http://localhost:3000.
- Fill in Company Name and Short Description (required).
- Optionally add Website URL and/or paste content into Website Text / Research Notes.
- Choose a Tone.
- Click Generate Landing Page.
- View the live Preview, inspect the Content JSON, or copy/download the generated HTML.
- Each generation now uses an auto-selected visual variant (different font pair, palette, motion profile, and section treatment).
- Recent visual variants are avoided to reduce back-to-back repetition.
- If a
website_urlis provided, the API now fetches and summarizes key public HTML text (title, meta description, headings, paragraphs, list items) and merges it with pasted notes before prompting the model.
landingforge/
app/
api/generate/route.ts <- API route: calls local LLM + website research fetch
layout.tsx
page.tsx
components/
LandingPageAgent.tsx <- Main UI component
lib/
llm.ts <- Local LLM wrapper
prompt.ts <- System prompt + helpers
research.ts <- Website text preprocessing + URL fetch summarizer
showcase-template.ts <- Multi-variant landing page HTML renderer
models/
model.bin <- Place your local model here (not in git)
next.config.mjs
package.json
tsconfig.json
- Next.js 14 (App Router)
@llama-node/core- TypeScript
- React 18