Retrieval-grounded prompt refinement as a Python library, CLI, and MCP server
-
Updated
Apr 13, 2026 - Python
Retrieval-grounded prompt refinement as a Python library, CLI, and MCP server
Automated prompt refinement pipeline that iteratively optimises LLM classification prompts using a classifier → evaluator → optimiser loop, targeting precision and recall thresholds on Azure OpenAI.
A universal, client-side AI prompt engineering tool that enhances your prompts using local or cloud-based AI models. Transform basic prompts into detailed, professional-grade instructions without sending your data to third-party servers.
MCP server that refines your coding prompts with real codebase context before your AI agent acts on them.
A collaborative LLM system that uses OpenAI and Google Gemini to iteratively refine prompts until both models agree on the quality. Features file I/O, markdown comparison, and robust error handling.
Claude Code skill that silently refines every prompt for better results (auto mode) and lets you generate optimized prompts as text with /optimize (manual mode)
Collaborative prompt refinement tool - AI asks questions, you provide answers, together you build better prompts
A framework for distilling abstract structures and presenting them with precise language. For precision prompting and communication.
MMPS - Multi-Modal Prompt Structuring System
Add a description, image, and links to the prompt-refinement topic page so that developers can more easily learn about it.
To associate your repository with the prompt-refinement topic, visit your repo's landing page and select "manage topics."