Skip to content

MHD-GDev/LlamaGen.nvim

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

8 Commits
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

LlamaGen

Use Llama.cpp with Neovim

Local LLMs in Neovim: llamagen.nvim

Requires

  • Llama.cpp with any local model (GGUF models are recommended)
  • Curl

Installation

Install with your favorite plugin manager, e.g. lazy.nvim

Example with Lazy

	-- Custom Parameters (with defaults)
	{
		"MHD-GDev/LlamaGen.nvim",
		dependencies = {
			"nvim-lualine/lualine.nvim",
		},
		config = function()
			require("llamagen").setup({
				quit_map = "q",
				retry_map = "<c-r>",
				accept_map = "<c-cr>",
				host = "localhost",
				port = "1123",
				display_mode = "float",
				show_prompt = true,
				show_model = false,
				no_auto_close = false,
				json_response = true,
				result_filetype = "markdown",
				debug = false,
			})

			-- Key mappings
			vim.keymap.set({ "n", "v" }, "<leader>]", ":Llamagen<CR>")
			vim.keymap.set("n", "<leader>gc", "<CMD>Llamagen Chat<CR>", { noremap = true })
			vim.keymap.set("n", "<leader>gg", "<CMD>Llamagen Generate<CR>", { noremap = true })
			vim.keymap.set("v", "<leader>gD", ":'<,'>Llamagen Document_Code<CR>", { noremap = true })
			vim.keymap.set("v", "<leader>gx", ":'<,'>Llamagen Explain_Code<CR>", { noremap = true })
			vim.keymap.set("v", "<leader>gC", ":'<,'>Llamagen Change_Code<CR>", { noremap = true })
			vim.keymap.set("v", "<leader>ge", ":'<,'>Llamagen Enhance_Code<CR>", { noremap = true })
			vim.keymap.set("v", "<leader>gR", ":'<,'>Llamagen Review_Code<CR>", { noremap = true })
			vim.keymap.set("v", "<leader>gs", ":'<,'>Llamagen Summarize<CR>", { noremap = true })
			vim.keymap.set("v", "<leader>ga", ":'<,'>Llamagen Ask<CR>", { noremap = true })
			vim.keymap.set("v", "<leader>gF", ":'<,'>Llamagen Fix_Code<CR>", { noremap = true })
			vim.keymap.set("n", "<leader>gl", "<CMD>GenLoadModel<CR>", { noremap = true })
			vim.keymap.set("n", "<leader>gu", "<CMD>GenUnloadModel<CR>", { noremap = true })
		end,
	},

Usage

IMPORTANT NOTE:

  • Remember to change the default path to your Models path in init.lua before using the plugin.

Use command Llamagen to generate text based on predefined and customizable prompts.

Example key maps:

vim.keymap.set({ 'n', 'v' }, '<leader>]', ':llamagen<CR>')

You can also directly invoke it with one of the predefined prompts or your custom prompts:

vim.keymap.set('v', '<leader>]', ':llamagen Enhance_Grammar_Spelling<CR>')

After a conversation begins, the entire context is sent to the LLM. That allows you to ask follow-up questions with

:llamagen Chat

and once the window is closed, you start with a fresh conversation.

For prompts which don't automatically replace the previously selected text (replace = false), you can replace the selected text with the generated output with <c-cr>.

Note:

To use llamagen you need to load or unload models with these commands :GenUnloadModel and :GenLoadModel .

Models:

Custom Prompts

All prompts are defined in require('llamagen').prompts, you can enhance or modify them.

Example:

require('llamagen').prompts['Elaborate_Text'] = {
  prompt = "Elaborate the following text:\n$text",
  replace = true
}
require('llamagen').prompts['Fix_Code'] = {
  prompt = "Fix the following code. Only output the result in format ```$filetype\n...\n```:\n```$filetype\n$text\n```",
  replace = true,
  extract = "```$filetype\n(.-)```"
}

You can use the following properties per prompt:

  • prompt: (string | function) Prompt either as a string or a function which should return a string. The result can use the following placeholders:
    • $text: Visually selected text or the content of the current buffer
    • $filetype: File type of the buffer (e.g. javascript)
    • $input: Additional user input
    • $register: Value of the unnamed register (yanked text)
  • replace: true if the selected text shall be replaced with the generated output
  • extract: Regular expression used to extract the generated result
  • model: The model to use, default: local-model

Tip

User selections can be delegated to Telescope with telescope-ui-select.

About

Use Llama.cpp with Neovim

Topics

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Contributors

Languages