ComfyUI > Nodes > ComfyUI-Prompt-Manager

ComfyUI Extension: ComfyUI-Prompt-Manager

Repo Name

ComfyUI-Prompt-Manager

Author
FranckyB (Account age: 4034 days)
Nodes
View all nodes(3)
Latest Updated
2025-12-19
Github Stars
0.03K

How to Install ComfyUI-Prompt-Manager

Install this extension via the ComfyUI Manager by searching for ComfyUI-Prompt-Manager
  • 1. Click the Manager button in the main menu
  • 2. Select Custom Nodes Manager button
  • 3. Enter ComfyUI-Prompt-Manager in the search bar
After installation, click the Restart button to restart ComfyUI. Then, manually refresh your browser to clear the cache and access the updated list of nodes.

Visit ComfyUI Online for ready-to-use ComfyUI environment

  • Free trial available
  • 16GB VRAM to 80GB VRAM GPU machines
  • 400+ preloaded models/nodes
  • Freedom to upload custom models/nodes
  • 200+ ready-to-run workflows
  • 100% private workspace with up to 200GB storage
  • Dedicated Support

Run ComfyUI Online

ComfyUI-Prompt-Manager Description

ComfyUI-Prompt-Manager is a custom node for ComfyUI that enables users to save and categorize prompts for easy reuse, enhancing workflow efficiency by organizing prompts into distinct categories.

ComfyUI-Prompt-Manager Introduction

The ComfyUI-Prompt-Manager is a versatile extension designed to enhance your experience with the ComfyUI platform. It serves as a comprehensive toolset for organizing, generating, and enhancing prompts, making it an invaluable resource for AI artists. Initially developed as a simple prompt manager to save and retrieve prompts, it has evolved into a robust solution that also aids in generating prompts. This extension can help streamline your creative process by providing a structured way to manage your prompts and leverage advanced AI models to generate detailed and contextually rich prompts.

How ComfyUI-Prompt-Manager Works

At its core, the ComfyUI-Prompt-Manager operates by integrating with the ComfyUI platform to manage and generate prompts. It uses an existing installation of llama.cpp to prevent conflicts with ComfyUI. The extension allows you to organize prompts into categories, save and load them easily, and enhance them using local language models (LLMs). It can analyze images to generate descriptions or enhance text prompts with detailed descriptions. The extension supports up to five images at once for analysis, providing flexibility in how you use visual inputs to generate prompts.

ComfyUI-Prompt-Manager Features

Prompt Manager

  • Category Organization: Organize your prompts into multiple categories for easy management.
  • Save & Load Prompts: Save your favorite prompts with custom names and load them quickly when needed.
  • LLM Input Toggle: Switch between using text outputs from other nodes or your internal prompts.
  • Persistent Storage: All prompts are saved in your ComfyUI user folder, ensuring they are always accessible.

Prompt Generator

  • Three Generation Modes: Choose from enhancing text prompts, analyzing images, or analyzing images with custom instructions.
  • Prompt Enhancement: Use local LLMs to transform basic prompts into detailed descriptions.
  • Vision Analysis: Generate detailed image descriptions using Qwen3VL models.
  • Custom Image Analysis: Provide your own instructions for image analysis.
  • JSON Output: Optionally output structured JSON with a scene breakdown.
  • Thinking Support: Enable deeper generative reasoning with Thinking models.
  • Automatic Server Management: Automatically manage the llama.cpp server, starting and stopping it as needed.

Prompt Generator Options

  • Model Selection: Choose from local models or download Qwen3, Qwen3VL, and Qwen3VL Thinking models.
  • Auto-Download: Automatically download models and required files for vision models.
  • LLM Parameters: Fine-tune parameters like temperature, top_k, and context size.
  • Custom Instructions: Override default system prompts for different enhancement styles.
  • Extra Image Inputs: Combine up to five images to generate your prompt.
  • Console Debugging: Output the entire process to the console for debugging.

Preference Options

  • Set preferred models for both base and VL modes.
  • Define a new default model location.
  • Specify a custom location for Llama.cpp if not added to the system path.

ComfyUI-Prompt-Manager Models

The extension supports various models, including:

  • Qwen3-1.7B-Q8_0.gguf: Fastest, lowest VRAM (~2GB)
  • Qwen3-4B-Q8_0.gguf: Balanced performance (~4GB VRAM)
  • Qwen3-8B-Q8_0.gguf: Best quality, highest VRAM (~8GB)
  • Qwen3VL-4B-Instruct-Q8_0.gguf: Vision model, balanced performance (~5GB VRAM)
  • Qwen3VL-8B-Instruct-Q8_0.gguf: Vision model, best quality (~9GB VRAM)
  • Qwen3VL-4B-Thinking-Q8_0.gguf: Vision model, Thinking variant, balanced performance (~5GB VRAM)
  • Qwen3VL-8B-Thinking-Q8_0.gguf: Vision model, Thinking variant, best quality (~9GB VRAM)

What's New with ComfyUI-Prompt-Manager

Version 1.8.3

  • Added option to leave the Llama server running when closing ComfyUI.

Version 1.8.2

  • Enhanced model management with a custom model path preference.

Version 1.8.1

  • Added option to set a custom Llama path in preferences.

Version 1.8.0

  • Support for Qwen3VL Thinking model variants.
  • Improved model management and console output options.

Troubleshooting ComfyUI-Prompt-Manager

Problem: Prompts don't appear in the dropdown.

  • Solution: Ensure the category has saved prompts. Create a new prompt if necessary. Problem: Changes aren't saved.
  • Solution: Click the "Save Prompt" button after making changes. Problem: Can't see LLM output in the node.
  • Solution: Ensure the LLM output is connected to the "llm_input" and run the workflow. Problem: "llama-server command not found".
  • Solution: Install llama.cpp and ensure llama-server is available in the command line. Problem: "No models found".
  • Solution: Place a .gguf file in the models/ folder or connect the Prompt Generator Option node and select a model size to download. Problem: Server won't start.
  • Solution: Check that port 8080 is not in use and close any existing llama-server processes.

Learn More about ComfyUI-Prompt-Manager

For further assistance and resources, you can explore the following:

ComfyUI-Prompt-Manager Related Nodes

RunComfy
Copyright 2025 RunComfy. All Rights Reserved.

RunComfy is the premier ComfyUI platform, offering ComfyUI online environment and services, along with ComfyUI workflows featuring stunning visuals. RunComfy also provides AI Models, enabling artists to harness the latest AI tools to create incredible art.