ComfyUI > Nodes > ComfyUI-NewBie-LLM-Formatter > LLM Xml Prompt Formatter

ComfyUI Node: LLM Xml Prompt Formatter

Class Name

LLM_Prompt_Formatter

Category
NewBie LLM Formatter
Author
SuzumiyaAkizuki (Account age: 0days)
Extension
ComfyUI-NewBie-LLM-Formatter
Latest Updated
2026-03-21
Github Stars
0.04K

How to Install ComfyUI-NewBie-LLM-Formatter

Install this extension via the ComfyUI Manager by searching for ComfyUI-NewBie-LLM-Formatter
  • 1. Click the Manager button in the main menu
  • 2. Select Custom Nodes Manager button
  • 3. Enter ComfyUI-NewBie-LLM-Formatter in the search bar
After installation, click the Restart button to restart ComfyUI. Then, manually refresh your browser to clear the cache and access the updated list of nodes.

Visit ComfyUI Online for ready-to-use ComfyUI environment

  • Free trial available
  • 16GB VRAM to 80GB VRAM GPU machines
  • 400+ preloaded models/nodes
  • Freedom to upload custom models/nodes
  • 200+ ready-to-run workflows
  • 100% private workspace with up to 200GB storage
  • Dedicated Support

Run ComfyUI Online

LLM Xml Prompt Formatter Description

Formats and processes LLM prompts and responses, extracting reasoning for clarity.

LLM Xml Prompt Formatter:

The LLM_Prompt_Formatter is a specialized node designed to enhance the interaction with large language models (LLMs) by formatting prompts and processing their responses. Its primary purpose is to manage and interpret the output from LLMs, ensuring that the responses are correctly formatted and any embedded reasoning or thought processes are extracted and presented clearly. This node is particularly beneficial for users who need to handle complex LLM outputs, as it can identify and separate reasoning content from the main response, providing a structured and comprehensive view of the LLM's output. By doing so, it aids in understanding the model's decision-making process and ensures that the responses are both meaningful and actionable.

LLM Xml Prompt Formatter Input Parameters:

response

The response parameter is the input from the LLM that the node processes. It contains the full output from the LLM, including the main content, any reasoning or thought processes, and token usage information. This parameter is crucial as it serves as the raw data that the node will format and analyze. There are no specific minimum, maximum, or default values for this parameter, as it directly depends on the LLM's output.

LLM Xml Prompt Formatter Output Parameters:

full_response

The full_response parameter is the main output of the node, representing the formatted and cleaned response from the LLM. It includes the primary content after any embedded reasoning or thought processes have been extracted and removed. This output is essential for users who need a clear and concise response from the LLM without additional reasoning content.

reasoning

The reasoning parameter provides the extracted reasoning or thought process from the LLM's response. This output is particularly valuable for users who wish to understand the underlying logic or considerations that the LLM used to generate its response. It offers insights into the model's decision-making process, which can be crucial for tasks requiring transparency and interpretability.

LLM Xml Prompt Formatter Usage Tips:

  • Ensure that the response parameter is correctly populated with the LLM's output to allow the node to function effectively.
  • Use the reasoning output to gain insights into the LLM's decision-making process, which can be particularly useful for complex queries or when transparency is required.

LLM Xml Prompt Formatter Common Errors and Solutions:

"LLM API 返回了 NoneType (返回内容为空)。"

  • Explanation: This error occurs when the LLM API returns a NoneType, indicating that the response content is empty.
  • Solution: Verify that the LLM is correctly configured and that the input prompt is valid. If the issue persists, consider retrying the request or checking the LLM's status.

"LLM API 的回复可能被截断。"

  • Explanation: This error suggests that the LLM's response might have been truncated, potentially due to size limitations or network issues.
  • Solution: Check the network connection and ensure that the LLM is capable of handling the size of the response. If necessary, adjust the prompt to reduce the response size or increase the LLM's output capacity.

LLM Xml Prompt Formatter Related Nodes

Go back to the extension to check out more related nodes.
ComfyUI-NewBie-LLM-Formatter
RunComfy
Copyright 2025 RunComfy. All Rights Reserved.

RunComfy is the premier ComfyUI platform, offering ComfyUI online environment and services, along with ComfyUI workflows featuring stunning visuals. RunComfy also provides AI Models, enabling artists to harness the latest AI tools to create incredible art.

LLM Xml Prompt Formatter