ComfyUI Node: Load Prompt Batch From File

Class Name

ShrugPromptBatchFromFile

Category
Shrug Nodes/Logic
Author
fblissjr (Account age: 4014days)
Extension
Shrug-Prompter: Unified VLM Integration for ComfyUI
Latest Updated
2025-09-30
Github Stars
0.02K

How to Install Shrug-Prompter: Unified VLM Integration for ComfyUI

Install this extension via the ComfyUI Manager by searching for Shrug-Prompter: Unified VLM Integration for ComfyUI
  • 1. Click the Manager button in the main menu
  • 2. Select Custom Nodes Manager button
  • 3. Enter Shrug-Prompter: Unified VLM Integration for ComfyUI in the search bar
After installation, click the Restart button to restart ComfyUI. Then, manually refresh your browser to clear the cache and access the updated list of nodes.

Visit ComfyUI Online for ready-to-use ComfyUI environment

  • Free trial available
  • 16GB VRAM to 80GB VRAM GPU machines
  • 400+ preloaded models/nodes
  • Freedom to upload custom models/nodes
  • 200+ ready-to-run workflows
  • 100% private workspace with up to 200GB storage
  • Dedicated Support

Run ComfyUI Online

Load Prompt Batch From File Description

Facilitates efficient batch loading and management of prompts from a file for streamlined processing.

Load Prompt Batch From File:

The ShrugPromptBatchFromFile node is designed to facilitate the loading and management of prompt batches from a file, making it an essential tool for users who need to handle large sets of prompts efficiently. This node reads prompts from a specified file, splits them by newline, and outputs a specific batch based on the provided index and batch size. Its primary benefit is the ability to manage and process multiple prompts in a structured manner, which is particularly useful in scenarios where batch processing is required. By automating the extraction and organization of prompts, this node streamlines workflows and enhances productivity, allowing you to focus on creative tasks rather than manual data handling.

Load Prompt Batch From File Input Parameters:

filename

The filename parameter specifies the name of the file from which prompts are to be loaded. This can be a relative path within the ComfyUI input directory or an absolute path. The file should contain prompts separated by newlines. Providing the correct file path is crucial as it directly impacts the node's ability to locate and read the prompts. There are no specific minimum or maximum values for this parameter, but it must be a valid string representing a file path.

index

The index parameter determines the starting point for the batch of prompts to be output. It is an integer value that, when multiplied by the batch_size, specifies the position in the list of prompts from which to begin extracting the batch. This parameter is essential for navigating through the file and selecting the correct subset of prompts for processing. The minimum value is 0, and there is no explicit maximum value, but it should be within the range of available prompts.

batch_size

The batch_size parameter defines the number of prompts to be included in each batch. It is an integer that dictates how many prompts are extracted starting from the position specified by the index. This parameter is vital for controlling the size of each batch, allowing you to tailor the output to your specific needs. The minimum value is 1, and there is no explicit maximum value, but it should be reasonable given the total number of prompts in the file.

Load Prompt Batch From File Output Parameters:

outputs

The outputs parameter is a tuple containing the prompts extracted from the file based on the specified index and batch_size. It provides the actual prompts that can be used for further processing or analysis. The importance of this output lies in its role as the primary data set for subsequent operations. The tuple will contain up to 16 prompts, padded with empty strings if fewer prompts are available, ensuring a consistent output format.

Load Prompt Batch From File Usage Tips:

  • Ensure that the file path provided in the filename parameter is correct and accessible to avoid file not found errors.
  • Adjust the index and batch_size parameters to efficiently navigate through large files and manage prompt batches according to your workflow needs.
  • Use the node in conjunction with other nodes that can process or analyze the extracted prompts to maximize its utility.

Load Prompt Batch From File Common Errors and Solutions:

Prompt file not found at '<file_path>'

  • Explanation: This error occurs when the specified file cannot be located at the given path. It may be due to an incorrect file path or the file not being present in the expected directory.
  • Solution: Verify that the file path is correct and that the file exists in the specified location. Ensure that the path is either relative to the ComfyUI input directory or an absolute path.

No valid prompts found in the file

  • Explanation: This error indicates that the file was read successfully, but no valid prompts were found. This could happen if the file is empty or contains only whitespace.
  • Solution: Check the file to ensure it contains valid prompts separated by newlines. Remove any unnecessary whitespace or empty lines that might be affecting the prompt extraction.

Load Prompt Batch From File Related Nodes

Go back to the extension to check out more related nodes.
Shrug-Prompter: Unified VLM Integration for ComfyUI
RunComfy
Copyright 2025 RunComfy. All Rights Reserved.

RunComfy is the premier ComfyUI platform, offering ComfyUI online environment and services, along with ComfyUI workflows featuring stunning visuals. RunComfy also provides AI Models, enabling artists to harness the latest AI tools to create incredible art.

Load Prompt Batch From File