ComfyUI > Nodes > ComfyUI-FL-AceStep-Training > FL AceStep LLM Loader

ComfyUI Node: FL AceStep LLM Loader

Class Name

FL_AceStep_LLMLoader

Category
FL AceStep/Loaders
Author
filliptm (Account age: 0days)
Extension
ComfyUI-FL-AceStep-Training
Latest Updated
2026-03-19
Github Stars
0.1K

How to Install ComfyUI-FL-AceStep-Training

Install this extension via the ComfyUI Manager by searching for ComfyUI-FL-AceStep-Training
  • 1. Click the Manager button in the main menu
  • 2. Select Custom Nodes Manager button
  • 3. Enter ComfyUI-FL-AceStep-Training in the search bar
After installation, click the Restart button to restart ComfyUI. Then, manually refresh your browser to clear the cache and access the updated list of nodes.

Visit ComfyUI Online for ready-to-use ComfyUI environment

  • Free trial available
  • 16GB VRAM to 80GB VRAM GPU machines
  • 400+ preloaded models/nodes
  • Freedom to upload custom models/nodes
  • 200+ ready-to-run workflows
  • 100% private workspace with up to 200GB storage
  • Dedicated Support

Run ComfyUI Online

FL AceStep LLM Loader Description

Facilitates loading ACE-Step 5Hz LLM for efficient audio understanding and auto-labeling tasks.

FL AceStep LLM Loader:

The FL_AceStep_LLMLoader is a specialized node designed to facilitate the loading of the ACE-Step 5Hz Language Model (LLM), which is pivotal for audio understanding and auto-labeling tasks. This node is integral to the process of automatically generating captions, metadata, and lyrics from audio samples, thereby streamlining dataset preparation for AI artists. By leveraging the capabilities of the 5Hz-lm model, the node enhances the efficiency and accuracy of audio semantic analysis, making it an invaluable tool for those involved in audio-based AI projects. The node supports multiple model variants, allowing users to choose between different levels of performance and resource requirements, ensuring flexibility and adaptability to various project needs.

FL AceStep LLM Loader Input Parameters:

model_name

The model_name parameter specifies which variant of the ACE-Step 5Hz Language Model to load. It offers three options: acestep-5Hz-lm-1.7B, acestep-5Hz-lm-0.6B, and acestep-5Hz-lm-4B. The 1.7B model is the default and provides a balanced performance, the 0.6B model is lightweight and suitable for environments with limited resources, while the 4B model offers high quality but requires more VRAM. Selecting the appropriate model variant impacts the node's performance and resource consumption.

device

The device parameter determines the hardware on which the model will be executed. Options include auto, cuda, and cpu. The auto setting automatically selects cuda if a compatible GPU is available, otherwise it defaults to cpu. Choosing the right device can significantly affect the speed and efficiency of model execution, with cuda generally providing faster processing times due to GPU acceleration.

backend

The backend parameter specifies the computational backend to be used for model execution, with options pt (PyTorch) and vllm. The choice of backend can influence the compatibility and performance of the model, with pt being the default option that is widely supported and vllm potentially offering different optimizations.

checkpoint_path

The checkpoint_path parameter allows you to specify a custom directory path for the model checkpoint files. If left empty, the node will automatically download the necessary files to the default models directory. This parameter is useful for users who prefer to manage their model files manually or need to use a specific version of the model stored locally.

FL AceStep LLM Loader Output Parameters:

llm

The llm output parameter represents the loaded ACE-Step Language Model instance. This output is crucial as it provides the functional model ready for use in audio understanding and auto-labeling tasks. The llm can be utilized in subsequent nodes or processes to perform semantic analysis and generate descriptive metadata from audio inputs, thereby enhancing the overall workflow in AI-driven audio projects.

FL AceStep LLM Loader Usage Tips:

  • Ensure that your system has sufficient VRAM if you plan to use the acestep-5Hz-lm-4B model, as it requires more resources compared to the other variants.
  • Utilize the auto setting for the device parameter to automatically leverage GPU acceleration if available, which can significantly speed up model processing times.
  • Consider using the checkpoint_path parameter to specify a local directory if you have pre-downloaded model files, which can save time and bandwidth during the model loading process.

FL AceStep LLM Loader Common Errors and Solutions:

Failed to ensure LLM: <status>

  • Explanation: This error occurs when the node is unable to download or verify the specified language model.
  • Solution: Check your internet connection and ensure that the specified checkpoint_path is correct and accessible. If the path is empty, verify that the default models directory is writable and has sufficient space.

Failed to load LLM: <error_message>

  • Explanation: This error indicates that there was an issue during the model loading process, possibly due to incompatible hardware or software configurations.
  • Solution: Ensure that your system meets the necessary hardware requirements, such as having a compatible GPU if using cuda. Also, verify that all dependencies, such as PyTorch, are correctly installed and up to date.

FL AceStep LLM Loader Related Nodes

Go back to the extension to check out more related nodes.
ComfyUI-FL-AceStep-Training
RunComfy
Copyright 2025 RunComfy. All Rights Reserved.

RunComfy is the premier ComfyUI platform, offering ComfyUI online environment and services, along with ComfyUI workflows featuring stunning visuals. RunComfy also provides AI Models, enabling artists to harness the latest AI tools to create incredible art.

FL AceStep LLM Loader