ComfyUI > Nodes > ComfyUI-QuantOps > Load Diffusion Model (BNB 4-bit)

ComfyUI Node: Load Diffusion Model (BNB 4-bit)

Class Name

BNB4bitUNETLoader

Category
loaders/quantized
Author
silveroxides (Account age: 0days)
Extension
ComfyUI-QuantOps
Latest Updated
2026-03-22
Github Stars
0.04K

How to Install ComfyUI-QuantOps

Install this extension via the ComfyUI Manager by searching for ComfyUI-QuantOps
  • 1. Click the Manager button in the main menu
  • 2. Select Custom Nodes Manager button
  • 3. Enter ComfyUI-QuantOps in the search bar
After installation, click the Restart button to restart ComfyUI. Then, manually refresh your browser to clear the cache and access the updated list of nodes.

Visit ComfyUI Online for ready-to-use ComfyUI environment

  • Free trial available
  • 16GB VRAM to 80GB VRAM GPU machines
  • 400+ preloaded models/nodes
  • Freedom to upload custom models/nodes
  • 200+ ready-to-run workflows
  • 100% private workspace with up to 200GB storage
  • Dedicated Support

Run ComfyUI Online

Load Diffusion Model (BNB 4-bit) Description

Loads quantized UNET/diffusion models in BNB 4-bit format, optimizing memory and performance.

Load Diffusion Model (BNB 4-bit):

The BNB4bitUNETLoader is designed to load UNET or diffusion models that have been quantized to the BNB 4-bit format, specifically NF4/FP4. This node is particularly beneficial for users who need to work with large models but want to reduce memory usage and computational load without significantly compromising performance. By utilizing pure PyTorch for dequantization during the forward pass, it ensures efficient processing. The node automatically detects the model type, such as Flux or Flux2, based on the state dictionary keys, allowing for seamless integration and operation. This capability is crucial for AI artists who want to leverage advanced diffusion models in their creative workflows while maintaining optimal performance and resource management.

Load Diffusion Model (BNB 4-bit) Input Parameters:

unet_name

The unet_name parameter specifies the name of the UNET model you wish to load. It is a required parameter and is used to locate the model file within the designated diffusion models directory. This parameter is crucial as it determines which model will be loaded and processed by the node. There are no specific minimum or maximum values, but it must match one of the available model filenames in the directory.

model_type_override

The model_type_override parameter allows you to manually specify the type of model to be loaded, overriding the automatic detection feature. This is an optional parameter with several options: "auto", "flux2", "flux", "chroma", "chroma_radiance", and "chroma_radiance_x0". By default, it is set to "auto", which enables the node to detect the model type based on the state dictionary keys. This parameter is useful if you have specific knowledge about the model type and want to ensure the correct configuration is applied.

Load Diffusion Model (BNB 4-bit) Output Parameters:

MODEL

The output parameter MODEL represents the loaded and configured diffusion model. This output is crucial as it provides the AI artist with a ready-to-use model that has been optimized for performance through BNB 4-bit quantization. The model can then be used in various creative applications, leveraging its capabilities for generating high-quality outputs while maintaining efficient resource usage.

Load Diffusion Model (BNB 4-bit) Usage Tips:

  • Ensure that the unet_name matches exactly with the model file name in the diffusion models directory to avoid loading errors.
  • Use the model_type_override parameter if you have specific knowledge about the model type to ensure the correct configuration is applied, especially if automatic detection might not be reliable for your specific model.

Load Diffusion Model (BNB 4-bit) Common Errors and Solutions:

Failed to import HybridBNB4bitOps

  • Explanation: This error occurs when the node is unable to import the HybridBNB4bitOps module, which is essential for the operation of the node.
  • Solution: Ensure that the HybridBNB4bitOps module is correctly installed and accessible in the environment. Check for any installation issues or path errors that might prevent the module from being imported.

Load Diffusion Model (BNB 4-bit): missing keys

  • Explanation: This warning indicates that some expected keys were not found in the state dictionary during the model loading process.
  • Solution: Verify that the model file is complete and not corrupted. Ensure that the model was correctly quantized and saved in the expected format.

Load Diffusion Model (BNB 4-bit): unexpected keys

  • Explanation: This warning suggests that there are additional keys in the state dictionary that were not expected by the loader.
  • Solution: Check if the model file is compatible with the loader and that it was quantized using the correct method. If necessary, re-quantize the model using the appropriate tools and settings.

Load Diffusion Model (BNB 4-bit) Related Nodes

Go back to the extension to check out more related nodes.
ComfyUI-QuantOps
RunComfy
Copyright 2025 RunComfy. All Rights Reserved.

RunComfy is the premier ComfyUI platform, offering ComfyUI online environment and services, along with ComfyUI workflows featuring stunning visuals. RunComfy also provides AI Models, enabling artists to harness the latest AI tools to create incredible art.

Load Diffusion Model (BNB 4-bit)