ComfyUI > Nodes > ComfyUI LLM Toolkit > LLM Provider Selector (LLMToolkit)

ComfyUI Node: LLM Provider Selector (LLMToolkit)

Class Name

LLMToolkitProviderSelector

Category
llm_toolkit
Author
comfy-deploy (Account age: 706days)
Extension
ComfyUI LLM Toolkit
Latest Updated
2025-10-01
Github Stars
0.08K

How to Install ComfyUI LLM Toolkit

Install this extension via the ComfyUI Manager by searching for ComfyUI LLM Toolkit
  • 1. Click the Manager button in the main menu
  • 2. Select Custom Nodes Manager button
  • 3. Enter ComfyUI LLM Toolkit in the search bar
After installation, click the Restart button to restart ComfyUI. Then, manually refresh your browser to clear the cache and access the updated list of nodes.

Visit ComfyUI Online for ready-to-use ComfyUI environment

  • Free trial available
  • 16GB VRAM to 80GB VRAM GPU machines
  • 400+ preloaded models/nodes
  • Freedom to upload custom models/nodes
  • 200+ ready-to-run workflows
  • 100% private workspace with up to 200GB storage
  • Dedicated Support

Run ComfyUI Online

LLM Provider Selector (LLMToolkit) Description

Versatile node for selecting and configuring LLM providers in ComfyUI, streamlining integration and enhancing flexibility.

LLM Provider Selector (LLMToolkit):

The LLMToolkitProviderSelector is a versatile node designed to facilitate the selection and configuration of Language Model (LLM) providers within the ComfyUI framework. Its primary function is to allow you to choose from a wide array of supported LLM providers and models, ensuring that the necessary configurations, such as API keys and network settings, are correctly set up for seamless integration with downstream nodes. This node is particularly beneficial for users who need to switch between different LLM providers or models, as it streamlines the process by validating API keys and generating a configuration dictionary that can be used by other nodes in the workflow. By offering a centralized point for provider and model selection, the LLMToolkitProviderSelector enhances the flexibility and efficiency of your AI art generation projects.

LLM Provider Selector (LLMToolkit) Input Parameters:

llm_provider

The llm_provider parameter allows you to select from a list of supported LLM providers, such as "transformers", "openai", "huggingface", and more. This selection determines which provider's models and configurations will be used in your project. The default value is "transformers", and choosing the right provider is crucial as it impacts the available models and the required configurations.

llm_model

The llm_model parameter specifies the model to be used from the selected provider. It is a string input that updates dynamically based on the chosen provider and connection status. The default value is "Provider not selected or models not fetched", indicating that a model needs to be selected once the provider is set.

base_ip

The base_ip parameter is used to specify the IP address for local providers. It is particularly relevant for providers that require a local setup, such as "ollama" or "llamacpp". The default value is "localhost", and it is essential for ensuring that the node can communicate with the local provider correctly.

port

The port parameter defines the port number for local providers. Similar to base_ip, this is crucial for establishing a connection with providers that operate locally. The default port is "11434", and it should be adjusted according to the specific requirements of the local provider setup.

external_api_key

The external_api_key parameter allows you to provide an API key directly, which can override the key set in the environment or .env file. This is optional but useful if you need to use a different API key for specific tasks or providers. It is a string input and should be kept secure.

context

The context parameter is a flexible input that can accept various types of data. It is designed to provide additional context or configuration details that might be needed by the node or downstream processes. This parameter enhances the node's adaptability to different use cases.

LLM Provider Selector (LLMToolkit) Output Parameters:

context

The context output parameter provides a configuration dictionary that contains all the necessary settings and credentials for the selected LLM provider and model. This output is crucial for downstream nodes, as it ensures they have the correct information to interact with the chosen provider and model effectively. The context output encapsulates the provider's configuration, making it easier to manage and integrate into larger workflows.

LLM Provider Selector (LLMToolkit) Usage Tips:

  • Ensure that you select the correct llm_provider and llm_model to match your project's requirements, as this will affect the models and capabilities available to you.
  • Use the external_api_key parameter to quickly switch between different API keys without modifying your environment settings, which can be useful for testing or using multiple accounts.
  • When working with local providers, double-check the base_ip and port settings to ensure that the node can establish a connection successfully.

LLM Provider Selector (LLMToolkit) Common Errors and Solutions:

Failed to import llmtoolkit_utils

  • Explanation: This error occurs when the node cannot import necessary utility functions from the llmtoolkit_utils module, possibly due to missing files or errors in the module.
  • Solution: Verify that the llmtoolkit_utils module exists in the correct directory and is free of errors. Ensure that the module is included in the system path.

Provider not supported

  • Explanation: This error indicates that the selected llm_provider is not among the supported providers listed in the node.
  • Solution: Check the list of supported providers and select one that is available. If you need a specific provider, ensure it is correctly integrated into the system.

Invalid API key

  • Explanation: This error suggests that the provided API key is incorrect or not authorized for the selected provider.
  • Solution: Double-check the API key for accuracy and ensure it has the necessary permissions for the provider. If using an environment variable, verify that it is set correctly.

LLM Provider Selector (LLMToolkit) Related Nodes

Go back to the extension to check out more related nodes.
ComfyUI LLM Toolkit
RunComfy
Copyright 2025 RunComfy. All Rights Reserved.

RunComfy is the premier ComfyUI platform, offering ComfyUI online environment and services, along with ComfyUI workflows featuring stunning visuals. RunComfy also provides AI Models, enabling artists to harness the latest AI tools to create incredible art.