ComfyUI > Nodes > IF_LLM > IF LLM List Models πŸ“š

ComfyUI Node: IF LLM List Models πŸ“š

Class Name

IF_LLM_ListModels

Category
ImpactFramesπŸ’₯🎞️/IF_LLM
Author
impactframes (Account age: 3185days)
Extension
IF_LLM
Latest Updated
2025-04-09
Github Stars
0.12K

How to Install IF_LLM

Install this extension via the ComfyUI Manager by searching for IF_LLM
  • 1. Click the Manager button in the main menu
  • 2. Select Custom Nodes Manager button
  • 3. Enter IF_LLM in the search bar
After installation, click the Restart button to restart ComfyUI. Then, manually refresh your browser to clear the cache and access the updated list of nodes.

Visit ComfyUI Online for ready-to-use ComfyUI environment

  • Free trial available
  • 16GB VRAM to 80GB VRAM GPU machines
  • 400+ preloaded models/nodes
  • Freedom to upload custom models/nodes
  • 200+ ready-to-run workflows
  • 100% private workspace with up to 200GB storage
  • Dedicated Support

Run ComfyUI Online

IF LLM List Models πŸ“š Description

Retrieve comprehensive list of models from Large Language Model providers for easy selection and exploration.

IF LLM List Models πŸ“š:

The IF_LLM_ListModels node is designed to provide a comprehensive list of available models from a specified Large Language Model (LLM) provider. This node is particularly useful for AI artists and developers who need to explore and select from various models offered by different LLM providers. By leveraging this node, you can easily retrieve and view the models available from providers such as OpenAI, Anthropic, and others, without needing to manually query each provider's API. The node simplifies the process by automatically handling API key management and formatting the output in a user-friendly manner. This functionality is essential for those who wish to integrate or experiment with different LLMs in their projects, as it provides a clear overview of the options available, thus aiding in informed decision-making.

IF LLM List Models πŸ“š Input Parameters:

llm_provider

The llm_provider parameter specifies the provider from which you want to list available models. It accepts a list of predefined provider names such as "ollama", "llamacpp", "kobold", "lmstudio", "textgen", "groq", "gemini", "openai", "anthropic", "mistral", "transformers", "xai", and "deepseek". The default value is "ollama". This parameter is crucial as it determines the source of the models being listed, and selecting the correct provider ensures that you receive the relevant models for your needs.

base_ip

The base_ip parameter defines the IP address of the server hosting the LLM provider's API. It is a string value with a default of "localhost". This parameter is important for establishing a connection to the correct server, especially when the API is hosted on a local or remote server. Ensuring the correct IP address is specified will facilitate successful communication with the provider's API.

port

The port parameter specifies the port number used to connect to the LLM provider's API. It is a string value with a default of "11434". This parameter is essential for directing the connection to the correct service endpoint on the server. Using the correct port number is necessary to ensure that the API requests are routed correctly and that the node can retrieve the model information.

external_api_key

The external_api_key parameter allows you to provide an API key for authenticating requests to the LLM provider. It is a string value with a default of an empty string. This parameter is vital for accessing providers that require authentication. If not provided, the node attempts to retrieve the API key from the environment variables. Supplying a valid API key ensures that you can access the models from providers that enforce security measures.

refresh

The refresh parameter is a boolean value that indicates whether to refresh the list of models. It has a default value of False. This parameter is useful when you want to ensure that the list of models is up-to-date, especially if there have been recent changes or updates to the models offered by the provider. Setting this parameter to True forces the node to fetch the latest model information.

IF LLM List Models πŸ“š Output Parameters:

model_list

The model_list output parameter is a string that contains a formatted list of available models from the specified LLM provider. This output is crucial as it provides a clear and organized view of the models you can choose from, helping you make informed decisions about which models to use in your projects. The list includes model names and is saved to a file for easy reference, ensuring that you have access to this information even after the node execution is complete.

IF LLM List Models πŸ“š Usage Tips:

  • Ensure that you select the correct llm_provider to get the relevant models for your needs. This will help you avoid unnecessary API calls and ensure you are working with the right set of models.
  • If you encounter issues with model retrieval, check that the base_ip and port are correctly set to match the server hosting the LLM provider's API. This will help in establishing a successful connection.
  • Use the external_api_key parameter to provide a valid API key if the provider requires authentication. This will ensure that you have the necessary permissions to access the models.
  • Set the refresh parameter to True if you suspect that the list of models has changed or if you want to ensure you have the most current information.

IF LLM List Models πŸ“š Common Errors and Solutions:

Error fetching models for <llm_provider>: <error_message>

  • Explanation: This error occurs when there is an issue retrieving the models from the specified LLM provider. The error message provides additional details about the specific problem encountered, such as network issues, incorrect API key, or server errors.
  • Solution: Verify that the llm_provider, base_ip, and port parameters are correctly set. Ensure that the API key is valid and correctly provided. Check your network connection and try again. If the problem persists, consult the provider's documentation for troubleshooting tips.

IF LLM List Models πŸ“š Related Nodes

Go back to the extension to check out more related nodes.
IF_LLM
RunComfy
Copyright 2025 RunComfy. All Rights Reserved.

RunComfy is the premier ComfyUI platform, offering ComfyUI online environment and services, along with ComfyUI workflows featuring stunning visuals. RunComfy also provides AI Playground, enabling artists to harness the latest AI tools to create incredible art.