Visit ComfyUI Online for ready-to-use ComfyUI environment
Retrieve comprehensive list of models from Large Language Model providers for easy selection and exploration.
The IF_LLM_ListModels
node is designed to provide a comprehensive list of available models from a specified Large Language Model (LLM) provider. This node is particularly useful for AI artists and developers who need to explore and select from various models offered by different LLM providers. By leveraging this node, you can easily retrieve and view the models available from providers such as OpenAI, Anthropic, and others, without needing to manually query each provider's API. The node simplifies the process by automatically handling API key management and formatting the output in a user-friendly manner. This functionality is essential for those who wish to integrate or experiment with different LLMs in their projects, as it provides a clear overview of the options available, thus aiding in informed decision-making.
The llm_provider
parameter specifies the provider from which you want to list available models. It accepts a list of predefined provider names such as "ollama", "llamacpp", "kobold", "lmstudio", "textgen", "groq", "gemini", "openai", "anthropic", "mistral", "transformers", "xai", and "deepseek". The default value is "ollama". This parameter is crucial as it determines the source of the models being listed, and selecting the correct provider ensures that you receive the relevant models for your needs.
The base_ip
parameter defines the IP address of the server hosting the LLM provider's API. It is a string value with a default of "localhost". This parameter is important for establishing a connection to the correct server, especially when the API is hosted on a local or remote server. Ensuring the correct IP address is specified will facilitate successful communication with the provider's API.
The port
parameter specifies the port number used to connect to the LLM provider's API. It is a string value with a default of "11434". This parameter is essential for directing the connection to the correct service endpoint on the server. Using the correct port number is necessary to ensure that the API requests are routed correctly and that the node can retrieve the model information.
The external_api_key
parameter allows you to provide an API key for authenticating requests to the LLM provider. It is a string value with a default of an empty string. This parameter is vital for accessing providers that require authentication. If not provided, the node attempts to retrieve the API key from the environment variables. Supplying a valid API key ensures that you can access the models from providers that enforce security measures.
The refresh
parameter is a boolean value that indicates whether to refresh the list of models. It has a default value of False
. This parameter is useful when you want to ensure that the list of models is up-to-date, especially if there have been recent changes or updates to the models offered by the provider. Setting this parameter to True
forces the node to fetch the latest model information.
The model_list
output parameter is a string that contains a formatted list of available models from the specified LLM provider. This output is crucial as it provides a clear and organized view of the models you can choose from, helping you make informed decisions about which models to use in your projects. The list includes model names and is saved to a file for easy reference, ensuring that you have access to this information even after the node execution is complete.
llm_provider
to get the relevant models for your needs. This will help you avoid unnecessary API calls and ensure you are working with the right set of models.base_ip
and port
are correctly set to match the server hosting the LLM provider's API. This will help in establishing a successful connection.external_api_key
parameter to provide a valid API key if the provider requires authentication. This will ensure that you have the necessary permissions to access the models.refresh
parameter to True
if you suspect that the list of models has changed or if you want to ensure you have the most current information.<llm_provider>
: <error_message>
llm_provider
, base_ip
, and port
parameters are correctly set. Ensure that the API key is valid and correctly provided. Check your network connection and try again. If the problem persists, consult the provider's documentation for troubleshooting tips.RunComfy is the premier ComfyUI platform, offering ComfyUI online environment and services, along with ComfyUI workflows featuring stunning visuals. RunComfy also provides AI Playground, enabling artists to harness the latest AI tools to create incredible art.