ComfyUI > Nodes > Qwen2.5-VL GGUF Nodes > 🌐 Remote Text Model Config (Ollama/Nexa/LM Studio)

ComfyUI Node: 🌐 Remote Text Model Config (Ollama/Nexa/LM Studio)

Class Name

RemoteAPIConfig

Category
🤖 GGUF-VLM/💬 Text Models
Author
walke2019 (Account age: 2560days)
Extension
Qwen2.5-VL GGUF Nodes
Latest Updated
2025-12-17
Github Stars
0.03K

How to Install Qwen2.5-VL GGUF Nodes

Install this extension via the ComfyUI Manager by searching for Qwen2.5-VL GGUF Nodes
  • 1. Click the Manager button in the main menu
  • 2. Select Custom Nodes Manager button
  • 3. Enter Qwen2.5-VL GGUF Nodes in the search bar
After installation, click the Restart button to restart ComfyUI. Then, manually refresh your browser to clear the cache and access the updated list of nodes.

Visit ComfyUI Online for ready-to-use ComfyUI environment

  • Free trial available
  • 16GB VRAM to 80GB VRAM GPU machines
  • 400+ preloaded models/nodes
  • Freedom to upload custom models/nodes
  • 200+ ready-to-run workflows
  • 100% private workspace with up to 200GB storage
  • Dedicated Support

Run ComfyUI Online

🌐 Remote Text Model Config (Ollama/Nexa/LM Studio) Description

RemoteAPIConfig node streamlines remote API setup for text models on platforms like Nexa and LM Studio.

🌐 Remote Text Model Config (Ollama/Nexa/LM Studio):

The RemoteAPIConfig node is designed to facilitate the configuration of remote APIs for text models, specifically supporting platforms like Nexa, Ollama, and LM Studio. This node allows you to seamlessly integrate and manage different API services by mapping the API type to its corresponding key and establishing a connection to the desired service. The primary goal of this node is to streamline the process of configuring remote APIs, ensuring that you can easily switch between different models and services without the need for extensive technical knowledge. By providing a user-friendly interface, the RemoteAPIConfig node empowers you to leverage the capabilities of various text models, enhancing your ability to create and manage AI-driven applications effectively.

🌐 Remote Text Model Config (Ollama/Nexa/LM Studio) Input Parameters:

base_url

The base_url parameter specifies the root URL of the remote API service you wish to connect to. It is crucial for establishing a connection to the correct server and ensuring that requests are directed to the appropriate endpoint. There are no specific minimum or maximum values for this parameter, but it must be a valid URL format. The default value is not set, as it depends on the service you are connecting to.

api_type

The api_type parameter determines the type of API service you are configuring. It maps to specific services such as "Nexa SDK," "Ollama," and "LM Studio," allowing the node to adjust its configuration accordingly. This parameter is essential for ensuring compatibility with the chosen service. The available options are "Nexa SDK," "Ollama," and "LM Studio," with "Ollama" being the default if not specified.

model

The model parameter allows you to select the specific model you wish to use within the chosen API service. This parameter is dynamic and can be updated by refreshing the model list, ensuring you have access to the latest models available. There are no predefined minimum or maximum values, as the options depend on the models supported by the selected API service.

system_prompt

The system_prompt parameter is an optional input that allows you to provide a system-level prompt or instruction to the model. This can be used to guide the model's behavior or output in a specific direction. The default value is an empty string, and it supports multiline input to accommodate more complex instructions.

🌐 Remote Text Model Config (Ollama/Nexa/LM Studio) Output Parameters:

model_config

The model_config output parameter provides a comprehensive configuration object for the selected remote API model. It includes details such as the mode of operation, base URL, API type, model name, system prompt, and service availability status. This output is crucial for verifying that the configuration is correct and that the service is ready for use.

status_info

The status_info output parameter offers a textual summary of the configuration status, indicating whether the connection to the remote API service was successful. It provides valuable feedback on the availability and readiness of the service, helping you quickly identify any issues that may need attention.

🌐 Remote Text Model Config (Ollama/Nexa/LM Studio) Usage Tips:

  • Ensure that the base_url is correctly set to the API service you intend to use, as an incorrect URL will prevent successful connection.
  • Regularly refresh the model list to access the latest models available for your chosen API service, ensuring you are working with the most up-to-date options.

🌐 Remote Text Model Config (Ollama/Nexa/LM Studio) Common Errors and Solutions:

Service unavailable: {base_url}

  • Explanation: This error indicates that the remote API service at the specified base URL is not accessible or not running.
  • Solution: Verify that the service is operational and that the base URL is correct. Ensure that there are no network issues preventing access to the service.

Invalid API type: {api_type}

  • Explanation: The specified API type does not match any of the supported services, leading to a configuration error.
  • Solution: Check the api_type parameter to ensure it matches one of the supported options: "Nexa SDK," "Ollama," or "LM Studio." Adjust the parameter as needed to align with a valid API type.

🌐 Remote Text Model Config (Ollama/Nexa/LM Studio) Related Nodes

Go back to the extension to check out more related nodes.
Qwen2.5-VL GGUF Nodes
RunComfy
Copyright 2025 RunComfy. All Rights Reserved.

RunComfy is the premier ComfyUI platform, offering ComfyUI online environment and services, along with ComfyUI workflows featuring stunning visuals. RunComfy also provides AI Models, enabling artists to harness the latest AI tools to create incredible art.

🌐 Remote Text Model Config (Ollama/Nexa/LM Studio)