🌐 Remote Text Model Config (Ollama/Nexa/LM Studio):
The RemoteAPIConfig node is designed to facilitate the configuration of remote APIs for text models, specifically supporting platforms like Nexa, Ollama, and LM Studio. This node allows you to seamlessly integrate and manage different API services by mapping the API type to its corresponding key and establishing a connection to the desired service. The primary goal of this node is to streamline the process of configuring remote APIs, ensuring that you can easily switch between different models and services without the need for extensive technical knowledge. By providing a user-friendly interface, the RemoteAPIConfig node empowers you to leverage the capabilities of various text models, enhancing your ability to create and manage AI-driven applications effectively.
🌐 Remote Text Model Config (Ollama/Nexa/LM Studio) Input Parameters:
base_url
The base_url parameter specifies the root URL of the remote API service you wish to connect to. It is crucial for establishing a connection to the correct server and ensuring that requests are directed to the appropriate endpoint. There are no specific minimum or maximum values for this parameter, but it must be a valid URL format. The default value is not set, as it depends on the service you are connecting to.
api_type
The api_type parameter determines the type of API service you are configuring. It maps to specific services such as "Nexa SDK," "Ollama," and "LM Studio," allowing the node to adjust its configuration accordingly. This parameter is essential for ensuring compatibility with the chosen service. The available options are "Nexa SDK," "Ollama," and "LM Studio," with "Ollama" being the default if not specified.
model
The model parameter allows you to select the specific model you wish to use within the chosen API service. This parameter is dynamic and can be updated by refreshing the model list, ensuring you have access to the latest models available. There are no predefined minimum or maximum values, as the options depend on the models supported by the selected API service.
system_prompt
The system_prompt parameter is an optional input that allows you to provide a system-level prompt or instruction to the model. This can be used to guide the model's behavior or output in a specific direction. The default value is an empty string, and it supports multiline input to accommodate more complex instructions.
🌐 Remote Text Model Config (Ollama/Nexa/LM Studio) Output Parameters:
model_config
The model_config output parameter provides a comprehensive configuration object for the selected remote API model. It includes details such as the mode of operation, base URL, API type, model name, system prompt, and service availability status. This output is crucial for verifying that the configuration is correct and that the service is ready for use.
status_info
The status_info output parameter offers a textual summary of the configuration status, indicating whether the connection to the remote API service was successful. It provides valuable feedback on the availability and readiness of the service, helping you quickly identify any issues that may need attention.
🌐 Remote Text Model Config (Ollama/Nexa/LM Studio) Usage Tips:
- Ensure that the
base_urlis correctly set to the API service you intend to use, as an incorrect URL will prevent successful connection. - Regularly refresh the model list to access the latest models available for your chosen API service, ensuring you are working with the most up-to-date options.
🌐 Remote Text Model Config (Ollama/Nexa/LM Studio) Common Errors and Solutions:
Service unavailable: {base_url}
- Explanation: This error indicates that the remote API service at the specified base URL is not accessible or not running.
- Solution: Verify that the service is operational and that the base URL is correct. Ensure that there are no network issues preventing access to the service.
Invalid API type: {api_type}
- Explanation: The specified API type does not match any of the supported services, leading to a configuration error.
- Solution: Check the
api_typeparameter to ensure it matches one of the supported options: "Nexa SDK," "Ollama," or "LM Studio." Adjust the parameter as needed to align with a valid API type.
