Skip to content

Ollama Chat Model node common issues#

Here are some common errors and issues with the Ollama Chat Model node and steps to resolve or troubleshoot them.

Processing parameters#

The Ollama Chat Model node is a sub-node. Sub-nodes behave differently than other nodes when processing multiple items using expressions.

Most nodes, including root nodes, take any number of items as input, process these items, and output the results. You can use expressions to refer to input items, and the node resolves the expression for each item in turn. For example, given an input of five name values, the expression {{ $json.name }} resolves to each name in turn.

In sub-nodes, the expression always resolves to the first item. For example, given an input of five name values, the expression {{ $json.name }} always resolves to the first name.

Can't connect to a remote Ollama instance#

The Ollama Chat Model node is only designed to connect to a locally hosted Ollama instance. It doesn't include the authentication features you'd need to connect to a remotely hosted Ollama instance.

To use the Ollama Chat Model, follow the Ollama credentials instructions to set up Ollama locally and configure the instance URL in Localmind Automate.

Error: connect ECONNREFUSED ::1:11434#

This error occurs when your computer has IPv6 enabled, but Ollama is listening to an IPv4 address.

To fix this, change the base URL in your Ollama credentials to connect to 127.0.0.1, the IPv4-specific local address, instead of the localhost alias that can resolve to either IPv4 or IPv6: http://127.0.0.1:11434.