Skip to main content Skip to complementary content

tOllamaClient

Availability-noteBeta
Accesses the Langchian4j Ollama API. With this component, you can use Ollama models and their different capabilities.

tOllamaClient Standard properties

These properties are used to configure tOllamaClient running in the Standard Job framework.

The Standard tOllamaClient component belongs to the AI family.

Information noteNote:
  • This component is available only when you have installed the 8.0.1-R2025-02 Talend Studio monthly update or a later one delivered by Talend. For more information, check with your administrator.
  • As of now, the component only supports versions 3.x of LLama models.

Prerequisites

Before using this component, you need to:

  • Set up the Ollama environment following the Ollama README instructions.
  • Install Llama version 3.x in the Ollama environment you set up using the ollama run llama3.x command (replace 3.x with the version of your choice). Refer to the Ollama README instructions.

Basic settings

Property type Either Built-in or Repository .
  • Built-in: No property data stored centrally.
  • Repository: Select the repository file in which the properties are stored. The fields that follow are completed automatically using the data retrieved.
Schema and Edit Schema A schema is a row description, it defines the number of fields to be processed and passed on to the next component. The schema is either Built-in or stored remotely in the Repository.
  • Built-in: You create and store the schema locally for this component only. For more information about a component schema in its Basic settings tab, see Basic settings tab.

  • Repository: You have already created the schema and stored it in the Repository. You can reuse it in various projects and Job designs. For more information about a component schema in its Basic settings tab, see Basic settings tab.

Click Edit schema to make changes to the schema. If the current schema is of the Repository type, three options are available:

  • View schema: choose this option to view the schema only.

  • Change to built-in property: choose this option to change the schema to Built-in for local changes.

  • Update repository connection: choose this option to change the schema stored in the repository and decide whether to propagate the changes to all the Jobs upon completion.

    If you just want to propagate the changes to the current Job, you can select No upon completion and choose this schema metadata again in the Repository Content window.

Host

Type in the local URL address of the Ollama API server you want to access, by default it is

http://localhost:11434/.
Model To select one of the available Llama models, click the [...] button next to the field. In the dialog box displayed, select the model that will be used or select the Use custom value check box and specify the model name in the Custom value field.

As of now, the component supports LLama 3.x models.

Prompt Enter your instructions in this field. For more information on prompt writing best practices, read How to prompt Code Llama.

Example: List the top 10 cities corresponding to the given countries, only include city names in the answer..

Advanced settings

tStatCatcher Statistics

Select this check box to gather the Job processing metadata at a Job level and at each component level.

Global Variables

Global Variables

ERROR_MESSAGE: the error message generated by the component when an error occurs. This is an After variable and it returns a string. This variable functions only if the Die on error check box is cleared, if the component has this check box.

Usage

Usage rules This component cannot handle dynamic columns.
The component performance depends on the following factors:

Did this page help you?

If you find any issues with this page or its content – a typo, a missing step, or a technical error – please let us know!