tVerticaBulkExec Standard properties
These properties are used to configure tVerticaBulkExec running in the Standard Job framework.
The Standard tVerticaBulkExec component belongs to the Databases family.
The component in this framework is available in all Talend products.
Basic settings
Database |
Select the desired database type from the list and click Apply. |
Property Type |
Select the way the connection details will be set.
|
Use an existing connection |
Select this check box and in the Component List drop-down list, select the desired connection component to reuse the connection details you already defined. When a Job contains the parent Job and the child Job, if you need to share an existing connection between the two levels, for example, to share the connection created by the parent Job with the child Job, you have to:
For an example about how to share a database connection across Job levels, see Sharing a database connection. |
Host |
The IP address or hostname of the database. |
Port |
The listening port number of the database. |
Database |
The name of the database. |
Schema |
The schema of the database. |
Username and Password |
The database user authentication data. To enter the password, click the [...] button next to the password field, enter the password in double quotes in the pop-up dialog box, and click OK to save the settings. |
Table |
The name of the table into which data will be written. |
Action on table |
Select an operation to be performed on the table defined.
|
Use existing dynamic schema |
Select this option to use a dynamic schema generated by a tSetDynamicSchema component. This option requires at least one tSetDynamicSchema component. From the Dynamic schema drop-down list, select the component that generates the desired dynamic schema. |
Schema and Edit schema |
A schema is a row description. It defines the number of fields (columns) to be processed and passed on to the next component. When you create a Spark Job, avoid the reserved word line when naming the fields.
When the schema to be reused has default values that are integers or functions, ensure that these default values are not enclosed within quotation marks. If they are, you must remove the quotation marks manually. For more information, see Retrieving table schemas. Click Edit schema to make changes to the schema. If you make changes, the schema automatically becomes built-in.
|
Use schema columns for Copy |
Select this check box to use the column option in the COPY statement so that you can restrict the load to one or more specified columns in the table. For more information, see the Vertica COPY SQL Statement. |
File Name |
The path to the file from which data will be loaded. The file should be located on the same machine where Talend Studio is installed or where the Job using this component is deployed. This property is available only when there is no input flow. |
Compression mode |
Select the compression mode for the file from which data will be loaded. This property is available only when you are using Vertica 6.0 and later. |
Advanced settings
Additional JDBC Parameters | Specify additional JDBC parameters for the database connection created. This property is not available when the Use an existing connection check box in the Basic settings view is selected. |
Action on data |
Select an action that will be performed on the data of the table defined.
|
Stream name |
The stream name of a load, which helps identify a particular load. This property is available only when you are using Vertica 6.0 and later. |
Write to ROS (Read Optimized Store) |
Select this check box to store data in a physical storage area, in order to optimize the reading, as the data is compressed and pre-sorted. |
Exit Job on no rows loaded |
The Job automatically stops if no row has been loaded. |
Missing columns as null |
Select this check box to insert NULL values for the missing columns when there is insufficient data to match the columns specified in the schema. This property is available only when you are using Vertica 6.0 and later. |
Skip Header |
Select this check box and in the field displayed next to it, specify the number of records to skip in the file. This property is available only when you are using Vertica 6.0 and later. |
Record terminator |
Select this check box and in the field displayed next to it, specify the literal character string used to indicate the end of each record in the file. This property is available only when you are using Vertica 6.0 and later. |
Enclosed by character |
Select this check box to set the character within which data is enclosed. This property is available only when you are using Vertica 6.0 and later. |
Escape char |
Select this check box and in the field displayed specify the character to be escaped when loading data into Vertica. By default, the check box is selected and the default escape character is \. |
Fields terminated by |
The character, string or regular expression to separate fields. |
Null String |
The string displayed to indicate that the value is null. |
Reject not fitted values |
Select this check box to reject data rows of type char, varchar, binary, and varbinary if they do not fit the target table. This property is available only when you are using Vertica 6.0 and later. |
Maximum number of rejected records |
Select this check box and in the field displayed next to it, specify the maximum number of records that can be rejected before a load fails. This property is available only when you are using Vertica 6.0 and later. |
Stop and rollback if any row is rejected |
Select this check box to stop and roll back a load without loading any data if any row is rejected. This property is available only when you are using Vertica 6.0 and later. |
Don't commit |
Select this check box to perform a bulk load transaction without committing the results automatically. This is useful if you want to execute multiple bulk loads in a single transaction. This property is available only when you are using Vertica 6.0 and later. |
Rejected data file |
Specify the file into which rejected rows will be written. This property is available only when Bulk insert is selected from the Action on data drop-down list. |
Exception log file |
Specify the file into which the exception log will be written. This log explains why each rejected row was rejected. This property is available only when Bulk insert is selected from the Action on data drop-down list. |
tStatCatcher Statistics |
Select this check box to gather the Job processing metadata at the Job level as well as at each component level. |
Global Variables
ACCEPTED_ROW_NUMBER |
The number of rows loaded into the database. This is an After variable and it returns an integer. |
REJECTED_ROW_NUMBER |
The number of rows rejected. This is an After variable and it returns an integer. |
ERROR_MESSAGE |
The error message generated by the component when an error occurs. This is an After variable and it returns a string. |
Usage
Usage rule |
Talend Studio and the Vertica database create very fast and affordable data warehouse and data mart applications. For more information about how to configure Talend Studio to connect to Vertica, see Talend and HP Vertica Tips and Techniques. You can use this component in either of the following two ways to write data into Vertica.
|
Dynamic settings |
Click the [+] button to add a row in the table and fill the Code field with a context variable to choose your database connection dynamically from multiple connections planned in your Job. This feature is useful when you need to access database tables having the same data structure but in different databases, especially when you are working in an environment where you cannot change your Job settings, for example, when your Job has to be deployed and executed independent of Talend Studio. The Dynamic settings table is available only when the Use an existing connection check box is selected in the Basic settings view. Once a dynamic parameter is defined, the Component List box in the Basic settings view becomes unusable. For examples on using dynamic parameters, see Reading data from databases through context-based dynamic connections and Reading data from different MySQL databases using dynamically loaded connection parameters. For more information on Dynamic settings and context variables, see Dynamic schema and Creating a context group and define context variables in it. |