New archived content: Talend MDM, Talend Data Catalog 8.0, and Talend 7.3 products reached their end of life in 2024. Their documentation was moved to the Talend Archive page and will no longer receive content updates.
In the Integration perspective of Talend Studio,
create an empty Spark Batch Job from the Job
Designs node in the Repository
tree view.
In the workspace, enter the name of the component to be used and select this
component from the list that appears. In this scenario, the components are
tHDFSConfiguration, tMongoDBConfiguration, tFixedFlowInput, tMongoDBOutput,
tMongoDBInput and tLogRow.
The tFixedFlowInput components are used to
load the sample data into the data flow. In the real-world practice, you can use
other components such as tFileInputDelimited,
alone or even with a tMap, in the place of
tFixedFlowInput to design a sophisticated
process to prepare your data to be processed.
Connect tFixedFlowInput to tMongoDBOutput using the Row >
Main link.
Connect tMongoDBInput to tLogRow using the Row >
Main link.
Connect tFixedFlowInput to tMongoDBInput using the Trigger >
OnSubjobOk link.
Leave tHDFSConfiguration and tMongoDBConfiguration alone without any
connection.
Did this page help you?
If you find any issues with this page or its content – a typo, a missing step, or a technical error – please let us know!