Managing data flows
What you can do with a data flow depends on the type of space where the data flow is located. In your personal space, you can manage your own data flows. In a shared space, depending on your permissions in that space, you can manage both your own data flows and those of other users. In managed spaces, there are fewer options for managing data flows as access is more governed. For more information about what you can do, see Data flow roles and permissions.
From your catalog, and depending on the space, you can:
Adding data flows to collections
Collections enable you to group content such as data flows, apps, charts, notes and links. To add a data flow to a collection, click > Add to collection.
By default, collections are private. Tenant admins and analytic admins can make collections public to share them with all other members of the tenant. Public collections do not change members' access to content in spaces. Members will only be able to see content in public collections to which they already have access. To make a collection public, click on the collection, select Make public, and click Make public. To make a public collection private again, click , select Make private, and click Make private.
Moving data flows
You can move data flows between shared spaces as well as between a shared space and a personal space.
If you move a data flow with a run schedule between spaces, the run schedule is deleted. Recreate the schedule in the new space if required.
Do the following:
- Click on the data flow and select Move.
- Select the new space from Space.
- To open the new space, select Navigate to space.
- Click Move.
Editing data flows details
You can change the titles and descriptions of your data flows. Click on the data flow that you want to edit and then select Edit details. You can also change the tags used with the data flow. Tags are used to group data flows with related scripts and apps. Tags are shared in the Qlik Cloud Analytics activity center, but you can only see tags you have created yourself or that are used with apps, data flows and scripts to which you have access.
Publishing data flows
You can publish data flows from a personal space to a managed space, or share space to a managed space. Publishing creates a copy of the data flow in the managed space. You must have the Owner or Can publish permission in the managed space to which you are publishing the data flow.
You can view the spaces that a data flow has been published to in Details.
Data flows indicate when they were last published to a managed space in the publish dialog.
Do the following:
- Click on the data flow and select Publish.
-
Select a managed space.
- Optionally, change the Published data flow name, Description, and Tags of the data flow.
- Click Publish.
Duplicating data flows
You can duplicate any data flow you have access to by clicking on the data flow and then selecting Duplicate. The duplicate can be modified.
Duplicating data flows has several uses. For example, you can duplicate a data flow as a form of version control, and store older ones a space created for archival purposes.
Duplicated data flows keep the original data sources. If you did not create the original data sources used in the data flow or if they are not available in the space, you will have to recreate data sources in the data flow.
Reloading data flows
You can manually run your flow from the catalog, without having to go to the Editor view. The latest configuration of the flow will be used, along with the latest data from the data source. A new output will be generated if the run is successful.
Do the following:
- Click on the data flow and select Reload now.
Scheduling data flows runs
You can create a schedule for running a data flow. The schedule can be based on frequency, date, or time.
Do the following:
-
Click on the data flow and select Reload schedule.
-
Click Add schedule.
-
Select any of the time options in the configuration window.
When you create a time-based schedule, you can choose:
-
The frequency and interval of the run
-
The timezone and time of day
-
How long the schedule will be in effect
Repeating runs can be set at the following intervals:
-
Daily: Set the times per day, timezone, and the time of day.
-
Weekly: Set the days of the week, times per day, timezone, and time of day.
-
Monthly: Set the days of the month, times per day, timezone, and time of day.
-
Yearly: Set the months, days of the month, times per day, timezone, and time of day.
By default, schedules will run continuously, with no end date. You can choose to set a start date, an end date, or to only run the schedule between two dates.
Configuring data flows notifications
You can access the notifications configuration panel of a data flow to set or modify which event will trigger a notification, and on which support to receive it.
Do the following:
- Click on the data flow and select Notifications.
Seeing data flow details
Using this option will lead you to the Overview page of the data flow, where you can see all the general information of your data flow. See Overview.
Do the following:
- Click on the data flow and select Details.
Viewing lineage
Lineage tracks data and data transformation backwards from the data flow to its original sources. This provides an interactive representation of the origins of the data that this data flow processes. For more information on lineage, see Analyzing lineage for apps, scripts, and datasets.
Do the following:
- Click on the data flow and select Lineage.
Viewing impact analysis
Impact analysis shows the downstream dependencies of data flows outputs. It answers questions about what other apps, scripts, or other resources would be directly or indirectly impacted by changes to the data flow. For more information on impact analysis, see Analyzing impact analysis for apps, scripts, and datasets.
Do the following:
- Click on the data flow and select Impact analysis.
Exporting data flows
You can export data flows from the Analytics activity center as .dfw files. These can be uploaded into other Qlik Cloud Analytics tenants or Qlik Sense versions.
The exported data flow is saved in the default download folder of your web browser.
Do the following:
- Click on the data flow you want to export and select Export.
Converting data flows to script
You can convert data flows into scripts. Converting duplicates the data flow and then converts it into a script.
Do the following:
- Click on the data flow you want to export and select Duplicate and convert to script.
- Click Confirm.
Deleting data flows
Do the following:
- Click on the data flow you want to delete and select Delete.
- Click Delete.