Zoho DataPrep allows you to trigger your pipeline using Zoho Flow. Zoho Flow is an integration platform which connects cloud applications. It helps you set up workflows to automate information exchange among the apps you use.
For instance, let's use this flow as an example, and walk through how to recreate it.
When a new file is added in Zoho Workdrive, a flow can trigger a pipeline that imports the newly added Workdrive file.
1. Login to your Zoho Flow account. In the My Flows section, click Create Flow in the top right corner, or the + button on the left.
2. Enter the flow name and a description (optional). Click Create to be directed to your builder screen.
4. Pick an app to trigger your flow and click Next.
5. Choose the event in the app to trigger your flow and click Next.
Different apps have different types of triggers.
Info: For DataPrep we have only Pipeline job completion trigger.
6. If you have not created any connections already, click Connect and authenticate your connection. Note that different apps have different authentication methods.
7. After configuring a connection, the variable name will be auto filled. You can edit the name if needed.
Note: When renaming the variable,
All names must start with a letter. Variable names can contain alphanumeric characters and underscores.
Other characters, such as spaces, are not accepted. Names must be unique to avoid naming clashes.
If you use the same variable name for multiple actions, the result of the action executed last will be stored in the variable. E.g: new_deal, createTask
8. Select the required data from the drop-down and click Done.
Note: When you connect to Zoho DataPrep, the following variables will be provided for filter criteria:
Organization ID |
Workspace ID |
Pipeline ID |
Job executed status |
Job completed time in Unix timestamp |
Job ID |
Creator email ID |
Data interval end time in Unix timestamp |
Data interval start time in Unix timestamp |
Job created time in Unix timestamp |
Did the export stop due to invalid data? |
Job re-run count |
Job type |
10. In the configuration window that opens, choose the connection and enter the data in the fields that appear.
If you have not created any connections already, click Connect and authenticate your connection. Note that different apps have different authentication methods.
a) Click Connect and give a name to your connection.
b) Click Authorize and accept the pop-up that appears.
c) Zoho DataPrep account gets connected with Zoho Flow.
11. After configuring a connection, the variable name will be auto filled. You can edit the name if needed.
12. Select the Organization ID, Workspace ID, and Pipeline ID from the drop-down and click Done.
Note: You can insert variables to draft your variable name or message according to the app you choose.
DataPrep provides the following variable to be used:
Organization ID |
Organization ID of the pipeline. |
Workspace ID |
Workspace ID of the pipeline. |
Pipeline ID |
Pipeline ID chosen. |
Job executed status |
Success or failure status of the job. |
Job completed time in Unix timestamp |
The total time interval taken for the job. |
Job ID |
ID of the job executed. |
Creator email ID |
The email ID of the user who runs the pipeline |
Data interval end time in Unix timestamp |
The time when the pipeline run was started. |
Data interval start time in Unix timestamp |
The time when the pipeline run was hit. |
Job created time in Unix timestamp |
The time when the job was created. |
Did the export stop due to invalid data? |
If the Stop export if data has invalid values setting is enabled and if the pipeline contains invalid data during export, then yes, if not no. |
Job re-run count |
The number of times a job was re-run. |
Job type |
The type of job that triggers the action: Schedule, Manual, Webhooks, Zoho Flow, Backfill. |
Important: When you add an action to run a pipeline, it must be configured in Zoho DataPrep.
To configure Zoho Flow in DataPrep follow the below steps:
2. Click the Schedule drop-down icon and click the Zoho Flow option.
Stop export if data has invalid values : Enabling this will stop the export when your data still has invalid values.
Order exports :
This option determines in what order the data has to be exported to destinations when you have configured multiple destinations.
If not enabled, export will run in the default order.
Note: This option will be visible only if you have added more than one destination in your pipeline.
To rearrange the order of your export destinations
1) Click the Order exports toggle.
2) You can drag and drop to change the order of the destinations and then click Save.
3. While configuring the Zoho Flow, the import configuration needs to be mandatorily setup for all the sources. Without setting up the import configuration, the Zoho Flow configuration cannot be saved.
Select the Edit here link to set the import configuration for Zoho Workdrive.
The import configuration is different for different sources. Click here to know more about how to set up import configuration for various sources.
4. After you configure the Zoho Flow configuration, click Save.
13. In Zoho Flow, your flow will be autosaved by default. After you switch on your flow, the draft will be published to live and saved as a version. Further changes made will be saved as a draft. Click Apply changes for those changes to get reflected in the flow.
15. Whenever a new file is uploaded in Zoho Workdrive, Zoho Flow will automatically trigger and run the pipeline configured in Zoho DataPrep.
Each Zoho Flow run is saved as a job in DataPrep. When a pipeline is run, the data will be fetched from your data sources, prepared using the series of transforms you have applied in each of the stages, and then data will be exported to your destination at regular intervals. This complete process is captured in the job history.
16. To go to the jobs list of a particular pipeline, go to the ellipses icon in the pipeline builder, and click on the Job history menu to check the job status of your pipeline.
17. Click the required job ID in the Jobs history page to navigate to the Job summary of a particular job.
The Job summary shows the history of a job executed in a pipeline flow.
You can view the status of the Zoho Flow run on the Job summary page. There are three different status for a job in DataPrep - Running, Success or Failure. Click here to know more.
18. When the Zoho Flow run is completed, the data prepared in your pipeline will be exported to the configured destinations.
After you set your Zoho Flow, you can choose to Edit or Remove using the Zoho Flow Active option in the pipeline builder.
a) Click Connect and give a name to your connection.
b) Click Authorize and accept the pop-up that appears.
c) Zoho DataPrep account gets connected with Zoho Flow.
d) After configuring a connection, the variable name will be auto filled. You can edit the name if needed.
e) Select the Organization ID, Workspace ID, and Pipeline ID from the drop-down.
f) You can also configure the conditions that trigger the flow using the following Filter criteria and click Done.
DataPrep provides the following variable to be used:
Organization ID |
Organization ID of the pipeline. |
Workspace ID |
Workspace ID of the pipeline. |
Pipeline ID |
ID of the Pipeline. |
Job executed status |
Success or failure status of the job. |
Job completed time in Unix timestamp |
The total time interval taken for the job. |
Job ID |
ID of the job executed. |
Creator email ID |
The email ID of the user who ran the pipeline |
Data interval end time in Unix timestamp |
The time when the pipeline run was started. |
Data interval start time in Unix timestamp |
The time when the pipeline run was hit. |
Job created time in Unix timestamp |
The time when the job was created. |
Did the export stop due to invalid data? |
If the Stop export if data has invalid values setting is enabled and if the pipeline contains invalid data during export, then yes, if not no. |
Job re-run count |
The number of times a job was re-run. |
Job type |
The type of job that starts the trigger pipeline: Schedule, Manual, Webhook, Backfill. |
After setting pipeline job completion as a trigger (for pipeline 1), set run a pipeline as an action (for pipeline 2) and then set message as Bot on channel as an action.
2. Set "Run a pipeline" as an action
Note: You can insert variables to draft your variable name or message.
DataPrep provides the following variable to be used:
Organization ID |
Organization ID of the pipeline. |
Workspace ID |
Workspace ID of the pipeline. |
Pipeline ID |
Pipeline ID chosen. |
Job executed status |
Success or failure status of the job. |
Job completed time in Unix timestamp |
The total time interval taken for the job. |
Job ID |
ID of the job executed. |
Creator email ID |
The email ID of the user who runs the pipeline |
Data interval end time in Unix timestamp |
The time when the pipeline run was started. |
Data interval start time in Unix timestamp |
The time when the pipeline run was hit. |
Job created time in Unix timestamp |
The time when the job was created. |
Did the export stop due to invalid data? |
If the Stop export if data has invalid values setting is enabled and if the pipeline contains invalid data during export, then yes, if not no. |
Job re-run count |
The number of times a job was re-run. |
Job type |
The type of job that triggers the action: Schedule, Manual, Webhook, Zoho Flow, Backfill. |
4. Save and run your flow
In Zoho Flow, your flow will be autosaved. Click the slider at the top to switch on your flow in Zoho Flow and let it work with actual data.
When Pipeline 1 completes successfully, Zoho Flow triggers Pipeline 2, and a message is sent to a Zoho Cliq channel.
Important: The action and trigger for a flow should not be the same pipeline, as this would result in the pipeline looping.
SEE ALSO
Export destinations in DataPrep
Learn how to use the best tools for sales force automation and better customer engagement from Zoho's implementation specialists.
If you'd like a personalized walk-through of our data preparation tool, please request a demo and we'll be happy to show you how to get the best out of Zoho DataPrep.
You are currently viewing the help pages of Qntrl’s earlier version. Click here to view our latest version—Qntrl 3.0's help articles.