Pipeline options
You can use the following options associated with your pipeline from the ellipses
icon.

Overview
Go to the
ellipses icon from the pipeline builder, and click on the Overview menu. You can view the pipeline overview for details such as the pipeline name, created by, created on, last modified date and time, last modified user, workspace name, stages, data sources, destinations, shared users, schedule time, last job run, and next job run. You can also rename, delete, and share with support from the pipeline overview.

Version history
Go to the
ellipses icon from the pipeline builder, and click on the Version history menu. The pipeline undergoes multiple updates during drafting, and once it is marked as ready, the changes are made live and a new version is created.
Note: You can access the Version history from the Last saved link at the top of the pipeline builder after the draft is published to live once.
Version history helps you keep track of all the actions done in a pipeline. You can view the list of the published older versions of the pipeline and view which version is currently live.

This helps in switching to a particular version to view its content and keep track of the changes made. You can also restore
the pipeline to an earlier version if needed. After restoring the version, the pipeline will be in draft state. Mark it as ready for the version to be published and made live.
Note: If you restore a version, modify it, and then switch to another version without making your changes live, the initial version will be lost and won’t appear in the version history. Drafts are not saved in the version history unless they are made live.
The history view by default displays all actions of a pipeline in chronological order. You can use filters to view the history by version types, user, or date as needed.

Draft
Any changes made to the pipeline are saved as a draft version and it will not be reflected in the active schedule. Please Mark as ready for the changes to reflect in the schedule.
Note: Manual run and backfill will be executed on the latest draft irrespective of the live or draft state.
Job history
Go to the
ellipses icon from the pipeline builder, and click on the Job history option to view all the jobs of a particular pipeline.
Click on a job in the Jobs page to navigate to the Job summary. The Job summary shows the details of a job in three different tabs: Overview, Stages, and Output. Click here to know more.

Personal data overview
Go to the
ellipses icon from the pipeline builder, and click on the Personal data overview menu. You can review how personal data is protected in your pipeline using the Personal data overview option. The name of the stage and the number of columns containing PII and ePHI data are displayed in this section. Click here to know how to mark a columns as personal data in DataPrep.
Info: You can also review the personal data in all your pipelines and workspaces under the Personal data audit in the Manage privacy section under the Settings page.

Share pipeline
Go to the
ellipses icon from the pipeline builder, and click on the Share pipeline menu.
Enter the user email or group name to share with users or groups.
You can choose the Send email notification to users checkbox to notify the user through email.

You can also Pick users/groups from your previously shared contact list.
Click Share to share your pipelines with the users or groups.

Click the
button if you want to stop sharing your pipeline with the user.
To create a new group
1. Click the Create new group button from within the Share pipeline dialog box.
2. Enter a name for the new group. Add users to the group by entering their email addresses.
3. Click Create to create your new group.

Click here to know more about sharing your pipeline.
Save as template
1. Open the required pipeline and click the
ellipses icon in the Pipeline builder page.

2. Click the Save as template option and choose Save as pipeline template. The pipeline will be saved as a pipeline template in the templates library.
You can also download the pipeline template and use it using the Download pipeline as .zdpt option.
3. In the Workspace tab, navigate to the Pipeline template and click Use now to create a new pipeline using the saved template.
Share with support
Open the required pipeline and click the
ellipses icon in the Pipeline builder page.
Click the Share with support option. When you share the pipeline with our support team:
-
You grant full access to our support team to view and modify all the data in your pipeline, including columns with personal data. If you want to proceed, then click Share option.

-
Our support team will automatically lose access after 2 weeks. If you want to withdraw the access before 2 weeks, you can manually revoke access using this menu option again.

Reload data
You can manually refresh your stage with the latest data by reloading data from your data source.
1. Click the
ellipses icon in the Pipeline builder page, and choose the Reload data option.
2. Provide the data interval and import configuration to reload data.
Click here to learn how to reload data.
Move to trash
Use this option to delete and move your pipeline to trash. The pipelines that are deleted can be located under the trash tab in the Workspace page. You can choose to restore the entities or permanently delete them from the trash section. Items in the trash will be permanently deleted after 180 days.
Zoom in, Zoom out - You can zoom in and out of the pipeline using this option. The range is between 60% - 100%
Scroll back to content - If your pipeline moves out of view, click this to scroll back to the top content.
Note: Return to the workspace by clicking the Back icon or Workspace name. You can also rename the pipeline name.
Undo and Redo
You can revert or reapply all the changes made in the pipeline using the undo and redo
button in the pipeline builder. Import data
You can click on the Import data
icon at the top of the pipeline builder to bring data from multiple sources into the pipeline.
Elements in a pipeline
Change source
1. Right click the source icon and choose Change source option. You will be redirected to the import page from where you can import data from any source as required.
Note: You can also rename the name of the file imported by clicking on the name below the data source.
Data source details
2. Right click the data source icon and choose the Datasource details option or click the data source icon once to view the data source details.

Click the the Change source option if you'd like to change the data source. You can also edit the data source details using the Edit option.
You can also edit the File path and fetch data from a different location. The above is the snapshot of Workdrive. Similarly, you can edit the data source details of all connectors.
Note: Changing the file format from CSV or TSV to XSLSX or HTML will not be supported.
Copy to
3. To copy a flow from one pipeline to another, right-click the data source or data destination icon and select the
Copy to option. Alternatively, you can click the

icon or right-click the stage linked to your data source or any other stage.

Select an existing pipeline from the dropdown where you want to copy the selected flow, or choose the Create New Pipeline option, enter the pipeline name, and click Copy.
The whole upstream and downstream of the selected flow will be copied and added to the new pipeline.
Note: Copy of the data source will be created as well, when a flow is copied from one pipeline to another pipeline.
4. Click the
icon or right click the stage linked to your data source and choose,
Add Stage - To add a new stage using the imported data. Stages are nodes created for processing data while applying data flow transforms. Every data imported from your data source will have a stage created by default.
Remove from pipeline - You can use this option to remove the imported data and the flow from the pipeline.

Dataset details
5. Click the stage icon once to view the dataset details - Ruleset, overview.
Overview of dataset: The overview shows the list of the name of the dataset created by, created on, last modified by, last modified, columns, rule count, data quality, and data preview.

You can also choose to Prepare Data to open the DataPrep Studio page and explore more data transforms.
Ruleset: Each transform applied on the dataset is stored in DataPrep as a rule, in order of their execution. The ordered list of these rules is called a Ruleset. You can access the Ruleset from the Ruleset pane, where you can add, edit, or remove rules that have been applied on the stage.

Stage options
6. Click the
icon or right click the stage and choose the following options.
Prepare data
You can choose to Prepare Data to open the DataPrep Studio page and explore more data transforms.
Join
You can join two datasets together using common columns using the Join transform. DataPrep offers all four types of joins: inner join, left join, right join, and outer join. Click here to know more about how to Join.
Append
In DataPrep, you can append one dataset with another and create a new dataset using the Append transform. Click here to know more about append transform.
Pivot
Pivot Table distributes data for easy consumption. It spreads out the data in long, winding tables by converting categories into columns. Pivot can be created by selecting the Column, Row, and Data fields. Click here to know more about Pivot transform.
Unpivot
Unpivot converts columns to rows. The Unpivot transform is useful in condensing data and the data is often exported to analytics software for creating reports and dashboards. The result is saved as a new dataset when the transform is applied. Click here to know more about Unpivot transform.
Add stage
You can add a new stage to the pipeline using this option. Stages are nodes created for processing data while applying data flow transforms. Every data imported from your data source will have a stage created by default.
Add destination
Once you are done creating your data flow and applying necessary transforms in your stages, you can right-click a stage and add a destination to complete your data flow.
Data destination is a place where you want to export your data to. It can be a cloud database, business applications like Zoho CRM, Zoho Analytics, etc. You can choose your preferred destination out of 50+ data destinations in DataPrep to export your prepared data to.
Overview
The overview shows the list of the name of the dataset created by, created on, last modified by, last modified, columns, rule count, data quality, and data preview.
You can also choose to Prepare Data to open the DataPrep Studio page and explore more data transforms.
View ruleset
Each transform applied on the dataset is stored in DataPrep as a rule, in order of their execution. The ordered list of these rules is called a Ruleset. You can access the Ruleset from the Ruleset pane, where you can add, edit, or remove rules that have been applied on the stage.

Remove from pipeline
You can use this option to remove the stage from the pipeline.
Copy to
To copy a flow from one pipeline to another, click the

icon on any stage or right-click the stage linked to your data source
and select the Copy to option. Alternatively you can right-click the data source or data destination icon
Select an existing pipeline from the dropdown where you want to copy the selected flow, or choose the Create New Pipeline option, enter the pipeline name, and click Copy.
The whole upstream and downstream of the selected flow will be copied and added to the new pipeline.
Note: Copy of the data source will be created as well, when a flow is copied from one pipeline to another pipeline.
Destination details
7. Click the destination once to view the details such as Connection details, Folder path, File export option, File name, File format, Delimiter, Text qualifier, Row Separator, File encoding, Compress as a .zip file, Personal data etc. Click the Edit option to edit the destination details.
You can also edit the File path and export data to a different location. The above is the snapshot of GoogleDrive. Similarly, you can edit the data source details of all connectors.
Note: Changing the file format from CSV or TSV to XSLSX or HTML will not be supported.
Copy to
3. To copy a flow from one pipeline to another, right-click the data source or data destination icon and select the
Copy to option. Alternatively, you can click the

icon or right-click the stage linked to your data source or any other stage.

Right click the destination and you'll find the following option,
Edit destination - You can change the destination of your pipeline using this option. Click the Edit destination option and you will redirected to the Choose your destination dialog box, from where you can choose your preferred destination out of 50+ data destinations in DataPrep.
Remove destination - You can use this option to remove the destination added to your pipeline.

Schedule
The Schedule option is used to schedule the import from various sources, apply the transforms, and export to the destinations in your pipeline at regular intervals. You can also import incremental data from various sources like cloud storage, CRM, Creator, etc. while scheduling. Incremental data import is a method used to import new or modified records after the previous sync. Click here to know more about how to schedule a pipeline.
Run
Manual run
Backfill run
Backfill run is an option used to export the data you have prepared to your destination. Backfill run can be mainly used to catch-up with failed schedules or for users who want to import specific timeframe data or to manually fetch incremental data without scheduling a job. Data interval needs to be defined before triggering a backfill run. Click here to know more about how to execute a backfill run in a pipeline.
Webhooks
Zoho DataPrep allows you to trigger your pipeline using Webhooks. A webhook is a user-defined HTTP callback that is triggered when a particular event occurs at the source site. When the event occurs, the source site makes an HTTP request to the URL specified. Click here to know more about Webhooks.
Zoho Flow
Mini map
It is located at the bottom-right corner of the pipeline builder, next to the zoom in/out controls. The mini map offers a compact visual representation of the entire pipeline. This feature is especially helpful for navigating and understanding the layout of large or complex pipelines.
