Pipelines are fundamental entities that allow users to create ETL pipelines / data flows in DataPrep, with multiple data stages and using various flow‑level transforms. You can create a new pipeline from scratch (for your data integration and data movement needs) from the below places:
Home page
Pipelines tab
Workspaces tab
To create a pipeline from the Home page
1. You can create a new pipeline from the Home page or from the Pipelines tab. Click on the Create New button. A pipeline gets created in a new workspace. The Pipeline builder will open into view.

This is the start of your extract phase of ETL. After data is imported, you will be redirected to the pipeline builder where you can see your data source, your dataset and a stage linked to it.
Stages are nodes created for processing data while applying data flow transforms. Every dataset imported from your data source will have a stage created by default.
Zoho DataPrep helps you import, transform, export and automate your data based on prompts to Ask Zia in Natural Language. Using Ask Zia pipeline builder, you can enter the required prompt, and the rest of the output will be auto-generated for you by Zia. Click here to know more about Ask Zia pipeline builder.
3. You can right-click the stage and apply the data flow transforms. These transforms are part of your data integration logic, shaping and cleaning data as needed.
5. Once you are done creating your data flow and apply necessary transforms in your stages, you can right-click a stage and add a destination to complete your data flow. The destination is where the data will be loaded (the load phase of ETL), finalizing the data movement.
6. Data destination is a place where you want to export your data to. It can be a cloud database, business applications like Zoho CRM, Zoho Analytics, etc. You can choose your preferred destination out of 50+ data destinations in DataPrep to export your prepared data to.
7. Now that you have added a destination, you may want to try executing your pipeline using a manual run at first. Once you make sure manual run works, you can then set up schedule to automate the pipeline. This schedule automates repeated data movement and data integration tasks. Learn about the different types of runs here.
8. Each run is saved as a job. When a pipeline run is executed, the data will be fetched from your data sources, prepared using the series of transforms you have applied in each of the stages, and then data will be exported to your destination. This complete ETL process is captured in the job history.
9. Go to the
ellipses icon, and click on the Job history menu to check your job status. All runs captured as jobs will be saved under the Jobs page.

Info: You can also view the overall jobs status of all the pipelines under the Jobs tab.

To create a pipeline from the Pipelines tab
1. In the Pipelines tab, click the Create New button. A pipeline gets created in a new workspace and the Pipeline builder opens.

2. Click on the Add data option to bring your data from the required data source. Learn more on 50+ data sources.
After data is imported, you will be redirected to the pipeline builder where you can see your data source, and a stage linked to it. Proceed with applying transforms, setting up ETL flow, integrating destination, etc.
Click here to know the other steps to build and execute the pipeline.
To create a pipeline from the Workspaces tab
1. In the Workspaces page, click the + Import data button and import data from your favorite data source Learn more on 50+ data sources. Once you import data, a pipeline gets created in a new workspace and the Pipeline builder will open into view.
Note: You can also create a new ETL / data integration pipeline within an existing workspace using the Create new pipeline option inside a workspace.

Click here to know the other steps to build and execute the pipeline. You can also build your ETL pipeline without any hazzle using the Ask Zia ETL pipeline builder.
SEE ALSO
How to create a pipeline from predefined templates?