You can import data from the following local databases into DataPrep using Zoho Databridge:
- MySQL
- MS SQL Server
- Oracle
- PostgreSQL
- Maria DB
- Pervasive SQL
- Sybase
- DB2
- Exasol
- Sqlite
- Actian Vector
- Greenplum
- Denodo
- Progress OpenEdge
- YugabyteDB
- Microsoft Access
- SAP Hana
- Connect using JDBC URL
To import data from local databases
1. Open an existing pipeline or
create a pipeline from the
Home Page,
Pipelines tab or
Workspaces tab and click the
Add data option.
Info: You can also click the Import data icon at the top of the pipeline builder to bring data from multiple sources into the pipeline.
2. In the next screen, choose the required database or click the Databases category from the left pane.
3. Select New connection from the Connection drop down. If you have existing connections, you can choose the required connection from the Connection drop down.
4. Give your connection a name under the Connection name section.
5. Zoho Databridge is a tool that facilitates importing data from local databases. Databridge is mandatory to import data from local network.
Note: If this is the first time you are downloading Databridge, see how to install it
here.
6. Once you have installed Databridge on your machine, select your Databridge from the Databridge drop-down.
Note: Select the Databridge which is installed in the same network as the database you want to import the data from.
7. Select your Database type and enter the Database server host name and Port number.
8. Enter your Database name and provide the username and password if authentication is required.
9. Save your database configuration and connect to the database using Connect.
Note: The connection configuration will be automatically saved for importing from the database in the future. Credentials are securely encrypted and stored.
10. Select the tables that need to be imported. Alternatively, you can use a SQL query to import data.
The incremental fetch option is not available when the data is imported using a query from databases. Click here to know more about incremental fetch from local database.
11. Click the Import button.
12
. Once you have completed importing data,
Pipeline builder page will open from where you can start applying transforms. You can also right-click the stage and choose the
Prepare data option to prepare your data in the
DataPrep Studio page.
Click here to know more about the transforms.
13. Once you are done creating your data flow and applying necessary transforms in your stages, you can right-click a stage and
add a destination to complete your data flow.
Note: After adding a destination to the pipeline, you can try executing your pipeline using a manual run at first. Once you make sure manual run works, you can then set up schedule to automate the pipeline. Learn about the different types of runs here. While configuring the Schedule, Backfill, Manual reload, Webhooks, or Zoho Flow, the import configuration needs to be mandatorily setup for all the sources. Without setting up the import configuration, the run cannot be saved.
Click here to know more about how to set up import configuration.
14.
After configuring a run, a pipeline job will be created at the run time. You can view the status of a job with the granular details in the Job summary. Click here to know more about the job summary.
To edit the database connection
DataPrep saves your data connections to avoid the hassle of keying in the credentials every time you need to connect to a data source or destination. You can always edit the saved data connection and update them with new parameters or credentials using the Edit connection option.
1. While creating a new dataset, click Saved data connections from the left pane in the Choose a data source section.
2. You can manage your saved data connections right from the data import screen. Click the ellipsis (3 dots) icon to share, edit, view the connection overview, or remove the connection.
3. Click the Edit connection option to update the saved connection with new parameters or credentials.