Zoho DataPrep supports importing data from Oracle Cloud, a cloud computing service offered by Oracle Corporation. It provides servers, storage, network, applications and services through a global network of Oracle Corporation managed data centers.
Import data from Oracle Cloud
1.
Create a pipeline or open an existing pipeline from the Home Page, Pipelines tab or Workspaces tab and click the Add data option. You can also click the Import data option under the Workspaces tab to import data.
Info: You can also click the Import data
icon at the top of the pipeline builder and bring data from multiple sources into the pipeline.
2. Choose the Cloud databases category from the left pane and click Oracle Cloud. You can also search Oracle Cloud in the search box.
Note: If you have already added an Oracle connection, choose the required connection under the Saved connections category and proceed to import. To learn more about Saved connections, click here.
3. Select Oracle Cloud in the Database service name dropdown.
4. Enter the Endpoint, Port, Oracle SID/Service Name, Username and Password to authenticate the database connection.
5. You can also select the Use SSL checkbox if your database server has been setup to serve encrypted data through SSL.
6. Enter a unique name for your connection under Connection name and click Connect .
Note: The connection configuration will be saved for importing data in the future. Credentials are securely encrypted and stored.
7. Select the tables that you want to import.
8. You can also use SQL query to select and import data.
The incremental fetch option is not available when the data is imported using a query from databases. Click here to know more about incremental fetch from cloud database.
9. Click Import to begin importing data from your Oracle cloud account.
10. Once you complete importing data, the Pipeline builder page opens and you can start applying transforms. You can also right-click the stage and choose the Prepare data option to prepare your data using various transforms in the DataPrep Studio page. Click here to know more about the transforms.
11
. Once you are done creating your data flow and applying necessary transforms in your stages, you can right-click a stage and add a destination to complete your data flow.
Note: After adding a destination to the pipeline, you can try executing your pipeline using a manual run at first. Once you make sure manual run works, you can then set up schedule to automate the pipeline. Learn about the different types of runs here. While configuring the Schedule, Backfill, Manual reload, Webhooks, or Zoho Flow, the import configuration needs to be mandatorily setup for all the sources. Without setting up the import configuration, the run cannot be saved.
Click here to know more about how to set up import configuration.
12.
After configuring a run, a pipeline job will be created at the run time. You can view the status of a job with the granular details in the Job summary. Click here to know more about the job summary.

Note: You can choose to schedule the import using the
Schedule import option available for datasets in your workspace or from the
Import menu in the top bar of the DataPrep Studio page.
Click here to know more.
To edit the Oracle cloud connection
DataPrep saves your data connections to avoid the hassle of keying in the credentials every time you need to connect to a data source or destination. Credentials are securely encrypted and stored. You can always edit the saved data connection and update them with new parameters or credentials using the Edit connection option.
1. Choose Saved connections from the left pane from the Choose your data source box while creating a new dataset.
2. You can manage your saved data connections right from the data import screen. Click the (ellipses) icon to share, edit, view the connection overview, or remove the connection.
3. Click the Edit connection option to update the saved connection with new parameters or credentials.
4. Click Update to save your connection with updated parameters.
SEE ALSO