Import data from cloud databases
You can import data from the following cloud databases:
- Amazon RDS - MySQL
- Amazon RDS - MS SQL Server
- Amazon RDS - Oracle
- Amazon RDS - PostgreSQL
- Amazon RDS - Maria DB
- Amazon RDS - Amazon Aurora MySQL
- Amazon RDS - Amazon Aurora PostgreSQL
- Amazon Redshift
- Amazon Athena
- Microsoft Azure-MySQL
- Microsoft Azure - PostgreSQL
- Microsoft Azure - Maria DB
- Microsoft Azure - SQL Database
- Microsoft Azure - SQL Data Warehouse
- Google Cloud SQL - MySQL
- Google Cloud SQL - PostgreSQL
- Snowflake
- Oracle Cloud
- IBM Cloud - DB2
- Heroku PostgreSQL
- Rackspace Cloud - MySQL
- Rackspace Cloud - Maria DB
- Panoply
- MySQL
- MS SQL Server
- Oracle
- PostgreSQL
- Maria DB
- MemSQL
- DB2
With these connectors, you can streamline your ETL workflows, enable smooth data movement, and simplify data integration between various cloud databases and Zoho DataPrep.
To import data from a cloud database
1.
Create a pipeline or open an existing pipeline from the Home Page, Pipelines tab or Workspaces tab and click the Add data option. You can also click the Import data option under the Workspaces tab to import data.
Info: You can also click the Import data
icon at the top of the pipeline builder and bring data from multiple sources into the pipeline.
2. Select the Cloud databases category from the left pane and choose the required cloud database. You can also search cloud databases in the search box.
3. Choose your Database service name and Database type.
4. Enter your Database server host.
5. Enter your Database name, username, and password if authentication is required.
6. You can also select the Use SSL checkbox if your database server has been setup to serve encrypted data through SSL.
Note : The Connection name must be unique for each connection.
7. Click Connect.
Note: The connection configuration will be saved for importing data in the future. Credentials are securely encrypted and stored.
8. Select the tables that need to be imported. You can also use SQL query to select and import data.
9. Click Import to begin importing data from your cloud database service.
10. Once you complete importing data, the
Pipeline builder page opens and you can start applying transforms to the ETL pipeline. You can also right-click the stage and choose the
Prepare data option to prepare your data using various transforms in the
DataPrep Studio page.
Click here to know more about the transforms.
11. Once you are done creating your data flow and applying necessary transforms in your stages, you can right-click a stage and
add a destination to complete your data flow.
Note: After adding a destination to your ETL pipeline, you can choose to schedule the import and export using the
Schedule option available in the pipeline builder to automate the data movement.
Click here to know more.
To edit the cloud database connection
DataPrep saves your data connections to avoid the hassle of keying in the credentials every time you need to connect to a data source or destination. You can always edit the saved data connection and update them with new parameters or credentials using the Edit connection option.
1. Click Saved data connections from the Choose a data source box while creating a new dataset.
2. You can manage your saved data connections right from the data import screen. Click the ellipsis (3 dots) icon to share, edit, view the connection overview, or remove the connection.
3. Click the Edit connection option to update the saved connection with new parameters or credentials.
Troubleshooting
If you're unable to connect to the cloud database,
Verify that the required IP addresses are whitelisted.
Zoho DataPrep IP address
Ensure that you are using SQL authentication, as Zoho DataPrep only supports this authentication method.
SEE ALSO