Experian Data Quality Community
-
Databricks JDBC driver for read and write workloads
Aperture Data Studio supports both the JDBC v2 (Simba / legacy line) and JDBC v3 (Databricks OSS / current and future line) drivers. These are "third-party" drivers which must be downloaded, but which will have some configuration support in the user interface. Databricks (via the JDBC driver) is also a supported system for…
-
Using ❄️ Snowflake with Aperture Data Studio
Snowflake is a popular cloud-based data warehouse that is used by many organizations to store, process, and analyze large volumes of diverse data. Connecting Snowflake to Aperture Data Studio Aperture Data Studio can import data and metadata from file stores, CRMs and databases, including Snowflake. To connect to Snowflake…
-
Connecting to an Amazon RDS MySQL Database
A user recently asked about connecting Data Studio to their AWS RDS MySQL Database. The first thing to note is that MySQL on RDS uses the MySQL community edition, which is not supported by Data Studio's native JDBC driver. When attempting to connect, you'll receive this error: "Connections to MySQL Community Server are not…
-
Connecting to a SAP HANA database using Data Studio
From the Data Studio v1.6.1 release onwards, we've made it easier to connect to a SAP HANA database directly from Data Studio. With the SAP HANA JDBC connection you can load in data from HANA to Data Studio, or export back. First, you'll need to locate the SAP HANA driver. The driver (ngdbc.jar) is installed as part of the…
-
SSL certificate error connecting to SQL Server after applying entitlement CA cert
This discussion was created from comments split from: ⚠️ licensing certificates issue
-
Salesforce import fails
Hi, This is one of the error messages. In some cases there is no error message in the log file although no rows have been imported. "Caused by: com.experian.aperturedatastudio.jdbc.sforce.phoenix.sql.ae: Socket Error: Read timed out Caused by: java.net.SocketTimeoutException: Read timed out" Do you know what can cause…
-
Enabling hard delete using the Salesforce connector
When deleting data from a Salesforce object you can tell the API to hard delete the data so that it does not go in to the recycle bin and therefore can't be recovered. It's also possible to do this with DELETEs in the Export step using the native Salesforce JDBC driver. Here are the steps: In your Salesforce External…
-
Include Metadata in the Export step
What problem are you facing? I would like to include metadata in the export step (specifically in the JDBC option) that includes metadata for the target system and table (e.g. tablename, DBMS system, credentials used etc.) as this would help users / DBAs to debug/resolve any issues or be included in a point-in-time audit…