Learn, collaborate and solve problems with like-minded data quality enthusiasts.
The Data Quality Community is a great place to collaborate, seek answers, and learn Aperture Data Studio. Register for free to join in with discussions.
Welcome! Find out how to get the most from The Community.
This is the place to ask a question or to start a discussion.
Get help and keep learning and expanding your knowledge.
Find out about new features, download the latest releases, and read case studies.
Find out what's coming up, suggest ideas, and take part in exciting research with us.
Hi all, I hope you have had a great start to the week. Thank you to everyone who has signed up as a Product Advisor; your contribution is invaluable to us. The community of Advisors has grown massively from 20 to over 120 members in just over a year, but we are always striving to further diversify the opinions and voices…
A user recently asked about connecting Data Studio to their AWS RDS MySQL Database. The first thing to note is that MySQL on RDS uses the MySQL community edition, which is not supported by Data Studio's native JDBC driver. When attempting to connect, you'll receive this error: "Connections to MySQL Community Server are not…
This in-person event will offer valuable insights from industry leaders, networking opportunity, and key takeaways from our recent Data Governance Survey, featuring THE Data Governance Coach – Nicola Askham! Here's what you can look forward to: Expert insights from Coman Wakefield, CTO of Experian Data Quality Industry…
I am in Aperture Version 2.8.8.27 I am trying to get connected to Googlesheet thru BigQuery. BigQuery is set up with link to Googlesheet and connection to BigQuery from Aperture seems to work ok. External system Connection is OK, using Service Account with .jason Private Key File. But when I try to add dataset in aperture…
What problem are you facing? As a Find duplicates user I want to be able to manually evaluate and change clusters of records where I think the rules have matched records incorrectly e.g. I've got a husband and wife in my data, both called M. Smith and they share the same postal address, email address and loyalty card…
We had a recent scenario where 1k jobs were sent to the queue, which did flag up a couple of UI/UX suggestions for the 'Jobs' page: Is it possible to multi-select to cancel jobs? We could only see the option to delete one job at a time in the queue? I.e. would be good to be able to select all jobs, multiple jobs or single…
has the process changed to update ssl cert on newer version? previously i used to do this, and hit save, now i see this, and does the section for public certificate needs to be empty? you guidance will be really appreciated, as our cert is expiring in 1 month or so on our dev server. Br, HS
What problem are you facing? When exporting data to Excel, I like to have each worksheet named according to the function or type of data being presented. To do this, the prior step must be named and that gets passed to the export step. However, if the workflow is changed and new steps are added prior to the export, the…
Hi, I thought I had this set up correctly, but testing reveals otherwise. I have a reusable workflow that is set up to send data passed to it to an external destination (as an Excel file). However, I only want to run this when the data being fed has rows. The reusable workflow is fine when used in a workflow - it uses the…
Hi Team, The aperture pre-production server database service has been stopped abruptly. 2025-03-04 22:00:48,784 INFO c.e.d.j.d.JobDatastoreImpl [jobControllerSystem-pekko.actor.default-dispatcher-2129] The repository service shutdown has already been invoked, nothing to log out. 2025-03-04 22:00:48,802 INFO…
You can easily get your data from Data Studio into Excel or Tableau using OData. OData (odata.org) is “an open protocol that allows the consumption of data via a simple and standard RESTful API”. If you choose to turn on OData in Data Studio (v2.2.3), you will be able to link your datasets (including snapshots) to any BI…
Hi, I want to explore using Rulesets, but the naming of rules is stumping me. I want my rule name to be the mapped column I am using, followed by a short description. Column=V_ID and description=not null. The rule name created though is "V_ID-Input - not null" - it is pulling in the input column from the ruleset definition…
I'm setting up a Dataset that automatically refreshes with data from files dropped into an External System dropzone (SFTP in this case). I only want to load specific files, determined by the file name. To do this, I was going to use the "Starts with file pattern" config option, but I couldn't find any information about the…