Best approach- to do profiling for large dataset (20 million records)
Hello ,
I need to do profiling for tables which have more than 20million records and also the table will get update with new records as every week.
1) Is that right to do profiling for such a large dataset, will i get any performance issue?
2) as the table gets updated with new record as weekly basis, is that possible that i can refresh my dataset in experian by only adding new records (incremental load)
Kindly suggest some best apporach to do profiling for tables like this.
Comments
-
Hello
Customers regularly profile volumes larger than you have suggested so I doubt you will see any performance issues, but it does depend on the spec of the machine you are running Aperture on
You probably do not need to incremental loads in this case, but there are some previous discussions on the topic such as:1