Cache, performance, and processes
Hi! We have just started using the Aperture Data Studio. We tried to profile a hefty table (2mil + records, 350+ columns). The process took a long time being stuck at 95%. During that time every other action that we want to do in the tool feels like eternity. We cant even open a value preview check on 4 row table as they are loading forever.
We deleted the 2mil + profiling and still everything feels running very slowly.
Our questions are:
1. Can this be potentially a cache problem and can we clear it?
2. Can we somehow check what processes are running in the background and kill them?
3. Did anyone had similar experiences and has a good solution for this?
Answers
-
Hello, 2M records is a small number, but 350 columns is quite a wide dataset. You have not mentioned any timings of how long it took to even get to 95%?
Processes running are shown in Jobs screen and resources being used are detailed in the 'Monitor.log' file. This could also help highlight where your VM/Server might need some additional resources if you run into similar in future.
In the next release of Aperture Data Studio there is actually a significant enhancement to Profiling performance, so keep an eye out for this and test again in the future.0
