A default value for "automatically delete data after N days"
What problem are you facing?
Almost all the data loaded into my Data Studio instance has a mandated 90 day retention period. 90 days after load we want to delete any batches that remain. Most data is loaded into new Datasets (in new Spaces) because we are working typically on short-term, one-off projects.
There are exceptions to this (eg static domains / lookup tables).
While Data Studio has a "Automatically delete older batches after N days" option, it needs to be explicitly set for each new Dataset that is created. Often, users do not remember to set the "delete after N days" value on all Datasets created for a given project.
What impact does this problem have on you/your business?
The users have to set reminders to either manually review all Datasets to check they have the correct "delete after N days" value set, or to manually delete batches as required. This is time consuming but also error-prone. There is a risk that data is not deleted when it is supposed to be, and in turn that data retention policies are not met.
Do you have any existing workarounds? If so, please describe those.
As described above, lots of manual review to ensure data is deleted as necessary.
Do you have any suggestions to solve the problem? Feel free to add images if this helps.
Suggestion is a default value for "automatically delete data after N days", which may be set at the global, environment, or space level (I don't know which makes most sense). Rather than having to set the "deletion in N days" value manually per-dataset, I'd like (to meet data retention requirements) to be able to set a default that's applied for any new Dataset, and to only turn it off for specific datasets as an exception.
Comments
-
Hi Henry,
Thanks for the suggestion. I will review this and be in touch for further details. Thanks.
0 -
Related request is an option to not allow the default value be overridden at the Dataset level
0