Best Of
Re: Is it possible to export a file as an XML please?
Hi Charles
This is not possible 'out of the box' in Data Studio, but it would be possible to write a custom step to generate an XML file or call an API directly:
Coming soon it will be possible to write a Custom exporter for the Export step
Re: Grouping with TOTALS row?
I wish the cell style would persist when exporting to Excel - would be nice to keep the colours.
Re: Grouping with TOTALS row?
Hi James
You are correct, a Group then Union is how I would achieve this:
- Group
- Delete default 'Count' aggregate and add a new aggregate
- Select 'Multiple' and set Name to blank
- Select all (numeric) columns
- set Aggregate to Sum and apply
—> Then Union this with the existing data
Optional, but depending on what you are doing with these totals, you could format / style the totals rows using a Transform before the Union. Example comma separating these likely large numbers and coloring the totals row to make it stand out:
SetCellStyle(
_RegexReplace(value, '(?:^|\G)(\d{1,3})(?=(\d{3})+(?:.\d+)?$)', '$1,', false),
'Success'
)
Re: Creating Solution- Selected all spaces
@Mahulima Thanks for confirming. Yes I think that it is this behaviour that is causing it to become very slow.
I've raised this and will get it resolved as soon as possible.
Regards,
Ian
Re: Creating Solution- Selected all spaces
@Ian Hayden Hi Ian I have access to 30+ spaces and by default are are selected . May be because of default settings of selecting all spaces making it slower.
🛢️ Loading from OData using the Autonomous REST Connector
The Autonomous REST Connector JDBC driver packaged with Aperture allows you to load (and refresh) the results returned from REST API calls as Datasets.
OData is a standardized REST interface, which means that any data that can be queried via OData API can be loaded into Aperture.
This article will demonstrate how to use the Autonomous REST Connector to retrieve data from OData, using the OData V4 example TripPin service.
Create a model with the Autonomous REST Composer
The first step is to use the Composer to create the configuration (a "model" file) that the driver will use. To launch the Composer, locate the DSautorest.jar file (likely to be in D:\ApertureDataStudio\drivers\jdbc) and double-click it.
This will launch a browser window. From here, navigate to arc.html, and create a new model with base URL https://services.odata.org/TripPinRESTierService
Configure connection options
Next, we configure the model's connection options. The TripPin service example service is quite simple: It uses SSL but does not require authentication, so we can leave most of the Composer settings as default. In the Security tab, switch Validate Server Certificate to false, to avoid certificate validation issues in the browser as you build the model (you can switch this on later when using the connection in Aperture).
View data in the Composer
In the Configure Endpoints tab, we can check things are working by querying the API's /people
endpoint. This should return normalized data based on the connector's sampling of the API:
Notice that as part of the normalization, the sampling has generated separate linked tables for FEATURES, ADDRESSINFO and EMAILS, which are arrays in the PEOPLE object which can contain multiple values. These table are linked to the relevant row of PEOPLE using the POSITION / VALUE_POSITION key:
Download the .rest file and use in Aperture
You're ready to download the model to use in Aperture.
- Download the .rest file from the Composer and copy it to a location accessible by Aperture.
- In Aperture, create a new External System with the following settings:
- Type: JDBC
- DBMS: Autonomous REST 6.0
- REST configuration file: full path to the .rest file you just downloaded, for example "C:\ApertureDataStudio\drivers\jdbc\RESTfiles\TripPin.rest"
- You will need to create credentials, but these can be empty (name: "Not Required", username: "", password: "")
Now you can create a new Dataset using this External System. The available tables should appear as they did in the Composer:
Configure and load these Datasets:
Finally, use a workflow to join the tables together using the keys (eg join PEOPLE to EMAIL on POSITION & VALUE POSITION
Refining the configuration
The steps above provide the basics to get started. There's other things you'll likely need to configure for your OData API including authentication, paging, and modifying data types or table names. You can make these changes in the Composer UI or by manually editing the .rest file.
Re: Workflow is not in an executable state but no errors exist
Thank you so much Ian, that has worked! Thanks for raising this as a bug too.
Re: Workflow is not in an executable state but no errors exist
Hi Sarah,
Josh forwarded me your workflow and I've taken a look at it.
The problem is that the StudentInitiative_1 source was changed and one of it's fields was renamed (by the looks of it INITIATIVEID changed to STUINITID). This field is used in its downstream transform step and the "Filter For Coverage" step which invalidates these steps and the workflow. This check is happening at publish/run time, but not when you open the workflow. When you open the workflow or change a source Data Studio should detect that the source has changed and prompt you to remap the columns. I will raise a bug to look into why this has not happened in this case.
The easiest workaround is to switch the source to StudentInitiative (which shares the same schema), which will detect the change and prompt to remap the columns, then change it back to StudentInitiative_1 (which does not need remapping again). The workflow should become valid again.
Best Regards,
Ian
Re: Refresh dataset notification
To solve this you should be able to use the Stop execution when refresh failed option on the Source steps in your workflow (available from Aperture 2.8.4 onwards). This has the effect of causing the workflow to fail if auto-refresh on the source from the External System fails for any reason.
With the setting enabled, my job execution will fail when the External System cannot be reached:
Without the setting enabled (the default), the job will complete with a warning in the execution details:
You can also add an Automation to notify / alert (or carry out some other action) when any of these failures or warnings occur, for example:
Which will give the send the following notification:
Note that the "Send notification" action and notifications inbox in my screenshot above is new in version 2.12, but in prior versions you can still send emails or trigger workflows / schedules