Connecting to the ServiceNow REST API using OAuth2
The Autonomous REST Connector JDBC driver allows you to load (and refresh) the results returned from REST API calls as Data Studio Datasets, by translating SQL statements to REST API requests.
In this article I'll show how to call ServiceNow's REST API (using OAuth2 authentication) to bring response data into Data Studio. There are several steps:
- Register your OAuth app in ServiceNow
- Retrieve the OAuth token
- Configure the External System in Data Studio
- Load data from parameterized paths
Register your OAuth app in ServiceNow
In ServiceNow, navigate to System oAuth > Application Registry > New and then Create an OAuth API endpoint for external clients. The Client ID and Client Secret will populate when the app is saved
Retrieve the OAuth token
There are lots of different ways to get the OAuth refresh token, which we will be using to authenticate requests. I use Postman. Follow these steps to set up Postman to get the OAuth token(s) for the initial authorization flow
- The callback URL should match what you have configured in the ServiceNow app
- Use the "Send client credentials in body" option in Postman
- Auth URL:
https://{instancename}.service-now.com/oauth_auth.do
- Access token URL:
https://{instancename}.service-now.com/oauth_token.do
- Use Scope
useraccount
and statecode
Clicking "Get New Access token" should prompt you to allow the connection
Make a note of the refresh_token.
Configure the External System in Data Studio
The attached ServiceNow.rest model file maps the ServiceNow REST API endpoints to tables.
To use it, remove the ".txt" extension, replace "{instancename}" in line 2 to your own SNOW instance, and then place on the Data Studio server.
Now create a new External System in Data Studio:
- authURI=https://instancename.service-now.com/oauth_auth.do
- clientId=From SNOW app
- tokenURI=https://instancename.service-now.com/oauth_token.do
- clientSecret=From SNOW app}
- refreshtoken=From Postman token generation
You will need to create a Credential, but the username and password for this can be left blank.
Load data from parameterized paths
The final step will be to use the External System to load data from an endpoint. In this example we'll use the GET /now/table/{tableName}
method from the Table API. Note that this (like most of the ServiceNow API endpoints) is a parameterized request, and we will need to provide the tableName as part of the query.
In Data Studio we do this by creating a Dataset from a SQL query. Create a new Dataset, select the ServiceNow External System, and select Database query:
The query SELECT * FROM AUTOREST.TABLERECORDS WHERE TABLENAME = 'incident'
will bring back data into my new dataset from the GET Table API endpoint for the table named "incident".
Comments
-
Thank you for giving detailed steps, I just follwed the Steps and tried to establish the connection but I am getting this message when I try
- My Aperture settings:
Refresh Token I received from Postman, and Rest of the Information is the same which I used in Postman and got the response.
2. rest file I used yours and change the .txt so that its a .rest file and changed the instance in the rest
0 -
@Mahulima The error suggests a problem with the .rest file (or the driver's ability to read it), since the schema for TABLERECORDS is defined there rather than in ServiceNow.
Your .rest file formatting may be the cause, can you double-check it's been downloaded correctly? It should look like this (section which defined the GET table → TABLERECORDS mapping:
0 -
I see this
0 -
Hi @Mahulima
Can you try removing the schema name (AUTOREST) from the query, so just run:
SELECT * FROM TABLERECORDS WHERE TABLENAME = 'incident
Also, could you try to load a table that models a REST API call that is not parameterized: CINOTALLOWEDACTION. For this, you should be able to just select the table name rather than specifying a SQL query.
0 -
@Henry Simms I tried without schema result is same
also I dont see that table in my table list
I see this kind of table only, So is it because of parsing the .rest file? as instead of the table names I am getting the Path, Path_1 like that.
I think we are very close to set up the connection :) Please help us a bit more for a successful completion
0 -
@Mahulima It does look like the .rest file is not correctly being read. Here's the list of tables you should see as defined by that file I shared:
Can you try using the same External System connection but change the .rest file to this small "test" on that just contains a single table definition (which is not parameterized):
This should show one table (INCIDENTTABLE) which can be loaded.
0 -
I tested with the above one and there also I see this
But the big rest file if I point that to Progress Data Direct as a model and Auth Type I selected Oauth2, Basic both then I am able to get the API response in Progress Data Direct which was not able to do earlier.
0 -
Another Observation in the attached .rest file for me in my ADS I saw onlly HHTP is coming as Table , I tested add [ before and after Account like below
and then I see that I can get that as a table
0 -
@Mahulima and I have been able to resolve this, but for future reference: This unusual behaviour (table names showing as "_HTTP", "_PATH_1", "_PATH_2" etc, is what happens when the .rest file path is entered into the REST endpoint field in the External System config, rather than into the REST config file field:
Easy mistake to make, and unexpected result!
1 -
@Henry Simms Thank you so much for all your help! :) The connection is Successful now for us. :)
0 -
I tried now to fetch other table information and able to get the records from other table as well. Now only problem is I am able to get only 21 columns from each table.
Is it because of some permission issue? I am able to get the Postman response of all columns though. In Postman response I am getting all columns of those tables, but through external system I am getting only 21 columns for each table.
0 -
Few more observations, for all table its only getting specific system columns for example –
SYS_ID
SYS_CREATED_BY
SYS_CREATED_ON
SYS_DOMAIN_LINK
SYS_DOMAIN_VALUE
Also I tried with querying with the columns which are part of that table as I am getting those column values when I tried with Postman response. But those columns also throwing error.
0 -
@Josh Boxer @Henry Simms @henry_
The issue is still there, I have checked for all tables its only fetcing records from some system columns and not the actual column records, In fact my Postman Response which has actual column records that does not have these systems columns. I have cross verified the columns from Postman Response Vs Columns from the External System.
Please help here!!
0 -
hi @Henry Simms @henry_ @Josh Boxer @Seamus Kyle
Any update regarding the Limited number of columns fetching issue in SNOW connection? We are very close with our successful connection so eagerly waiting for your further response/guidance for making it up and running.
0 -
@Henry Simms @henry_ @Josh Boxer @Seamus Kyle Is there any update on this? This is very critical for us and we are blocked here. Please help
0 -
Sorry Mahulima but I am not an expert with this driver and do often have to use some trial and error to get it working as required. If you have a Consultant to work with they should be able to work with you to get it configured to your exact specifications.
0 -
Hi @Josh Boxer It is not about the exact Configuration of My side of the system. The Example shared in the experian community also does not have all the details it has only 21 columns details and in my case also I am getting exact same number of columns. So my configuration is giving me exact same result what is shared here in the community. Do you know someone who can help us here from Experian side?
0 -
Hi @Mahulima , just following up on your question:
I have checked for all tables its only fetching records from some system columns and not the actual column records
The reason for this is each table returned from the GET
/now/table/{tableName}
method (Table API) has a different schema, i.e. different JSON structure for the returned table.The default ServiceNow.rest file included above was generated by sampling, and only includes 21 columns (all "sys_" values) because these are the only values that are consistently for all tables.
The sampling process does not know what actual
tableName
values are available, but to be able to successfully return columns you would need to generate an entry in the .rest file for each table.Using the example of the
cmdb_ci_appl
table, we see that a response includes many fields:To have these returned via the AutoREST driver, and entry for this specific table, and the mapping for these fields, needs to be in the .rest file. For example (returning just a sub-set of the values in this table):
"cmdb_ci_appl_table": {
"#path": [
"IDENTITY /now/table/cmdb_ci_appl/{table_sys_id} /result",
"/now/table/cmdb_ci_appl /result"
],
"#maximumPageSize": 100,
"#pageSizeParameter": "sysparm_limit",
"#rowOffsetParameter": "sysparm_offset",
"sys_id": "VarChar(64),#key",
"cost_cc": "VarChar(64)",
"install_status": "VarChar(64)",
"name": "VarChar(64)",
"used_for": "VarChar(64)",
"sys_created_by": "VarChar(64)",
"maintenance_schedule": {
"link": "VarChar(256)",
"value": "VarChar(64)"
},
"attested": "VarChar(64)"
},This will then display "cmdb_ci_appl_table" as a Dataset and allow the selected cmdb_ci_appl values to be returned:
To return more values, or values for other tables, new entries will need to be made in the .rest file. This can be done manually but the process can be sped up significantly by using the Autonomous REST Composer.
I've attached below the modified .rest file I'm using which includes the additional entry (lines 49-68) for the cmdb_ci_appl table.
0 -
@Henry Simms Thank you for the response. Is it possible to give me one example how this can be done using Auto Rest Composer as Manually adding so many fields might be very difficult task and chance of human error is more
0 -
The Autonomous REST Composer is built into the JDBC driver and can help you build out your connection along with the "Getting started with Autonomous REST" tutorial guide created by Progress Datadirect, who provide the driver.
To run the composer, either
- Double-click on the driver file
DSautorest.jar
file, which you will find as part of the Aperture Data Studio database, eg inD:\ApertureDataStudio\drivers\jdbc\
or similar. This will pop up a web app, and you can enter "arc.html" in the browser, or - Run
javaw -jar DSautorest.jar --design
From here you can import your existing model, use your OAuth creds to authenticate, and then in the Configure Endpoints tab generate your REST file config. You can find more on this process in these articles:
0 - Double-click on the driver file