It is common to back up Qlik Sense Client Managed servers at a server level, often with a virtual machine snapshot. Applications can be backed up by having copies of them in different streams or work areas on the server. Apps can be exported one application at a time using the Export function, which is time consuming. Or a file level backup can be made by copying the repository, but apps are hard to identify as they all have a GUID for the name. This post looks at a simple way that all apps can be copied to a different location, as a QVF file, with a sensible file name.

API Connections in Qlik Sense Client Managed

Back Up All Qlik Sense Applications

Straight of the box with a clean install Sense has a number of REST Connections created for you to explore your environment. It’s well worth exploring what these connections offer, with regards to looking at users and usage.

For this post though we are going to be looking at the monitor_apps_REST_app connection.

Create a new app and go to the Data Load Editor. If you can see the monitor_apps_REST_app connection we are good to go. If it’s not there you will need to grant yourself rights in the QMC under the Data Connections settings, find the correct connector and select Security Rules.

Qlik Connector Settings

From here you can grant yourself access to the one connection by your name or user id:

User Permissions Connection

Or alternatively, if you are a RootAdmin on the QMC you may want to grant the RootAdmin role access to all connections on the server. This will mean that administrators can all explore all of the default connections and keep an eye on what users are creating also. Do this by replacing the GUID of the connection in the Resource Filter with an asterisk. Obviously, you can tailor the permissions to be whatever you need.

RootAdmin Permissions Connection

When you go back to the Data Load Editor in another tab the connections will not have appeared automatically. You will have to refresh the browser with a Shift-F5, or click the refresh button in the browser toolbar.

Loading Data From REST API Connection

Now you have access to the correct REST connection, you can use the Select Data icon next to the monitor_apps_REST_app connection to look at the data. Alternatively, you can use the script here to pull the information that you need.

Before we connect to the data though, we are going to set up some variables which describe where the apps currently reside and where we want to back them up to. The apps will be in your repository, the location of which will have been set when Sense was installed. There will be a share, most likely on the Sense server, pointing to this location. In here find the Apps folder, which contains a file for each app, with a GUID for a name and no extension. The destination is simply where you want to copy to. Here I create a dated folder for the backup:

// Set up locations of Sense Repository and a backup location
// Used in the backup script field
let vSrcFolder = 'c:\QlikSenseRepository\Apps\';
let vDestFolder = 'c:\SenseBackups\AppBackup' & date(today(), 'YYYYMMDD') & '\';

A connection then needs to be made to the REST API in order to load data from it:

// Connect to the REST connection
LIB CONNECT TO 'monitor_apps_REST_app';

With this in place we can then pull the data from the REST connection, using this code:

// Load raw JSON Data
RawRESTData:
SQL SELECT
"id" AS "id_u1",
"createdDate",
"modifiedDate",
"modifiedByUserName",
"name" AS "name_u1",
"publishTime",
"published",
"description",
"fileSize",
"lastReloadTime",
"thumbnail",
"savedInProductVersion",
"availabilityStatus",
"schemaPath",
"__KEY_root",
(SELECT
"userId",
"userDirectory",
"name",
"__FK_owner"
FROM "owner" FK "__FK_owner"),
(SELECT
"name" AS "name_u0",
"__FK_stream"
FROM "stream" FK "__FK_stream")
FROM JSON (wrap on) "root" PK "__KEY_root";

As with loading data from just about any REST connection, the data will come down in a single table, even though there are three logical tables in the data model. Each row of data will have values in columns in one of the three sets (Application Details, User Details and Stream), values in the other columns will be null. Along with these values are primary and foreign keys to join things correctly.

Using Select Data the Qlik engine will create some code which will load three separate tables into the end data model. In this case however we know there is a one-to-one relationship between an Application and its Owner and the Stream it is in (if the application is published). We can therefore create a cleaner and flatter data model with ApplyMap.

There is much more we can do also, to make it easier to select the apps we want to back up, and do further analysis on apps. The same data feed can be used to help use identify apps which are not being maintained or are bloated.

Cleaning Up The Application Table

Now we have all of the data in a temporary table we need to clean it up and augment it. This will be done with a number of RESIDENT loads and a couple of Preceding Loads.

First we are going to create some ApplyMap statements to bring in the Owner name, Owner login and Stream. These use the foreign keys in the flat table.

// Create mapping tables from parts of the REST result
Map_Owner:
MAPPING LOAD
[__FK_owner],
[name]
RESIDENT RawRESTData
WHERE NOT IsNull([__FK_owner]);

Map_OwnerLogin:
MAPPING LOAD
[__FK_owner],
[userDirectory] & '\' & [userId] as Login
RESIDENT RawRESTData
WHERE NOT IsNull([__FK_owner]);

Map_Stream:
MAPPING LOAD
[__FK_stream],
[name_u0]
RESIDENT RawRESTData
WHERE NOT IsNull([__FK_stream]);

Then some variables which act as functions in the subsequent RESIDENT load. These all utilise the fact that a SET statement does not evaluate the code that it puts into a variable (rather it copies code verbatim) and the functionality of passing parameters into variables.

// Create parameterised variables for repeated tasks
set vTime = Date(if(Date#(left($1, 10), 'YYYY-MM-DD') < 0, null(),
Date#(left(replace($1, 'T', ' '), 19), 'YYYY-MM-DD hh:mm:ss')),
'DD MMM YYYY hh:mm:ss') as [$2 Date Time];
set vDP = Date(DayStart([$1 Date Time]), 'DD MMM YYYY') as [$1 Date],
Date(MonthStart([$1 Date Time]), 'MMM-YYYY') as [$1 Month],
Month([$1 Date Time]) as [$1 Month Name],
Year([$1 Date Time]) as [$1 Year],
WeekDay([$1 Date Time]) as [$1 Day],
today() - DayStart([$1 Date Time]) as [Days Since $1];
set vNS = if($1 = '', 'Not Specified', $1);

The first function here receives the name of a date field and checks whether it has a value, if it does it returns that converted to a date and formatted correctly. The second takes the date time field created by the previous variable (thanks to a preceding load) and creates various date parts for these dates. The final variable checks a given field name and returns Not Specified if the value is empty, otherwise it returns the value.

You will see all of these variables used in the next block of code.

We can then do our final RESIDENT load to bring through all of the original data, with the various transformations applied.

Applications:
LOAD
1 as ApplicationCount,
*,
RangeMin([Days Since Created],
[Days Since Modified],
[Days Since Published],
[Days Since Last Reload]) as [Days Since Last Touch],
// Export this column to create a script to backup your apps from the Repository
'copy $(vSrcFolder)' & GUID &
' "$(vDestFolder)' & Stream & ' -- ' &
Application & '.qvf"' as [App Backup Script]
;
LOAD
*,
// Create all required date parts
$(vDP(Created)),
$(vDP(Modified)),
$(vDP(Published)),
$(vDP(Last Reload))
;
LOAD
id_u1 as GUID,
// Convert all required dates
$(vTime(createdDate, Created)),
$(vTime(modifiedDate, Modified)),
$(vTime(lastReloadTime, Last Reload)),
$(vTime(publishTime, Published)),
[modifiedByUserName] as [Modified By],
[name_u1] as Application,
[published] as [Is Published],
$(vNS(description)) as [Application Description],
fileSize / 1000000 as [File Size],
$(vNS(thumbnail)) as Thumbnail,
[savedInProductVersion] as [Product Version],
[availabilityStatus] as Availability,
ApplyMap('Map_Owner', [__KEY_root], 'Not Known') as Owner,
ApplyMap('Map_OwnerLogin', [__KEY_root], 'Not Known') as [Owner Login],
ApplyMap('Map_Stream', [__KEY_root], 'None') as Stream
RESIDENT RawRESTData
WHERE NOT IsNull([__KEY_root]);

Then, to tidy things up and avoid any strange joins, we can remove the temporary table which was created when the data was pulled from the REST connection.

DROP TABLE RawRESTData;

And that is our load script done.

Unpacking The Application Back Up Code

There is quite a lot going on in that last step of the code. Hopefully you can work out what is going on with most, if not all, of it. Much of it is to give us some extra fields to do some analysis of the applications on our server (more on that later), but for now just to focus on the creation of the App Backup Script field.

'copy $(vSrcFolder)' & GUID & ' "$(vDestFolder)' &
Stream & ' -- ' & Application & '.qvf"' as [App Backup Script]

Here we are starting with the copy statement we wish to use (you can replace this with XCopy or add parameters if you wish), adding in the two variables we set up right at the top of the script and pilling out the GUID which is the source filename for our script. There is then the building of the output filename and the placing this in a fieldname.

The construction of the output filename and path you can also alter if you wish, I have gone for the stream name (which is set to None for items that are not published) a double hyphen (as apps and streams may have hyphens in the name already) the application name and finally the QVF extension.

This then builds code that we can use in a Command Prompt on our Qlik Sense server to move the apps to their backup location.

Creating and Using The Back Up Script

Now our script is complete, click Load Data to bring all the data in from the REST connection. Once this has finished loading click Analyse Sheet and change to Edit mode. Grab a Table object from the Charts and drop it on the sheet. Click to Add Dimension and add in the App Backup Script dimension.

Add App Backup Dimension

Come out of Edit mode and you can then right click to export the table to Excel. Open this spreadsheet and copy and paste the code into a new Notepad (or other text editor) window and then save the script as a .bat file. This file can then be copied onto the server and executed to perform the backup.

Back Up Script Running

There is a slight risk of one of the files being corrupted if it is copied whilst a write is in progress, so it is best practice to stop all Sense services before taking the backup. In practice, I have not ever hit a snag with this though and you could simply check the date stamp on the files and see if there were any that were updated at around the time you took your backup and perhaps copy them again.

So there you have a relatively quick and painless way of getting QVFs of all of your Qlik Sense Client Managed Server applications.

I hope you find the technique useful. Please let us know how you get on in the comments below.

Further Analysis of Your Application Data

As well as the the information required to filter and export the backup script there were various other data points brought in about the Applications on the Sense Server which you may want to analyse. Finding which applications have not been touched in over a year, for example.

Fortunately we have your back with this, with our Instant Sense Application. This is an application template which ships with a number of pre-written load scripts (including the one above) and metadata files so that you can re-purpose the application to use your own data sources. Out of the box you can view a number of KPIs over these application data:

Some Application KPIs

It also allows you to break the various measures down over different dimensions and plot those different measures against each other, by a selected dimension:

Application Scatter Chart

Finally there is the user defined table, where the user can select a number of dimensions and measures and view these in a table which can then be exported as required:

Application Measure Table

Everything in the application is driven from a metadata spreadsheet, this means that you can configure the app so it can be used over whatever data source you like:

Instant Application Metadata

All in all there are twelve sheets of user configurable visualisations, just waiting for you to bring your own data. The best thing is that it is free to download and use.

Find out more about the application, and download it, from here:

https://www.quickintelligence.co.uk/instant-qlik-sense-app/

More technical details, including how the application is built can be found in this Instant Sense Application blog post.

If you have any questions or comments about anything above, please do drop us a comment below.

If you would like to share a load script and metadata spreadsheet for the Instant Sense Application then we would love to hear from you.