There are a number of monitoring apps that ship with Qlik Sense Enterprise on Premise, some of which you need to upload to your site. All of these are black-boxes to an extent though. What if you want to build your own analysis over the Sense log files? Well, this blog post has you covered.
Logging on Qlik Sense Enterprise
With the default settings on logs in Sense Enterprise get written to two locations. One for the log file currently being written to and then an archive, where the current log is moved to periodically. By loading from both of those sources you get a good picture of what is going on with usage on your site. Connections are created to these two locations, named ServerLogFolder and ArchivedLogsFolder. There is a reasonable chance that you won’t see these connections in the Load Script Editor, so you will need to first go into the QMC and grant yourself rights – be careful not to grant more rights than you intend to when doing this!
Once you have access to these connections you can dig around and find out quite a lot about your system and how it is performing. This post, however, just focuses on the most important information. Who has done what and when.
Sense Session Log Load Script
Once you have permissions to the connections set up you can enter the Data Load Editor and start building your script.
First you need to set a couple of variables, which form part of the path to the session files. The first of these is generally the machine name, so can be left as is. The second tends to be a combination of the machine name and the local domain. You will need to click on the Select Data button underneath the connection name to confirm the path if its not what is expected.
// Set variables which identify this server and form part of the log names // Check in the ServerLogFolder and ArchivedLogsFolder to find the values for your installation let vServerName = ComputerName(); let vFullServerName = 'servername.domain.local'; // Usually FQDN
The next set of variables are then constructed from the connection name (which is usually the same in each installation) and the two variables you set above.
// Build out file paths and file masks based on the variables above let vCurLog = 'lib://ServerLogFolder/'; let vLogFolder = 'lib://ArchivedLogsFolder/$(vFullServerName)/'; let vCurFile = '$(vCurLog)Engine/Trace/$(vServerName)_Session_Engine.txt'; let vArchiveFiles = '$(vLogFolder)Engine/Trace/$(vServerName)_Session_Engine_*.log';
As you would expect, there are a number of dates in the log file. Because the log is in a text file, rather than a database table, the dates are in plain text. This means they each need to be converted to dual-datatype dates so they can be used. The code for each of the conversions would need to be repeated, so here is a parameterised variable to do the job. If you have not used variables with parameters before you may want to see this previous blog post.
// Create parameterised variables for repeated tasks set vDT = Date(Date#(left(replace($1, 'T', ' '), 15), 'YYYYMMDD hhmmss'), 'DD MMM YYYY hh:mm:ss');
The filename for the current log file has been built into a variable. Just in case we are between log files at this point we are going to check first that it exists. If it does we can bring the fields that we want from the file. If you are a completest you may want to look to the source file and bring through the fields I have dropped. This is pretty much everything that is interesting though.
Note that I am excluding sessions relating to reloads by the Sense Distribution service. If you are looking for log-jams on CPU use at specific times of day you may want to include these rows, but for most analysis they are not useful.
// Load the raw data from the current file if alt(FileSize('$(vCurFile)'), 0) > 0 then Temp_Sessions: LOAD ActiveUserId, [App Title], [Session Start], [Exit Reason], Level, Hostname, ServiceUser, ActiveUserDirectory, Timestamp, [Exe Version], [Server Started], AppId, [Doc Timestamp], [Authenticated User], [Secure Protocol], [Session Duration], [CPU Spent (s)], [Bytes Received], [Bytes Sent], Calls, Selections FROM [$(vCurFile)] (txt, utf8, embedded labels, delimiter is '\t', msq) // Ignoring reload sessions. Remove next line if you want all sessions WHERE ActiveUserId <> 'sa_scheduler' and ActiveUserId <> 'sa_repository' ; end if
For the archive log files we need to enumerate around all of the files in the archive folder. Again, we already have the file-mask in a variable, so can just use this. You may be aware that a load can be done directly from a filename with an asterisk in it, to load all matching files. Personally, I would always use a for each loop, as it makes debugging and testing easier. You can also exclude files (such as files with a tilde when loading Excel files) and work with the CROSSTABLE statement, which fails when used with an asterisk in the file name.
// Concatenate onto the temporary table the raw data from all of the archive files // Strongly suggest building an archive of QVD files for historical logs and // concatenate these at the end of the process - this is not done in this example though for each vFile in FileList('$(vArchiveFiles)') Temp_Sessions: LOAD ActiveUserId, [App Title], [Session Start], [Exit Reason], Level, Hostname, ServiceUser, ActiveUserDirectory, Timestamp, [Exe Version], [Server Started], AppId, [Doc Timestamp], [Authenticated User], [Secure Protocol], [Session Duration], [CPU Spent (s)], [Bytes Received], [Bytes Sent], Calls, Selections FROM [$(vFile)] (txt, utf8, embedded labels, delimiter is '\t', msq) // Ignoring reload sessions. Remove next line if you want all sessions WHERE ActiveUserId <> 'sa_scheduler' and ActiveUserId <> 'sa_repository' ; next
You should now have a single table with all of the log entries that are needed for analysis. There are a number of transformations and general tidying up that is required. This is done by doing a RESIDENT load from our raw-data temporary table.
// Format all raw values, calculate new values and rename fields Sessions: LOAD 1 as SessionCount, *, Date(DayStart([Session Start]), 'DD MMM YYYY') as Date, Date(MonthStart([Session Start]), 'MMM YYYY') as Month, Year([Session Start]) as Year, Month([Session Start]) as [Month Name], Hour([Session Start]) as Hour, WeekDay([Session Start]) as Day, Time(Round( [Session Start] - DayStart([Session Start]), (1/288)), 'hh:mm') as Time, // nearest 5min Floor([Session Start] - [Server Started]) as [Days Since Server Start], Floor(([Session End] - [Document Reloaded]) * 24) as [Hours Since Document Reload], Round([Bytes Received] * 1024, 1) as [KB Received], Round([Bytes Sent] * 1024, 1) as [KB Sent] ; LOAD ActiveUserId as User, [App Title] as Application, $(vDT([Session Start])) as [Session Start], [Exit Reason], Level, Hostname, ServiceUser as [Service User], ActiveUserDirectory as [User Directory], $(vDT(Timestamp)) as [Session End], [Exe Version] as [Server Version], $(vDT([Server Started])) as [Server Started], AppId, $(vDT([Doc Timestamp])) as [Document Reloaded], [Authenticated User], [Secure Protocol], rangemin([Session Duration], 0.5) as [Session Duration], // clip as error can cause incorrect duration [CPU Spent (s)], [Bytes Received], [Bytes Sent], Calls, Selections RESIDENT Temp_Sessions ;
There’s quite a bit going on in the script above, so to take it one bit at a time… Field names are being renamed to make them more ‘friendly’ for end-users. The date variable created above is being applied to all date fields. I found the session duration occasionally has duff values (if a session looks to be resumed), so those are capped at 12 hours. A number of new fields are added in the preceding load, including a counter (something I always to to make things more efficient), all of the date parts and some derived values for further calculations.
As should always be done at the end of load scripts, there is then some tidying up of tables and variables to do.
// Drop the temporary table DROP TABLE Temp_Sessions; // Clean up all the variables we have used let vServerName = ; let vFullServerName = ; let vCurLog = ; let vLogFolder = ; let vCurFile = ; let vArchiveFiles = ; let vCurFile = ; let vFile = ;
And there you have it. Your Sense Enterprise log file, loaded, cleaned and ready to analyse.
So, How Do I Analyse My Sense Logs?
Obviously, you can build your own charts and tables over these data, but wouldn’t it be simpler if there was a pre-canned dashboard you could use? Well, we have your back on that one. Our Instant Sense Application can load any data-set and some metadata about those data and then presents it in a user-configurable dashboard. There is a short video of me presenting the application on YouTube, here: https://www.youtube.com/watch?v=jN3Vj93v9x4.
The application is driven by a spreadsheet which defines the fields to be used for dimensions (in filters, charts and tables) and also the measures to calculate over those dimensions.
If you already have the latest version of the Instant Sense Application you can simply download the zip file with the load script and metadata spreadsheet from here:
If you have yet to download the app you can do so for free, we just ask for a few contact details so we can keep you up to date with new releases and fixes. This can be done from here:
Once you have downloaded the app you will need to upload it to your Sense server (via the QMC). At this point you can rename it to Sense Session Logs or similar. Open the app in your Work area and go to the data load editor. Now find the SenseSesssions folder under DataSets in the zip file. Open the load script text file and copy and paste the script over the script in the Main Data section of the load script. The SessionLogLookups.xlsx file needs to be copied to a folder you have a connection to in Sense, and then the past to that needs to be set in the variable on the Main sheet of the load script.
Once this has been done it should just be a case of reloading and you have your Sense Session Log Analyser.
There are a number of other example scripts included with the Instant Sense Application which you might like to explore. If you are interested in how the application was built, that is all explained in one of our previous blog posts.
If you have any comments or suggestions related to this post please post in the comments below.