Seq Reporter - Turn your structured logs into scheduled reports!

Table of Contents

Uhh ... You want what?

So, you have all your apps logging to Seq, perhaps you have monitoring and alerting using apps like the Seq OpsGenie client, and maybe you're even using Event Timeout to detect events that didn't happen in time. Things are going great, except ...

Well, management now want SLA reports to track on how things are going, and you have this great structured logging server that has most or even all of the data you need for that report. How are you going to get it out?

You could perform queries against Seq and manually export the results to CSV or JSON using the inbuilt export functions. You could, perhaps, take screenshots of the pretty dashboard you've built, and use those. You could even plot your query data as a time series, bar chart, or pie chart and download the result as PNG file. All quite do-able - but manual.

You could also get fancy and use seqcli to run a query and export to CSV. That has some potential for automation, and it formed the starting point for this effort.

I wanted to achieve, essentially, what seqcli does, but with some enhanced flexibility around the start and end times for the query, so that (for example) I could have a scheduled monthly report. I had a few other criteria in mind, too - so to build the basic list of requirements:

  1. Time expressions for scheduling - for example, an ability to configure a report starting from 1 month ago at midnight, to midnight today.
  2. Multiple reports - an easy way to setup different reports with different queries (as needed) and periods,.
  3. Integrated logging, to integrate with the monitoring and alerting that we've setup in our Seq implementation
  4. Integrated email of reports

Creating Seq Reporter

To reproduce the seqcli functionality, I would need to get to grips with the Seq API. This wasn't too difficult - I've done plenty of work with Seq, and the API is well defined. There was even an example app that showed how to essentially reproduce the seqcli functionality, and the API has a built in method to return a string of your results in CSV format. 

I initially tried simply adapting this code, but decided I wanted more control. If await connection.Data.QueryCsvAsync(query, rangeStartUtc, rangeEndUtc) encounters an error, it will simply return the error in the string, which doesn't bode well for reliably detecting and alerting failures.

I therefore elected to parse the data returned from connection.Data.QueryAsync(), which returns a QueryResultPart() that has Columns and Rows properties for parsing, and importantly, also includes an Error string.

That makes detecting an error trivial, and we can log that as an Error. Logging it as an error means I can watch for that error in Seq, and send it off to Jira or OpsGenie to ensure it's alerted.

To parse the results, I elected to make use of an old favourite, CsvHelper, which makes it trivial to create CSV files. My favourite is being able to simply send a list of a given class to it and have an instant CSV file - but in this case, the Columns and Rows attributes didn't readily lend itself to that, so I resorted to writing individual columns as fields, with each row an individual record. Not a major drama.

Which brings us to logging and alerting. I'd think my choices are obvious there - Lurgle.Logging and Lurgle.Alerting were brought to bear.

Lurgle.Logging made it easy to develop a scheme that would both read from Seq using the API, and log to Seq using the underlying Serilog. To allow for this, my Seq API key is created with Ingest and Read permissions, so that just a single Seq config is needed. I'm in the habit of using the old logging patterns for Lurgle.Logging, but obviously could have made use of the new logging patterns that I recently enabled.

Lurgle.Alerting gives me the ability to simply send the report, with a nicely formatted template that was easily changeable. The app does benefit from the recent addition to allow plain text alternates, and of course I make use of the enhancement to the send results to properly log a failure of the email.

This all comes together nicely for a single purpose app ... but I did mention creating an easy way to setup multiple reports.

Multiple configs

We have  an internal app that I developed for critical file transfers some years back. When I originally created it, it was for a single purpose file transfer ... but I built in a bunch of features that made it usable for multiple transfers. Fast forwarding a few years, that little app now supports a majority of our most critical SLA file transfers ... but it had a problem.

You see, every time someone needed a scheduled file transfer, they'd deploy a new copy of the app with a new config. All the config was in the app.config (appname.exe.config) so this was necessary. We would have multiple versions of the app all over the place, with old versions not benefitting from bug fixes and new enhancements, such as additions to logging, which I touched on in the original Lurgle.Logging blog post.

So I baked in an ability to specify an alternate config file, which meant that a single instance of the app could be used to perform any transfers. That helped to arrest proliferation of the app, with focus shifting instead to creating and maintaining different configs.

With the addition of common logging and alerting libraries (in the form of Lurgle), I further enhanced this capability. No longer would the transfer config be contained in the app.config - only in the transfer config files. The app.config would now contain only the logging and alerting "global" settings.

On top of that, I made it possible for each transfer to override logging and alerting settings, so (for example) a transfer could be set to use different email recipients, or log to a different folder, or use a different Seq API key. 

When it came to Seq.Client.Reporter, then, it was obvious that to achieve multiple reports without proliferation of reporting versions - using this approach would be ideal. Hence - a single instance of Seq.Client.Reporter can be used with the command line:

seq.client.reporter.exe -config=<path to report config>

The Seq.Client.Reporter.exe.config file

Your Seq Reporter instance needs configuration for logging and alerting, and this can be set in a single spot - Seq.Client.Reporter.exe.config. You can set your logs and alerts once, and only add them to your report configs when you need to override the "global" config.

The log and alert settings have previously been covered off in the Lurgle blog posts - the configuration opportunities are extensive, and should be largely self-explanatory.

  <add key="EnableMethodNameProperty" value="true" />
  <add key="EnableSourceFileProperty" value="true" />
  <add key="EnableLineNumberProperty" value="true" />

  <add key="AppName" value="Seq.Client.Reporter" />
  <add key="LogType" value="Console,File,Seq" />

  <add key="LogMaskProperties" value="" />
  <add key="LogMaskPolicy" value="None" />
  <add key="LogMaskPattern" value="XXXXXX" />
  <add key="LogMaskCharacter" value="X" />
    <add key="LogMaskDigit" value="*" />

  <add key="LogConsoleTheme" value="Literate" />
  <add key="LogFolder" value="C:\TEMP\TEMP\log" />
  <add key="LogName" value="Reporter" />
  <add key="LogExtension" value=".log" />
  <add key="LogFileType" value="Text" />
  <add key="LogDays" value="31" />
  <add key="LogFlush" value="5" />
  <add key="LogShared" value="false" />
    <add key="LogBuffered" value="true" />

  <add key="LogEventSource" value="Reporter" />
  <add key="LogEventName" value="Application" />

  <add key="logSeqServer" value="" />
    <add key="logSeqApiKey" value="" />

  <add key="LogLevel" value="Verbose" />
  <add key="LogLevelConsole" value="Verbose" />
  <add key="LogLevelFile" value="Information" />
  <add key="LogLevelEvent" value="Warning" />
  <add key="LogLevelSeq" value="Verbose" />

  <add key="LogFormatConsole" value="{Message}{NewLine}" />
  <add key="LogFormatEvent" value="({ThreadId}) {Message}{NewLine}{NewLine}{Exception}" />
    <add key="LogFormatFile" value="{Timestamp:yyyy-MM-dd HH:mm:ss}: ({ThreadId}) [{Level}] {Message}{NewLine}" />

  <add key="MailRenderer" value="Liquid" />
  <add key="MailSender" value="MailKit" />
  <add key="MailTemplatePath"
         value="" />
  <add key="MailHost" value="mail" />
  <add key="MailPort" value="25" />
  <add key="MailTestTimeout" value="3" />
  <add key="MailUseAuthentication" value="false" />
  <add key="MailUsername" value="" />
  <add key="MailPassword" value="" />
  <add key="MailUseTls" value="true" />
  <add key="MailTimeout" value="60" />
  <add key="MailFrom" value="[email protected]" />
  <add key="MailTo" value="[email protected]" />
  <add key="MailDebug" value="[email protected]" />
  <add key="MailSubject" value="Alert!" />

The Report config file

So we come to configuring a report. The Test.config file included with the distribution includes comments to provide some guidance, but in short;

  • You may override log and alert settings, but some are grouped and must have all components specified if you want to override something like the file log folder or mail host
  • The config includes an IsDebug value that is used to ensure that, if set to true, we will email the MailDebug value instead of MailTo (preventing data leakage). It also controls whether we log the query used.
  • The config includes your Seq query that you've worked out within Seq. Note that it doesn't validate that the value is suitable for CSV output - you need to do the work to create and test a working query before you put it in.
    • Note - because this is within an XML value string, you need to escape special characters. I'll discuss this a little more in a later section. The query can be on multiple lines. 
  • The query timeout can be configured
  • You can configure a signal to query against, and I recommend this - signals are indexed and will provide the best performance. 
  • You can express TimeFrom and TimeTo as time expressions, which I'll also discuss further on
  • Optionally - you can enable and configure proxy settings for the Seq API. This is most useful to disable proxy for testing or for a scheduled app that could pick up proxy settings from policy.


<?xml version="1.0" encoding="utf-8"?>

  <add key="AppName" value="Scheduled Transfer Report" />
  <add key="LogType" value="Console,File,Seq" />
  <add key="LogFolder" value="C:\TEMP\TEMP\Log" />
  <add key="LogName" value="Transfer" />
  <add key="LogFileType" value="Json" />
  <add key="LogDays" value="31" />
  <add key="LogFlush" value="5" />
  <add key="LogShared" value="false" />
  <add key="LogBuffered" value="true" />
  <add key="LogLevelConsole" value="Verbose" />
  <add key="LogLevelFile" value="Information" />
  <add key="LogLevelEvent" value="Warning" />
  <add key="LogLevelSeq" value="Verbose" />

  <add key="MailFrom" value="[email protected]" />
  <add key="MailTo" value="[email protected],[email protected]" />
  <add key="MailDebug" value="[email protected]" />

  <add key="IsDebug" value="false" />
  <!-- Specify the valid Seq query you want to run. Multi-line is okay, but you must escape special characters per below-->
  <!--  Ampersand       &amp;   &
        Less-than       &lt;    <
        Greater-than    &gt;    >
        Quotes          &quot;  "
        Apostrophe      &apos;  '-->
  <add key="Query"
Substring(ToIsoString(@Timestamp + OffsetIn(&apos;Australia/Sydney&apos;,@Timestamp)), 0, LastIndexOf(ToIsoString(TimeOfDay(@Timestamp, DatePart(OffsetIn(&apos;Australia/Sydney&apos;,@Timestamp),&apos;hour&apos;,0h))), &apos;T&apos;)) AS Date,
Substring(ToIsoString(TimeOfDay(@Timestamp, DatePart(OffsetIn(&apos;Australia/Sydney&apos;,@Timestamp),&apos;hour&apos;,0h))), IndexOf(ToIsoString(TimeOfDay(@Timestamp, DatePart(OffsetIn(&apos;Australia/Sydney&apos;,@Timestamp),&apos;hour&apos;,0h))), &apos;T&apos;) + 1, LastIndexOf(ToIsoString(TimeOfDay(@Timestamp, DatePart(OffsetIn(&apos;Australia/Sydney&apos;,@Timestamp),&apos;hour&apos;,0h))), &apos;.&apos;) - IndexOf(ToIsoString(TimeOfDay(@Timestamp, DatePart(OffsetIn(&apos;Australia/Sydney&apos;,@Timestamp),&apos;hour&apos;,0h))), &apos;T&apos;) - 1) AS XfrTime,
log4net_HostName AS Server
FROM Stream LIMIT 10000" />

  <!-- Query timeout in minutes-->
  <add key="QueryTimeout" value="10" />
  <!-- Perform the query against one or more signals. Recommended since signals are indexed -->
  <add key="Signal" value="signal-503" />
  <!-- TimeFrom and TimeTo can be a Time, Date Expression, or Hybrid Expression-->
  <!-- Date expressions: {Int}s|m|h|d|w|M, where s=seconds, m=minutes, h=hours, d=days, w=weeks, M=months-->
  <!-- Sample date expression: 1M (1 month)-->
  <!-- Hybrid expressions - date expression plus time - examples: 1M 4:00 or 1d 04:00:00-->
  <add key="TimeFrom" value="4:00" />
    <add key="TimeTo" value="5:00" />

  <add key="UseProxy" value="false" />
  <add key="ProxyServer" value="" />
  <add key="BypassProxyOnLocal" value="" />
  <add key="BypassList" value="" />
  <add key="ProxyUser" />
  <add key="ProxyPassword" />

Overriding logging and alerting settings

If you override logging settings, you need to be aware that some properties are grouped. Many Lurgle.Logging properties can be overridden on a per-report basis, but file logging overrides are enabled by specifying LogFolder, and any of the following properties should be specified if this is enabled, or they will revert to their defaults - this is most likely with LogName, LogExtension, and LogFileType.

  • LogName (default Lurgle)
  • LogExtension (default .log)
  • LogFileType (default Text)
  • LogDays (default 31)
  • LogFlush (default 5)
  • LogShared (default false)
  • LogBuffered (default true)
  • LogLevelFile (default Verbose)
  • LogFormatFile (default {Timestamp:yyyy-MM-dd HH:mm:ss}: ({ThreadId}) [{Level}] {Message}{NewLine})

Console logging properties are grouped as well, but the defaults are likely what you'd use anyway.

While log masking is unlikely to be needed for this, the LogMaskPolicy, LogMaskPattern, LogMaskCharacter, and LogMaskDigit properties are similarly grouped.

For alerting, you can freely override the MailFrom, MailTo, and MailDebug settings - however if you want to override the mail host, you'll need to specify any of the following properties if reverting to the default setting isn't desirable - this is most likely with MailRenderer.

  • MailRenderer (default Replace)
  • MailSender (default MailKit)
  • MailPort (default 25)
  • MailTestTimeout (default 3)
  • MailUseAuthentication (default false)
  • MailUsername (default empty)
  • MailPassword (default empty)
  • MailUseTls (default false)
  • MailTimeout (default 60)
  • MailTemplatePath (default <exe path>\Templates)

Adding a query

The Test.config has part of a query that I've used in production for a Seq Reporter config. I have an app which uses Log4net that I've retrofitted to log to Seq, and I have a scheduled report to extract some really useful data from the XML logging that this app sends. I haven't included some of the meatier parts of my query, but I did leave the logic that carves @Timestamp into a date and separate time column ... Seq functions can allow for some really funky capabilities in queries 😁

The query as performed in Seq is:

Substring(ToIsoString(@Timestamp + OffsetIn('Australia/Sydney',@Timestamp)), 0, LastIndexOf(ToIsoString(TimeOfDay(@Timestamp, DatePart(OffsetIn('Australia/Sydney',@Timestamp),'hour',0h))), 'T')) AS Date, 
Substring(ToIsoString(TimeOfDay(@Timestamp, DatePart(OffsetIn('Australia/Sydney',@Timestamp),'hour',0h))), IndexOf(ToIsoString(TimeOfDay(@Timestamp, DatePart(OffsetIn('Australia/Sydney',@Timestamp),'hour',0h))), 'T') + 1, LastIndexOf(ToIsoString(TimeOfDay(@Timestamp, DatePart(OffsetIn('Australia/Sydney',@Timestamp),'hour',0h))), '.') - IndexOf(ToIsoString(TimeOfDay(@Timestamp, DatePart(OffsetIn('Australia/Sydney',@Timestamp),'hour',0h))), 'T') - 1) AS XfrTime, 
log4net_HostName AS Server 
FROM Stream LIMIT 10000

which returns a nice little table with the date, time, message, and server name. 

To add that to my report config, I need to escape the apostrophes so that the Query value can be treated as a string:

    <add key="Query"
Substring(ToIsoString(@Timestamp + OffsetIn(&apos;Australia/Sydney&apos;,@Timestamp)), 0, LastIndexOf(ToIsoString(TimeOfDay(@Timestamp, DatePart(OffsetIn(&apos;Australia/Sydney&apos;,@Timestamp),&apos;hour&apos;,0h))), &apos;T&apos;)) AS Date, 
Substring(ToIsoString(TimeOfDay(@Timestamp, DatePart(OffsetIn(&apos;Australia/Sydney&apos;,@Timestamp),&apos;hour&apos;,0h))), IndexOf(ToIsoString(TimeOfDay(@Timestamp, DatePart(OffsetIn(&apos;Australia/Sydney&apos;,@Timestamp),&apos;hour&apos;,0h))), &apos;T&apos;) + 1, LastIndexOf(ToIsoString(TimeOfDay(@Timestamp, DatePart(OffsetIn(&apos;Australia/Sydney&apos;,@Timestamp),&apos;hour&apos;,0h))), &apos;.&apos;) - IndexOf(ToIsoString(TimeOfDay(@Timestamp, DatePart(OffsetIn(&apos;Australia/Sydney&apos;,@Timestamp),&apos;hour&apos;,0h))), &apos;T&apos;) - 1) AS XfrTime, 
log4net_HostName AS Server 
FROM Stream LIMIT 10000" />

This is a simple find and replace  operation - search for ' and replace with &apos;.

The test.config includes a comment indicating which characters need to be escaped, but the table below also shows them:

NameCharacterEscaped Character

Once you're aware of this - it's simple to get your query in place.

Report time range and Date Expressions

You control the report time range with the TimeFrom and TimeTo configs.

You can, of course, simply specify hours in the H:mm or H:mm:ss format, and this shows in the test.config file - a report on events from 4:00am to 5:00am every day.

  • TimeFrom = "4:00"
  • TimeTo = "5:00"

But to make it as powerful as possible, I made date expressions. I've used a very simple scheme for time expressions for the TimeFrom and TimeTo configs.

These are expressed as the numeric value, followed by a character indicating the period.

Periods available are:

  • s (seconds)
  • m (minutes)
  • h (hours)
  • d (days)
  • w (weeks)
  • M (months).

You can also specify "now" which simply uses the current date and time. This can't be used as a hybrid expression with a time, obviously, but it's a useful shorthand for, well, now - the time you run the report.

So to simply specify the last hour in a time expression, you would put "1h". These are applied as  past values, so "1h" means 1 hour ago. 

If you wanted a report for the past hour, you would specify:

  • TimeFrom = "1h"
  • TimeTo = "now"

But perhaps we want to be even more specific, such as for a monthly report. I also made it possible to perform "hybrid" expressions - a time expression plus a time value in H:mm or H:mm:ss format.

To perform a report that will be scheduled on the 1st of every month, you would specify:

  • TimeFrom = "1M 0:00"
  • TimeTo = "0:00"

which means that on the 1st July, we will report on all events matching the query between 1st June 12am to 1st July 12am - a full month.

It is also possible to configure specific dates and times in TimeFrom and TimeTo - Seq.Client.Reporter will make a best effort attempt to parse them into meaningful time ranges.

Customising the email

The use of a Liquid template for Seq Reporter means that you can readily customise the email to suit your purposes. The templates within the Templates folder can simply be edited to suit your desired text / format.


<!DOCTYPE html>
<html lang="en" xmlns="">
  <meta charset="utf-8"/>
  <title>{{ReportName}} Report for {{Date}}</title>
  <style type="text/css">
      p, td {
          font-family: "Calibri", sans-serif;
          font-size: 11.0pt;
<p>Please find the <span style="font-weight: bold">{{ReportName}}</span> for <span style="font-weight: bold;">{{Date}}</span> attached.</p>
<p>Report period: <span style="font-weight: bold">{{From}}</span> to <span style="font-weight: bold">{{To}}</span></p>
<p><span style="font-weight: bold">{{RecordCount}}</span> records were returned.</p>


Please find the {{ReportName}} for {{Date}} attached.
Report period: {{From}} to {{To}}
{{RecordCount}} records were returned.

Scheduling reports

Seq Reporter is deliberately designed as a console app, which means you can add this to Scheduled Tasks in Windows - which I'm certainly doing. You do need to make sure you allow it to run whether or not a user is logged on, and save the password in order for it to be able to access the network.

You could also run it as a schedule app from other services, such as scheduled SQL Server Integration Services agent jobs. I'm not using that in this case, but we certainly do use it with the scheduled transfer app that's also implemented as a console application.

In short - if you have a way to schedule an app with parameters, Seq Reporter should work for the purpose.

Get Seq Reporter!

I'm really pleased with how this app came together, and it makes a huge difference to be able to reliably get reporting out of Seq as an automated process. It benefitted a lot from prior work, and made it really easy to get my reports reliably logged and emailed.

A possible future enhancement might be to apply some of the code from the transfer app, to enable sending the file via SFTP.

As always, it's available to download from the fancy links below!

Latest version of Seq Reporter
Total downloads of Seq Reporter



You may also like:

Lurgle.Logging v1.2.1 - More logging patterns for your Lurgle convenience

Lurgle approach compared to Serilog Following on from the v1.2.0 multi-threaded correlation release, I thought about whether we could further improve how we interface with Lurgle.Logging. The general approach was to maintain a static interface to logging that would allow us to capture key properties for logging, that would provide nicely...

Structured Logging with Seq and Serilog

A few years back, I picked up an old "unloved" business application for document handling, and brought it into the modern era. I completed some work on adding automated OCR, running as a service, and then started to enhance it well beyond its original capabilities, such as moving a manual...

Seq Reporter v1.0.3 - Who needs email when you can raise a Jira issue?

Reporting all the things Seq Reporter is the command-line client that can be used to schedule reporting from your Seq structured logs. It drives a number of daily and monthly reports for us and overall, it works well. We just set our query config and time range, schedule it, and...