Gaining insights in your Nintex Workflow Data

Working with workflows in SharePoint is always a challenging task. Especially when you are an administrator and people start complaining about performance issues, workflows that don’t start or resume, and so on. Using Nintex Workflow doesn’t change anything, but it gives you a bit more leverage.

The first thing I always do, is try to get some insights in the size of the issue. So, we’re talking about data. How much workflow data is involved. When Nintex is involved, there are 2 places where I’m going to look:

  • Workflow History Lists (SharePoint)
  • WorkflowProgress table (SQL Database)

Nintex has an article on both topics which I can definately recommend.

https://support.nintex.com/SharePoint/Forms/SharePoint_Maintenance_for_Nintex_Workflow

This article links to a PDF which explains how to maintain the WorkflowProgress table. You can find that PDF here.

In this PDF, you will find a SQL query which queries this table and the WorkflowInstance table to give you an insight in the sheer amount of data which is stored and which might be the root cause of your issues with workflows.

Just for you lazy people, here’s the query:

This gives you a lot of information which you can use in your remediation strategy.
Just export that information to a CSV and load it in Excel and you can have the following information at hand in no time.

workflow data

This is a table which gives you, per Nintex content database, the amount of records in the workflowProgress table (ActionCount) and the number of workflow instances involved. And it splits up this information for the different states workflows can be in (running, completed, canceled, error). Whoever came up with the concept of Pivot tables… thank you! 🙂

While this query is great for getting that information and report on it, it’s not that useful for people who are responsible for creating or maintaining those workflows. Look at the query. It gives you the site, web, list and even item that’s involved. But those are the ID’s. You still need to resolve these to a URL or title of a list to be useful for anyone who’s not an administrator on SharePoint and knows his way around in PowerShell.

You could customize this query and include the SharePoint content databases in there to get you the needed information but you know you shouldn’t touch those databases in SQL! So, that’s not an option.

I decided to make my life a little less complicated and use PowerShell to solve it.
The script below is going to execute the query above (with some minor adjustments to get some extra information) and after getting the results, it will get the corresponding URL’s for site and web, and the title of the list (if needed). All of this information is exported to a CSV.

This not only gives me information I can work with immediately, but it also allows me to do it all from my SharePoint server. I don’t need a SQL Management studio anymore or physical access to the database server.

Depending on the amount of rows that is returned from SQL, the execution of the script can take some time though. I tried to minimize the calls to SharePoint by using arrays where I store the URL’s and titles for found GUID’s. This reduces the stress on SharePoint a bit but it still requires some time to go through each row.

Using this list, you can prioritize the things to focus on during a cleanup. And it gives you also the ability to predict the impact of a purge on that table using specific parameters.

Leave a Reply

Your email address will not be published. Required fields are marked *

*