Last week I came across an issue where a number of our paginated report subscriptions at work failed to go out. When we checked in the Portal these subscriptions had a status of “Pending”. This happened on our production Power BI Report Server instance which is currently running the Sep 2019 release.
This was very strange as these subscriptions had been running for months and months without issue and just stopped all of a sudden. It was even more confusing as we had some reports with multiple subscriptions and some of the subscriptions were still working while others were stuck with a status of “Pending”.
With no other information to go on in the Portal I started looking through the log files. This was complicated in our case as we have over one and a half thousand users and hundreds of report subscriptions. We also have 2 instances of PBIRS (Power BI Report Server) behind a load balancer which means two sets of log files to search through.
So I started by going to one of the pending subscriptions and clicking on the edit option
When you do this you will see the URL change to something like the following where there is a guid at the end with the SubscriptionId:
By default PBIRS writes a number of log files out to a folder at C:\Program Files\Microsoft Power BI Report Server\PBIRS\LogFiles If you look in this folder it will look something like the following with a bunch of different files with date stamps on the end. In this case because the issue is with a paginated report we need to look in the ReportingServicesService*.log files
What I found in one of the log files when I searched for the subscription Id was the following error
Error processing data driven subscription a743db7f-bbbe-4c45-9da1-2e2e286992dd: Microsoft.ReportingServices.Diagnostics.Utilities.UnknownUserNameException: The user or group name '<domain>\<contractor user>' is not recognized.
Where the <domain> was our company’s AD domain and <contractor user> was the login that had been used by a short term contractor that had worked with us to build the report in question.
In our case the account in question had been disabled when the contractor left the organization. So obviously nothing was checking the enabled state of this account.
But as it turns out that our AD team had done one of their periodic clean-ups yesterday where they actually deleted a whole group of disabled accounts. So it appeared that this was related somehow to this account no long existing in AD.
We already knew from years of working with Reporting Services that when a subscription is executed that the Owner of the subscription is checked to make sure a valid account is specified. (I believe this is possibly a security measure to stop people setting up a schedule to send reports to somewhere after they have left a company). However we already had the contractor set the subscription owner to one of our service accounts when they created the subscription to try and prevent this very scenario from happening.
In fact searching through all the properties for the subscription in the portal showed no sign of the <contractor user> account anywhere.
At this point I decided to open up a PowerShell window and use one of the tools from the ReportingServicesTools PowerShell module to see if that could shed any more light on this issue.
When I ran the Get-RsSubscription cmdlet I noticed the following:
Sitting in the ModifiedBy property of the subscription object was a reference to our <contractor user> which we were seeing in the error in the log file.
When running Get-RsSubscription against a report where some subscriptions were working and others were stuck in a “Pending” state I could see that the working subscriptions had a ModifiedBy of an account belonging to someone who still had an active account in AD.
My guess as to what is happening here is that Report Server is attempting to populate some of the properties of a user object from Active Directory and is failing now that the users has been physically deleted and this is throwing an exception that is preventing the entire subscription from continuing with it’s execution.
So if you only have a handful of subscriptions stuck in a pending state like this you can just edit them in the portal and make some non-functional change like adding a full stop to the end of the subscription name. This will set the ModifiedBy to your user account and the subscription will start working again.
In our case we took a backup of the ReportServer database and then ran an update statement to set the guid of the ModifiedBy to the guid of our service account user. This is not a supported activity and something you would do at your own risk. But in our case it did allow us to quickly fix numerous “broken” subscriptions that would have taken hours to fix through the UI.
Power BI Report Server needs an Admin Portal
I think one area where Power BI Report Server could do with some more work is in the area of administrator tools. At the moment if a report fails to render because of an error you have to wait for a user to report it. And if a subscription fails to send there is no central place where you can see these issues and easily take steps to correct them.
The 2.11.0 release of DAX Studio is now available and brings with it the following new features and fixes.
New Preview Features
There are 2 new preview features this month, so you need to go into Options > Advanced and enable them if you want to use them
Query Builder
When enabled, the Query Builder appears as a button in the main ribbon
It lets you drag and drop columns and measures to build up a query which can include basic filters. You can also add custom measures or override the expression of a measure from your data model.
You can either run the content of the query builder directly or you can click the “Edit Query” button to send the text for the query to the main editor window where you can run it or further customize it.
Query Benchmark
The Query Benchmark tool appears as a button on the Advanced ribbon. It allows you to run a given query a number of times both against a cold and warm cache. This is useful because even on a quiet development server there can be a number of factors that can cause variability in the server timings.
The Benchmark feature makes use of the Server Timings functionality to record detailed information about each query execution.
You get the option of how many runs of cold vs warm cache (and by default these are linked)
The output of a Benchmark run shows a summary view with the Avg, StdDev, Min and Max of both runs for the Total Duration and the Storage Engine Duration
The detailed output shows the timings of every single query execution.
New Features
In addition to the two big features above there are a number of smaller features that have been added in 2.11.0
Added full filename tooltip to tabs (thanks @dmarkle)
Promoted View Metrics (Vertipaq Analyzer) from preview status to general availability
Promoted Export Data feature from preview status to general availability
Documentation Updates:
Added license page
including a section on SmartScreen issues in Win10 (thanks to Gilbert at fourmoo.com )
Updated syntax highlighting to align with DaxFormatter.com
Added a note in the Database tooltip that the Database Id can be copied using a right-click
Added formatting to shorten asazure: and powerbi: server names in the status bar so that the key information is visible
Added a partitions tab to the Model Metrics views
Added a sample of any missing keys to the relationships tab in the Model Metrics (these keys are not saved for privacy reasons when exporting to a vpax file)
Fixes
fixed cancelling of exports to SQL Server
improved keyboard navigation by adding IsDefault/IsCancel properties to dialog buttons (thanks @dmarkle)
fixed an issue with intellisense not re-enabling after reconnecting (thanks @dmarkle)
fixed an issue with Query History pane not updating the “Current” Database filter when changing databases
disabled external tools when connected to PowerPivot
#290 updated all URL references to use https (thanks @warthurton)
Have you ever run a DAX query from DAX Studio (or using a DAX window in SSMS) and wondered why the format you set on a measure does not always seem to get applied?
Let’s start with the following simple DAX query which simply lists the month number from 1 to 12 and a measure.
EVALUATE
ADDCOLUMNS(VALUES('date'[Month])
,"Internet Total Sales"
, [Internet Total Sales]
)
If we run this in DAX Studio you will see the following:
Note how the Format of the measure is correctly applied to return the dollar sign and the thousand separator and only 2 decimal places.
Now lets run the same query against the same model using SQL Server Management Studio (SSMS)
Now we have no currency symbol, no thousand separator and 3 to 4 numbers after the decimal place. What is going on here?
Well I’m going to let you in on a little secret about DAX queries:
The results in a DAX query are always returned unformatted.
You may well ask “Why is the formatting working in DAX Studio then?“. The answer is simple, I’ve specifically added code that looks at the column names returned by a query and then looks for a measure with the same name and applies any format string it finds.
You’ll notice in the example query that I’m setting the column name to the same name as the measure. If I change the column name to “AAA” you will see the following output.
Which is the same “raw” format we see from SSMS.
And if we exploit this for evil purposes we can even change the column name in the output to match a completely different measure. In the screenshot below I am applying the “Margin %” format to the [Internet Total Sales] measure so that it has one decimal place and a percentage sign and the decimal place is shifted two points to the right. I can’t think of a practical use for this behaviour, but you may see it occasionally if you are editing a query and change the measure reference without updating the column name.
You usually never see this in a client tool like Power BI as it builds the DAX queries internally so it knows which measures map to a given column in the result set so it can then apply the formatting appropriately.
If you’ve been following along with some of these example queries there is one other formatting feature we have in DAX Studio which you may have run into and that is the “Automatically Format Results” setting under File > Options.
This is off by default, but if you switch it on DAX Studio will apply some basic formatting based on the data type of the column in an attempt to try and make the results easier to read.
If the column is an integer use the format string “#,0” (this should include the appropriate thousands separator based on the language settings of your pc)
If the column is a decimal use the format “#,0.00”
If the column is a decimal number AND the name includes “pct” or “%” then use the format “0.00%”
This formatting of results is just one of the many small ways that I try to improve the user experience when working with queries in DAX Studio.
Recent Comments