Darren Gosbell

Category: Uncategorized (Page 1 of 2)

Power BI: Hacking a bar chart into table

I was working on report with a business user and one of the outputs he wanted to see was a list of over 100 tasks that needed to be performed on specified groups of equipment.

One of the initial views they wanted to see is list of these tasks with about 7-8 descriptive columns so a table visual made the most sense. However, for each of those tasks there would be 1 or more pieces of equipment and for each piece of equipment it could be in 1 of 4 states depending on the last time the task was performed.

The states that a piece of equipment could be in are:

  • Compliant (the task was performed before the due date)
  • Due Now (the due date has passed, but each task has a certain amount of tolerance and we are within that tolerance)
  • Overdue (the task is past the due date and tolerance)
  • Missing (cannot find any historic occurrence of this task)

The above information is just for some context, but they key take away is that we had a table and 4 percentages that added up to 100% across each row.

The problem was that the information was very text heavy and it was difficult to see which tasks were OK and which ones needed attention without reading each row carefully one at a time. I initially tried adding some conditional formatting of the background colours and that helped a little bit, but it still was not clear.

What would have been really nice was to use a 100% stacked bar chart, but we had too many descriptive columns to use one of these visuals on its own and we had too many rows to be able to just position one next to the table.

If you are interested in seeing an implementation of each of the approaches outlined below a copy of the PBIX file I used for the images in this blog post will be linked at the end of this post

Approach 1: Using an SVG Measure

What would have been ideal would be to use a measure to generate an SVG image. There are a number of blog posts available that show how you can generate SVG strings in DAX. Unfortunately due to the limitation that all images in tables and matrices are forced to be square and you cannot specify a rectangle that is wider than it is high.

If you would like to see the limitation removed that forces images to have the same width and height please vote for this idea

I did do a quick test to confirm that this restriction was an issue and it produced output like the following which was no good as it forced each row to be too tall.

At this point I reached out to some friends to see if anyone else had some ideas for solving this issue or whether I would need to consider a different design for the report. This ended up in a great 3-way conversation between myself, Daniel Marsh-Patrick ( b | t ) and Kerry Kolosko ( b | t )

Approach 2: Using a Measure with extended unicode characters

After explaining the issue and what approaches I had already considered Daniel quickly came up with the idea of using extended unicode characters to approximate a bar chart. If you scroll down to the bottom of this page which shows some of the extended unicode characters you can see that there are some characters which are colored squares.

There are only 7 different colours (and a black and white square) but that is enough to build a simple bar chart.

Using the following measure we are able to generate a series of coloured squares to represent each of our percentages. (note you should see different coloured squares in the code below, if you don’t keep reading for an explanation of this)

Unicode Chart = 
    VAR purple = "🟪" 
    VAR brown  = "🟫" 
    VAR green  = "🟩" 
    VAR red    = "🟥" 
    VAR orange = "🟧" 
    VAR yellow = "🟨" 
    VAR blue   = "🟦" 
    VAR white  = "⬜" 
    VAR black  = "⬛" 
    VAR Scale = 10
    VAR Compliant = COALESCE(MROUND([Compliant %], (1/Scale)) * Scale,0)
    VAR DueNow = COALESCE(MROUND([Due Now %], (1/Scale)) * Scale, 0)
    VAR Overdue = COALESCE(MROUND([Overdue %], (1/Scale)) * Scale, 0)
    VAR Missing = COALESCE(MROUND([Missing %], (1/Scale)) * Scale, 0)
    RETURN 
        IF (HASONEVALUE('Tasks'[Task]), 
                REPT(green, Compliant) & 
                REPT(white, DueNow) &
                REPT(yellow, Overdue) &
                REPT(red, Missing)
                , BLANK()
                )

I selected a scale value of 10 which builds a simplistic bar chart where each square represents 10%. I This chart uses the following colour coding

And this produces a result like the following, which is really cool since all of the logic is contained in a single measure:

Unicode bar chart

However, you will notice that there are a couple of issues with the bar charts above. This is because I have specifically chosen the amounts I used in the test data to highlight some of the problems with this sort of approach.

  • Task 1: appears to be 100% compliant, but is actually only 99%
  • Task 2: has 11 squares due to multiple categories being rounded up
  • Task 3: has only 9 squares due to multiple categories being rounded down
  • Task 4: has 12 squares due to multiple categories being rounded up

In my specific circumstance, since the goal is to get every task to a status of 100% compliant, the issue with Task 1 was actually the most critical. I did not want something to appear to be 100% compliant if it was not actually at 100%. While annoying from an esthetics point of view the issue of there not always being exactly 10 squares was not a deal breaker.

However we did find one deal breaker with this approach and that is that it appears as follows on some machines, with just a series of white squares on every row:

I think this is probably because Power BI uses the fonts from the client machines when rendering text. So maybe there were different versions of the standard fonts. But even on some machines supposedly running the identical version of Windows 10 we still saw instances of this issue. This is pretty much a deal breaker for this approach as there was no way to guarantee that the necessary extended characters would be available on all the client machines.

Approach 3: A creative use of conditional formatting

It was at this point that Kerry came up with the inspired suggestion of creating 10 columns and using conditional background colours to simulate the bar chart.

I started off by creating the first measure using the following expression.

Status 01 = 
var block = 1   // the number of the current block
var scale = 10  // controls the number of blocks
var comp =   MROUND([Compliant %],(1/ scale)) * scale
var dueNow =  comp + (MROUND([Due Now %],1/ scale) * scale)
var overdue =  dueNow + (MROUND([Overdue %],1/ scale) * scale)
var missing =  overdue + (MROUND([Missing %],1/ scale) * scale)
var result = SWITCH(TRUE(),
    comp >= block , 1,
    dueNow >= block , 2,
    overdue >= block , 3,
    missing >= block , 4,
    blank()
)
return result

I setup all the key components as variables so then I just had to copy paste the code 9 more times just changing the name and the value of the “block” variable. (you can see these 10 measures in the report linked at the bottom of this post)

If you drop these 10 measures into a table you get output like the following:

By renaming each of the measures so they are just a single digit from 1 to 0 we can shrink down the table so that it easily fits on the page without horizontal scroll bars

In my first pass I started off creating rules for the background colour. This involved creating 4 rules, one for each colour and doing this 10 times (once for each column). You can see the rules for column 1 below.

After doing a couple of columns this got a bit tedious. The technique was working, but it was taking a lot of clicks per column which made me wonder if there was a better way. But you can see from the image below how the rules were working.

If you look at the options for the conditional formatting instead of using a rules based approach you could use the value coming from a column or measure. So I created 10 measures like the following which returned the background colour as a hex string instead of returning a numeric value between 1-4 that I could use in the rules.

Status Colour 01 = 
var block = 1
var scale = 10
var compliant =   MROUND([Compliant %],(1/ scale)) * scale
var dueNow =  compliant + (MROUND([Due Now %],1/ scale) * scale)
var overdue =  dueNow + (MROUND([Overdue %],1/ scale) * scale)
var missing =  overdue + (MROUND([Missing %],1/ scale) * scale)
var compliantColour = "#8BC34A"
var dueNowColour = "#CCCCCC"
var overdueColour = "#F69647"
var missingColour = "#FD625E"
var result = SWITCH(TRUE(),
    compliant >= block , compliantColour,
    dueNow >= block , dueNowColour,
    overdue >= block , overdueColour,
    missing >= block , missingColour,
    blank()
)
return result

This meant I could change the conditional formatting settings to look as follows which just required me to change the Format Style to “Field value” and then pick the relevant measure.

This was a much simpler approach which took much less time to configure and once I configured both the background and foreground colours it resulted in an output like the following.

The only remaining issue was the 1-0 column headings, but I “fixed” that by positioning a text box with a white background over the top of these.

What about other options like Charticulator or the HTML Content custom visual?

I’m sure there are many of you wondering if this approach of hacking a table was the best option and what other approaches might have worked. Below are some of the other options which I considered and subsequently chose not to pursue.

Create a Custom Visual – I have dabbled a bit with coding custom visuals and I could have potentially built my own, but it would have taken days and days to do this not hours. And then there is the requirement to maintain and update the custom visual over time which adds extra burdens to this approach which makes it unfeasible for visuals with a limited re-use potential

Charticulator – I have not used this very much so there may be an approach I missed (and I’m more than happy to be proven wrong if someone wants to have a go at implementing this). But from what I could see there is no way to force the individual text glyphs to be constrained to a given column size so if some of the description fields were longer than the available space they would overlap with the next “column” instead of wrapping like they do with a table.

HTML Content Visual – I actually got really close to the original concept for this report with the HTML Content Visual, but one key bit of functionality that I needed to support was the ability to drill through from the table to a detail page. Unfortunately the HTML Content Visual does not currently support any way of configuring drillthrough.

Download the sample file

If you are interested to play with the example file used in this post you can get it from here

Building a Bump Chart in Power BI using Deneb

Bump charts / Slope charts are an interesting way to show changes in rankings over time, but Power BI does not currently have a good built-in chart for this sort of visualization.

See the source image
Example of a bump chart

I saw an interesting example of one of these charts recently where the circles in the bump chart also change size based on the relative value of the metric.

There is a video on the Curbal channel showing how you can get pretty close with a line chart, but the circles are all the same size.

And Kerry Kolosko also has an example using deneb on her site. And while the circles are a constant size in her example, I was pretty sure that with a little tweaking I could make Deneb produce the sort of output I wanted.

I started using a copy of the data set from Curbal’s video. I used the existing [Rank bump chart] measure for the rankings and I created a simple Percent of Year measure using the following expression for the labels and sizing of the circles.

Percent of Year = DIVIDE([Sales], CALCULATE([Sales] , all(), values('Calendar'[Year])))

And after a bit of experimenting and reading through the Vega and Vega-Lite documentation to figure out how to specify the size encoding for the circles I ended up with the following result which I was pretty pleased with.

Final Bump Chart

Below is the Deneb specification for the above chart

{
  "$schema": "https://vega.github.io/schema/vega-lite/v5.json",
  "usermeta": {
    "deneb": {
      "build": "1.4.0.0",
      "metaVersion": 1,
      "provider": "vegaLite",
      "providerVersion": "5.4.0"
    },
    "interactivity": {
      "tooltip": true,
      "contextMenu": true,
      "selection": false,
      "highlight": false,
      "dataPointLimit": 50
    },
    "information": {
      "name": "Bump Chart",
      "description": "[No Description Provided]",
      "author": "Darren Gosbell",
      "uuid": "3416d242-19c2-4c88-b369-63d3f8ca302e",
      "generated": "2022-12-09T07:05:41.633Z"
    },
    "dataset": [
      {
        "key": "__0__",
        "name": "Year",
        "description": "This is the column for the X-Axis",
        "type": "numeric",
        "kind": "column"
      },
      {
        "key": "__1__",
        "name": "CategoryName",
        "description": "This is the column for the Y-Axis",
        "type": "text",
        "kind": "column"
      },
      {
        "key": "__2__",
        "name": "Rank bump chart",
        "description": "This is the ranking measure",
        "type": "numeric",
        "kind": "measure"
      },
      {
        "key": "__3__",
        "name": "Percent of Year",
        "description": "This is the value to display",
        "type": "numeric",
        "kind": "measure"
      }
    ]
  },
  "config": {
    "view": {"stroke": "transparent"},
    "font": "Segoe UI",
    "arc": {},
    "area": {
      "line": true,
      "opacity": 0.6
    },
    "bar": {},
    "line": {
      "strokeWidth": 3,
      "strokeCap": "round",
      "strokeJoin": "round"
    },
    "path": {},
    "point": {
      "filled": true,
      "size": 75
    },
    "rect": {},
    "shape": {},
    "symbol": {
      "strokeWidth": 1.5,
      "size": 50
    },
    "text": {
      "font": "Segoe UI",
      "fontSize": 12,
      "fill": "#666666"
    },
    "axis": {
      "ticks": false,
      "grid": false,
      "domain": false,
      "labelColor": "#666666",
      "labelFontSize": 12,
      "titleFont": "wf_standard-font, helvetica, arial, sans-serif",
      "titleColor": "#070f25",
      "titleFontSize": 16,
      "titleFontWeight": "normal"
    },
    "axisQuantitative": {
      "tickCount": 3,
      "grid": true,
      "gridColor": "#C8C8C8",
      "gridDash": [1, 5],
      "labelFlush": false
    },
    "axisX": {"labelPadding": 5},
    "axisY": {"labelPadding": 10},
    "header": {
      "titleFont": "wf_standard-font, helvetica, arial, sans-serif",
      "titleFontSize": 16,
      "titleColor": "#070f25",
      "labelFont": "Segoe UI",
      "labelFontSize": 13.333333333333332,
      "labelColor": "#666666"
    },
    "legend": {
      "titleFont": "Segoe UI",
      "titleFontWeight": "bold",
      "titleColor": "#666666",
      "labelFont": "Segoe UI",
      "labelFontSize": 13.333333333333332,
      "labelColor": "#666666",
      "symbolType": "circle",
      "symbolSize": 75
    }
  },
  "data": {"name": "dataset"},
  "encoding": {
    "x": {
      "field": "__0__",
      "type": "nominal"
    },
    "y": {
      "field": "__2__",
      "type": "nominal",
      "axis": null
    },
    "order": {
      "field": "__0__",
      "type": "nominal"
    }
  },
  "layer": [
    {
      "name": "Lines Layer",
      "mark": {
        "type": "line",
        "point": false,
        "interpolate": "monotone"
      },
      "encoding": {
        "color": {
          "field": "__1__",
          "legend": false,
          "scale": {
            "scheme": "pbiColorNominal"
          }
        }
      }
    },
    {
      "name": "Circle Layer",
      "mark": {
        "type": "circle",
        "tooltip": true,
        "opacity": 1
      },
      "encoding": {
        "color": {
          "field": "__1__",
          "legend": false
        },
        "size": {
          "field": "__3__",
          "type": "quantitative",
          "scale": {
            "range": [200, 2000]
          },
          "legend": null
        }
      }
    },
    {
      "mark": {
        "type": "text",
        "color": "white"
      },
      "encoding": {
        "text": {
          "field": "__3__",
          "format": "0%",
          "formatType": "pbiFormat"
        }
      }
    },
    {
      "transform": [
        {
          "window": [
            {"op": "rank", "as": "rank"}
          ],
          "sort": [
            {
              "field": "__0__",
              "order": "descending"
            }
          ]
        },
        {"filter": "datum.rank === 1"}
      ],
      "mark": {
        "type": "text",
        "align": "left",
        "baseline": "middle",
        "dx": 25,
        "fontWeight": "bold",
        "fontSize": 12
      },
      "encoding": {
        "text": {
          "field": "__1__",
          "type": "nominal"
        },
        "color": {
          "field": "__1__",
          "scale": {
            "scheme": "pbiColorNominal"
          },
          "legend": false
        }
      }
    },
    {
      "transform": [
        {
          "window": [
            {"op": "rank", "as": "rank"}
          ],
          "sort": [
            {
              "field": "__0__",
              "order": "ascending"
            }
          ]
        },
        {"filter": "datum.rank === 1"}
      ],
      "mark": {
        "type": "text",
        "align": "right",
        "baseline": "middle",
        "dx": -25,
        "fontWeight": "bold",
        "fontSize": 12
      },
      "encoding": {
        "text": {
          "field": "__1__",
          "type": "nominal"
        },
        "color": {
          "field": "__1__",
          "scale": {
            "scheme": "pbiColorNominal"
          },
          "legend": false
        }
      }
    }
  ]
}

If you prefer the look of the straight lines between the data points you can simply remove the "interpolate": "monotone" from the lines layer and you end up with a chart like the following one.

Power BI Report Server – an unexpected error occurred (Nov 2021)

Earlier this week we started getting reports of the following error on our production Power BI Report Server (PBIRS) instance for any reports that were using a live connection back to one of our Analysis Services models.

All it basically said was “an unexpected error occurred. Please try again later”. And if you looked at the details it showed the message above saying that “We could not connect to the Analysis Services server”.

Some other fun facts about this issue are:

  • It was not affecting all users (at least in our environment)
  • There had been no recent updates or configuration changes on PBIRS, SQL or SSAS
  • The same issue appeared to be affecting our test environment but not our development environment.
  • It was happening for both pbix and paginated reports
  • It was affecting reports that sourced data from SQL Server as well as SSAS
  • When running a profiler trace against SSAS listening for the session events I could see some sessions with usernames and others where the user was listed as “ANONYMOUS USER”

Putting all of the above evidence together point to some sort of Kerberos issue. In our environment our SQL and SSAS instances are on different servers to our PBIRS instance we had to configure Kerberos with constrained delegation to allow for the user credentials to flow from client > to PBIRS > to SSAS.

Kerberos can be a little fiddly to setup initially, but once it’s configured I’ve never had an issue with it until the one this week. When we initially configured Kerberos I had distilled the generic instructions from the Microsoft documentation down into a series of specific steps for our production environment listing the specific setspn commands that needed to be run and how the constrained delegation settings needed to be configured and saved these in a document. This made it very easy to cross check that all these configurations were still correct.

At this point we knew that the Kerberos configuration appeared to be unchanged, but we still had an issue where it worked for some users and not for others.

So the next thing I started doing was to get some of our users to run the klist command from a command prompt. This command lists all the Kerberos tickets that are cached on the current machine. This lead us to find a pattern where everyone that was getting the error were getting their kerberos tickets from a specific list of domain controllers while people that did not have the issue were getting theirs from a different list of domain controllers. Our IT admins then noticed that all the failing servers were in a group which had had security patches applied earlier that day.

The specific patch that had been applied was KB5007206 and after raising a support ticket with Microsoft they were able to point us to the following hotfix (KB5008602) for this update which fixed an issue when you were using Kerberos authentication with constrained delegation and protocol transition (which is exactly how Kerberos needs to be configured when using PBIRS or SSRS or SSRS in Sharepoint integrated mode for people with on-prem Sharepoint environments).

Hopefully this post will help anyone else that runs into this issue.

Fixing incorrect date formats in Power BI Report Server when using Chrome or Firefox

Reposted from 1 Mar 2019 from my old blog since geekswithblogs.net is down again at the moment: http://geekswithblogs.net/darrengosbell/archive/2019/03/01/fixing-incorrect-date-formats-in-power-bi-report-server-when.aspx

We are using Power BI Report server to publish Power BI Reports internally. And being in Australia we use a “sensible” date format of dd/mm/yyyy Smile 

However at a recent showcase with some of our end users they commented that our date format was “backwards” ie. mm/dd/yyyy

If you are using US regional settings on your PC you probably have not come across this issue, but for other regions this can be a problem.

Upon getting back to my desk I checked the report in Power BI Desktop and the date format was correctly picking up my local regional settings. I then opened the report in IE (which is what most of our End Users have as a default browser) and the report was still using the Australian format (dd/mm/yyyy):

DateSlicer-IE

However when opening the same report in Chrome or Firefox the date was using US regional settings (mm/dd/yyyy):

DateSlicer-Chrome

After doing some google/binging I found that both Chrome and Firefox have their own internal regional settings dialog and they both default to using US settings.

If you go into the Settings menu in Chrome and search for “Language” you will find the following, and it’s not just the default display language that matters the order is also important. Moving “English (Australia)” to the top of the list, then deleting cached images and files fixed this issue.

RegionalSettings-Firefox

Clicking on the 3 dots to the right of the language gives you an option to move that language to the top.

Firefox has a very similar language dialog and the same fix works for it too.

What this does behind the scenes is to change the “Accept-Language” headers that the browser sends to PBIRS to make your preferred language first in the list. This in turn alters the regional formatting setting javascript file that is sent down to the browser. Before this change I could see a file called angular-locale_en-us.js being used, after the change I now see angular-locale_en-au.js

DAX Studio 2.15 Released

This release includes a lot of small fixes, but hopefully I think I may have finally tracked down and fixed what was causing the code completion to randomly stop working.

You can get it from https://daxstudio.org

New Features

  • #559 Added a toolbar button and right-click menu to copy Server Timing metrics
  • #556 Added an option to play a sound after long running operations (this can be enabled in the Options menu)
  • Added “Duplicate Measure” right-click menu to Query Builder
  • Added Error events to All Queries trace, this can be helpful if you have another client tool that is generating invalid DAX or MDX
  • #568 Added right-click menu option in the editor to go to dax.guide for functions
  • #569 Added Function tooltip display when hovering the mouse over functions in the editor
  • Added setup cmdline parameter docs
  • Added SQLBI videos to docs
  • Added discussions link to website navigation
  • Updated Multiple Query Detection to work with 
    // Dax Query in addition to // SQL Query comments

Fixes

  • Adding extra checks around QueryEnd timeout in Server Timings
  • Close the Trace Layout ribbon tab when all documents are closed
  • Made ‘Add New’ disabled in the Query Builder if the current model has no tables
  • Fixed #458 added docs and tooltip to explain memory sizes
  • Fixed #517 updated clipboard bindings to allow for copying from the View Metrics tabs
  • Fixed #539 error starting traces when using Roles or EffectiveUserName
  • Fixed bug in retry for ViewMetrics when there is an error reading statistical information.
  • Added configuration to read statistical data for VertiPaq Analyzer. Disable reading statistical data for legacy versions of SSAS. Added retry logic and log of warning if there is an error reading statistical data.
  • Fixed #525 F1 and Context Menu to open dax.guide for functions
  • Fixed server timings trace for PowerPivot
  • Fixed drag drop issues with QueryBuilder
  • Fixed #514 Server Trace not working in Excel
  • Fixed #520 option to not truncate tables was not working when exporting to SQL Server
  • Fixed #511 Excessive metadata refreshes was causing metadata pane to collapse
  • Fixed #512 fixed Table tooltip warnings and added measure count
  • Fixed #546 corrected wording in error message
  • Fixed #549 correctly escape connection string values containing special characters
  • Fixed #557 delimiter error on Clear Cache when using non-US delimiter style
  • Fixed a bug with the Roles parameter in the advanced connection properties not accepting spaces
  • Fixed #563 metadata incorrectly resetting when using a combination of mouse selection and query builder hotkeys
  • Fixed #564 Application crashes when clicking on a ribbon tab while the ribbon is collapsed
  • Fixed crash when resetting key bindings
  • Fixed #575 autocomplete sometimes stops working
  • Disabled the Add Measure button when connected to a model with no tables
  • Fixed #576 Editing an existing Measure in Query Builder against PowerPivot causes a crash

Fixing “Attempted to read or write protected memory” Refresh Errors in Power BI Report Server

Warning: I am just sharing what worked for me in our environment in the hopes that it may help others who might hit this issue. But you should make sure to test changes like this yourself and that you have a good backups and a rollback strategy before changing your own production environments. Any changes you make to your servers are done so at your own risk.

After doing a fairly extensive round of testing we recently updated our production environment for our Power BI Report Servers to the Jan 2021 release. However unfortunately during the verification testing after the upgrade I found an issue on the production servers that had not occurred on any of our other environments.

The issue was that doing a scheduled refresh of a pbix file linked to an access file was failing with an error that “We’re sorry, an error occurred during evaluation.; Attempted to read or write protected memory” and searching for this error message only brought back hits from 2-3 years ago which were not very helpful.

The error reporting in the Report Server UI is often not very comprehensive, but even checking the RSPowerBI* logs in the Report Server log folder did not help all that much (I’ve included the relevant lines from an example of this error at the end of this post)

Initially I tried running a repair operation by re-running the Power BI Report Server installer, I could see that one of the things it checked was the Access 2016 Engine, but unfortunately even after doing a reboot the issue persisted.

Upon checking in “Add or Remove Programs” I found that the servers had both the Access 2010 and Access 2016 Engine runtimes installed. I thought this was a little odd since the 2016 version should supersede the 2010 version and these servers only run Power BI Report Server so it was not like there was other software that my have had specific requirements for the 2010 engine. So I did wonder if something had gone wrong with the registration of these 2 versions and somehow the server was expecting to run using the 2016 version, but was actually running using the 2010 version.

However upon checking our non-prod environments I found that they all had both versions installed. But by watching the installer I could see that the Jan 2021 release was definitely only installing the Access 2016 runtime. So as a test on one of our non-production environments I tried uninstalling both runtimes then only re-installing the Access 2016 Engine runtime (which I found by searching for “access 2016 engine download“).

I did not get prompted to reboot after doing the re-install of the Access 2016 engine so I simply did a stop and start of the Power BI Report Server using the Report Server Configuration Manager. After doing this refreshing one of my test reports using an Access file data source still worked. So since production was already in a broken state as far as Access refreshes were concerned I tried this same process on the production environment and this fixed our issue so we now have a fully operation install of the Jan 2021 release of Power BI Report Server.

For reference the full error message from the RSPowerBI log files is below:

2021-02-24 18:49:46.7357|ERROR|19|Error Processing Data Model Refresh: SessionId: b3c84c62-2cf0-4df0-b22c-fde90857c4b8, Status: Error Failed to refresh the model, Exception Microsoft.AnalysisServices.OperationException: Failed to save modifications to the server. Error returned: 'We're sorry, an error occurred during evaluation.;Attempted to read or write protected memory. This is often an indication that other memory is corrupt.
Attempted to read or write protected memory. This is often an indication that other memory is corrupt.
Attempted to read or write protected memory. This is often an indication that other memory is corrupt.
Attempted to read or write protected memory. This is often an indication that other memory is corrupt.
Attempted to read or write protected memory. This is often an indication that other memory is corrupt.. The exception was raised by the IDbCommand interface.
'.
at Microsoft.AnalysisServices.Tabular.Model.SaveChangesImpl(SaveOptions options)
at Microsoft.PowerBI.ReportServer.AsServer.TOMWrapper.RefreshModel(Database database)
at Microsoft.PowerBI.ReportServer.AsServer.AnalysisServicesServer.RefreshDatabase(String databaseName, IEnumerable1 dataSources, IEnumerable1 parameters, String clientSessionId)
at Microsoft.PowerBI.ReportServer.WebHost.EventHandler.AnalysisServicesDataRefresh.RefreshDatabase(AsDatabaseInfo asDatabaseInfo)
at Microsoft.PowerBI.ReportServer.WebHost.EventHandler.DataRefreshScope.<>c__DisplayClass34_0.b__0()
at Microsoft.PowerBI.ReportServer.WebHost.EventHandler.DataRefreshScope.d__33`1.MoveNext()

DAX Studio 2.14 Release

Updates

The following are some of the new features and improvements that you will find in v2.14 of DAX Studio

Blank Canvas Help Text

Help text has been added when the edit control is empty. This is designed to help people that are new to writing DAX queries. There are links to an article with information on writing DAX queries as well as a link to display the query builder. This help text will automatically fade away as soon as you start typing and once you no longer need it you can turn it off.

New Privacy Settings

There are new privacy settings now available in the options. We don’t recommend enabling these for general use, but if you are working with data of a highly sensitive nature and you need to ensure that DAX Studio does not send any information externally you can enable these options.

Note: DAX Studio never sends any data from your model externally.

However if the text of your queries contain sensitive information such as customer names or account numbers and you choose to use the Format DAX option that query text gets sent to daxformatter.com. And if you happen to suffer a fatal crash the crash reporter includes the option to send a screenshot and if you don’t untick the option to include the screenshot there maybe some data visible or some information in the query text.

So you as the user have complete control over any information that is sent externally. But if you work in a highly secure environment and you need to ensure that no accidental information leakage can occur then you can enable some or all of these options.

Enabled Discussions on the Github site

Discussions are a new feature on Github and this now gives us a place where people can ask questions or make suggestions without having to log a formal issue.

Multiple Queries Detected dialog

The “Multiple Queries Detected” dialog will appear when pasting in queries from Power BI Performance Analyzer that came from a Composite model. Such queries will sometimes have 2 queries where the second query is prefixed with a // SQL Query comment. The query after this comment will sometimes be SQL however if your model is a composite model over AzureAS or Power BI it might be in DAX.

New Server Timings Options

There are new options for Server Timings that help make the xmSQL text easier to understand. These new processing options are enabled by default and strip out internal IDs, Aliases and Lineage guids and highlight Formula Engine callback operations to make these events easier to understand

Other Miscellaneous Updates

  • Help Ribbon updates
    • Updating link in help ribbon to Analysis Services forum
    • Adding link to Discussions https://github.com/daxstudio/daxstudio/discussions
  • Query Builder moved out of preview status
  • Benchmarking moved out of preview status
  • Adding Query Builder documentation to daxstudio.org
  • Updating AdomdClient and TOM references to v19.14
  • Added retry logic when fetching Database and Model lists
  • #450 Database Last Updated Datetime is now shown in local time instead of UTC (and the timezone information is displayed)
  • Added keyboard short cuts in metadata pane to work with Query Builder
  • #492 added description to function tooltip

Fixes

  • Fixed crash when doing Replace All with an invalid RegEx expression
  • Fixed crash when clicking Refresh Metadata after connection has dropped
  • Fixed #469 Server Timings busy spinner was activating incorrectly when doing Clear Cache
  • Fixed #467 Database Name was being incorrectly injected into all new connections when DAX Studio was opened from External Tools in Power BI Desktop
  • Fixed #466 PBI XMLA connections not being parsed correctly
  • Fixed #457 crashing on startup after setting view hidden objects = false in options
  • Fixed #356 Clear Cache throwing errors against PBI XMLA endpoint
  • Fixed #449 Connection lost during Benchmarking
  • Fixed a trace timeout issue when connecting to PBI XMLA endpoint (sometimes it is still necessary to increase the default timeout to 90 seconds or more)
  • Query Builder
    • Fixed a bug when filtering dates that reported an error parsing an empty string
    • Added the ability to drag KPI components into the Query Builder
    • Fixed a crash when trying to alter the expression for a dynamic format string for a calculation group item
  • Fixed hang when pasting in code with extremely long lines (tens of thousands of characters)
  • Fixed #451 up/down keys not work consistently
  • Fixed #446 Aggregate rewrite events causing an error
  • Fixed #444 to use correct culture in csv export
  • Fixed #445 defaulting the selected model to the base model not the first perspective
  • Fixed an issue with the trace layout options not displaying
  • Fixed a bug with swap delimiters feature and block comments
  • Fixed drag/drop of text not working within editor
  • Fixed #475 proxy auth error when entering AAD credentials
  • Fixed #485 Swap delimiters incorrectly affecting functions with periods in the name
  • Fixed #486 Define Measure not using Separator specified in Options
  • Fixed #497 Define Measures not working when connected to a model with active translations
  • Fixed #499 “Show Objects that Reference this Column” feature did not include table name.
  • Fixed #501 avoiding bug with EXCEPT function in Query Builder

Miscellaneous Crash fixes

In addition to the fixes listed above a number of other fixes have been made that should address some issues that came in via crash reports, but the majority of these reports were submitted anonymously. What this means is that for many of them it is impossible to verify that the fix has worked as we have no steps to reproduce the issue and the fix is based on assumptions and best guesses of the root cause of the issue.

If you are experiencing a regular crash or issue I would encourage you to open a bug on github so that I can help resolve your issues.

DAX Studio 2.12.1 Released

There are just a few minor updates to DAX Studio in this release

New Features

  • Pasting a connection string into the server name of the Connection dialog will parse out the Data Source and Initial Catalog settings

Fixes

  • #391 – issues when copying Power BI Performance data and pasting into Excel
  • #393 – application crashed if you attempted to open the Initial Catalog list before specifying a server
  • #394 – output to text file was not working for multiple results sets
  • #397 – “All Queries” trace was not working against PowerPivot connections

DAX Studio 2.12 release

DAX Studio 2.12 has just been released and I wanted to take some time to walk through some of the significant new features in this release.

The Installer is now signed with EV Certificate!

The fact that Installer now signed with an EV certificate might sound like technobabble but this should hopefully prevent all the issues and warnings people have been getting from from Windows SmartScreen when attempting to download and install DAX Studio.

A HUGE thank-you goes out to the guys at SQLBI.com. To acquire an EV certificate not only requires a financial outlay, but you also need to have a legal company entity with a physical office and financial records and things like that. Both of these things make it pretty much impossible for a free/open source tool to obtain an EV certificate on their own without assistance.

Note: Even though Windows may show SQLBI as the publisher this is due to the fact that they have funded the code signing certificate and it is a condition of the EV certificate that it has to bear the legal name of the entity that purchased it. DAX Studio is still an open source tool and all support requests still go through our github issue register.

Initial Catalog option added to Connection Dialog

An Initial Catalog setting has been added to the Advanced Options in the connection dialog, if you have a very large model or slow connection (like AzureAS or VPN) then you can now choose the initial database you are connected to. This saves you having to wait for the metadata to load for the default database, then change to another database and wait for the second lot of metadata to refresh.

In future I also want to look at improving the ability for DAX Studio to automatically re-connect to dropped connections. But that will have to wait for a future release.

Format Query button updated

The Format Query button has been changed into a split-button so that you can choose the style of formatting by clicking on the bottom half of the button.

Clicking the top half of the button will still trigger the default formatting style (which can be changed in the Options menu)

Portable version

DAX Studio now has a portable download option. The portable version is just a zip file which you can unzip then run by clicking on DaxStudio.exe. This is helpful if you need to run DAX Studio on a machine where you may not have admin rights or where you do not want to run the installer.

The portable version has the following limitations:

  • The Excel addin is not included (as we need to write to the registry with admin rights to register the addin with Excel)
  • The Power BI Desktop External Tools feature will not work (as it requires admin rights to add the pbitools.json file under c:\program files\…)
  • The ability to double click on a .dax file to open DAX Studio will not work (this also requires making registry changes to the machine)

Otherwise the portable version is identical to the installed version. Apart from the above limitations the only difference is that settings, query history, autosave files and error logs are all redirected to local files and folders underneath the folder where daxstudio.exe is launched from

Other New Features

  • Added Roles and IsAdmin properties to Database tooltip that can be useful when testing security settings.

Fixes

  • #378 Power BI External Tools – installer not correctly updating path in daxstudio.pbitools.json file if a non-default install location is used
  • #377 exporting tables with no rows would throw an error
  • #376 autosave not recovering open files for all types of crashes
  • #374 improved tooltip sampling to prevent full column materialization
  • Fixed crash when attempting to export using invalid manual connection string
  • #381 re-connecting to a pbix file that has been closed and re-opened after launching from External Tools does not work

DAX Studio 2.11.2 Released

I’ve just pushed out a minor update for DAX Studio, it fixes a few issues I’ve received from crash reports as well as the following items:

Updates

  • Adds support for the “External Tools” ribbon in Power BI Desktop

Fixes

  • #373 Fixes “Define Measure” when connected to a SSAS instance without admin rights
  • #372 Duplicate MyMeasures created when adding more than 1 custom measure in the Query Builder
  • #367 Measure information duplicated in VPAX files
  • Fixed crash when highlighting
  • Fixed crash when closing last tab and connection had been lost
« Older posts