Another BISM Normalizer Use Case

As I have said for (almost) 3 years, the use cases for BISM Normalizer are more than just helping migrate from Power Pivot to SSAS. Apart from merging code bases, and deployment of large, partitioned tabular models, etc., I probably have not emphasized the ease of deployment through dev, test, prod environments from Visual Studio. Let’s be honest; while they may not admit it, I suspect a large proportion of projects deploy from Visual Studio using right click > Deploy in Solution Explorer.

BISM Normalizer allows piecemeal deployment of individual features through environments from Visual Studio, which is valuable for many projects. Right click > Deploy in Solution Explorer obviously doesn’t achieve this. Apart from being an all-or-nothing deployment technique, it doesn’t take the state of each environment into account.

For example, BISM Normalizer makes it possible to pick limited particular features that have been signed off in a user-acceptance-testing environment and deploy to production – even if the SSAS objects required have been modified in the lower environments.

Benefit: easy deployment that is more responsive to the customer.

BISM Normalizer: Version 1.3.13 Released!

Download it from the BISM Normalizer Visual Studio Gallery page.

Enhancements in Version 1.3.13.1

  • Expiry date extended to December 31st 2014. It is unclear what will happen to BISM Normalizer after this time.
  • Tested with SQL 2012 SP2.

SSAS Locking: CommitTimeout and ForceCommitTimeout

There are already plenty of good posts out there on this topic:

As mentioned by Andrew Calvett, it is possible to set the CommitTimeout for a Process command.  Here is how to do it.

<Execute xmlns="urn:schemas-microsoft-com:xml-analysis">
  <Command>

    <Process xmlns="http://schemas.microsoft.com/analysisservices/2003/engine">
      <Type>ProcessFull</Type>
      <Object>
        <DatabaseID>AdventureWorks</DatabaseID>
        <DimensionID>Date_d2c7ec3d-c72c-435d-bd43-8283714cc2dd</DimensionID>
      </Object>
    </Process>

  </Command>
  <Properties>
    <PropertyList>
       <CommitTimeout>20000</CommitTimeout>
   </PropertyList>
  </Properties>
</Execute>

 

Setting CommitTimeout to 20 seconds (20000 milliseconds) means it will kick in before the server-level default ForceCommitTimeout of 30 seconds. To try this out, run the following query, which takes about 50 seconds on my laptop.  As soon as the query starts running, execute the process command.  It should rollback the process command and allow the query to run to completion.

DEFINE
MEASURE 'Date'[Bad Measure] =
   COUNTROWS(GENERATE(ALL('Internet Sales'), VALUES('Date'[Day Of Year])))
EVALUATE
ADDCOLUMNS(
   VALUES('Date'[Calendar Year])
   ,"Bad Measure", [Bad Measure]
)

 

Change CommitTimeout to 40 seconds (40000 milliseconds) and the default ForceCommitTimeout of 30 seconds will kick in instead.  The query will fail – instead of the Process command – “because of locking conflicts”.

BISM Normalizer: Version 1.3.12 Released!

Download it from the BISM Normalizer Visual Studio Gallery page.

Enhancements in 1.3.12.3

  • Support for Visual Studio 2013 and SQL Server 2014. To run Visual Studio 2010/2012/2013 on the same machine, point them at the same local Addin folder in Visual Studio options.
  • Using a new (very simple) InstallShield installer. This will hopefully resolve the issue with the installation path pointing at a remote Addin folder based on target machine settings. I am unable to test this.

Update 4/4/2014: fixes in 1.3.12.4

  • Fix for BISM Normalizer window re-initializes when loses focus (Visual Studio 2012 and 2013 only).

Video of BISM Normalizer

Revised video of BISM Normalizer.

BISM Normalizer: Version 1.3.11 Released!

Download it from the BISM Normalizer Visual Studio Gallery page.

Enhancements in Version 1.3.11.1

Support for tabular objects created by BIDS Helper.

  • Actions are first-class objects listed in the differences grid.
  • Display folders are attributes of other objects (tables with columns/hierarchies, measures) and visible in those objects’ definitions.
  • Translations are attributes of other objects (tables with columns/hierarchies, measures, perspectives, actions) and visible in those objects’ definitions.

It is necessary to enable the features in the BISM Normalizer options dialog (accessible from the Connections dialog).

BISM Normalizer Options

BISM Normalizer is 2 Years Old!

BISM Normalizer is 2 years old today. I launched it on Christmas Day 2011 (way before SQL Server 2012 hit RTM). It’s arrival was marked by a wise man writing a blog post.

Having done various presentations about BISM Normalizer at the PASS Community Summit, SQLBits and other events, I gradually became more aggressive in selling its benefits – especially challenging the SSAS Deployment Wizard.

I have considered opening up the source code on Codeplex. I also had discussions about selling the source code (the executable is obfuscated), but they didn’t work out. I don’t know the future of BISM Normalizer, but the most likely outcome is I continue to give it away for free on the VS Gallery as I have done till now. The benefits of giving it away (for me) are that it raises my profile both as a consultant and in the SQL community.

So consider BISM Normalizer a Christmas gift of enterprise code management for Analysis Services, facilitated deployment, and promotion of a “single version of the truth” for business definitions covered by BI models.

Merry Christmas – or Happy Holidays – to you and your family!

BISM Normalizer: Version 1.3.10 Released!

Download it from the BISM Normalizer Visual Studio Gallery page.

Enhancements in Version 1.3.10.3

  • Measure/column formats included in object definitions for comparison.  Note: column datatypes have always been included.
  • Measure/column/table visibility included in object definitions for comparison.
  • Expiration date extended to June 30th 2014. It is unclear what will happen to BISM Normalizer after this time.

Multidimensional or Tabular

This post is not a list of multidimensional features unsupported by tabular. This has been documented already by various reliable sources including the following posts, as well as PASS presentations.

First thing I would like to say is I agree that there is lots of work for tabular to catch up to the feature-rich multidimensional. The tabular-model designer inherited from Power Pivot is sluggish for models with lots of tables. The Excel-like DAX formula bar is, to put it politely, annoying. Without saying anything too controversial, MS corporate BI has been playing second fiddle lately.

However, tabular does make sense for many customers today. For most customers, having fast performance is more important than the unsupported features – which invariably either have “workarounds” or are fringe use cases.

On the workarounds, if the same functionality can be delivered to the business, they don’t care if we technical people see it as a “workaround” because it’s not delivered the same way we are used to. And the business people are the ones that matter. This applies to many-to-many relationships, parent-child hierarchies, role-playing dimensions (can create multiple instances of same table), and various other items.

For what I’m calling the fringe use cases, the supportability of some of these does not make sense for many customers. Hand a solution to support that uses MDX stored procs, extensive scoped-cell assignments and they will struggle. How many implementations use these features because the developer thought they were cool rather than having any real business need? I think quite a few.

Other use cases may be showstoppers like unary operators and writeback, but not for the majority of implementations.

Scoped-cell assignments are in the potential showstopper list too, but in most cases if calculation logic is pushed to the ETL layer (where it belongs if not one of the strengths of the cube/tabular model like aggregated level calcs, or those which would cause a data explosion problem at the relational level, etc) to avoid the formula engine where possible, then DAX is a pretty capable and powerful language for calculations built into the tabular model.

On the tabular memory limitation, many customers I’ve talked to are worried they won’t fit into memory when they are actually nowhere near the upper limit of what they can relatively easily get on a server (especially when limited to the required data). Also, more memory will only become more viable in the future. For a multi-terabyte data warehouse implementation like a Yahoo.com, then yes but again this is a fringe use case.

As noted elsewhere,

  • When a project chooses multidimensional or tabular, it is not possible to change your mind without starting development again.
  • I think it is widely accepted that Microsoft is more likely to build new features and put future development investment into tabular than multidimensional.

It is a valid statement that a project may need some of the features unsupported by tabular at a later date, which could be a problem. Conversely a project may encounter performance issues with multidimensional that were not anticipated at the start of the project.  One could also argue that, as new features are built into tabular, could be stuck with multidimensional and unable to leverage better tabular capabilities in the future.

Update Nov 13 2013 – post from Marco Russo: Updates about Multidimensional vs Tabular

Deploy Tabular Models and Retain Partitions

Kasper de Jonge asked a question on Twitter that I would like to answer here to a) give it more exposure, and b) I pretty quickly ran out of 140 characters in my Twitter reply.

I initially tweeted: Problem when deploy tabular models to Dev server from SSDT & lose dynamic partitions? Deploy with BISM Normalizer & retain partitions

Kasper tweeted: Hey Christian, what do you mean by dynamic partitions?

And here is my response …

Hi Kasper,

We often partition the large tables in a tabular model to speed up processing times (as I’m sure you know).  For example, we might partition by month and then process only the most recent couple of months every night – which is a lot quicker than processing the whole table.  Normally this is done in AMO code that is called from a SSIS package – so it can be managed by the ETL to increment/drop partitions, etc.  The version of the model in source control does not contain the partition objects.

While developing in SSDT, we frequently deploy to a dev server (obviously not test/production).  This is necessary because all the dev reports are pointing at the server – not the workspace database on the developer machine.  If the dev ETL has built partitions, they are lost upon deployment from SSDT (using right-click deploy) – and they take a long time to rebuild/reprocess.

When creating a BISM Normalizer difference comparison, we have the option to “include partitions in table definitions”.  If we leave this unchecked, partitions will not be considered when comparing tables.  BISM Normalizer will treat such tables as equal and will not mess with them – thereby avoiding reprocessing.

Include Partitions Option

The SSAS Deployment Wizard is the only other deployment method for tabular models that supports “retain partitions” functionality.  However, you would not use it to deploy from SSDT to the dev server when making a change to a DAX calc or something.  BIDS Helper‘s Deploy MDX Script feature serves a similar purpose for multidimensional.

For regular Test/Production deployments, we can use the deployment wizard. Alternatively, we can use BISM Normalizer to (more easily) create an xmla script for the release to apply on Test/Production.

Another use case is bug fixes and partial deployments.  BISM Normalizer can create a script to apply on Test/Production – which updates only the calculation(s) that need to be fixed without a full release cycle – and does not mess with everything else that is already there.  The deployment wizard only supports “all-or-nothing” deployments, which are not appropriate for bug fixes and partial deployments.

Cheers!

Christian

Follow

Get every new post delivered to your Inbox.