Woman looking at a computer screen with charts and graphs on display.

Master Your Power BI Environment with Tabular Models

The world of business analytics is evolving rapidly. The size and scope of business databases have grown as ERP functionality has evolved, businesses have increased their adoption of CRM and marketing automation, and collaboration networks have become more common. As ERP moves to the cloud, software vendors are developing more sophisticated, interconnected ways of gathering, organizing, and analyzing business data.

Microsoft’s launch of the Power BI platform several years ago marked the company’s entrance into an important space within the business applications market. As the Microsoft Dynamics ERP products transition to a cloud-first model, Microsoft has positioned Power BI as the future of business intelligence for its Dynamics family of products.

 

Power BI provides users with some very nice dashboarding capabilities. Unfortunately, it also introduces a mountain of complexity into the reporting process. Most organizations are looking for sophisticated reporting and analytics, but they have little appetite for managing the highly complicated infrastructure that goes with it.

Fortunately, there is a way to have the best of both worlds.

OLAP Cubes vs. Tabular Models

Let’s begin with an overview of how data analytics works for most business applications. The simplest model for reporting ERP data involves a direct connection to the database – executing queries against the transactional database in real time, summarizing and organizing selected data points, and displaying a report that you can print, display on screen or save as a data file. In the world of Microsoft Dynamics 365, that option is going away, or at the very least, it will look somewhat different than it once did. As a security measure, Microsoft is closing off direct database access to live Microsoft Dynamics ERP data. The company is pointing customers to several other options, including “BYOD” (which stands for “bring your own database”) and Microsoft Azure data lakes.

This leads to the second option, which is a data warehouse. In this scenario, data are periodically queried from the source transactional system. It updates a dedicated database against which you can perform reporting and analytics.

Within the data warehouse paradigm, there are two divergent approaches. The first is an OLAP model. That stands for “Online Analytical Processing,” and it’s a paradigm that goes back a little more than two decades, to a time when database performance and computational power were far less robust than they are today.

To perform multidimensional analysis on large data sets, OLAP data were organized into “cubes.” The process required substantial pre-processing of information, so all of the computational heavy lifting was done in advance. That was often an overnight process, which meant that today’s reports always reflected yesterday’s reality.

In other words, the OLAP approach was largely a workaround to make up for the fact that processing power was simply not up to the task of crunching lots and lots of data in real time.

Fast-forward to 2020. Moore’s law marches on, databases are more powerful, and processing speeds are faster than ever. The era of big data has arrived.

Part of the performance improvement arises from the development of in-memory database technology. By loading large data sets into random-access memory (RAM), software engineers have made it possible to query very large data sets and render results with extraordinary speed.

This leads to the newest paradigm for reporting and analytics—tabular models. Tabular models within SQL Server Analytical Services (SSAS) run as an in-memory process, which means they’re very fast. It seems likely at this point that Microsoft will focus more of its resources developing the more modern tabular models, which don’t require pre-processing and are therefore much better at providing near real-time information.

Building a Roadmap to Your Data

Because it’s apparent that tabular models are the future, it’s important that companies understand the practical implications for their IT investments and the total cost of ownership.

In the past, the people who developed and modified reports usually needed to understand the underlying data structure of the ERP system. In the new paradigm, an additional step is required: Someone needs to develop the tabular models that serve as a kind of roadmap or translation layer between the ERP data and reporting tools.

This process requires some very specialized expertise. Moreover, it can result in a map that is confusing and potentially contradictory.

Let’s say, for example, that you want to report on year-over-year comparative same-store sales. One programmer might develop a tabular model that compares current year sales to the prior year for all stores in the organization. Another might exclude new locations, or locations that have closed in the current year, or both. Reports built on those two (or more) models will produce very different results.

One of the top complaints that we hear from C-suite executives is that they often receive conflicting information about what is happening in their organization. When the VP of Sales announces that year-to-date revenue is on target at $20 million, but the CFO’s revenue number only shows $18 million, it’s hard to have a meaningful conversation about sales performance.

Man sitting at his desk looking out a window.

Businesses need a single source of truth. In and of themselves, tabular models are not necessarily problematic in this way, but when multiple people within the organization are developing them, the results can be chaotic at times. This is especially true when you’re involving multiple people  in the process or when requirements are not clearly documented.

The result can be a bit like a collection of Excel spreadsheets created by someone who has left the company, and edited by a half dozen people over the years. Information gets lost. Formulas get out of sync. Someone needs to bring order to the chaos.

Ongoing Maintenance

To make matters worse, tabular models require ongoing maintenance. Whenever the ERP system is upgraded (or CRM, or some other system that is incorporated into business reports), the data model inevitably changes. That means tabular models need to be revisited.

If you have a substantial library of existing reports and dashboards, reviewing and testing them all will take time. Each problem that is uncovered as part of that review will prompt a rewrite of the code underlying the tabular models that drive those reports.

If you are using Power BI without any external reporting tools, that maintenance effort must be undertaken with every upgrade you perform to your systems. It dramatically increases your total cost of ownership.

Bringing Order to the Chaos

There is a surprisingly simple solution to many of the problems discussed here. Namely, by working with a pre-built data model instead of building customized one-off tabular models, businesses can have instant access to their ERP data, without all the upfront work or ongoing effort and expense associated with custom-built SSAS tabular models.

Jet Analytics is a product from insightsoftware that integrates with the family of Microsoft Dynamics ERP products and other business applications. Jet Analytics provides the tabular models for you, the roadmap to your ERP information, as an off-the-shelf product that lowers your costs and vastly simplifies reporting and analytics. You can also create your own custom tabular models from the same pre-built data model to ensure consistent decisions.

Jet Analytics updates each time there is a data model change to the Microsoft Dynamics database, which eliminates the problem of rewriting large amounts of code every time you upgrade.

With built-in data lineage and documentation options, everything links together in one platform and documented on the fly, making it easy to track-and-trace data within your business intelligence environment.

For customers upgrading from one of the legacy Microsoft Dynamics products to Microsoft Dynamics 365 Business Central (D365 BC) or Microsoft Dynamics 365 Finance & Supply Chain (D365 F&SC), Jet Analytics makes life even easier because it solves a critical problem involving reports that include historical data.

Typically, when an organization changes from one ERP system to another, that involves a tedious and expensive data migration process. For practical reasons, many companies choose to limit the amount of data that they migrate. In the process, much of the detailed transactional history is left behind.

That means your reports will no longer include historical data alongside current information. It creates a break in continuity that most business leaders will find inadequate.

With Jet Analytics, companies can connect to their legacy ERP system and their new ERP system at the same time. That means you can have a complete picture of your organization, both before and after the migration, without sacrificing visibility to historical data and without requiring an expensive project to migrate historical data to the new system.

Jet Analytics puts the power of reporting and analytics into the hands of end-users, enabling greater business agility and responsiveness.

For over two decades, insightsoftware has provided over 27,000 organizations with world-class financial reporting and enterprise performance management tools. We enable finance teams with reporting, analytics, budgeting, forecasting, consolidation, and tax solutions to provide them with increased productivity, visibility, accuracy, and compliance. Download our Jet Solutions brochure, built for Microsoft Dynamics 365, or talk with one of our experts about how insightsoftware can help you provide better business intelligence at a lower total cost of ownership.

Rick de Jong -
Rick de Jong

Finance and IT specialist with over 20 years’ experience, who loves helping organizations build their insights by creating high quality management reporting mechanisms.