Data Integration, PowerShell

Objections to Power Shell for Data Integration

In our previous post, we discussed why we recommend that you use PowerShell for data integration.  If you’re a data professional, you may have some objections.  (Indeed, if you’re not a technical person, you should probably skip this post).

First, PowerShell isn’t great for data transformation.

For example, if you’ve used SSIS or a similar tool, you may be used to mapping data in your integration tool. That, indeed, is how the ETL – Extract/Transform/Load integration model works.  You take data from source 1 (extract), you transform it in your tool, and you push it into source 2 (load).

And indeed, we don’t recommend this paradigm.  Instead, we prefer ELT– Extract/Load/Transform.

Which means all PowerShell is doing is pulling the data and pushing into another database. Once the data is loaded, we then use SQL to transform it for its final use.

The point – we don’t do transformations in intermediate tools.  We do it in the database source or in the database target.

Speed may be your next objection.

Purpose built tools can be faster than PowerShell. 

But as our customer, you’re a solid mid-market company our customer with revenue anywhere from 50 to 500 million dollars.  And customers like you don’t have “big data”. It’s rare that any of your integrations process more than 100,000 records at a time. 

A decent server can process this quantity of data within minutes.  And indeed, in the world of Rest APIs, the speed constraint is most often the API, not PowerShell. And again, as a mid-market company, you’re not looking for instantaneous, 7×24 performance.   Ease of support and training are much more important.

if you do have tens of millions of records to load at a time or you need instantaneous updates, we’ll agree that PowerShell is not the answer. And we’ll admit that we’re not the right partner for you.

Data Integration, PowerShell

Why PowerShell for Data Integration

As there are many specific tools for data integration (Boomi, SSIS, informatica), you we think you should choose PowerShell.  Whether your data source is an Excel spreadsheet, a text file, a Rest APIs, or a software specific reporting tool, PowerShell should be the glue that binds them all together.  There are two reasons we think you should choose this direction.

First, if you run Windows, you have PowerShell.  Yes, we’ll almost certainly have to install some new modules.  But you have it.  And there’s no easier investment than no investment.

Second, you can find people to support and maintain what we build together. Which means you won’t become dependent on us.  We certainly hope that you become a long-term client.  But we don’t believe in consultant dependency syndrome.   You should work with us because you want to, not because you’re afraid of firing us.

Third, there are lots and lots of resources available to learn about PowerShell.  With their MVP program, Microsoft has done a wonderful job encouraging expert content creation.   Blogs, videos, and conferences provide multiple ways to learn about PowerShell. And of course, we’re happy to train.

That said, we know that PowerShell is not THE answer to every data integration question. We’ll discuss some objections in our next post.

Adaptive Process

Collecting Data About How You Work with Data

A friend of mine is an executive at a smaller non-profit healthcare provider. We’ve been speaking informally about the various data and reporting challenges his agency is facing.

Here are the key facts:

  1. The organization relies on multiple software systems, some in the cloud and others server-based.
  2. The organization has several folks who develop reports in the different systems. Each person develops reports as they see fit, generally upon request from others in the organization.
  3. There is no CIO or CDO. Various other executives (including my friend, who has no technical background) have responsibility for managing technical folks.
  4. An executive with extensive experience in larger, more sophisticated organizations wants the organization to become more sophisticated. He would like them to develop a data warehouse to centralize all their data so they can better develop metrics that would drive the business forward.
  5. The organization has not been successful in the past when they have attempted to execute large projects.

What do I recommend? Should they be strategic, develop a data warehouse, create metrics and distribute? Or should they go tactical and identify all current data demands and knock them out one by one?

Read more…
Dynamics NAV

Building A NAV Date Table for Power BI

A date table is essential to data work, whether you’re working in SQL, Tabular Modeling for Power Pivot, Power BI or SSAS. 

Because if you work with any kind of data, and particularly accounting data, you’ll always have to ask for the appropriate time period.

You can find lots of good examples of how to build a data table, both in SQL and in Power BI, if you operate on a calendar year or your fiscal months are calendar months. 

Google is your friend here, so if the links above don’t help, you can find others.

But, if you’re a retailer or another business that operates on some variety of a 4-5-4 calendar (with a 53rd week showing up every couple of years), that’s not as easy to find.  

Read more…
Business Intelligence

Why I Love Self-Service BI

Self-service BI (or self-service business intelligence) has been a buzzword for a while. If you Google “self service BI,” you get 383,000,000 results. 

(Interestingly, if you Google without the dash — “self service BI” — you get 409,000,000 results. I can’t tell you the significance of this. Although Word tells me it needs a dash.) 

But you don’t Google search results to tell you that users don’t want to rely on IT to get the “data they need from the software they already have.”  

So, as a consultant, why would I want users to do more of the work?

Read more…
Dynamics NAV

NAV Object, Field, and Option Numbers in the SQL Server Database

When writing SQL queries for NAV, you can run into a problem.   

While column names are generally pretty self-explanatory, sometimes the data within those columns requires decoding.  

Those are: 1) Table Numbers, 2) Fields Numbers and 3) Option Numbers

I’ve already covered this in my posts (and presentation) on Understanding the NAV Database.

Here I’m going to repeat myself — and then give you the code you need to get these numbers into SQL.  

Read more…
Dynamics NAV

Accessing Dynamics Business Central (NAV) Cloud Data, Part 5 – Going Back to SQL Server

So far in this series, I’ve spent a lot of time showing how we can work with various ODATA web service APIs to get different sets of data from Business Central. 

And as you may recall, the purpose of this series is to describe how you can combine data from Business Central with other applications that may not be in the cloud and may not be using Power BI.  

In this post, we’re ready to talk about this specifically.

Read more…
Dynamics NAV

Accessing Dynamics Business Central (NAV) Cloud Data, Part 4 – Going Deeper into the API

So far, we’ve seen how we can extract an entire table from the cloud, in this case the Chart of Accounts. However, we often don’t want to do that (especially on tables like G/L Entries). 

Thus, we need to filter.

The complete reference for ODATA Filtering for Business Central/NAV can be found here:

Read more…

Get tips and insights delivered to your inbox

Start a conversation with us

Call: 917-848-7284

Request a Consult