So far, we’ve seen how we can extract an entire table from the cloud, in this case the Chart of Accounts. However, we often don’t want to do that (especially on tables like G/L Entries). 

Thus, we need to filter.

The complete reference for ODATA Filtering for Business Central/NAV can be found here:

Filtering – Choosing Only Income/Statement Accounts

Let’s start with a simple example and say we only want to see income statement accounts. So, we need to know two things: The field name to filter on and the value to filter.

Looking at the Postman File

To do this, we look up the file I saved in part 3 of this series, the .json file produced by Postman.  

Opening the file in Visual Studio, we can search for an example:

The field name is called Income_Balance and the value is “Income Statement.”

Two things to note if you’ve worked with SQL Server and NAV before:

  1. The field names don’t have the special characters that drove us crazy in old, on-premises NAV. Here, they’ve replaced spaces with underscores and periods have generally disappeared.
  2. The field types are not the always the same. The best example I can think of (which drove me crazy) was Posting Date associated with G/L Entries. I figured out it was Posting_Date.  However, in SQL, it’s defined as a DATETIME and in ODATA it comes across as a Date. Ugh. 

Looking at Excel

While I’ve said that I’m not looking to use Excel for this exercise, it is a nice way to view the data quickly. I’ve described how to do that in part 2A of this series

Here’s what Excel output of ODATA looks like:

We can see that column name is the field we want to filter, and the value is still Income_Statement.

Entering Filters in Postman

Let’s go back to Postman and look at our request again.

The first thing we’re going to do is Save As, so we can have one version with parameters and one without.

We click the arrow next to Save and choose Save As:

And we name it ODATA-Accounts with Filter:

Back to Postman.

There’s a lot of stuff we can do with filters. This Microsoft document describes many of the options:

However, we’re going to focus on getting income statement accounts.

Let’s look at the Params tab underneath the request. We enter the parameter that Microsoft wants ($filter) and the string to pass, Income_Balance eq ‘Income Statement’.  

Note that the string Income Statement must be in single quotes.

When we send the request, we can see the result set starts with 40000, the first income statement account:

We highlight and copy the string and paste it into the browser: 

When we do, we see that the spaces (and some other special characters) have been replaced. 

Keep in mind that when you use other tools, you may need to do these replacements yourself.

Going More in Depth — Working with G/L Entries

By the end of this series, I want to arrive at a place where we have a complete solution with working data. So, I’m going to use a different ODATA web service for G/L entries. 

G/L entries power a large percentage of the reports that I build. So, accessing them is pretty important.

Let’s start again by going back to Business Central and finding the Web Services entry for G/L Entries:

Here’s the link:{YourGUIDHere}/Production/ODataV4/Company(‘CRONUS%20USA%2C%20Inc.’)/G_LEntries

You’ll note that the only thing that’s changed is the last part of the URL — the part which tells us which Page we’re calling.

Next, let’s create a new request in Postman:

We then paste the URL into the GET area and add a filter to only take entries with an Entry_No le­­ 10:

When we scroll through the results, we see that we’ve only received 10 entries. (I can’t prove this with a screen shot as there are too many rows for each entry.)

Let’s make one more adjustment — filtering for a range of numbers. (I’m going to use this in my complete example in PowerShell.)

This brings me to another parameter that you may want to use.

Selecting Only Certain Fields

Up to now, we’ve been receiving every field available on the page, which is often what we want when pulling NAV data into another system.  

But let’s say we only want Entry, Account, Posting Date and Amount. We can do that with a Select parameter. 

The parameter entry looks like this:

When we run the selection, we see we only have 10 records with four fields each:

Selecting by Date

Let’s finish with one more selection, this time by data rather than by Entry_No.  

Let’s choose all entries with a Posting_Date of January 31st, 2018.

So we replace the filter Entry_No with the filter Posting_Date:

And it works:

I wanted to use this example for a reason. The ODATA page here uses Posting Date as Data, not Date Time. I, being used to SQL, thought it should be a date time and wasted lots of time trying to pass a DateTime. Hopefully I’ve saved you from the same mistake.

One Last Note on Postman – The Value of Variables

The challenge of APIs is that the strings get to be quite long. As we saw above, much of the string is the same for each request. Further, I hard coded the company name into each string. If I wanted to test this across companies, I’d have to go into each request and make the change — which isn’t something I want to do.  

Enter the power of Postman collections.

When we first set up our Postman collection, I showed how you can store your authentication values at the collection level. 

Turns out, we can do more than just that with our collection.  

Let’s go back to our collection and choose Edit from the context menu:

Now choose the Variables Tab:

We’re going to create two variables, one for base_url, and one for bc_company (in this case CRONUS US).

The base_url is everything before the company, and the company is the company.  

So, given the string we used before for G/L Entries:{YourGUIDHere}/Production/ODataV4/Company(‘CRONUS%20USA%2C%20Inc.’)/G_LEntries

Our variables would look like this:


bc_company:  Company(‘CRONUS%20USA%2C%20Inc.’)/

Please note: The company variable must be correct for the URL, with spaces and commas replaced by %20 and %2C (at least in my testing).

Make sure you have the right number of slashes in your variables. As a rule, I always keep the trailing slash in my variables.

Here’s what it looks like on the screen:

Now, that we have our variables, we can create another request using them. 

Here’s what the request looks like:

And when we run it, here are the results:

Getting this Code

I’m going to save a version of this collection in my GitHub repository. All queries will include the variables. 

And in my next blog post in this series, I’ll talk about how to get this data into SQL Server.

Get tips and insights delivered to your inbox

Start a conversation with us

Call: 917-848-7284

Request a Consult