Dealing With JSON Data (a Primer)

Dealing With JSON Data (a Primer)

Recently one of our member on our Facebook group has been trying to work with JSON data and posted a few questions. And while a lot of the answer were correct, they also seemed to be a bit dated and led me to believe that perhaps everyone hasn’t fully embraced just how well PCSOFT has done with the last few releases to integrate JSON natively. So read on and I will attempt to cover some of the new methods we use when dealing with JSON data.

Here at wxPerts we work with a lot of JSON data, partially because we work with a lot of webservices and RESTful webservices and JSON data have definitely become the standard for webservices. In fact, I cringe when I see a SOAP and XML webservice these days as I find working with RESTful and JSON so much easier.

But that isn’t the only way we work with JSON data. Andy does most of our mobile work and he often uses JSON when he needs to store some data on the local device. There are a lot of reasons for this, in most mobile projects we are going to be working with some kind of a backend database and that communication will be via RESTful and JSON, so if the data that the mobile app is sending and receiving is mobile, then it just makes sense to just keep it in that format. It also means we don’t need to rely on installing Hyperfile on the mobile device or using SQLLite.

In recent years we are also using JSON data in our databases to provide flexibility and customization. The beauty of JSON data is it doesn’t have to be predefine, you can add a new field to it without breaking anything, you can mix and match fields as needed etc. In fact we are currently working on a project that has several database tables because of extra fields needed for different regions and/or clients. We have created a methodology to store those extra fields in a JSON column in the database, and the data can be different for each region/client, and a new field can be added on the fly very quickly and easily without any database updates/conversions. Since we are using PostgreSQL as our backend database we can even directly query the individual fields in the JSON via SQL statements, but that is getting a bit outside the scope of this post. I promise we will be talking more about some of this methodology as the project progresses.

So all of that was just to say we seem to live, breath and eat JSON around here, in fact we were doing it back with there was little to no support for JSON inside the WINDEV/WEBDEV, so when it became obvious over the last few releases that PCSOFT had made a major commitment to natively support JSON we were very happy.

Much of the advice given in the post concerning JSON, was mentioning using a Variant variable type, Structures, classes, etc. All of which we have used in the past to work with JSON data, and many of which we have written about or taught in webinars. But they all were work arounds, and had issues.

For example putting your JSON data into structures, lets you work with them very easily with the WL language, however as soon as it is in a structure you have lost flexibility of dynamic nature of JSON, with a structure every field defined exists, and you can’t add a new field without changing the structure definition. You can also run into issues with you attempt to serialize/deserialize the JSON data in/out of the structures if the two don’t match perfectly. There are ways to handle all those issues, but again we the latest JSON functionality we just haven’t found much of a need for it. Note: In a very over simplified way, classes are just structures with some added functionality, so although they to give some extra ability for dealing with the above mentioned issues, they are still just a workaround. Not that I am saying classes themselves aren’t great, heck we pretty much do everything as a class these days, and all of that methodology I mentioned above for custom fields in a database via JSON is all handled via classes, I am just saying that we use classes to work with the Native JSON data, not to convert it into a WL format to work with it.

Moving JSON into a variant data type was definitely a great stepping stone in our path to native JSON support, and in many ways the new JSON datatype works very similar to the Variant datatype. For example we can use the ..Exists property to see if a field exist. We can add a field to the variable by simply using it. (MyVariant.NewField = 123). But moving the JSON in and out of a variant datatype just isn’t needed anymore so why go through the extra step, processing time, etc. And as you will soon see the JSON datatype adds some new features that the Variant datatype doesn’t offer!

As is the Uncle Pete style, I have rambled on enough, its time to start looking at working with JSON natively. For this example we are going to use some JSON data that comes back from our Quickbooks Online API classes. This is the Items Querry and I told it to only return 3 results, that will let us explore arrays without having to large of a JSON object. So here is the native JSON

{
	"Item":
	[
		{
			"Name":"Design",
			"Description":"Custom Design",
			"Active":true,
			"FullyQualifiedName":"Design",
			"Taxable":false,
			"UnitPrice":75,
			"Type":"Service",
			"IncomeAccountRef":
			{
				"value":"79",
				"name":"Sales of Product Income"
			},
			"PurchaseCost":0,
			"TrackQtyOnHand":false,
			"domain":"QBO",
			"sparse":false,
			"Id":"4",
			"SyncToken":"2",
			"MetaData":
			{
				"CreateTime":"2020-03-26T10:41:38-07:00",
				"LastUpdatedTime":"2020-05-16T14:42:06-07:00"
			}
		},
		{
			"Name":"Gardening",
			"Description":"Weekly Gardening Service",
			"Active":true,
			"FullyQualifiedName":"Gardening",
			"Taxable":false,
			"UnitPrice":0,
			"Type":"Service",
			"IncomeAccountRef":
			{
				"value":"45",
				"name":"Landscaping Services"
			},
			"PurchaseCost":0,
			"TrackQtyOnHand":false,
			"domain":"QBO",
			"sparse":false,
			"Id":"6",
			"SyncToken":"0",
			"MetaData":
			{
				"CreateTime":"2020-03-26T10:43:14-07:00",
				"LastUpdatedTime":"2020-03-26T10:43:14-07:00"
			}
		},
		{
			"Name":"Hours",
			"Active":true,
			"FullyQualifiedName":"Hours",
			"Taxable":false,
			"UnitPrice":0,
			"Type":"Service",
			"IncomeAccountRef":
			{
				"value":"1",
				"name":"Services"
			},
			"PurchaseCost":0,
			"TrackQtyOnHand":false,
			"domain":"QBO",
			"sparse":false,
			"Id":"2",
			"SyncToken":"0",
			"MetaData":
			{
				"CreateTime":"2020-03-21T14:42:05-07:00",
				"LastUpdatedTime":"2020-03-21T14:42:05-07:00"
			}
		}
	],
	"startPosition":1,
	"maxResults":3
}

So lets put the above JSON data into a native JSON variable type so we can do some experimenting. Normally I would do something like this to populate the a string with a long fixed value, however the curly brackets in JSON complicated that

myJSON = [
xxx
xxx
]

So instead I saved the JSON to a text file and then used fLoadText like this

myJSON is JSON = fLoadText("C:\temp\example.json")

And here comes my first tip when working with JSON data, or any complex data for that matter, in WINDEV/WEBDEV… The Debugger is your friend. We can gets some great tips about how to handle the data by simply looking at it in the debugger, so after loading the variable, I set a break point and then examine the variable in the debugger

When we choose to display as text we see it in a structured tree. I feel that the Display options are a bit backwards and this should really be the JSON view, but I digress 🙂

Choosing JSON from the Display combo shows us the variable in its native JSON text format

There is also a Hexidecimal display option, but I don’t really see that as beneficial unless you happen to be a robot 🙂

So back on the Text format, notice when I click on name in the second item, the Expression line at the top actually shows me exactly how to address that via code

So lets give that a try via this code

Info(myJSON.Item[2].Name)

And as you see we get the correct value. Not this also gives us a tip on how to deal with the data as we can see that Item is an array, which means we can write code to work with it similar to the way we would other arrays in WL code

Of course JSON can can complex variables in other complex variables and even array inside arrays, but again it is no issue to address them and the debugger helps us figure out the correct code.

For example IncomeAccountRef is a complex variable that contains 2 fields of its own (name and value). But the debugger shows exactly how we could address those fields as well.

So we know Item is an array, so lets expand our code a bit to loop through the array and show the name for the IncomeAccountRef for each Item returned (3 in this case). Here’s the code

retString is string
FOR EACH recItem OF myJSON.Item
	retString += [CR] + recItem.IncomeAccountRef.name
END
Info(retString)

And here is the result

As you see we are using the FOR EACH loop syntax that we use for stepping through any other WL array. recItem gets declared as a JSON variable of its own with just single item in it. Lets look at recItem in the debugger and as you can see it once again showed us exactly how to reference the field we wanted to see.

Note: If you are more comfortable working with arrays using a counter/index loop that also works just fine. Here is how the code would read for that.

FOR x = 1 TO myJSON.Item..Count
	retString += [CR] + myJSON.Item[x].IncomeAccountRef.name
END
Info(retString)

Here are a couple of other quick tips and examples, then we are going to really get fancy!

As mention above, the JSON variable type is dynamic we can add fields to it on the fly. So if we execute the following code

myJSON.NewField1 = "We just added this"

We see that it is now part of the JSON when we look in the debugger

And if we were to want to save the JSON back out to a file we could simple do this, and it will write out a correctly formatted JSON file with out new field included

fSaveText("C:\temp\example2.json",JSONToString(myJSON,psdFormatting))

Note: The psdFormatting option formats the JSON in human readable format with indents and carriage returns, but if you are ok with your JSON being on one line or need it as compact as possible you can use the psdMinified option instead. But if that is what you want you can do it even simpler and not use the JSONToString function at all.

fSaveText("C:\temp\example3.json",myJSON)

We can just as easily insert complex variables, such as arrays, and do so on specific elements of the arrays (specific item). And since everything is dynamic, we don’t even have to have the same fields on each. Let’s look at this code

myJSON.Item[1].NewArray[1].Field1 = "Abc"
myJSON.Item[1].NewArray[1].Field2 = "Def"
myJSON.Item[1].NewArray[2].Field1 = "Hij"
myJSON.Item[2].NewArray[1].Field3 = "Klm"

And the results in debugger, as you can see we managed to create a new array and even have different fields in the array for different item records. Very powerful stuff, and we use this to our advantage often for returning custom fields, errrors, etc.

As mention above we can also use the ..Exists property similar to a Variant to see if a field exists or not, which can sometimes be handy for exception code. So after adding our new data from the last example if we execute this code

retString is string
IF myJSON.Item[1].NewArray..Exist THEN
	retString += [CR] + "Yes"
ELSE
	retString += [CR] + "No"	
END
IF myJSON.Item[3].NewArray..Exist THEN
	retString += [CR] + "Yes"
ELSE
	retString += [CR] + "No"	
END
IF myJSON.Item[1].NewArray[1].Field2..Exist THEN
	retString += [CR] + "Yes"
ELSE
	retString += [CR] + "No"	
END
IF myJSON.Item[2].NewArray[1].Field1..Exist THEN
	retString += [CR] + "Yes"
ELSE
	retString += [CR] + "No"	
END
Info(retString)

We will get this

Before moving on to some even more advanced topic, I should mention another quick tip when working with JSON, or XML for that matter, is using Notepad++ which has extensions for formatting the JSON and even viewing it in tree format.

Now for a couple of things that nearly blew my mind when I discovered them.

First remember that just like JavaScript, JSON is case sensitive, so myJSON.Field1 and myJSON.field1 are 2 completely separate fields and can really cause you problems in your coding if you don’t get it perfect. But PCSOFT has even came up with a way to help us with that issue!!!

Some of you may know that when working with XML documents you can import an XSD or example XML document into your project to use an an external description, which enables intellisense for that document. Well guess what we can also import an Example JSON document!!!

When working with APIs often the documentation will provide you with an example of the JSON that will be returns. For example I was able to get this example of the Items JSON from the Quickbooks online documentation.

{
	"Item": [
		{
			"Name": "",
			"Description": "",
			"Active": true,
			"FullyQualifiedName": "",
			"Taxable": false,
			"UnitPrice": 0,
			"Type": "",
			"IncomeAccountRef": {
				"value": "",
				"name": ""
			},
			"PurchaseDesc": "",
			"PurchaseCost": 0,
			"ExpenseAccountRef": {
				"value": "",
				"name": ""
			},
			"AssetAccountRef": {
				"value": "",
				"name": ""
			},
			"TrackQtyOnHand": false,
			"QtyOnHand": 0,
			"InvStartDate": "",
			"domain": "",
			"SubItem": false,
			"ParentRef": {
				"name": "",
				"value": ""
			},
			"sparse": false,
			"Id": "",
			"SyncToken": "",
			"MetaData": {
				"CreateTime": "2020-03-26T10:36:03-07:00",
				"LastUpdatedTime": "2020-03-29T12:47:47-07:00"
			}
		}
	],
	"startPosition": 0,
	"maxResults": 0
}

I save that as Items.json and then import it into my project as an external description

Which gives me this

Now if we drag that definition into our code we get this

Which I can then adjust to be the same as our original code for loading the JSON file like this

myJSON is JSON <description="Items"> = fLoadText("C:\temp\example.json")

The difference is now intellisense knows the makeup of our JSON can can assist us, so if I type myJSON. it begins prompting me, and you can see maxResults and startPosition, along with their exact case format are available, and it even shows me that Item is an array.

If I take it a step further and type myJSON.Item[1]. intellisense provides me with a list of the fields in Item

And this just keeps going for however deep we may need. For example

But wait, don’t order yet, there’s more! 🙂 Instead of dragging the entire Items description onto our code area, Just drag Item

And we get this

That’s right we can use part of the definition, so now our code for looping through the array looks like this and as you can see Intellisense is once again able to help us. Note for this code to work it recItem can’t be declared in the global section as the For Each loop won’t let you use a predefine global variable for some reason.

That is all well and good, but not really prepare for an mind expanding experience.

If we define our recItem in the global event, instead of the initializing even, for the page. The the variables are available to be bound to our controls. So if I do this code globally for the page

myJSON		is JSON	<description="Items">	= fLoadText("C:\temp\example.json")
recItem		is JSON	<description="Items.Item"> = myJSON.Item[1]
FileToPage()

And we bind the name to an edit control

You can see that we are now able to move JSON data directly in and out of control fields without any additional code and without ever changing it into any other data type!!!!!

If you are like me at this point you are really excited and the first thing you want to do now is try to use the myJSON.Items as the source for a table or looper control. However although we are able to set the source

And Even bind the columns to the individual fields

I have not been able to get it to actually work and display the values. I believe the issue is that the control isn’t fully seeing it as an array. For example when I look at a table control where we are using a class as the source notice on the binding the top line shows as an array of OrderHeader

However that portion is empty when we use the JSON variable

So unfortunately for now we either have to move our JSON into a structured array or class in order to populated a table control or looper control without code, or we have to manually populate them using TableAdd or LooperAdd. I am hoping that PCSOFT can address that in a future release which would me that we would be able to completely code a project using native JSON database the same as we do any other data!

Hopefully this has given you a pretty good primary into using JSON natively in WINDEV/WEBDEV and convinced you that you no longer need to move it to variants, structures or classes for most uses. I should not that I believe everything I have shown in this article was actually available with version 25. Version 26 has added some additional functions for moving JSON data into Hyperfile record structures, but I haven’t personally used those features. There were also some significant performance improvements with the JSON datatype in v26.

Leave a Reply

Please log in using one of these methods to post your comment:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out /  Change )

Twitter picture

You are commenting using your Twitter account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )

Connecting to %s