cancel
Showing results for 
Search instead for 
Did you mean: 

Extended JSON parsing

gyorokpeter
New Contributor

Apparently .j.k loses precision with large enough numbers:

q)`long$.j.k"1471220573128024107"
 1471220573128024064

However I'm trying to interface with a program that expects the number to round-trip, i.e. if I export the JSON again, it should contain exactly 1471220573128024107, so I can't use .j.k to import the file as it loses precision. There are parsers for other languages that can convert to long integers when a number is encountered and only force a conversion to float when there is a decimal point or exponent in the JSON.

Is there any existing library that can do the same for q?

1 REPLY 1

rocuinneagain
Contributor III
Contributor III

A user has written qrapidjson which implements .j.j but not .j.k

The underlying rapidjson library does allow a parse flag kParseNumbersAsStringsFlag 

http://rapidjson.org/namespacerapidjson.html#a81379eb4e94a0386d71d15fda882ebc9

Although in a quick test the library looks to already parse numbers without decimal places as longs by default:

q)rapidjson:.p.import`rapidjson
q)rapidjson[`:loads]["1471220573128024107"]`
1471220573128024107
q)type rapidjson[`:loads]["1471220573128024107"]`
-7h

q)type each rapidjson[`:loads]["[1471220573128024107, 1471220573128024107.0]"]`
-7 -9h

//Same for default json library in python
q)json:.p.import`json
q)json[`:loads]["1471220573128024107"]`
1471220573128024107
q)type json[`:loads]["1471220573128024107"]`
-7h