-
Notifications
You must be signed in to change notification settings - Fork 20
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
fitdump.pl: JSON output #7
Comments
I agree that the JSON version of .fit data could be very helpful, but there are two possible problems:
The main routine used in |
What if it just followed the FIT spec? A FIT file has a header, an array/list or record objects and a CRC. I've just started writing a parser in Go and my FIT file structure looks very similar to what is proposed below: {
"crc": 49512,
"header": {
"crc": 23380,
"dataSize": 4430,
"profileVersion": "20.13",
"protocolVersion": "1.0",
"size": 14,
},
"records": [
]
} Note: The CRC values above are in int format instead of hex. This would still let you have the varying data/length in the data/records section while still providing some of the FIT file details in other portions of the structure. Thoughts? Note: My current work, and thus my proposed JSON format, do not account for the scenario where you have multiple FIT files concatenated into one FIT file (which is permissible). |
I added 1st version of JSON output to fitdump.pl. You can try it with command line switch "-print_json=1". If you want to include the units, which are switched off by default for JSON, add command line switch "-without_unit=0". The format is the following (trying to cover concatenated .fit files, although I have no such example):
I also don't output invalid values, because with this simpler format there's no way to distinguish them from the valid ones. |
From what I see, it looks good. I'll give it a spin soon. If you don't hear from me, feel free to ping me. |
I've been using this to build some tools in node.js. I noticed that the JSON output loses the message_number value from the original file. I've hacked my local version for my needs for now, but this would be a great addition. |
It's no quite clear to me where would we put the message number and retain easy accessibility of the data. Can you give me an example? We could also add an additional mapping between message names and numbers at the beginning. |
+1 for message mapping. Including the other message information would be useful too. It's lost in translation right now.
Here's an example of what I'm seeing in the library.
Right now, I have modified this to at lease consistently apply the message number to the message name. To make it easier for parsing, I've also delimited it with "__".
_NOTE: "unknown[message_number]" and "xxx[message_number]" appear to already be reserved prefixes for those message types. Another option, though it won't be consistent with the raw FIT output, would be to append a "message_number" attribute.
|
Rethinking: Current json output is actually not so much a repack of fitdump output, but more of a converter, which provides only the minimum (useful) information. Including message numbers, lengths, data types, ... would require a different json structure. Besides, message numbers are always mapped to the same message names (see SDK/FIT.pm), so this will not add any additional information. |
Interesting. Since I'm using the JSON output to programmatically construct fitsed.pl expressions to modify the .fit file, not being able to retrieve the message number from all messages in the JSON is a problem. The fact that |
I'm trying to do some data analysis of
.fit
files and I loveGarmin::FIT
for helping me with this. But not being a Perl developer by trade, it would be really slick iffitdump.pl
had a--json
option or some way to output the data as JSON. Ideally, you'd output basically an array of objects and each object would correspond with thetype
of data it corresponds to. I will look into submitting a PR for this in the meantime but I figured I'd start here to get your opinion, and possibly save time by having an expert knock it out.Thanks again for this awesome library.
The text was updated successfully, but these errors were encountered: