Working with huge JSON files
Mark Olson last edited by
There are various tools out there for working with JSON files (I’ll selfishly plug JsonTools but others exist), but if you’re working with very large (say 50+ MB) JSON files that place a lot of pressure on the editor, you might want to check out the latest version of the HugeFiles plugin, which now has a feature for breaking up JSON files into syntactically valid chunks without holding any persistent objects in memory for longer than absolutely necessary.
The parser is still pretty slow right now (maybe 100-200 seconds/MB), but once the file is chunked you can page through the file quickly without eating up huge memory or risking crash.
If anybody who works with this sort of thing is interested in helping improve parser performance, I’d welcome the help.
HugeFiles also has the ability to break files into newline-delimited chunks, and this is very fast and doesn’t need any real help from a performance perspective.