I have a JSON file that has roughly 36 GB and I need to access it more efficiently. I've been using rapidjsons SAX-style API in C++ but it takes about two hours to parse. Now here's my question: Should I split the big file into millions of small files? Is there any other approach I should take? Thanks.
Discuss This Question: