I created a local web application to display and interact with data that I got an a big Json file (around 250 Mo). I have several function in order to display differently the data but currently each one starts by reading and parsing the Json file :
$string = file_get_contents("myFile.json");
$json_a = json_decode($string, true);
The thing is that it is quite slow (about 4 seconds) because the file is big... So I would like to parse the Json file once and for all and store the parsed data in memory so that each function could use it :
session_start();
$string = file_get_contents("myFile.json");
$_SESSION['json_a'] = json_decode($string, true);
and then use $_SESSION['json_a']
in my functions.
But I got an error when the function access to this $_SESSION['json_a']
variable :
Fatal error: Allowed memory size of 134217728 bytes exhausted (tried to allocate 195799278 bytes)
I assume it is because the file is too big.. but why does it crash when the variable is used and not when it is built ? And why can I do it with my first solution (parsing each time) if it is too big for the memory ?
And finally my real question : how can I optimise that ? (I know sql database would be much better but I happen to have json data)