I am currently working with a server that holds many JSON cache files that need to be accessed each time the user needs to get more data. I have the JSON files ordered by most recent data first. Here is how the whole system should work: I need to access each file (say 3) at the same time and compare their times then I will make a JSON file and send it to JavaScript to be process and printed to the page. The dilemma I am having is, after I get the data the first, second, so on time I need to save the last position I got each item in each file, so I can continue loading in the next most recent data when the user scrolls to the bottom on the content I have loaded for them. The method I have come up with is saving each location (a number) as a session for the user. The problem is that the user will need to access the entire set of JSON files to continue where they left off. This seems very inefficient and will become cumbersome the more JSON files there are.
For clarification here are the steps I was able to come up with:
PHP section:
- decode each JSON file - this part feels very inefficient to load every time
- loop to 20 items{ compare the times on each first item in the files add the most recent data to an array or some structure and inc that files count }
- save the current locations and files in session variables (filea1.JSON could run over into filea2.JSON)
JavaScript:
- Parse the JSON Print all data to the page
- If the user scrolls to the bottom of the data{ make another PHP call }
repeat till out of data in files or user leaves the page.
I need the loading of the files to be efficient the rest of the steps are straight forward for me, but what I can think of is to keep the JSON in PHP persistent, so I do not need to load all the data once again. This or I need to load the data continuing where I left off.
I could be completely wrong about the data needing to persist, but I am relatively new to PHP and don't know that many tricks.