I am creating a php script that parses a string of text typically 500-750 characters using regex expressions to find 5-6 substrings. The extracted info is manipulated a bit and then stored in a database. Once implemented, the script will be executed every few seconds on average throughout the day. Will an average webserver be able to handle this level of usage? I'm not sure how memory intensive this type of script would be with the frequency its executed.
I was thinking through possible ways of reducing the load on the server and came up with an idea which i'm not sure is any better. Instead of parsing the raw text instantly when received, I could store it in a database to be parsed later. Throughout the day I could then have the server process the info in manageable quantities and during periods of low site traffic. If this is a solution, would the below algorithm be an okay way to approach it?
- Select a few text records which have yet to be parsed from a database
- Extract info from the text and add them to the database
- Update the text records to indicate they've been parsed
- Have the script reload using
<meta http-equiv="refresh">
- Repeat x amount of times
Any advice on how to approach this would be appreciated, thanks!