I have a column named 'other' in 'table_one' (MySQL) that contains a JSON string.
The table has millions of records.
A. I want to iterate through 'table_one', read from 'other' and B. INSERT all the $values into their corresponding $key columns in Table 2 ('table_two'). foreach $row (?)
I've already created the columns in 'table_2'.
I'd rather not (I think) create a static file (/path/some.json) from 'table_one' as it seems inefficient compared to simply reading straight from the database.
Some values in the JSON will be null, and some key/value pairs missing. So being able to pick up the $key(s) and $value(s) "dynamically" -- and not erroring out if they are absent -- is ideal/critical.
Sample of JSON in 'other':
{"firstName": "Bob", "lastName": "Jones", "email": "bob.jones@yahoo.com", "address": "7206 maplehurst drive", "city": "PORT RICHEY", "state": "FL", "zip": "34668", "ipAddress": "208.54.85.233", "gender": "M", "employer_name": "bobs auto", "months_address": "2", "years_address": "12"}
You'd (I'd) think by now there would be standard functionality to address this. If there is, again, my apologies.
Currently, I'm using an ETL tool to perform a simple, but monsterous, trim/split/map process. Each JSON string has 45 pairs. And any slight tweak in table structure requires re-mapping the whole thing.
*Comfortable with PHP, MySQL, JavaScript, jQuery.