Here is your problem
foreach($json['items'] as $row){
...
$videoId = $row['id']['videoId'];
$videoId1 = $row['id']['videoId'];
$videoId2 = $row['id']['videoId'];
$sql = "INSERT INTO users(channelid, videoId, videoId1, videoId3) VALUES('$channelid', '$videoId', '$videoId1', '$videoId2')
ON DUPLICATE KEY UPDATE channelid='$channelid', videoId='$videoId', videoId1='$videoId1', videoId2='$videoId2';
";
...
}
Basically what happens is take your first JSON data (I'll focus just on the Values part):
//iteration 1
VALUES('UCiP20o92_7jUYwj-ulVLifw', 'kHPqpxDE50c', 'kHPqpxDE50c', 'kHPqpxDE50c')
//iteration 2
VALUES('UCiP20o92_7jUYwj-ulVLifw', 'DbAlCl3kk-M', 'DbAlCl3kk-M', 'DbAlCl3kk-M')
//iteration 3
VALUES('UCiP20o92_7jUYwj-ulVLifw', '7EDPROQa4i0', '7EDPROQa4i0', '7EDPROQa4i0')
Now think what happens. Because of how you table is setup(improperly I may add), I'd be willing to bet that chanelid
is unique. So each loop you overwrite the id's from the previous loop.
it's 3 same videoId's inserting 7EDPROQa4i0
Which by the way is "Exactly" what you see happen.
To fix it you have to insert/update after the JSON loop. By building an array of video Ids that we can insert in one query. You don't need 1 query for each loop of the JSON data (the way it's setup).
Like this:
$loop = mysqli_query($conn, "SELECT channelid, videoId FROM users ORDER BY id")
or die (mysqli_error($conn));
while ($row = mysqli_fetch_array($loop))
{
$channelid = $row['channelid'];
$url = 'https://www.googleapis.com/youtube/v3/search?order=date&part=snippet&channelId='.$channelid .'&maxResults=3&key=KEY';
$content = file_get_contents($url);
$json = json_decode($content, true);
if(!isset($json['items'])) continue; //skip if no items
$videos = ['videoId'=>'','videoId1'=>'','videoId2'=>''];
$i = 0;
foreach($json['items'] as $items)
{
$key = 0==$i ? 'videoId' : 'videoId'.$i;
$videos[$key] = $items['id']['videoId'];
++$i;
}
$sql = "INSERT INTO users(channelid, videoId, videoId1, videoId3) VALUES('$channelid', '{$videos['videoId']}', '{$videos['videoId1']}', '{$videos['videoId2']}')
ON DUPLICATE KEY UPDATE channelid=VALUES(channelid), videoId=VALUES(videoId), videoId1=VALUES(videoId1), videoId2=VALUES(videoId2);
";
mysqli_query($conn, "SET NAMES utf8");
if(!mysqli_query($conn,$sql))
{
die('Error : ' . mysqli_error($conn));
}
}
That takes care of the immediate issue. However there is a much bigger issue here, which is how your table is setup and how you are using it. The way you have it you are limited to only 3 video and even if that is all you need, at some point you might want more. Everyone always wants more, it's a given. To update it to have 4 or 5 or 6 requires you add additional columns in, which is bad.
Deeper Issues
A better way is to use 1 Row per video. So how do we do that?
The first thing to do is export this table (so you have a backup)
Then change the index for channelid
to a compound (composite) "Unique" index, that has 2 fields both the channelid
and videoId
as the fields. Both of these have to be the same for the row to be considered a duplicate.
Next add a simple index (non-unique) back on channelId
. You want it to have it's own index because for the compound one to work generally you'll wan't to supply both fields (in the order of the index). That said I do forget (between the DB and some other things like Mongo), so you may be able to use one if it's the first one. You can always do explain and make sure it uses it. Anyway, you want that index because we are going to be using that to do lookups and well indexing it is good.
Next you will have to get rid of videoId1
and videoId2
completely. If you have data in there you don't want to lose you can make a migration script that will pull the values out of those 2 fields and create a new row for them.
Then onto the code.
$loop = mysqli_query($conn, "SELECT DISTINCT channelid FROM users ORDER BY id")
or die (mysqli_error($conn));
while ($row = mysqli_fetch_array($loop))
{
$channelid = $row['channelid'];
$url = 'https://www.googleapis.com/youtube/v3/search?order=date&part=snippet&channelId='.$channelid .'&maxResults=3&key=KEY';
$content = file_get_contents($url);
$json = json_decode($content, true);
if(!isset($json['items'])) continue; //skip if no items
foreach($json['items'] as $items)
{
$videoId = $items['id']['videoId'];
$sql = "INSERT INTO users(channelid, videoId) VALUES('$channelid', '$videoId')
ON DUPLICATE KEY UPDATE channelid=VALUES(channelid), videoId=VALUES(videoId);
";
mysqli_query($conn, "SET NAMES utf8");
if(!mysqli_query($conn,$sql))
{
die('Error : ' . mysqli_error($conn));
}
}
}
This is much closer to your original, but we use only one field in the first query and DISTINCT
. The rest should be self explanatory.
When you pull the data out, you'll just have to loop though the results as each row will be 1 video. It will make things simpler in the long run, promise.
Other things
You also have SQL Injection issues, which is so so as this is probably less hackable, but why risk it. Anytime you concatenate data into a query you open the door of SQLInjection. Even if the data is 100% safe, you lose nothing by always preparing your queries. There is plenty of tutorials on how to do this.
You can use VALUES(column)
in the on duplicate part, which I did. This saves us from having to duplicate the variable (which shouldn't be here) but even when you prepare it, it will make it easier.
Naming conventions, such as channelid
vs videoId
vs channelId
. It may seem like not a big deal, but it really is. While everyone has their own way of doing it the most important part is "be consistent". I prefer no-caps, underlines for spaces, plural table names. So for me these would be channel_id
, video_id
. By knowing this and being consistant I don't have to look back at the schema of a table I made a year ago and wonder if it's channelid
, ChannelId
, channelID
, channelId
or channel_id
. AS you can see just looking there is more variation in the cased versions then the lowercase version (4+ vs 2). Spaces in column name are a non-starter in MySQL as you have to escape them with backtics so that's never a question.
You were overwriting the $row
variable in the inner loop, the one for JSON, so I changed that to $items
instead. You also don't need the channelid
from the JSON because you already have it in the query (and the url).
Enjoy!