I am creating a simple web service that accepts a video upload, runs multiple different encodes on the video (mp4, webm, ogv), and then uploads the newly created files to our video host.
Let's say I have multiple commands..
shell_exec('ffmpeg -i input.mp4 -f mp4 -c:v libx264 -preset slow -crf 24 -s 1280x720 -c:a libfdk_aac -profile:a aac_he -ar 22050 -b:a 64k -movflags +faststart output-1280x720.mp4');
shell_exec('ffmpeg -i input.mp4 -f mp4 -c:v libx264 -preset slow -crf 24 -s 1920x1080 -c:a libfdk_aac -profile:a aac_he -ar 22050 -b:a 64k -movflags +faststart output-1920x1080.mp4');
shell_exec('ffmpeg -i input.mp4 -f ogg -c:v libtheora -q:v 5 -s 1280x720 -c:a libvorbis -ar 22050 -b:a 64k -movflags +faststart output-1280x720.ogv');
shell_exec('ffmpeg -i input.mp4 -f ogg -c:v libtheora -q:v 5 -s 1920x1080 -c:a libvorbis -ar 22050 -b:a 64k -movflags +faststart output-1920x1080.ogv');
In summary, I want to...
- Print an immediate response: {success: true}
- Kick off multiple ffmpeg jobs synchronously.
- After each job is complete, send a POST to another server (one POST for each shell_exec).
It would also be nice to only send the POST if the job was successful, but I could easily work around that by just checking to see if the output file exists on the server.
I know that I can force shell_exec to run in the background by simply appending >/dev/null 2>/dev/null &
to each command -- which would allow me to print a response immediately -- but I think by doing this, this would cause all of the jobs to run in parallel, and also, since this is diverting the output, I do not get any true callbacks when jobs are complete.
Any ideas??