I'd like to use jenkins for my laravel project.
I use pipeline for that. As you know, laravel doesn't need build step. I also don't use tests. I just want to have a stage('deploy').
When making push to repository, jenkins
host gets notified, pulls the project and runs the jenkins pipeline resided in laravel
project. As i have jenkins host
and laravel api
host different from each other, this is where I am facing issues.
As I said, I don't have build
and test
stages. so only deploy
in pipeline
. but to run laravel
project, before using jenkins
, i had bash
script which had 10 lines of code in it. such as (changing permissions, making current user as my user, running composer install, running docker-compose and so on)
.As I have another host for Laravel , I have 2 options:
1) in stage('deploy')
I can transfer build.sh file from jenkins host to laravel api host. and then making ssh into that laravel api host and running that file there. The thing I don't like about this is that what if setting permissions or other stuff that I have in that build.sh file go wrong in halfway which will get my project in an undesired state and I won't even get notified and I will have the broken project in production.
2) I can do all the stuff in stage('build')
running 'composer install and other stuff
, after that making dockerfile
from it(including vendor folder) and then after getting the image (which contains vendor too), uploading this docker image into dockerhub, then in stage('deploy') i can notify remote host about that and pass the script file from jenkins to that laravel remote hosts which contains the code which pulls the latest image from dockerhub and run it. this way I won't have undesired state as that image already contains vendor folder, it already got permissions and all that stuff. The issue with this is that making image will take lots of hard-disk. and imagine creating images for every push on the repository.
What do you suggest I should do?