dongpaipu8394 2018-04-29 03:00
浏览 103

Docker镜像可以“构建”(“编译”)以不需要Docker吗?

I have a very small amount of experience with Docker for experimentation and development, and zero experience with Docker when it comes to staging and deployment - so forgive anything that sounds naive.

The main question

Suppose I have a Docker image (or even a docker-compose.yml file consisting of several images and services) which, when run, sets up the environment for my app and runs my app - allowing for connections on a publicly open port and responding to requests.

In order to run this image in production (and therefore in order to run my app in production) the production server must have Docker installed. This feels like a violation of The Twelve-Factor App design. Particularly when you consider the Port Binding tenet:

The twelve-factor app is completely self-contained

Just as an app should not rely on Apache or nginx to be installed, should an app also not rely on Docker to be installed?

This led me to wonder if there were a way to "package", "build", or otherwise "compile" the Docker runtime and the image into an executable binary. Something that could be deployed to any server and run as a single process without the need to install Docker first.

Now, it's possible I'm just thinking about this entirely wrong. For that reason, I've detailed the specifics of the concerns and issues I'm having below

What brought this up

I have a web application project that I have previously been developing using Cloud9. When I push this project to production, I manually log into the production server via SSH and perform git pull, composer update, npm install, and gulp. I bit of a hassle, but for the very small scale I'm working at this has been sufficient and it's a hell of a lot better than uploading all of my dependencies via FTP.

I've occasionally run into challenges with external dependencies, however. Something works fine in development, then when I push it to production I realize the production server has an out-dated version of MySQL. Or the version of pngquant installed on the production server has a bug. Or the nginx config on the server doesn't match the nginx config in development exactly and it's causing some edge case when routing malformed requests.

All of these problems hit at once today when I tried to load up my project in CodeAnywhere instead of Cloud9. I had to ensure:

  • The PHP version was update
  • NodeJS was updated
  • NPM was updated
  • cURL was installed
  • All of the required PHP extensions were installed
  • Several GNU libraries were installed
  • etc

I spent hours trying to get this code running -- and it's code I wrote

Having all of these problems reminded me of The Twelve-Factor App design. So I hopped over to the website and did some thinking to figure out what I was doing wrong.

Note: I don't just develop solo and then deploy to production directly. I actually have this project set up in BitBucket, I use a ticketing system to track changes, a branch is created per ticket, and branches are checked out in a staging environment before being merged into master. So I've created a pretty robust system to managing changes to avoid bugs from slipping into production and to allow for agile development. However when it comes to checking out a branch in staging or production it's the same manual crap: git pull, composer update, npm install, gulp.

What I like about Docker

The ability to define my working environment in a source-controlled config file would eliminate the bulk of my issues. Never again would I need to ensure PHP was up to date, ensure NodeJS was up to date, ensure cURL was installed, etc. If the Docker image has all of the dependencies, then it will still have those dependencies when deployed in staging or production. Consistency of environment between all stages of development would make my life a lot easier.

Also, I haven't yet played around with anything this advanced, but I'm to understand that it's easy with Docker to set up automated deployment. If I could click on a branch in BitBucket then click "send to staging" and a minute later have it deployed and ready to test -- that would save me hours of time each week. If I could similarly have code automatically deployed to production when it was merged to master, that would not only save me time but would avoid the risk of finished features languishing in BitBucket and never getting in front of a client.

Finally, and this may eventually be a moot point, I'm to understand that Docker makes green/blue deployment much easier. Currently when I push a new change to production the production server goes offline briefly. Usually only for 15-20 seconds, but once it was an entire hour. During this 15-20 seconds window I'm running composer update, npm install, and gulp. The former two commands usually don't need to do anything (since my dependencies don't change often) and gulp usually completes within 15 seconds. However when dependencies do change or when there are larger issues (like needing to upgrade MySQL) the site can go down for an entire hour. If I could slowly and calmly deploy to a secondary production server then flip the switch in milliseconds when I verified it was working correctly, this would mean less down time and more happy customers.

Of course the last one may be a moot point because I'm currently not utilizing a "build" step (another part of the twelve-factor app), and all of these steps should be part of the "build" phase -- not the "deploy" phase.

What I don't like about Docker

It's yet one more tool to learn. In order to understand and develop for my app you already need to understand:

  • PHP
  • Composer
  • Symfony
  • Laravel
  • NodeJS
  • NPM
  • Gulp
  • Bootstrap
  • VueJS
  • (probably many other things I can't think of right now)

Adding "Docker" to that list just means this project gets that much harder to train someone on if I were to ever hand it off to another developer. I want fewer dependencies, not more.

Also, Docker doesn't come default with any operating system that I'm aware of. So it's not like cURL where, while it's technically a third-party dependency, you can generally expect people have it. Instead it's a whole beast that has to be installed separately.

The former issue I can't really circumvent. If I choose to use Docker, it means adding one more tool to my toolbox for this app. However the latter issue could be avoided if Docker images can somehow be compiled to stand-alone binaries.

  • 写回答

2条回答 默认 最新

  • dqkxo44488 2018-10-12 08:08
    关注

    Docker images do indeed require docker installed on the machine, but it's a much smaller issue then having all the other dependencies you mentioned set up.

    While it's probably possible to create some kind of self contained docker image, it would probably be less portable then docker images, as the binary will be os dependent.

    also consider that if you use a cloud provider, they provide the ability to deploy docker images "directly on the cloud" - that is, you don't have to manage the underlying servers

    评论

报告相同问题?

悬赏问题

  • ¥15 安卓adb backup备份应用数据失败
  • ¥15 eclipse运行项目时遇到的问题
  • ¥15 关于#c##的问题:最近需要用CAT工具Trados进行一些开发
  • ¥15 南大pa1 小游戏没有界面,并且报了如下错误,尝试过换显卡驱动,但是好像不行
  • ¥15 没有证书,nginx怎么反向代理到只能接受https的公网网站
  • ¥50 成都蓉城足球俱乐部小程序抢票
  • ¥15 yolov7训练自己的数据集
  • ¥15 esp8266与51单片机连接问题(标签-单片机|关键词-串口)(相关搜索:51单片机|单片机|测试代码)
  • ¥15 电力市场出清matlab yalmip kkt 双层优化问题
  • ¥30 ros小车路径规划实现不了,如何解决?(操作系统-ubuntu)