使用celery执行celery -A tasks worker --loglevel=info后报错from kombu.matcher import match,ImportError: No module named 'kombu.matcher'

shell中报错如下

File "/usr/local/lib/python3.5/dist-packages/celery/utils/imports.py", line 55, in instantiate
    return symbol_by_name(name)(*args, **kwargs)
  File "/home/tarena/.local/lib/python3.5/site-packages/kombu/utils/imports.py", line 56, in symbol_by_name
    module = imp(module_name, package=package, **kwargs)
  File "/usr/lib/python3.5/importlib/__init__.py", line 126, in import_module
    return _bootstrap._gcd_import(name[level:], package, level)
  File "<frozen importlib._bootstrap>", line 986, in _gcd_import
  File "<frozen importlib._bootstrap>", line 969, in _find_and_load
  File "<frozen importlib._bootstrap>", line 958, in _find_and_load_unlocked
  File "<frozen importlib._bootstrap>", line 673, in _load_unlocked
  File "<frozen importlib._bootstrap_external>", line 665, in exec_module
  File "<frozen importlib._bootstrap>", line 222, in _call_with_frames_removed
  File "/usr/local/lib/python3.5/dist-packages/celery/app/control.py", line 12, in <module>
    from kombu.matcher import match
ImportError: No module named 'kombu.matcher'

celery提不起来,求哪位大神赐教

1个回答

Csdn user default icon
上传中...
上传图片
插入图片
抄袭、复制答案,以达到刷声望分或其他目的的行为,在CSDN问答是严格禁止的,一经发现立刻封号。是时候展现真正的技术了!
其他相关推荐
celery 启动worker 时候报错

错误信息如下: raceback (most recent call last): File "/home/xuxiaolong/anaconda3/lib/python3.6/site-packages/celery/app/utils.py", line 228, in find_app found = sym.app AttributeError: module 'celeryDemo' has no attribute 'app' During handling of the above exception, another exception occurred: Traceback (most recent call last): File "/home/xuxiaolong/anaconda3/bin/celery", line 11, in <module> sys.exit(main()) File "/home/xuxiaolong/anaconda3/lib/python3.6/site-packages/celery/__main__.py", line 30, in main main() File "/home/xuxiaolong/anaconda3/lib/python3.6/site-packages/celery/bin/celery.py", line 80, in main cmd.execute_from_commandline(argv) File "/home/xuxiaolong/anaconda3/lib/python3.6/site-packages/celery/bin/celery.py", line 723, in execute_from_commandline super(CeleryCommand, self).execute_from_commandline(argv))) File "/home/xuxiaolong/anaconda3/lib/python3.6/site-packages/celery/bin/base.py", line 301, in execute_from_commandline argv = self.setup_app_from_commandline(argv) File "/home/xuxiaolong/anaconda3/lib/python3.6/site-packages/celery/bin/base.py", line 431, in setup_app_from_commandline self.app = self.find_app(app) File "/home/xuxiaolong/anaconda3/lib/python3.6/site-packages/celery/bin/base.py", line 451, in find_app return find_app(app, symbol_by_name=self.symbol_by_name) File "/home/xuxiaolong/anaconda3/lib/python3.6/site-packages/celery/app/utils.py", line 233, in find_app found = sym.celery AttributeError: module 'celeryDemo' has no attribute 'celery' 代码目录结构: celeryDemo --__init__.py --tasks.py --CeleryConf.py --config.py **config.py代码:** from __future__ import absolute_import CELERY_RESULT_BACKEND = 'redis://127.0.0.1:6379/5' BROKER_URL = 'redis://127.0.0.1:6379/6' **celeryconf.py代码**: from __future__ import absolute_import from celery import Celery app=Celery('celeryDemo', include=['celeryDemo.tasks']) app.config_from_object('celeryDemo.config') if __name__ == '__main__': app.start() **tasks.py代码:** from __future__ import absolute_import from celeryDemo.celeryConf import app #from celeryDemo.scrapyLijia import scrapyProcess @app.task def add(x, y): return x + y """ @app.task def spider(region): q =[] process = scrapyProcess(region,q,"Mozilla/5.0 (X11; Ubuntu; Linux x86_64; rv:31.0) Gecko/20100101 Firefox/31.0",None,5) process.start() """ 当家帮忙看下,我查了好久不知道啥问题

cmd运行celery出现无权限访问

建立了简单的测试celery小代码,但是运行 celery -A test1 worker --loglevel=info 时候报错。 ![图片说明](https://img-ask.csdn.net/upload/201711/08/1510104156_348863.png) ![图片说明](https://img-ask.csdn.net/upload/201711/08/1510104241_549134.png)

Celery上手使用遇到困难,server卡在 task.delay(),似乎进不去任务方法

作为一个celery的初学者,我试图在我的简单的flask web项目中添加celery的异步特性。在安装并启动celery、redis和相关的python pip之后,我在一开始就遇到了卡在task.delay()的问题。很沮丧,能帮我一下吗? <br> 在flask项目中有app.py和tasks.py。(一开始我把所有东西都写在app.py中,然后我把celery对象分开,但没有任何帮助) <br> tasks.py: ``` from celery import Celery celery = Celery('app', broker='redis://localhost:6379/0', backend='redis://localhost:6379/1') @celery.task def modeling_task(): print('yes') ``` <br> app.py: ``` @app.route('/train', methods=['GET', 'POST']) def train(): if request.method == "GET": return render_template('train.html') else: # when request POST # check if the post request has the file part if 'file' not in request.files: return jsonify({"code": 500, "status": 'No file is uploaded!'}) if file and allowed_file(file.filename): print(request.files['file'].filename) print('before task') task = modeling_task.delay() print('after task') return jsonify({"code": 200, "status": 'model training', "task_id": task.id}) return 'Uploading file is wrong' ``` <br> 按理说,当server接收到前端上传的文件时,控制台应该输出这些 ``` Car_TEST.csv before task yes after task ``` <br> 但实际上输出的是这样 ``` * Detected change in 'C:\\Users\\headmaster\\Desktop\\WEB\\tasks.py', reloading * Restarting with stat * Debugger is active! * Debugger PIN: 308-608-393 * Running on http://127.0.0.1:5000/ (Press CTRL+C to quit) 127.0.0.1 - - [22/Apr/2020 11:39:22] "GET /train HTTP/1.1" 200 - Car_TEST.csv before task ``` 然后server就一直卡在这里了 <br> 好像"task = modeling_task.delay()" 根本就没执行 <br> 是因为它有执行,然后“yes”被打印在别的地方了吗? 为什么会一直卡在那呢? <br> 下面是celery的report ``` (base) C:\Users\headmaster\Desktop\WEB>celery -A tasks report software -> celery:4.4.2 (cliffs) kombu:4.6.8 py:3.7.4 billiard:3.6.3.0 redis:3.4.1 platform -> system:Windows arch:64bit, WindowsPE kernel version:10 imp:CPython loader -> celery.loaders.app.AppLoader settings -> transport:redis results:redis://localhost:6379/1 broker_url: 'redis://localhost:6379/0' result_backend: 'redis://localhost:6379/1' ``` <br> celery的运行状态: ``` (base) C:\Users\headmaster\Desktop\WEB>celery worker -A tasks.celery --loglevel=info -------------- celery@LAPTOP-KLKJCK2F v4.4.2 (cliffs) --- ***** ----- -- ******* ---- Windows-10-10.0.18362-SP0 2020-04-22 10:55:13 - *** --- * --- - ** ---------- [config] - ** ---------- .> app: app:0x2526977ac08 - ** ---------- .> transport: redis://localhost:6379/0 - ** ---------- .> results: redis://localhost:6379/1 - *** --- * --- .> concurrency: 4 (prefork) -- ******* ---- .> task events: OFF (enable -E to monitor tasks in this worker) --- ***** ----- -------------- [queues] .> celery exchange=celery(direct) key=celery [tasks] . tasks.modeling_task [2020-04-22 10:55:15,501: INFO/SpawnPoolWorker-1] child process 6692 calling self.run() [2020-04-22 10:55:15,511: INFO/SpawnPoolWorker-2] child process 17736 calling self.run() [2020-04-22 10:55:15,518: INFO/SpawnPoolWorker-3] child process 16320 calling self.run() [2020-04-22 10:55:15,571: INFO/SpawnPoolWorker-4] child process 5608 calling self.run() [2020-04-22 10:55:15,972: INFO/MainProcess] Connected to redis://localhost:6379/0 [2020-04-22 10:55:20,332: INFO/MainProcess] mingle: searching for neighbors [2020-04-22 10:55:27,369: INFO/MainProcess] mingle: all alone [2020-04-22 10:55:35,402: INFO/MainProcess] celery@LAPTOP-KLKJCK2F ready. ``` <br> redis的启动状态: ``` PS D:\Applications\Redis> redis-server redis.conf _._ _.-``__ ''-._ _.-`` `. `_. ''-._ Redis 3.2.100 (00000000/0) 64 bit .-`` .-```. ```\/ _.,_ ''-._ ( ' , .-` | `, ) Running in standalone mode |`-._`-...-` __...-.``-._|'` _.-'| Port: 6379 | `-._ `._ / _.-' | PID: 21420 `-._ `-._ `-./ _.-' _.-' |`-._`-._ `-.__.-' _.-'_.-'| | `-._`-._ _.-'_.-' | http://redis.io `-._ `-._`-.__.-'_.-' _.-' |`-._`-._ `-.__.-' _.-'_.-'| | `-._`-._ _.-'_.-' | `-._ `-._`-.__.-'_.-' _.-' `-._ `-.__.-' _.-' `-._ _.-' `-.__.-' [21420] 22 Apr 10:38:22.506 # Server started, Redis version 3.2.100 [21420] 22 Apr 10:38:22.508 * DB loaded from disk: 0.000 seconds [21420] 22 Apr 10:38:22.508 * The server is now ready to accept connections on port 6379 [21420] 22 Apr 10:43:23.025 * 10 changes in 300 seconds. Saving... [21420] 22 Apr 10:43:23.030 * Background saving started by pid 6204 [21420] 22 Apr 10:43:23.231 # fork operation complete [21420] 22 Apr 10:43:23.232 * Background saving terminated with success [21420] 22 Apr 10:51:18.099 * 10 changes in 300 seconds. Saving... [21420] 22 Apr 10:51:18.103 * Background saving started by pid 10116 [21420] 22 Apr 10:51:18.305 # fork operation complete [21420] 22 Apr 10:51:18.306 * Background saving terminated with success [21420] 22 Apr 10:56:19.022 * 10 changes in 300 seconds. Saving... [21420] 22 Apr 10:56:19.026 * Background saving started by pid 11748 [21420] 22 Apr 10:56:19.227 # fork operation complete [21420] 22 Apr 10:56:19.227 * Background saving terminated with success ```

celery 连接rabbitmq ConnectionResetError: [Errno 104]

代码目录结构: task/ celeryStart.py celeryApp/ __init__.py celeryconfig.py celerytask.py __init__.py中生成app=Celery()对象,celeryStart.py中引用app发送一个任务 celery -A celeryApp worker -l info正常连接rabbitmq celery -A celeryStart worker -l info报警告 ![图片说明](https://img-ask.csdn.net/upload/201812/29/1546074603_73952.png)

celery,想存储任务结果,能否设置backend为kafka,或者?

我想问下,在celery中,怎么使用kafka作为存储结果,我在worker中使用存储的时候,存本地速度是正常的,当使用kafka的时候每次任务都需要连接一次kafka导致任务太慢,感谢大家

celery+redis启动报错attribute 'hash_randomization'

遇到一个问题,在使用"celery -A Redis worker --loglevel=info"命令启动时,报错 AttributeError: 'sys.flags' object has no attribute 'hash_randomization' 没找到具体是哪里的问题 我用的代码 ``` # coding: utf-8 from celery import Celery broker = 'redis://localhost:6379' backend = 'redis://localhost:6379' # "Redis" 任务名 (与当前文件名一致) app = Celery("Redis", broker=broker, backend=backend) @app.task() def redis_main(x): print "Hello %s!" % x ``` 具体的报错 ``` -------------- celery@DESKTOP-DK5Q8NF v4.2.0 (windowlicker) ---- **** ----- --- * *** * -- Windows-post2008Server-6.2.9200 2018-06-13 15:29:30 -- * - **** --- - ** ---------- [config] - ** ---------- .> app: Redis:0x5a88198 - ** ---------- .> transport: redis://localhost:6379// - ** ---------- .> results: redis://localhost:6379/ - *** --- * --- .> concurrency: 4 (prefork) -- ******* ---- .> task events: OFF (enable -E to monitor tasks in this worker) --- ***** ----- -------------- [queues] .> celery exchange=celery(direct) key=celery [tasks] . Redis.redis_main [2018-06-13 15:29:31,095: CRITICAL/MainProcess] Unrecoverable error: AttributeError("'sys.flags' object has no attribute 'hash_randomization'",) Traceback (most recent call last): File "C:\Python27\lib\site-packages\celery-4.2.0-py2.7.egg\celery\worker\worker.py", line 205, in start self.blueprint.start(self) File "C:\Python27\lib\site-packages\celery-4.2.0-py2.7.egg\celery\bootsteps.py", line 119, in start step.start(parent) File "C:\Python27\lib\site-packages\celery-4.2.0-py2.7.egg\celery\bootsteps.py", line 369, in start return self.obj.start() File "C:\Python27\lib\site-packages\celery-4.2.0-py2.7.egg\celery\concurrency\base.py", line 131, in start self.on_start() File "C:\Python27\lib\site-packages\celery-4.2.0-py2.7.egg\celery\concurrency\prefork.py", line 112, in on_start **self.options) File "C:\Python27\lib\site-packages\billiard-3.5.0.3-py2.7-win-amd64.egg\billiard\pool.py", line 1007, in __init__ self._create_worker_process(i) File "C:\Python27\lib\site-packages\billiard-3.5.0.3-py2.7-win-amd64.egg\billiard\pool.py", line 1116, in _create_worker_process w.start() File "C:\Python27\lib\site-packages\billiard-3.5.0.3-py2.7-win-amd64.egg\billiard\process.py", line 124, in start self._popen = self._Popen(self) File "C:\Python27\lib\site-packages\billiard-3.5.0.3-py2.7-win-amd64.egg\billiard\context.py", line 383, in _Popen return Popen(process_obj) File "C:\Python27\lib\site-packages\billiard-3.5.0.3-py2.7-win-amd64.egg\billiard\popen_spawn_win32.py", line 55, in __init__ pipe_handle=rhandle) File "C:\Python27\lib\site-packages\billiard-3.5.0.3-py2.7-win-amd64.egg\billiard\spawn.py", line 147, in get_command_line opts = util._args_from_interpreter_flags() File "C:\Python27\lib\site-packages\billiard-3.5.0.3-py2.7-win-amd64.egg\billiard\util.py", line 36, in _args_from_interpreter_flags v = getattr(sys.flags, flag) AttributeError: 'sys.flags' object has no attribute 'hash_randomization' ```

python Master-Worker多台机器的分布式怎么写?

参考了缪雪峰教程里面的分布式写法,自己又琢磨了一会, 在一台机器上启动Master,另外三台机器上启动worker并不能完美执行, 有没有这方面的专家写一个这样的代码研究一下谢谢!!!!

celery启动worker时说redis版本太低

![图片说明](https://img-ask.csdn.net/upload/201903/12/1552394168_669843.png) 我用django的celery时把redis作为broker然后启动worker时一直报错说我redis的版本是3.1可我启动的redis就是3.2.1的,大佬们这是为什么呀

如何通过celery进行分布式运行task

现在我已经在我的机器上写好了app和相应的task,我想在另外一台机器上运行这上面的task,并且另外一台电脑拥有运行task的所有条件,并且这两台电脑处于同一个子网,假定分别是192.168.1.196/24和192.168.1.197/24,我在这台电脑上的broker定义是 amqp://test:test@192.168.1.196:5672/celery_t est, app的名字叫celery_test_app,那么我想在另外一台机器上运行celery worker来运行这上面的task,应该怎么操作?

请问下celery怎么给安装在其他服务器上的worker分配任务,小白请教,不升感激!

集群里面利用celery做任务发送,同时回收结果。rabbitmq或者redis怎么接受消息发送给其他的worker

关于celery启动任务时报错Thread 'ResultHandler' crashed: ValueError('invalid file descriptor 13',)

我使用celery定时器执行任务,可启动时会出现这个错误 ``` [2019-10-22 09:13:30,334: INFO/MainProcess] Connected to redis://127.0.0.1:6379/14 [2019-10-22 09:13:30,361: INFO/MainProcess] mingle: searching for neighbors [2019-10-22 09:13:30,532: INFO/Beat] beat: Starting... [2019-10-22 09:13:31,072: ERROR/Beat] Thread 'ResultHandler' crashed: ValueError('invalid file descriptor 13',) Traceback (most recent call last): File "/pyenvs/spider/lib64/python3.6/site-packages/billiard/pool.py", line 899, in body for _ in self._process_result(1.0): # blocking File "/pyenvs/spider/lib64/python3.6/site-packages/billiard/pool.py", line 864, in _process_result ready, task = poll(timeout) File "/pyenvs/spider/lib64/python3.6/site-packages/billiard/pool.py", line 1370, in _poll_result if self._outqueue._reader.poll(timeout): File "/pyenvs/spider/lib64/python3.6/site-packages/billiard/connection.py", line 285, in poll return self._poll(timeout) File "/pyenvs/spider/lib64/python3.6/site-packages/billiard/connection.py", line 463, in _poll r = wait([self], timeout) File "/pyenvs/spider/lib64/python3.6/site-packages/billiard/connection.py", line 996, in wait return _poll(object_list, timeout) File "/pyenvs/spider/lib64/python3.6/site-packages/billiard/connection.py", line 976, in _poll raise ValueError('invalid file descriptor %i' % fd) ValueError: invalid file descriptor 13 During handling of the above exception, another exception occurred: Traceback (most recent call last): File "/pyenvs/spider/lib64/python3.6/site-packages/billiard/pool.py", line 504, in run return self.body() File "/pyenvs/spider/lib64/python3.6/site-packages/billiard/pool.py", line 904, in body self.finish_at_shutdown() File "/pyenvs/spider/lib64/python3.6/site-packages/billiard/pool.py", line 953, in finish_at_shutdown if not outqueue._reader.poll(): File "/pyenvs/spider/lib64/python3.6/site-packages/billiard/connection.py", line 285, in poll return self._poll(timeout) File "/pyenvs/spider/lib64/python3.6/site-packages/billiard/connection.py", line 463, in _poll r = wait([self], timeout) File "/pyenvs/spider/lib64/python3.6/site-packages/billiard/connection.py", line 991, in wait return _poll(object_list, 0) File "/pyenvs/spider/lib64/python3.6/site-packages/billiard/connection.py", line 976, in _poll raise ValueError('invalid file descriptor %i' % fd) ValueError: invalid file descriptor 13 [2019-10-22 09:13:31,408: INFO/MainProcess] mingle: all alone [2019-10-22 09:13:31,423: INFO/MainProcess] celery@iZwz9h41nalpsqzz57x4tmZ ready. ``` 重启几次这个错误就不会出现,但运行一段时间后再次从定时器发布任务时还会出现该错误导致任务执行失败 ``` [2019-10-22 07:00:00,000: INFO/Beat] Scheduler: Sending due task monitoring_auto_run (run.monitoring_auto_run) [2019-10-22 07:00:00,008: INFO/MainProcess] Received task: run.monitoring_auto_run[469ee195-3e1a-4bf0-a7cb-783232e8d0bc] [2019-10-22 07:00:00,105: ERROR/ForkPoolWorker-11] Thread 'ResultHandler' crashed: ValueError('invalid file descriptor 14',) Traceback (most recent call last): File "/pyenvs/spider/lib64/python3.6/site-packages/billiard/pool.py", line 504, in run return self.body() File "/pyenvs/spider/lib64/python3.6/site-packages/billiard/pool.py", line 899, in body for _ in self._process_result(1.0): # blocking File "/pyenvs/spider/lib64/python3.6/site-packages/billiard/pool.py", line 864, in _process_result ready, task = poll(timeout) File "/pyenvs/spider/lib64/python3.6/site-packages/billiard/pool.py", line 1370, in _poll_result if self._outqueue._reader.poll(timeout): File "/pyenvs/spider/lib64/python3.6/site-packages/billiard/connection.py", line 285, in poll return self._poll(timeout) File "/pyenvs/spider/lib64/python3.6/site-packages/billiard/connection.py", line 463, in _poll r = wait([self], timeout) File "/pyenvs/spider/lib64/python3.6/site-packages/billiard/connection.py", line 996, in wait return _poll(object_list, timeout) File "/pyenvs/spider/lib64/python3.6/site-packages/billiard/connection.py", line 976, in _poll raise ValueError('invalid file descriptor %i' % fd) ValueError: invalid file descriptor 14 [2019-10-22 07:00:00,768: ERROR/MainProcess] Process 'ForkPoolWorker-11' pid:15068 exited with 'exitcode 1' [2019-10-22 07:00:11,186: ERROR/MainProcess] Task handler raised error: WorkerLostError('Worker exited prematurely: exitcode 1.',) Traceback (most recent call last): File "/pyenvs/spider/lib64/python3.6/site-packages/billiard/pool.py", line 1267, in mark_as_worker_lost human_status(exitcode)), billiard.exceptions.WorkerLostError: Worker exited prematurely: exitcode 1. ```

对于Symfony 2框架,是否有任何Celery或Celery(用于Django / Python)类型替代品

<div class="post-text" itemprop="text"> <p>I am a big fan of <a href="http://en.wikipedia.org/wiki/Celery_Task_Queue" rel="nofollow">Celery</a> for executing scheduled tasks in Django. I am now using <a href="http://en.wikipedia.org/wiki/Symfony" rel="nofollow">Symfony</a> 2 and see that it is almost similar to Django framework.</p> <p>I wonder if there is something similar to Celery in Symfony for scheduling task queues.</p> </div>

如何在Celery中完成失败的任务?

<div class="post-text" itemprop="text"> <p>I am using celery to process some tasks. I can see how many are active or scheduled etc, but I am not able to find any way to see the tasks that have failed. Flower does show me the status but only if it was running when the task was started and failed. Is there any command to get all the tasks that have failed (STATUS: FAILURE) ?</p> <p>I do have the task id when the task was created. But there are millions of them. So I can't check one by one even if there is a way to check it by task ID. But if there is such a command, please let me know.</p> </div>

利用flask+celery+flask_mail扩展异步发送邮件时,celery会出现上下文问题,有人能解决吗?网上找了很多,都感觉很模糊

``` import random from flask import Flask, request, jsonify, render_template, current_app from celery import Celery from flask_mail import Message, Mail import settings app = Flask(__name__) app.config.from_object(settings.BaseConfig) mail = Mail(app) celery = Celery(app.name, broker=app.config['CELERY_BROKER_URL']) celery.conf.update(app.config) @celery.task def send_async_email(subject, to, expire_minute): app = current_app._get_current_object() msg = Message(subject=subject, sender=app.config['MAIL_DEFAULT_SENDER'], recipients=[to]) msg.body = 'This is a test email sent from a background Celery task.' vcode = ''.join([str(i) for i in random.sample(range(0, 10), 4)]) context = { 'subject': subject, 'expire_minute': expire_minute, 'vcode': vcode, # 'debug': get_config_partial('DEBUG'), # 'debug': True, } msg.html = render_template(app.config['EMAIL_TEMPLATE'] + '.html', **context) # with app.app_context(): mail.send(msg) @app.route('/send_email', methods=['POST']) def send_email(): try: subject = request.form.get("subject") to = request.form.get("to") expire_minute = request.form.get("expire_minute") except: return jsonify({'msg': '参数有误'}) # send_async_email.delay(app, msg) send_async_email.apply_async(args=[subject, to, expire_minute]) message = '邮件发送成功' return jsonify(message=message) if __name__ == '__main__': app.run() ``` 这是我的settings文件 ``` class BaseConfig(object): CELERY_BROKER_URL = 'amqp://guest:guest@localhost:5672//' BACKEND = None RABBITMQ_QUEUE = 'hello' MAIL_SERVER = 'smtp.163.com ' MAIL_PORT = 25 MAIL_USE_TLS = True MAIL_USERNAME = 'mini@ 163.com' MAIL_PASSWORD = '1314AC' MAIL_DEFAULT_SENDER = 'mini @ 163.com' DEFAULT_FROM_EMAIL = 'mini @ 163.com' EMAIL_TEMPLATE = 'email' ```

celery怎么异步上传图片到本地或者七牛云,我发现delay()不可以传入对象的,那应该用什么方法

celery怎么异步上传图片到本地或者七牛云,我发现delay()不可以传入对象的,那应该用什么方法 django的celery模块有什么方法实现异步传图片

如何用Python以外的其他语言取消/撤销延迟的任务?

<div class="post-text" itemprop="text"> <p>We are using Celery as a task queue. It serves both our Django application and another app written in Go.</p> <p>To create a task in celery in Go language, I can post a message directly into RabbitMQ with something like this: <a href="https://github.com/bsphere/celery" rel="nofollow">https://github.com/bsphere/celery</a></p> <p>But our Go app also needs to cancel a delayed task before it is executed. How can I do this?</p> </div>

django同步send_email正常,celery异步send_email报错Connection unexpectedly closed

@shared_task def sendEmail(content): try: send_mail('主题',content,'abc@qq.com',['1001@qq.com','1002@qq.com']) print('发送成功') except SMTPException as e: print(f'发送失败{e}') #views.py …… #sendEmail('同步发送信息') ——同步发送正常 sendEmail.delay('异步发送信息') ——celery异步发送失败报错Connection unexpectedly closed ……

【没有C币有大神看不】celery一处理任务就报错,报错信息里说的没有反向怎么破?

1. 部署环境:腾讯云,python3, Ubuntu18.04 2.所用工具:nginx1.81,uwsgi,redis4.x,django2.1,celery,mysql5.7 3.描述: celery处理任务时报错,说没有反向啥的,我感觉有呀,而且上传图片几十k也不大啊 ** 报 ** raise NoReverseMatch(msg) django.urls.exceptions.NoReverseMatch: Reverse for 'order' with no arguments not found. 1 pattern(s) tried: ['user/order/(?P<page>\\d+)$'] ## **原url配置代码截图:** ![图片说明](https://img-ask.csdn.net/upload/201905/14/1557818222_583693.png) user中url的配置 和下面答主的一样 (除了我的order/后面还有别的关于页码的参数以外) ![图片说明](https://img-ask.csdn.net/upload/201905/14/1557818513_355803.png) ## 具体图片描述: ![图片说明](https://img-ask.csdn.net/upload/201905/14/1557817706_503027.png)

Dockerise Nginx,使用Sentry本地服务器的PHP-FPM

<div class="post-text" itemprop="text"> <p>Currently I encountered a problem when using local Sentry server. I'm using Lumen 5.4 and sentry-laravel package to write logs to Sentry server. But I got failed </p> <p><a href="https://i.stack.imgur.com/n9ZwI.png" rel="nofollow noreferrer">Screenshot</a></p> <p>Here is my docker-compose.yml:</p> <pre><code>version: '2' services: ## # Autodiscovery : Consul ## autodiscovery: build: ./autodiscovery/ mem_limit: 128m expose: - 53 - 8300 - 8301 - 8302 - 8400 - 8500 ports: - 8500:8500 dns: - 127.0.0.1 ## # Book Microservice ## microservice_book_fpm: build: ./microservices/book/php-fpm/ volumes_from: - source_book links: - autodiscovery - microservice_book_database expose: - 8080 environment: - BACKEND=microservice-book-nginx - CONSUL=autodiscovery microservice_book_nginx: build: ./microservices/book/nginx/ volumes_from: - source_book links: - autodiscovery - microservice_book_fpm environment: - BACKEND=microservice-book-fpm - CONSUL=autodiscovery ports: - 8443:443 - 8081:80 - 9091:9090 microservice_book_database: build: ./microservices/book/database/ environment: - CONSUL=autodiscovery - MYSQL_ROOT_PASSWORD=bookr_pwd - MYSQL_DATABASE=bookr - MYSQL_USER=bookr_usr - MYSQL_PASSWORD=bookr_pwd ports: - 6666:3306 ## # Sentry ## sentry_redis: image: redis expose: - 6379 sentry_postgres: image: postgres environment: - POSTGRES_PASSWORD=sentry - POSTGRES_USER=sentry volumes: - /var/lib/postgresql/data expose: - 5432 sentry: image: sentry links: - sentry_redis - sentry_postgres ports: - 9876:9000 environment: SENTRY_SECRET_KEY: 'b837e5a087c0a8727b1279bcdbe5a8a1' SENTRY_POSTGRES_HOST: sentry_postgres SENTRY_REDIS_HOST: sentry_redis SENTRY_DB_USER: sentry SENTRY_DB_PASSWORD: sentry sentry_celery_beat: image: sentry links: - sentry_redis - sentry_postgres command: sentry run cron environment: SENTRY_SECRET_KEY: 'b837e5a087c0a8727b1279bcdbe5a8a1' SENTRY_POSTGRES_HOST: sentry_postgres SENTRY_REDIS_HOST: sentry_redis SENTRY_DB_USER: sentry SENTRY_DB_PASSWORD: sentry sentry_celery_worker: image: sentry links: - sentry_redis - sentry_postgres command: sentry run worker environment: SENTRY_SECRET_KEY: 'b837e5a087c0a8727b1279bcdbe5a8a1' SENTRY_POSTGRES_HOST: sentry_postgres SENTRY_REDIS_HOST: sentry_redis SENTRY_DB_USER: sentry SENTRY_DB_PASSWORD: sentry ## # Telemetry: prometheus ## telemetry: build: ./telemetry/ links: - autodiscovery expose: - 9090 ports: - 9090:9090 ## # Source containers ## source_book: image: nginx:stable volumes: - ../source/book:/var/www/html command: "true" </code></pre> <p>Please help me figure out why I couldn't connect to local Sentry server from Docker container. Many thanks and sorry for my bad English :(</p> </div>

在中国程序员是青春饭吗?

今年,我也32了 ,为了不给大家误导,咨询了猎头、圈内好友,以及年过35岁的几位老程序员……舍了老脸去揭人家伤疤……希望能给大家以帮助,记得帮我点赞哦。 目录: 你以为的人生 一次又一次的伤害 猎头界的真相 如何应对互联网行业的「中年危机」 一、你以为的人生 刚入行时,拿着傲人的工资,想着好好干,以为我们的人生是这样的: 等真到了那一天,你会发现,你的人生很可能是这样的: ...

程序员请照顾好自己,周末病魔差点一套带走我。

程序员在一个周末的时间,得了重病,差点当场去世,还好及时挽救回来了。

Java基础知识面试题(2020最新版)

文章目录Java概述何为编程什么是Javajdk1.5之后的三大版本JVM、JRE和JDK的关系什么是跨平台性?原理是什么Java语言有哪些特点什么是字节码?采用字节码的最大好处是什么什么是Java程序的主类?应用程序和小程序的主类有何不同?Java应用程序与小程序之间有那些差别?Java和C++的区别Oracle JDK 和 OpenJDK 的对比基础语法数据类型Java有哪些数据类型switc...

技术大佬:我去,你写的 switch 语句也太老土了吧

昨天早上通过远程的方式 review 了两名新来同事的代码,大部分代码都写得很漂亮,严谨的同时注释也很到位,这令我非常满意。但当我看到他们当中有一个人写的 switch 语句时,还是忍不住破口大骂:“我擦,小王,你丫写的 switch 语句也太老土了吧!” 来看看小王写的代码吧,看完不要骂我装逼啊。 private static String createPlayer(PlayerTypes p...

和黑客斗争的 6 天!

互联网公司工作,很难避免不和黑客们打交道,我呆过的两家互联网公司,几乎每月每天每分钟都有黑客在公司网站上扫描。有的是寻找 Sql 注入的缺口,有的是寻找线上服务器可能存在的漏洞,大部分都...

Intellij IDEA 实用插件安利

1. 前言从2020 年 JVM 生态报告解读 可以看出Intellij IDEA 目前已经稳坐 Java IDE 头把交椅。而且统计得出付费用户已经超过了八成(国外统计)。IDEA 的...

女程序员,为什么比男程序员少???

昨天看到一档综艺节目,讨论了两个话题:(1)中国学生的数学成绩,平均下来看,会比国外好?为什么?(2)男生的数学成绩,平均下来看,会比女生好?为什么?同时,我又联想到了一个技术圈经常讨...

总结了 150 余个神奇网站,你不来瞅瞅吗?

原博客再更新,可能就没了,之后将持续更新本篇博客。

副业收入是我做程序媛的3倍,工作外的B面人生是怎样的?

提到“程序员”,多数人脑海里首先想到的大约是:为人木讷、薪水超高、工作枯燥…… 然而,当离开工作岗位,撕去层层标签,脱下“程序员”这身外套,有的人生动又有趣,马上展现出了完全不同的A/B面人生! 不论是简单的爱好,还是正经的副业,他们都干得同样出色。偶尔,还能和程序员的特质结合,产生奇妙的“化学反应”。 @Charlotte:平日素颜示人,周末美妆博主 大家都以为程序媛也个个不修边幅,但我们也许...

MySQL数据库面试题(2020最新版)

文章目录数据库基础知识为什么要使用数据库什么是SQL?什么是MySQL?数据库三大范式是什么mysql有关权限的表都有哪几个MySQL的binlog有有几种录入格式?分别有什么区别?数据类型mysql有哪些数据类型引擎MySQL存储引擎MyISAM与InnoDB区别MyISAM索引与InnoDB索引的区别?InnoDB引擎的4大特性存储引擎选择索引什么是索引?索引有哪些优缺点?索引使用场景(重点)...

如果你是老板,你会不会踢了这样的员工?

有个好朋友ZS,是技术总监,昨天问我:“有一个老下属,跟了我很多年,做事勤勤恳恳,主动性也很好。但随着公司的发展,他的进步速度,跟不上团队的步伐了,有点...

我入职阿里后,才知道原来简历这么写

私下里,有不少读者问我:“二哥,如何才能写出一份专业的技术简历呢?我总感觉自己写的简历太烂了,所以投了无数份,都石沉大海了。”说实话,我自己好多年没有写过简历了,但我认识的一个同行,他在阿里,给我说了一些他当年写简历的方法论,我感觉太牛逼了,实在是忍不住,就分享了出来,希望能够帮助到你。 01、简历的本质 作为简历的撰写者,你必须要搞清楚一点,简历的本质是什么,它就是为了来销售你的价值主张的。往深...

魂迁光刻,梦绕芯片,中芯国际终获ASML大型光刻机

据羊城晚报报道,近日中芯国际从荷兰进口的一台大型光刻机,顺利通过深圳出口加工区场站两道闸口进入厂区,中芯国际发表公告称该光刻机并非此前盛传的EUV光刻机,主要用于企业复工复产后的生产线扩容。 我们知道EUV主要用于7nm及以下制程的芯片制造,光刻机作为集成电路制造中最关键的设备,对芯片制作工艺有着决定性的影响,被誉为“超精密制造技术皇冠上的明珠”,根据之前中芯国际的公报,目...

优雅的替换if-else语句

场景 日常开发,if-else语句写的不少吧??当逻辑分支非常多的时候,if-else套了一层又一层,虽然业务功能倒是实现了,但是看起来是真的很不优雅,尤其是对于我这种有强迫症的程序"猿",看到这么多if-else,脑袋瓜子就嗡嗡的,总想着解锁新姿势:干掉过多的if-else!!!本文将介绍三板斧手段: 优先判断条件,条件不满足的,逻辑及时中断返回; 采用策略模式+工厂模式; 结合注解,锦...

离职半年了,老东家又发 offer,回不回?

有小伙伴问松哥这个问题,他在上海某公司,在离职了几个月后,前公司的领导联系到他,希望他能够返聘回去,他很纠结要不要回去? 俗话说好马不吃回头草,但是这个小伙伴既然感到纠结了,我觉得至少说明了两个问题:1.曾经的公司还不错;2.现在的日子也不是很如意。否则应该就不会纠结了。 老实说,松哥之前也有过类似的经历,今天就来和小伙伴们聊聊回头草到底吃不吃。 首先一个基本观点,就是离职了也没必要和老东家弄的苦...

2020阿里全球数学大赛:3万名高手、4道题、2天2夜未交卷

阿里巴巴全球数学竞赛( Alibaba Global Mathematics Competition)由马云发起,由中国科学技术协会、阿里巴巴基金会、阿里巴巴达摩院共同举办。大赛不设报名门槛,全世界爱好数学的人都可参与,不论是否出身数学专业、是否投身数学研究。 2020年阿里巴巴达摩院邀请北京大学、剑桥大学、浙江大学等高校的顶尖数学教师组建了出题组。中科院院士、美国艺术与科学院院士、北京国际数学...

为什么你不想学习?只想玩?人是如何一步一步废掉的

不知道是不是只有我这样子,还是你们也有过类似的经历。 上学的时候总有很多光辉历史,学年名列前茅,或者单科目大佬,但是虽然慢慢地长大了,你开始懈怠了,开始废掉了。。。 什么?你说不知道具体的情况是怎么样的? 我来告诉你: 你常常潜意识里或者心理觉得,自己真正的生活或者奋斗还没有开始。总是幻想着自己还拥有大把时间,还有无限的可能,自己还能逆风翻盘,只不是自己还没开始罢了,自己以后肯定会变得特别厉害...

男生更看重女生的身材脸蛋,还是思想?

往往,我们看不进去大段大段的逻辑。深刻的哲理,往往短而精悍,一阵见血。问:产品经理挺漂亮的,有点心动,但不知道合不合得来。男生更看重女生的身材脸蛋,还是...

为什么程序员做外包会被瞧不起?

二哥,有个事想询问下您的意见,您觉得应届生值得去外包吗?公司虽然挺大的,中xx,但待遇感觉挺低,马上要报到,挺纠结的。

当HR压你价,说你只值7K,你该怎么回答?

当HR压你价,说你只值7K时,你可以流畅地回答,记住,是流畅,不能犹豫。 礼貌地说:“7K是吗?了解了。嗯~其实我对贵司的面试官印象很好。只不过,现在我的手头上已经有一份11K的offer。来面试,主要也是自己对贵司挺有兴趣的,所以过来看看……”(未完) 这段话主要是陪HR互诈的同时,从公司兴趣,公司职员印象上,都给予对方正面的肯定,既能提升HR的好感度,又能让谈判气氛融洽,为后面的发挥留足空间。...

面试:第十六章:Java中级开发

HashMap底层实现原理,红黑树,B+树,B树的结构原理 Spring的AOP和IOC是什么?它们常见的使用场景有哪些?Spring事务,事务的属性,传播行为,数据库隔离级别 Spring和SpringMVC,MyBatis以及SpringBoot的注解分别有哪些?SpringMVC的工作原理,SpringBoot框架的优点,MyBatis框架的优点 SpringCould组件有哪些,他们...

面试阿里p7,被按在地上摩擦,鬼知道我经历了什么?

面试阿里p7被问到的问题(当时我只知道第一个):@Conditional是做什么的?@Conditional多个条件是什么逻辑关系?条件判断在什么时候执...

Python爬虫,高清美图我全都要(彼岸桌面壁纸)

爬取彼岸桌面网站较为简单,用到了requests、lxml、Beautiful Soup4

差点跪了...

最近微信又搞出了一个大利器,甚至都上了热搜,当然消息最敏捷的自媒体人,纷纷都开通了自己的视频号。01 视频号是什么呢?视频号是微信体系内的短视频,它不同...

面试了一个 31 岁程序员,让我有所触动,30岁以上的程序员该何去何从?

最近面试了一个31岁8年经验的程序猿,让我有点感慨,大龄程序猿该何去何从。

Vue回炉重造之router路由(更新中)

你好,我是Vam的金豆之路,可以叫我豆哥。2019年年度博客之星、技术领域博客专家。主要领域:前端开发。我的微信是 maomin9761,有什么疑问可以加我哦,自己创建了一个微信技术交流群,可以加我邀请你一起交流学习。最后自己也创建了一个微信公众号,里面的文章是我自己精挑细选的文章,主要介绍各种IT新技术。欢迎关注哦,微信搜索:臻美IT,等你来。 欢迎阅读本博文,本博文主要讲述【】,文字通...

大三实习生,字节跳动面经分享,已拿Offer

说实话,自己的算法,我一个不会,太难了吧

程序员垃圾简历长什么样?

已经连续五年参加大厂校招、社招的技术面试工作,简历看的不下于万份 这篇文章会用实例告诉你,什么是差的程序员简历! 疫情快要结束了,各个公司也都开始春招了,作为即将红遍大江南北的新晋UP主,那当然要为小伙伴们做点事(手动狗头)。 就在公众号里公开征简历,义务帮大家看,并一一点评。《启舰:春招在即,义务帮大家看看简历吧》 一石激起千层浪,三天收到两百多封简历。 花光了两个星期的所有空闲时...

Java岗开发3年,公司临时抽查算法,离职后这几题我记一辈子

前几天我们公司做了一件蠢事,非常非常愚蠢的事情。我原以为从学校出来之后,除了找工作有测试外,不会有任何与考试有关的事儿。 但是,天有不测风云,公司技术总监、人事总监两位大佬突然降临到我们事业线,叫上我老大,给我们组织了一场别开生面的“考试”。 那是一个风和日丽的下午,我翘着二郎腿,左手端着一杯卡布奇诺,右手抓着我的罗技鼠标,滚动着轮轴,穿梭在头条热点之间。 “淡黄的长裙~蓬松的头发...

大胆预测下未来5年的Web开发

在2019年的ReactiveConf 上,《Elm in Action》的作者Richard Feldman对未来5年Web开发的发展做了预测,很有意思,分享给大家。如果你有机会从头...

立即提问
相关内容推荐