首页 > 数据库 >python3 apscheduler 任务池 异常错误 /opt/www/taskPools1/venv/lib/python3.8/site-packages/apscheduler/jobsto

python3 apscheduler 任务池 异常错误 /opt/www/taskPools1/venv/lib/python3.8/site-packages/apscheduler/jobsto

时间:2023-08-08 12:11:23浏览次数:40  
标签:opt www venv taskPools1 apscheduler job py

报错信息:


(venv) root@VM-8-7-ubuntu:/opt/www/taskPools1# python main.py
Traceback (most recent call last):
File "/opt/www/taskPools1/venv/lib/python3.8/site-packages/apscheduler/jobstores/mongodb.py", line 86, in add_job
self.collection.insert_one({
File "/opt/www/taskPools1/venv/lib/python3.8/site-packages/pymongo/collection.py", line 639, in insert_one
self._insert_one(
File "/opt/www/taskPools1/venv/lib/python3.8/site-packages/pymongo/collection.py", line 579, in _insert_one
self.__database.client._retryable_write(acknowledged, _insert_command, session)
File "/opt/www/taskPools1/venv/lib/python3.8/site-packages/pymongo/mongo_client.py", line 1493, in _retryable_write
return self._retry_with_session(retryable, func, s, None)
File "/opt/www/taskPools1/venv/lib/python3.8/site-packages/pymongo/mongo_client.py", line 1360, in _retry_with_session
return self._retry_internal(retryable, func, session, bulk)
File "/opt/www/taskPools1/venv/lib/python3.8/site-packages/pymongo/_csot.py", line 106, in csot_wrapper
return func(self, *args, **kwargs)
File "/opt/www/taskPools1/venv/lib/python3.8/site-packages/pymongo/mongo_client.py", line 1401, in _retry_internal
return func(session, sock_info, retryable)
File "/opt/www/taskPools1/venv/lib/python3.8/site-packages/pymongo/collection.py", line 577, in _insert_command
_check_write_command_response(result)
File "/opt/www/taskPools1/venv/lib/python3.8/site-packages/pymongo/helpers.py", line 230, in _check_write_command_response
_raise_last_write_error(write_errors)
File "/opt/www/taskPools1/venv/lib/python3.8/site-packages/pymongo/helpers.py", line 202, in _raise_last_write_error
raise DuplicateKeyError(error.get("errmsg"), 11000, error)
pymongo.errors.DuplicateKeyError: E11000 duplicate key error collection: jxkServer.task_pools_jobs index: _id_ dup key: { _id: "test" }, full error: {'index': 0, 'code': 11000, 'errmsg': 'E11000 duplicate key error collection: jxkServer.task_pools_jobs index: _id_ dup key: { _id: "test" }', 'keyPattern': {'_id': 1}, 'keyValue': {'_id': 'test'}}

During handling of the above exception, another exception occurred:

Traceback (most recent call last):
File "main.py", line 27, in <module>
scheduler.start()
File "/opt/www/taskPools1/venv/lib/python3.8/site-packages/flask_apscheduler/scheduler.py", line 105, in start
self._scheduler.start(paused=paused)
File "/opt/www/taskPools1/venv/lib/python3.8/site-packages/apscheduler/schedulers/gevent.py", line 21, in start
BaseScheduler.start(self, *args, **kwargs)
File "/opt/www/taskPools1/venv/lib/python3.8/site-packages/apscheduler/schedulers/base.py", line 167, in start
self._real_add_job(job, jobstore_alias, replace_existing)
File "/opt/www/taskPools1/venv/lib/python3.8/site-packages/apscheduler/schedulers/base.py", line 871, in _real_add_job
store.add_job(job)
File "/opt/www/taskPools1/venv/lib/python3.8/site-packages/apscheduler/jobstores/mongodb.py", line 92, in add_job
raise ConflictingIdError(job.id)
apscheduler.jobstores.base.ConflictingIdError: 'Job identifier (test) conflicts with an existing job'

 

报错原因:主要是 apscheduler  再将任务存储倒mongodb的时候是直接插入 并且任务id 就是任务名 导致 相同的任务插入错误

修改前:第84行的函数


def add_job(self, job):
try:
self.collection.insert_one({
'_id': job.id,
'next_run_time': datetime_to_utc_timestamp(job.next_run_time),
'job_state': Binary(pickle.dumps(job.__getstate__(), self.pickle_protocol))
})
except DuplicateKeyError:
raise ConflictingIdError(job.id)

修改以后:


def add_job(self, job):
try:
if not self.collection.find_one({'_id': job.id}):
self.collection.insert_one({
'_id': job.id,
'next_run_time': datetime_to_utc_timestamp(job.next_run_time),
'job_state': Binary(pickle.dumps(job.__getstate__(), self.pickle_protocol))
})
except DuplicateKeyError:
raise ConflictingIdError(job.id)

 

标签:opt,www,venv,taskPools1,apscheduler,job,py
From: https://www.cnblogs.com/jxkshu/p/17613831.html

相关文章

  • Antd的filterOption的使用
    <Selectmode="multiple"placeholder="请选择组成辅料"style={{width:'100%'}}optionFilterProp="children"filterOption={(input,......
  • ping www.baidu.com 未知的名称或服务
    1、先确定登录的身份是否是root用户,如果不是,最好切换为root身份2、输入vim/etc/sysconfig/network-scripts/ifcfg-ens33,然后会看到下图3、将BOOTPROTO="dhcp" 改成 BOOTPROTO="static"4、将ONBOOT="on" 改成 ONBOOT="yes"5、在后面加上IPADDR=192.168.75.123//此次......
  • [2023本地存储方案](https://www.cnblogs.com/fangchaoduan/p/17608006.html)
    2023本地存储方案本地存储方案cookie本地存储:有期限的限制,可以自己设置过期期限。在期限内,不论页面刷新还是关闭,存储的信息都还会存在。localStorage本地持久化存储:页面刷新或者关闭,存储的信息一直存在,除非手动清除或者卸载浏览器,而且没有有效期的限制。sessionSto......
  • grep - useful options
    The“-c”optioncountsthenumberofoccurrencesofastring:eventhoughABC4.shhasnomatches,itstillcountsthemandreturnszero:grep–cabc*shTheoutputoftheprecedingcommandishere:ABC4.sh:0abc3.sh:3The“-e”optionletsyoumatch......
  • echarts多次setOption没有覆盖上一条数据 和 echarts的站位问题
    1、问题现象:echarts第一次获取的数据展示后第二次再次获取会覆盖不了展示的依然是上次的数据解决办法:chart.clear()   2、问题现象:echarts的占位没有数据的话是只展示x轴和y轴解决办法:利用title的副标题subtext,默认为“暂无......
  • web扫描里 还是御剑、wwwscan、WebPathBrute好用 内置的字典强大啊
    我想只要有接触Web安全的小伙伴们,应该没有一个不知道御剑通常网站后台扫描工具都是利用后台目录字典进行爆破扫描,字典越多,扫描到的结果也越多而常用的网站后台扫描工具wwwscan、御剑、dirbuster和cansina正文这里简单优化了一下以前的工具,修正了一些问题。御剑新版+:https://pan.ba......
  • android开发Android studio卡顿配置studio64.exe.vmoptions文件的解决方法
    备份一下studio64.exe.vmoptions写法:-Xms2g-Xmx16g-XX:ReservedCodeCacheSize=2g-XX:+IgnoreUnrecognizedVMOptions-XX:+UseG1GC-XX:SoftRefLRUPolicyMSPerMB=100-XX:CICompilerCount=2-XX:+HeapDumpOnOutOfMemoryError-XX:-OmitStackTraceInFastThrow-da-Djna.nosys=t......
  • PostMan 如何在x-www-form-urlencode调试List<string>
    分析:第三方支持两种post请求方式: application/json和application/x-www-form-urlencode方式一:正常方式二异常:参数[loginIds]当前类型[String]转成目标类型[List]异常使用数组方式:数据统计不一致,不报错解决方案:命名至少两个相同的变量名称,变量名为空的也不能省略c#实现部分代码: /......
  • 利用ZoomEye探索互联网hikvision摄像头——直接htp://www.zoomeye.org/search?q=DVRDV
    实践了下,zoomeye或者shodan都可以!还是很吓人的...         然后随便点击一个就进去了。。。 利用ZoomEye探索互联网hikvision摄像头2021年4月2日某天使用IISPUTSCANNER扫描C段观察HTTPBANNER为DVRDVS-Webs点开后发现为hikvision摄像头的WEB端于是乎想到了知道创宇......
  • 【攻防世界】-Training-WWW-Robots
    信息收集翻译:在这个小小的训练挑战中,你将学习机器人的排除标准。robots.txt文件用于网络爬虫检查它们是否被允许抓取和索引您的网站或仅部分网站。有时,这些文件揭示了目录结构,而不是保护内容不被抓取。享受吧!解题思路根据题目可以联想到君子协议robots.txt在url后输入:......