来源:https://www.shanhubei.com/archives/23601.html
1、顺序执行:
from scrapy.cmdline import execute
execute(['scrapy','crawl','httpbin'])
2、同时进行
setting = get_project_settings()
process = CrawlerProcess(setting)
didntWorkSpider = ['sample']
workSpider = ['gochinaz', 'gochinaz2', 'gochinaz3', 'gochinaz4', 'gochinaz5', 'gochinaz6', 'gochinaz7', 'gochinaz8']
print("运行中...")
for spider_name in process.spiders.list():
if spider_name in workSpider:
print("Running spider %s" % (spider_name))
process.crawl(spider_name)
process.start()
标签:name,python,workSpider,爬虫,spider,process,scrapy,print
From: https://www.cnblogs.com/shanhubei/p/18066700