且构网

分享程序员开发的那些事...
且构网 - 分享程序员编程开发的那些事

运行多个 Scrapy Spider(简单的方法)Python

更新时间:2022-03-12 09:37:27

这里有一个简单的方法.您需要将此代码与scrapy.cfg 保存在同一目录中(我的scrapy 版本是1.3.3):

Here it is the easy way. you need to save this code at the same directory with scrapy.cfg (My scrapy version is 1.3.3) :

from scrapy.utils.project import get_project_settings
from scrapy.crawler import CrawlerProcess

setting = get_project_settings()
process = CrawlerProcess(setting)

for spider_name in process.spiders.list():
    print ("Running spider %s" % (spider_name))
    process.crawl(spider_name,query="dvh") #query dvh is custom argument used in your scrapy

process.start()

并运行它.就是这样!

and run it. thats it!