python爬虫scrapy之如何同时执行多个scrapy爬行任务

1、顺序执行:

from scrapy.cmdline import execute
 
execute(['scrapy','crawl','httpbin'])

2、同时进行

    setting = get_project_settings()
    process = CrawlerProcess(setting)
    didntWorkSpider = ['sample']
    workSpider = ['gochinaz', 'gochinaz2', 'gochinaz3', 'gochinaz4', 'gochinaz5', 'gochinaz6', 'gochinaz7', 'gochinaz8']

    print("运行中...")
    for spider_name in process.spiders.list():
        if spider_name in workSpider:
            print("Running spider %s" % (spider_name))
            process.crawl(spider_name)
    process.start()
微信公众号
手机浏览(小程序)
0
分享到:
没有账号? 忘记密码?