更新时间:2022-11-26 10:05:09
只需记录做好这项工作.尝试使用PythonLoggingObserver
代替DefaultObserver
:
Just let logging do the job. Try to use PythonLoggingObserver
instead of DefaultObserver
:
INFO
and one for ERROR
messages) directly in python, or via fileconfig, or via dictconfig (see docs) start it in spider's __init__
:
def __init__(self, name=None, **kwargs):
# TODO: configure logging: e.g. logging.config.fileConfig("logging.conf")
observer = log.PythonLoggingObserver()
observer.start()
让我知道您是否需要配置记录器方面的帮助.
Let me know if you need help with configuring loggers.
另一种选择是在__init__.py
中启动两个文件日志观察器:
Another option is to start two file log observers in __init__.py
:
from scrapy.log import ScrapyFileLogObserver
from scrapy import log
class MySpider(BaseSpider):
name = "myspider"
def __init__(self, name=None, **kwargs):
ScrapyFileLogObserver(open("spider.log", 'w'), level=logging.INFO).start()
ScrapyFileLogObserver(open("spider_error.log", 'w'), level=logging.ERROR).start()
super(MySpider, self).__init__(name, **kwargs)
...