用 scrapy_redis 作为 scrapy 爬虫的排重数据库,之前用 Python2.x 没问题,现在想升级到 Python3.5 报了个错误,报错如下:[2016-06-10 19:44:35,905]CRITICAL:Unhandled Error
Traceback (most recent call last):
File "d:\python3.5.1\lib\site-packages\scrapy\commands\crawl.py", line 58, in run
self.crawler_process.start()
File "d:\python3.5.1\lib\site-packages\scrapy\crawler.py", line 280, in start
reactor.run(installSignalHandlers=False) # blocking call
File "d:\python3.5.1\lib\site-packages\twisted\internet\base.py", line 1194, in run
self.mainLoop()
File "d:\python3.5.1\lib\site-packages\twisted\internet\base.py", line 1203, in mainLoop
self.runUntilCurrent()
--- <exception caught here> ---
File "d:\python3.5.1\lib\site-packages\twisted\internet\base.py", line 825, in runUntilCurrent
call.func(*call.args, **call.kw)
File "d:\python3.5.1\lib\site-packages\scrapy\utils\reactor.py", line 41, in __call__
return self._func(*self._a, **self._kw)
File "d:\python3.5.1\lib\site-packages\scrapy\core\engine.py", line 134, in _next_request
self.crawl(request, spider)
File "d:\python3.5.1\lib\site-packages\scrapy\core\engine.py", line 209, in crawl
self.schedule(request, spider)
File "d:\python3.5.1\lib\site-packages\scrapy\core\engine.py", line 215, in schedule
if not self.slot.scheduler.enqueue_request(request):
File "d:\python3.5.1\lib\site-packages\scrapy_redis\scheduler.py", line 78, in enqueue_request
self.queue.push(request)
File "d:\python3.5.1\lib\site-packages\scrapy_redis\queue.py", line 83, in push
self.server.zadd(self.key, **pairs)
builtins.TypeError: zadd() keywords must be strings
新手,不知道咋解决。