V2EX = way to explore
V2EX 是一个关于分享和探索的地方
现在注册
已注册用户请  登录
推荐学习书目
Learn Python the Hard Way
Python Sites
PyPI - Python Package Index
http://diveintopython.org/toc/index.html
Pocoo
值得关注的项目
PyPy
Celery
Jinja2
Read the Docs
gevent
pyenv
virtualenv
Stackless Python
Beautiful Soup
结巴中文分词
Green Unicorn
Sentry
Shovel
Pyflakes
pytest
Python 编程
pep8 Checker
Styles
PEP 8
Google Python Style Guide
Code Style from The Hitchhiker's Guide
bigbearme
V2EX  ›  Python

ubuntu16.04 lts 使用 anaconda2 安装了 scrapy 结果执行原来在 windows 下可以正常运行的的程序报错,执行 scrapy shell 命令也报错, google 了好久没找到答案,求大神解答,谢谢!

  •  
  •   bigbearme · 2016-04-28 22:31:28 +08:00 · 4742 次点击
    这是一个创建于 3117 天前的主题,其中的信息可能已经有所发展或是发生改变。
    Traceback (most recent call last):
    File "/home/peter/anaconda2/bin/scrapy", line 11, in <module>
    sys.exit(execute())
    File "/home/peter/anaconda2/lib/python2.7/site-packages/scrapy/cmdline.py", line 143, in execute
    _run_print_help(parser, _run_command, cmd, args, opts)
    File "/home/peter/anaconda2/lib/python2.7/site-packages/scrapy/cmdline.py", line 89, in _run_print_help
    func(*a, **kw)
    File "/home/peter/anaconda2/lib/python2.7/site-packages/scrapy/cmdline.py", line 150, in _run_command
    cmd.run(args, opts)
    File "/home/peter/anaconda2/lib/python2.7/site-packages/scrapy/commands/shell.py", line 61, in run
    crawler.engine = crawler._create_engine()
    File "/home/peter/anaconda2/lib/python2.7/site-packages/scrapy/crawler.py", line 83, in _create_engine
    return ExecutionEngine(self, lambda _: self.stop())
    File "/home/peter/anaconda2/lib/python2.7/site-packages/scrapy/core/engine.py", line 69, in __init__
    self.scraper = Scraper(crawler)
    File "/home/peter/anaconda2/lib/python2.7/site-packages/scrapy/core/scraper.py", line 70, in __init__
    self.itemproc = itemproc_cls.from_crawler(crawler)
    File "/home/peter/anaconda2/lib/python2.7/site-packages/scrapy/middleware.py", line 56, in from_crawler
    return cls.from_settings(crawler.settings, crawler)
    File "/home/peter/anaconda2/lib/python2.7/site-packages/scrapy/middleware.py", line 34, in from_settings
    mw = mwcls.from_crawler(crawler)
    File "/home/peter/anaconda2/lib/python2.7/site-packages/scrapy/pipelines/media.py", line 33, in from_crawler
    pipe = cls.from_settings(crawler.settings)
    File "/home/peter/anaconda2/lib/python2.7/site-packages/scrapy/pipelines/images.py", line 57, in from_settings
    return cls(store_uri)
    File "/home/peter/anaconda2/lib/python2.7/site-packages/scrapy/pipelines/files.py", line 160, in __init__
    self.store = self._get_store(store_uri)
    File "/home/peter/anaconda2/lib/python2.7/site-packages/scrapy/pipelines/files.py", line 180, in _get_store
    store_cls = self.STORE_SCHEMES[scheme]
    KeyError: 'd'
    11 条回复    2016-04-29 17:25:47 +08:00
    skyrem
        1
    skyrem  
       2016-04-29 00:20:54 +08:00
    scrapy shell 都报错兴许是 scrapy 本身的问题,重新用 pip 装一个吧
    bigbearme
        2
    bigbearme  
    OP
       2016-04-29 06:33:45 +08:00 via iPhone
    @skyrem 重新安装了两次然而并没有效果,不知道是不是因为 ubuntu 版本的问题
    wlsnx
        3
    wlsnx  
       2016-04-29 10:24:18 +08:00
    wlsnx
        4
    wlsnx  
       2016-04-29 10:37:49 +08:00
    突然想到了,你不会是想把文件存在 D 盘吧? Linux 可没有 D 盘哦。
    bigbearme
        5
    bigbearme  
    OP
       2016-04-29 11:20:51 +08:00 via iPhone
    @wlsnx 我运行 scrapy shell 也报这个错啊。 linux 没有 d 盘这种常识我还是有的...
    leavic
        6
    leavic  
       2016-04-29 12:43:03 +08:00
    scrapy shell 都报错那坑定是 scrapy 没装好啊,这和你的代码又没关系。
    pc10201
        7
    pc10201  
       2016-04-29 13:16:58 +08:00
    我遇到过一个坑,通过 pip 安装 scrapy ,发现没有 scrapy.exe 文件,后来手动下载源码,运行 python setup.py install 来安装的
    bigbearme
        8
    bigbearme  
    OP
       2016-04-29 13:22:05 +08:00 via iPhone
    @leavic 嗯,我也判断和代码没关系,估计是没装好, scrapy 安装真累心
    wlsnx
        9
    wlsnx  
       2016-04-29 14:16:21 +08:00
    我不知道 scrapy shell 为什么报错,但是报了 KeyError: 'd'这个错,你最好检查一下 settings.py 里有没有 FILES_STORE="d:\some\file\path"之类的配置。
    Neveroldmilk
        10
    Neveroldmilk  
       2016-04-29 15:51:38 +08:00
    16.04 太新了,等一段时间再试吧。
    bigbearme
        11
    bigbearme  
    OP
       2016-04-29 17:25:47 +08:00 via iPhone
    @wlsnx 嗯嗯,回去看看,多谢
    关于   ·   帮助文档   ·   博客   ·   API   ·   FAQ   ·   实用小工具   ·   2620 人在线   最高记录 6679   ·     Select Language
    创意工作者们的社区
    World is powered by solitude
    VERSION: 3.9.8.5 · 26ms · UTC 02:58 · PVG 10:58 · LAX 18:58 · JFK 21:58
    Developed with CodeLauncher
    ♥ Do have faith in what you're doing.