Error installing Scrapy package in Python

2

I'm trying to install Scrapy through pip, but I've been getting errors like:

  

running build_ext

     

buildig 'lxml.etree' extension

     

error: Microsoft Visual C ++ 10.0 is required (Unable to find   vcvarsall.bat).

I already have VSC ++ 10 installed. I'm using Python version 3.4.4 and Windows 7. Can anyone help me?

    
asked by anonymous 18.11.2016 / 18:44

3 answers

2

Many Python packages are annoying to install with PIP on Windows because they need other tools to compile.

I would try to install already precompiled.

  • Find the link to the desired package here link . Usually you would need to choose the right version, Python 3 vs. Python 2, 32-bit vs. 64-bit. But in this case it looks like there is only one version.
  • Download the wheel (.whl) file from the link
  • Install with PIP: pip install c:\caminho\para\Scrapy‑1.2.1‑py2.py3‑none‑any.whl
  • This is how I install NumPy and matplotlib among others in Windows.

        
    05.12.2016 / 21:51
    1

    The most convenient way to install in command line and package manager conda of < anaconda :

    27.07.2018 / 00:24
    0

    See this link :

      

    However, if you are targeting Python 3.3 and 3.4 (and did not have   access to Visual Studio 2010), building is slightly more complicated.   You will need to open Visual Studio Command Prompt (selecting the   x64 version if using 64-bit Python) and run set DISTUTILS_USE_SDK = 1   before calling pip install.

    It looks like you need to open a shell in Visual Studio and call the pip install from within.

        
    19.11.2016 / 01:41