Jupyter notebook and Pyspark

0

The problem is this: When I enter Jupyer notebok with the command $ pyspark I create a new notebook and type: print(sc) it returns me the following error:

NameError Traceback (most recent call last)
<ipython-input-1-01cf0a34ba78> in <module>()
----> 1 print(sc)

NameError: name 'sc' is not defined

And I have no idea how to solve it. Here are some files that I think will be useful for solving the problem:

Required Programs

~$ java -version
java version "9-ea"
Java(TM) SE Runtime Environment (build 9-ea+140)
Java HotSpot(TM) 64-Bit Server VM (build 9-ea+140, mixed mode)

~$ ls /opt/Spark/
bin   data      jars     licenses  python  README.md  sbin
conf  examples  LICENSE  NOTICE    R       RELEASE    yarn

~$ ls anaconda3/
bin         include      mkspecs      README.rst    var
conda-meta  lib          phrasebooks  sbin
doc         libexec      pkgs         share
envs        LICENSE.rst  plugins      ssl
etc         LICENSE.txt  qml          translations

    
asked by anonymous 09.01.2017 / 20:28

1 answer

0

I managed to resolve. For some nebulous reason pyspark does not work with java in version 9, so I returned version 8.

    
09.01.2017 / 21:45