28

Running a Python Spark Application via API call - On submitting the Application - response - Failed SSH into the Worker

My python application exists in

/root/spark/work/driver-id/wordcount.py

Error can be found in

/root/spark/work/driver-id/stderr

Show the following error -

Traceback (most recent call last):
  File "/root/wordcount.py", line 34, in <module>
    main()
  File "/root/wordcount.py", line 18, in main
    sc = SparkContext(conf=conf)
  File "/root/spark/python/lib/pyspark.zip/pyspark/context.py", line 115, in __init__
  File "/root/spark/python/lib/pyspark.zip/pyspark/context.py", line 172, in _do_init
  File "/root/spark/python/lib/pyspark.zip/pyspark/context.py", line 235, in _initialize_context
  File "/root/spark/python/lib/py4j-0.9-src.zip/py4j/java_gateway.py", line 1064, in __call__
  File "/root/spark/python/lib/py4j-0.9-src.zip/py4j/protocol.py", line 308, in get_return_value
py4j.protocol.Py4JJavaError: An error occurred while calling None.org.apache.spark.api.java.JavaSparkContext.
: java.io.FileNotFoundException: File file:/tmp/spark-events does not exist.
  at org.apache.hadoop.fs.RawLocalFileSystem.getFileStatus(RawLocalFileSystem.java:402)
  at org.apache.hadoop.fs.FilterFileSystem.getFileStatus(FilterFileSystem.java:255)
  at org.apache.spark.scheduler.EventLoggingListener.start(EventLoggingListener.scala:100)
  at org.apache.spark.SparkContext.<init>(SparkContext.scala:549)
  at org.apache.spark.api.java.JavaSparkContext.<init>(JavaSparkContext.scala:59)
  at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
  at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:57)
  at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
  at java.lang.reflect.Constructor.newInstance(Constructor.java:526)
  at py4j.reflection.MethodInvoker.invoke(MethodInvoker.java:234)
  at py4j.reflection.ReflectionEngine.invoke(ReflectionEngine.java:381)
  at py4j.Gateway.invoke(Gateway.java:214)
  at py4j.commands.ConstructorCommand.invokeConstructor(ConstructorCommand.java:79)
  at py4j.commands.ConstructorCommand.execute(ConstructorCommand.java:68)
  at py4j.GatewayConnection.run(GatewayConnection.java:209)
  at java.lang.Thread.run(Thread.java:745)

It indicates - /tmp/spark-events Does not exist - which is true However, in wordcount.py

from pyspark import SparkContext, SparkConf

... few more lines ...

def main():
    conf = SparkConf().setAppName("MyApp").setMaster("spark://ec2-54-209-108-127.compute-1.amazonaws.com:7077")
    sc = SparkContext(conf=conf)
    sc.stop()

if __name__ == "__main__":
    main()
1
  • 1
    Disabling event logging might help: SparkConf().set("spark.eventLog.enabled","false")...
    – asmaier
    Commented Jun 23, 2017 at 14:37

5 Answers 5

45

/tmp/spark-events is the location that Spark store the events logs. Just create this directory in the master machine and you're set.

$mkdir /tmp/spark-events
$ sudo /root/spark-ec2/copy-dir /tmp/spark-events/
RSYNC'ing /tmp/spark-events to slaves...
ec2-54-175-163-32.compute-1.amazonaws.com
3
  • Tried creating directory in master machine. Retried the API call. Error persists Commented Jul 13, 2016 at 11:41
  • Ohk. So I had to replicate the same across the workers. Error was resolved. Thanks Commented Jul 13, 2016 at 11:42
  • Where "/tmp/spark-events" is set? I put in $SPARK_HOME/conf/spark-defaults.conf the followings attrs: spark.eventLog.enabled true spark.eventLog.dir hdfs://host:port/user/root/spark-logs spark.history.fs.logDirectory hdfs://host:port/user/root/spark-logs
    – Peter
    Commented May 22, 2019 at 13:05
10

While trying to setup my spark history server on my local machine, I had the same 'File file:/tmp/spark-events does not exist.' error. I had customized my log directory to a non-default path. To resolve this, I needed to do 2 things.

  1. edit $SPARK_HOME/conf/spark-defaults.conf -- add these 2 lines spark.history.fs.logDirectory /mycustomdir spark.eventLog.enabled true
  2. create a link from /tmp/spark-events to /mycustomdir.
    ln -fs /tmp/spark-events /mycustomdir Ideally, step 1 would have solved my issue entirely, but i still needed to create the link so I suspect there might have been one other setting i missed. Anyhow, once I did this, i was able to run my historyserver and see new jobs logged in my webui.
2
  • 1
    worked from me but I had to create the sym link with reverse order. I did ln -fs /mycustomdir . /tmp/spark-events
    – george
    Commented Jan 3, 2018 at 5:38
  • 1
    I added spark.eventLog.dir=/mycustomdir to spark-defaults.conf and so there is no need for the soft link created with ln.
    – bpelhos
    Commented Aug 7, 2019 at 16:03
5

Use spark.eventLog.dir for client/driver program

spark.eventLog.dir=/usr/local/spark/history

and use spark.history.fs.logDirectory for history server

spark.history.fs.logDirectory=/usr/local/spark/history

as mentioned in: How to enable spark-history server for standalone cluster non hdfs mode

At least as per Spark version 2.2.1

1

I just created /tmp/spark-events on the {master} node and then distributed it to other nodes on the cluster to work.

mkdir /tmp/spark-events
rsync -a /tmp/spark-events {slaves}:/tmp/spark-events

my spark-default.conf:

spark.history.ui.port=18080
spark.eventLog.enabled=true
spark.history.fs.logDirectory=hdfs:///home/elon/spark/events
0

when I try edit two files spark-default.conf spark_env.sh, and histroy-server starting.

spark-default.conf:

spark.eventLog.enabled           true
spark.history.ui.port=18080
spark.history.fs.logDirectory={host}:{port}/directory

spark_env.sh

export SPARK_HISTORY_OPTS="
-Dspark.history.ui.port=18080
-Dspark.history.fs.logDirectory={host}:{port}/directory
-Dspark.history.retainedApplications=30"

Not the answer you're looking for? Browse other questions tagged or ask your own question.