ibm bluemix - how to list jars in the dsx spark environment and the jars loaded into the spark JVM? -


i'm hitting issues trying use spark packages, example:

java.lang.classnotfoundexception: failed find data source: com.mongodb.spark.sql.defaultsource 

i have listed files in lib dir:

!find ~/data/libs/ 

i can see jars installed:

/gpfs/fs01/user/xxxx/data/libs/ /gpfs/fs01/user/xxxx/data/libs/scala-2.11 /gpfs/fs01/user/xxxx/data/libs/scala-2.11/mongo-spark-connector_2.11-2.0.0.jar /gpfs/fs01/user/xxxx/data/libs/scala-2.11/mongo-java-driver-3.2.2.jar /gpfs/fs01/user/xxxx/data/libs/pixiedust.jar /gpfs/fs01/user/xxxx/data/libs/spark-csv_2.11-1.3.0.jar 

however, error suggests spark unable see jar.

how can list jars available spark?

the classpath in environment variable spark_dist_classpath. following snippet execution in python notebook yields duplicates , non-jars, jars on classpath.

!ls $(printenv spark_dist_classpath | sed -e 's/:/ /g') 

note classpath depends on selected spark version.


Comments

Popular posts from this blog

angular - Ionic slides - dynamically add slides before and after -

minify - Minimizing css files -

Add a dynamic header in angular 2 http provider -