apache spark - Exception in thread "main" java.lang.NoClassDefFoundError: org/deeplearning4j/nn/conf/layers/Layer -


i trying build application on spark using deeplearning4j library. have cluster going run jar(built using intellij) using spark-submit command. here's code

package com.spark.examples  import scala.collection.mutable.listbuffer import org.apache.spark.sparkconf import org.apache.spark.sparkcontext import org.canova.api.records.reader.recordreader import org.canova.api.records.reader.impl.csvrecordreader import org.deeplearning4j.nn.api.optimizationalgorithm import org.deeplearning4j.nn.conf.multilayerconfiguration import org.deeplearning4j.nn.conf.neuralnetconfiguration import org.deeplearning4j.nn.conf.layers.denselayer import org.deeplearning4j.nn.conf.layers.outputlayer import org.deeplearning4j.nn.multilayer.multilayernetwork import org.deeplearning4j.nn.weights.weightinit import org.deeplearning4j.spark.impl.multilayer.sparkdl4jmultilayer import org.nd4j.linalg.lossfunctions.lossfunctions  object feedforwardnetworkwithspark {   def main(args:array[string]): unit ={     val recordreader:recordreader = new csvrecordreader(0,",")     val conf = new sparkconf()       .setappname("feedforwardnetwork-iris")     val sc = new sparkcontext(conf)     val numinputs:int = 4     val outputnum = 3     val iterations =1     val multilayerconfig:multilayerconfiguration = new neuralnetconfiguration.builder()       .seed(12345)       .iterations(iterations)       .optimizationalgo(optimizationalgorithm.stochastic_gradient_descent)       .learningrate(1e-1)       .l1(0.01).regularization(true).l2(1e-3)       .list(3)       .layer(0, new denselayer.builder().nin(numinputs).nout(3).activation("tanh").weightinit(weightinit.xavier).build())       .layer(1, new denselayer.builder().nin(3).nout(2).activation("tanh").weightinit(weightinit.xavier).build())       .layer(2, new outputlayer.builder(lossfunctions.lossfunction.mcxent).weightinit(weightinit.xavier)         .activation("softmax")         .nin(2).nout(outputnum).build())       .backprop(true).pretrain(false)       .build     val network:multilayernetwork = new multilayernetwork(multilayerconfig)     network.init     network.setupdater(null)     val sparknetwork:sparkdl4jmultilayer = new         sparkdl4jmultilayer(sc,network)     val nepochs:int = 6     val listbuffer = new listbuffer[array[float]]()     (0 until nepochs).foreach{i => val net:multilayernetwork = sparknetwork.fit("/user/iris.txt",4,recordreader)       listbuffer +=(net.params.data.asfloat().clone())       }     println("parameters vs. iteration output: ")     (0 until listbuffer.size).foreach{i =>       println(i+"\t"+listbuffer(i).mkstring)}   } } 

here build.sbt file

name := "hwapp"  version := "0.1"  scalaversion := "2.12.3"  librarydependencies += "org.apache.spark" % "spark-core_2.10" % "1.6.0" % "provided" librarydependencies += "org.apache.spark" % "spark-mllib_2.10" % "1.6.0" % "provided" librarydependencies += "org.deeplearning4j" % "deeplearning4j-nlp" % "0.4-rc3.8" librarydependencies += "org.deeplearning4j" % "dl4j-spark" % "0.4-rc3.8" librarydependencies += "org.deeplearning4j" % "deeplearning4j-core" % "0.4-rc3.8" librarydependencies += "org.nd4j" % "nd4j-x86" % "0.4-rc3.8" % "test" librarydependencies += "org.nd4j" % "nd4j-api" % "0.4-rc3.8" librarydependencies += "org.nd4j" % "nd4j-jcublas-7.0" % "0.4-rc3.8" librarydependencies += "org.nd4j" % "canova-api" % "0.0.0.14" 

when see code in intellij, not show error when execute application on cluster: got this:

error

i don't know wants me. little appreciated. thanks.

i'm not sure how came list of versions (i'm assuming randomly compiling? please don't that.)

you using 1.5 year old version of dl4j dependencies year older that don't exist anymore.

start scratch , follow our getting started , examples other open source project.

those can found here: https://deeplearning4j.org/quickstart

with example projects here: https://github.com/deeplearnin4j/dl4j-examples

a few more things: canova doesn't exist anymore , has been renamed datavec more year.

all dl4j, datavec, nd4j,.. versions must same.

if using of our scala modules spark, must have same scala version.

so mixing scala 2.12 scala 2.10 dependencies scala no no (that's not dl4j specific).

dl4j supports scala 2.11 @ most. because hadoop distros cdh , hortonworks don't support scala 2.12 yet.

edit: thing watch out dl4j specific how spark versions. spark 1 , 2 supported. artifact id should be:

dl4j-spark_${yourscala version} (usually 2.10, 2.11) dependency like: 0.9.1_spark_${your version of spark}

this applicable our nlp modules well.

edit more folks haven't followed our getting started (please that, keep date): need nd4j backend. nd4j-native-platform maybe cuda if using gpus with: nd4j-cuda-${your cuda version}-platform


Comments

Popular posts from this blog

angular - Ionic slides - dynamically add slides before and after -

minify - Minimizing css files -

Add a dynamic header in angular 2 http provider -