apache spark - pyspark streaming job unable to read messages from rabbitmq queue -


import sys  pyspark import sparkcontext pyspark.streaming import streamingcontext mqtt import mqttutils  if __name__ == "__main__":     if len(sys.argv) != 3:         print >> sys.stderr, "usage: mqtt_wordcount.py <broker url> <topic>"         exit(-1)      sc = sparkcontext(appname="pysparkstreamingrabbitmqwordcount")     ssc = streamingcontext(sc, 10)      brokerurl = sys.argv[1]     topic = sys.argv[2]      lines = mqttutils.createstream(ssc, brokerurl, topic)     counts = lines.flatmap(lambda line: line.split(" ")) \         .map(lambda word: (word, 1)).reducebykey(lambda a, b: a+b)     counts.pprint()      ssc.start()     ssc.awaittermination() 

below command run spark job:

spark2-submit --master yarn --deploy-mode client  --executor-cores 2 --executor-memory 1g --packages org.apache.bahir:spark-streaming-mqtt_2.11:2.2.0 sparkrabbitmq.py tcp://<hostname>:1883 test 

this code connect rabbitmq server , print messages on console , it's running without errors. created new queue named test , sending text messages rabbitmq web ui guest user has admin permissions. when run spark streaming job don't see output on console without errors. job running.

i have enabled rabbitmq plugin well.

environment:cdh 5.12,spark2


Comments

Popular posts from this blog

angular - Ionic slides - dynamically add slides before and after -

minify - Minimizing css files -

Add a dynamic header in angular 2 http provider -