spark kafka security kerberos -


i try use kafka(0.9.1) secure mode. read data spark, must pass jaas conf file jvm. use cmd start job :

    /opt/spark/bin/spark-submit -v --master spark://master1:7077    \     --conf "spark.executor.extrajavaoptions=-djava.security.auth.login.conf=kafka_client_jaas.conf" \     --files "./conf/kafka_client_jaas.conf,./conf/kafka.client.1.keytab" \     --class kafka.consumersasl  ./kafka.jar --topics test 

i still have same error :

caused by: java.lang.illegalargumentexception: must pass java.security.auth.login.config in secure mode.     @ org.apache.kafka.common.security.kerberos.login.login(login.java:289)     @ org.apache.kafka.common.security.kerberos.login.<init>(login.java:104)     @ org.apache.kafka.common.security.kerberos.loginmanager.<init>(loginmanager.java:44)     @ org.apache.kafka.common.security.kerberos.loginmanager.acquireloginmanager(loginmanager.java:85)     @ org.apache.kafka.common.network.saslchannelbuilder.configure(saslchannelbuilder.java:55) 

i think spark not inject parameter djava.security.auth.login.conf in jvm !!

the main cause of issue have mentioned wrong property name. should java.security.auth.login.config , not -djava.security.auth.login.conf. if using keytab file. make sure make available on executors using --files argument in spark-submit. if using kerberos ticket make sure set krb5ccname on executors using property spark_yarn_user_env.

if using older version of spark 1.6.x or earlier. there known issues spark integration not work have write custom receiver.

for spark 1.8 , later, can see configuration here

incase need create custom receiver can see this


Comments

Popular posts from this blog

Spring Boot + JPA + Hibernate: Unable to locate persister -

go - Golang: panic: runtime error: invalid memory address or nil pointer dereference using bufio.Scanner -

c - double free or corruption (fasttop) -