Kafka-Connect-Hdfs - Couldn't start HdfsSinkConnector -


i've downloaded kafka connect http://docs.confluent.io/2.0.0/quickstart.html#quickstart

i'm trying run hdfs connector. here settings:

connect-standalone.properties:

bootstrap.servers=lvpi00658.s:9092,lvpi00659.s:9092,lvpi00660.s:9092  key.converter=org.apache.kafka.connect.storage.stringconverter value.converter=org.apache.kafka.connect.storage.stringconverter  internal.key.converter=org.apache.kafka.connect.storage.stringconverter internal.value.converter=org.apache.kafka.connect.storage.stringconverter  offset.storage.file.filename=/tmp/connect.offsets # flush faster normal, useful testing/debugging offset.flush.interval.ms=10000  key.deserializer=org.apache.kafka.common.serialization.stringdeserializer  value.deserializer=org.apache.kafka.common.serialization.stringdeserializer 

and

quickstart-hdfs.properties:

name=hdfs-sink connector.class=io.confluent.connect.hdfs.hdfssinkconnector tasks.max=1 topics=eightball-stuff11 hdfs.url=hdfs://localhost:9000 flush.size=3 

i run hdfs connector this:
cd /home/fclvappi005561/confluent-3.0.0/bin
./connect-standalone ../etc/kafka-connect-hdfs/connect-standalone.properties ../etc/kafka-connect-hdfs/quickstart-hdfs.properties

but error:

[2016-09-12 17:19:28,039] info couldn't start hdfssinkconnector: (io.confluent.connect.hdfs.hdfssinktask:72) org.apache.kafka.connect.errors.connectexception: org.apache.hadoop.security.accesscontrolexception: permission denied: user=lvpi005561, access=write, inode="/topics":root:supergroup:drwxr-xr-x @ org.apache.hadoop.hdfs.server.namenode.fspermissionchecker.check(fspermissionchecker.java:319) @ org.apache.hadoop.hdfs.server.namenode.fspermissionchecker.check(fspermissionchecker.java:292) @ org.apache.hadoop.hdfs.server.namenode.fspermissionchecker.checkpermission(fspermissionchecker.java:213) @ org.apache.hadoop.hdfs.server.namenode.fspermissionchecker.checkpermission(fspermissionchecker.java:190) @ org.apache.hadoop.hdfs.server.namenode.fsdirectory.checkpermission(fsdirectory.java:1698) @ org.apache.hadoop.hdfs.server.namenode.fsdirectory.checkpermission(fsdirectory.java:1682) @ org.apache.hadoop.hdfs.server.namenode.fsdirectory.checkancestoraccess(fsdirectory.java:1665) @ org.apache.hadoop.hdfs.server.namenode.fsdirmkdirop.mkdirs(fsdirmkdirop.java:71) @ org.apache.hadoop.hdfs.server.namenode.fsnamesystem.mkdirs(fsnamesystem.java:3900) @ org.apache.hadoop.hdfs.server.namenode.namenoderpcserver.mkdirs(namenoderpcserver.java:978) @ org.apache.hadoop.hdfs.protocolpb.clientnamenodeprotocolserversidetranslatorpb.mkdirs(clientnamenodeprotocolserversidetranslatorpb.java:622) @ org.apache.hadoop.hdfs.protocol.proto.clientnamenodeprotocolprotos$clientnamenodeprotocol$2.callblockingmethod(clientnamenodeprotocolprotos.java) @ org.apache.hadoop.ipc.protobufrpcengine$server$protobufrpcinvoker.call(protobufrpcengine.java:616) @ org.apache.hadoop.ipc.rpc$server.call(rpc.java:969) @ org.apache.hadoop.ipc.server$handler$1.run(server.java:2049) @ org.apache.hadoop.ipc.server$handler$1.run(server.java:2045) @ java.security.accesscontroller.doprivileged(native method) @ javax.security.auth.subject.doas(subject.java:415) @ org.apache.hadoop.security.usergroupinformation.doas(usergroupinformation.java:1657) @ org.apache.hadoop.ipc.server$handler.run(server.java:2043) @ io.confluent.connect.hdfs.datawriter.(datawriter.java:202) @ io.confluent.connect.hdfs.hdfssinktask.start(hdfssinktask.java:64) @ org.apache.kafka.connect.runtime.workersinktask.initializeandstart(workersinktask.java:207) @ org.apache.kafka.connect.runtime.workersinktask.execute(workersinktask.java:139) @ org.apache.kafka.connect.runtime.workertask.dorun(workertask.java:140) @ org.apache.kafka.connect.runtime.workertask.run(workertask.java:175) @ java.util.concurrent.executors$runnableadapter.call(executors.java:511) @ java.util.concurrent.futuretask.run(futuretask.java:266) @ java.util.concurrent.threadpoolexecutor.runworker(threadpoolexecutor.java:1142) @ java.util.concurrent.threadpoolexecutor$worker.run(threadpoolexecutor.java:617) @ java.lang.thread.run(thread.java:745) caused by: org.apache.hadoop.security.accesscontrolexception: permission denied: user=fclvappi005561, access=write, inode="/topics":root:supergroup:drwxr-xr-x @ org.apache.hadoop.hdfs.server.namenode.fspermissionchecker.check(fspermissionchecker.java:319) @ org.apache.hadoop.hdfs.server.namenode.fspermissionchecker.check(fspermissionchecker.java:292) @ org.apache.hadoop.hdfs.server.namenode.fspermissionchecker.checkpermission(fspermissionchecker.java:213) @ org.apache.hadoop.hdfs.server.namenode.fspermissionchecker.checkpermission(fspermissionchecker.java:190) @ org.apache.hadoop.hdfs.server.namenode.fsdirectory.checkpermission(fsdirectory.java:1698) @ org.apache.hadoop.hdfs.server.namenode.fsdirectory.checkpermission(fsdirectory.java:1682) @ org.apache.hadoop.hdfs.server.namenode.fsdirectory.checkancestoraccess(fsdirectory.java:1665) @ org.apache.hadoop.hdfs.server.namenode.fsdirmkdirop.mkdirs(fsdirmkdirop.java:71) @ org.apache.hadoop.hdfs.server.namenode.fsnamesystem.mkdirs(fsnamesystem.java:3900) @ org.apache.hadoop.hdfs.server.namenode.namenoderpcserver.mkdirs(namenoderpcserver.java:978) @ org.apache.hadoop.hdfs.protocolpb.clientnamenodeprotocolserversidetranslatorpb.mkdirs(clientnamenodeprotocolserversidetranslatorpb.java:622) @ org.apache.hadoop.hdfs.protocol.proto.clientnamenodeprotocolprotos$clientnamenodeprotocol$2.callblockingmethod(clientnamenodeprotocolprotos.java) @ org.apache.hadoop.ipc.protobufrpcengine$server$protobufrpcinvoker.call(protobufrpcengine.java:616) @ org.apache.hadoop.ipc.rpc$server.call(rpc.java:969) @ org.apache.hadoop.ipc.server$handler$1.run(server.java:2049) @ org.apache.hadoop.ipc.server$handler$1.run(server.java:2045) @ java.security.accesscontroller.doprivileged(native method) @ javax.security.auth.subject.doas(subject.java:415) @ org.apache.hadoop.security.usergroupinformation.doas(usergroupinformation.java:1657) @ org.apache.hadoop.ipc.server$handler.run(server.java:2043) @ sun.reflect.nativeconstructoraccessorimpl.newinstance0(native method) @ sun.reflect.nativeconstructoraccessorimpl.newinstance(nativeconstructoraccessorimpl.java:62) @ sun.reflect.delegatingconstructoraccessorimpl.newinstance(delegatingconstructoraccessorimpl.java:45) @ java.lang.reflect.constructor.newinstance(constructor.java:423) @ org.apache.hadoop.ipc.remoteexception.instantiateexception(remoteexception.java:106) @ org.apache.hadoop.ipc.remoteexception.unwrapremoteexception(remoteexception.java:73) @ org.apache.hadoop.hdfs.dfsclient.primitivemkdir(dfsclient.java:2755) @ org.apache.hadoop.hdfs.dfsclient.mkdirs(dfsclient.java:2724) @ org.apache.hadoop.hdfs.distributedfilesystem$17.docall(distributedfilesystem.java:870) @ org.apache.hadoop.hdfs.distributedfilesystem$17.docall(distributedfilesystem.java:866) @ org.apache.hadoop.fs.filesystemlinkresolver.resolve(filesystemlinkresolver.java:81) @ org.apache.hadoop.hdfs.distributedfilesystem.mkdirsinternal(distributedfilesystem.java:866) @ org.apache.hadoop.hdfs.distributedfilesystem.mkdirs(distributedfilesystem.java:859) @ org.apache.hadoop.fs.filesystem.mkdirs(filesystem.java:1817) @ io.confluent.connect.hdfs.storage.hdfsstorage.mkdirs(hdfsstorage.java:61) @ io.confluent.connect.hdfs.datawriter.createdir(datawriter.java:369) @ io.confluent.connect.hdfs.datawriter.(datawriter.java:170) ... 10 more caused by: org.apache.hadoop.ipc.remoteexception(org.apache.hadoop.security.accesscontrolexception): permission denied: user=fclvappi005561, access=write, inode="/topics":root:supergroup:drwxr-xr-x @ org.apache.hadoop.hdfs.server.namenode.fspermissionchecker.check(fspermissionchecker.java:319) @ org.apache.hadoop.hdfs.server.namenode.fspermissionchecker.check(fspermissionchecker.java:292) @ org.apache.hadoop.hdfs.server.namenode.fspermissionchecker.checkpermission(fspermissionchecker.java:213) @ org.apache.hadoop.hdfs.server.namenode.fspermissionchecker.checkpermission(fspermissionchecker.java:190) @ org.apache.hadoop.hdfs.server.namenode.fsdirectory.checkpermission(fsdirectory.java:1698) @ org.apache.hadoop.hdfs.server.namenode.fsdirectory.checkpermission(fsdirectory.java:1682) @ org.apache.hadoop.hdfs.server.namenode.fsdirectory.checkancestoraccess(fsdirectory.java:1665) @ org.apache.hadoop.hdfs.server.namenode.fsdirmkdirop.mkdirs(fsdirmkdirop.java:71) @ org.apache.hadoop.hdfs.server.namenode.fsnamesystem.mkdirs(fsnamesystem.java:3900) @ org.apache.hadoop.hdfs.server.namenode.namenoderpcserver.mkdirs(namenoderpcserver.java:978) @ org.apache.hadoop.hdfs.protocolpb.clientnamenodeprotocolserversidetranslatorpb.mkdirs(clientnamenodeprotocolserversidetranslatorpb.java:622) @ org.apache.hadoop.hdfs.protocol.proto.clientnamenodeprotocolprotos$clientnamenodeprotocol$2.callblockingmethod(clientnamenodeprotocolprotos.java) @ org.apache.hadoop.ipc.protobufrpcengine$server$protobufrpcinvoker.call(protobufrpcengine.java:616) @ org.apache.hadoop.ipc.rpc$server.call(rpc.java:969) @ org.apache.hadoop.ipc.server$handler$1.run(server.java:2049) @ org.apache.hadoop.ipc.server$handler$1.run(server.java:2045) @ java.security.accesscontroller.doprivileged(native method) @ javax.security.auth.subject.doas(subject.java:415) @ org.apache.hadoop.security.usergroupinformation.doas(usergroupinformation.java:1657) @ org.apache.hadoop.ipc.server$handler.run(server.java:2043) @ org.apache.hadoop.ipc.client.call(client.java:1468) @ org.apache.hadoop.ipc.client.call(client.java:1399) @ org.apache.hadoop.ipc.protobufrpcengine$invoker.invoke(protobufrpcengine.java:232) @ com.sun.proxy.$proxy47.mkdirs(unknown source) @ org.apache.hadoop.hdfs.protocolpb.clientnamenodeprotocoltranslatorpb.mkdirs(clientnamenodeprotocoltranslatorpb.java:539) @ sun.reflect.nativemethodaccessorimpl.invoke0(native method) @ sun.reflect.nativemethodaccessorimpl.invoke(nativemethodaccessorimpl.java:62) @ sun.reflect.delegatingmethodaccessorimpl.invoke(delegatingmethodaccessorimpl.java:43) @ java.lang.reflect.method.invoke(method.java:498) @ org.apache.hadoop.io.retry.retryinvocationhandler.invokemethod(retryinvocationhandler.java:187) @ org.apache.hadoop.io.retry.retryinvocationhandler.invoke(retryinvocationhandler.java:102) @ com.sun.proxy.$proxy48.mkdirs(unknown source) @ org.apache.hadoop.hdfs.dfsclient.primitivemkdir(dfsclient.java:2753) ... 20 more

i should mention run docker image of hadoop locally @ 127.0.0.1: docker run -d -p 9000:9000 sequenceiq/hadoop-docker:2.7.1

what permission denied error i'm seeing? on different host ones mentioned under bootstrap.servers

the permission denied error on hdfs side. user "root" doesn't have write access hdfs directory "/topics".


Comments

Popular posts from this blog

javascript - Thinglink image not visible until browser resize -

firebird - Error "invalid transaction handle (expecting explicit transaction start)" executing script from Delphi -

mongodb - How to keep track of users making Stripe Payments -