-
Notifications
You must be signed in to change notification settings - Fork 77
Divolte not writing to HDFS #271
Comments
Can you be a bit more specific about not working? Do you see any errors? |
Here is the error:
|
Then I added fs.file.impl = "org.apache.hadoop.fs.LocalFileSystem" and fs.hdfs.impl = "org.apache.hadoop.hdfs.DistributedFileSystem" to my hdfs configuration in divolte and now im getting a different error:
|
according to this thread (https://stackoverflow.com/questions/45460909/accessing-hdfs-in-java-throws-error) there is an issue with dependency versions in divolte but im not sure who to change that in divolte... |
This pull request may help : #244 |
My kafka sink is working but my HDFS sink is not working. Im using hdfs 2.0 so that might be why? Ive got divolte running in a docker container and a hadoop cluster running in the same docker-compose network which I got from https://github.com/big-data-europe/docker-hadoop
here is are the relevant parts of my divolte-collector.conf (some parts stripped for brevity):
For fs.DEFAULT_FS, Ive tried hdfs://localhost:9870 and hdfs://namenode:9870 (namenode is the name of the hdfs namenode container running in the same docker network)
The text was updated successfully, but these errors were encountered: