You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
After executing kylin-deploy-single commande from ambari-functions file, i can access the ambari interface via http://<container_ip>:8080, but when running the services i have this error on the data node start level:
Fail: Execution of 'ulimit -c unlimited; su -s /bin/bash - hdfs -c 'export HADOOP_LIBEXEC_DIR=/usr/lib/hadoop/libexec && /usr/lib/hadoop/sbin/hadoop-daemon.sh --config /etc/hadoop/conf stop datanode'' returned 127. -bash: /usr/lib/hadoop/sbin/hadoop-daemon.sh: No such file or directory
Actually, the file hadoop-daemon.sh does not exist in that path. It is rather in /user/hdp//hadoop/.
and it is the same problem for all other service YARN, Hive, Hbase ...
How can i fix that to point ambari to the right HDP location.
Thanks.
The text was updated successfully, but these errors were encountered:
Hi imanis, are you using the docker image of kylin 0.6? That image is out of date, now we're building the new image with kylin 0.7.2 release, but I encountered an ambari problem (couldn't startup hive server due to mysql security issue); Once the new image is ready, we will announce to the users; In the meawhile if you want to trial Kylin, a HDP Sandbox VM is much easier to adopt than the docker, please refer to https://kylin.incubator.apache.org/docs/install/index.html ; Thanks!
Hello,
After executing kylin-deploy-single commande from ambari-functions file, i can access the ambari interface via http://<container_ip>:8080, but when running the services i have this error on the data node start level:
Fail: Execution of 'ulimit -c unlimited; su -s /bin/bash - hdfs -c 'export HADOOP_LIBEXEC_DIR=/usr/lib/hadoop/libexec && /usr/lib/hadoop/sbin/hadoop-daemon.sh --config /etc/hadoop/conf stop datanode'' returned 127. -bash: /usr/lib/hadoop/sbin/hadoop-daemon.sh: No such file or directory
Actually, the file hadoop-daemon.sh does not exist in that path. It is rather in /user/hdp//hadoop/.
and it is the same problem for all other service YARN, Hive, Hbase ...
How can i fix that to point ambari to the right HDP location.
Thanks.
The text was updated successfully, but these errors were encountered: