hadoopbigdataambari

Install DataNode by Ambari


I have

OS Red Hat Enterprise Linux Server release 7.4 (Maipo)
Ambari Version 2.5.1.0
HDP 2.6

After finished deploy components 2 datanodes not can start. Tried start returned error:

  File "/usr/lib/python2.6/site-packages/resource_management/core/shell.py", line 303, in _call
    raise ExecutionFailed(err_msg, code, out, err)
resource_management.core.exceptions.ExecutionFailed: Execution of 'ambari-sudo.sh su hdfs -l -s /bin/bash -c 'ulimit -c unlimited ;  /usr/hdp/current/hadoop-client/sbin/hadoop-daemon.sh --config /usr/hdp/current/hadoop-client/conf start datanode'' returned 127. -bash: /usr/hdp/current/hadoop-client/sbin/hadoop-daemon.sh: No such file or directory

I tried to delete component and make new install by Ambari.

Installed completed without error

2018-02-27 20:47:31,550 - Execute['ambari-sudo.sh /usr/bin/hdp-select set all `ambari-python-wrap /usr/bin/hdp-select versions | grep ^2.6 | tail -1`'] {'only_if': 'ls -d /usr/hdp/2.6*'}
2018-02-27 20:47:31,554 - Skipping Execute['ambari-sudo.sh /usr/bin/hdp-select set all `ambari-python-wrap /usr/bin/hdp-select versions | grep ^2.6 | tail -1`'] due to only_if
2018-02-27 20:47:31,554 - FS Type: 
2018-02-27 20:47:31,554 - XmlConfig['core-site.xml'] {'group': 'hadoop', 'conf_dir': '/usr/hdp/current/hadoop-client/conf', 'configuration_attributes': {u'final': {u'fs.defaultFS': u'true'}}, 'owner': 'hdfs', 'only_if': 'ls /usr/hdp/current/hadoop-client/conf', 'configurations': ...}
2018-02-27 20:47:31,568 - Generating config: /usr/hdp/current/hadoop-client/conf/core-site.xml
2018-02-27 20:47:31,569 - File['/usr/hdp/current/hadoop-client/conf/core-site.xml'] {'owner': 'hdfs', 'content': InlineTemplate(...), 'group': 'hadoop', 'mode': None, 'encoding': 'UTF-8'}
2018-02-27 20:47:31,583 - Could not load 'version' from /var/lib/ambari-agent/data/structured-out-3374.json

Command completed successfully!

BUT new start show more again error. I checked folder /usr/hdp/current/hadoop-client/ In folder new files for example /sbin/hadoop-daemon.sh did not appear.

How to do it again deploy component DataNode by Ambari?


Solution

  • I'd guess the issue is caused by wrong symlinks at /usr/hdp. You may even try to fix them manually, the structure is trivial enough. Through the issue does not sound like a common one after a plain stack deployment.

    Are you running Ambari with non-root/custom user? Maybe Ambari has not sufficient permissions? See https://docs.hortonworks.com/HDPDocuments/Ambari-2.6.0.0/bk_ambari-security/content/how_to_configure_ambari_server_for_non-root.html

    Ambari Version 2.5.1.0 is considerably outdated, so it would make sense to update Ambari and see whether it helps. Also, if you want to wipe out everything see https://github.com/hortonworks/HDP-Public-Utilities/blob/master/Installation/cleanup_script.sh

    Also, it may be more productive to ask Ambari-related questions here https://community.hortonworks.com/