Sahara Spark cluster creation fails with keytool error

asked 2017-01-13 03:33:03 -0600

mj1001 gravatar image

updated 2017-01-24 14:31:16 -0600

rbowen gravatar image

When attempting to create a cluster it fails with a keytool error:

Creating cluster failed for the following reason(s): An error occurred in thread 'configure-ssl-cert-35092d42-9149-41f2-9c83-6765df9bf004': RemoteCommandException: Error during command execution: "sudo su - -c "keytool -import -alias sahara-0 -keystore `cut -f2 -d \"=\" /etc/profile.d/99-java.sh | head -1`/lib/security/cacerts -file /tmp/cert.pem -noprompt -storepass changeit"" Return code: 1 STDERR: stdin: is not a tty STDOUT: keytool error: java.lang.Exception: Input not an X.509 certificate Error ID: fd44fc55-ab1f-4e20-985f-edcb432af353 Error ID: 428aebc9-8faf-48cd-bdef-352a204c92ee Traceback (most recent call last): File "/usr/lib/python2.7/dist-packages/sahara/context.py", line 172, in _wrapper func(*args, **kwargs) File "/usr/lib/python2.7/dist-packages/sahara/swift/swift_helper.py", line 101, in _install_ssl_certs r.execute_command(register_cmd % idx) File "/usr/lib/python2.7/dist-packages/sahara/utils/ssh_remote.py", line 748, in execute_command get_stderr, raise_when_error) File "/usr/lib/python2.7/dist-packages/sahara/utils/ssh_remote.py", line 820, in _run_s return self._run_with_log(func, timeout, *args, **kwargs) File "/usr/lib/python2.7/dist-packages/sahara/utils/ssh_remote.py", line 671, in _run_with_log return self._run(func, *args, **kwargs) File "/usr/lib/python2.7/dist-packages/sahara/utils/ssh_remote.py", line 816, in _run return procutils.run_in_subprocess(self.proc, func, args, kwargs) File "/usr/lib/python2.7/dist-packages/sahara/utils/procutils.py", line 57, in run_in_subprocess raise exceptions.SubprocessException(result['exception']) SubprocessException: RemoteCommandException: Error during command execution: "sudo su - -c "keytool -import -alias sahara-0 -keystore `cut -f2 -d \"=\" /etc/profile.d/99-java.sh | head -1`/lib/security/cacerts -file /tmp/cert.pem -noprompt -storepass changeit"" Return code: 1 STDERR: stdin: is not a tty STDOUT: keytool error: java.lang.Exception: Input not an X.509 certificate Error ID: fd44fc55-ab1f-4e20-985f-edcb432af353 Error ID: 428aebc9-8faf-48cd-bdef-352a204c92ee Error ID: ca5361f3-2bcc-4388-9017-ac9248478830

I have been unable to anything on Google that relates to this error.

To replicate this add the spark image available at http://sahara-files.mirantis.com/imag.... Create two node group templates one with both the name node and spark master. The second contains both the datanode and spark worker processes. Next create a cluster template that spins up 1 instance of the template with the name node and spark master and 5 instances of the data node and spark worker template.

Note that I can get the same error when creating a vanilla hadoop cluster.

edit retag flag offensive close merge delete