Enable Spark Plugin
I am trying to enable Spark plugin , with not much luck. Updating the “plugins” variable in sahara.conf doesn’t seem to have any effect. Running "tox -e venv -- sahara-api --config-file etc/sahara/sahara.conf --debug " and adding “spark” to default array, in plugins/base.py, returns with the following error:
2014-08-11 21:48:37.285 19329 INFO sahara.plugins.base [-] Plugin 'vanilla' loaded (sahara.plugins.vanilla.plugin:VanillaProvider)
2014-08-11 21:48:37.285 19329 INFO sahara.plugins.base [-] Plugin 'idh' loaded (sahara.plugins.intel.plugin:IDHProvider)
2014-08-11 21:48:37.285 19329 INFO sahara.plugins.base [-] Plugin 'hdp' loaded (sahara.plugins.hdp.ambariplugin:AmbariPlugin)
2014-08-11 21:48:37.285 19329 CRITICAL sahara [-] RuntimeError: Plugins couldn't be loaded: spark
2014-08-11 21:48:37.285 19329 TRACE sahara Traceback (most recent call last):
2014-08-11 21:48:37.285 19329 TRACE sahara File ".tox/venv/bin/sahara-api", line 14, in <module>
2014-08-11 21:48:37.285 19329 TRACE sahara sys.exit(main())
2014-08-11 21:48:37.285 19329 TRACE sahara File "/home/navesta/OpenStack/sahara/sahara/cli/sahara_api.py", line 60, in main
2014-08-11 21:48:37.285 19329 TRACE sahara app = server.make_app()
2014-08-11 21:48:37.285 19329 TRACE sahara File "/home/navesta/OpenStack/sahara/sahara/main.py", line 107, in make_app
2014-08-11 21:48:37.285 19329 TRACE sahara plugins_base.setup_plugins()
2014-08-11 21:48:37.285 19329 TRACE sahara File "/home/navesta/OpenStack/sahara/sahara/plugins/base.py", line 139, in setup_plugins
2014-08-11 21:48:37.285 19329 TRACE sahara PLUGINS = PluginManager()
2014-08-11 21:48:37.285 19329 TRACE sahara File "/home/navesta/OpenStack/sahara/sahara/plugins/base.py", line 87, in __init__
2014-08-11 21:48:37.285 19329 TRACE sahara self._load_cluster_plugins()
2014-08-11 21:48:37.285 19329 TRACE sahara File "/home/navesta/OpenStack/sahara/sahara/plugins/base.py", line 112, in _load_cluster_plugins
2014-08-11 21:48:37.285 19329 TRACE sahara % ", ".join(requested_plugins - loaded_plugins))
2014-08-11 21:48:37.285 19329 TRACE sahara RuntimeError: Plugins couldn't be loaded: spark
2014-08-11 21:48:37.285 19329 TRACE saharaERROR: InvocationError: '/home/navesta/OpenStack/sahara/.tox/venv/bin/sahara-api --config-file etc/sahara/sahara.conf --debug'
_____________________________________________________________________________________ summary ______________________________________________________________________________________
ERROR: venv: commands failed
Running "tox -e venv -- sahara-all --config-file etc/sahara/sahara.conf --debug", returns the following error messages:
venv develop-inst-nodeps: /home/navesta/OpenStack/sahara
venv runtests: commands[0] | sahara-all --config-file etc/sahara/sahara.conf --debug
Traceback (most recent call last):
File ".tox/venv/bin/sahara-all", line 6, in <module>
from sahara.cli.sahara_all import main
File "/home/navesta/OpenStack/sahara/sahara/cli/sahara_all.py", line 19, in <module>
AttributeError: 'module' object has no attribute 'patch_all'
ERROR: InvocationError: '/home/navesta/OpenStack/sahara/.tox/venv/bin/sahara-all --config-file etc/sahara/sahara.conf --debug'
_____________________________________________________________________________________ summary ______________________________________________________________________________________
ERROR: venv: commands failed
I am on icehouse branch of git:
navesta@UbuntuMaas:~/OpenStack/sahara$ tox -e venv -- sahara-api --version
venv develop-inst-nodeps: /home/navesta/OpenStack/sahara
venv runtests: commands[0] | sahara-api --version
2014.1.2
_____________________________________________________________________________________ summary ______________________________________________________________________________________
venv: commands succeeded
congratulations :)
Any help would be ...
I recommend not to perform 'pip install' manually. tox uses its own virtual envs (one could be found in .tox/venv) and automatically installs all the requirements there. I.e. steps #3 and #4 in your section 'Edit 1' could be safely omitted.
Strange, it kept complaining that it couldn't find db module in oslo. (Time allowing will test again on a clean env.)
Any recommendation as how to proceed with building a Spark compatible image, I am following http://docs.openstack.org/developer/s... . Would this suffice diskimage-create.sh -p spark?
I know that diskimage_create.sh do supports spark plugin, but I don't know more. Consider creating another question to bring more attention. Or maybe it would be easier and faster to ask via IRC in #openstack-sahara on freenode.
"diskimage-create.sh -p spark" will create image for spark