Ask Your Question
0

Enable Spark Plugin

asked 2014-08-11 17:02:41 -0600

Nastooh gravatar image

updated 2014-08-12 10:46:07 -0600

I am trying to enable Spark plugin , with not much luck. Updating the “plugins” variable in sahara.conf doesn’t seem to have any effect. Running "tox -e venv -- sahara-api --config-file etc/sahara/sahara.conf --debug " and adding “spark” to default array, in plugins/base.py, returns with the following error:

    2014-08-11 21:48:37.285 19329 INFO sahara.plugins.base [-] Plugin 'vanilla' loaded (sahara.plugins.vanilla.plugin:VanillaProvider)
    2014-08-11 21:48:37.285 19329 INFO sahara.plugins.base [-] Plugin 'idh' loaded (sahara.plugins.intel.plugin:IDHProvider)
    2014-08-11 21:48:37.285 19329 INFO sahara.plugins.base [-] Plugin 'hdp' loaded (sahara.plugins.hdp.ambariplugin:AmbariPlugin)
    2014-08-11 21:48:37.285 19329 CRITICAL sahara [-] RuntimeError: Plugins couldn't be loaded: spark
    2014-08-11 21:48:37.285 19329 TRACE sahara Traceback (most recent call last):
    2014-08-11 21:48:37.285 19329 TRACE sahara   File ".tox/venv/bin/sahara-api", line 14, in <module>
    2014-08-11 21:48:37.285 19329 TRACE sahara     sys.exit(main())
    2014-08-11 21:48:37.285 19329 TRACE sahara   File "/home/navesta/OpenStack/sahara/sahara/cli/sahara_api.py", line 60, in main
    2014-08-11 21:48:37.285 19329 TRACE sahara     app = server.make_app()
    2014-08-11 21:48:37.285 19329 TRACE sahara   File "/home/navesta/OpenStack/sahara/sahara/main.py", line 107, in make_app
    2014-08-11 21:48:37.285 19329 TRACE sahara     plugins_base.setup_plugins()
    2014-08-11 21:48:37.285 19329 TRACE sahara   File "/home/navesta/OpenStack/sahara/sahara/plugins/base.py", line 139, in setup_plugins
    2014-08-11 21:48:37.285 19329 TRACE sahara     PLUGINS = PluginManager()
    2014-08-11 21:48:37.285 19329 TRACE sahara   File "/home/navesta/OpenStack/sahara/sahara/plugins/base.py", line 87, in __init__
    2014-08-11 21:48:37.285 19329 TRACE sahara     self._load_cluster_plugins()
    2014-08-11 21:48:37.285 19329 TRACE sahara   File "/home/navesta/OpenStack/sahara/sahara/plugins/base.py", line 112, in _load_cluster_plugins
    2014-08-11 21:48:37.285 19329 TRACE sahara     % ", ".join(requested_plugins - loaded_plugins))
    2014-08-11 21:48:37.285 19329 TRACE sahara RuntimeError: Plugins couldn't be loaded: spark
    2014-08-11 21:48:37.285 19329 TRACE saharaERROR: InvocationError: '/home/navesta/OpenStack/sahara/.tox/venv/bin/sahara-api --config-file etc/sahara/sahara.conf --debug'
_____________________________________________________________________________________ summary ______________________________________________________________________________________
ERROR:   venv: commands failed

Running "tox -e venv -- sahara-all --config-file etc/sahara/sahara.conf --debug", returns the following error messages:

venv develop-inst-nodeps: /home/navesta/OpenStack/sahara
venv runtests: commands[0] | sahara-all --config-file etc/sahara/sahara.conf --debug
Traceback (most recent call last):
  File ".tox/venv/bin/sahara-all", line 6, in <module>
    from sahara.cli.sahara_all import main
  File "/home/navesta/OpenStack/sahara/sahara/cli/sahara_all.py", line 19, in <module>
AttributeError: 'module' object has no attribute 'patch_all'
ERROR: InvocationError: '/home/navesta/OpenStack/sahara/.tox/venv/bin/sahara-all --config-file etc/sahara/sahara.conf --debug'
_____________________________________________________________________________________ summary ______________________________________________________________________________________
ERROR:   venv: commands failed

I am on icehouse branch of git:

navesta@UbuntuMaas:~/OpenStack/sahara$ tox -e venv -- sahara-api --version
venv develop-inst-nodeps: /home/navesta/OpenStack/sahara
venv runtests: commands[0] | sahara-api --version
2014.1.2
_____________________________________________________________________________________ summary ______________________________________________________________________________________
  venv: commands succeeded
  congratulations :)

Any help would be ... (more)

edit retag flag offensive close merge delete

Comments

I recommend not to perform 'pip install' manually. tox uses its own virtual envs (one could be found in .tox/venv) and automatically installs all the requirements there. I.e. steps #3 and #4 in your section 'Edit 1' could be safely omitted.

dmitrymex gravatar imagedmitrymex ( 2014-08-12 11:06:13 -0600 )edit

Strange, it kept complaining that it couldn't find db module in oslo. (Time allowing will test again on a clean env.)

Nastooh gravatar imageNastooh ( 2014-08-12 13:27:25 -0600 )edit

Any recommendation as how to proceed with building a Spark compatible image, I am following http://docs.openstack.org/developer/s... . Would this suffice diskimage-create.sh -p spark?

Nastooh gravatar imageNastooh ( 2014-08-12 13:34:43 -0600 )edit

I know that diskimage_create.sh do supports spark plugin, but I don't know more. Consider creating another question to bring more attention. Or maybe it would be easier and faster to ask via IRC in #openstack-sahara on freenode.

dmitrymex gravatar imagedmitrymex ( 2014-08-13 05:09:37 -0600 )edit

"diskimage-create.sh -p spark" will create image for spark

alazarev gravatar imagealazarev ( 2014-08-13 09:56:50 -0600 )edit

1 answer

Sort by » oldest newest most voted
2

answered 2014-08-12 05:43:59 -0600

dmitrymex gravatar image

You face two problems:

Spark plugin is not available in Icehouse, it will be first released in Juno. Right now it can be used from the master branch.

The sahara-all binary appeared only in Juno. In Icehouse the same binary is called sahara-api and you should invoke it. I think that you still can invoke sahara-all only because you have .pyc files from your previous attempts with master.

edit flag offensive delete link more

Comments

Is there any way to update sahara in Icehouse so Spark can be run on it?

Ignacio Mulas gravatar imageIgnacio Mulas ( 2014-10-02 11:07:22 -0600 )edit

There is no official way. But you can try to backport code manually. Provisioning part didn't change in Juno. But EDP part changed a lot. So, if you need just provisioning, it should not be hard to backport.

alazarev gravatar imagealazarev ( 2014-10-02 11:13:37 -0600 )edit

It's doable (and I wish, now, that I'd written down the procedure.) After installing Hadoop plugin, I followed https://wiki.openstack.org/wiki/Sahar... , http://docs.openstack.org/developer/s... , and links within to create Spark 1.0.2 plugin.

Nastooh gravatar imageNastooh ( 2014-10-02 11:25:47 -0600 )edit

Your Answer

Please start posting anonymously - your entry will be published after you log in or create a new account.

Add Answer

Get to know Ask OpenStack

Resources for moderators

Question Tools

2 followers

Stats

Asked: 2014-08-11 17:02:41 -0600

Seen: 491 times

Last updated: Aug 12 '14