Ask Your Question

Revision history [back]

click to hide/show revision 1
initial version

Savanna - Failed to create database


I have OpenStack Grizzly set up with a controller node and two compute nodes. All systems run Ubuntu 12.04 64-bit.

I followed the following guide :

Now i am trying to integrate Savanna 0.3 into my setup and i used the following installation guide :

I have set up savanna in a virtual environment.

Now when i try and run savanna by using the command, it fails to create a database. $ savanna-venv/bin/python savanna-venv/bin/savanna-api --config-file savanna-venv/etc/savanna.conf

The following are excerpts from the log.

2014-02-24 17:32:35.697 18858 ERROR savanna.db.sqlalchemy.api [-] Database registration exception: (OperationalError) unable to open database file None None .... 2014-02-24 17:32:35.708 18858 CRITICAL savanna [-] Failed to create database! .... 2014-02-24 17:32:35.708 18858 TRACE savanna raise RuntimeError('Failed to create database!') 2014-02-24 17:32:35.708 18858 TRACE savanna RuntimeError: Failed to create database!

Here is my savanna.conf configuration file :


Hostname or IP address that will be used to listen on

(string value)


Port that will be used to listen on (integer value)


Address and credentials that will be used to check auth tokens

os_auth_host= os_auth_port=35357 os_admin_username=admin os_admin_password=password os_admin_tenant_name=service

If set to True, Savanna will use floating IPs to communicate

with instances. To make sure that all instances have

floating IPs assigned in Nova Network set

"auto_assign_floating_ip=True" in nova.conf.If Neutron is

used for networking, make sure that all Node Groups have

"floating_ip_pool" parameter defined. (boolean value)


Use Neutron or Nova Network (boolean value)


Maximum length of job binary data in kilobytes that may be

stored or retrieved in a single operation (integer value)

Maximum length of job binary data in kilobytes that may be

stored or retrieved in a single operation (integer value)


Postfix for storing jobs in hdfs. Will be added to

/user/hadoop/ (string value)


Enables Savanna to use Keystone API v3. If that flag is

disabled, per-job clusters will not be terminated

automatically. (boolean value)


enable periodic tasks (boolean value)


Enables data locality for hadoop cluster.

Also enables data locality for Swift used by hadoop.

If enabled, 'compute_topology' and 'swift_topology'

configuration parameters should point to OpenStack and Swift

topology correspondingly. (boolean value)


Print debugging output (set logging level to DEBUG instead

of default WARNING level). (boolean value)


Print more verbose output (set logging level to INFO instead

of default WARNING level). (boolean value)


Log output to standard error (boolean value)


(Optional) Name of log file to output to. If no default is

set, logging will go to stdout. (string value)


(Optional) The base directory used for relative --log-file

paths (string value)


Use syslog for logging. (boolean value)


syslog facility to receive log lines (string value)


List of plugins to be loaded. Savanna preserves the order of

the list when returning it. (list value)


[plugin:vanilla] plugin_class=savanna.plugins.vanilla.plugin:VanillaProvider

[plugin:hdp] plugin_class=savanna.plugins.hdp.ambariplugin:AmbariPlugin

[database] connection=sqlite:////savanna/openstack/common/db/$sqlite_db

Can someone please guide me on this? Thanks a lot.