Ask Your Question

Revision history [back]

Load Balancer: "unexpected keyword argument 'decider_depth'"

I am following instructions to set up an Octavia Load Balancer on Devstack. The command neutron lbaas-loadbalancer-create succeeds, but the load balancer remains in state PENDING_CREATE forever.

Looking at the log files, I find:

  • q-lbaasv2.log contains

    2016-02-09 08:19:18.556 ERROR neutron_lbaas.agent.agent_manager [-] Unable to retrieve ready devices

  • 1.5 seconds later, o-cw.log has

    2016-02-10 08:19:19.893 29851 ERROR oslo_messaging.rpc.dispatcher [-] Exception during message handling: link() got an unexpected keyword argument 'decider_depth'

The message in q-lbaasv2.log occurs from the beginning and is repeated once a second

What could this be? What can I do to investigate further?

Log details below, first q-lbaasv2.log:

2016-02-09 08:18:18.396 DEBUG oslo_messaging._drivers.amqpdriver [-] CAST unique_id: c96ac82cc5af4324b79cee7274e7afc9 exchange 'neutron' topic 'n-lbaasv2-plugin' from (pid=26882) _send /usr/local/lib/python2.7/dist-packages/oslo_messaging/_drivers/amqpdriver.py:448
2016-02-09 08:19:18.556 ERROR neutron_lbaas.agent.agent_manager [-] Unable to retrieve ready devices
2016-02-09 08:19:18.556 TRACE neutron_lbaas.agent.agent_manager Traceback (most recent call last):
2016-02-09 08:19:18.556 TRACE neutron_lbaas.agent.agent_manager   File "/opt/stack/neutron-lbaas/neutron_lbaas/agent/agent_manager.py", line 151, in sync_state
2016-02-09 08:19:18.556 TRACE neutron_lbaas.agent.agent_manager     ready_instances = set(self.plugin_rpc.get_ready_devices())
2016-02-09 08:19:18.556 TRACE neutron_lbaas.agent.agent_manager   File "/opt/stack/neutron-lbaas/neutron_lbaas/agent/agent_api.py", line 34, in get_ready_devices
2016-02-09 08:19:18.556 TRACE neutron_lbaas.agent.agent_manager     return cctxt.call(self.context, 'get_ready_devices', host=self.host)
2016-02-09 08:19:18.556 TRACE neutron_lbaas.agent.agent_manager   File "/usr/local/lib/python2.7/dist-packages/oslo_messaging/rpc/client.py", line 158, in call
2016-02-09 08:19:18.556 TRACE neutron_lbaas.agent.agent_manager     retry=self.retry)
2016-02-09 08:19:18.556 TRACE neutron_lbaas.agent.agent_manager   File "/usr/local/lib/python2.7/dist-packages/oslo_messaging/transport.py", line 90, in _send
2016-02-09 08:19:18.556 TRACE neutron_lbaas.agent.agent_manager     timeout=timeout, retry=retry)
2016-02-09 08:19:18.556 TRACE neutron_lbaas.agent.agent_manager   File "/usr/local/lib/python2.7/dist-packages/oslo_messaging/_drivers/amqpdriver.py", line 464, in send
2016-02-09 08:19:18.556 TRACE neutron_lbaas.agent.agent_manager     retry=retry)
2016-02-09 08:19:18.556 TRACE neutron_lbaas.agent.agent_manager   File "/usr/local/lib/python2.7/dist-packages/oslo_messaging/_drivers/amqpdriver.py", line 453, in _send
2016-02-09 08:19:18.556 TRACE neutron_lbaas.agent.agent_manager     result = self._waiter.wait(msg_id, timeout)
2016-02-09 08:19:18.556 TRACE neutron_lbaas.agent.agent_manager   File "/usr/local/lib/python2.7/dist-packages/oslo_messaging/_drivers/amqpdriver.py", line 334, in wait
2016-02-09 08:19:18.556 TRACE neutron_lbaas.agent.agent_manager     message = self.waiters.get(msg_id, timeout=timeout)
2016-02-09 08:19:18.556 TRACE neutron_lbaas.agent.agent_manager   File "/usr/local/lib/python2.7/dist-packages/oslo_messaging/_drivers/amqpdriver.py", line 237, in get
2016-02-09 08:19:18.556 TRACE neutron_lbaas.agent.agent_manager     'to message ID %s' % msg_id)
2016-02-09 08:19:18.556 TRACE neutron_lbaas.agent.agent_manager MessagingTimeout: Timed out waiting for a reply to message ID f5d8fd23993a4cb6baf51d6e91779a35
2016-02-09 08:19:18.556 TRACE neutron_lbaas.agent.agent_manager

then o-cw.log:

/usr/local/bin/octavia-worker --config-file /etc/octavia/octavia.conf
2016-02-08 23:04:13.749 29851 INFO octavia.common.config [-] Logging enabled!
2016-02-08 23:04:13.794 29851 WARNING oslo_reports.guru_meditation_report [-] Guru mediation now registers SIGUSR1 and SIGUSR2 by default for backward compatibility. SIGUSR1 will no longer be registered in a future release, so please use SIGUSR2 to generate reports.
2016-02-08 23:04:21.337 29851 INFO root [-] Generating grammar tables from /usr/lib/python2.7/lib2to3/Grammar.txt
2016-02-08 23:04:21.992 29851 INFO root [-] Generating grammar tables from /usr/lib/python2.7/lib2to3/PatternGrammar.txt
2016-02-08 23:04:33.608 29851 INFO octavia.controller.queue.consumer [-] Starting consumer...
2016-02-10 08:19:16.716 29851 INFO octavia.controller.queue.endpoint [-] Creating load balancer '0a09e92b-0dde-4a47-86e1-402b91673021'...
2016-02-10 08:19:19.893 29851 ERROR oslo_messaging.rpc.dispatcher [-] Exception during message handling: link() got an unexpected keyword argument 'decider_depth'
2016-02-10 08:19:19.893 29851 ERROR oslo_messaging.rpc.dispatcher Traceback (most recent call last):
2016-02-10 08:19:19.893 29851 ERROR oslo_messaging.rpc.dispatcher   File "/usr/local/lib/python2.7/dist-packages/oslo_messaging/rpc/dispatcher.py", line 142, in _dispatch_and_reply
2016-02-10 08:19:19.893 29851 ERROR oslo_messaging.rpc.dispatcher     executor_callback))
2016-02-10 08:19:19.893 29851 ERROR oslo_messaging.rpc.dispatcher   File "/usr/local/lib/python2.7/dist-packages/oslo_messaging/rpc/dispatcher.py", line 186, in _dispatch
2016-02-10 08:19:19.893 29851 ERROR oslo_messaging.rpc.dispatcher     executor_callback)
2016-02-10 08:19:19.893 29851 ERROR oslo_messaging.rpc.dispatcher   File "/usr/local/lib/python2.7/dist-packages/oslo_messaging/rpc/dispatcher.py", line 129, in _do_dispatch
2016-02-10 08:19:19.893 29851 ERROR oslo_messaging.rpc.dispatcher     result = func(ctxt, **new_args)
2016-02-10 08:19:19.893 29851 ERROR oslo_messaging.rpc.dispatcher   File "/opt/stack/octavia/octavia/controller/queue/endpoint.py", line 45, in create_load_balancer
2016-02-10 08:19:19.893 29851 ERROR oslo_messaging.rpc.dispatcher     self.worker.create_load_balancer(load_balancer_id)
2016-02-10 08:19:19.893 29851 ERROR oslo_messaging.rpc.dispatcher   File "/opt/stack/octavia/octavia/controller/worker/controller_worker.py", line 264, in create_load_balancer
2016-02-10 08:19:19.893 29851 ERROR oslo_messaging.rpc.dispatcher     topology=CONF.controller_worker.loadbalancer_topology),
2016-02-10 08:19:19.893 29851 ERROR oslo_messaging.rpc.dispatcher   File "/opt/stack/octavia/octavia/controller/worker/flows/load_balancer_flows.py", line 62, in get_create_load_balancer_flow
2016-02-10 08:19:19.893 29851 ERROR oslo_messaging.rpc.dispatcher     role=constants.ROLE_STANDALONE)
2016-02-10 08:19:19.893 29851 ERROR oslo_messaging.rpc.dispatcher   File "/opt/stack/octavia/octavia/controller/worker/flows/amphora_flows.py", line 227, in get_amphora_for_lb_subflow
2016-02-10 08:19:19.893 29851 ERROR oslo_messaging.rpc.dispatcher     decider_depth='flow')
2016-02-10 08:19:19.893 29851 ERROR oslo_messaging.rpc.dispatcher TypeError: link() got an unexpected keyword argument 'decider_depth'
2016-02-10 08:19:19.893 29851 ERROR oslo_messaging.rpc.dispatcher

Load Balancer: "unexpected keyword argument 'decider_depth'"

I am following instructions to set up an Octavia Load Balancer on Devstack. The command neutron lbaas-loadbalancer-create succeeds, but the load balancer remains in state PENDING_CREATE forever.

Looking at the log files, I find:

  • q-lbaasv2.log contains

    2016-02-09 08:19:18.556 ERROR neutron_lbaas.agent.agent_manager [-] Unable to retrieve ready devices

  • 1.5 seconds later, o-cw.log has

    2016-02-10 08:19:19.893 29851 ERROR oslo_messaging.rpc.dispatcher [-] Exception during message handling: link() got an unexpected keyword argument 'decider_depth'

The message in q-lbaasv2.log occurs from the beginning and is repeated once a second

EDIT: neutron agent-list doesn't show the load balancer. I did see the v1 load balancer before I enabled v2. Is that OK?

What could this be? What can I do to investigate further?

Log details below, first q-lbaasv2.log:

2016-02-09 08:18:18.396 DEBUG oslo_messaging._drivers.amqpdriver [-] CAST unique_id: c96ac82cc5af4324b79cee7274e7afc9 exchange 'neutron' topic 'n-lbaasv2-plugin' from (pid=26882) _send /usr/local/lib/python2.7/dist-packages/oslo_messaging/_drivers/amqpdriver.py:448
2016-02-09 08:19:18.556 ERROR neutron_lbaas.agent.agent_manager [-] Unable to retrieve ready devices
2016-02-09 08:19:18.556 TRACE neutron_lbaas.agent.agent_manager Traceback (most recent call last):
2016-02-09 08:19:18.556 TRACE neutron_lbaas.agent.agent_manager   File "/opt/stack/neutron-lbaas/neutron_lbaas/agent/agent_manager.py", line 151, in sync_state
2016-02-09 08:19:18.556 TRACE neutron_lbaas.agent.agent_manager     ready_instances = set(self.plugin_rpc.get_ready_devices())
2016-02-09 08:19:18.556 TRACE neutron_lbaas.agent.agent_manager   File "/opt/stack/neutron-lbaas/neutron_lbaas/agent/agent_api.py", line 34, in get_ready_devices
2016-02-09 08:19:18.556 TRACE neutron_lbaas.agent.agent_manager     return cctxt.call(self.context, 'get_ready_devices', host=self.host)
2016-02-09 08:19:18.556 TRACE neutron_lbaas.agent.agent_manager   File "/usr/local/lib/python2.7/dist-packages/oslo_messaging/rpc/client.py", line 158, in call
2016-02-09 08:19:18.556 TRACE neutron_lbaas.agent.agent_manager     retry=self.retry)
2016-02-09 08:19:18.556 TRACE neutron_lbaas.agent.agent_manager   File "/usr/local/lib/python2.7/dist-packages/oslo_messaging/transport.py", line 90, in _send
2016-02-09 08:19:18.556 TRACE neutron_lbaas.agent.agent_manager     timeout=timeout, retry=retry)
2016-02-09 08:19:18.556 TRACE neutron_lbaas.agent.agent_manager   File "/usr/local/lib/python2.7/dist-packages/oslo_messaging/_drivers/amqpdriver.py", line 464, in send
2016-02-09 08:19:18.556 TRACE neutron_lbaas.agent.agent_manager     retry=retry)
2016-02-09 08:19:18.556 TRACE neutron_lbaas.agent.agent_manager   File "/usr/local/lib/python2.7/dist-packages/oslo_messaging/_drivers/amqpdriver.py", line 453, in _send
2016-02-09 08:19:18.556 TRACE neutron_lbaas.agent.agent_manager     result = self._waiter.wait(msg_id, timeout)
2016-02-09 08:19:18.556 TRACE neutron_lbaas.agent.agent_manager   File "/usr/local/lib/python2.7/dist-packages/oslo_messaging/_drivers/amqpdriver.py", line 334, in wait
2016-02-09 08:19:18.556 TRACE neutron_lbaas.agent.agent_manager     message = self.waiters.get(msg_id, timeout=timeout)
2016-02-09 08:19:18.556 TRACE neutron_lbaas.agent.agent_manager   File "/usr/local/lib/python2.7/dist-packages/oslo_messaging/_drivers/amqpdriver.py", line 237, in get
2016-02-09 08:19:18.556 TRACE neutron_lbaas.agent.agent_manager     'to message ID %s' % msg_id)
2016-02-09 08:19:18.556 TRACE neutron_lbaas.agent.agent_manager MessagingTimeout: Timed out waiting for a reply to message ID f5d8fd23993a4cb6baf51d6e91779a35
2016-02-09 08:19:18.556 TRACE neutron_lbaas.agent.agent_manager

then o-cw.log:

/usr/local/bin/octavia-worker --config-file /etc/octavia/octavia.conf
2016-02-08 23:04:13.749 29851 INFO octavia.common.config [-] Logging enabled!
2016-02-08 23:04:13.794 29851 WARNING oslo_reports.guru_meditation_report [-] Guru mediation now registers SIGUSR1 and SIGUSR2 by default for backward compatibility. SIGUSR1 will no longer be registered in a future release, so please use SIGUSR2 to generate reports.
2016-02-08 23:04:21.337 29851 INFO root [-] Generating grammar tables from /usr/lib/python2.7/lib2to3/Grammar.txt
2016-02-08 23:04:21.992 29851 INFO root [-] Generating grammar tables from /usr/lib/python2.7/lib2to3/PatternGrammar.txt
2016-02-08 23:04:33.608 29851 INFO octavia.controller.queue.consumer [-] Starting consumer...
2016-02-10 08:19:16.716 29851 INFO octavia.controller.queue.endpoint [-] Creating load balancer '0a09e92b-0dde-4a47-86e1-402b91673021'...
2016-02-10 08:19:19.893 29851 ERROR oslo_messaging.rpc.dispatcher [-] Exception during message handling: link() got an unexpected keyword argument 'decider_depth'
2016-02-10 08:19:19.893 29851 ERROR oslo_messaging.rpc.dispatcher Traceback (most recent call last):
2016-02-10 08:19:19.893 29851 ERROR oslo_messaging.rpc.dispatcher   File "/usr/local/lib/python2.7/dist-packages/oslo_messaging/rpc/dispatcher.py", line 142, in _dispatch_and_reply
2016-02-10 08:19:19.893 29851 ERROR oslo_messaging.rpc.dispatcher     executor_callback))
2016-02-10 08:19:19.893 29851 ERROR oslo_messaging.rpc.dispatcher   File "/usr/local/lib/python2.7/dist-packages/oslo_messaging/rpc/dispatcher.py", line 186, in _dispatch
2016-02-10 08:19:19.893 29851 ERROR oslo_messaging.rpc.dispatcher     executor_callback)
2016-02-10 08:19:19.893 29851 ERROR oslo_messaging.rpc.dispatcher   File "/usr/local/lib/python2.7/dist-packages/oslo_messaging/rpc/dispatcher.py", line 129, in _do_dispatch
2016-02-10 08:19:19.893 29851 ERROR oslo_messaging.rpc.dispatcher     result = func(ctxt, **new_args)
2016-02-10 08:19:19.893 29851 ERROR oslo_messaging.rpc.dispatcher   File "/opt/stack/octavia/octavia/controller/queue/endpoint.py", line 45, in create_load_balancer
2016-02-10 08:19:19.893 29851 ERROR oslo_messaging.rpc.dispatcher     self.worker.create_load_balancer(load_balancer_id)
2016-02-10 08:19:19.893 29851 ERROR oslo_messaging.rpc.dispatcher   File "/opt/stack/octavia/octavia/controller/worker/controller_worker.py", line 264, in create_load_balancer
2016-02-10 08:19:19.893 29851 ERROR oslo_messaging.rpc.dispatcher     topology=CONF.controller_worker.loadbalancer_topology),
2016-02-10 08:19:19.893 29851 ERROR oslo_messaging.rpc.dispatcher   File "/opt/stack/octavia/octavia/controller/worker/flows/load_balancer_flows.py", line 62, in get_create_load_balancer_flow
2016-02-10 08:19:19.893 29851 ERROR oslo_messaging.rpc.dispatcher     role=constants.ROLE_STANDALONE)
2016-02-10 08:19:19.893 29851 ERROR oslo_messaging.rpc.dispatcher   File "/opt/stack/octavia/octavia/controller/worker/flows/amphora_flows.py", line 227, in get_amphora_for_lb_subflow
2016-02-10 08:19:19.893 29851 ERROR oslo_messaging.rpc.dispatcher     decider_depth='flow')
2016-02-10 08:19:19.893 29851 ERROR oslo_messaging.rpc.dispatcher TypeError: link() got an unexpected keyword argument 'decider_depth'
2016-02-10 08:19:19.893 29851 ERROR oslo_messaging.rpc.dispatcher

Load Balancer: "unexpected keyword argument 'decider_depth'"

I am following instructions to set up an Octavia Load Balancer on Devstack. The command neutron lbaas-loadbalancer-create succeeds, but the load balancer remains in state PENDING_CREATE forever.

Looking at the log files, I find:

  • q-lbaasv2.log contains

    2016-02-09 08:19:18.556 ERROR neutron_lbaas.agent.agent_manager [-] Unable to retrieve ready devices

  • 1.5 seconds later, o-cw.log has

    2016-02-10 08:19:19.893 29851 ERROR oslo_messaging.rpc.dispatcher [-] Exception during message handling: link() got an unexpected keyword argument 'decider_depth'

The message in q-lbaasv2.log occurs from the beginning and is repeated once a second

EDIT: neutron agent-list doesn't show the load balancer. I did see the v1 load balancer before I enabled v2. Is that OK?OK? EDIT: No trace of load balancer in Horizon.

What could this be? What can I do to investigate further?

Log details below, first q-lbaasv2.log:

2016-02-09 08:18:18.396 DEBUG oslo_messaging._drivers.amqpdriver [-] CAST unique_id: c96ac82cc5af4324b79cee7274e7afc9 exchange 'neutron' topic 'n-lbaasv2-plugin' from (pid=26882) _send /usr/local/lib/python2.7/dist-packages/oslo_messaging/_drivers/amqpdriver.py:448
2016-02-09 08:19:18.556 ERROR neutron_lbaas.agent.agent_manager [-] Unable to retrieve ready devices
2016-02-09 08:19:18.556 TRACE neutron_lbaas.agent.agent_manager Traceback (most recent call last):
2016-02-09 08:19:18.556 TRACE neutron_lbaas.agent.agent_manager   File "/opt/stack/neutron-lbaas/neutron_lbaas/agent/agent_manager.py", line 151, in sync_state
2016-02-09 08:19:18.556 TRACE neutron_lbaas.agent.agent_manager     ready_instances = set(self.plugin_rpc.get_ready_devices())
2016-02-09 08:19:18.556 TRACE neutron_lbaas.agent.agent_manager   File "/opt/stack/neutron-lbaas/neutron_lbaas/agent/agent_api.py", line 34, in get_ready_devices
2016-02-09 08:19:18.556 TRACE neutron_lbaas.agent.agent_manager     return cctxt.call(self.context, 'get_ready_devices', host=self.host)
2016-02-09 08:19:18.556 TRACE neutron_lbaas.agent.agent_manager   File "/usr/local/lib/python2.7/dist-packages/oslo_messaging/rpc/client.py", line 158, in call
2016-02-09 08:19:18.556 TRACE neutron_lbaas.agent.agent_manager     retry=self.retry)
2016-02-09 08:19:18.556 TRACE neutron_lbaas.agent.agent_manager   File "/usr/local/lib/python2.7/dist-packages/oslo_messaging/transport.py", line 90, in _send
2016-02-09 08:19:18.556 TRACE neutron_lbaas.agent.agent_manager     timeout=timeout, retry=retry)
2016-02-09 08:19:18.556 TRACE neutron_lbaas.agent.agent_manager   File "/usr/local/lib/python2.7/dist-packages/oslo_messaging/_drivers/amqpdriver.py", line 464, in send
2016-02-09 08:19:18.556 TRACE neutron_lbaas.agent.agent_manager     retry=retry)
2016-02-09 08:19:18.556 TRACE neutron_lbaas.agent.agent_manager   File "/usr/local/lib/python2.7/dist-packages/oslo_messaging/_drivers/amqpdriver.py", line 453, in _send
2016-02-09 08:19:18.556 TRACE neutron_lbaas.agent.agent_manager     result = self._waiter.wait(msg_id, timeout)
2016-02-09 08:19:18.556 TRACE neutron_lbaas.agent.agent_manager   File "/usr/local/lib/python2.7/dist-packages/oslo_messaging/_drivers/amqpdriver.py", line 334, in wait
2016-02-09 08:19:18.556 TRACE neutron_lbaas.agent.agent_manager     message = self.waiters.get(msg_id, timeout=timeout)
2016-02-09 08:19:18.556 TRACE neutron_lbaas.agent.agent_manager   File "/usr/local/lib/python2.7/dist-packages/oslo_messaging/_drivers/amqpdriver.py", line 237, in get
2016-02-09 08:19:18.556 TRACE neutron_lbaas.agent.agent_manager     'to message ID %s' % msg_id)
2016-02-09 08:19:18.556 TRACE neutron_lbaas.agent.agent_manager MessagingTimeout: Timed out waiting for a reply to message ID f5d8fd23993a4cb6baf51d6e91779a35
2016-02-09 08:19:18.556 TRACE neutron_lbaas.agent.agent_manager

then o-cw.log:

/usr/local/bin/octavia-worker --config-file /etc/octavia/octavia.conf
2016-02-08 23:04:13.749 29851 INFO octavia.common.config [-] Logging enabled!
2016-02-08 23:04:13.794 29851 WARNING oslo_reports.guru_meditation_report [-] Guru mediation now registers SIGUSR1 and SIGUSR2 by default for backward compatibility. SIGUSR1 will no longer be registered in a future release, so please use SIGUSR2 to generate reports.
2016-02-08 23:04:21.337 29851 INFO root [-] Generating grammar tables from /usr/lib/python2.7/lib2to3/Grammar.txt
2016-02-08 23:04:21.992 29851 INFO root [-] Generating grammar tables from /usr/lib/python2.7/lib2to3/PatternGrammar.txt
2016-02-08 23:04:33.608 29851 INFO octavia.controller.queue.consumer [-] Starting consumer...
2016-02-10 08:19:16.716 29851 INFO octavia.controller.queue.endpoint [-] Creating load balancer '0a09e92b-0dde-4a47-86e1-402b91673021'...
2016-02-10 08:19:19.893 29851 ERROR oslo_messaging.rpc.dispatcher [-] Exception during message handling: link() got an unexpected keyword argument 'decider_depth'
2016-02-10 08:19:19.893 29851 ERROR oslo_messaging.rpc.dispatcher Traceback (most recent call last):
2016-02-10 08:19:19.893 29851 ERROR oslo_messaging.rpc.dispatcher   File "/usr/local/lib/python2.7/dist-packages/oslo_messaging/rpc/dispatcher.py", line 142, in _dispatch_and_reply
2016-02-10 08:19:19.893 29851 ERROR oslo_messaging.rpc.dispatcher     executor_callback))
2016-02-10 08:19:19.893 29851 ERROR oslo_messaging.rpc.dispatcher   File "/usr/local/lib/python2.7/dist-packages/oslo_messaging/rpc/dispatcher.py", line 186, in _dispatch
2016-02-10 08:19:19.893 29851 ERROR oslo_messaging.rpc.dispatcher     executor_callback)
2016-02-10 08:19:19.893 29851 ERROR oslo_messaging.rpc.dispatcher   File "/usr/local/lib/python2.7/dist-packages/oslo_messaging/rpc/dispatcher.py", line 129, in _do_dispatch
2016-02-10 08:19:19.893 29851 ERROR oslo_messaging.rpc.dispatcher     result = func(ctxt, **new_args)
2016-02-10 08:19:19.893 29851 ERROR oslo_messaging.rpc.dispatcher   File "/opt/stack/octavia/octavia/controller/queue/endpoint.py", line 45, in create_load_balancer
2016-02-10 08:19:19.893 29851 ERROR oslo_messaging.rpc.dispatcher     self.worker.create_load_balancer(load_balancer_id)
2016-02-10 08:19:19.893 29851 ERROR oslo_messaging.rpc.dispatcher   File "/opt/stack/octavia/octavia/controller/worker/controller_worker.py", line 264, in create_load_balancer
2016-02-10 08:19:19.893 29851 ERROR oslo_messaging.rpc.dispatcher     topology=CONF.controller_worker.loadbalancer_topology),
2016-02-10 08:19:19.893 29851 ERROR oslo_messaging.rpc.dispatcher   File "/opt/stack/octavia/octavia/controller/worker/flows/load_balancer_flows.py", line 62, in get_create_load_balancer_flow
2016-02-10 08:19:19.893 29851 ERROR oslo_messaging.rpc.dispatcher     role=constants.ROLE_STANDALONE)
2016-02-10 08:19:19.893 29851 ERROR oslo_messaging.rpc.dispatcher   File "/opt/stack/octavia/octavia/controller/worker/flows/amphora_flows.py", line 227, in get_amphora_for_lb_subflow
2016-02-10 08:19:19.893 29851 ERROR oslo_messaging.rpc.dispatcher     decider_depth='flow')
2016-02-10 08:19:19.893 29851 ERROR oslo_messaging.rpc.dispatcher TypeError: link() got an unexpected keyword argument 'decider_depth'
2016-02-10 08:19:19.893 29851 ERROR oslo_messaging.rpc.dispatcher