Ask Your Question

Revision history [back]

click to hide/show revision 1
initial version

Icehouse + RedHat Deployment Bug?

Hello,

I'm wondering if I've hit a bug or have a configuration error on my 3 node deployment. All services start up and the message queue appears to be working. When I launch an image I get the following:

2014-04-29 16:16:49.723 2252 ERROR nova.api.openstack [req-a5503226-ad62-47fd-9ce2-9bd0b3c79d54 ca8cf7879f304f328420a023aa47d821 cc55e4c69e404fc4b89e2983a36efd80] Caught error: Timed out waiting for a reply to message ID 1c8a38a6a601410981cc630ddefa6346
2014-04-29 16:16:49.723 2252 TRACE nova.api.openstack Traceback (most recent call last):
2014-04-29 16:16:49.723 2252 TRACE nova.api.openstack   File "/usr/lib/python2.6/site-packages/nova/api/openstack/__init__.py", line 125, in __call__
2014-04-29 16:16:49.723 2252 TRACE nova.api.openstack     return req.get_response(self.application)
2014-04-29 16:16:49.723 2252 TRACE nova.api.openstack   File "/usr/lib/python2.6/site-packages/webob/request.py", line 1296, in send
2014-04-29 16:16:49.723 2252 TRACE nova.api.openstack     application, catch_exc_info=False)
2014-04-29 16:16:49.723 2252 TRACE nova.api.openstack   File "/usr/lib/python2.6/site-packages/webob/request.py", line 1260, in call_application
2014-04-29 16:16:49.723 2252 TRACE nova.api.openstack     app_iter = application(self.environ, start_response)
2014-04-29 16:16:49.723 2252 TRACE nova.api.openstack   File "/usr/lib/python2.6/site-packages/webob/dec.py", line 144, in __call__
2014-04-29 16:16:49.723 2252 TRACE nova.api.openstack     return resp(environ, start_response)
2014-04-29 16:16:49.723 2252 TRACE nova.api.openstack   File "/usr/lib/python2.6/site-packages/keystoneclient/middleware/auth_token.py", line 615, in __call__
2014-04-29 16:16:49.723 2252 TRACE nova.api.openstack     return self.app(env, start_response)
2014-04-29 16:16:49.723 2252 TRACE nova.api.openstack   File "/usr/lib/python2.6/site-packages/webob/dec.py", line 144, in __call__
2014-04-29 16:16:49.723 2252 TRACE nova.api.openstack     return resp(environ, start_response)
2014-04-29 16:16:49.723 2252 TRACE nova.api.openstack   File "/usr/lib/python2.6/site-packages/webob/dec.py", line 144, in __call__
2014-04-29 16:16:49.723 2252 TRACE nova.api.openstack     return resp(environ, start_response)
2014-04-29 16:16:49.723 2252 TRACE nova.api.openstack   File "/usr/lib/python2.6/site-packages/Routes-1.12.3-py2.6.egg/routes/middleware.py", line 131, in __call__
2014-04-29 16:16:49.723 2252 TRACE nova.api.openstack     response = self.app(environ, start_response)
2014-04-29 16:16:49.723 2252 TRACE nova.api.openstack   File "/usr/lib/python2.6/site-packages/webob/dec.py", line 144, in __call__
2014-04-29 16:16:49.723 2252 TRACE nova.api.openstack     return resp(environ, start_response)
2014-04-29 16:16:49.723 2252 TRACE nova.api.openstack   File "/usr/lib/python2.6/site-packages/webob/dec.py", line 130, in __call__
2014-04-29 16:16:49.723 2252 TRACE nova.api.openstack     resp = self.call_func(req, *args, **self.kwargs)
2014-04-29 16:16:49.723 2252 TRACE nova.api.openstack   File "/usr/lib/python2.6/site-packages/webob/dec.py", line 195, in call_func
2014-04-29 16:16:49.723 2252 TRACE nova.api.openstack     return self.func(req, *args, **kwargs)
2014-04-29 16:16:49.723 2252 TRACE nova.api.openstack   File "/usr/lib/python2.6/site-packages/nova/api/openstack/wsgi.py", line 917, in __call__
2014-04-29 16:16:49.723 2252 TRACE nova.api.openstack     content_type, body, accept)
2014-04-29 16:16:49.723 2252 TRACE nova.api.openstack   File "/usr/lib/python2.6/site-packages/nova/api/openstack/wsgi.py", line 983, in _process_stack
2014-04-29 16:16:49.723 2252 TRACE nova.api.openstack     action_result = self.dispatch(meth, request, action_args)
2014-04-29 16:16:49.723 2252 TRACE nova.api.openstack   File "/usr/lib/python2.6/site-packages/nova/api/openstack/wsgi.py", line 1070, in dispatch
2014-04-29 16:16:49.723 2252 TRACE nova.api.openstack     return method(req=request, **action_args)
2014-04-29 16:16:49.723 2252 TRACE nova.api.openstack   File "/usr/lib/python2.6/site-packages/nova/api/openstack/compute/servers.py", line 956, in create
2014-04-29 16:16:49.723 2252 TRACE nova.api.openstack     legacy_bdm=legacy_bdm)
2014-04-29 16:16:49.723 2252 TRACE nova.api.openstack   File "/usr/lib/python2.6/site-packages/nova/hooks.py", line 103, in inner
2014-04-29 16:16:49.723 2252 TRACE nova.api.openstack     rv = f(*args, **kwargs)
2014-04-29 16:16:49.723 2252 TRACE nova.api.openstack   File "/usr/lib/python2.6/site-packages/nova/compute/api.py", line 1341, in create
2014-04-29 16:16:49.723 2252 TRACE nova.api.openstack     legacy_bdm=legacy_bdm)
2014-04-29 16:16:49.723 2252 TRACE nova.api.openstack   File "/usr/lib/python2.6/site-packages/nova/compute/api.py", line 968, in _create_instance
2014-04-29 16:16:49.723 2252 TRACE nova.api.openstack     max_count)
2014-04-29 16:16:49.723 2252 TRACE nova.api.openstack   File "/usr/lib/python2.6/site-packages/nova/compute/api.py", line 739, in _validate_and_build_base_options
2014-04-29 16:16:49.723 2252 TRACE nova.api.openstack     requested_networks, max_count)
2014-04-29 16:16:49.723 2252 TRACE nova.api.openstack   File "/usr/lib/python2.6/site-packages/nova/compute/api.py", line 463, in _check_requested_networks
2014-04-29 16:16:49.723 2252 TRACE nova.api.openstack     max_count)
2014-04-29 16:16:49.723 2252 TRACE nova.api.openstack   File "/usr/lib/python2.6/site-packages/nova/network/api.py", line 94, in wrapped
2014-04-29 16:16:49.723 2252 TRACE nova.api.openstack     return func(self, context, *args, **kwargs)
2014-04-29 16:16:49.723 2252 TRACE nova.api.openstack   File "/usr/lib/python2.6/site-packages/nova/network/api.py", line 419, in validate_networks
2014-04-29 16:16:49.723 2252 TRACE nova.api.openstack     requested_networks)
2014-04-29 16:16:49.723 2252 TRACE nova.api.openstack   File "/usr/lib/python2.6/site-packages/nova/network/rpcapi.py", line 225, in validate_networks
2014-04-29 16:16:49.723 2252 TRACE nova.api.openstack     return self.client.call(ctxt, 'validate_networks', networks=networks)
2014-04-29 16:16:49.723 2252 TRACE nova.api.openstack   File "/usr/lib/python2.6/site-packages/oslo/messaging/rpc/client.py", line 361, in call
2014-04-29 16:16:49.723 2252 TRACE nova.api.openstack     return self.prepare().call(ctxt, method, **kwargs)
2014-04-29 16:16:49.723 2252 TRACE nova.api.openstack   File "/usr/lib/python2.6/site-packages/oslo/messaging/rpc/client.py", line 150, in call
2014-04-29 16:16:49.723 2252 TRACE nova.api.openstack     wait_for_reply=True, timeout=timeout)
2014-04-29 16:16:49.723 2252 TRACE nova.api.openstack   File "/usr/lib/python2.6/site-packages/oslo/messaging/transport.py", line 90, in _send
2014-04-29 16:16:49.723 2252 TRACE nova.api.openstack     timeout=timeout)
2014-04-29 16:16:49.723 2252 TRACE nova.api.openstack   File "/usr/lib/python2.6/site-packages/oslo/messaging/_drivers/amqpdriver.py", line 412, in send
2014-04-29 16:16:49.723 2252 TRACE nova.api.openstack     return self._send(target, ctxt, message, wait_for_reply, timeout)
2014-04-29 16:16:49.723 2252 TRACE nova.api.openstack   File "/usr/lib/python2.6/site-packages/oslo/messaging/_drivers/amqpdriver.py", line 403, in _send
2014-04-29 16:16:49.723 2252 TRACE nova.api.openstack     result = self._waiter.wait(msg_id, timeout)
2014-04-29 16:16:49.723 2252 TRACE nova.api.openstack   File "/usr/lib/python2.6/site-packages/oslo/messaging/_drivers/amqpdriver.py", line 267, in wait
2014-04-29 16:16:49.723 2252 TRACE nova.api.openstack     reply, ending = self._poll_connection(msg_id, timeout)
2014-04-29 16:16:49.723 2252 TRACE nova.api.openstack   File "/usr/lib/python2.6/site-packages/oslo/messaging/_drivers/amqpdriver.py", line 217, in _poll_connection
2014-04-29 16:16:49.723 2252 TRACE nova.api.openstack     % msg_id)
2014-04-29 16:16:49.723 2252 TRACE nova.api.openstack MessagingTimeout: Timed out waiting for a reply to message ID 1c8a38a6a601410981cc630ddefa6346
2014-04-29 16:16:49.723 2252 TRACE nova.api.openstack

Can anyone help me figure out how to go about debugging this issue? I can see a message didn't come back to the controller but I don't see who the message was for and I'm wondering why one of the other nodes didn't flag it as an error.

My compute node log is

2014-04-29 15:38:24.102 2022 INFO oslo.messaging._drivers.impl_qpid [-] Connected to AMQP server on nmtg-ctrl001:5672
2014-04-29 15:39:21.081 2022 WARNING nova.openstack.common.loopingcall [-] task run outlasted interval by 162.852126 sec

My neutron server log is

2014-04-29 16:15:23.601 4614 INFO neutron.wsgi [-] (4614) wsgi starting up on http://0.0.0.0:9696/

2014-04-29 16:15:23.605 4614 INFO neutron.openstack.common.rpc.impl_qpid [-] Connected to AMQP server on nmtg-ctrl001:5672

As a side note I can ping all interfaces on my nodes.

Take Care Jason Kary