Can anyone assist with an issue attaching volume to instance when storage is used? [closed]
Asking this question in a different way in the hope that someone can assist.
I have juno installed on CentOS7 by packstack. System comprises of 1 controller/neutron, 3 compute, one of which has storage for cinder.
I can create instances and volumes (in target) etc. but volumes fail to attach to instances.
cinder list
+--------------------------------------+-----------+--------------+------+-------------+----------+-------------+
| ID | Status | Display Name | Size | Volume Type | Bootable | Attached to |
+--------------------------------------+-----------+--------------+------+-------------+----------+-------------+
| e5c4c144-1315-476e-97c5-8f647be0868a | available | test | 1 | lvm | false | |
+--------------------------------------+-----------+--------------+------+-------------+----------+-------------+
nova list
+--------------------------------------+------+--------+------------+-------------+------------------------+
| ID | Name | Status | Task State | Power State | Networks |
+--------------------------------------+------+--------+------------+-------------+------------------------+
| d81f77ee-2b7f-4032-bd19-e113e4833220 | test | ACTIVE | - | Running | admin-net=172.16.100.4 |
+--------------------------------------+------+--------+------------+-------------+------------------------+
nova volume-attach d81f77ee-2b7f-4032-bd19-e113e4833220 e5c4c144-1315-476e-97c5-8f647be0868a auto
+----------+--------------------------------------+
| Property | Value |
+----------+--------------------------------------+
| device | /dev/vdb |
| id | e5c4c144-1315-476e-97c5-8f647be0868a |
| serverId | d81f77ee-2b7f-4032-bd19-e113e4833220 |
| volumeId | e5c4c144-1315-476e-97c5-8f647be0868a |
+----------+--------------------------------------+
cinder list
+--------------------------------------+-----------+--------------+------+-------------+----------+-------------+
| ID | Status | Display Name | Size | Volume Type | Bootable | Attached to |
+--------------------------------------+-----------+--------------+------+-------------+----------+-------------+
| e5c4c144-1315-476e-97c5-8f647be0868a | available | test | 1 | lvm | false | |
+--------------------------------------+-----------+--------------+------+-------------+----------+-------------+
Storage host volume.log
2015-01-22 13:05:43.164 5125 INFO cinder.brick.iscsi.iscsi [req-b32c0a32-cb77-4ae2-8241-8c7d9bc8ad5e bd01e7fc46884992afd7660e60cbb3d6 26bdb8bbc54f4137998a3931d68fa45b - - -] Creating iscsi_target for volume: volume-02a490ef-12ca-43d4-8a7f-f2fa15ff0edb
2015-01-22 13:05:43.393 5125 ERROR cinder.brick.iscsi.iscsi [req-b32c0a32-cb77-4ae2-8241-8c7d9bc8ad5e bd01e7fc46884992afd7660e60cbb3d6 26bdb8bbc54f4137998a3931d68fa45b - - -] Failed to create iscsi target for volume id:volume-02a490ef-12ca-43d4-8a7f-f2fa15ff0edb.
2015-01-22 13:05:43.394 5125 ERROR oslo.messaging.rpc.dispatcher [req-b32c0a32-cb77-4ae2-8241-8c7d9bc8ad5e bd01e7fc46884992afd7660e60cbb3d6 26bdb8bbc54f4137998a3931d68fa45b - - -] Exception during message handling: Resource could not be found.
2015-01-22 13:05:43.394 5125 TRACE oslo.messaging.rpc.dispatcher Traceback (most recent call last):
2015-01-22 13:05:43.394 5125 TRACE oslo.messaging.rpc.dispatcher File "/usr/lib/python2.7/site-packages/oslo/messaging/rpc/dispatcher.py", line 134, in _dispatch_and_reply
2015-01-22 13:05:43.394 5125 TRACE oslo.messaging.rpc.dispatcher incoming.message))
2015-01-22 13:05:43.394 5125 TRACE oslo.messaging.rpc.dispatcher File "/usr/lib/python2.7/site-packages/oslo/messaging/rpc/dispatcher.py", line 177, in _dispatch
2015-01-22 13:05:43.394 5125 TRACE oslo.messaging.rpc.dispatcher return self._do_dispatch(endpoint, method, ctxt, args)
2015-01-22 13:05:43.394 5125 TRACE oslo.messaging.rpc.dispatcher File "/usr/lib/python2.7/site-packages/oslo/messaging/rpc/dispatcher.py", line 123, in _do_dispatch
2015-01-22 13:05:43.394 5125 TRACE oslo.messaging.rpc.dispatcher result = getattr(endpoint, method)(ctxt, **new_args)
2015-01-22 13:05:43.394 5125 TRACE oslo.messaging.rpc.dispatcher File "/usr/lib/python2.7/site-packages/osprofiler/profiler.py", line 105, in wrapper
2015-01-22 13:05:43.394 5125 TRACE oslo.messaging.rpc.dispatcher return f(*args, **kwargs)
2015-01-22 13:05:43.394 5125 TRACE oslo.messaging.rpc.dispatcher File "/usr/lib/python2.7/site-packages/cinder/volume/manager.py", line 875, in initialize_connection
2015-01-22 13:05:43.394 5125 TRACE oslo.messaging.rpc.dispatcher volume)
2015-01-22 13:05:43.394 5125 TRACE oslo.messaging.rpc.dispatcher File "/usr/lib/python2.7/site-packages/osprofiler/profiler.py", line 105, in wrapper
2015-01-22 13:05:43.394 5125 TRACE oslo.messaging.rpc.dispatcher return f(*args, **kwargs)
2015-01-22 13:05:43.394 5125 TRACE oslo.messaging.rpc.dispatcher File "/usr/lib/python2.7/site-packages/cinder/volume/drivers/lvm.py", line 548, in create_export
2015-01-22 13:05:43.394 5125 TRACE oslo.messaging.rpc.dispatcher return self._create_export(context, volume)
2015-01-22 13:05:43.394 5125 TRACE oslo ...
What's wrong with this shema http://textuploader.com/t3dp
Nothing and I have tried it, please see above in main body. After doing so, volume still shows available (& obviously no /dev/vdb in instance)
So inside instance
pvcreate /dev/vdb
doesn't work ?Device /dev/vdb not found
Are there any ERRORS in cinder logs ?