Ask Your Question
0

Could not attach volume to instance[solved]

asked 2014-11-11 10:24:29 -0600

keky gravatar image

updated 2014-11-11 12:01:40 -0600

I can smoothly create | delete volume,But I can not attach them to the exist instance

   [root@controller-128 ~]#  cinder create --display-name  TEST_VOLUME_05 1            
+---------------------+--------------------------------------+
|       Property      |                Value                 |
+---------------------+--------------------------------------+
|     attachments     |                  []                  |
|  availability_zone  |                 nova                 |
|       bootable      |                false                 |
|      created_at     |      2014-11-11T16:18:24.095813      |
| display_description |                 None                 |
|     display_name    |            TEST_VOLUME_05            |
|      encrypted      |                False                 |
|          id         | e0e5882e-4ddb-4b3e-929d-3d4dd6973f50 |
|       metadata      |                  {}                  |
|         size        |                  1                   |
|     snapshot_id     |                 None                 |
|     source_volid    |                 None                 |
|        status       |               creating               |
|     volume_type     |                 None                 |
+---------------------+--------------------------------------+
[root@controller-128 ~]# cinder list
+--------------------------------------+-----------+-----------------+------+-------------+----------+-------------+
|                  ID                  |   Status  |   Display Name  | Size | Volume Type | Bootable | Attached to |
+--------------------------------------+-----------+-----------------+------+-------------+----------+-------------+          |
| e0e5882e-4ddb-4b3e-929d-3d4dd6973f50 | available |  TEST_VOLUME_05 |  1   |     None    |  false   |             |

[root@controller-128 ~]# nova list
+--------------------------------------+------------+--------+------------+-------------+---------------------------------------------+
| ID                                   | Name       | Status | Task State | Power State | Networks                                    |
+--------------------------------------+------------+--------+------------+-------------+---------------------------------------------+
| 247d95f0-86cb-4295-9d0c-87871083751e | vlanTest01 | ACTIVE | -          | Running     | WAN_NET=42.xx.xx.180  

[root@controller-128 ~]# nova volume-attach  247d95f0-86cb-4295-9d0c-87871083751e e0e5882e-4ddb-4b3e-929d-3d4dd6973f50 auto
+----------+--------------------------------------+
| Property | Value                                |
+----------+--------------------------------------+
| device   | /dev/vdb                             |
| id       | e0e5882e-4ddb-4b3e-929d-3d4dd6973f50 |
| serverId | 247d95f0-86cb-4295-9d0c-87871083751e |
| volumeId | e0e5882e-4ddb-4b3e-929d-3d4dd6973f50 |
+----------+--------------------------------------+

but It does not really attached to the instance
[root@controller-128 ~]# cinder show e0e5882e-4ddb-4b3e-929d-3d4dd6973f50 
+--------------------------------+--------------------------------------+
|            Property            |                Value                 |
+--------------------------------+--------------------------------------+
|          attachments           |                  []                  |
|       availability_zone        |                 nova                 |
|            bootable            |                false                 |
|           created_at           |      2014-11-11T16:18:24.000000      |
|      display_description       |                 None                 |
|          display_name          |            TEST_VOLUME_05            |
|           encrypted            |                False                 |
|               id               | e0e5882e-4ddb-4b3e-929d-3d4dd6973f50 |
|            metadata            |                  {}                  |
|     os-vol-host-attr:host      |          KG-TEST-10_1_2_131          |
| os-vol-mig-status-attr:migstat |                 None                 |
| os-vol-mig-status-attr:name_id |                 None                 |
|  os-vol-tenant-attr:tenant_id  |   a174481a86f04181b0283a558c80b24e   |
|              size              |                  1                   |
|          snapshot_id           |                 None                 |
|          source_volid          |                 None                 |
|             status             |              available               |
|          volume_type           |                 None                 |
+--------------------------------+--------------------------------------+

The log shows up:

2014-11-12 00:19:54.821 25365 WARNING cinder.context [-] Arguments dropped when creating context: {'user': u'e0ba83d0903f4a368bd9869898e41256', 'tenant': u'a174481a86f04181b02}
2014-11-12 00:19:55.215 25365 ERROR cinder.brick.iscsi.iscsi [req-e6dadbe4-b9b4-4d44-babc-0916a45bdf2f e0ba83d0903f4a368bd9869898e41256 a174481a86f04181b0283a558c80b24e - - -].
2014-11-12 00:19:55.216 25365 ERROR oslo.messaging.rpc.dispatcher [req-e6dadbe4-b9b4-4d44-babc-0916a45bdf2f e0ba83d0903f4a368bd9869898e41256 a174481a86f04181b0283a558c80b24e -.
2014-11-12 00:19:55.216 25365 TRACE oslo.messaging.rpc.dispatcher Traceback (most recent call last):
2014-11-12 00:19:55.216 25365 TRACE oslo.messaging.rpc.dispatcher   File "/usr/lib/python2.6/site-packages/oslo/messaging/rpc/dispatcher.py", line 133, in _dispatch_and_reply
2014-11-12 00:19:55.216 25365 TRACE oslo.messaging.rpc.dispatcher     incoming.message))
2014-11-12 00:19:55.216 25365 TRACE oslo.messaging.rpc.dispatcher   File "/usr/lib/python2.6/site-packages/oslo/messaging/rpc/dispatcher.py", line 176, in _dispatch
2014-11-12 00:19:55.216 25365 TRACE oslo.messaging.rpc.dispatcher     return self._do_dispatch(endpoint, method, ctxt, args)
2014-11-12 00:19:55.216 25365 TRACE oslo.messaging.rpc.dispatcher   File "/usr/lib/python2.6/site-packages/oslo/messaging/rpc/dispatcher.py", line 122, in _do_dispatch
2014-11-12 00:19:55.216 25365 TRACE oslo.messaging.rpc.dispatcher     result = getattr(endpoint, method)(ctxt, **new_args)
2014-11-12 00:19:55.216 25365 TRACE oslo.messaging.rpc.dispatcher   File "/usr/lib/python2.6/site-packages/cinder/volume/manager.py", line 790, in initialize_connection
2014-11-12 00:19:55.216 25365 TRACE oslo.messaging.rpc.dispatcher     volume)
2014-11-12 00:19:55.216 25365 TRACE oslo.messaging.rpc.dispatcher   File "/usr/lib/python2.6/site-packages/cinder/volume/drivers/lvm.py", line 525, in create_export
2014-11-12 00:19:55.216 25365 TRACE oslo.messaging.rpc.dispatcher     return self._create_export(context, volume)
2014-11-12 00:19:55.216 25365 TRACE oslo.messaging.rpc.dispatcher   File "/usr/lib/python2.6/site-packages/cinder/volume/drivers/lvm.py", line 534, in _create_export
2014-11-12 00:19:55.216 25365 TRACE oslo.messaging.rpc.dispatcher     data = self.target_helper.create_export(context, volume, volume_path)
2014-11-12 00:19:55.216 25365 TRACE oslo.messaging.rpc.dispatcher   File "/usr ...
(more)
edit retag flag offensive close merge delete

2 answers

Sort by ยป oldest newest most voted
1

answered 2014-11-11 14:05:33 -0600

N3tW0rkZ gravatar image

my tgtadm is already uncommented and I cannot attach a volume to an instance. My logs are below, can anyone shed light on this for me?

[root@open-controller cinder(keystone_admin)]#

2014-11-11 14:39:48.742 6700 WARNING cinder.context [-] Arguments dropped when creating context: {'user': u'171cb641fb014222ae53ee6cbfec9273', 'tenant': u'fe83b11800624fbdbaf4d37401005dcc', 'user_identity': u'171cb641fb014222ae53ee6cbfec9273 fe83b11800624fbdbaf4d37401005dcc - - -'} 2014-11-11 14:39:48.771 6700 INFO cinder.brick.iscsi.iscsi [req-fb78e5de-acc5-4b0e-8270-b798ec2d8e1b 171cb641fb014222ae53ee6cbfec9273 fe83b11800624fbdbaf4d37401005dcc - - -] Creating iscsi_target for: volume-25f2f1cc-03af-4983-8abc-093841de7064 2014-11-11 14:39:53.466 6700 WARNING cinder.context [-] Arguments dropped when creating context: {'user': u'171cb641fb014222ae53ee6cbfec9273', 'tenant': u'fe83b11800624fbdbaf4d37401005dcc', 'user_identity': u'171cb641fb014222ae53ee6cbfec9273 fe83b11800624fbdbaf4d37401005dcc - - -'}

2014-11-11 14:40:03.236 6700 INFO cinder.volume.manager [req-d2791f5e-7025-4aae-822c-6eec591f29cc 171cb641fb014222ae53ee6cbfec9273 fe83b11800624fbdbaf4d37401005dcc - - -] Updating volume status 2014-11-11 14:40:46.857 6700 INFO cinder.volume.manager [-] Updating volume status [root@open-controller cinder(keystone_admin)]#

edit flag offensive delete link more
0

answered 2014-11-11 11:50:17 -0600

keky gravatar image

updated 2014-11-11 11:51:07 -0600

root@hydrogen:~# vi /etc/cinder/cinder.conf #uncomment the following line iscsi_helper = tgtadm


do that both in CONTROLLER NODE and CINDER NODE,restart all the cinder service,And It would be fine,May it helps

edit flag offensive delete link more

Your Answer

Please start posting anonymously - your entry will be published after you log in or create a new account.

Add Answer

Get to know Ask OpenStack

Resources for moderators

Question Tools

1 follower

Stats

Asked: 2014-11-11 10:24:29 -0600

Seen: 1,134 times

Last updated: Nov 11 '14