Ask Your Question

Revision history [back]

click to hide/show revision 1
initial version

Kilo(OSP7) Deployment failing on NoValidHost: No valid host was found.

Hi,

We are using RHEL7.1 OSP7 distribution which is based on Kilo with Ironic driver for power management. Deployer is TripleO based RDO manager. We see following error on nova-conductor logs -

2015-06-19 17:17:48.272 4988 WARNING nova.scheduler.utils [req-9d33461b-63b1-4555-b4a9-eff98428c3dc e27ab3751f7f4a4e9f3b030d48b0f366 0db82411bf7a4d4fba49b39d7b5aef2d - - -] Failed to compute_task_build_instances: No valid host was found. There are not enough hosts available. Traceback (most recent call last):

File "/usr/lib/python2.7/site-packages/oslo_messaging/rpc/server.py", line 142, in inner return func(args, *kwargs)

File "/usr/lib/python2.7/site-packages/nova/scheduler/manager.py", line 86, in select_destinations filter_properties)

File "/usr/lib/python2.7/site-packages/nova/scheduler/filter_scheduler.py", line 80, in select_destinations raise exception.NoValidHost(reason=reason)

NoValidHost: No valid host was found. There are not enough hosts available.

We have modified nova.virt.driver for ironic as below -

compute_driver=libvirt.LibvirtDriver

compute_driver=nova.virt.ironic.IronicDriver <<<<<<<<<<<

Since we are using our own 'pxe_ucs' driver back ported from Liberty release for kilo, as h/w is Cisco UCS -

enabled_drivers=pxe_ipmitool

enabled_drivers=pxe_ucs,agent_ucs <<<<<<<<<<<

Below are the some of log outputs -

[stack@ospd ~]$ ironic node-list +--------------------------------------+------+---------------+-------------+-----------------+-------------+ | UUID | Name | Instance UUID | Power State | Provision State | Maintenance | +--------------------------------------+------+---------------+-------------+-----------------+-------------+ | 2208c498-5989-447c-8a77-b6761e3d3cda | None | None | power off | available | False | | cb21453a-dfe4-423c-9a9f-91504a230923 | None | None | power off | available | False | +--------------------------------------+------+---------------+-------------+-----------------+-------------+ [stack@ospd ~]$ nova list +--------------------------------------+-------------------------------------------------------+--------+------------+-------------+----------+ | ID | Name | Status | Task State | Power State | Networks | +--------------------------------------+-------------------------------------------------------+--------+------------+-------------+----------+ | 2dbaf0d1-4874-4b07-9fe2-29d4987487ac | ov-gqf22wdaft-0-5zueofadpzlb-NovaCompute-hinuwoq3yq7q | ERROR | - | NOSTATE | | | 1ba2abb0-f188-4f06-bbc5-14de8f111534 | ov-rimjhocbgyt-0-bwbyuxfori6h-Controller-aah6uu3r7v3e | ERROR | - | NOSTATE | | +--------------------------------------+-------------------------------------------------------+--------+------------+-------------+----------+ [stack@ospd ~]$ nova flavor-list +--------------------------------------+-----------+-----------+------+-----------+------+-------+-------------+-----------+ | ID | Name | Memory_MB | Disk | Ephemeral | Swap | VCPUs | RXTX_Factor | Is_Public | +--------------------------------------+-----------+-----------+------+-----------+------+-------+-------------+-----------+ | fabc50be-3c0b-45b2-84a9-bf78607b3cc3 | baremetal | 4096 | 40 | 0 | | 1 | 1.0 | True | +———————————————————+-----------+-----------+------+-----------+------+-------+-------------+-----------+

[stack@ospd ~]$ nova hypervisor-list +----+--------------------------------------+-------+---------+ | ID | Hypervisor hostname | State | Status | +----+--------------------------------------+-------+---------+ | 11 | 2208c498-5989-447c-8a77-b6761e3d3cda | up | enabled | | 12 | cb21453a-dfe4-423c-9a9f-91504a230923 | up | enabled | +----+--------------------------------------+-------+---------+ [stack@ospd ~]$ nova hypervisor-stats +----------------------+--------+ | Property | Value | +----------------------+--------+ | count | 2 | | current_workload | 0 | | disk_available_least | 370 | | free_disk_gb | 370 | | free_ram_mb | 229376 | | local_gb | 370 | | local_gb_used | 0 | | memory_mb | 229376 | | memory_mb_used | 0 | | running_vms | 0 | | vcpus | 56 | | vcpus_used | 0 | +———————————+--------+

[stack@ospd ~]$ glance image-list +--------------------------------------+------------------------+-------------+------------------+-----------+--------+ | ID | Name | Disk Format | Container Format | Size | Status | +--------------------------------------+------------------------+-------------+------------------+-----------+--------+ | a85703cd-b6a6-4366-b3a6-4e5de420fb98 | bm-deploy-kernel | aki | aki | 5027392 | active | | 28cff74d-4cd6-421b-97a8-a044a114069b | bm-deploy-ramdisk | ari | ari | 56310758 | active | | 8f0604bd-a20a-4df2-83be-290a99138f13 | overcloud-full | qcow2 | bare | 893856256 | active | | 5936b7a3-fdf1-4723-95eb-3ea261995cc5 | overcloud-full-initrd | ari | ari | 36758352 | active | | 3a999b85-fc53-48c8-b6ec-89e5fa3a0aa9 | overcloud-full-vmlinuz | aki | aki | 5027392 | active | +———————————————————+------------------------+-------------+------------------+-----------+--------+

[stack@ospd ~]$ instack-ironic-deployment --show-profile Preparing for deployment... Querying assigned profiles ...

2208c498-5989-447c-8a77-b6761e3d3cda
  "boot_option=local"

cb21453a-dfe4-423c-9a9f-91504a230923
  "boot_option=local"

DONE.

Prepared.

[stack@ospd ~]$ heat stack-list +--------------------------------------+------------+---------------+----------------------+ | id | stack_name | stack_status | creation_time | +--------------------------------------+------------+---------------+----------------------+ | a0cf5344-ae2f-44bc-b5dc-85d467f61a6e | overcloud | CREATE_FAILED | 2015-06-19T11:56:26Z |
+———————————————————+——————+---------------+----------------------+

[stack@ospd ~]$ tuskar plan-show-scale overcloud +------------------+-------+ | Property | Value | +------------------+-------+ | Ceph-Storage-1 | 0 | | Cinder-Storage-1 | 0 | | Compute-1 | 1 | | Controller-1 | 1 | | Swift-Storage-1 | 0 | +—————————+-------+

[stack@ospd ~]$ tuskar plan-show-flavors overcloud +------------------+-----------+ | Property | Value | +------------------+-----------+ | Ceph-Storage-1 | baremetal | | Cinder-Storage-1 | baremetal | | Compute-1 | baremetal | | Controller-1 | baremetal | | Swift-Storage-1 | baremetal | +—————————+-----------+

[stack@ospd ~]$ openstack flavor show baremetal +----------------------------+-----------------------------------------------------+ | Field | Value | +----------------------------+-----------------------------------------------------+ | OS-FLV-DISABLED:disabled | False | | OS-FLV-EXT-DATA:ephemeral | 0 | | disk | 40 | | id | fabc50be-3c0b-45b2-84a9-bf78607b3cc3 | | name | baremetal | | os-flavor-access:is_public | True | | properties | capabilities:boot_option='local', cpu_arch='x86_64' | | ram | 4096 | | rxtx_factor | 1.0 | | swap | | | vcpus | 1 | +——————————————+-----------------------------------------------------+

[stack@ospd ~]$ heat stack-list --show-nested -f "status=FAILED" +--------------------------------------+--------------------------------------------------+---------------+----------------------+--------------------------------------+ | id | stack_name | stack_status | creation_time | parent | +--------------------------------------+--------------------------------------------------+---------------+----------------------+--------------------------------------+ | a0cf5344-ae2f-44bc-b5dc-85d467f61a6e | overcloud | CREATE_FAILED | 2015-06-19T11:56:26Z | None | | 1545787f-32b1-407d-9129-5931ccfe1869 | overcloud-Compute-cggqf22wdaft | CREATE_FAILED | 2015-06-19T11:56:29Z | a0cf5344-ae2f-44bc-b5dc-85d467f61a6e | | 7373d273-1b71-424e-b7fb-a4d794c8b65a | overcloud-Compute-cggqf22wdaft-0-5zueofadpzlb | CREATE_FAILED | 2015-06-19T11:56:30Z | 1545787f-32b1-407d-9129-5931ccfe1869 | | cb7698fb-9023-464f-a9a0-548b5e03a971 | overcloud-Controller-drimjhocbgyt | CREATE_FAILED | 2015-06-19T11:56:30Z | a0cf5344-ae2f-44bc-b5dc-85d467f61a6e | | 941666fa-98ab-464e-83e3-4c893e6e5fad | overcloud-Controller-drimjhocbgyt-0-bwbyuxfori6h | CREATE_FAILED | 2015-06-19T11:56:31Z | cb7698fb-9023-464f-a9a0-548b5e03a971 | +———————————————————+--------------------------------------------------+---------------+----------------------+--------------------------------------+

[stack@ospd ~]$ heat resource-list --nested-depth 5 overcloud | grep FAILED | Compute | 1545787f-32b1-407d-9129-5931ccfe1869 | OS::Heat::ResourceGroup | CREATE_FAILED | 2015-06-19T11:56:26Z | | | Controller | cb7698fb-9023-464f-a9a0-548b5e03a971 | OS::Heat::ResourceGroup | CREATE_FAILED | 2015-06-19T11:56:26Z | | | 0 | 7373d273-1b71-424e-b7fb-a4d794c8b65a | Tuskar::Compute-1 | CREATE_FAILED | 2015-06-19T11:56:29Z | Compute | | 0 | 941666fa-98ab-464e-83e3-4c893e6e5fad | Tuskar::Controller-1 | CREATE_FAILED | 2015-06-19T11:56:30Z | Controller | | NovaCompute | 2dbaf0d1-4874-4b07-9fe2-29d4987487ac | OS::Nova::Server | CREATE_FAILED | 2015-06-19T11:56:30Z | 0 | | Controller | 1ba2abb0-f188-4f06-bbc5-14de8f111534 | OS::Nova::Server | CREATE_FAILED | 2015-06-19T11:56:31Z | 0 |

[stack@ospd ~]$ heat resource-show 1545787f-32b1-407d-9129-5931ccfe1869 0 | grep resource_status_reason | resource_status_reason | ResourceUnknownStatus: Resource failed - Unknown status FAILED due to "Resource CREATE failed: ResourceInError: Went to status ERROR due to "Message: No valid host was found. There are not enough hosts available., Code: 500"" |

Appreciate help and pointers, for de-bugging this issue.

Thanks, ~ Rajesh.

click to hide/show revision 2
No.2 Revision

Kilo(OSP7) Deployment failing on NoValidHost: No valid host was found.

Hi,

We are using RHEL7.1 OSP7 distribution which is based on Kilo with Ironic driver for power management. Deployer is TripleO based RDO manager. We see following error on nova-conductor logs -

2015-06-19 17:17:48.272 4988 WARNING nova.scheduler.utils [req-9d33461b-63b1-4555-b4a9-eff98428c3dc e27ab3751f7f4a4e9f3b030d48b0f366 0db82411bf7a4d4fba49b39d7b5aef2d - - -] Failed to compute_task_build_instances: No valid host was found. There are not enough hosts available.
Traceback (most recent call last):

last): File "/usr/lib/python2.7/site-packages/oslo_messaging/rpc/server.py", line 142, in inner return func(args, *kwargs)

func(*args, **kwargs) File "/usr/lib/python2.7/site-packages/nova/scheduler/manager.py", line 86, in select_destinations filter_properties)

filter_properties) File "/usr/lib/python2.7/site-packages/nova/scheduler/filter_scheduler.py", line 80, in select_destinations raise exception.NoValidHost(reason=reason)

exception.NoValidHost(reason=reason) NoValidHost: No valid host was found. There are not enough hosts available.

available.

We have modified nova.virt.driver for ironic as below -

compute_driver=libvirt.LibvirtDriver

#compute_driver=libvirt.LibvirtDriver
compute_driver=nova.virt.ironic.IronicDriver <<<<<<<<<<<

<<<<<<<<<<<

Since we are using our own 'pxe_ucs' driver back ported from Liberty release for kilo, as h/w is Cisco UCS -

enabled_drivers=pxe_ipmitool

#enabled_drivers=pxe_ipmitool
enabled_drivers=pxe_ucs,agent_ucs <<<<<<<<<<<

<<<<<<<<<<<

Below are the some of log outputs -

[stack@ospd ~]$ ironic node-list
+--------------------------------------+------+---------------+-------------+-----------------+-------------+
| UUID                                 | Name | Instance UUID | Power State | Provision State | Maintenance |
+--------------------------------------+------+---------------+-------------+-----------------+-------------+
| 2208c498-5989-447c-8a77-b6761e3d3cda | None | None          | power off   | available       | False       |
| cb21453a-dfe4-423c-9a9f-91504a230923 | None | None          | power off   | available       | False       |
+--------------------------------------+------+---------------+-------------+-----------------+-------------+
[stack@ospd ~]$ nova list
+--------------------------------------+-------------------------------------------------------+--------+------------+-------------+----------+
| ID                                   | Name                                                  | Status | Task State | Power State | Networks |
+--------------------------------------+-------------------------------------------------------+--------+------------+-------------+----------+
| 2dbaf0d1-4874-4b07-9fe2-29d4987487ac | ov-gqf22wdaft-0-5zueofadpzlb-NovaCompute-hinuwoq3yq7q | ERROR  | -          | NOSTATE     |          |
| 1ba2abb0-f188-4f06-bbc5-14de8f111534 | ov-rimjhocbgyt-0-bwbyuxfori6h-Controller-aah6uu3r7v3e | ERROR  | -          | NOSTATE     |          |
+--------------------------------------+-------------------------------------------------------+--------+------------+-------------+----------+
[stack@ospd ~]$ nova flavor-list
+--------------------------------------+-----------+-----------+------+-----------+------+-------+-------------+-----------+
| ID                                   | Name      | Memory_MB | Disk | Ephemeral | Swap | VCPUs | RXTX_Factor | Is_Public |
+--------------------------------------+-----------+-----------+------+-----------+------+-------+-------------+-----------+
| fabc50be-3c0b-45b2-84a9-bf78607b3cc3 | baremetal | 4096      | 40   | 0         |      | 1     | 1.0         | True      |
+———————————————————+-----------+-----------+------+-----------+------+-------+-------------+-----------+

+———————————————————+-----------+-----------+------+-----------+------+-------+-------------+-----------+ [stack@ospd ~]$ nova hypervisor-list +----+--------------------------------------+-------+---------+ | ID | Hypervisor hostname | State | Status | +----+--------------------------------------+-------+---------+ | 11 | 2208c498-5989-447c-8a77-b6761e3d3cda | up | enabled | | 12 | cb21453a-dfe4-423c-9a9f-91504a230923 | up | enabled | +----+--------------------------------------+-------+---------+ [stack@ospd ~]$ nova hypervisor-stats +----------------------+--------+ | Property | Value | +----------------------+--------+ | count | 2 | | current_workload | 0 | | disk_available_least | 370 | | free_disk_gb | 370 | | free_ram_mb | 229376 | | local_gb | 370 | | local_gb_used | 0 | | memory_mb | 229376 | | memory_mb_used | 0 | | running_vms | 0 | | vcpus | 56 | | vcpus_used | 0 | +———————————+--------+

+———————————+--------+ [stack@ospd ~]$ glance image-list +--------------------------------------+------------------------+-------------+------------------+-----------+--------+ | ID | Name | Disk Format | Container Format | Size | Status | +--------------------------------------+------------------------+-------------+------------------+-----------+--------+ | a85703cd-b6a6-4366-b3a6-4e5de420fb98 | bm-deploy-kernel | aki | aki | 5027392 | active | | 28cff74d-4cd6-421b-97a8-a044a114069b | bm-deploy-ramdisk | ari | ari | 56310758 | active | | 8f0604bd-a20a-4df2-83be-290a99138f13 | overcloud-full | qcow2 | bare | 893856256 | active | | 5936b7a3-fdf1-4723-95eb-3ea261995cc5 | overcloud-full-initrd | ari | ari | 36758352 | active | | 3a999b85-fc53-48c8-b6ec-89e5fa3a0aa9 | overcloud-full-vmlinuz | aki | aki | 5027392 | active | +———————————————————+------------------------+-------------+------------------+-----------+--------+

+———————————————————+------------------------+-------------+------------------+-----------+--------+ [stack@ospd ~]$ instack-ironic-deployment --show-profile Preparing for deployment... Querying assigned profiles ...

...

    2208c498-5989-447c-8a77-b6761e3d3cda
   "boot_option=local"

 cb21453a-dfe4-423c-9a9f-91504a230923
   "boot_option=local"


  DONE.

DONE.

Prepared.

[stack@ospd ~]$ heat stack-list
+--------------------------------------+------------+---------------+----------------------+
| id                                   | stack_name | stack_status  | creation_time        |
+--------------------------------------+------------+---------------+----------------------+
| a0cf5344-ae2f-44bc-b5dc-85d467f61a6e | overcloud  | CREATE_FAILED | 2015-06-19T11:56:26Z | 
+———————————————————+——————+---------------+----------------------+

+———————————————————+——————+---------------+----------------------+ [stack@ospd ~]$ tuskar plan-show-scale overcloud +------------------+-------+ | Property | Value | +------------------+-------+ | Ceph-Storage-1 | 0 | | Cinder-Storage-1 | 0 | | Compute-1 | 1 | | Controller-1 | 1 | | Swift-Storage-1 | 0 | +—————————+-------+

+—————————+-------+ [stack@ospd ~]$ tuskar plan-show-flavors overcloud +------------------+-----------+ | Property | Value | +------------------+-----------+ | Ceph-Storage-1 | baremetal | | Cinder-Storage-1 | baremetal | | Compute-1 | baremetal | | Controller-1 | baremetal | | Swift-Storage-1 | baremetal | +—————————+-----------+

+—————————+-----------+ [stack@ospd ~]$ openstack flavor show baremetal +----------------------------+-----------------------------------------------------+ | Field | Value | +----------------------------+-----------------------------------------------------+ | OS-FLV-DISABLED:disabled | False | | OS-FLV-EXT-DATA:ephemeral | 0 | | disk | 40 | | id | fabc50be-3c0b-45b2-84a9-bf78607b3cc3 | | name | baremetal | | os-flavor-access:is_public | True | | properties | capabilities:boot_option='local', cpu_arch='x86_64' | | ram | 4096 | | rxtx_factor | 1.0 | | swap | | | vcpus | 1 | +——————————————+-----------------------------------------------------+

+——————————————+-----------------------------------------------------+ [stack@ospd ~]$ heat stack-list --show-nested -f "status=FAILED" +--------------------------------------+--------------------------------------------------+---------------+----------------------+--------------------------------------+ | id | stack_name | stack_status | creation_time | parent | +--------------------------------------+--------------------------------------------------+---------------+----------------------+--------------------------------------+ | a0cf5344-ae2f-44bc-b5dc-85d467f61a6e | overcloud | CREATE_FAILED | 2015-06-19T11:56:26Z | None | | 1545787f-32b1-407d-9129-5931ccfe1869 | overcloud-Compute-cggqf22wdaft | CREATE_FAILED | 2015-06-19T11:56:29Z | a0cf5344-ae2f-44bc-b5dc-85d467f61a6e | | 7373d273-1b71-424e-b7fb-a4d794c8b65a | overcloud-Compute-cggqf22wdaft-0-5zueofadpzlb | CREATE_FAILED | 2015-06-19T11:56:30Z | 1545787f-32b1-407d-9129-5931ccfe1869 | | cb7698fb-9023-464f-a9a0-548b5e03a971 | overcloud-Controller-drimjhocbgyt | CREATE_FAILED | 2015-06-19T11:56:30Z | a0cf5344-ae2f-44bc-b5dc-85d467f61a6e | | 941666fa-98ab-464e-83e3-4c893e6e5fad | overcloud-Controller-drimjhocbgyt-0-bwbyuxfori6h | CREATE_FAILED | 2015-06-19T11:56:31Z | cb7698fb-9023-464f-a9a0-548b5e03a971 | +———————————————————+--------------------------------------------------+---------------+----------------------+--------------------------------------+

+———————————————————+--------------------------------------------------+---------------+----------------------+--------------------------------------+ [stack@ospd ~]$ heat resource-list --nested-depth 5 overcloud | grep FAILED | Compute | 1545787f-32b1-407d-9129-5931ccfe1869 | OS::Heat::ResourceGroup | CREATE_FAILED | 2015-06-19T11:56:26Z | | | Controller | cb7698fb-9023-464f-a9a0-548b5e03a971 | OS::Heat::ResourceGroup | CREATE_FAILED | 2015-06-19T11:56:26Z | | | 0 | 7373d273-1b71-424e-b7fb-a4d794c8b65a | Tuskar::Compute-1 | CREATE_FAILED | 2015-06-19T11:56:29Z | Compute | | 0 | 941666fa-98ab-464e-83e3-4c893e6e5fad | Tuskar::Controller-1 | CREATE_FAILED | 2015-06-19T11:56:30Z | Controller | | NovaCompute | 2dbaf0d1-4874-4b07-9fe2-29d4987487ac | OS::Nova::Server | CREATE_FAILED | 2015-06-19T11:56:30Z | 0 | | Controller | 1ba2abb0-f188-4f06-bbc5-14de8f111534 | OS::Nova::Server | CREATE_FAILED | 2015-06-19T11:56:31Z | 0 |

| [stack@ospd ~]$ heat resource-show 1545787f-32b1-407d-9129-5931ccfe1869 0 | grep resource_status_reason | resource_status_reason | ResourceUnknownStatus: Resource failed - Unknown status FAILED due to "Resource CREATE failed: ResourceInError: Went to status ERROR due to "Message: No valid host was found. There are not enough hosts available., Code: 500"" |

|

Appreciate help and pointers, for de-bugging this issue.

Thanks, ~ Rajesh.

Kilo(OSP7) Deployment failing on NoValidHost: No valid host was found.

Hi,

We are using RHEL7.1 OSP7 distribution which is based on Kilo with Ironic driver for power management. Deployer is TripleO based RDO manager. We see following error on nova-conductor logs -

2015-06-19 17:17:48.272 4988 WARNING nova.scheduler.utils [req-9d33461b-63b1-4555-b4a9-eff98428c3dc e27ab3751f7f4a4e9f3b030d48b0f366 0db82411bf7a4d4fba49b39d7b5aef2d - - -] Failed to compute_task_build_instances: No valid host was found. There are not enough hosts available.
Traceback (most recent call last):

  File "/usr/lib/python2.7/site-packages/oslo_messaging/rpc/server.py", line 142, in inner
    return func(*args, **kwargs)

  File "/usr/lib/python2.7/site-packages/nova/scheduler/manager.py", line 86, in select_destinations
    filter_properties)

  File "/usr/lib/python2.7/site-packages/nova/scheduler/filter_scheduler.py", line 80, in select_destinations
    raise exception.NoValidHost(reason=reason)

NoValidHost: No valid host was found. There are not enough hosts available.

We have modified nova.virt.driver for ironic as below -

#compute_driver=libvirt.LibvirtDriver
compute_driver=nova.virt.ironic.IronicDriver <<<<<<<<<<<

Since we are using our own 'pxe_ucs' driver back ported from Liberty release for kilo, as h/w is Cisco UCS -

#enabled_drivers=pxe_ipmitool
enabled_drivers=pxe_ucs,agent_ucs <<<<<<<<<<<

Below are the some of log outputs -

[stack@ospd ~]$ ironic node-list
+--------------------------------------+------+---------------+-------------+-----------------+-------------+
| UUID                                 | Name | Instance UUID | Power State | Provision State | Maintenance |
+--------------------------------------+------+---------------+-------------+-----------------+-------------+
| 2208c498-5989-447c-8a77-b6761e3d3cda | None | None          | power off   | available       | False       |
| cb21453a-dfe4-423c-9a9f-91504a230923 | None | None          | power off   | available       | False       |
+--------------------------------------+------+---------------+-------------+-----------------+-------------+
[stack@ospd ~]$ nova list
+--------------------------------------+-------------------------------------------------------+--------+------------+-------------+----------+
| ID                                   | Name                                                  | Status | Task State | Power State | Networks |
+--------------------------------------+-------------------------------------------------------+--------+------------+-------------+----------+
| 2dbaf0d1-4874-4b07-9fe2-29d4987487ac | ov-gqf22wdaft-0-5zueofadpzlb-NovaCompute-hinuwoq3yq7q | ERROR  | -          | NOSTATE     |          |
| 1ba2abb0-f188-4f06-bbc5-14de8f111534 | ov-rimjhocbgyt-0-bwbyuxfori6h-Controller-aah6uu3r7v3e | ERROR  | -          | NOSTATE     |          |
+--------------------------------------+-------------------------------------------------------+--------+------------+-------------+----------+
[stack@ospd ~]$ nova flavor-list
+--------------------------------------+-----------+-----------+------+-----------+------+-------+-------------+-----------+
| ID                                   | Name      | Memory_MB | Disk | Ephemeral | Swap | VCPUs | RXTX_Factor | Is_Public |
+--------------------------------------+-----------+-----------+------+-----------+------+-------+-------------+-----------+
| fabc50be-3c0b-45b2-84a9-bf78607b3cc3 | baremetal | 4096      | 40   | 0         |      | 1     | 1.0         | True      |
+———————————————————+-----------+-----------+------+-----------+------+-------+-------------+-----------+

[stack@ospd ~]$ nova hypervisor-list
+----+--------------------------------------+-------+---------+
| ID | Hypervisor hostname                  | State | Status  |
+----+--------------------------------------+-------+---------+
| 11 | 2208c498-5989-447c-8a77-b6761e3d3cda | up    | enabled |
| 12 | cb21453a-dfe4-423c-9a9f-91504a230923 | up    | enabled |
+----+--------------------------------------+-------+---------+
[stack@ospd ~]$ nova hypervisor-stats
+----------------------+--------+
| Property             | Value  |
+----------------------+--------+
| count                | 2      |
| current_workload     | 0      |
| disk_available_least | 370    |
| free_disk_gb         | 370    |
| free_ram_mb          | 229376 |
| local_gb             | 370    |
| local_gb_used        | 0      |
| memory_mb            | 229376 |
| memory_mb_used       | 0      |
| running_vms          | 0      |
| vcpus                | 56     |
| vcpus_used           | 0      |
+———————————+--------+

[stack@ospd ~]$ glance image-list
+--------------------------------------+------------------------+-------------+------------------+-----------+--------+
| ID                                   | Name                   | Disk Format | Container Format | Size      | Status |
+--------------------------------------+------------------------+-------------+------------------+-----------+--------+
| a85703cd-b6a6-4366-b3a6-4e5de420fb98 | bm-deploy-kernel       | aki         | aki              | 5027392   | active |
| 28cff74d-4cd6-421b-97a8-a044a114069b | bm-deploy-ramdisk      | ari         | ari              | 56310758  | active |
| 8f0604bd-a20a-4df2-83be-290a99138f13 | overcloud-full         | qcow2       | bare             | 893856256 | active |
| 5936b7a3-fdf1-4723-95eb-3ea261995cc5 | overcloud-full-initrd  | ari         | ari              | 36758352  | active |
| 3a999b85-fc53-48c8-b6ec-89e5fa3a0aa9 | overcloud-full-vmlinuz | aki         | aki              | 5027392   | active |
+———————————————————+------------------------+-------------+------------------+-----------+--------+

[stack@ospd ~]$ instack-ironic-deployment --show-profile
Preparing for deployment...
  Querying assigned profiles ...

    2208c498-5989-447c-8a77-b6761e3d3cda
      "boot_option=local"

    cb21453a-dfe4-423c-9a9f-91504a230923
      "boot_option=local"


  DONE.

Prepared.

[stack@ospd ~]$ heat stack-list
+--------------------------------------+------------+---------------+----------------------+
| id                                   | stack_name | stack_status  | creation_time        |
+--------------------------------------+------------+---------------+----------------------+
| a0cf5344-ae2f-44bc-b5dc-85d467f61a6e | overcloud  | CREATE_FAILED | 2015-06-19T11:56:26Z |  
+———————————————————+——————+---------------+----------------------+

[stack@ospd ~]$ tuskar plan-show-scale overcloud
+------------------+-------+
| Property         | Value |
+------------------+-------+
| Ceph-Storage-1   | 0     |
| Cinder-Storage-1 | 0     |
| Compute-1        | 1     |
| Controller-1     | 1     |
| Swift-Storage-1  | 0     |
+—————————+-------+

[stack@ospd ~]$ tuskar plan-show-flavors overcloud
+------------------+-----------+
| Property         | Value     |
+------------------+-----------+
| Ceph-Storage-1   | baremetal |
| Cinder-Storage-1 | baremetal |
| Compute-1        | baremetal |
| Controller-1     | baremetal |
| Swift-Storage-1  | baremetal |
+—————————+-----------+

[stack@ospd ~]$ openstack flavor show baremetal
+----------------------------+-----------------------------------------------------+
| Field                      | Value                                               |
+----------------------------+-----------------------------------------------------+
| OS-FLV-DISABLED:disabled   | False                                               |
| OS-FLV-EXT-DATA:ephemeral  | 0                                                   |
| disk                       | 40                                                  |
| id                         | fabc50be-3c0b-45b2-84a9-bf78607b3cc3                |
| name                       | baremetal                                           |
| os-flavor-access:is_public | True                                                |
| properties                 | capabilities:boot_option='local', cpu_arch='x86_64' |
| ram                        | 4096                                                |
| rxtx_factor                | 1.0                                                 |
| swap                       |                                                     |
| vcpus                      | 1                                                   |
+——————————————+-----------------------------------------------------+

[stack@ospd ~]$  heat stack-list --show-nested -f "status=FAILED"
+--------------------------------------+--------------------------------------------------+---------------+----------------------+--------------------------------------+
| id                                   | stack_name                                       | stack_status  | creation_time        | parent                               |
+--------------------------------------+--------------------------------------------------+---------------+----------------------+--------------------------------------+
| a0cf5344-ae2f-44bc-b5dc-85d467f61a6e | overcloud                                        | CREATE_FAILED | 2015-06-19T11:56:26Z | None                                 |
| 1545787f-32b1-407d-9129-5931ccfe1869 | overcloud-Compute-cggqf22wdaft                   | CREATE_FAILED | 2015-06-19T11:56:29Z | a0cf5344-ae2f-44bc-b5dc-85d467f61a6e |
| 7373d273-1b71-424e-b7fb-a4d794c8b65a | overcloud-Compute-cggqf22wdaft-0-5zueofadpzlb    | CREATE_FAILED | 2015-06-19T11:56:30Z | 1545787f-32b1-407d-9129-5931ccfe1869 |
| cb7698fb-9023-464f-a9a0-548b5e03a971 | overcloud-Controller-drimjhocbgyt                | CREATE_FAILED | 2015-06-19T11:56:30Z | a0cf5344-ae2f-44bc-b5dc-85d467f61a6e |
| 941666fa-98ab-464e-83e3-4c893e6e5fad | overcloud-Controller-drimjhocbgyt-0-bwbyuxfori6h | CREATE_FAILED | 2015-06-19T11:56:31Z | cb7698fb-9023-464f-a9a0-548b5e03a971 |
+———————————————————+--------------------------------------------------+---------------+----------------------+--------------------------------------+


[stack@ospd ~]$ heat resource-list --nested-depth 5 overcloud | grep FAILED
| Compute                           | 1545787f-32b1-407d-9129-5931ccfe1869          | OS::Heat::ResourceGroup                           | CREATE_FAILED   | 2015-06-19T11:56:26Z |                 |
| Controller                        | cb7698fb-9023-464f-a9a0-548b5e03a971          | OS::Heat::ResourceGroup                           | CREATE_FAILED   | 2015-06-19T11:56:26Z |                 |
| 0                                 | 7373d273-1b71-424e-b7fb-a4d794c8b65a          | Tuskar::Compute-1                                 | CREATE_FAILED   | 2015-06-19T11:56:29Z | Compute         |
| 0                                 | 941666fa-98ab-464e-83e3-4c893e6e5fad          | Tuskar::Controller-1                              | CREATE_FAILED   | 2015-06-19T11:56:30Z | Controller      |
| NovaCompute                       | 2dbaf0d1-4874-4b07-9fe2-29d4987487ac          | OS::Nova::Server                                  | CREATE_FAILED   | 2015-06-19T11:56:30Z | 0               |
| Controller                        | 1ba2abb0-f188-4f06-bbc5-14de8f111534          | OS::Nova::Server                                  | CREATE_FAILED   | 2015-06-19T11:56:31Z | 0               |

[stack@ospd ~]$ heat resource-show 1545787f-32b1-407d-9129-5931ccfe1869 0 | grep resource_status_reason
| resource_status_reason | ResourceUnknownStatus: Resource failed - Unknown status FAILED due to "Resource CREATE failed: ResourceInError: Went to status ERROR due to "Message: No valid host was found. There are not enough hosts available., Code: 500"" |

Appreciate help and pointers, for de-bugging this issue.

Thanks, ~ Rajesh.

More findings -

First occurrence-

Jun 19 17:05:09 ospd.cisco.com ironic-conductor[2367]: 2015-06-19 17:05:09.029 2367 TRACE oslo_messaging.rpc.dispatcher super(UcsOperationError, self).__init__() <<<<<<<<<<<<<<<<<<<<<<<<<<<< Jun 19 17:05:09 ospd.cisco.com ironic-conductor[2367]: 2015-06-19 17:05:09.029 2367 TRACE oslo_messaging.rpc.dispatcher TypeError: __init__() takes exactly 2 arguments (1 given) <<<<<<<<<<<<<<<<<<<<<<<<<<< Jun 19 17:05:09 ospd.cisco.com ironic-conductor[2367]: 2015-06-19 17:05:09.029 2367 TRACE oslo_messaging.rpc.dispatcher Jun 19 17:05:09 ospd.cisco.com ironic-conductor[2367]: 2015-06-19 17:05:09.029 2367 ERROR oslo_messaging._drivers.common [-] Returning exception __init__() takes exactly 2 arguments (1 given) to caller Jun 19 17:05:09 ospd.cisco.com ironic-conductor[2367]: 2015-06-19 17:05:09.030 2367 ERROR oslo_messaging._drivers.common [-] ['Traceback (most recent call last):\n', ' File "/usr/lib/python2.7/site-packages/oslo_messaging/rpc/dispatcher.py", line 142, in _dispatch_and_reply\n executor_callback))\n', ' File "/usr/lib/python2.7/site-packages/oslo_messaging/rpc/dispatcher.py", line 186, in _dispatch\n executor_callback)\n', ' File "/usr/lib/python2.7/site-packages/oslo_messaging/rpc/dispatcher.py", line 130, in _do_dispatch\n result = func(ctxt, *new_args)\n', ' File "/usr/lib/python2.7/site-packages/oslo_messaging/rpc/server.py", line 142, in inner\n return func(args, *kwargs)\n', ' File "/usr/lib/python2.7/site-packages/ironic/conductor/manager.py", line 1557, in set_boot_device\n persistent=persistent)\n', ' File "/usr/lib/python2.7/site-packages/ironic/drivers/modules/ucs/helper.py", line 64, in wrapper\n return func(self, task, *args, *kwargs)\n', ' File "/usr/lib/python2.7/site-packages/ironic/drivers/modules/ucs/management.py" , line 91, in set_boot_device\n mgmt_handle.set_boot_device(device, persistent)\n', ' File "/usr/lib/python2.7/site-packages/UcsSdk/utils/management.py", line 60, in set_boot_device\n self.helper.handle)\n', ' File "/usr/lib/python2.7/site-packages/UcsSdk/utils/helper.py", line 123, in config_managed_object\n raise exception.UcsOperationError(\’config_managed_object\’, error=ex)\n’, ‘ File “/usr/lib/python2.7/site-packages/UcsSdk/utils/exception.py”, line 42, in __init__\n super(UcsOperationError, self).__init__()\n’, ‘TypeError: __init__() takes exactly 2 arguments (1 given)\n’] <<<<<<<<<<<<<<<<<<<

Second occurrence-

Jun 19 17:05:26 ospd.cisco.com ironic-conductor[2367]: 2015-06-19 17:05:26.112 2367 TRACE oslo_messaging.rpc.dispatcher File "/usr/lib/python2.7/site-packages/UcsSdk/utils/management.py", line 60, in set_boot_device Jun 19 17:05:26 ospd.cisco.com ironic-conductor[2367]: 2015-06-19 17:05:26.112 2367 TRACE oslo_messaging.rpc.dispatcher self.helper.handle) Jun 19 17:05:26 ospd.cisco.com ironic-conductor[2367]: 2015-06-19 17:05:26.112 2367 TRACE oslo_messaging.rpc.dispatcher File "/usr/lib/python2.7/site-packages/UcsSdk/utils/helper.py", line 123, in config_managed_object Jun 19 17:05:26 ospd.cisco.com ironic-conductor[2367]: 2015-06-19 17:05:26.112 2367 TRACE oslo_messaging.rpc.dispatcher raise exception.UcsOperationError(‘config_managed_object’, error=ex) <<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<< Jun 19 17:05:26 ospd.cisco.com ironic-conductor[2367]: 2015-06-19 17:05:26.112 2367 TRACE oslo_messaging.rpc.dispatcher File "/usr/lib/python2.7/site-packages/UcsSdk/utils/exception.py", line 42, in __init__ Jun 19 17:05:26 ospd.cisco.com ironic-conductor[2367]: 2015-06-19 17:05:26.112 2367 TRACE oslo_messaging.rpc.dispatcher super(UcsOperationError, self).__init__() Jun 19 17:05:26 ospd.cisco.com ironic-conductor[2367]: 2015-06-19 17:05:26.112 2367 TRACE oslo_messaging.rpc.dispatcher TypeError: __init__() takes exactly 2 arguments (1 given) <<<<<<<<<<<<<<<<<<<<<<<<<<<<<<< Jun 19 17:05:26 ospd.cisco.com ironic-conductor[2367]: 2015-06-19 17:05:26.112 2367 TRACE oslo_messaging.rpc.dispatcher Jun 19 17:05:26 ospd.cisco.com ironic-conductor[2367]: 2015-06-19 17:05:26.113 2367 ERROR oslo_messaging._drivers.common [-] Returning exception __init__() takes exactly 2 arguments (1 given) to caller <<<<<<<<<<<<<<<<<<<<<<<< Jun 19 17:05:26 ospd.cisco.com ironic-conductor[2367]: 2015-06-19 17:05:26.113 2367 ERROR oslo_messaging._drivers.common [-] ['Traceback (most recent call last):\n', ' File "/usr/lib/python2.7/site-packages/oslo_messaging/rpc/dispatcher.py", line 142, in _dispatch_and_reply\n executor_callback))\n', ' File "/usr/lib/python2.7/site-packages/oslo_messaging/rpc/dispatcher.py", line 186, in _dispatch\n executor_callback)\n', ' File "/usr/lib/python2.7/site-packages/oslo_messaging/rpc/dispatcher.py", line 130, in _do_dispatch\n result = func(ctxt, *new_args)\n', ' File "/usr/lib/python2.7/site-packages/oslo_messaging/rpc/server.py", line 142, in inner\n return func(args, *kwargs)\n', ' File "/usr/lib/python2.7/site-packages/ironic/conductor/manager.py", line 1557, in set_boot_device\n persistent=persistent)\n', ' File "/usr/lib/python2.7/site-packages/ironic/drivers/modules/ucs/helper.py", line 64, in wrapper\n return func(self, task, *args, *kwargs)\n', ' File "/usr/lib/python2.7/site-packages/ironic/drivers/modules/ucs/management.py" , line 91, in set_boot_device\n mgmt_handle.set_boot_device(device, persistent)\n', ' File "/usr/lib/python2.7/site-packages/UcsSdk/utils/management.py", line 60, in set_boot_device\n self.helper.handle)\n', ' File "/usr/lib/python2.7/site-packages/UcsSdk/utils/helper.py", line 123, in config_managed_object\n raise exception.UcsOperationError(\’config_managed_object\’, error=ex)\n’, ‘ File “/usr/lib/python2.7/site-packages/UcsSdk/utils/exception.py”, line 42, in __init__\n super(UcsOperationError, self).__init__()\n’, ‘TypeError: __init__() takes exactly 2 arguments (1 given)\n’] <<<<<<<<<<<<<<<<<<<<<<<<<<<<<<

After this, roughly 10 minutes later we get the first ‘NoValidHost’ found message in nova-conductor.log-

2015-06-19 17:17:40.578 4992 WARNING nova.scheduler.utils [req-bc7957ab-aac9-47d7-8454-515a723180c9 e27ab3751f7f4a4e9f3b030d48b0f366 0db82411bf7a4d4fba49b39d7b5aef2d - - -] Failed to compute_task_build_instances: No valid host was found. There are not enough hosts available. Traceback (most recent call last):

File "/usr/lib/python2.7/site-packages/oslo_messaging/rpc/server.py", line 142, in inner return func(args, *kwargs)

File "/usr/lib/python2.7/site-packages/nova/scheduler/manager.py", line 86, in select_destinations filter_properties)

File "/usr/lib/python2.7/site-packages/nova/scheduler/filter_scheduler.py", line 80, in select_destinations raise exception.NoValidHost(reason=reason)

NoValidHost: No valid host was found. There are not enough hosts available.

Helper.py -

58 @functools.wraps(func) 59 def wrapper(self, task, args, *kwargs): 60 if kwargs.get('helper') is None: 61 kwargs['helper'] = CiscoUcsHelper(task) 62 try: 63 kwargs['helper'].connect_ucsm() 64 return func(self, task, args, *kwargs) 65 finally: 66 kwargs['helper'].logout() 67 return wrapper

Management.py -

53 operation = “set_boot_device” <<<<<<<<<< 54 try: 55 ucs_helper.config_managed_object(dn, 56 OrgOrg.ClassId(), 57 LsbootPolicy.ClassId(), 58 boot_policy_cfg, 59 boot_policy_dn, 60 self.helper.handle) 61 except UcsException as ex: 62 raise exception.UcsOperationError(operation=operation, error=ex) <<<<<<<<<<<