Ask Your Question

Revision history [back]

click to hide/show revision 1
initial version

Image stuck in spawning state

I have a new openstack install on a compute node and controller node, bare metal. It all seems to work until I try to instantiate and instance. It just hangs in spawning.

On the compute node, I can find the node in the /var/lib/nova/ directory, but the disk file is like 200KB. I have read the faqs, searched and read forum posts, etc... most of them point to a lack of resources, but I have 900GB of disk and countless CPU cores and memory free. I have logged a bit at the time of spawning, and I will reproduce it here. Any help would be greatly appreciated.

Steps taken so far:

Checked resources on the compute node, has nearly 1TB of free storage, 628GB of ram free and 76 CPUs free. Checked permissions on the directories. Tried to debug by turning on debugging on nova on both controller and compute node. I can not see anything totally out of the ordinary. Logs do not show any errors, only warnings.

Followed some advice from here to no avail also:

https://stackoverflow.com/questions/47504867/openstack-vm-instance-stuck-in-spawning-state

I took debug logs on the controller and the compute. Here they are:

controller node:


==> /var/log/nova/nova-api.log <==
2019-04-24 11:33:14.064 7070 INFO nova.osapi_compute.wsgi.server [req-33d9eb1d-1127-4a62-af82-043d3570d287 5e3920bdc30948e3a4896e57e0c13187 2848a37532334b5cad85adc261e02e69 - default default] 10.131.36.21 "POST /v2.1/os-keypairs HTTP/1.1" status: 200 len: 2692 time: 0.1951220

==> /var/log/nova/nova-scheduler.log <==
2019-04-24 11:33:14.091 91832 DEBUG oslo_service.periodic_task [req-fa7a35f5-8ed9-4730-bd80-620e5d6c9df4 - - - - -] Running periodic task SchedulerManager._run_periodic_tasks run_periodic_tasks /usr/lib/python2.7/dist-packages/oslo_service/periodic_task.py:215
2019-04-24 11:33:14.118 91839 DEBUG oslo_service.periodic_task [req-6370673d-0ab8-4479-b9a8-4e04d0e87a52 - - - - -] Running periodic task SchedulerManager._run_periodic_tasks run_periodic_tasks /usr/lib/python2.7/dist-packages/oslo_service/periodic_task.py:215
2019-04-24 11:33:15.272 91901 DEBUG oslo_service.periodic_task [req-e2bae43a-deb6-4496-a523-4eb01071855d - - - - -] Running periodic task SchedulerManager._run_periodic_tasks run_periodic_tasks /usr/lib/python2.7/dist-packages/oslo_service/periodic_task.py:215
2019-04-24 11:33:15.405 91957 DEBUG oslo_service.periodic_task [req-4a2f6771-395f-4388-a246-eadd7e2b415f - - - - -] Running periodic task SchedulerManager._run_periodic_tasks run_periodic_tasks /usr/lib/python2.7/dist-packages/oslo_service/periodic_task.py:215
2019-04-24 11:33:16.297 91911 DEBUG oslo_service.periodic_task [req-2f1c24a2-a443-44d1-a4cb-dcec925bcdd2 - - - - -] Running periodic task SchedulerManager._run_periodic_tasks run_periodic_tasks /usr/lib/python2.7/dist-packages/oslo_service/periodic_task.py:215
2019-04-24 11:33:17.254 91895 DEBUG oslo_service.periodic_task [req-d3c2cf57-5aba-4041-ad55-9fea5f4de0e3 - - - - -] Running periodic task SchedulerManager._run_periodic_tasks run_periodic_tasks /usr/lib/python2.7/dist-packages/oslo_service/periodic_task.py:215
2019-04-24 11:33:18.082 91828 DEBUG oslo_service.periodic_task [req-4b94c387-eeaa-4f34-89c6-6c553650f30a - - - - -] Running periodic task SchedulerManager._run_periodic_tasks run_periodic_tasks /usr/lib/python2.7/dist-packages/oslo_service/periodic_task.py:215
2019-04-24 11:33:18.101 91835 DEBUG oslo_service.periodic_task [req-cade0ce9-fd4e-48fc-a4a6-6af450cbea14 - - - - -] Running periodic task SchedulerManager._run_periodic_tasks run_periodic_tasks /usr/lib/python2.7/dist-packages/oslo_service/periodic_task.py:215
2019-04-24 11:33:18.170 91861 DEBUG oslo_service.periodic_task [req-ba1e0089-12f5-4c90-b393-9521ac012da6 - - - - -] Running periodic task SchedulerManager._run_periodic_tasks run_periodic_tasks /usr/lib/python2.7/dist-packages/oslo_service/periodic_task.py:215
2019-04-24 11:33:19.170 91853 DEBUG oslo_service.periodic_task [req-e06452c4-e3c1-4a15-8614-c5b5cccafbd6 - - - - -] Running periodic task SchedulerManager._run_periodic_tasks run_periodic_tasks /usr/lib/python2.7/dist-packages/oslo_service/periodic_task.py:215
2019-04-24 11:33:19.316 91919 DEBUG oslo_service.periodic_task [req-d7372d4a-9fd7-45ac-9487-9de2919ec403 - - - - -] Running periodic task SchedulerManager._run_periodic_tasks run_periodic_tasks /usr/lib/python2.7/dist-packages/oslo_service/periodic_task.py:215
2019-04-24 11:33:19.441 91966 DEBUG oslo_service.periodic_task [req-9ea8a027-e3e3-4cd8-8481-e76468440fd5 - - - - -] Running periodic task SchedulerManager._run_periodic_tasks run_periodic_tasks /usr/lib/python2.7/dist-packages/oslo_service/periodic_task.py:215
2019-04-24 11:33:20.443 91970 DEBUG oslo_service.periodic_task [req-a35cb813-0256-4646-adcf-019f3fd24e4e - - - - -] Running periodic task SchedulerManager._run_periodic_tasks run_periodic_tasks /usr/lib/python2.7/dist-packages/oslo_service/periodic_task.py:215
2019-04-24 11:33:21.590 92016 DEBUG oslo_service.periodic_task [req-6bd19360-5110-4745-8aea-4086dbdc2665 - - - - -] Running periodic task SchedulerManager._run_periodic_tasks run_periodic_tasks /usr/lib/python2.7/dist-packages/oslo_service/periodic_task.py:215
2019-04-24 11:33:22.225 91885 DEBUG oslo_service.periodic_task [req-de575a74-880b-40d3-97d8-4a4e63e47c4e - - - - -] Running periodic task SchedulerManager._run_periodic_tasks run_periodic_tasks /usr/lib/python2.7/dist-packages/oslo_service/periodic_task.py:215
2019-04-24 11:33:22.278 91900 DEBUG oslo_service.periodic_task [req-3490c661-662c-4070-8011-78c06478d6ed - - - - -] Running periodic task SchedulerManager._run_periodic_tasks run_periodic_tasks /usr/lib/python2.7/dist-packages/oslo_service/periodic_task.py:215
2019-04-24 11:33:22.372 91942 DEBUG oslo_service.periodic_task [req-9796132a-db8d-4251-bbca-a8496563ffbb - - - - -] Running periodic task SchedulerManager._run_periodic_tasks run_periodic_tasks /usr/lib/python2.7/dist-packages/oslo_service/periodic_task.py:215
2019-04-24 11:33:23.167 91857 DEBUG oslo_service.periodic_task [req-69f0c3b8-0bac-42b8-b84a-80d5a693ec54 - - - - -] Running periodic task SchedulerManager._run_periodic_tasks run_periodic_tasks /usr/lib/python2.7/dist-packages/oslo_service/periodic_task.py:215
2019-04-24 11:33:23.402 91952 DEBUG oslo_service.periodic_task [req-ab31a670-376a-48ac-aa10-da72a893c6d4 - - - - -] Running periodic task SchedulerManager._run_periodic_tasks run_periodic_tasks /usr/lib/python2.7/dist-packages/oslo_service/periodic_task.py:215
2019-04-24 11:33:25.131 91844 DEBUG oslo_service.periodic_task [req-a66feeb0-468f-4de7-a5d8-a3576fb4df57 - - - - -] Running periodic task SchedulerManager._run_periodic_tasks run_periodic_tasks /usr/lib/python2.7/dist-packages/oslo_service/periodic_task.py:215
2019-04-24 11:33:26.130 91846 DEBUG oslo_service.periodic_task [req-c2d433ff-ef66-4f57-b241-3043e2e619b8 - - - - -] Running periodic task SchedulerManager._run_periodic_tasks run_periodic_tasks /usr/lib/python2.7/dist-packages/oslo_service/periodic_task.py:215
2019-04-24 11:33:26.210 91877 DEBUG oslo_service.periodic_task [req-9cace30e-9851-4163-82f2-5fce0b922d5a - - - - -] Running periodic task SchedulerManager._run_periodic_tasks run_periodic_tasks /usr/lib/python2.7/dist-packages/oslo_service/periodic_task.py:215
2019-04-24 11:33:26.555 92009 DEBUG oslo_service.periodic_task [req-380bc36c-be51-4f19-bb94-af8c7d587811 - - - - -] Running periodic task SchedulerManager._run_periodic_tasks run_periodic_tasks /usr/lib/python2.7/dist-packages/oslo_service/periodic_task.py:215
2019-04-24 11:33:27.219 91881 DEBUG oslo_service.periodic_task [req-4fd4180a-e8a2-434a-9e79-daa43b2e9bab - - - - -] Running periodic task SchedulerManager._run_periodic_tasks run_periodic_tasks /usr/lib/python2.7/dist-packages/oslo_service/periodic_task.py:215

==> /var/log/nova/nova-api.log <==
2019-04-24 11:33:27.417 7066 INFO nova.osapi_compute.wsgi.server [req-3915c5dd-2d4e-40a1-beef-42f4ed0a6fcc 5e3920bdc30948e3a4896e57e0c13187 2848a37532334b5cad85adc261e02e69 - default default] 10.131.36.21 "GET /v2.1 HTTP/1.1" status: 302 len: 312 time: 0.0048208
2019-04-24 11:33:27.425 7066 INFO nova.osapi_compute.wsgi.server [req-1f566f9f-0de0-44b4-ab88-0fda5a3192b0 5e3920bdc30948e3a4896e57e0c13187 2848a37532334b5cad85adc261e02e69 - default default] 10.131.36.21 "GET /v2.1/ HTTP/1.1" status: 200 len: 783 time: 0.0048091

==> /var/log/nova/nova-scheduler.log <==
2019-04-24 11:33:27.594 92026 DEBUG oslo_service.periodic_task [req-d6799b5e-8213-4768-a165-9b36aaf8e58d - - - - -] Running periodic task SchedulerManager._run_periodic_tasks run_periodic_tasks /usr/lib/python2.7/dist-packages/oslo_service/periodic_task.py:215

==> /var/log/nova/nova-api.log <==
2019-04-24 11:33:29.758 7070 INFO nova.osapi_compute.wsgi.server [req-83957bf8-d8be-493f-840b-a563c762f45c 5e3920bdc30948e3a4896e57e0c13187 2848a37532334b5cad85adc261e02e69 - default default] 10.131.36.21 "POST /v2.1/servers HTTP/1.1" status: 202 len: 890 time: 2.1083019

==> /var/log/nova/nova-conductor.log <==
2019-04-24 11:33:29.810 91732 DEBUG oslo_db.sqlalchemy.engines [req-83957bf8-d8be-493f-840b-a563c762f45c 5e3920bdc30948e3a4896e57e0c13187 2848a37532334b5cad85adc261e02e69 - default default] MySQL server mode set to STRICT_TRANS_TABLES,STRICT_ALL_TABLES,NO_ZERO_IN_DATE,NO_ZERO_DATE,ERROR_FOR_DIVISION_BY_ZERO,TRADITIONAL,NO_AUTO_CREATE_USER,NO_ENGINE_SUBSTITUTION _check_effective_sql_mode /usr/lib/python2.7/dist-packages/oslo_db/sqlalchemy/engines.py:308

==> /var/log/nova/nova-api.log <==
2019-04-24 11:33:29.833 7070 INFO nova.osapi_compute.wsgi.server [req-76ab185a-0860-496b-9887-8e572dd7fb57 5e3920bdc30948e3a4896e57e0c13187 2848a37532334b5cad85adc261e02e69 - default default] 10.131.36.21 "GET /v2.1/servers/e501e9b6-e4ac-464f-b1d6-9a5439c5c269 HTTP/1.1" status: 200 len: 2130 time: 0.0689468

==> /var/log/nova/nova-scheduler.log <==
2019-04-24 11:33:29.885 91827 DEBUG nova.scheduler.manager [req-83957bf8-d8be-493f-840b-a563c762f45c 5e3920bdc30948e3a4896e57e0c13187 2848a37532334b5cad85adc261e02e69 - default default] Starting to schedule for instances: [u'e501e9b6-e4ac-464f-b1d6-9a5439c5c269'] select_destinations /usr/lib/python2.7/dist-packages/nova/scheduler/manager.py:117
2019-04-24 11:33:29.891 91827 DEBUG oslo_concurrency.lockutils [req-83957bf8-d8be-493f-840b-a563c762f45c 5e3920bdc30948e3a4896e57e0c13187 2848a37532334b5cad85adc261e02e69 - default default] Lock "placement_client" acquired by "nova.scheduler.client.report._create_client" :: waited 0.000s inner /usr/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:273
2019-04-24 11:33:29.910 91827 DEBUG oslo_concurrency.lockutils [req-83957bf8-d8be-493f-840b-a563c762f45c 5e3920bdc30948e3a4896e57e0c13187 2848a37532334b5cad85adc261e02e69 - default default] Lock "placement_client" released by "nova.scheduler.client.report._create_client" :: held 0.019s inner /usr/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:285
2019-04-24 11:33:30.353 91934 DEBUG oslo_service.periodic_task [req-a780f069-49bf-4934-9d03-466002dcae49 - - - - -] Running periodic task SchedulerManager._run_periodic_tasks run_periodic_tasks /usr/lib/python2.7/dist-packages/oslo_service/periodic_task.py:215
2019-04-24 11:33:30.556 92011 DEBUG oslo_service.periodic_task [req-019faeb0-f42a-4ab0-8670-8d5dad10f904 - - - - -] Running periodic task SchedulerManager._run_periodic_tasks run_periodic_tasks /usr/lib/python2.7/dist-packages/oslo_service/periodic_task.py:215
2019-04-24 11:33:30.617 92034 DEBUG oslo_service.periodic_task [req-84511f42-5082-4c0e-8841-e22e9bec977d - - - - -] Running periodic task SchedulerManager._run_periodic_tasks run_periodic_tasks /usr/lib/python2.7/dist-packages/oslo_service/periodic_task.py:215
2019-04-24 11:33:30.936 91827 DEBUG oslo_concurrency.lockutils [req-83957bf8-d8be-493f-840b-a563c762f45c 5e3920bdc30948e3a4896e57e0c13187 2848a37532334b5cad85adc261e02e69 - default default] Lock "be489632-58a4-4c10-9901-efef9dee6494" acquired by "nova.context.get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:273
2019-04-24 11:33:30.936 91827 DEBUG oslo_concurrency.lockutils [req-83957bf8-d8be-493f-840b-a563c762f45c 5e3920bdc30948e3a4896e57e0c13187 2848a37532334b5cad85adc261e02e69 - default default] Lock "be489632-58a4-4c10-9901-efef9dee6494" released by "nova.context.get_or_set_cached_cell_and_set_connections" :: held 0.001s inner /usr/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:285
2019-04-24 11:33:30.980 91827 DEBUG oslo_concurrency.lockutils [req-83957bf8-d8be-493f-840b-a563c762f45c 5e3920bdc30948e3a4896e57e0c13187 2848a37532334b5cad85adc261e02e69 - default default] Lock "be489632-58a4-4c10-9901-efef9dee6494" acquired by "nova.context.get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:273
2019-04-24 11:33:30.980 91827 DEBUG oslo_concurrency.lockutils [req-83957bf8-d8be-493f-840b-a563c762f45c 5e3920bdc30948e3a4896e57e0c13187 2848a37532334b5cad85adc261e02e69 - default default] Lock "be489632-58a4-4c10-9901-efef9dee6494" released by "nova.context.get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:285
2019-04-24 11:33:30.984 91827 DEBUG oslo_concurrency.lockutils [req-83957bf8-d8be-493f-840b-a563c762f45c 5e3920bdc30948e3a4896e57e0c13187 2848a37532334b5cad85adc261e02e69 - default default] Lock "(u'hpcccloud2', u'hpcccloud2')" acquired by "nova.scheduler.host_manager._locked_update" :: waited 0.000s inner /usr/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:273
2019-04-24 11:33:30.984 91827 DEBUG nova.scheduler.host_manager [req-83957bf8-d8be-493f-840b-a563c762f45c 5e3920bdc30948e3a4896e57e0c13187 2848a37532334b5cad85adc261e02e69 - default default] Update host state from compute node: ComputeNode(cpu_allocation_ratio=16.0,cpu_info='{"vendor": "Intel", "model": "Skylake-Server-IBRS", "arch": "x86_64", "features": ["pku", "rtm", "tsc_adjust", "vme", "pge", "xsaveopt", "smep", "fpu", "monitor", "lm", "tsc", "adx", "fxsr", "tm", "pclmuldq", "xgetbv1", "tsc-deadline", "arat", "de", "aes", "pse", "sse", "f16c", "ds", "mpx", "avx512f", "avx2", "pbe", "cx16", "ds_cpl", "movbe", "ospke", "vmx", "sep", "avx512dq", "xsave", "erms", "hle", "est", "smx", "abm", "sse4.1", "sse4.2", "ssbd", "acpi", "popcnt", "mmx", "osxsave", "clwb", "dca", "pdcm", "invtsc", "tm2", "pcid", "rdrand", "avx512vl", "x2apic", "smap", "clflush", "dtes64", "xtpr", "avx512bw", "msr", "fma", "cx8", "mce", "avx512cd", "mca", "lahf_lm", "rdseed", "apic", "fsgsbase", "rdtscp", "ssse3", "pse36", "mtrr", "avx", "syscall", "invpcid", "cmov", "spec-ctrl", "clflushopt", "pat", "3dnowprefetch", "nx", "pae", "ht", "pdpe1gb", "xsavec", "pni", "sse2", "ss", "bmi1", "bmi2", "xsaves"], "topology": {"cores": 20, "cells": 2, "threads": 2, "sockets": 1}}',created_at=2019-04-09T19:10:16Z,current_workload=1,deleted=False,deleted_at=None,disk_allocation_ratio=1.0,disk_available_least=927,free_disk_gb=875,free_ram_mb=637735,host='hpcccloud2',host_ip=10.131.36.22,hypervisor_hostname='hpcccloud2',hypervisor_type='QEMU',hypervisor_version=2011001,id=2,local_gb=983,local_gb_used=108,mapped=0,memory_mb=644391,memory_mb_used=6656,metrics='[]',numa_topology='{"nova_object.version": "1.2", "nova_object.changes": ["cells"], "nova_object.name": "NUMATopology", "nova_object.data": {"cells": [{"nova_object.version": "1.3", "nova_object.changes": ["network_metadata", "cpu_usage", "memory_usage", "cpuset", "pinned_cpus", "siblings", "memory", "mempages", "id"], "nova_object.name": "NUMACell", "nova_object.data": {"network_metadata": {"nova_object.version": "1.0", "nova_object.changes": ["physnets", "tunneled"], "nova_object.name": "NetworkMetadata", "nova_object.data": {"physnets": [], "tunneled": false}, "nova_object.namespace": "nova"}, "cpu_usage": 0, "memory_usage": 0, "cpuset": [0, 2, 4, 6, 8, 10, 12, 14, 16, 18, 20, 22, 24, 26, 28, 30, 32, 34, 36, 38, 40, 42, 44, 46, 48, 50, 52, 54, 56, 58, 60, 62, 64, 66, 68, 70, 72, 74, 76, 78], "pinned_cpus": [], "siblings": [[18, 58], [12, 52], [0, 40], [24, 64], [8, 48], [28, 68], [14, 54], [20, 60], [34, 74], [6, 46], [30, 70], [36, 76], [4, 44], [26, 66], [38, 78], [32, 72], [2, 42], [10, 50], [22, 62], [16, 56]], "memory": 321861, "mempages": [{"nova_object.version": "1.1", "nova_object.changes": ["total", "used", "reserved", "size_kb"], "nova_object.name": "NUMAPagesTopology", "nova_object.data": {"used": 0, "total": 82396621, "reserved": 0, "size_kb": 4}, "nova_object.namespace": "nova"}, {"nova_object.version": "1.1", "nova_object.changes": ["total", "used", "reserved", "size_kb"], "nova_object.name": "NUMAPagesTopology", "nova_object.data": {"used": 0, "total": 0, "reserved": 0, "size_kb": 2048}, "nova_object.namespace": "nova"}, {"nova_object.version": "1.1", "nova_object.changes": ["total", "used", "reserved", "size_kb"], "nova_object.name": "NUMAPagesTopology", "nova_object.data": {"used": 0, "total": 0, "reserved": 0, "size_kb": 1048576}, "nova_object.namespace": "nova"}], "id": 0}, "nova_object.namespace": "nova"}, {"nova_object.version": "1.3", "nova_object.changes": ["network_metadata", "cpu_usage", "memory_usage", "cpuset", "pinned_cpus", "siblings", "memory", "mempages", "id"], "nova_object.name": "NUMACell", "nova_object.data": {"network_metadata": {"nova_object.version": "1.0", "nova_object.changes": ["physnets", "tunneled"], "nova_object.name": "NetworkMetadata", "nova_object.data": {"physnets": [], "tunneled": false}, "nova_object.namespace": "nova"}, "cpu_usage": 0, "memory_usage": 0, "cpuset": [1, 3, 5, 7, 9, 11, 13, 15, 17, 19, 21, 23, 25, 27, 29, 31, 33, 35, 37, 39, 41, 43, 45, 47, 49, 51, 53, 55, 57, 59, 61, 63, 65, 67, 69, 71, 73, 75, 77, 79], "pinned_cpus": [], "siblings": [[19, 59], [21, 61], [13, 53], [35, 75], [31, 71], [33, 73], [5, 45], [37, 77], [3, 43], [23, 63], [25, 65], [27, 67], [9, 49], [29, 69], [39, 79], [1, 41], [17, 57], [11, 51], [15, 55], [7, 47]], "memory": 322529, "mempages": [{"nova_object.version": "1.1", "nova_object.changes": ["total", "used", "reserved", "size_kb"], "nova_object.name": "NUMAPagesTopology", "nova_object.data": {"used": 0, "total": 82567671, "reserved": 0, "size_kb": 4}, "nova_object.namespace": "nova"}, {"nova_object.version": "1.1", "nova_object.changes": ["total", "used", "reserved", "size_kb"], "nova_object.name": "NUMAPagesTopology", "nova_object.data": {"used": 0, "total": 0, "reserved": 0, "size_kb": 2048}, "nova_object.namespace": "nova"}, {"nova_object.version": "1.1", "nova_object.changes": ["total", "used", "reserved", "size_kb"], "nova_object.name": "NUMAPagesTopology", "nova_object.data": {"used": 0, "total": 0, "reserved": 0, "size_kb": 1048576}, "nova_object.namespace": "nova"}], "id": 1}, "nova_object.namespace": "nova"}]}, "nova_object.namespace": "nova"}',pci_device_pools=PciDevicePoolList,ram_allocation_ratio=1.5,running_vms=3,service_id=None,stats={failed_builds='0',io_workload='0',num_instances='3',num_os_type_None='3',num_proj_2848a37532334b5cad85adc261e02e69='1',num_proj_bda9284c0c8e410092d72879ea2e5f05='2',num_task_None='2',num_task_deleting='1',num_vm_error='3'},supported_hv_specs=[HVSpec,HVSpec],updated_at=2019-04-24T16:32:19Z,uuid=9a5f90eb-0bd3-4e5d-a1a8-b83cdacda6ba,vcpus=80,vcpus_used=4) _locked_update /usr/lib/python2.7/dist-packages/nova/scheduler/host_manager.py:171
2019-04-24 11:33:30.988 91827 WARNING nova.scheduler.host_manager [req-83957bf8-d8be-493f-840b-a563c762f45c 5e3920bdc30948e3a4896e57e0c13187 2848a37532334b5cad85adc261e02e69 - default default] Host hpcccloud2 has more disk space than database expected (927 GB > 875 GB)
2019-04-24 11:33:30.990 91827 DEBUG nova.scheduler.host_manager [req-83957bf8-d8be-493f-840b-a563c762f45c 5e3920bdc30948e3a4896e57e0c13187 2848a37532334b5cad85adc261e02e69 - default default] Update host state with aggregates: [] _locked_update /usr/lib/python2.7/dist-packages/nova/scheduler/host_manager.py:174
2019-04-24 11:33:30.991 91827 DEBUG nova.scheduler.host_manager [req-83957bf8-d8be-493f-840b-a563c762f45c 5e3920bdc30948e3a4896e57e0c13187 2848a37532334b5cad85adc261e02e69 - default default] Update host state with service dict: {'binary': u'nova-compute', 'uuid': u'337910f3-660b-4b77-b6a3-ecfc3f8b6a04', 'deleted': False, 'created_at': datetime.datetime(2019, 4, 2, 18, 6, 16, tzinfo=<iso8601.Utc>), 'updated_at': datetime.datetime(2019, 4, 24, 16, 33, 28, tzinfo=<iso8601.Utc>), 'report_count': 152336, 'topic': u'compute', 'host': u'hpcccloud2', 'version': 35, 'disabled': False, 'forced_down': False, 'last_seen_up': datetime.datetime(2019, 4, 24, 16, 33, 28, tzinfo=<iso8601.Utc>), 'deleted_at': None, 'disabled_reason': None, 'id': 8} _locked_update /usr/lib/python2.7/dist-packages/nova/scheduler/host_manager.py:177
2019-04-24 11:33:30.991 91827 DEBUG nova.scheduler.host_manager [req-83957bf8-d8be-493f-840b-a563c762f45c 5e3920bdc30948e3a4896e57e0c13187 2848a37532334b5cad85adc261e02e69 - default default] Update host state with instances: [u'6828030d-f0ef-4efd-b8fd-f33af0cb9249', u'60e9e3db-cede-4c98-86b1-11648022fecb', u'58fa1dc8-55a5-40c4-be38-96b7a49dcf50'] _locked_update /usr/lib/python2.7/dist-packages/nova/scheduler/host_manager.py:181
2019-04-24 11:33:30.992 91827 DEBUG oslo_concurrency.lockutils [req-83957bf8-d8be-493f-840b-a563c762f45c 5e3920bdc30948e3a4896e57e0c13187 2848a37532334b5cad85adc261e02e69 - default default] Lock "(u'hpcccloud2', u'hpcccloud2')" released by "nova.scheduler.host_manager._locked_update" :: held 0.008s inner /usr/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:285
2019-04-24 11:33:30.992 91827 DEBUG nova.filters [req-83957bf8-d8be-493f-840b-a563c762f45c 5e3920bdc30948e3a4896e57e0c13187 2848a37532334b5cad85adc261e02e69 - default default] Starting with 1 host(s) get_filtered_objects /usr/lib/python2.7/dist-packages/nova/filters.py:70
2019-04-24 11:33:30.993 91827 DEBUG nova.filters [req-83957bf8-d8be-493f-840b-a563c762f45c 5e3920bdc30948e3a4896e57e0c13187 2848a37532334b5cad85adc261e02e69 - default default] Filter RetryFilter returned 1 host(s) get_filtered_objects /usr/lib/python2.7/dist-packages/nova/filters.py:104
2019-04-24 11:33:30.993 91827 DEBUG nova.filters [req-83957bf8-d8be-493f-840b-a563c762f45c 5e3920bdc30948e3a4896e57e0c13187 2848a37532334b5cad85adc261e02e69 - default default] Filter AvailabilityZoneFilter returned 1 host(s) get_filtered_objects /usr/lib/python2.7/dist-packages/nova/filters.py:104
2019-04-24 11:33:30.994 91827 DEBUG nova.filters [req-83957bf8-d8be-493f-840b-a563c762f45c 5e3920bdc30948e3a4896e57e0c13187 2848a37532334b5cad85adc261e02e69 - default default] Filter ComputeFilter returned 1 host(s) get_filtered_objects /usr/lib/python2.7/dist-packages/nova/filters.py:104
2019-04-24 11:33:30.995 91827 DEBUG nova.filters [req-83957bf8-d8be-493f-840b-a563c762f45c 5e3920bdc30948e3a4896e57e0c13187 2848a37532334b5cad85adc261e02e69 - default default] Filter ComputeCapabilitiesFilter returned 1 host(s) get_filtered_objects /usr/lib/python2.7/dist-packages/nova/filters.py:104
2019-04-24 11:33:30.996 91827 DEBUG nova.filters [req-83957bf8-d8be-493f-840b-a563c762f45c 5e3920bdc30948e3a4896e57e0c13187 2848a37532334b5cad85adc261e02e69 - default default] Filter ImagePropertiesFilter returned 1 host(s) get_filtered_objects /usr/lib/python2.7/dist-packages/nova/filters.py:104
2019-04-24 11:33:30.996 91827 DEBUG nova.filters [req-83957bf8-d8be-493f-840b-a563c762f45c 5e3920bdc30948e3a4896e57e0c13187 2848a37532334b5cad85adc261e02e69 - default default] Filter ServerGroupAntiAffinityFilter returned 1 host(s) get_filtered_objects /usr/lib/python2.7/dist-packages/nova/filters.py:104
2019-04-24 11:33:30.996 91827 DEBUG nova.filters [req-83957bf8-d8be-493f-840b-a563c762f45c 5e3920bdc30948e3a4896e57e0c13187 2848a37532334b5cad85adc261e02e69 - default default] Filter ServerGroupAffinityFilter returned 1 host(s) get_filtered_objects /usr/lib/python2.7/dist-packages/nova/filters.py:104
2019-04-24 11:33:30.997 91827 DEBUG nova.scheduler.filter_scheduler [req-83957bf8-d8be-493f-840b-a563c762f45c 5e3920bdc30948e3a4896e57e0c13187 2848a37532334b5cad85adc261e02e69 - default default] Filtered [(hpcccloud2, hpcccloud2) ram: 637735MB disk: 896000MB io_ops: 0 instances: 3] _get_sorted_hosts /usr/lib/python2.7/dist-packages/nova/scheduler/filter_scheduler.py:435
2019-04-24 11:33:30.997 91827 DEBUG nova.scheduler.filter_scheduler [req-83957bf8-d8be-493f-840b-a563c762f45c 5e3920bdc30948e3a4896e57e0c13187 2848a37532334b5cad85adc261e02e69 - default default] Weighed [(hpcccloud2, hpcccloud2) ram: 637735MB disk: 896000MB io_ops: 0 instances: 3] _get_sorted_hosts /usr/lib/python2.7/dist-packages/nova/scheduler/filter_scheduler.py:455
2019-04-24 11:33:30.998 91827 DEBUG nova.scheduler.utils [req-83957bf8-d8be-493f-840b-a563c762f45c 5e3920bdc30948e3a4896e57e0c13187 2848a37532334b5cad85adc261e02e69 - default default] Attempting to claim resources in the placement API for instance e501e9b6-e4ac-464f-b1d6-9a5439c5c269 claim_resources /usr/lib/python2.7/dist-packages/nova/scheduler/utils.py:934
2019-04-24 11:33:30.999 91827 DEBUG oslo_concurrency.lockutils [req-83957bf8-d8be-493f-840b-a563c762f45c 5e3920bdc30948e3a4896e57e0c13187 2848a37532334b5cad85adc261e02e69 - default default] Lock "placement_client" acquired by "nova.scheduler.client.report._create_client" :: waited 0.000s inner /usr/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:273
2019-04-24 11:33:31.001 91827 DEBUG oslo_concurrency.lockutils [req-83957bf8-d8be-493f-840b-a563c762f45c 5e3920bdc30948e3a4896e57e0c13187 2848a37532334b5cad85adc261e02e69 - default default] Lock "placement_client" released by "nova.scheduler.client.report._create_client" :: held 0.003s inner /usr/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:285
2019-04-24 11:33:31.151 91855 DEBUG oslo_service.periodic_task [req-51ad99e1-3253-43e1-91c2-b098a2a2b891 - - - - -] Running periodic task SchedulerManager._run_periodic_tasks run_periodic_tasks /usr/lib/python2.7/dist-packages/oslo_service/periodic_task.py:215
2019-04-24 11:33:31.288 91905 DEBUG oslo_service.periodic_task [req-6dd9cb6b-9b02-4dd1-ace2-6aa701bc92e5 - - - - -] Running periodic task SchedulerManager._run_periodic_tasks run_periodic_tasks /usr/lib/python2.7/dist-packages/oslo_service/periodic_task.py:215
2019-04-24 11:33:31.579 92020 DEBUG oslo_service.periodic_task [req-81a7f7dc-2f73-4196-bd8d-95cc8a003819 - - - - -] Running periodic task SchedulerManager._run_periodic_tasks run_periodic_tasks /usr/lib/python2.7/dist-packages/oslo_service/periodic_task.py:215
2019-04-24 11:33:32.596 91827 DEBUG nova.scheduler.filter_scheduler [req-83957bf8-d8be-493f-840b-a563c762f45c 5e3920bdc30948e3a4896e57e0c13187 2848a37532334b5cad85adc261e02e69 - default default] [instance: e501e9b6-e4ac-464f-b1d6-9a5439c5c269] Selected host: (hpcccloud2, hpcccloud2) ram: 637735MB disk: 896000MB io_ops: 0 instances: 3 _consume_selected_host /usr/lib/python2.7/dist-packages/nova/scheduler/filter_scheduler.py:346
2019-04-24 11:33:32.597 91827 DEBUG oslo_concurrency.lockutils [req-83957bf8-d8be-493f-840b-a563c762f45c 5e3920bdc30948e3a4896e57e0c13187 2848a37532334b5cad85adc261e02e69 - default default] Lock "(u'hpcccloud2', u'hpcccloud2')" acquired by "nova.scheduler.host_manager._locked" :: waited 0.000s inner /usr/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:273
2019-04-24 11:33:32.608 91827 DEBUG oslo_concurrency.lockutils [req-83957bf8-d8be-493f-840b-a563c762f45c 5e3920bdc30948e3a4896e57e0c13187 2848a37532334b5cad85adc261e02e69 - default default] Lock "(u'hpcccloud2', u'hpcccloud2')" released by "nova.scheduler.host_manager._locked" :: held 0.011s inner /usr/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:285



compute node:

2019-04-24 11:33:22.012 45826 DEBUG oslo_service.periodic_task [req-e60be91c-cfd9-430a-8f7e-cad2b4f6b90d - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python2.7/dist-packages/oslo_service/periodic_task.py:215
2019-04-24 11:33:22.012 45826 DEBUG oslo_service.periodic_task [req-e60be91c-cfd9-430a-8f7e-cad2b4f6b90d - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python2.7/dist-packages/oslo_service/periodic_task.py:215
2019-04-24 11:33:22.013 45826 DEBUG nova.compute.manager [req-e60be91c-cfd9-430a-8f7e-cad2b4f6b90d - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python2.7/dist-packages/nova/compute/manager.py:7069
2019-04-24 11:33:22.105 45826 DEBUG nova.compute.manager [req-e60be91c-cfd9-430a-8f7e-cad2b4f6b90d - - - - -] [instance: 58fa1dc8-55a5-40c4-be38-96b7a49dcf50] Skipping network cache update for instance because it is being deleted. _heal_instance_info_cache /usr/lib/python2.7/dist-packages/nova/compute/manager.py:7119
2019-04-24 11:33:22.105 45826 DEBUG nova.compute.manager [req-e60be91c-cfd9-430a-8f7e-cad2b4f6b90d - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python2.7/dist-packages/nova/compute/manager.py:7145
2019-04-24 11:33:23.104 45826 DEBUG oslo_service.periodic_task [req-e60be91c-cfd9-430a-8f7e-cad2b4f6b90d - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python2.7/dist-package/oslo_service/periodic_task.py:215
2019-04-24 11:33:23.105 45826 DEBUG oslo_service.periodic_task [req-e60be91c-cfd9-430a-8f7e-cad2b4f6b90d - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python2.7/dist-packages/oslo_service/periodic_task.py:215
2019-04-24 11:33:23.105 45826 DEBUG oslo_service.periodic_task [req-e60be91c-cfd9-430a-8f7e-cad2b4f6b90d - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python2.7/dist-packages/oslo_service/periodic_task.py:215
2019-04-24 11:33:23.105 45826 DEBUG oslo_service.periodic_task [req-e60be91c-cfd9-430a-8f7e-cad2b4f6b90d - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python2.7/dist-packages/oslo_service/periodic_task.py:215
2019-04-24 11:33:23.158 45826 DEBUG nova.compute.resource_tracker [req-e60be91c-cfd9-430a-8f7e-cad2b4f6b90d - -- - -] Auditing locally available compute resources for hpcccloud2 (node: hpcccloud2) update_available_resource /usr/li/python2.7/dist-packages/nova/compute/resource_tracker.py:704
## inventory output deleted too large
2019-04-24 11:33:23.546 45826 DEBUG oslo_concurrency.lockutils [req-e60be91c-cfd9-430a-8f7e-cad2b4f6b90d - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker._update_available_resource" :: waited 0.000s inner /usr/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:273
2019-04-24 11:33:23.622 45826 DEBUG nova.compute.resource_tracker [req-e60be91c-cfd9-430a-8f7e-cad2b4f6b90d - - - - -] Compute driver doesn't require allocation refresh and we're on a compute host in a deployment that only has compute hosts with Nova versions >=16 (Pike). Skipping auto-correction of allocations. _update_usage_from_instances /usr/lib/python2.7/dist-packages/nova/compute/resource_tracker.py:1281
2019-04-24 11:33:23.729 45826 DEBUG nova.compute.resource_tracker [req-e60be91c-cfd9-430a-8f7e-cad2b4f6b90d - - - - -] Instance 6828030d-f0ef-4efd-b8fd-f33af0cb9249 actively managed on this compute host and has allocations in placement: {u'resources': {u'MEMORY_MB': 4096, u'VCPU': 2, u'DISK_GB': 40}}. _remove_deleted_instances_allocations /usr/lib/python2.7/dist-packages/nova/compute/resource_tracker.py:1306
2019-04-24 11:33:23.730 45826 DEBUG nova.compute.resource_tracker [req-e60be91c-cfd9-430a-8f7e-cad2b4f6b90d - - - - -] Instance 60e9e3db-cede-4c98-86b1-11648022fecb actively managed on this compute host and has allocations in placement: {u'resources': {u'MEMORY_MB': 1024, u'VCPU': 1, u'DISK_GB': 34}}. _remove_deleted_instances_allocations /usr/lib/python2.7/dist-packages/nova/compute/resource_tracker.py:1306
2019-04-24 11:33:23.730 45826 DEBUG nova.compute.resource_tracker [req-e60be91c-cfd9-430a-8f7e-cad2b4f6b90d - - - - -] Instance 58fa1dc8-55a5-40c4-be38-96b7a49dcf50 actively managed on this compute host and has allocations in placement: {u'resources': {u'MEMORY_MB': 1024, u'VCPU': 1, u'DISK_GB': 34}}. _remove_deleted_instances_allocations /usr/lib/python2.7/dist-packages/nova/compute/resource_tracker.py:1306
2019-04-24 11:33:23.730 45826 DEBUG nova.compute.resource_tracker [req-e60be91c-cfd9-430a-8f7e-cad2b4f6b90d - - - - -] Total usable vcpus: 80, total allocated vcpus: 4 _report_final_resource_view /usr/lib/python2.7/dist-packages/nova/compute/resource_tracker.py:860
2019-04-24 11:33:23.730 45826 INFO nova.compute.resource_tracker [req-e60be91c-cfd9-430a-8f7e-cad2b4f6b90d - - - - -] Final resource view: name=hpcccloud2 phys_ram=644391MB used_ram=6656MB phys_disk=983GB used_disk=108GB total_vcpus=80 used_vcpus=4 pci_stats=[]
2019-04-24 11:33:23.795 45826 DEBUG nova.scheduler.client.report [req-e60be91c-cfd9-430a-8f7e-cad2b4f6b90d - - - - -] Refreshing aggregate associations for resource provider 9a5f90eb-0bd3-4e5d-a1a8-b83cdacda6ba, aggregates: None _refresh_associations /usr/lib/python2.7/dist-packages/nova/scheduler/client/report.py:792
2019-04-24 11:33:23.817 45826 DEBUG nova.scheduler.client.report [req-e60be91c-cfd9-430a-8f7e-cad2b4f6b90d - - - - -] Refreshing trait associations for resource provider 9a5f90eb-0bd3-4e5d-a1a8-b83cdacda6ba, traits: HW_CPU_X86_AESNI,HW_CPU_X86_AVX512CD,HW_CPU_X86_FMA3,HW_CPU_X86_MPX,HW_CPU_X86_SSE2,HW_CPU_X86_MMX,HW_CPU_X86_SSSE3,HW_CPU_X86_AVX,HW_CPU_X86_CLMUL,HW_CPU_X86_AVX512DQ,HW_CPU_X86_SSE,HW_CPU_X86_AVX2,HW_CPU_X86_AVX512F,HW_CPU_X86_F16C,HW_CPU_X86_SSE41,HW_CPU_X86_SSE42,HW_CPU_X86_VMX,HW_CPU_X86_BMI2,HW_CPU_X86_AVX512VL,HW_CPU_X86_AVX512BW,HW_CPU_X86_ABM,HW_CPU_X86_BMI _refresh_associations /usr/lib/python2.7/dist-packages/nova/scheduler/client/report.py:806
2019-04-24 11:33:23.846 45826 DEBUG nova.virt.libvirt.driver [req-e60be91c-cfd9-430a-8f7e-cad2b4f6b90d - - - - -] CPU mode 'host-model' model '' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python2.7/dist-packages/nova/virt/libvirt/driver.py:3841
2019-04-24 11:33:23.850 45826 DEBUG nova.compute.resource_tracker [req-e60be91c-cfd9-430a-8f7e-cad2b4f6b90- - - - -] Compute_service record updated for hpcccloud2:hpcccloud2 _update_available_resource /usr/lib/python2.7/dist-packages/nova/compute/resource_tracker.py:800
2019-04-24 11:33:23.850 45826 DEBUG oslo_concurrency.lockutils [req-e60be91c-cfd9-430a-8f7e-cad2b4f6b90d - - - - -] Lock "compute_resources" released by "nova.compute.resource_tracker._update_available_resource" :: held 0.304s inner /usr/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:285
2019-04-24 11:33:23.851 45826 DEBUG oslo_service.periodic_task [req-e60be91c-cfd9-430a-8f7e-cad2b4f6b90d - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python2.7/dist-packages/oslo_service/periodic_task.py:215
2019-04-24 11:33:24.758 45826 DEBUG oslo_service.periodic_task [req-e60be91c-cfd9-430a-8f7e-cad2b4f6b90d - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python2.7/dist-packages/oslo_service/periodic_task.py:215
2019-04-24 11:33:24.758 45826 DEBUG nova.compute.manager [req-e60be91c-cfd9-430a-8f7e-cad2b4f6b90d - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python2.7/dist-packages/nova/compute/manager.py:7742
2019-04-24 11:33:36.047 45826 DEBUG oslo_concurrency.lockutils [req-83957bf8-d8be-493f-840b-a563c762f45c 5e3920bdc30948e3a4896e57e0c13187 2848a37532334b5cad85adc261e02e69 - default default] Lock "e501e9b6-e4ac-464f-b1d6-9a5439c5c269" acquired by "nova.compute.manager._locked_do_build_and_run_instance" :: waited 0.000s inner /usr/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:273
2019-04-24 11:33:36.101 45826 DEBUG nova.compute.manager [req-83957bf8-d8be-493f-840b-a563c762f45c 5e3920bdc30948e3a4896e57e0c13187 2848a37532334b5cad85adc261e02e69 - default default] [instance: e501e9b6-e4ac-464f-b1d6-9a5439c5c269] Starting instance... _do_build_and_run_instance /usr/lib/python2.7/dist-package/nova/compute/manager.py:1913
2019-04-24 11:33:36.257 45826 DEBUG oslo_concurrency.lockutils [req-83957bf8-d8be-493f-840b-a563c762f45c 5e3920bdc30948e3a4896e57e0c13187 2848a37532334b5cad85adc261e02e69 - default default] Lock "compute_resources"acquired by "nova.compute.resource_tracker.instance_claim" :: waited 0.000s inner /usr/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:273
2019-04-24 11:33:36.259 45826 DEBUG nova.compute.resource_tracker [req-83957bf8-d8be-493f-840b-a563c762f45c5e3920bdc30948e3a4896e57e0c13187 2848a37532334b5cad85adc261e02e69 - default default] Memoryoverhead for 1024 MB instance; 0 MB instance_claim /usr/lib/python2.7/dist-packages/nova/comput/resource_tracker.py:205
2019-04-24 11:33:36.259 45826 DEBUG nova.compute.resource_tracker [req-83957bf8-d8be-493f-840b-a563c762f455e3920bdc30948e3a4896e57e0c13187 2848a37532334b5cad85adc261e02e69 - default default] Disk overhead for 32 GB instance; 0 GB instance_claim /usr/lib/python2.7/dist-packages/nova/compute/resource_tracker.py:208
2019-04-24 11:33:36.259 45826 DEBUG nova.compute.resource_tracker [req-83957bf8-d8be-493f-840b-a563c762f45c5e3920bdc30948e3a4896e57e0c13187 2848a37532334b5cad85adc261e02e69 - default default] CPU overhead for 1vCPUs instance; 0 vCPU(s) instance_claim /usr/lib/python2.7/dist-packages/nova/compute/resource_tracker.py:211
2019-04-24 11:33:36.298 45826 INFO nova.compute.claims [req-83957bf8-d8be-493f-840b-a563c762f45c5e3920bdc30948e3a4896e57e0c13187 2848a37532334b5cad85adc261e02e69 - default default] [instance: e501e9b6-e4ac-464f-b1d6-9a5439c5c269] Attempting claim on node hpcccloud2: memory 1024 MB, disk 32 GB, vcpus 1 CPU
2019-04-24 11:33:36.298 45826 INFO nova.compute.claims [req-83957bf8-d8be-493f-840b-a563c762f45c5e3920bdc30948e3a4896e57e0c13187 2848a37532334b5cad85adc261e02e69 - default default] [instance: e501e9b6e4ac-464f-b1d6-9a5439c5c269] Total memory: 644391 MB, used: 6656.00 MB
2019-04-24 11:33:36.298 45826 INFO nova.compute.claims [req-83957bf8-d8be-493f-840ba563c762f45c5e3920bdc30948e3a4896e57e0c13187 2848a37532334b5cad85adc261e02e69 - default default][instance:e501e9b6e4ac-464f-b1d6-9a5439c5c269] memory limit not specified, defaulting to unlimited
2019-04-24 11:33:36.298 45826 INFO nova.compute.claims [req-83957bf8-d8be-493f-840ba563c762f45c5e3920bdc30948e3a4896e57e0c13187 2848a37532334b5cad85adc261e02e69 - default default][instance:e501e9b6e4ac-464f-b1d6-9a5439c5c269] Total disk: 983 GB, used: 108.00 GB
2019-04-24 11:33:36.299 45826 INFO nova.compute.claims [req-83957bf8-d8be-493f-840b-a563c762f45c5e3920bdc30948e3a4896e57e0c13187 2848a37532334b5cad85adc261e02e69 - default default] [instance: e501e9b6e4ac-464f-b1d6-9a5439c5c269] disk limit not specified, defaulting to unlimited
2019-04-24 11:33:36.299 45826 INFO nova.compute.claims [req-83957bf8-d8be-493f-840b-a563c762f45c5e3920bdc30948e3a4896e57e0c13187 2848a37532334b5cad85adc261e02e69 - default default] [instance: e501e9b6e4ac-464f-b1d6-9a5439c5c269] Total vcpu: 80 VCPU, used: 4.00 VCPU
2019-04-24 11:33:36.299 45826 INFO nova.compute.claims [req-83957bf8-d8be-493f-840b-a563c762f45c5e3920bdc30948e3a4896e57e0c13187 2848a37532334b5cad85adc261e02e69 - default default] [instance: e501e9b6e4ac-464f-b1d6-9a5439c5c269] vcpu limit not specified, defaulting to unlimited
2019-04-24 11:33:36.300 45826 DEBUG nova.virt.hardware [req-83957bf8-d8be-493f-840ba563c762f45c5e3920bdc30948e3a4896e57e0c13187 2848a37532334b5cad85adc261e02e69 - default default] Require both a host andinstance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python2.7/dist-package/nova/virt/hardware.py:1664
2019-04-24 11:33:36.300 45826 INFO nova.compute.claims [req-83957bf8-d8be-493f-840ba563c762f45c5e3920bdc30948e3a4896e57e0c13187 2848a37532334b5cad85adc261e02e69 - default default] [instance:e501e9b6e4ac-464f-b1d6-9a5439c5c269] Claim successful on node hpcccloud2
2019-04-24 11:33:36.574 45826 DEBUG nova.scheduler.client.report [req-83957bf8-d8be-493f-840b-a563c762f45c5e3920bdc30948e3a4896e57e0c13187 2848a37532334b5cad85adc261e02e69 - default default] Refreshing aggregateassociations for resource provider 9a5f90eb-0bd3-4e5d-a1a8-b83cdacda6ba, aggregates: None _refresh_associations /usr/li/python2.7/dist-packages/nova/scheduler/client/report.py:792
2019-04-24 11:33:36.600 45826 DEBUG nova.scheduler.client.report [req-83957bf8-d8be-493f-840b-a563c762f45c5e3920bdc30948e3a4896e57e0c13187 2848a37532334b5cad85adc261e02e69 - default default] Refreshing traitassociations for resource provider 9a5f90eb-0bd3-4e5d-a1a8-b83cdacda6ba,traits:HW_CPU_X86_AESNI,HW_CPU_X86_AVX512CD,HW_CPU_X86_FMA3,HW_CPU_X86_MPX,HW_CPU_X86_SSEHW_CPU_X86_MMX,HW_CPU_X86_SSSE3,HW_CPU_X86_AVX,HW_CPU_X86_CLMUL,HW_CPU_X86_AVX512DQ,H_CPU_X86_SSE,HW_CPU_X86_AVX2,HW_CPU_X86_AVX512F,HW_CPU_X86_F16C,HW_CPU_X86_SSE41,HW_CPU_86_SSE42,HW_CPU_X86_VMX,HW_CPU_X86_BMI2,HW_CPU_X86_AVX512VL,HW_CPU_X86_AVX512BW,HW_CPU_X86_ABM,HW_CPU_X86_BMI _refresh_associations /usr/lib/python2.7/dist-packages/nova/scheduler/client/report.py:806
2019-04-24 11:33:36.651 45826 DEBUG nova.virt.libvirt.driver [req-83957bf8-d8be-493f-840b-a563c762f45c 5e3920bdc30948e3a4896e57e0c13187 2848a37532334b5cad85adc261e02e69 - default default] CPU mode 'host-model' model '' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python2.7/dist-packages/nova/virt/libvi/driver.py:3841
2019-04-24 11:33:36.655 45826 DEBUG oslo_concurrency.lockutils [req-83957bf8-d8be-493f-840b-a563c762f45c5e3920bdc30948e3a4896e57e0c13187 2848a37532334b5cad85adc261e02e69 - default default] Lock "compute_resources" released by "nova.compute.resource_tracker.instance_claim" :: held 0.397s inner /usr/li/python2.7/dist-packages/oslo_concurrency/lockutils.py:285
2019-04-24 11:33:36.655 45826 DEBUG nova.compute.manager [req-83957bf8-d8be-493f-840b-a563c762f45c5e3920bdc30948e3a4896e57e0c13187 2848a37532334b5cad85adc261e02e69 - default default] [instance: e501e9b6e4ac-464f-b1d6-9a5439c5c269] Start building networks asynchronously for instance. _build_resources /usr/li/python2.7/dist-packages/nova/compute/manager.py:2280
2019-04-24 11:33:36.822 45826 DEBUG nova.compute.manager [req-83957bf8-d8be-493f-840b-a563c762f45c5e3920bdc30948e3a4896e57e0c13187 2848a37532334b5cad85adc261e02e69 - default default] [instance: e501e9b6e4ac-464f-b1d6-9a5439c5c269] Allocating IP information in the background. _allocate_network_async /usr/li/python2.7/dist-packages/nova/compute/manager.py:1493
2019-04-24 11:33:36.822 45826 DEBUG nova.network.neutronv2.api [req-83957bf8-d8be-493f-840b-a563c762f45c5e3920bdc30948e3a4896e57e0c13187 2848a37532334b5cad85adc261e02e69 - default default] [instance: e501e9b6e4ac-464f-b1d6-9a5439c5c269] allocate_for_instance() allocate_for_instance /usr/lib/python2.7/dist-package/nova/network/neutronv2/api.py:996
2019-04-24 11:33:36.869 45826 DEBUG nova.block_device [req-83957bf8-d8be-493f-840b-a563c762f45c5e3920bdc30948e3a4896e57e0c13187 2848a37532334b5cad85adc261e02e69 - default default]block_device_list []volume_in_mapping /usr/lib/python2.7/dist-packages/nova/block_device.py:575
2019-04-24 11:33:36.911 45826 DEBUG nova.compute.manager [req-83957bf8-d8be-493f-840b-a563c762f45c5e3920bdc30948e3a4896e57e0c13187 2848a37532334b5cad85adc261e02e69 - default default] [instance: e501e9b6e4ac-464f-b1d6-9a5439c5c269] Start building block device mappings for instance. _build_resources /usr/li/python2.7/dist-packages/nova/compute/manager.py:2314
2019-04-24 11:33:36.921 45826 DEBUG oslo_policy.policy [req-83957bf8-d8be-493f-840b-a563c762f45c5e3920bdc30948e3a4896e57e0c13187 2848a37532334b5cad85adc261e02e69 - default default] The policy file policy.jsoncould not be found. load_rules /usr/lib/python2.7/dist-packages/oslo_policy/policy.py:559
2019-04-24 11:33:37.029 45826 DEBUG oslo_concurrency.lockutils [req-83957bf8-d8be-493f-840b-a563c762f45c 5e3920bdc30948e3a4896e57e0c13187 2848a37532334b5cad85adc261e02e69 - default default] Lock "placement_client" acquired by "nova.scheduler.client.report._create_client" :: waited 0.000s inner /usr/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:273
2019-04-24 11:33:37.031 45826 DEBUG oslo_concurrency.lockutils [req-83957bf8-d8be-493f-840b-a563c762f45c 5e3920bdc30948e3a4896e57e0c13187 2848a37532334b5cad85adc261e02e69 - default default] Lock "placement_client" released by "nova.scheduler.client.report._create_client" :: held 0.002s inner /usr/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:285
2019-04-24 11:33:37.986 45826 DEBUG nova.compute.manager [req-83957bf8-d8be-493f-840b-a563c762f45c 5e3920bdc30948e3a4896e57e0c13187 2848a37532334b5cad85adc261e02e69 - default default] [instance: e501e9b6-e4ac-464f-b1d6-9a5439c5c269] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python2.7/dist-packages/nova/compute/manager.py:2127
2019-04-24 11:33:37.987 45826 DEBUG nova.block_device [req-83957bf8-d8be-493f-840b-a563c762f45c 5e3920bdc30948e3a4896e57e0c13187 2848a37532334b5cad85adc261e02e69 - default default] block_device_list [u'vdb'] volume_in_mapping /usr/lib/python2.7/dist-packages/nova/block_device.py:575
2019-04-24 11:33:37.987 45826 INFO nova.virt.libvirt.driver [req-83957bf8-d8be-493f-840b-a563c762f45c 5e3920bdc30948e3a4896e57e0c13187 2848a37532334b5cad85adc261e02e69 - default default] [instance: e501e9b6-e4ac-464f-b1d6-9a5439c5c269] Creating image
2019-04-24 11:33:37.988 45826 DEBUG oslo_concurrency.lockutils [req-83957bf8-d8be-493f-840b-a563c762f45c 5e3920bdc30948e3a4896e57e0c13187 2848a37532334b5cad85adc261e02e69 - default default] Lock "/var/lib/nova/instances/e501e9b6-e4ac-464f-b1d6-9a5439c5c269/disk.info" acquired by "nova.virt.libvirt.imagebackend.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:273
2019-04-24 11:33:37.988 45826 DEBUG oslo_concurrency.lockutils [req-83957bf8-d8be-493f-840b-a563c762f45c 5e3920bdc30948e3a4896e57e0c13187 2848a37532334b5cad85adc261e02e69 - default default] Lock "/var/lib/nova/instances/e501e9b6-e4ac-464f-b1d6-9a5439c5c269/disk.info" released by "nova.virt.libvirt.imagebackend.write_to_disk_info_file" :: held 0.000s inner /usr/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:285
2019-04-24 11:33:37.989 45826 INFO oslo.privsep.daemon [req-83957bf8-d8be-493f-840b-a563c762f45c 5e3920bdc30948e3a4896e57e0c13187 2848a37532334b5cad85adc261e02e69 - default default] Running privsep helper: ['sudo', 'nova-rootwrap', '/etc/nova/rootwrap.conf', 'privsep-helper', '--config-file', '/etc/nova/nova.conf', '--config-file', '/etc/nova/nova-compute.conf', '--privsep_context', 'nova.privsep.sys_admin_pctxt', '--privsep_sock_path', '/tmp/tmpVpRflp/privsep.sock']
2019-04-24 11:33:38.421 45826 INFO oslo.privsep.daemon [req-83957bf8-d8be-493f-840b-a563c762f45c 5e3920bdc30948e3a4896e57e0c13187 2848a37532334b5cad85adc261e02e69 - default default] Spawned new privsep daemon via rootwrap
2019-04-24 11:33:38.360 45904 INFO oslo.privsep.daemon [-] privsep daemon starting
2019-04-24 11:33:38.363 45904 INFO oslo.privsep.daemon [-] privsep process running with uid/gid: 0/0
2019-04-24 11:33:38.364 45904 INFO oslo.privsep.daemon [-] privsep process running with capabilities (eff/prm/inh):CAP_CHOWN|CAP_DAC_OVERRIDE|CAP_DAC_READ_SEARCH|CAP_FOWNER|CAP_NET_ADMIN|CAP_SYSADMIN/CAP_CHOWN|CAP_DAC_OVERRIDE|CAP_DAC_READ_SEARCH|CAP_FOWNER|CAP_NET_ADMIN|CAP_SYS_ADMN/none
2019-04-24 11:33:38.364 45904 INFO oslo.privsep.daemon [-] privsep daemon running as pid 45904
2019-04-24 11:33:38.457 45826 DEBUG oslo_concurrency.processutils [req-83957bf8-d8be-493f-840b-a563c762f45c5e3920bdc30948e3a4896e57e0c13187 2848a37532334b5cad85adc261e02e69 - default default] Running cmd(subprocess): /usr/bin/python2.7 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/abc261a53dd1e62f06a0d7cd2fc8da800b68383a --force-share execute /usr/lib/python2.7/dist-packages/oslo_concurrency/processutils.py:372
2019-04-24 11:33:38.497 45826 DEBUG oslo_concurrency.processutils [req-83957bf8-d8be-493f-840ba563c762f45c5e3920bdc30948e3a4896e57e0c13187 2848a37532334b5cad85adc261e02e69 - default default] CMD "/usr/bin/python2.7 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/abc261a53dd1e62f06a0d7cd2fc8da800b68383a --force-share" returned: 0 in 0.040s execute /usr/lib/python2.7/dist-packages/oslo_concurrency/processutils.py:409
2019-04-24 11:33:38.499 45826 DEBUG oslo_concurrency.lockutils [req-83957bf8-d8be-493f-840b-a563c762f45c5e3920bdc30948e3a4896e57e0c13187 2848a37532334b5cad85adc261e02e69 - default default] Lock "abc261a53dd1e62f06a0d7cd2fc8da800b68383a" acquired by "nova.virt.libvirt.imagebackend.copy_qcow2_image" :: waited 0.000s inner /usr/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:273
2019-04-24 11:33:38.499 45826 DEBUG oslo_concurrency.processutils [req-83957bf8-d8be-493f-840b-a563c762f45c 5e3920bdc30948e3a4896e57e0c13187 2848a37532334b5cad85adc261e02e69 - default default] Running cmd (subprocess): /usr/bin/python2.7 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/abc261a53dd1e62f06a0d7cd2fc8da800b68383a --force-share execute /usr/lib/python2.7/dist-packages/oslo_concurrency/processutils.py:372
2019-04-24 11:33:38.539 45826 DEBUG oslo_concurrency.processutils [req-83957bf8-d8be-493f-840b-a563c762f45c5e3920bdc30948e3a4896e57e0c13187 2848a37532334b5cad85adc261e02e69 - default default] CMD "/usr/bin/python2.7 - m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/abc261a53dd1e62f06a0d7cd2fc8da800b68383a --force-share" returned: 0 in 0.040s execute /usr/lib/python2.7/dist-packages/oslo_concurrency/processutils.py:409
2019-04-24 11:33:38.540 45826 DEBUG oslo_concurrency.processutils [req-83957bf8-d8be-493f-840b-a563c762f45c5e3920bdc30948e3a4896e57e0c13187 2848a37532334b5cad85adc261e02e69 - default default] Running cmd (subprocess): qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/abc261a53dd1e62f06a0d7cd2fc8da800b68383a /var/lib/nova/instances/e501e9b6-e4ac-464f-b1d6-9a5439c5c269/disk execute /usr/lib/python2.7/dist-packages/oslo_concurrency/processutils.py:372
2019-04-24 11:33:38.564 45826 DEBUG oslo_concurrency.processutils [req-83957bf8-d8be-493f-840b-a563c762f45c5e3920bdc30948e3a4896e57e0c13187 2848a37532334b5cad85adc261e02e69 - default default] CMD "qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/abc261a53dd1e62f06a0d7cd2fc8da800b68383a /var/lib/nova/instances/e501e9b6-e4ac-464f-b1d6-9a5439c5c269/disk" returned: 0 in 0.024s execute /usr/lib/python2.7/dist-packages/oslo_concurrency/processutils.py:409
2019-04-24 11:33:38.565 45826 DEBUG nova.virt.disk.api [req-83957bf8-d8be-493f-840b-a563c762f45c 5e3920bdc30948e3a4896e57e0c13187 2848a37532334b5cad85adc261e02e69 - default default] Checking if we can resize image /var/lib/nova/instances/e501e9b6-e4ac-464f-b1d6-9a5439c5c269/disk. size=34359738368 can_resize_image /usr/lib/python2.7/dist-packages/nova/virt/disk/api.py:178
2019-04-24 11:33:38.566 45826 DEBUG oslo_concurrency.processutils [req-83957bf8-d8be-493f-840b-a563c762f45c5e3920bdc30948e3a4896e57e0c13187 2848a37532334b5cad85adc261e02e69 - default default] Running cmd (subprocess): /usr/bin/python2.7 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/e501e9b6-e4ac-464f-b1d6-9a5439c5c269/disk --force-share execute /usr/lib/python2.7/dist-packages/oslo_concurrency/processutils.py:372
2019-04-24 11:33:38.606 45826 DEBUG oslo_concurrency.processutils [req-83957bf8-d8be-493f-840b-a563c762f45c5e3920bdc30948e3a4896e57e0c13187 2848a37532334b5cad85adc261e02e69 - default default] CMD "/usr/bin/python2.7 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/e501e9b6-e4ac-464f-b1d6-9a5439c5c269/disk --force-share" returned: 0 in 0.040s execute /usr/lib/python2.7/dist-packages/oslo_concurrency/processutils.py:409
2019-04-24 11:33:38.607 45826 DEBUG oslo_concurrency.processutils [req-83957bf8-d8be-493f-840b-a563c762f45c5e3920bdc30948e3a4896e57e0c13187 2848a37532334b5cad85adc261e02e69 - default default] Running cmd (subprocess): qemu-img resize /var/lib/nova/instances/e501e9b6-e4ac-464f-b1d6-9a5439c5c269/disk 34359738368 execute /usr/lib/python2.7/dist-packages/oslo_concurrency/processutils.py:372
2019-04-24 11:33:38.647 45826 DEBUG oslo_concurrency.processutils [req-83957bf8-d8be-493f-840b-a563c762f45c5e3920bdc30948e3a4896e57e0c13187 2848a37532334b5cad85adc261e02e69 - default default] CMD "qemu-img resize /var/lib/nova/instances/e501e9b6-e4ac-464f-b1d6-9a5439c5c269/disk 34359738368" returned: 0 in 0.040s execute /usr/lib/python2.7/dist-packages/oslo_concurrency/processutils.py:409
2019-04-24 11:33:38.648 45826 DEBUG oslo_concurrency.lockutils [req-83957bf8-d8be-493f-840b-a563c762f45c 53920bdc30948e3a4896e57e0c13187 2848a37532334b5cad85adc261e02e69 - default default] Lock "abc261a53dd1e62f06a0d7cd2fc8da800b68383a" released by "nova.virt.libvirt.imagebackend.copy_qcow2_image" :: held 0.149s inner /usr/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:285
2019-04-24 11:33:38.648 45826 DEBUG oslo_concurrency.processutils [req-83957bf8-d8be-493f-840b-a563c762f45c5e3920bdc30948e3a4896e57e0c13187 2848a37532334b5cad85adc261e02e69 - default default] Running cmd (subprocess): /usr/bin/python2.7 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/abc261a53dd1e62f06a0d7cd2fc8da800b68383a --force-share execute /usr/lib/python2.7/dist-packages/oslo_concurrency/processutils.py:372
2019-04-24 11:33:38.688 45826 DEBUG oslo_concurrency.processutils [req-83957bf8-d8be-493f-840b-a563c762f45c5e3920bdc30948e3a4896e57e0c13187 2848a37532334b5cad85adc261e02e69 - default default] CMD "/usr/bin/python2.7 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/abc261a53dd1e62f06a0d7cd2fc8da800b68383a --force-share" returned: 0 in 0.040s execute /usr/lib/python2.7/dist-packages/oslo_concurrency/processutils.py:409
2019-04-24 11:33:38.689 45826 DEBUG nova.virt.disk.api [req-83957bf8-d8be-493f-840b-a563c762f45c5e3920bdc30948e3a4896e57e0c13187 2848a37532334b5cad85adc261e02e69 - default default] Checking if we can resize image /var/lib/nova/instances/e501e9b6-e4ac-464f-b1d6-9a5439c5c269/disk. size=34359738368 can_resize_image /usr/lib/python2.7/dist-packages/nova/virt/disk/api.py:178
2019-04-24 11:33:38.689 45826 DEBUG oslo_concurrency.processutils [req-83957bf8-d8be-493f-840b-a563c762f45c5e3920bdc30948e3a4896e57e0c13187 2848a37532334b5cad85adc261e02e69 - default default] Running cmd (subprocess): /usr/bin/python2.7 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/e501e9b6-e4ac-464f-b1d6-9a5439c5c269/disk --force-share execute /usr/lib/python2.7/dist-packages/oslo_concurrency/processutils.py:372
2019-04-24 11:33:38.725 45826 DEBUG oslo_concurrency.processutils [req-83957bf8-d8be-493f-840b-a563c762f45c5e3920bdc30948e3a4896e57e0c13187 2848a37532334b5cad85adc261e02e69 - default default] CMD "/usr/bin/python2.7 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/e501e9b6-e4ac-464f-b1d6-9a5439c5c269/disk --force-share" returned: 0 in 0.036s execute /usr/lib/python2.7/dist-packages/oslo_concurrency/processutils.py:409
2019-04-24 11:33:38.726 45826 DEBUG nova.virt.disk.api [req-83957bf8-d8be-493f-840b-a563c762f45c 5e3920bdc30948e3a4896e57e0c13187 2848a37532334b5cad85adc261e02e69 - default default] Cannot resize image /var/lib/nova/instances/e501e9b6-e4ac-464f-b1d6-9a5439c5c269/disk to a smaller size. can_resize_image /usr/lib/python2.7/dist-packages/nova/virt/disk/api.py:184
2019-04-24 11:33:38.727 45826 DEBUG oslo_concurrency.lockutils [req-83957bf8-d8be-493f-840b-a563c762f45c 5e3920bdc30948e3a4896e57e0c13187 2848a37532334b5cad85adc261e02e69 - default default] Lock "/var/lib/nova/instances/e501e9b6-e4ac-464f-b1d6-9a5439c5c269/disk.info" acquired by "nova.virt.libvirt.imagebackend.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:273
2019-04-24 11:33:38.728 45826 DEBUG oslo_concurrency.lockutils [req-83957bf8-d8be-493f-840b-a563c762f45c 5e3920bdc30948e3a4896e57e0c13187 2848a37532334b5cad85adc261e02e69 - default default] Lock "/var/lib/nova/instances/e501e9b6-e4ac-464f-b1d6-9a5439c5c269/disk.info" released by "nova.virt.libvirt.imagebackend.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:285
2019-04-24 11:33:38.729 45826 DEBUG oslo_concurrency.processutils [req-83957bf8-d8be-493f-840b-a563c762f45c5e3920bdc30948e3a4896e57e0c13187 2848a37532334b5cad85adc261e02e69 - default default] Running cmd (subprocess): /usr/bin/python2.7 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/swap_2048 --force-share execute /usr/lib/python2.7/dist-packages/oslo_concurrency/processutils.py:372
2019-04-24 11:33:38.769 45826 DEBUG oslo_concurrency.processutils [req-83957bf8-d8be-493f-840b-a563c762f45c5e3920bdc30948e3a4896e57e0c13187 2848a37532334b5cad85adc261e02e69 - default default] CMD "/usr/bin/python2.7 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/swap_2048 --force-share" returned: 0 in 0.040s execute /usr/lib/python2.7/dist-packages/oslo_concurrency/processutils.py:409
2019-04-24 11:33:38.769 45826 DEBUG oslo_concurrency.lockutils [req-83957bf8-d8be-493f-840b-a563c762f45c 5e3920bdc30948e3a4896e57e0c13187 2848a37532334b5cad85adc261e02e69 - default default] Lock "swap_2048" acquired by "nova.virt.libvirt.imagebackend.copy_qcow2_image" :: waited 0.000s inner /usr/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:273
2019-04-24 11:33:38.770 45826 DEBUG oslo_concurrency.processutils [req-83957bf8-d8be-493f-840b-a563c762f45c5e3920bdc30948e3a4896e57e0c13187 2848a37532334b5cad85adc261e02e69 - default default] Running cmd (subprocess): /usr/bin/python2.7 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/swap_2048 --force-share execute /usr/lib/python2.7/dist-packages/oslo_concurrency/processutils.py:372
2019-04-24 11:33:38.808 45826 DEBUG oslo_concurrency.processutils [req-83957bf8-d8be-493f-840b-a563c762f45c5e3920bdc30948e3a4896e57e0c13187 2848a37532334b5cad85adc261e02e69 - default default] CMD "/usr/bin/python2.7 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/swap_2048 --force-share" returned: 0 in 0.038s execute /usr/lib/python2.7/dist-packages/oslo_concurrency/processutils.py:409
2019-04-24 11:33:38.809 45826 DEBUG oslo_concurrency.processutils [req-83957bf8-d8be-493f-840b-a563c762f45c5e3920bdc30948e3a4896e57e0c13187 2848a37532334b5cad85adc261e02e69 - default default] Running cmd (subprocess): qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/swap_2048 /var/lib/nova/instances/e501e9b6-e4ac-464f-b1d6-9a5439c5c269/disk.swap execute /usr/lib/python2.7/dist-packages/oslo_concurrency/processutils.py:372
2019-04-24 11:33:38.835 45826 DEBUG oslo_concurrency.processutils [req-83957bf8-d8be-493f-840b-a563c762f45c5e3920bdc30948e3a4896e57e0c13187 2848a37532334b5cad85adc261e02e69 - default default] CMD "qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/swap_2048 /var/lib/nova/instances/e501e9b6-e4ac-464f-b1d6-9a5439c5c269/disk.swap" returned: 0 in 0.026s execute /usr/lib/python2.7/dist-packages/oslo_concurrency/processutils.py:409
2019-04-24 11:33:38.836 45826 DEBUG nova.virt.disk.api [req-83957bf8-d8be-493f-840b-a563c762f45c 5e3920bdc30948e3a4896e57e0c13187 2848a37532334b5cad85adc261e02e69 - default default] Checking if we can resize image /var/lib/nova/instances/e501e9b6-e4ac-464f-b1d6-9a5439c5c269/disk.swap. size=2147483648 can_resize_image /usr/lib/python2.7/dist-packages/nova/virt/disk/api.py:178
2019-04-24 11:33:38.836 45826 DEBUG oslo_concurrency.processutils [req-83957bf8-d8be-493f-840b-a563c762f45c5e3920bdc30948e3a4896e57e0c13187 2848a37532334b5cad85adc261e02e69 - default default] Running cmd (subprocess): /usr/bin/python2.7 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/e501e9b6-e4ac-464f-b1d6-9a5439c5c269/disk.swap --force-share execute /usr/lib/python2.7/dist-packages/oslo_concurrency/processutils.py:372
2019-04-24 11:33:38.872 45826 DEBUG oslo_concurrency.processutils [req-83957bf8-d8be-493f-840b-a563c762f45c5e3920bdc30948e3a4896e57e0c13187 2848a37532334b5cad85adc261e02e69 - default default] CMD "/usr/bin/python2.7 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/e501e9b6-e4ac-464f-b1d6-9a5439c5c269/disk.swap --force-share" returned: 0 in 0.036s execute /usr/lib/python2.7/dist-packages/oslo_concurrency/processutils.py:409
2019-04-24 11:33:38.873 45826 DEBUG nova.virt.disk.api [req-83957bf8-d8be-493f-840b-a563c762f45c 5e3920bdc0948e3a4896e57e0c13187 2848a37532334b5cad85adc261e02e69 - default default] Cannot resize image /var/lib/nova/instances/e501e9b6-e4ac-464f-b1d6-9a5439c5c269/disk.swap to a smaller size. can_resize_image /usr/lib/python2.7/dlst-packages/nova/virt/disk/api.py:184

Image stuck in spawning state

I have a new openstack install on a compute node and controller node, bare metal. It all seems to work until I try to instantiate and instance. It just hangs in spawning.

On the compute node, I can find the node in the /var/lib/nova/ directory, but the disk file is like 200KB. I have read the faqs, searched and read forum posts, etc... most of them point to a lack of resources, but I have 900GB of disk and countless CPU cores and memory free. I have logged a bit at the time of spawning, and I will reproduce it here. Any help would be greatly appreciated.

Steps taken so far:

Checked resources on the compute node, has nearly 1TB of free storage, 628GB of ram free and 76 CPUs free. Checked permissions on the directories. Tried to debug by turning on debugging on nova on both controller and compute node. I can not see anything totally out of the ordinary. Logs do not show any errors, only warnings.

Followed some advice from here to no avail also:

https://stackoverflow.com/questions/47504867/openstack-vm-instance-stuck-in-spawning-state

I took debug removed the original debugs, as I think they are a red herring. The network (I think) was not properly setup, so I went back and set it up as per these docs: https://computingforgeeks.com/creating-openstack-network-and-subnets/ Now at least when I spawn an instance the dashboard shows the correct IPs.

Now there are no libvirt logs on the controller and the compute. Here they are:as user eblock was asking for, but I do see this in the neutron-linuxbridge-agent.log:

controller node:


==> /var/log/nova/nova-api.log <==
2019-04-24 11:33:14.064 7070 INFO nova.osapi_compute.wsgi.server [req-33d9eb1d-1127-4a62-af82-043d3570d287 5e3920bdc30948e3a4896e57e0c13187 2848a37532334b5cad85adc261e02e69 - default default] 10.131.36.21 "POST /v2.1/os-keypairs HTTP/1.1" status: 200 len: 2692 time: 0.1951220

==> /var/log/nova/nova-scheduler.log <==
2019-04-24 11:33:14.091 91832 DEBUG oslo_service.periodic_task [req-fa7a35f5-8ed9-4730-bd80-620e5d6c9df4 - - - - -] Running periodic task SchedulerManager._run_periodic_tasks run_periodic_tasks /usr/lib/python2.7/dist-packages/oslo_service/periodic_task.py:215
2019-04-24 11:33:14.118 91839 DEBUG oslo_service.periodic_task [req-6370673d-0ab8-4479-b9a8-4e04d0e87a52 - - - - -] Running periodic task SchedulerManager._run_periodic_tasks run_periodic_tasks /usr/lib/python2.7/dist-packages/oslo_service/periodic_task.py:215
2019-04-24 11:33:15.272 91901 DEBUG oslo_service.periodic_task [req-e2bae43a-deb6-4496-a523-4eb01071855d - - - - -] Running periodic task SchedulerManager._run_periodic_tasks run_periodic_tasks /usr/lib/python2.7/dist-packages/oslo_service/periodic_task.py:215
2019-04-24 11:33:15.405 91957 DEBUG oslo_service.periodic_task [req-4a2f6771-395f-4388-a246-eadd7e2b415f - - - - -] Running periodic task SchedulerManager._run_periodic_tasks run_periodic_tasks /usr/lib/python2.7/dist-packages/oslo_service/periodic_task.py:215
2019-04-24 11:33:16.297 91911 DEBUG oslo_service.periodic_task [req-2f1c24a2-a443-44d1-a4cb-dcec925bcdd2 - - - - -] Running periodic task SchedulerManager._run_periodic_tasks run_periodic_tasks /usr/lib/python2.7/dist-packages/oslo_service/periodic_task.py:215
2019-04-24 11:33:17.254 91895 DEBUG oslo_service.periodic_task [req-d3c2cf57-5aba-4041-ad55-9fea5f4de0e3 - - - - -] Running periodic task SchedulerManager._run_periodic_tasks run_periodic_tasks /usr/lib/python2.7/dist-packages/oslo_service/periodic_task.py:215
2019-04-24 11:33:18.082 91828 DEBUG oslo_service.periodic_task [req-4b94c387-eeaa-4f34-89c6-6c553650f30a - - - - -] Running periodic task SchedulerManager._run_periodic_tasks run_periodic_tasks /usr/lib/python2.7/dist-packages/oslo_service/periodic_task.py:215
2019-04-24 11:33:18.101 91835 DEBUG oslo_service.periodic_task [req-cade0ce9-fd4e-48fc-a4a6-6af450cbea14 - - - - -] Running periodic task SchedulerManager._run_periodic_tasks run_periodic_tasks /usr/lib/python2.7/dist-packages/oslo_service/periodic_task.py:215
2019-04-24 11:33:18.170 91861 DEBUG oslo_service.periodic_task [req-ba1e0089-12f5-4c90-b393-9521ac012da6 - - - - -] Running periodic task SchedulerManager._run_periodic_tasks run_periodic_tasks /usr/lib/python2.7/dist-packages/oslo_service/periodic_task.py:215
2019-04-24 11:33:19.170 91853 DEBUG oslo_service.periodic_task [req-e06452c4-e3c1-4a15-8614-c5b5cccafbd6 - - - - -] Running periodic task SchedulerManager._run_periodic_tasks run_periodic_tasks /usr/lib/python2.7/dist-packages/oslo_service/periodic_task.py:215
2019-04-24 11:33:19.316 91919 DEBUG oslo_service.periodic_task [req-d7372d4a-9fd7-45ac-9487-9de2919ec403 - - - - -] Running periodic task SchedulerManager._run_periodic_tasks run_periodic_tasks /usr/lib/python2.7/dist-packages/oslo_service/periodic_task.py:215
2019-04-24 11:33:19.441 91966 DEBUG oslo_service.periodic_task [req-9ea8a027-e3e3-4cd8-8481-e76468440fd5 - - - - -] Running periodic task SchedulerManager._run_periodic_tasks run_periodic_tasks /usr/lib/python2.7/dist-packages/oslo_service/periodic_task.py:215
2019-04-24 11:33:20.443 91970 DEBUG oslo_service.periodic_task [req-a35cb813-0256-4646-adcf-019f3fd24e4e - - - - -] Running periodic task SchedulerManager._run_periodic_tasks run_periodic_tasks /usr/lib/python2.7/dist-packages/oslo_service/periodic_task.py:215
2019-04-24 11:33:21.590 92016 DEBUG oslo_service.periodic_task [req-6bd19360-5110-4745-8aea-4086dbdc2665 - - - - -] Running periodic task SchedulerManager._run_periodic_tasks run_periodic_tasks /usr/lib/python2.7/dist-packages/oslo_service/periodic_task.py:215
2019-04-24 11:33:22.225 91885 DEBUG oslo_service.periodic_task [req-de575a74-880b-40d3-97d8-4a4e63e47c4e - - - - -] Running periodic task SchedulerManager._run_periodic_tasks run_periodic_tasks /usr/lib/python2.7/dist-packages/oslo_service/periodic_task.py:215
2019-04-24 11:33:22.278 91900 DEBUG oslo_service.periodic_task [req-3490c661-662c-4070-8011-78c06478d6ed - - - - -] Running periodic task SchedulerManager._run_periodic_tasks run_periodic_tasks /usr/lib/python2.7/dist-packages/oslo_service/periodic_task.py:215
2019-04-24 11:33:22.372 91942 DEBUG oslo_service.periodic_task [req-9796132a-db8d-4251-bbca-a8496563ffbb - - - - -] Running periodic task SchedulerManager._run_periodic_tasks run_periodic_tasks /usr/lib/python2.7/dist-packages/oslo_service/periodic_task.py:215
2019-04-24 11:33:23.167 91857 DEBUG oslo_service.periodic_task [req-69f0c3b8-0bac-42b8-b84a-80d5a693ec54 - - - - -] Running periodic task SchedulerManager._run_periodic_tasks run_periodic_tasks /usr/lib/python2.7/dist-packages/oslo_service/periodic_task.py:215
2019-04-24 11:33:23.402 91952 DEBUG oslo_service.periodic_task [req-ab31a670-376a-48ac-aa10-da72a893c6d4 - - - - -] Running periodic task SchedulerManager._run_periodic_tasks run_periodic_tasks /usr/lib/python2.7/dist-packages/oslo_service/periodic_task.py:215
2019-04-24 11:33:25.131 91844 DEBUG oslo_service.periodic_task [req-a66feeb0-468f-4de7-a5d8-a3576fb4df57 - - - - -] Running periodic task SchedulerManager._run_periodic_tasks run_periodic_tasks /usr/lib/python2.7/dist-packages/oslo_service/periodic_task.py:215
2019-04-24 11:33:26.130 91846 DEBUG oslo_service.periodic_task [req-c2d433ff-ef66-4f57-b241-3043e2e619b8 - - - - -] Running periodic task SchedulerManager._run_periodic_tasks run_periodic_tasks /usr/lib/python2.7/dist-packages/oslo_service/periodic_task.py:215
2019-04-24 11:33:26.210 91877 DEBUG oslo_service.periodic_task [req-9cace30e-9851-4163-82f2-5fce0b922d5a - - - - -] Running periodic task SchedulerManager._run_periodic_tasks run_periodic_tasks /usr/lib/python2.7/dist-packages/oslo_service/periodic_task.py:215
2019-04-24 11:33:26.555 92009 DEBUG oslo_service.periodic_task [req-380bc36c-be51-4f19-bb94-af8c7d587811 - - - - -] Running periodic task SchedulerManager._run_periodic_tasks run_periodic_tasks /usr/lib/python2.7/dist-packages/oslo_service/periodic_task.py:215
2019-04-24 11:33:27.219 91881 DEBUG oslo_service.periodic_task [req-4fd4180a-e8a2-434a-9e79-daa43b2e9bab - - - - -] Running periodic task SchedulerManager._run_periodic_tasks run_periodic_tasks /usr/lib/python2.7/dist-packages/oslo_service/periodic_task.py:215

==> /var/log/nova/nova-api.log <==
2019-04-24 11:33:27.417 7066 INFO nova.osapi_compute.wsgi.server [req-3915c5dd-2d4e-40a1-beef-42f4ed0a6fcc 5e3920bdc30948e3a4896e57e0c13187 2848a37532334b5cad85adc261e02e69 - default default] 10.131.36.21 "GET /v2.1 HTTP/1.1" status: 302 len: 312 time: 0.0048208
2019-04-24 11:33:27.425 7066 INFO nova.osapi_compute.wsgi.server [req-1f566f9f-0de0-44b4-ab88-0fda5a3192b0 5e3920bdc30948e3a4896e57e0c13187 2848a37532334b5cad85adc261e02e69 - default default] 10.131.36.21 "GET /v2.1/ HTTP/1.1" status: 200 len: 783 time: 0.0048091

==> /var/log/nova/nova-scheduler.log <==
2019-04-24 11:33:27.594 92026 DEBUG oslo_service.periodic_task [req-d6799b5e-8213-4768-a165-9b36aaf8e58d - - - - -] Running periodic task SchedulerManager._run_periodic_tasks run_periodic_tasks /usr/lib/python2.7/dist-packages/oslo_service/periodic_task.py:215

==> /var/log/nova/nova-api.log <==
2019-04-24 11:33:29.758 7070 INFO nova.osapi_compute.wsgi.server [req-83957bf8-d8be-493f-840b-a563c762f45c 5e3920bdc30948e3a4896e57e0c13187 2848a37532334b5cad85adc261e02e69 - default default] 10.131.36.21 "POST /v2.1/servers HTTP/1.1" status: 202 len: 890 time: 2.1083019

==> /var/log/nova/nova-conductor.log <==
2019-04-24 11:33:29.810 91732 DEBUG oslo_db.sqlalchemy.engines [req-83957bf8-d8be-493f-840b-a563c762f45c 5e3920bdc30948e3a4896e57e0c13187 2848a37532334b5cad85adc261e02e69 - default default] MySQL server mode set to STRICT_TRANS_TABLES,STRICT_ALL_TABLES,NO_ZERO_IN_DATE,NO_ZERO_DATE,ERROR_FOR_DIVISION_BY_ZERO,TRADITIONAL,NO_AUTO_CREATE_USER,NO_ENGINE_SUBSTITUTION _check_effective_sql_mode /usr/lib/python2.7/dist-packages/oslo_db/sqlalchemy/engines.py:308

==> /var/log/nova/nova-api.log <==
2019-04-24 11:33:29.833 7070 INFO nova.osapi_compute.wsgi.server [req-76ab185a-0860-496b-9887-8e572dd7fb57 5e3920bdc30948e3a4896e57e0c13187 2848a37532334b5cad85adc261e02e69 - default default] 10.131.36.21 "GET /v2.1/servers/e501e9b6-e4ac-464f-b1d6-9a5439c5c269 HTTP/1.1" status: 200 len: 2130 time: 0.0689468

==> /var/log/nova/nova-scheduler.log <==
2019-04-24 11:33:29.885 91827 DEBUG nova.scheduler.manager [req-83957bf8-d8be-493f-840b-a563c762f45c 5e3920bdc30948e3a4896e57e0c13187 2848a37532334b5cad85adc261e02e69 - default default] Starting to schedule 2019-04-26 11:56:11.394 2417 ERROR neutron.plugins.ml2.drivers.agent._common_agent MessagingTimeout:    Timed out waiting for instances: [u'e501e9b6-e4ac-464f-b1d6-9a5439c5c269'] select_destinations /usr/lib/python2.7/dist-packages/nova/scheduler/manager.py:117
2019-04-24 11:33:29.891 91827 DEBUG oslo_concurrency.lockutils [req-83957bf8-d8be-493f-840b-a563c762f45c 5e3920bdc30948e3a4896e57e0c13187 2848a37532334b5cad85adc261e02e69 - default default] Lock "placement_client" acquired a reply to message ID 900a069b1c5144ce810da1bae9066159
2019-04-26 11:56:11.394 2417 ERROR neutron.plugins.ml2.drivers.agent._common_agent 
2019-04-26 11:56:11.395 2417 WARNING oslo.service.loopingcall [-] Function 'neutron.plugins.ml2.drivers.agent._common_agent.CommonAgentLoop._report_state' run outlasted interval by "nova.scheduler.client.report._create_client" :: waited 0.000s inner /usr/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:273
2019-04-24 11:33:29.910 91827 DEBUG oslo_concurrency.lockutils [req-83957bf8-d8be-493f-840b-a563c762f45c 5e3920bdc30948e3a4896e57e0c13187 2848a37532334b5cad85adc261e02e69 - default default] Lock "placement_client" released  30.00 sec
2019-04-26 11:57:11.396 2417 ERROR neutron.plugins.ml2.drivers.agent._common_agent [-] Failed reporting state!: MessagingTimeout: Timed out waiting for a reply to message ID 7bbd26ab7c624dbeb24f60a1f827f119
2019-04-26 11:57:11.396 2417 ERROR neutron.plugins.ml2.drivers.agent._common_agent Traceback (most recent call last):
2019-04-26 11:57:11.396 2417 ERROR neutron.plugins.ml2.drivers.agent._common_agent   File "/usr/lib/python2.7/dist-packages/neutron/plugins/ml2/drivers/agent/_common_agent.py", line 129, in _report_state
2019-04-26 11:57:11.396 2417 ERROR neutron.plugins.ml2.drivers.agent._common_agent     True)
2019-04-26 11:57:11.396 2417 ERROR neutron.plugins.ml2.drivers.agent._common_agent   File "/usr/lib/python2.7/dist-packages/neutron/agent/rpc.py", line 97, in report_state
2019-04-26 11:57:11.396 2417 ERROR neutron.plugins.ml2.drivers.agent._common_agent     return method(context, 'report_state', **kwargs)
2019-04-26 11:57:11.396 2417 ERROR neutron.plugins.ml2.drivers.agent._common_agent   File "/usr/lib/python2.7/dist-packages/oslo_messaging/rpc/client.py", line 179, in call
2019-04-26 11:57:11.396 2417 ERROR neutron.plugins.ml2.drivers.agent._common_agent     retry=self.retry)
2019-04-26 11:57:11.396 2417 ERROR neutron.plugins.ml2.drivers.agent._common_agent   File "/usr/lib/python2.7/dist-packages/oslo_messaging/transport.py", line 133, in _send
2019-04-26 11:57:11.396 2417 ERROR neutron.plugins.ml2.drivers.agent._common_agent     retry=retry)
2019-04-26 11:57:11.396 2417 ERROR neutron.plugins.ml2.drivers.agent._common_agent   File "/usr/lib/python2.7/dist-packages/oslo_messaging/_drivers/amqpdriver.py", line 584, in send
2019-04-26 11:57:11.396 2417 ERROR neutron.plugins.ml2.drivers.agent._common_agent     call_monitor_timeout, retry=retry)
2019-04-26 11:57:11.396 2417 ERROR neutron.plugins.ml2.drivers.agent._common_agent   File "/usr/lib/python2.7/dist-packages/oslo_messaging/_drivers/amqpdriver.py", line 573, in _send
2019-04-26 11:57:11.396 2417 ERROR neutron.plugins.ml2.drivers.agent._common_agent     call_monitor_timeout)
2019-04-26 11:57:11.396 2417 ERROR neutron.plugins.ml2.drivers.agent._common_agent   File "/usr/lib/python2.7/dist-packages/oslo_messaging/_drivers/amqpdriver.py", line 459, in wait
2019-04-26 11:57:11.396 2417 ERROR neutron.plugins.ml2.drivers.agent._common_agent     message = self.waiters.get(msg_id, timeout=timeout)
2019-04-26 11:57:11.396 2417 ERROR neutron.plugins.ml2.drivers.agent._common_agent   File "/usr/lib/python2.7/dist-packages/oslo_messaging/_drivers/amqpdriver.py", line 336, in get
2019-04-26 11:57:11.396 2417 ERROR neutron.plugins.ml2.drivers.agent._common_agent     'to message ID %s' % msg_id)
2019-04-26 11:57:11.396 2417 ERROR neutron.plugins.ml2.drivers.agent._common_agent MessagingTimeout: Timed out waiting for a reply to message ID 7bbd26ab7c624dbeb24f60a1f827f119
2019-04-26 11:57:11.396 2417 ERROR neutron.plugins.ml2.drivers.agent._common_agent 
2019-04-26 11:57:11.397 2417 WARNING oslo.service.loopingcall [-] Function 'neutron.plugins.ml2.drivers.agent._common_agent.CommonAgentLoop._report_state' run outlasted interval by "nova.scheduler.client.report._create_client" :: held 0.019s inner /usr/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:285
2019-04-24 11:33:30.353 91934 DEBUG oslo_service.periodic_task [req-a780f069-49bf-4934-9d03-466002dcae49 - - - - -] Running periodic task SchedulerManager._run_periodic_tasks run_periodic_tasks /usr/lib/python2.7/dist-packages/oslo_service/periodic_task.py:215
2019-04-24 11:33:30.556 92011 DEBUG oslo_service.periodic_task [req-019faeb0-f42a-4ab0-8670-8d5dad10f904 - - - - -] Running periodic task SchedulerManager._run_periodic_tasks run_periodic_tasks /usr/lib/python2.7/dist-packages/oslo_service/periodic_task.py:215
2019-04-24 11:33:30.617 92034 DEBUG oslo_service.periodic_task [req-84511f42-5082-4c0e-8841-e22e9bec977d - - - - -] Running periodic task SchedulerManager._run_periodic_tasks run_periodic_tasks /usr/lib/python2.7/dist-packages/oslo_service/periodic_task.py:215
2019-04-24 11:33:30.936 91827 DEBUG oslo_concurrency.lockutils [req-83957bf8-d8be-493f-840b-a563c762f45c 5e3920bdc30948e3a4896e57e0c13187 2848a37532334b5cad85adc261e02e69 - default default] Lock "be489632-58a4-4c10-9901-efef9dee6494" acquired by "nova.context.get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:273
2019-04-24 11:33:30.936 91827 DEBUG oslo_concurrency.lockutils [req-83957bf8-d8be-493f-840b-a563c762f45c 5e3920bdc30948e3a4896e57e0c13187 2848a37532334b5cad85adc261e02e69 - default default] Lock "be489632-58a4-4c10-9901-efef9dee6494" released by "nova.context.get_or_set_cached_cell_and_set_connections" :: held 0.001s inner /usr/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:285
2019-04-24 11:33:30.980 91827 DEBUG oslo_concurrency.lockutils [req-83957bf8-d8be-493f-840b-a563c762f45c 5e3920bdc30948e3a4896e57e0c13187 2848a37532334b5cad85adc261e02e69 - default default] Lock "be489632-58a4-4c10-9901-efef9dee6494" acquired by "nova.context.get_or_set_cached_cell_and_set_connections" :: waited 0.000s inner /usr/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:273
2019-04-24 11:33:30.980 91827 DEBUG oslo_concurrency.lockutils [req-83957bf8-d8be-493f-840b-a563c762f45c 5e3920bdc30948e3a4896e57e0c13187 2848a37532334b5cad85adc261e02e69 - default default] Lock "be489632-58a4-4c10-9901-efef9dee6494" released by "nova.context.get_or_set_cached_cell_and_set_connections" :: held 0.000s inner /usr/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:285
2019-04-24 11:33:30.984 91827 DEBUG oslo_concurrency.lockutils [req-83957bf8-d8be-493f-840b-a563c762f45c 5e3920bdc30948e3a4896e57e0c13187 2848a37532334b5cad85adc261e02e69 - default default] Lock "(u'hpcccloud2', u'hpcccloud2')" acquired by "nova.scheduler.host_manager._locked_update" :: waited 0.000s inner /usr/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:273
2019-04-24 11:33:30.984 91827 DEBUG nova.scheduler.host_manager [req-83957bf8-d8be-493f-840b-a563c762f45c 5e3920bdc30948e3a4896e57e0c13187 2848a37532334b5cad85adc261e02e69 - default default] Update host state from compute node: ComputeNode(cpu_allocation_ratio=16.0,cpu_info='{"vendor": "Intel", "model": "Skylake-Server-IBRS", "arch": "x86_64", "features": ["pku", "rtm", "tsc_adjust", "vme", "pge", "xsaveopt", "smep", "fpu", "monitor", "lm", "tsc", "adx", "fxsr", "tm", "pclmuldq", "xgetbv1", "tsc-deadline", "arat", "de", "aes", "pse", "sse", "f16c", "ds", "mpx", "avx512f", "avx2", "pbe", "cx16", "ds_cpl", "movbe", "ospke", "vmx", "sep", "avx512dq", "xsave", "erms", "hle", "est", "smx", "abm", "sse4.1", "sse4.2", "ssbd", "acpi", "popcnt", "mmx", "osxsave", "clwb", "dca", "pdcm", "invtsc", "tm2", "pcid", "rdrand", "avx512vl", "x2apic", "smap", "clflush", "dtes64", "xtpr", "avx512bw", "msr", "fma", "cx8", "mce", "avx512cd", "mca", "lahf_lm", "rdseed", "apic", "fsgsbase", "rdtscp", "ssse3", "pse36", "mtrr", "avx", "syscall", "invpcid", "cmov", "spec-ctrl", "clflushopt", "pat", "3dnowprefetch", "nx", "pae", "ht", "pdpe1gb", "xsavec", "pni", "sse2", "ss", "bmi1", "bmi2", "xsaves"], "topology": {"cores": 20, "cells": 2, "threads": 2, "sockets": 1}}',created_at=2019-04-09T19:10:16Z,current_workload=1,deleted=False,deleted_at=None,disk_allocation_ratio=1.0,disk_available_least=927,free_disk_gb=875,free_ram_mb=637735,host='hpcccloud2',host_ip=10.131.36.22,hypervisor_hostname='hpcccloud2',hypervisor_type='QEMU',hypervisor_version=2011001,id=2,local_gb=983,local_gb_used=108,mapped=0,memory_mb=644391,memory_mb_used=6656,metrics='[]',numa_topology='{"nova_object.version": "1.2", "nova_object.changes": ["cells"], "nova_object.name": "NUMATopology", "nova_object.data": {"cells": [{"nova_object.version": "1.3", "nova_object.changes": ["network_metadata", "cpu_usage", "memory_usage", "cpuset", "pinned_cpus", "siblings", "memory", "mempages", "id"], "nova_object.name": "NUMACell", "nova_object.data": {"network_metadata": {"nova_object.version": "1.0", "nova_object.changes": ["physnets", "tunneled"], "nova_object.name": "NetworkMetadata", "nova_object.data": {"physnets": [], "tunneled": false}, "nova_object.namespace": "nova"}, "cpu_usage": 0, "memory_usage": 0, "cpuset": [0, 2, 4, 6, 8, 10, 12, 14, 16, 18, 20, 22, 24, 26, 28, 30, 32, 34, 36, 38, 40, 42, 44, 46, 48, 50, 52, 54, 56, 58, 60, 62, 64, 66, 68, 70, 72, 74, 76, 78], "pinned_cpus": [], "siblings": [[18, 58], [12, 52], [0, 40], [24, 64], [8, 48], [28, 68], [14, 54], [20, 60], [34, 74], [6, 46], [30, 70], [36, 76], [4, 44], [26, 66], [38, 78], [32, 72], [2, 42], [10, 50], [22, 62], [16, 56]], "memory": 321861, "mempages": [{"nova_object.version": "1.1", "nova_object.changes": ["total", "used", "reserved", "size_kb"], "nova_object.name": "NUMAPagesTopology", "nova_object.data": {"used": 0, "total": 82396621, "reserved": 0, "size_kb": 4}, "nova_object.namespace": "nova"}, {"nova_object.version": "1.1", "nova_object.changes": ["total", "used", "reserved", "size_kb"], "nova_object.name": "NUMAPagesTopology", "nova_object.data": {"used": 0, "total": 0, "reserved": 0, "size_kb": 2048}, "nova_object.namespace": "nova"}, {"nova_object.version": "1.1", "nova_object.changes": ["total", "used", "reserved", "size_kb"], "nova_object.name": "NUMAPagesTopology", "nova_object.data": {"used": 0, "total": 0, "reserved": 0, "size_kb": 1048576}, "nova_object.namespace": "nova"}], "id": 0}, "nova_object.namespace": "nova"}, {"nova_object.version": "1.3", "nova_object.changes": ["network_metadata", "cpu_usage", "memory_usage", "cpuset", "pinned_cpus", "siblings", "memory", "mempages", "id"], "nova_object.name": "NUMACell", "nova_object.data": {"network_metadata": {"nova_object.version": "1.0", "nova_object.changes": ["physnets", "tunneled"], "nova_object.name": "NetworkMetadata", "nova_object.data": {"physnets": [], "tunneled": false}, "nova_object.namespace": "nova"}, "cpu_usage": 0, "memory_usage": 0, "cpuset": [1, 3, 5, 7, 9, 11, 13, 15, 17, 19, 21, 23, 25, 27, 29, 31, 33, 35, 37, 39, 41, 43, 45, 47, 49, 51, 53, 55, 57, 59, 61, 63, 65, 67, 69, 71, 73, 75, 77, 79], "pinned_cpus": [], "siblings": [[19, 59], [21, 61], [13, 53], [35, 75], [31, 71], [33, 73], [5, 45], [37, 77], [3, 43], [23, 63], [25, 65], [27, 67], [9, 49], [29, 69], [39, 79], [1, 41], [17, 57], [11, 51], [15, 55], [7, 47]], "memory": 322529, "mempages": [{"nova_object.version": "1.1", "nova_object.changes": ["total", "used", "reserved", "size_kb"], "nova_object.name": "NUMAPagesTopology", "nova_object.data": {"used": 0, "total": 82567671, "reserved": 0, "size_kb": 4}, "nova_object.namespace": "nova"}, {"nova_object.version": "1.1", "nova_object.changes": ["total", "used", "reserved", "size_kb"], "nova_object.name": "NUMAPagesTopology", "nova_object.data": {"used": 0, "total": 0, "reserved": 0, "size_kb": 2048}, "nova_object.namespace": "nova"}, {"nova_object.version": "1.1", "nova_object.changes": ["total", "used", "reserved", "size_kb"], "nova_object.name": "NUMAPagesTopology", "nova_object.data": {"used": 0, "total": 0, "reserved": 0, "size_kb": 1048576}, "nova_object.namespace": "nova"}], "id": 1}, "nova_object.namespace": "nova"}]}, "nova_object.namespace": "nova"}',pci_device_pools=PciDevicePoolList,ram_allocation_ratio=1.5,running_vms=3,service_id=None,stats={failed_builds='0',io_workload='0',num_instances='3',num_os_type_None='3',num_proj_2848a37532334b5cad85adc261e02e69='1',num_proj_bda9284c0c8e410092d72879ea2e5f05='2',num_task_None='2',num_task_deleting='1',num_vm_error='3'},supported_hv_specs=[HVSpec,HVSpec],updated_at=2019-04-24T16:32:19Z,uuid=9a5f90eb-0bd3-4e5d-a1a8-b83cdacda6ba,vcpus=80,vcpus_used=4) _locked_update /usr/lib/python2.7/dist-packages/nova/scheduler/host_manager.py:171
2019-04-24 11:33:30.988 91827 WARNING nova.scheduler.host_manager [req-83957bf8-d8be-493f-840b-a563c762f45c 5e3920bdc30948e3a4896e57e0c13187 2848a37532334b5cad85adc261e02e69 - default default] Host hpcccloud2 has more disk space than database expected (927 GB > 875 GB)
2019-04-24 11:33:30.990 91827 DEBUG nova.scheduler.host_manager [req-83957bf8-d8be-493f-840b-a563c762f45c 5e3920bdc30948e3a4896e57e0c13187 2848a37532334b5cad85adc261e02e69 - default default] Update host state with aggregates: [] _locked_update /usr/lib/python2.7/dist-packages/nova/scheduler/host_manager.py:174
2019-04-24 11:33:30.991 91827 DEBUG nova.scheduler.host_manager [req-83957bf8-d8be-493f-840b-a563c762f45c 5e3920bdc30948e3a4896e57e0c13187 2848a37532334b5cad85adc261e02e69 - default default] Update host state with service dict: {'binary': u'nova-compute', 'uuid': u'337910f3-660b-4b77-b6a3-ecfc3f8b6a04', 'deleted': False, 'created_at': datetime.datetime(2019, 4, 2, 18, 6, 16, tzinfo=<iso8601.Utc>), 'updated_at': datetime.datetime(2019, 4, 24, 16, 33, 28, tzinfo=<iso8601.Utc>), 'report_count': 152336, 'topic': u'compute', 'host': u'hpcccloud2', 'version': 35, 'disabled': False, 'forced_down': False, 'last_seen_up': datetime.datetime(2019, 4, 24, 16, 33, 28, tzinfo=<iso8601.Utc>), 'deleted_at': None, 'disabled_reason': None, 'id': 8} _locked_update /usr/lib/python2.7/dist-packages/nova/scheduler/host_manager.py:177
2019-04-24 11:33:30.991 91827 DEBUG nova.scheduler.host_manager [req-83957bf8-d8be-493f-840b-a563c762f45c 5e3920bdc30948e3a4896e57e0c13187 2848a37532334b5cad85adc261e02e69 - default default] Update host state with instances: [u'6828030d-f0ef-4efd-b8fd-f33af0cb9249', u'60e9e3db-cede-4c98-86b1-11648022fecb', u'58fa1dc8-55a5-40c4-be38-96b7a49dcf50'] _locked_update /usr/lib/python2.7/dist-packages/nova/scheduler/host_manager.py:181
2019-04-24 11:33:30.992 91827 DEBUG oslo_concurrency.lockutils [req-83957bf8-d8be-493f-840b-a563c762f45c 5e3920bdc30948e3a4896e57e0c13187 2848a37532334b5cad85adc261e02e69 - default default] Lock "(u'hpcccloud2', u'hpcccloud2')" released by "nova.scheduler.host_manager._locked_update" :: held 0.008s inner /usr/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:285
2019-04-24 11:33:30.992 91827 DEBUG nova.filters [req-83957bf8-d8be-493f-840b-a563c762f45c 5e3920bdc30948e3a4896e57e0c13187 2848a37532334b5cad85adc261e02e69 - default default] Starting with 1 host(s) get_filtered_objects /usr/lib/python2.7/dist-packages/nova/filters.py:70
2019-04-24 11:33:30.993 91827 DEBUG nova.filters [req-83957bf8-d8be-493f-840b-a563c762f45c 5e3920bdc30948e3a4896e57e0c13187 2848a37532334b5cad85adc261e02e69 - default default] Filter RetryFilter returned 1 host(s) get_filtered_objects /usr/lib/python2.7/dist-packages/nova/filters.py:104
2019-04-24 11:33:30.993 91827 DEBUG nova.filters [req-83957bf8-d8be-493f-840b-a563c762f45c 5e3920bdc30948e3a4896e57e0c13187 2848a37532334b5cad85adc261e02e69 - default default] Filter AvailabilityZoneFilter returned 1 host(s) get_filtered_objects /usr/lib/python2.7/dist-packages/nova/filters.py:104
2019-04-24 11:33:30.994 91827 DEBUG nova.filters [req-83957bf8-d8be-493f-840b-a563c762f45c 5e3920bdc30948e3a4896e57e0c13187 2848a37532334b5cad85adc261e02e69 - default default] Filter ComputeFilter returned 1 host(s) get_filtered_objects /usr/lib/python2.7/dist-packages/nova/filters.py:104
2019-04-24 11:33:30.995 91827 DEBUG nova.filters [req-83957bf8-d8be-493f-840b-a563c762f45c 5e3920bdc30948e3a4896e57e0c13187 2848a37532334b5cad85adc261e02e69 - default default] Filter ComputeCapabilitiesFilter returned 1 host(s) get_filtered_objects /usr/lib/python2.7/dist-packages/nova/filters.py:104
2019-04-24 11:33:30.996 91827 DEBUG nova.filters [req-83957bf8-d8be-493f-840b-a563c762f45c 5e3920bdc30948e3a4896e57e0c13187 2848a37532334b5cad85adc261e02e69 - default default] Filter ImagePropertiesFilter returned 1 host(s) get_filtered_objects /usr/lib/python2.7/dist-packages/nova/filters.py:104
2019-04-24 11:33:30.996 91827 DEBUG nova.filters [req-83957bf8-d8be-493f-840b-a563c762f45c 5e3920bdc30948e3a4896e57e0c13187 2848a37532334b5cad85adc261e02e69 - default default] Filter ServerGroupAntiAffinityFilter returned 1 host(s) get_filtered_objects /usr/lib/python2.7/dist-packages/nova/filters.py:104
2019-04-24 11:33:30.996 91827 DEBUG nova.filters [req-83957bf8-d8be-493f-840b-a563c762f45c 5e3920bdc30948e3a4896e57e0c13187 2848a37532334b5cad85adc261e02e69 - default default] Filter ServerGroupAffinityFilter returned 1 host(s) get_filtered_objects /usr/lib/python2.7/dist-packages/nova/filters.py:104
2019-04-24 11:33:30.997 91827 DEBUG nova.scheduler.filter_scheduler [req-83957bf8-d8be-493f-840b-a563c762f45c 5e3920bdc30948e3a4896e57e0c13187 2848a37532334b5cad85adc261e02e69 - default default] Filtered [(hpcccloud2, hpcccloud2) ram: 637735MB disk: 896000MB io_ops: 0 instances: 3] _get_sorted_hosts /usr/lib/python2.7/dist-packages/nova/scheduler/filter_scheduler.py:435
2019-04-24 11:33:30.997 91827 DEBUG nova.scheduler.filter_scheduler [req-83957bf8-d8be-493f-840b-a563c762f45c 5e3920bdc30948e3a4896e57e0c13187 2848a37532334b5cad85adc261e02e69 - default default] Weighed [(hpcccloud2, hpcccloud2) ram: 637735MB disk: 896000MB io_ops: 0 instances: 3] _get_sorted_hosts /usr/lib/python2.7/dist-packages/nova/scheduler/filter_scheduler.py:455
2019-04-24 11:33:30.998 91827 DEBUG nova.scheduler.utils [req-83957bf8-d8be-493f-840b-a563c762f45c 5e3920bdc30948e3a4896e57e0c13187 2848a37532334b5cad85adc261e02e69 - default default] Attempting to claim resources in the placement API for instance e501e9b6-e4ac-464f-b1d6-9a5439c5c269 claim_resources /usr/lib/python2.7/dist-packages/nova/scheduler/utils.py:934
2019-04-24 11:33:30.999 91827 DEBUG oslo_concurrency.lockutils [req-83957bf8-d8be-493f-840b-a563c762f45c 5e3920bdc30948e3a4896e57e0c13187 2848a37532334b5cad85adc261e02e69 - default default] Lock "placement_client" acquired by "nova.scheduler.client.report._create_client" :: waited 0.000s inner /usr/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:273
2019-04-24 11:33:31.001 91827 DEBUG oslo_concurrency.lockutils [req-83957bf8-d8be-493f-840b-a563c762f45c 5e3920bdc30948e3a4896e57e0c13187 2848a37532334b5cad85adc261e02e69 - default default] Lock "placement_client" released by "nova.scheduler.client.report._create_client" :: held 0.003s inner /usr/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:285
2019-04-24 11:33:31.151 91855 DEBUG oslo_service.periodic_task [req-51ad99e1-3253-43e1-91c2-b098a2a2b891 - - - - -] Running periodic task SchedulerManager._run_periodic_tasks run_periodic_tasks /usr/lib/python2.7/dist-packages/oslo_service/periodic_task.py:215
2019-04-24 11:33:31.288 91905 DEBUG oslo_service.periodic_task [req-6dd9cb6b-9b02-4dd1-ace2-6aa701bc92e5 - - - - -] Running periodic task SchedulerManager._run_periodic_tasks run_periodic_tasks /usr/lib/python2.7/dist-packages/oslo_service/periodic_task.py:215
2019-04-24 11:33:31.579 92020 DEBUG oslo_service.periodic_task [req-81a7f7dc-2f73-4196-bd8d-95cc8a003819 - - - - -] Running periodic task SchedulerManager._run_periodic_tasks run_periodic_tasks /usr/lib/python2.7/dist-packages/oslo_service/periodic_task.py:215
2019-04-24 11:33:32.596 91827 DEBUG nova.scheduler.filter_scheduler [req-83957bf8-d8be-493f-840b-a563c762f45c 5e3920bdc30948e3a4896e57e0c13187 2848a37532334b5cad85adc261e02e69 - default default] [instance: e501e9b6-e4ac-464f-b1d6-9a5439c5c269] Selected host: (hpcccloud2, hpcccloud2) ram: 637735MB disk: 896000MB io_ops: 0 instances: 3 _consume_selected_host /usr/lib/python2.7/dist-packages/nova/scheduler/filter_scheduler.py:346
2019-04-24 11:33:32.597 91827 DEBUG oslo_concurrency.lockutils [req-83957bf8-d8be-493f-840b-a563c762f45c 5e3920bdc30948e3a4896e57e0c13187 2848a37532334b5cad85adc261e02e69 - default default] Lock "(u'hpcccloud2', u'hpcccloud2')" acquired by "nova.scheduler.host_manager._locked" :: waited 0.000s inner /usr/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:273
2019-04-24 11:33:32.608 91827 DEBUG oslo_concurrency.lockutils [req-83957bf8-d8be-493f-840b-a563c762f45c 5e3920bdc30948e3a4896e57e0c13187 2848a37532334b5cad85adc261e02e69 - default default] Lock "(u'hpcccloud2', u'hpcccloud2')" released by "nova.scheduler.host_manager._locked" :: held 0.011s inner /usr/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:285



compute node:

2019-04-24 11:33:22.012 45826 DEBUG oslo_service.periodic_task [req-e60be91c-cfd9-430a-8f7e-cad2b4f6b90d - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python2.7/dist-packages/oslo_service/periodic_task.py:215
2019-04-24 11:33:22.012 45826 DEBUG oslo_service.periodic_task [req-e60be91c-cfd9-430a-8f7e-cad2b4f6b90d - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python2.7/dist-packages/oslo_service/periodic_task.py:215
2019-04-24 11:33:22.013 45826 DEBUG nova.compute.manager [req-e60be91c-cfd9-430a-8f7e-cad2b4f6b90d - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python2.7/dist-packages/nova/compute/manager.py:7069
2019-04-24 11:33:22.105 45826 DEBUG nova.compute.manager [req-e60be91c-cfd9-430a-8f7e-cad2b4f6b90d - - - - -] [instance: 58fa1dc8-55a5-40c4-be38-96b7a49dcf50] Skipping network cache update for instance because it is being deleted. _heal_instance_info_cache /usr/lib/python2.7/dist-packages/nova/compute/manager.py:7119
2019-04-24 11:33:22.105 45826 DEBUG nova.compute.manager [req-e60be91c-cfd9-430a-8f7e-cad2b4f6b90d - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python2.7/dist-packages/nova/compute/manager.py:7145
2019-04-24 11:33:23.104 45826 DEBUG oslo_service.periodic_task [req-e60be91c-cfd9-430a-8f7e-cad2b4f6b90d - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python2.7/dist-package/oslo_service/periodic_task.py:215
2019-04-24 11:33:23.105 45826 DEBUG oslo_service.periodic_task [req-e60be91c-cfd9-430a-8f7e-cad2b4f6b90d - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python2.7/dist-packages/oslo_service/periodic_task.py:215
2019-04-24 11:33:23.105 45826 DEBUG oslo_service.periodic_task [req-e60be91c-cfd9-430a-8f7e-cad2b4f6b90d - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python2.7/dist-packages/oslo_service/periodic_task.py:215
2019-04-24 11:33:23.105 45826 DEBUG oslo_service.periodic_task [req-e60be91c-cfd9-430a-8f7e-cad2b4f6b90d - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python2.7/dist-packages/oslo_service/periodic_task.py:215
2019-04-24 11:33:23.158 45826 DEBUG nova.compute.resource_tracker [req-e60be91c-cfd9-430a-8f7e-cad2b4f6b90d - -- - -] Auditing locally available compute resources for hpcccloud2 (node: hpcccloud2) update_available_resource /usr/li/python2.7/dist-packages/nova/compute/resource_tracker.py:704
## inventory output deleted too large
2019-04-24 11:33:23.546 45826 DEBUG oslo_concurrency.lockutils [req-e60be91c-cfd9-430a-8f7e-cad2b4f6b90d - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker._update_available_resource" :: waited 0.000s inner /usr/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:273
2019-04-24 11:33:23.622 45826 DEBUG nova.compute.resource_tracker [req-e60be91c-cfd9-430a-8f7e-cad2b4f6b90d - - - - -] Compute driver doesn't require allocation refresh and we're on a compute host in a deployment that only has compute hosts with Nova versions >=16 (Pike). Skipping auto-correction of allocations. _update_usage_from_instances /usr/lib/python2.7/dist-packages/nova/compute/resource_tracker.py:1281
2019-04-24 11:33:23.729 45826 DEBUG nova.compute.resource_tracker [req-e60be91c-cfd9-430a-8f7e-cad2b4f6b90d - - - - -] Instance 6828030d-f0ef-4efd-b8fd-f33af0cb9249 actively managed on this compute host and has allocations in placement: {u'resources': {u'MEMORY_MB': 4096, u'VCPU': 2, u'DISK_GB': 40}}. _remove_deleted_instances_allocations /usr/lib/python2.7/dist-packages/nova/compute/resource_tracker.py:1306
2019-04-24 11:33:23.730 45826 DEBUG nova.compute.resource_tracker [req-e60be91c-cfd9-430a-8f7e-cad2b4f6b90d - - - - -] Instance 60e9e3db-cede-4c98-86b1-11648022fecb actively managed on this compute host and has allocations in placement: {u'resources': {u'MEMORY_MB': 1024, u'VCPU': 1, u'DISK_GB': 34}}. _remove_deleted_instances_allocations /usr/lib/python2.7/dist-packages/nova/compute/resource_tracker.py:1306
2019-04-24 11:33:23.730 45826 DEBUG nova.compute.resource_tracker [req-e60be91c-cfd9-430a-8f7e-cad2b4f6b90d - - - - -] Instance 58fa1dc8-55a5-40c4-be38-96b7a49dcf50 actively managed on this compute host and has allocations in placement: {u'resources': {u'MEMORY_MB': 1024, u'VCPU': 1, u'DISK_GB': 34}}. _remove_deleted_instances_allocations /usr/lib/python2.7/dist-packages/nova/compute/resource_tracker.py:1306
2019-04-24 11:33:23.730 45826 DEBUG nova.compute.resource_tracker [req-e60be91c-cfd9-430a-8f7e-cad2b4f6b90d - - - - -] Total usable vcpus: 80, total allocated vcpus: 4 _report_final_resource_view /usr/lib/python2.7/dist-packages/nova/compute/resource_tracker.py:860
2019-04-24 11:33:23.730 45826 INFO nova.compute.resource_tracker [req-e60be91c-cfd9-430a-8f7e-cad2b4f6b90d - - - - -] Final resource view: name=hpcccloud2 phys_ram=644391MB used_ram=6656MB phys_disk=983GB used_disk=108GB total_vcpus=80 used_vcpus=4 pci_stats=[]
2019-04-24 11:33:23.795 45826 DEBUG nova.scheduler.client.report [req-e60be91c-cfd9-430a-8f7e-cad2b4f6b90d - - - - -] Refreshing aggregate associations for resource provider 9a5f90eb-0bd3-4e5d-a1a8-b83cdacda6ba, aggregates: None _refresh_associations /usr/lib/python2.7/dist-packages/nova/scheduler/client/report.py:792
2019-04-24 11:33:23.817 45826 DEBUG nova.scheduler.client.report [req-e60be91c-cfd9-430a-8f7e-cad2b4f6b90d - - - - -] Refreshing trait associations for resource provider 9a5f90eb-0bd3-4e5d-a1a8-b83cdacda6ba, traits: HW_CPU_X86_AESNI,HW_CPU_X86_AVX512CD,HW_CPU_X86_FMA3,HW_CPU_X86_MPX,HW_CPU_X86_SSE2,HW_CPU_X86_MMX,HW_CPU_X86_SSSE3,HW_CPU_X86_AVX,HW_CPU_X86_CLMUL,HW_CPU_X86_AVX512DQ,HW_CPU_X86_SSE,HW_CPU_X86_AVX2,HW_CPU_X86_AVX512F,HW_CPU_X86_F16C,HW_CPU_X86_SSE41,HW_CPU_X86_SSE42,HW_CPU_X86_VMX,HW_CPU_X86_BMI2,HW_CPU_X86_AVX512VL,HW_CPU_X86_AVX512BW,HW_CPU_X86_ABM,HW_CPU_X86_BMI _refresh_associations /usr/lib/python2.7/dist-packages/nova/scheduler/client/report.py:806
2019-04-24 11:33:23.846 45826 DEBUG nova.virt.libvirt.driver [req-e60be91c-cfd9-430a-8f7e-cad2b4f6b90d - - - - -] CPU mode 'host-model' model '' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python2.7/dist-packages/nova/virt/libvirt/driver.py:3841
2019-04-24 11:33:23.850 45826 DEBUG nova.compute.resource_tracker [req-e60be91c-cfd9-430a-8f7e-cad2b4f6b90- - - - -] Compute_service record updated for hpcccloud2:hpcccloud2 _update_available_resource /usr/lib/python2.7/dist-packages/nova/compute/resource_tracker.py:800
2019-04-24 11:33:23.850 45826 DEBUG oslo_concurrency.lockutils [req-e60be91c-cfd9-430a-8f7e-cad2b4f6b90d - - - - -] Lock "compute_resources" released by "nova.compute.resource_tracker._update_available_resource" :: held 0.304s inner /usr/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:285
2019-04-24 11:33:23.851 45826 DEBUG oslo_service.periodic_task [req-e60be91c-cfd9-430a-8f7e-cad2b4f6b90d - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python2.7/dist-packages/oslo_service/periodic_task.py:215
2019-04-24 11:33:24.758 45826 DEBUG oslo_service.periodic_task [req-e60be91c-cfd9-430a-8f7e-cad2b4f6b90d - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python2.7/dist-packages/oslo_service/periodic_task.py:215
2019-04-24 11:33:24.758 45826 DEBUG nova.compute.manager [req-e60be91c-cfd9-430a-8f7e-cad2b4f6b90d - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python2.7/dist-packages/nova/compute/manager.py:7742
2019-04-24 11:33:36.047 45826 DEBUG oslo_concurrency.lockutils [req-83957bf8-d8be-493f-840b-a563c762f45c 5e3920bdc30948e3a4896e57e0c13187 2848a37532334b5cad85adc261e02e69 - default default] Lock "e501e9b6-e4ac-464f-b1d6-9a5439c5c269" acquired by "nova.compute.manager._locked_do_build_and_run_instance" :: waited 0.000s inner /usr/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:273
2019-04-24 11:33:36.101 45826 DEBUG nova.compute.manager [req-83957bf8-d8be-493f-840b-a563c762f45c 5e3920bdc30948e3a4896e57e0c13187 2848a37532334b5cad85adc261e02e69 - default default] [instance: e501e9b6-e4ac-464f-b1d6-9a5439c5c269] Starting instance... _do_build_and_run_instance /usr/lib/python2.7/dist-package/nova/compute/manager.py:1913
2019-04-24 11:33:36.257 45826 DEBUG oslo_concurrency.lockutils [req-83957bf8-d8be-493f-840b-a563c762f45c 5e3920bdc30948e3a4896e57e0c13187 2848a37532334b5cad85adc261e02e69 - default default] Lock "compute_resources"acquired by "nova.compute.resource_tracker.instance_claim" :: waited 0.000s inner /usr/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:273
2019-04-24 11:33:36.259 45826 DEBUG nova.compute.resource_tracker [req-83957bf8-d8be-493f-840b-a563c762f45c5e3920bdc30948e3a4896e57e0c13187 2848a37532334b5cad85adc261e02e69 - default default] Memoryoverhead for 1024 MB instance; 0 MB instance_claim /usr/lib/python2.7/dist-packages/nova/comput/resource_tracker.py:205
2019-04-24 11:33:36.259 45826 DEBUG nova.compute.resource_tracker [req-83957bf8-d8be-493f-840b-a563c762f455e3920bdc30948e3a4896e57e0c13187 2848a37532334b5cad85adc261e02e69 - default default] Disk overhead for 32 GB instance; 0 GB instance_claim /usr/lib/python2.7/dist-packages/nova/compute/resource_tracker.py:208
2019-04-24 11:33:36.259 45826 DEBUG nova.compute.resource_tracker [req-83957bf8-d8be-493f-840b-a563c762f45c5e3920bdc30948e3a4896e57e0c13187 2848a37532334b5cad85adc261e02e69 - default default] CPU overhead for 1vCPUs instance; 0 vCPU(s) instance_claim /usr/lib/python2.7/dist-packages/nova/compute/resource_tracker.py:211
2019-04-24 11:33:36.298 45826 INFO nova.compute.claims [req-83957bf8-d8be-493f-840b-a563c762f45c5e3920bdc30948e3a4896e57e0c13187 2848a37532334b5cad85adc261e02e69 - default default] [instance: e501e9b6-e4ac-464f-b1d6-9a5439c5c269] Attempting claim on node hpcccloud2: memory 1024 MB, disk 32 GB, vcpus 1 CPU
2019-04-24 11:33:36.298 45826 INFO nova.compute.claims [req-83957bf8-d8be-493f-840b-a563c762f45c5e3920bdc30948e3a4896e57e0c13187 2848a37532334b5cad85adc261e02e69 - default default] [instance: e501e9b6e4ac-464f-b1d6-9a5439c5c269] Total memory: 644391 MB, used: 6656.00 MB
2019-04-24 11:33:36.298 45826 INFO nova.compute.claims [req-83957bf8-d8be-493f-840ba563c762f45c5e3920bdc30948e3a4896e57e0c13187 2848a37532334b5cad85adc261e02e69 - default default][instance:e501e9b6e4ac-464f-b1d6-9a5439c5c269] memory limit not specified, defaulting to unlimited
2019-04-24 11:33:36.298 45826 INFO nova.compute.claims [req-83957bf8-d8be-493f-840ba563c762f45c5e3920bdc30948e3a4896e57e0c13187 2848a37532334b5cad85adc261e02e69 - default default][instance:e501e9b6e4ac-464f-b1d6-9a5439c5c269] Total disk: 983 GB, used: 108.00 GB
2019-04-24 11:33:36.299 45826 INFO nova.compute.claims [req-83957bf8-d8be-493f-840b-a563c762f45c5e3920bdc30948e3a4896e57e0c13187 2848a37532334b5cad85adc261e02e69 - default default] [instance: e501e9b6e4ac-464f-b1d6-9a5439c5c269] disk limit not specified, defaulting to unlimited
2019-04-24 11:33:36.299 45826 INFO nova.compute.claims [req-83957bf8-d8be-493f-840b-a563c762f45c5e3920bdc30948e3a4896e57e0c13187 2848a37532334b5cad85adc261e02e69 - default default] [instance: e501e9b6e4ac-464f-b1d6-9a5439c5c269] Total vcpu: 80 VCPU, used: 4.00 VCPU
2019-04-24 11:33:36.299 45826 INFO nova.compute.claims [req-83957bf8-d8be-493f-840b-a563c762f45c5e3920bdc30948e3a4896e57e0c13187 2848a37532334b5cad85adc261e02e69 - default default] [instance: e501e9b6e4ac-464f-b1d6-9a5439c5c269] vcpu limit not specified, defaulting to unlimited
2019-04-24 11:33:36.300 45826 DEBUG nova.virt.hardware [req-83957bf8-d8be-493f-840ba563c762f45c5e3920bdc30948e3a4896e57e0c13187 2848a37532334b5cad85adc261e02e69 - default default] Require both a host andinstance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python2.7/dist-package/nova/virt/hardware.py:1664
2019-04-24 11:33:36.300 45826 INFO nova.compute.claims [req-83957bf8-d8be-493f-840ba563c762f45c5e3920bdc30948e3a4896e57e0c13187 2848a37532334b5cad85adc261e02e69 - default default] [instance:e501e9b6e4ac-464f-b1d6-9a5439c5c269] Claim successful on node hpcccloud2
2019-04-24 11:33:36.574 45826 DEBUG nova.scheduler.client.report [req-83957bf8-d8be-493f-840b-a563c762f45c5e3920bdc30948e3a4896e57e0c13187 2848a37532334b5cad85adc261e02e69 - default default] Refreshing aggregateassociations for resource provider 9a5f90eb-0bd3-4e5d-a1a8-b83cdacda6ba, aggregates: None _refresh_associations /usr/li/python2.7/dist-packages/nova/scheduler/client/report.py:792
2019-04-24 11:33:36.600 45826 DEBUG nova.scheduler.client.report [req-83957bf8-d8be-493f-840b-a563c762f45c5e3920bdc30948e3a4896e57e0c13187 2848a37532334b5cad85adc261e02e69 - default default] Refreshing traitassociations for resource provider 9a5f90eb-0bd3-4e5d-a1a8-b83cdacda6ba,traits:HW_CPU_X86_AESNI,HW_CPU_X86_AVX512CD,HW_CPU_X86_FMA3,HW_CPU_X86_MPX,HW_CPU_X86_SSEHW_CPU_X86_MMX,HW_CPU_X86_SSSE3,HW_CPU_X86_AVX,HW_CPU_X86_CLMUL,HW_CPU_X86_AVX512DQ,H_CPU_X86_SSE,HW_CPU_X86_AVX2,HW_CPU_X86_AVX512F,HW_CPU_X86_F16C,HW_CPU_X86_SSE41,HW_CPU_86_SSE42,HW_CPU_X86_VMX,HW_CPU_X86_BMI2,HW_CPU_X86_AVX512VL,HW_CPU_X86_AVX512BW,HW_CPU_X86_ABM,HW_CPU_X86_BMI _refresh_associations /usr/lib/python2.7/dist-packages/nova/scheduler/client/report.py:806
2019-04-24 11:33:36.651 45826 DEBUG nova.virt.libvirt.driver [req-83957bf8-d8be-493f-840b-a563c762f45c 5e3920bdc30948e3a4896e57e0c13187 2848a37532334b5cad85adc261e02e69 - default default] CPU mode 'host-model' model '' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python2.7/dist-packages/nova/virt/libvi/driver.py:3841
2019-04-24 11:33:36.655 45826 DEBUG oslo_concurrency.lockutils [req-83957bf8-d8be-493f-840b-a563c762f45c5e3920bdc30948e3a4896e57e0c13187 2848a37532334b5cad85adc261e02e69 - default default] Lock "compute_resources" released by "nova.compute.resource_tracker.instance_claim" :: held 0.397s inner /usr/li/python2.7/dist-packages/oslo_concurrency/lockutils.py:285
2019-04-24 11:33:36.655 45826 DEBUG nova.compute.manager [req-83957bf8-d8be-493f-840b-a563c762f45c5e3920bdc30948e3a4896e57e0c13187 2848a37532334b5cad85adc261e02e69 - default default] [instance: e501e9b6e4ac-464f-b1d6-9a5439c5c269] Start building networks asynchronously for instance. _build_resources /usr/li/python2.7/dist-packages/nova/compute/manager.py:2280
2019-04-24 11:33:36.822 45826 DEBUG nova.compute.manager [req-83957bf8-d8be-493f-840b-a563c762f45c5e3920bdc30948e3a4896e57e0c13187 2848a37532334b5cad85adc261e02e69 - default default] [instance: e501e9b6e4ac-464f-b1d6-9a5439c5c269] Allocating IP information in the background. _allocate_network_async /usr/li/python2.7/dist-packages/nova/compute/manager.py:1493
2019-04-24 11:33:36.822 45826 DEBUG nova.network.neutronv2.api [req-83957bf8-d8be-493f-840b-a563c762f45c5e3920bdc30948e3a4896e57e0c13187 2848a37532334b5cad85adc261e02e69 - default default] [instance: e501e9b6e4ac-464f-b1d6-9a5439c5c269] allocate_for_instance() allocate_for_instance /usr/lib/python2.7/dist-package/nova/network/neutronv2/api.py:996
2019-04-24 11:33:36.869 45826 DEBUG nova.block_device [req-83957bf8-d8be-493f-840b-a563c762f45c5e3920bdc30948e3a4896e57e0c13187 2848a37532334b5cad85adc261e02e69 - default default]block_device_list []volume_in_mapping /usr/lib/python2.7/dist-packages/nova/block_device.py:575
2019-04-24 11:33:36.911 45826 DEBUG nova.compute.manager [req-83957bf8-d8be-493f-840b-a563c762f45c5e3920bdc30948e3a4896e57e0c13187 2848a37532334b5cad85adc261e02e69 - default default] [instance: e501e9b6e4ac-464f-b1d6-9a5439c5c269] Start building block device mappings for instance. _build_resources /usr/li/python2.7/dist-packages/nova/compute/manager.py:2314
2019-04-24 11:33:36.921 45826 DEBUG oslo_policy.policy [req-83957bf8-d8be-493f-840b-a563c762f45c5e3920bdc30948e3a4896e57e0c13187 2848a37532334b5cad85adc261e02e69 - default default] The policy file policy.jsoncould not be found. load_rules /usr/lib/python2.7/dist-packages/oslo_policy/policy.py:559
2019-04-24 11:33:37.029 45826 DEBUG oslo_concurrency.lockutils [req-83957bf8-d8be-493f-840b-a563c762f45c 5e3920bdc30948e3a4896e57e0c13187 2848a37532334b5cad85adc261e02e69 - default default] Lock "placement_client" acquired by "nova.scheduler.client.report._create_client" :: waited 0.000s inner /usr/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:273
2019-04-24 11:33:37.031 45826 DEBUG oslo_concurrency.lockutils [req-83957bf8-d8be-493f-840b-a563c762f45c 5e3920bdc30948e3a4896e57e0c13187 2848a37532334b5cad85adc261e02e69 - default default] Lock "placement_client" released by "nova.scheduler.client.report._create_client" :: held 0.002s inner /usr/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:285
2019-04-24 11:33:37.986 45826 DEBUG nova.compute.manager [req-83957bf8-d8be-493f-840b-a563c762f45c 5e3920bdc30948e3a4896e57e0c13187 2848a37532334b5cad85adc261e02e69 - default default] [instance: e501e9b6-e4ac-464f-b1d6-9a5439c5c269] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python2.7/dist-packages/nova/compute/manager.py:2127
2019-04-24 11:33:37.987 45826 DEBUG nova.block_device [req-83957bf8-d8be-493f-840b-a563c762f45c 5e3920bdc30948e3a4896e57e0c13187 2848a37532334b5cad85adc261e02e69 - default default] block_device_list [u'vdb'] volume_in_mapping /usr/lib/python2.7/dist-packages/nova/block_device.py:575
2019-04-24 11:33:37.987 45826 INFO nova.virt.libvirt.driver [req-83957bf8-d8be-493f-840b-a563c762f45c 5e3920bdc30948e3a4896e57e0c13187 2848a37532334b5cad85adc261e02e69 - default default] [instance: e501e9b6-e4ac-464f-b1d6-9a5439c5c269] Creating image
2019-04-24 11:33:37.988 45826 DEBUG oslo_concurrency.lockutils [req-83957bf8-d8be-493f-840b-a563c762f45c 5e3920bdc30948e3a4896e57e0c13187 2848a37532334b5cad85adc261e02e69 - default default] Lock "/var/lib/nova/instances/e501e9b6-e4ac-464f-b1d6-9a5439c5c269/disk.info" acquired by "nova.virt.libvirt.imagebackend.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:273
2019-04-24 11:33:37.988 45826 DEBUG oslo_concurrency.lockutils [req-83957bf8-d8be-493f-840b-a563c762f45c 5e3920bdc30948e3a4896e57e0c13187 2848a37532334b5cad85adc261e02e69 - default default] Lock "/var/lib/nova/instances/e501e9b6-e4ac-464f-b1d6-9a5439c5c269/disk.info" released by "nova.virt.libvirt.imagebackend.write_to_disk_info_file" :: held 0.000s inner /usr/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:285
2019-04-24 11:33:37.989 45826 INFO oslo.privsep.daemon [req-83957bf8-d8be-493f-840b-a563c762f45c 5e3920bdc30948e3a4896e57e0c13187 2848a37532334b5cad85adc261e02e69 - default default] Running privsep helper: ['sudo', 'nova-rootwrap', '/etc/nova/rootwrap.conf', 'privsep-helper', '--config-file', '/etc/nova/nova.conf', '--config-file', '/etc/nova/nova-compute.conf', '--privsep_context', 'nova.privsep.sys_admin_pctxt', '--privsep_sock_path', '/tmp/tmpVpRflp/privsep.sock']
2019-04-24 11:33:38.421 45826 INFO oslo.privsep.daemon [req-83957bf8-d8be-493f-840b-a563c762f45c 5e3920bdc30948e3a4896e57e0c13187 2848a37532334b5cad85adc261e02e69 - default default] Spawned new privsep daemon via rootwrap
2019-04-24 11:33:38.360 45904 INFO oslo.privsep.daemon [-] privsep daemon starting
2019-04-24 11:33:38.363 45904 INFO oslo.privsep.daemon [-] privsep process running with uid/gid: 0/0
2019-04-24 11:33:38.364 45904 INFO oslo.privsep.daemon [-] privsep process running with capabilities (eff/prm/inh):CAP_CHOWN|CAP_DAC_OVERRIDE|CAP_DAC_READ_SEARCH|CAP_FOWNER|CAP_NET_ADMIN|CAP_SYSADMIN/CAP_CHOWN|CAP_DAC_OVERRIDE|CAP_DAC_READ_SEARCH|CAP_FOWNER|CAP_NET_ADMIN|CAP_SYS_ADMN/none
2019-04-24 11:33:38.364 45904 INFO oslo.privsep.daemon [-] privsep daemon running as pid 45904
2019-04-24 11:33:38.457 45826 DEBUG oslo_concurrency.processutils [req-83957bf8-d8be-493f-840b-a563c762f45c5e3920bdc30948e3a4896e57e0c13187 2848a37532334b5cad85adc261e02e69 - default default] Running cmd(subprocess): /usr/bin/python2.7 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/abc261a53dd1e62f06a0d7cd2fc8da800b68383a --force-share execute /usr/lib/python2.7/dist-packages/oslo_concurrency/processutils.py:372
2019-04-24 11:33:38.497 45826 DEBUG oslo_concurrency.processutils [req-83957bf8-d8be-493f-840ba563c762f45c5e3920bdc30948e3a4896e57e0c13187 2848a37532334b5cad85adc261e02e69 - default default] CMD "/usr/bin/python2.7 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/abc261a53dd1e62f06a0d7cd2fc8da800b68383a --force-share" returned: 0 in 0.040s execute /usr/lib/python2.7/dist-packages/oslo_concurrency/processutils.py:409
2019-04-24 11:33:38.499 45826 DEBUG oslo_concurrency.lockutils [req-83957bf8-d8be-493f-840b-a563c762f45c5e3920bdc30948e3a4896e57e0c13187 2848a37532334b5cad85adc261e02e69 - default default] Lock "abc261a53dd1e62f06a0d7cd2fc8da800b68383a" acquired by "nova.virt.libvirt.imagebackend.copy_qcow2_image" :: waited 0.000s inner /usr/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:273
2019-04-24 11:33:38.499 45826 DEBUG oslo_concurrency.processutils [req-83957bf8-d8be-493f-840b-a563c762f45c 5e3920bdc30948e3a4896e57e0c13187 2848a37532334b5cad85adc261e02e69 - default default] Running cmd (subprocess): /usr/bin/python2.7 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/abc261a53dd1e62f06a0d7cd2fc8da800b68383a --force-share execute /usr/lib/python2.7/dist-packages/oslo_concurrency/processutils.py:372
2019-04-24 11:33:38.539 45826 DEBUG oslo_concurrency.processutils [req-83957bf8-d8be-493f-840b-a563c762f45c5e3920bdc30948e3a4896e57e0c13187 2848a37532334b5cad85adc261e02e69 - default default] CMD "/usr/bin/python2.7 - m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/abc261a53dd1e62f06a0d7cd2fc8da800b68383a --force-share" returned: 0 in 0.040s execute /usr/lib/python2.7/dist-packages/oslo_concurrency/processutils.py:409
2019-04-24 11:33:38.540 45826 DEBUG oslo_concurrency.processutils [req-83957bf8-d8be-493f-840b-a563c762f45c5e3920bdc30948e3a4896e57e0c13187 2848a37532334b5cad85adc261e02e69 - default default] Running cmd (subprocess): qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/abc261a53dd1e62f06a0d7cd2fc8da800b68383a /var/lib/nova/instances/e501e9b6-e4ac-464f-b1d6-9a5439c5c269/disk execute /usr/lib/python2.7/dist-packages/oslo_concurrency/processutils.py:372
2019-04-24 11:33:38.564 45826 DEBUG oslo_concurrency.processutils [req-83957bf8-d8be-493f-840b-a563c762f45c5e3920bdc30948e3a4896e57e0c13187 2848a37532334b5cad85adc261e02e69 - default default] CMD "qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/abc261a53dd1e62f06a0d7cd2fc8da800b68383a /var/lib/nova/instances/e501e9b6-e4ac-464f-b1d6-9a5439c5c269/disk" returned: 0 in 0.024s execute /usr/lib/python2.7/dist-packages/oslo_concurrency/processutils.py:409
2019-04-24 11:33:38.565 45826 DEBUG nova.virt.disk.api [req-83957bf8-d8be-493f-840b-a563c762f45c 5e3920bdc30948e3a4896e57e0c13187 2848a37532334b5cad85adc261e02e69 - default default] Checking if we can resize image /var/lib/nova/instances/e501e9b6-e4ac-464f-b1d6-9a5439c5c269/disk. size=34359738368 can_resize_image /usr/lib/python2.7/dist-packages/nova/virt/disk/api.py:178
2019-04-24 11:33:38.566 45826 DEBUG oslo_concurrency.processutils [req-83957bf8-d8be-493f-840b-a563c762f45c5e3920bdc30948e3a4896e57e0c13187 2848a37532334b5cad85adc261e02e69 - default default] Running cmd (subprocess): /usr/bin/python2.7 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/e501e9b6-e4ac-464f-b1d6-9a5439c5c269/disk --force-share execute /usr/lib/python2.7/dist-packages/oslo_concurrency/processutils.py:372
2019-04-24 11:33:38.606 45826 DEBUG oslo_concurrency.processutils [req-83957bf8-d8be-493f-840b-a563c762f45c5e3920bdc30948e3a4896e57e0c13187 2848a37532334b5cad85adc261e02e69 - default default] CMD "/usr/bin/python2.7 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/e501e9b6-e4ac-464f-b1d6-9a5439c5c269/disk --force-share" returned: 0 in 0.040s execute /usr/lib/python2.7/dist-packages/oslo_concurrency/processutils.py:409
2019-04-24 11:33:38.607 45826 DEBUG oslo_concurrency.processutils [req-83957bf8-d8be-493f-840b-a563c762f45c5e3920bdc30948e3a4896e57e0c13187 2848a37532334b5cad85adc261e02e69 - default default] Running cmd (subprocess): qemu-img resize /var/lib/nova/instances/e501e9b6-e4ac-464f-b1d6-9a5439c5c269/disk 34359738368 execute /usr/lib/python2.7/dist-packages/oslo_concurrency/processutils.py:372
2019-04-24 11:33:38.647 45826 DEBUG oslo_concurrency.processutils [req-83957bf8-d8be-493f-840b-a563c762f45c5e3920bdc30948e3a4896e57e0c13187 2848a37532334b5cad85adc261e02e69 - default default] CMD "qemu-img resize /var/lib/nova/instances/e501e9b6-e4ac-464f-b1d6-9a5439c5c269/disk 34359738368" returned: 0 in 0.040s execute /usr/lib/python2.7/dist-packages/oslo_concurrency/processutils.py:409
2019-04-24 11:33:38.648 45826 DEBUG oslo_concurrency.lockutils [req-83957bf8-d8be-493f-840b-a563c762f45c 53920bdc30948e3a4896e57e0c13187 2848a37532334b5cad85adc261e02e69 - default default] Lock "abc261a53dd1e62f06a0d7cd2fc8da800b68383a" released by "nova.virt.libvirt.imagebackend.copy_qcow2_image" :: held 0.149s inner /usr/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:285
2019-04-24 11:33:38.648 45826 DEBUG oslo_concurrency.processutils [req-83957bf8-d8be-493f-840b-a563c762f45c5e3920bdc30948e3a4896e57e0c13187 2848a37532334b5cad85adc261e02e69 - default default] Running cmd (subprocess): /usr/bin/python2.7 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/abc261a53dd1e62f06a0d7cd2fc8da800b68383a --force-share execute /usr/lib/python2.7/dist-packages/oslo_concurrency/processutils.py:372
2019-04-24 11:33:38.688 45826 DEBUG oslo_concurrency.processutils [req-83957bf8-d8be-493f-840b-a563c762f45c5e3920bdc30948e3a4896e57e0c13187 2848a37532334b5cad85adc261e02e69 - default default] CMD "/usr/bin/python2.7 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/abc261a53dd1e62f06a0d7cd2fc8da800b68383a --force-share" returned: 0 in 0.040s execute /usr/lib/python2.7/dist-packages/oslo_concurrency/processutils.py:409
2019-04-24 11:33:38.689 45826 DEBUG nova.virt.disk.api [req-83957bf8-d8be-493f-840b-a563c762f45c5e3920bdc30948e3a4896e57e0c13187 2848a37532334b5cad85adc261e02e69 - default default] Checking if we can resize image /var/lib/nova/instances/e501e9b6-e4ac-464f-b1d6-9a5439c5c269/disk. size=34359738368 can_resize_image /usr/lib/python2.7/dist-packages/nova/virt/disk/api.py:178
2019-04-24 11:33:38.689 45826 DEBUG oslo_concurrency.processutils [req-83957bf8-d8be-493f-840b-a563c762f45c5e3920bdc30948e3a4896e57e0c13187 2848a37532334b5cad85adc261e02e69 - default default] Running cmd (subprocess): /usr/bin/python2.7 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/e501e9b6-e4ac-464f-b1d6-9a5439c5c269/disk --force-share execute /usr/lib/python2.7/dist-packages/oslo_concurrency/processutils.py:372
2019-04-24 11:33:38.725 45826 DEBUG oslo_concurrency.processutils [req-83957bf8-d8be-493f-840b-a563c762f45c5e3920bdc30948e3a4896e57e0c13187 2848a37532334b5cad85adc261e02e69 - default default] CMD "/usr/bin/python2.7 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/e501e9b6-e4ac-464f-b1d6-9a5439c5c269/disk --force-share" returned: 0 in 0.036s execute /usr/lib/python2.7/dist-packages/oslo_concurrency/processutils.py:409
2019-04-24 11:33:38.726 45826 DEBUG nova.virt.disk.api [req-83957bf8-d8be-493f-840b-a563c762f45c 5e3920bdc30948e3a4896e57e0c13187 2848a37532334b5cad85adc261e02e69 - default default] Cannot resize image /var/lib/nova/instances/e501e9b6-e4ac-464f-b1d6-9a5439c5c269/disk to a smaller size. can_resize_image /usr/lib/python2.7/dist-packages/nova/virt/disk/api.py:184
2019-04-24 11:33:38.727 45826 DEBUG oslo_concurrency.lockutils [req-83957bf8-d8be-493f-840b-a563c762f45c 5e3920bdc30948e3a4896e57e0c13187 2848a37532334b5cad85adc261e02e69 - default default] Lock "/var/lib/nova/instances/e501e9b6-e4ac-464f-b1d6-9a5439c5c269/disk.info" acquired by "nova.virt.libvirt.imagebackend.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:273
2019-04-24 11:33:38.728 45826 DEBUG oslo_concurrency.lockutils [req-83957bf8-d8be-493f-840b-a563c762f45c 5e3920bdc30948e3a4896e57e0c13187 2848a37532334b5cad85adc261e02e69 - default default] Lock "/var/lib/nova/instances/e501e9b6-e4ac-464f-b1d6-9a5439c5c269/disk.info" released by "nova.virt.libvirt.imagebackend.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:285
2019-04-24 11:33:38.729 45826 DEBUG oslo_concurrency.processutils [req-83957bf8-d8be-493f-840b-a563c762f45c5e3920bdc30948e3a4896e57e0c13187 2848a37532334b5cad85adc261e02e69 - default default] Running cmd (subprocess): /usr/bin/python2.7 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/swap_2048 --force-share execute /usr/lib/python2.7/dist-packages/oslo_concurrency/processutils.py:372
2019-04-24 11:33:38.769 45826 DEBUG oslo_concurrency.processutils [req-83957bf8-d8be-493f-840b-a563c762f45c5e3920bdc30948e3a4896e57e0c13187 2848a37532334b5cad85adc261e02e69 - default default] CMD "/usr/bin/python2.7 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/swap_2048 --force-share" returned: 0 in 0.040s execute /usr/lib/python2.7/dist-packages/oslo_concurrency/processutils.py:409
2019-04-24 11:33:38.769 45826 DEBUG oslo_concurrency.lockutils [req-83957bf8-d8be-493f-840b-a563c762f45c 5e3920bdc30948e3a4896e57e0c13187 2848a37532334b5cad85adc261e02e69 - default default] Lock "swap_2048" acquired by "nova.virt.libvirt.imagebackend.copy_qcow2_image" :: waited 0.000s inner /usr/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:273
2019-04-24 11:33:38.770 45826 DEBUG oslo_concurrency.processutils [req-83957bf8-d8be-493f-840b-a563c762f45c5e3920bdc30948e3a4896e57e0c13187 2848a37532334b5cad85adc261e02e69 - default default] Running cmd (subprocess): /usr/bin/python2.7 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/swap_2048 --force-share execute /usr/lib/python2.7/dist-packages/oslo_concurrency/processutils.py:372
2019-04-24 11:33:38.808 45826 DEBUG oslo_concurrency.processutils [req-83957bf8-d8be-493f-840b-a563c762f45c5e3920bdc30948e3a4896e57e0c13187 2848a37532334b5cad85adc261e02e69 - default default] CMD "/usr/bin/python2.7 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/swap_2048 --force-share" returned: 0 in 0.038s execute /usr/lib/python2.7/dist-packages/oslo_concurrency/processutils.py:409
2019-04-24 11:33:38.809 45826 DEBUG oslo_concurrency.processutils [req-83957bf8-d8be-493f-840b-a563c762f45c5e3920bdc30948e3a4896e57e0c13187 2848a37532334b5cad85adc261e02e69 - default default] Running cmd (subprocess): qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/swap_2048 /var/lib/nova/instances/e501e9b6-e4ac-464f-b1d6-9a5439c5c269/disk.swap execute /usr/lib/python2.7/dist-packages/oslo_concurrency/processutils.py:372
2019-04-24 11:33:38.835 45826 DEBUG oslo_concurrency.processutils [req-83957bf8-d8be-493f-840b-a563c762f45c5e3920bdc30948e3a4896e57e0c13187 2848a37532334b5cad85adc261e02e69 - default default] CMD "qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/swap_2048 /var/lib/nova/instances/e501e9b6-e4ac-464f-b1d6-9a5439c5c269/disk.swap" returned: 0 in 0.026s execute /usr/lib/python2.7/dist-packages/oslo_concurrency/processutils.py:409
2019-04-24 11:33:38.836 45826 DEBUG nova.virt.disk.api [req-83957bf8-d8be-493f-840b-a563c762f45c 5e3920bdc30948e3a4896e57e0c13187 2848a37532334b5cad85adc261e02e69 - default default] Checking if we can resize image /var/lib/nova/instances/e501e9b6-e4ac-464f-b1d6-9a5439c5c269/disk.swap. size=2147483648 can_resize_image /usr/lib/python2.7/dist-packages/nova/virt/disk/api.py:178
2019-04-24 11:33:38.836 45826 DEBUG oslo_concurrency.processutils [req-83957bf8-d8be-493f-840b-a563c762f45c5e3920bdc30948e3a4896e57e0c13187 2848a37532334b5cad85adc261e02e69 - default default] Running cmd (subprocess): /usr/bin/python2.7 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/e501e9b6-e4ac-464f-b1d6-9a5439c5c269/disk.swap --force-share execute /usr/lib/python2.7/dist-packages/oslo_concurrency/processutils.py:372
2019-04-24 11:33:38.872 45826 DEBUG oslo_concurrency.processutils [req-83957bf8-d8be-493f-840b-a563c762f45c5e3920bdc30948e3a4896e57e0c13187 2848a37532334b5cad85adc261e02e69 - default default] CMD "/usr/bin/python2.7 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/e501e9b6-e4ac-464f-b1d6-9a5439c5c269/disk.swap --force-share" returned: 0 in 0.036s execute /usr/lib/python2.7/dist-packages/oslo_concurrency/processutils.py:409
2019-04-24 11:33:38.873 45826 DEBUG nova.virt.disk.api [req-83957bf8-d8be-493f-840b-a563c762f45c 5e3920bdc0948e3a4896e57e0c13187 2848a37532334b5cad85adc261e02e69 - default default] Cannot resize image /var/lib/nova/instances/e501e9b6-e4ac-464f-b1d6-9a5439c5c269/disk.swap to a smaller size. can_resize_image /usr/lib/python2.7/dlst-packages/nova/virt/disk/api.py:184
30.00 sec