VM creation fail : No vaild host was found
Hi. I'm deploying Openstack Mitaka with 1 controller and 1networknode and 3compute node. Everything seems to work fine, and I'm able to launch instances also. But now I'm trying to launch instance with large capacity options, like m1.xlarge, and it fails, saying No valid host was found. I found out that Filters filtering out all my compute nodes as a valid host. And I believe overcommit isn't really a way to get this done, since it might crash when disk or ram of that node are fully used. So my questions..
- Do I have to assign RAM or diskstorage constrained to the amount of the maximum of one compute node? If so, please tell me the reason? I'm confused.. isn't that against the concept of cloud?
- my hypervisor-stats shows all my three compute-nodes, but filters are not counting compute-node2 as you can see in the error log. What configurations or files should I be looking at?
tail /var/log/nova/nova-scheduler.log - diskfilter
2016-11-10 18:53:12.288 10845 DEBUG nova.scheduler.host_manager [req-10c398e2-6611-4884-8883-cdd4c2445848 6abfef77e9984fc58bb5decc566d12c2 fda14487a68f436da7b55fe218798ea2 - - -] Update host state with aggregates: [] _locked_update /usr/lib/python2.7/dist-packages/nova/scheduler/host_manager.py:171
2016-11-10 18:53:12.288 10845 DEBUG nova.scheduler.host_manager [req-10c398e2-6611-4884-8883-cdd4c2445848 6abfef77e9984fc58bb5decc566d12c2 fda14487a68f436da7b55fe218798ea2 - - -] Update host state with service dict: {'binary': u'nova-compute', 'deleted': False, 'created_at': datetime.datetime(2016, 10, 15, 7, 31, 22, tzinfo=<iso8601.Utc>), 'updated_at': datetime.datetime(2016, 11, 10, 9, 53, 11, tzinfo=<iso8601.Utc>), 'report_count': 12760, 'topic': u'compute', 'host': u'compute-node3', 'version': 9, 'disabled': False, 'forced_down': False, 'last_seen_up': datetime.datetime(2016, 11, 10, 9, 53, 11, tzinfo=<iso8601.Utc>), 'deleted_at': None, 'disabled_reason': None, 'id': 8} _locked_update /usr/lib/python2.7/dist-packages/nova/scheduler/host_manager.py:174
2016-11-10 18:53:12.289 10845 DEBUG nova.scheduler.host_manager [req-10c398e2-6611-4884-8883-cdd4c2445848 6abfef77e9984fc58bb5decc566d12c2 fda14487a68f436da7b55fe218798ea2 - - -] Update host state with instances: {} _locked_update /usr/lib/python2.7/dist-packages/nova/scheduler/host_manager.py:177
2016-11-10 18:53:12.289 10845 DEBUG oslo_concurrency.lockutils [req-10c398e2-6611-4884-8883-cdd4c2445848 6abfef77e9984fc58bb5decc566d12c2 fda14487a68f436da7b55fe218798ea2 - - -] Lock "(u'compute-node3', u'compute-node3')" released by "nova.scheduler.host_manager._locked_update" :: held 0.003s inner /usr/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:285
2016-11-10 18:53:12.290 10845 WARNING nova.scheduler.host_manager [req-10c398e2-6611-4884-8883-cdd4c2445848 6abfef77e9984fc58bb5decc566d12c2 fda14487a68f436da7b55fe218798ea2 - - -] No compute service record found for host compute-node2
2016-11-10 18:53:12.290 10845 DEBUG nova.filters [req-10c398e2-6611-4884-8883-cdd4c2445848 6abfef77e9984fc58bb5decc566d12c2 fda14487a68f436da7b55fe218798ea2 - - -] Starting with 2 host(s) get_filtered_objects /usr/lib/python2.7/dist-packages/nova/filters.py:70
2016-11-10 18:53:12.291 10845 DEBUG nova.filters [req-10c398e2-6611-4884-8883-cdd4c2445848 6abfef77e9984fc58bb5decc566d12c2 fda14487a68f436da7b55fe218798ea2 - - -] Filter RetryFilter returned 2 host(s) get_filtered_objects /usr/lib/python2.7/dist-packages/nova/filters.py:104
2016-11-10 18:53:12.291 10845 DEBUG nova.filters [req-10c398e2-6611-4884-8883-cdd4c2445848 6abfef77e9984fc58bb5decc566d12c2 fda14487a68f436da7b55fe218798ea2 - - -] Filter AvailabilityZoneFilter returned 2 host(s) get_filtered_objects /usr/lib/python2.7/dist-packages/nova/filters.py:104
2016-11-10 18:53:12.292 10845 DEBUG nova.filters [req-10c398e2-6611-4884-8883-cdd4c2445848 6abfef77e9984fc58bb5decc566d12c2 fda14487a68f436da7b55fe218798ea2 - - -] Filter RamFilter returned 2 host(s) get_filtered_objects /usr/lib/python2.7/dist-packages/nova/filters.py:104
2016-11-10 18:53:12.292 10845 DEBUG nova.scheduler.filters.disk_filter [req-10c398e2-6611-4884-8883-cdd4c2445848 6abfef77e9984fc58bb5decc566d12c2 ...