You have tripped over a issue caused by a recent change in the oslo-incubator project, which enforces utf8
character encoding on database tables when using the mysql database backend. The code in question looks like this:
if engine.name == 'mysql':
onlyutf8_sql = ('SELECT TABLE_NAME,TABLE_COLLATION '
'from information_schema.TABLES '
'where TABLE_SCHEMA=%s and '
'TABLE_COLLATION NOT LIKE "%%utf8%%"')
table_names = [res[0] for res in engine.execute(onlyutf8_sql,
engine.url.database)]
if len(table_names) > 0:
raise ValueError(_('Tables "%s" have non utf8 collation, '
'please make sure all tables are CHARSET=utf8'
) % ','.join(table_names))
This code has been adopted by the Keystone, Glance, and Heat projects
as of this writing, and will probably works its way into other
projects. If you have existing tables that were created with latin1
encoding, this code will cause database upgrades to fail.
To address this error, you need to convert the tables in question to
use utf8
character encoding. You can convert the tables from the
command line like this:
DB="glance"
(
echo 'ALTER DATABASE `'"$DB"'` CHARACTER SET utf8 COLLATE utf8_general_ci;'
mysql "$DB" -e "SHOW TABLES" --batch --skip-column-names \
| xargs -I{} echo 'ALTER TABLE `'{}'` CONVERT TO CHARACTER SET utf8 COLLATE utf8_general_ci;'
) \
| mysql "$DB"
You would need to run this once per database.
Note that there are potential issues with this conversion:
As described in this post, if you had clients set to utf8
putting data into your latin1
tables containing "high ascii"
characters, you can end up with unexpected results as a result of the
converstion.
In this case, you can either fix the corrupted entries by hand, or
write a conversion tool as described in the linked post.