Ask Your Question
0

Error upgrading database schema: ERROR: Tables "..." have non utf8 collation, please make sure all tables are CHARSET=utf8

asked 2014-04-02 12:38:14 -0500

larsks gravatar image

When I try to performa db_sync operation to upgrade my database schema from Havana to Icehouse, I encounter an error similar to the following:

# heat-manage db_sync
ERROR: Tables "event,migrate_version,raw_template,resource,resource_data,stack,user_creds,watch_data,watch_rule" have non utf8 collation, please make sure all tables are CHARSET=utf8

This error is for Heat, but I see a similar error with Glance and Keystone.

edit retag flag offensive close merge delete

1 answer

Sort by ยป oldest newest most voted
1

answered 2014-04-02 12:50:43 -0500

larsks gravatar image

You have tripped over a issue caused by a recent change in the oslo-incubator project, which enforces utf8 character encoding on database tables when using the mysql database backend. The code in question looks like this:

if engine.name == 'mysql':
    onlyutf8_sql = ('SELECT TABLE_NAME,TABLE_COLLATION '
                    'from information_schema.TABLES '
                    'where TABLE_SCHEMA=%s and '
                    'TABLE_COLLATION NOT LIKE "%%utf8%%"')

    table_names = [res[0] for res in engine.execute(onlyutf8_sql,
                                                    engine.url.database)]
    if len(table_names) > 0:
        raise ValueError(_('Tables "%s" have non utf8 collation, '
                           'please make sure all tables are CHARSET=utf8'
                           ) % ','.join(table_names))

This code has been adopted by the Keystone, Glance, and Heat projects as of this writing, and will probably works its way into other projects. If you have existing tables that were created with latin1 encoding, this code will cause database upgrades to fail.

To address this error, you need to convert the tables in question to use utf8 character encoding. You can convert the tables from the command line like this:

DB="glance"
(
    echo 'ALTER DATABASE `'"$DB"'` CHARACTER SET utf8 COLLATE utf8_general_ci;'
    mysql "$DB" -e "SHOW TABLES" --batch --skip-column-names \
    | xargs -I{} echo 'ALTER TABLE `'{}'` CONVERT TO CHARACTER SET utf8 COLLATE utf8_general_ci;'
) \
| mysql "$DB"

You would need to run this once per database.

Note that there are potential issues with this conversion:

As described in this post, if you had clients set to utf8 putting data into your latin1 tables containing "high ascii" characters, you can end up with unexpected results as a result of the converstion.

In this case, you can either fix the corrupted entries by hand, or write a conversion tool as described in the linked post.

edit flag offensive delete link more

Comments

Unfortunately this doesn't work because of foreign keys defined by OpenStack services DB migrations. For instance:

ERROR 1832 (HY000) at line 1: Cannot change column 'blob_id': used in a foreign key constraint 'artifact_blob_locations_ibfk_1'
Davide Guerri gravatar imageDavide Guerri ( 2015-09-04 04:43:14 -0500 )edit

Your Answer

Please start posting anonymously - your entry will be published after you log in or create a new account.

Add Answer

Get to know Ask OpenStack

Resources for moderators

Question Tools

1 follower

Stats

Asked: 2014-04-02 12:38:14 -0500

Seen: 2,782 times

Last updated: Apr 02 '14