I am trying to migrate osTicket from physical Windows machine to AWS RDS, and I get a bunch of warnings while doing so. I am quite a DB noob, so I am not sure what is the issue.
Source: Windows, MySQL 5.6.10, backup created with mysqldump -u root -p --default-character-set=utf8 osTicket> file.sql
mysql> show variables like 'character_set%';
+--------------------------+---------------------------------------------------+
| Variable_name | Value |
+--------------------------+----------------------------------------------------
| character_set_client | cp850 |
| character_set_connection | cp850 |
| character_set_database | utf8 |
| character_set_filesystem | binary |
| character_set_results | cp850 |
| character_set_server | utf8 |
| character_set_system | utf8 |
| character_sets_dir | C:\Program Files\MySQL\MySQL Server 5.6\share\char|
|sets\ |
Destination: AWS RDS, MySQL 5.6.27-log, import with mysql> source file.sql. I've created a custom Parameter Group to change character_set_database to utf8 since default on RDS is latin1
mysql> show variables like 'character_set%';
+--------------------------+---------------------------------------------------+
| Variable_name | Value |
+--------------------------+----------------------------------------------------
| character_set_client | cp850 |
| character_set_connection | cp850 |
| character_set_database | utf8 |
| character_set_filesystem | binary |
| character_set_results | cp850 |
| character_set_server | utf8 |
| character_set_system | utf8 |
| character_sets_dir | /rdsdbbin/mysql-5.6.27.R1/share/charsets/ |
I get warnings like these:
Warning (Code 1300): Invalid utf8 character string: 'FFD8FF'
Warning (Code 1300): Invalid utf8 character string: 'FFD8FF'
Query OK, 6 rows affected, 6 warnings (0.06 sec)
Records: 6 Duplicates: 0 Warnings: 6
Warning (Code 1300): Invalid utf8 character string: 'FFD8FF'
Warning (Code 1300): Invalid utf8 character string: 'FFD8FF'
Warning (Code 1300): Invalid utf8 character string: 'FFD8FF'
Warning (Code 1300): Invalid utf8 character string: 'E2E3CF'
Warning (Code 1300): Invalid utf8 character string: 'D0CF11'
Warning (Code 1300): Invalid utf8 character string: 'E2E3CF'
Query OK, 20 rows affected, 20 warnings (0.06 sec)
Records: 20 Duplicates: 0 Warnings: 20
I tried to set character_set_client=cp850 and the same for set_connection and set_results before import, but didn't help me.
If anyone could be so kind to help me what I am doing wrong here, it would be really great.
Thanks in advance!
1 Answer 1
Judging by your initial post, where you said you used default options for mysqldump
, it means you didn't use the --hex-blob
option. I've played around with osTicket some time ago, and you should definitely used --hex-blob option to create your dump. It will give you much bigger .sql file (and of course, the import to RDS will take more time) but in that way you won't get the UTF8 warnings you're getting now, and your attachments will be available once you deploy osTicket to new host.
One more advice, update to newer versions (1.9 or 1.10) and move those files out of DB, you can use S3 or even EFS now, or just store it under your EBS volume
-
thanks. Can you explain more why BINARY, VARBINARY, the BLOB types, and BIT need the
--hex-blob
option? WhyInvalid utf8 character string
without this option?Sybil– Sybil2017年01月11日 08:20:43 +00:00Commented Jan 11, 2017 at 8:20
SHOW CREATE TABLE
. All parameters (except pwd) on mysqldump. Character-related settings in my.cnf. Proof of what format the source is in -- HEX if possible.[mysqld]
default-character-set=utf8
If bySHOW CREATE TABLE
you mean what's the table charset when the mysqldump is creating them, it's the same for all of them,DEFAULT CHARSET=utf8;
mysqldump
with the default options, I've mentioned it in the first post, and what I noticed that all the errors I get, are inost_file_chunk
table which is actually storing the file attachments for osTicket . As well, that table is around 3.3 GB out of my 3.5 GB dump, and it's practically giving warning for every row which is definining a file chunk (columns are file_id, file_chunk which is saying hiow much chunks file says, and data which is storing blob)file_chunk
? Is some column aBLOB
? Are you using the option to dump as hex? How big is the blob? How many rows?ost_file_chunk
table, third column isblob
. I was using default dump options, entire table is 3.5 GB large (stores all the attachments forosTicket
) with 10801 rows. Thanks for the help Rick