duanaoshu1989 2013-11-24 21:31
浏览 48

存储在MySQL中的二进制数据已损坏

I need to store 50 KiB - 500 KiB binary files (actually zlib compressed text files) in a MySQL database. Assuming that there is good reason to do this as opposed to just writing them to the filesystem, I am having troubles with the data being corrupted in the database. I have pulled it out with both PHP and Python, both show that the data is corrupted.

I had tried using PHP emulated prepared statements and non-emulated prepared statements to enter the data, as well as simply putting the data right in the query as if it were a string. All of these resulted in corrupt data.

With prepared statements:

$options = array('PDO::ATTR_EMULATE_PREPARES' => FALSE);
$dsn = 'mysql:host=localhost;dbname=qb_cache;charset=utf8';
$pdo = new PDO($dsn, 'root', 'hunter2', $options);

$sql = 'INSERT INTO cache (body) VALUES (:body)';
$stmt = $pdo->prepare($sql);
$stmt->bindValue(':body', $body);

With emulated prepared statements:

$dsn = 'mysql:host=localhost;dbname=qb_cache;charset=utf8';
$pdo = new PDO($dsn, 'root', 'hunter2');

$sql = 'INSERT INTO cache (body) VALUES (:body)';
$stmt = $pdo->prepare($sql);
$stmt->bindValue(':body', $body);

Straight into the database:

$dsn = 'mysql:host=localhost;dbname=qb_cache;charset=utf8';
$pdo = new PDO($dsn, 'root', 'hunter2');

$sql = "INSERT INTO cache (body) VALUES ('{$body}')";
$stmt = $pdo->prepare($sql);

Note that there is no PDO PDO::PARAM_* constant for binary data! By the way, I don't believe that I'm hitting the size limit for a MySQL field:

mysql> show variables like 'max_allowed_packet';
+--------------------+----------+
| Variable_name      | Value    |
+--------------------+----------+
| max_allowed_packet | 16777216 |
+--------------------+----------+
1 row in set (0.00 sec)

This is what I get when I try to decode:

$ zlib-flate -uncompress < output_from_database
...snip...
74 ARP4s0B64R9P6oZOpe6262E}C k3A AFD001 Si IL4A57sflate: inflate: data: invalid block type


$ cat output_from_database | openssl zlib -d > decoded
140438369359520:error:29065064:lib(41):BIO_ZLIB_READ:zlib inflate error:c_zlib.c:570:zlib error:data error

Using Python to pull the data out of MySQL I see that the field is the right size, but I can't decompress it::

>>> pprint(zlib.decompress(row[0]))
Traceback (most recent call last):
  File "<stdin>", line 1, in <module>
zlib.error: Error -5 while decompressing data: incomplete or truncated stream
  • 写回答

1条回答 默认 最新

  • dongyaobo9081 2013-11-25 08:15
    关注

    The problem was that the column type was blob as for some reason I thought that blob was the largest datatype that MySQL supports. As it turns out, when I actually looked and saw that the size coming out of the database was 65535 bytes I understood that the data was being truncated. Moving to longblob resolved the issue.

    For the next guy: MySQL Data Type Storage Requirements.

    评论

报告相同问题?

悬赏问题

  • ¥15 任务A:大数据平台搭建(容器环境)怎么做呢?
  • ¥15 r语言神经网络自变量重要性分析
  • ¥15 基于双目测规则物体尺寸
  • ¥15 wegame打不开英雄联盟
  • ¥15 公司的电脑,win10系统自带远程协助,访问家里个人电脑,提示出现内部错误,各种常规的设置都已经尝试,感觉公司对此功能进行了限制(我们是集团公司)
  • ¥15 救!ENVI5.6深度学习初始化模型报错怎么办?
  • ¥30 eclipse开启服务后,网页无法打开
  • ¥30 雷达辐射源信号参考模型
  • ¥15 html+css+js如何实现这样子的效果?
  • ¥15 STM32单片机自主设计