As suggested by @UlrichEckhardt comment, it was due to read / write concurrency problem. I was trying to read a file that was being writen. I solved this by just waiting before trying to read the file again
读取时文件被截断
I am writing some json results in files in PHP on shared hosting (fwrite).
Then I read those files to extract json results (file_get_contents).
It happens some times (maybe one out of more than one thousand) that when I read this file it appears truncated: I can only read a multiple of the first 32768 bytes of the file.
I added some code to copy/paste the file I am reading in case the json string is not valid, and I then get 2 different files: the original one was correctly written as it contains a valid json string and the copied one contains only the beginning of the original one and has a size of x*32768 bytes.
Would you have any idea of what could be the problem and how to solve this? (I don't know how to investigate further)
Thank you
- 点赞
- 写回答
- 关注问题
- 收藏
- 复制链接分享
- 邀请回答
2条回答
为你推荐
- 关于不使用fclose的后果
- 读取时文件被截断
- file
- php
- 2个回答
- Base64编码不要用在URL中
- IT行业问题
- 计算机技术
- it技术
- 编程语言问答
- 互联网问答
- 0个回答
- AES解密出现意外的截断
- c++
- 1个回答
- 原生node,接收ajax传过来的base64图片太大,被截断了
- 关于使用C语言自己写pcap文件,使用wireshark读取的问题
- c
- wireshark
- 网络
- 2个回答