dshm8998473
2018-02-05 11:16
浏览 247

node.js http响应数据在数据大小很大时终止

I am using nodejs and sending http post request to server. On server side I am running php. My server is returning correct data around 9KB but data in nodejs client is terminated. It works fine if data is less than 6KB. Following is my code

        var reqPost = https.request(optionspost, function(res) {
        res.on('data', function(d) {
            Console.log('Cloud Resp:', d);
            var jsonObj = JSON.parse(d);
        });
    }); 

My print Console.log('Cloud Resp:', d) prints data up to 8KB only. Can someone please help me to understand if this limit is imposed by nodejs or something else and how can I increase the limit

图片转代码服务由CSDN问答提供 功能建议

我正在使用nodejs并向服务器发送http post请求。 在服务器端我运行PHP。 我的服务器返回大约9KB的正确数据,但nodejs客户端中的数据终止。 如果数据小于6KB,它可以正常工作。 以下是我的代码

  var reqPost = https.request(optionspost,function(res){
 res.on('data',function(d){
 console  .log('Cloud Resp:',d); 
 var jsonObj = JSON.parse(d); 
}); 
});  
   
 
 

我的打印Console.log('Cloud Resp:',d)仅打印最大8KB的数据。 有人可以帮我理解这个限制是由nodejs或其他什么强加的,我怎样才能增加限制

  • 写回答
  • 关注问题
  • 收藏
  • 邀请回答

2条回答 默认 最新

  • dongzhao4036 2018-02-05 12:25
    已采纳

    I think your data is chunked during transfering PHP server ---> Node server

    • I assume you are using native https module to request from Node side (Correct me if I am wrong)

    So in data event you need to concat the chunk. But you parse it only in end event. If you parse in data event it will show you error for JSON.parse() because of incomplete data

    Here is the sample code, it works with a 500kb data as I tested. Basically native node does not has data limitation in code level.

    var http = require("https");
    
    var options = {
      "method": "GET",
      "hostname": "c16db448-d912-4ce8-823a-db6c51e09878.mock.pstmn.io"
    };
    
    var req = http.request(options, function (res) {
      var chunks = '';
    
      res.on("data", function (chunk) {
        console.log(chunk.length)
        chunks += chunk;
      });
    
      res.on("end", function () {
        const object = JSON.parse(chunks)
        console.log(object.length)
        console.log(Buffer.byteLength(chunks, 'utf8') / 1024 + " kbytes");
      });
    });
    
    已采纳该答案
    打赏 评论
  • dongmubi4375 2018-02-05 11:23

    Let me guess, you are using body-parser too right? That is causing an issue.

    Body Parser Allows data up to certain limit (100kB I guess) by default.

    To increase the limits go to: node_modules>body-parser>lib>types>json.js

    look for the following line:

    var limit = typeof opts.limit !== 'number'
    ? bytes.parse(opts.limit || '100kb')
    : opts.limit
    

    replace the 100 in 100kB with the limit you want to impose in your application. This should fix it.

    Edit:

    If you don't want to change node_modules: set the limit option when you declare the bodyparser variable.

    var bodyParser = require('body-parser');
    bodyParser.json({limit: *your limit in bytes*});
    

    This works only in case of JSON parsing. you can set limit parameter to raw, text, urlencoded formats too similarly (check documentation link below).

    See this.

    Thank You @iofjuupasli for suggesting that there could be a better way.

    打赏 评论

相关推荐 更多相似问题