今天模拟一些数据,需要将一些大文件我用的是755M(可以更大)存入到Mysql的一个longblob字段中。
我使用了pstmt.setBinaryStream(2, fis, f.length());的方式,但总是抛出异常。
貌似Mysql 的JDBC总是先将stream中的数据读成byte[]然后再往数据库中存,所以总是内存溢出,不知哪位高人可有解决方案?!
我的代码:
ApplicationContext contxt = new ClassPathXmlApplicationContext("classpath:applicationContext.xml"); DataSource ds = contxt.getBean("dataSource", DataSource.class); Connection con = ds.getConnection(); PreparedStatement pstmt = con.prepareStatement("insert into saved_file values(?, ?)"); File f = new File("D:\\test\\office_755M.iso"); // 测试用的文件,确定大小 for (int i = 1; i < 11; i++) { System.out.println("添加: " + i + "..."); pstmt.setInt(1, i); FileInputStream fis = new FileInputStream(f); pstmt.setBinaryStream(2, fis, f.length()); int rt = pstmt.executeUpdate(); fis.close(); System.out.println("添加结果: " + rt); } pstmt.close(); con.close();
抛出异常:
Exception in thread "main" java.lang.OutOfMemoryError: Java heap space at com.mysql.jdbc.Buffer.ensureCapacity(Buffer.java:156) at com.mysql.jdbc.PreparedStatement.fillSendPacket(PreparedStatement.java:2544) at com.mysql.jdbc.PreparedStatement.executeUpdate(PreparedStatement.java:2401) at com.mysql.jdbc.PreparedStatement.executeUpdate(PreparedStatement.java:2345) at com.mysql.jdbc.PreparedStatement.executeUpdate(PreparedStatement.java:2330) at org.apache.commons.dbcp.DelegatingPreparedStatement.executeUpdate(DelegatingPreparedStatement.java:105) at org.apache.commons.dbcp.DelegatingPreparedStatement.executeUpdate(DelegatingPreparedStatement.java:105) at com.wondersgroup.cache.test.DBDataPrepare.main(DBDataPrepare.java:31)