limit 后面如何用java变量赋值

int pagesize =0; int sumnum=30;pagesize =sumnum/2;
// int tmpPage = 0;
// tmpPage = sumnum % 2 == 0 ? 0 : 1;
// pagesize = pagesize + tmpPage;
for(int i=1;i<=pagesize;i++){
String sql = "SELECT ID,name, url From websites limit i*2,2;";
ResultSet rs = stmt.executeQuery(sql);
limit后面需要将java i的变量赋值,麻烦再讲清楚些,怎么做,在我后面写上些代码刚刚接触编程呀,网上去找,都不对,大佬,拜托了。

6个回答

String sql = "SELECT ID,name, url From websites limit "+(i*2)+",2;";

String sql ="SELECT ID,name, url From websites limit ?*2,2";//用?占位符代替
Connection con = null;
PreparedStatement stmt = con.prepareStatement(sql);
for (int i = 0; i < 10; i++) {
stmt.setInt(1, i);//1代表第一个参数,也是第一问号
ResultSet rs = stmt.executeQuery();
}

String sql = "SELECT ID,name, url From websites limit "+i+"*2,2;";

wangpeng920310
wangpeng920310 String sql = "SELECT ID,name, url From websites limit "+i+"*2,2;";
接近 2 年之前 回复
u013606570
u013606570 String sql = "SELECT ID,name, url From websites limit "+(i*2)+",2;";
接近 2 年之前 回复
ggx1abc
gu123xin 不可以,会报错You have an error in your SQL syntax; check the manual that corresponds to your MySQL server version for the right syntax to use near '*2,2' at line 1
接近 2 年之前 回复

nativeQuery = "true"

建议你还是找个jdbc的书系统的看看,或相关资料,要不了一天就了解了,老是这样问不是办法,学到的也少。

ggx1abc
gu123xin 在学习中,谢谢了
接近 2 年之前 回复

String sql = "SELECT ID,name, url From websites limit "+(i*2)+",2;";

这样或许可以 但是不是很好 最好用占位符
String sql = "SELECT ID,name, url From websites limit ?,?”
然后用PreparedStatement 设置相应的参数

Csdn user default icon
上传中...
上传图片
插入图片
抄袭、复制答案,以达到刷声望分或其他目的的行为,在CSDN问答是严格禁止的,一经发现立刻封号。是时候展现真正的技术了!
其他相关推荐
求sql分页利用java的变量的代码

public static void main(String[] args) { Connection conn = null; Statement stmt = null; try{ creatFile(fn, FILE_NAME); Class.forName("com.mysql.jdbc.Driver"); System.out.println("连接数据库..."); conn = DriverManager.getConnection(DB_URL,USER,PASS); System.out.println(" 实例化Statement对象..."); stmt = conn.createStatement(); int sunNum = 30; int pagesize = 0; pagesize = sunNum/2; for(int i=0;i<=pagesize;i++) { String sql = "SELECT ID,name, url From websites limit i*2,2;"; ResultSet rs = stmt.executeQuery(sql); String fn = "F://NEWtest//again4.txt"; wf(fn,"ID,name,url."); while(rs.next()){ int id = rs.getInt("id"); String name = rs.getString("name"); String url = rs.getString("url"); //String country = rs.getString("country"); //String fourth = rs.getString("fourth"); System.out.print(", FR: " + id); System.out.print(", SE: " + name); System.out.print(", TH: " + url); //System.out.print(", FO: " + country); System.out.print("\n"); wf(fn,id+ name + url ); } rs.close(); stmt.close(); conn.close();} 看看我的代码对不,想利用循环分页,目前不知道怎么样把java的变量赋值给sql,帮忙解决一下,写个代码出来吧,我刚学java不到一个星期,谢谢。

JAVA ArrayList GC overhead limit exceeded问题

项目中遇到一个问题求教CSDN大牛们,在大批量创建Arraylist的过程中发现GC overhead limit exceeded的情况,具体情况简化如下。 项目中需要用Arraylist添加1000000次counter_temp字符串,counter_temp字符串需要100次累加得到。在运行到后期会出现GC overhead limit exceeded的情况,求大牛指点迷津,怎么修改可以解决。代码简化如下: import java.util.ArrayList; public class TestGC {     public static void main(String[] args) {         ArrayList list = new ArrayList();         String counter_temp=null;                  for (int i = 0; i < 1000000; i++) {             long s = System.currentTimeMillis();          for (int j = 0; j < 100; j++){          if (counter_temp==null){          counter_temp="1";          }else{          counter_temp=counter_temp+"$1";          }             }          list.add(counter_temp);             long end = System.currentTimeMillis();             System.out.println(end-s);         }     } }

请教一个关于NodeJs中在多个pool.query回调中的变量赋值问题?

**目的描述: ** 我根据查到的总的数据数目和前端请求分页的每页数目,计算出要分的总的页数,以便于做个页码翻页。 **出现问题:** 代码部分: ``` router.get('/list',(req,res)=>{ var obj = req.query; var $page = Math.round(obj.page); var $size = Math.round(obj.size); var $pageCount = null; //临时存储 总页数 var pageCount = 0;//第一种写法 console.log(obj); if(!$page){ $page = 1; } if(!$size){ $size = 10; } $page --; console.log('前端请求条数'+$size); console.log('初始值'+ pageCount);//this.pageCount //console.log('初始值'+ pageCount);//第二种写法 //获得总的页面数 var sqlstr2 = 'SELECT * FROM xz_laptop'; pool.query(sqlstr2,(err,result)=>{ if(err) throw err; if(result.length > 0){ //总页数 $pageCount = result.length; console.log('总数据数'+$pageCount); console.log($pageCount/$size); pageCount = Math.ceil($pageCount/$size); console.log('总的页数:'+ pageCount); } }); //分页查询 var sqlstr = 'SELECT * FROM xz_laptop LIMIT ? , ?'; pool.query(sqlstr,[($page*$size),$size],(err, result)=>{ //console.log() if(err) throw err; console.log(result); if(result.length > 0){ console.log('返回页数'+ pageCount ); res.send({data:result,pno:($page+1),pCount:pageCount}); //res.send(result); }else{ res.send({code:301,msg:'list error'}); } }); ``` 采用第一种写法:pageCount只能获取到第一次服务器开启时的数据,此后返回的pageCount均为null,而且返回的数据对象是: {data:…, pno:…,//返回的页码 pCount:…,//总的页数,也就是我想要的数 pageCount:….//这是什么鬼,为什么会出现四个属性?? } 可是我原本的数据对象是这样的: {data:result,pno:($page+1),pCount:pageCount} 为什么会多出来一个属性??这是一点。 采用第二种写法,直接不声明pageCount,后续的pageCount全部使用this.pageCount,就可以得到想要的结果数据,但是我并没有赋初始值,可是初始值哪里会打印一个1是什么鬼。而且为什么这样就能打印出来,因为是隐式的提升为对象的属性了吗?? 如果有this指向问题,用变量保存了this,依然不行,结果是undefine,为什么呢? 还有像这种执行多条Sql语句是否能这样写? 望各位大佬指教! 考虑过函数提升、this指向、但是还是想不通,为什么呢??

java用limit循环读取mysql直到全部读完,并能显示读了多少条

数据库数据很多,每次读5000条左右,用分页的方式循环读完,下面是测试代码,现在需要分页的代码,可以写在我发的测试代码上面,尽量详细,谢谢。 package webtest; import java.sql.*; import java.io.BufferedWriter; import java.io.FileOutputStream; import java.io.FileWriter; import java.io.IOException; import java.io.OutputStreamWriter; import java.io.PrintWriter; import java.io.File; public class test5{ static final String JDBC_DRIVER = "com.mysql.jdbc.Driver"; static final String DB_URL = "jdbc:mysql://localhost:3306/gumysql"; static final String USER = "root"; static final String PASS = "123456"; public static final String FILE_NAME = "again1.txt";//要创建的文件名 public static final String fn = "F:/NEWtest/";//文件指定存放的路径 public static void creatFile(String fn, String fileName) { File folder = new File(fn); //文件夹路径不存在 if (!folder.exists() && !folder.isDirectory()) { System.out.println("文件夹路径不存在,创建路径:" + fn); folder.mkdirs(); } else { System.out.println("文件夹路径存在:" + fn); } // 如果文件不存在就创建 File file = new File(fn + fileName); if (!file.exists()) { System.out.println("文件不存在,创建文件:" + fn+ fileName); try { file.createNewFile(); } catch (IOException e) { e.printStackTrace(); } } else { System.out.println("文件已存在,文件为:" + fn+ fileName); } } public static void wf(String file, String conent) { BufferedWriter out = null; try { out = new BufferedWriter(new OutputStreamWriter( new FileOutputStream(file, true))); out.write(conent+"\r\n"); } catch (Exception e) { e.printStackTrace(); } finally { try { out.close(); } catch (IOException e) { e.printStackTrace(); } } } public static void main(String[] args) { Connection conn = null; Statement stmt = null; try{ creatFile(fn, FILE_NAME); Class.forName("com.mysql.jdbc.Driver"); System.out.println("连接数据库..."); conn = DriverManager.getConnection(DB_URL,USER,PASS); System.out.println(" 实例化Statement对象..."); stmt = conn.createStatement(); String sql; sql = "SELECT id, name, url ,country FROM websites "; ResultSet rs = stmt.executeQuery(sql); String fn = "F://NEWtest//again1.txt"; wf(fn,"ID ,站点名称, 站点,country."); while(rs.next()){ int id = rs.getInt("id"); String name = rs.getString("name"); String url = rs.getString("url"); String country = rs.getString("country"); System.out.print("ID: " + id); System.out.print(", 站点名称: " + name); System.out.print(", 站点 URL: " + url); System.out.print(", country: " + country); System.out.print("\n"); wf(fn,id+"," + name+ "," + url+ "," + country); } rs.close(); stmt.close(); conn.close(); }catch(SQLException se){ se.printStackTrace(); }catch(Exception e){ e.printStackTrace(); }finally{ try{ if(stmt!=null) stmt.close(); }catch(SQLException se2){ } try{ if(conn!=null) conn.close(); }catch(SQLException se){ se.printStackTrace(); } } System.out.println("Goodbye!"); } }

Java数据库order by ? limit ?,?(排序带分页)问题

如题,关于第一个排序不起作用(一直是默认排序)。不知道大家在开发中有没有遇见这个问题。但是在mysql命令行可以使用。如何解决?急着用!在线等

项目总是报gc overhead limit exceeded 请问该如何解决?

项目总是报gc overhead limit exceeded 请问该如何解决? ![图片说明](https://img-ask.csdn.net/upload/201812/27/1545913147_197931.png)

mysql的limit分页查询可以在前面加条件吗?

mysql的limit来让一个模糊查询只在前端页面上分页中显示出3列,sql语句怎么写?

group 分组之后,如何对每个组进行分页展示 limit?

group 分组之后,如何对每个组进行分页展示 limit? 例如一页显示10条 则第一页显示所有分组的前10条 第二页显示,所有分组的10-20条 存在问题,有的分组>10条数据,有的小于10条 直接对group分组进行limit的话是有问题的

mysql limit函数传参问题

limit函数有限制参数必须为整型,现在有个需求是要给limit函数传入二个参数,而我这边能提供的二个参数是string型的,如果转化为int性被limit函数接受? 用cast(? as unsigned int)不行,求高手赐教

请教一下redis的参数client-output-buffer-limit pubsub是怎么优化的?如果参数设置client-output-buffer-limit pubsub 0 0 0对服务器有什么影响?

最近后台服务偶发报错: redis.clients.jedis.exceptions.JedisConnectionException: Unexpected end of stream. at redis.clients.util.RedisInputStream.ensureFill(RedisInputStream.java:198) at redis.clients.util.RedisInputStream.read(RedisInputStream.java:180) at redis.clients.jedis.Protocol.processBulkReply(Protocol.java:158) at redis.clients.jedis.Protocol.process(Protocol.java:132) at redis.clients.jedis.Protocol.processMultiBulkReply(Protocol.java:183) at redis.clients.jedis.Protocol.process(Protocol.java:134) at redis.clients.jedis.Protocol.read(Protocol.java:192) at redis.clients.jedis.Connection.readProtocolWithCheckingBroken(Connection.java:282) at redis.clients.jedis.Connection.getRawObjectMultiBulkReply(Connection.java:227) at redis.clients.jedis.JedisPubSub.process(JedisPubSub.java:108) at redis.clients.jedis.JedisPubSub.proceedWithPatterns(JedisPubSub.java:95) at redis.clients.jedis.Jedis.psubscribe(Jedis.java:2513) at BenchRedisConsumer$BenchRunner.run(BenchRedisConsumer.java:208) at java.lang.Thread.run(Thread.java:745) 对于pubsub client,如果client-output-buffer一旦超过32mb,又或者超过8mb持续60秒,那么服务器就会立即断开客户端连接。如果设置reids参数为client-output-buffer-limit pubsub 0 0 0的话,对于服务器会产生什么影响呢?还是要根据其他参数弹性的调整这个参数?

mySQL的limit分页怎么转换成SQL server语句

**String sql="select " + str + " from product,shop where product.shop_id=shop.shop_id " + " limit " + (page *size) + "," + size;//通过limit来达到分页的效果**_ 其中定义了private String str = "product.product_id,product.category_id,product.shop_id,product.sub_category_id,product.city_id,product.product_title," + "shop.shop_name,shop.shop_tel,shop.shop_address,shop.shop_area,shop.shop_open_time,shop.shop_lon,shop.shop_lat,shop.shop_traffic_info"; 那么问题来了,上面的mySQL语句要怎么转换成SQL server语句达到分页的效果呢?

关于ByteBuffer的position和limit的问题

@Test public void test() throws Exception{ FileInputStream fis = new FileInputStream("D:\\1.jpg"); FileOutputStream fos = new FileOutputStream("D:\\2.jpg"); //获取通道 FileChannel inChannel = fis.getChannel(); FileChannel outChannel = fos.getChannel(); //分配指定大小缓存区 ByteBuffer buff = ByteBuffer.allocate(1024); System.out.println("-----ByteBuffer.allocate(1024)-----"); System.out.println("position"+buff.position()); System.out.println("limit"+buff.limit()); } 这个test在run下输出为 -----ByteBuffer.allocate(1024)----- position0 limit1024 将断点打在System.out.println("position"+buff.position());这个代码前面,debug下输出为 -----ByteBuffer.allocate(1024)----- position1024 limit1024 如果将 FileInputStream fis = new FileInputStream("D:\\1.jpg"); FileOutputStream fos = new FileOutputStream("D:\\2.jpg"); //获取通道 FileChannel inChannel = fis.getChannel(); FileChannel outChannel = fos.getChannel(); 注释掉,run和debug输出是一样的 -----ByteBuffer.allocate(1024)----- position1024 limit1024 获取通道FileChannel对ByteBuffer有什么影响? 求解答

java.lang.OutOfMemory异常

各位大神,小生在做excel导出的时候遇见了这么一个问题,作为实习生实在是莫名其妙,求各位大神帮个小忙。 java.lang.OutOfMemoryError: GC overhead limit exceeded 16:31:22.700 [nioEventLoopGroup-2-8] WARN io.netty.channel.nio.NioEventLoop - Unexpected exception in the selector loop. java.lang.OutOfMemoryError: GC overhead limit exceeded at java.util.ArrayList.iterator(ArrayList.java:814) ~[na:1.7.0_80] at sun.nio.ch.WindowsSelectorImpl.updateSelectedKeys(WindowsSelectorImpl.java:496) ~[na:1.7.0_80] at sun.nio.ch.WindowsSelectorImpl.doSelect(WindowsSelectorImpl.java:172) ~[na:1.7.0_80] at sun.nio.ch.SelectorImpl.lockAndDoSelect(SelectorImpl.java:87) ~[na:1.7.0_80] at sun.nio.ch.SelectorImpl.select(SelectorImpl.java:98) ~[na:1.7.0_80] at io.netty.channel.nio.NioEventLoop.select(NioEventLoop.java:622) ~[netty-transport-4.0.32.Final.jar:4.0.32.Final] at io.netty.channel.nio.NioEventLoop.run(NioEventLoop.java:310) ~[netty-transport-4.0.32.Final.jar:4.0.32.Final] at io.netty.util.concurrent.SingleThreadEventExecutor$2.run(SingleThreadEventExecutor.java:112) [netty-common-4.0.32.Final.jar:4.0.32.Final] at io.netty.util.concurrent.DefaultThreadFactory$DefaultRunnableDecorator.run(DefaultThreadFactory.java:137) [netty-common-4.0.32.Final.jar:4.0.32.Final] at java.lang.Thread.run(Thread.java:745) [na:1.7.0_80] 16:31:23.334 [Thread-6] ERROR o.a.e.i.a.AcquireTimerJobsRunnable - exception during timer job acquisition: GC overhead limit exceeded java.lang.OutOfMemoryError: GC overhead limit exceeded at java.util.Arrays.copyOf(Arrays.java:2367) ~[na:1.7.0_80] at java.lang.AbstractStringBuilder.expandCapacity(AbstractStringBuilder.java:130) ~[na:1.7.0_80] at java.lang.AbstractStringBuilder.ensureCapacityInternal(AbstractStringBuilder.java:114) ~[na:1.7.0_80] at java.lang.AbstractStringBuilder.append(AbstractStringBuilder.java:415) ~[na:1.7.0_80] at java.lang.StringBuilder.append(StringBuilder.java:132) ~[na:1.7.0_80] at org.apache.ibatis.reflection.wrapper.BeanWrapper.getBeanProperty(BeanWrapper.java:171) ~[mybatis-3.3.0.jar:3.3.0] at org.apache.ibatis.reflection.wrapper.BeanWrapper.get(BeanWrapper.java:49) ~[mybatis-3.3.0.jar:3.3.0] at org.apache.ibatis.reflection.MetaObject.getValue(MetaObject.java:122) ~[mybatis-3.3.0.jar:3.3.0] at org.apache.ibatis.executor.BaseExecutor.createCacheKey(BaseExecutor.java:212) ~[mybatis-3.3.0.jar:3.3.0] at org.apache.ibatis.executor.CachingExecutor.createCacheKey(CachingExecutor.java:139) ~[mybatis-3.3.0.jar:3.3.0] at org.apache.ibatis.executor.CachingExecutor.query(CachingExecutor.java:81) ~[mybatis-3.3.0.jar:3.3.0] at org.apache.ibatis.session.defaults.DefaultSqlSession.selectList(DefaultSqlSession.java:120) ~[mybatis-3.3.0.jar:3.3.0] at org.apache.ibatis.session.defaults.DefaultSqlSession.selectList(DefaultSqlSession.java:113) ~[mybatis-3.3.0.jar:3.3.0] at org.activiti.engine.impl.db.DbSqlSession.selectListWithRawParameter(DbSqlSession.java:438) ~[activiti-engine-5.19.0.2.jar:5.19.0.2] at org.activiti.engine.impl.db.DbSqlSession.selectList(DbSqlSession.java:429) ~[activiti-engine-5.19.0.2.jar:5.19.0.2] at org.activiti.engine.impl.db.DbSqlSession.selectList(DbSqlSession.java:424) ~[activiti-engine-5.19.0.2.jar:5.19.0.2] at org.activiti.engine.impl.db.DbSqlSession.selectList(DbSqlSession.java:411) ~[activiti-engine-5.19.0.2.jar:5.19.0.2] at org.activiti.engine.impl.persistence.entity.JobEntityManager.findNextTimerJobsToExecute(JobEntityManager.java:157) ~[activiti-engine-5.19.0.2.jar:5.19.0.2] at org.activiti.engine.impl.cmd.AcquireTimerJobsCmd.execute(AcquireTimerJobsCmd.java:45) ~[activiti-engine-5.19.0.2.jar:5.19.0.2] at org.activiti.engine.impl.cmd.AcquireTimerJobsCmd.execute(AcquireTimerJobsCmd.java:29) ~[activiti-engine-5.19.0.2.jar:5.19.0.2] at org.activiti.engine.impl.interceptor.CommandInvoker.execute(CommandInvoker.java:24) ~[activiti-engine-5.19.0.2.jar:5.19.0.2] at org.activiti.engine.impl.interceptor.CommandContextInterceptor.execute(CommandContextInterceptor.java:57) ~[activiti-engine-5.19.0.2.jar:5.19.0.2] at org.activiti.spring.SpringTransactionInterceptor$1.doInTransaction(SpringTransactionInterceptor.java:47) ~[activiti-spring-5.19.0.2.jar:5.19.0.2] at org.springframework.transaction.support.TransactionTemplate.execute(TransactionTemplate.java:133) ~[spring-tx-4.2.2.RELEASE.jar:4.2.2.RELEASE] at org.activiti.spring.SpringTransactionInterceptor.execute(SpringTransactionInterceptor.java:45) ~[activiti-spring-5.19.0.2.jar:5.19.0.2] at org.activiti.engine.impl.interceptor.LogInterceptor.execute(LogInterceptor.java:31) ~[activiti-engine-5.19.0.2.jar:5.19.0.2] at org.activiti.engine.impl.cfg.CommandExecutorImpl.execute(CommandExecutorImpl.java:40) ~[activiti-engine-5.19.0.2.jar:5.19.0.2] at org.activiti.engine.impl.cfg.CommandExecutorImpl.execute(CommandExecutorImpl.java:35) ~[activiti-engine-5.19.0.2.jar:5.19.0.2] at org.activiti.engine.impl.asyncexecutor.AcquireTimerJobsRunnable.run(AcquireTimerJobsRunnable.java:52) ~[activiti-engine-5.19.0.2.jar:5.19.0.2] at java.lang.Thread.run(Thread.java:745) [na:1.7.0_80] /report/collectionPaymentClass-report-down java.lang.OutOfMemoryError: GC overhead limit exceeded

java内存溢出问题,不确定哪里出的问题

项目运行的时候出现了内存溢出情况,复现方式无法确定,大概出现了五六次,每次复现的方式都不一样,下面是我用MAT工具分析的dump文件 ![图片说明](https://img-ask.csdn.net/upload/202001/20/1579508231_612899.png) ![图片说明](https://img-ask.csdn.net/upload/202001/20/1579508244_502327.png) 除了hibernate里的那两个对象,其他的1000多万个对象里全是integer类型的对象 这是当时出现的时候产生的异常 ``` 2020-01-17 13:32:55 ERROR [DruidDataSource.java:2469] - create connection SQLException, url: jdbc:mysql://localhost:55060/mcs?useUnicode=true&characterEncoding=utf-8, errorCode 0, state 08S01 com.mysql.jdbc.exceptions.jdbc4.CommunicationsException: Communications link failure The last packet successfully received from the server was 1 milliseconds ago. The last packet sent successfully to the server was 0 milliseconds ago. at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method) at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:57) at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45) at java.lang.reflect.Constructor.newInstance(Constructor.java:526) at com.mysql.jdbc.Util.handleNewInstance(Util.java:425) at com.mysql.jdbc.SQLError.createCommunicationsException(SQLError.java:990) at com.mysql.jdbc.MysqlIO.reuseAndReadPacket(MysqlIO.java:3517) at com.mysql.jdbc.MysqlIO.reuseAndReadPacket(MysqlIO.java:3417) at com.mysql.jdbc.MysqlIO.checkErrorPacket(MysqlIO.java:3860) at com.mysql.jdbc.MysqlIO.checkErrorPacket(MysqlIO.java:864) at com.mysql.jdbc.MysqlIO.proceedHandshakeWithPluggableAuthentication(MysqlIO.java:1707) at com.mysql.jdbc.MysqlIO.doHandshake(MysqlIO.java:1217) at com.mysql.jdbc.ConnectionImpl.coreConnect(ConnectionImpl.java:2189) at com.mysql.jdbc.ConnectionImpl.connectOneTryOnly(ConnectionImpl.java:2220) at com.mysql.jdbc.ConnectionImpl.createNewIO(ConnectionImpl.java:2015) at com.mysql.jdbc.ConnectionImpl.<init>(ConnectionImpl.java:768) at com.mysql.jdbc.JDBC4Connection.<init>(JDBC4Connection.java:47) at sun.reflect.GeneratedConstructorAccessor118.newInstance(Unknown Source) at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45) at java.lang.reflect.Constructor.newInstance(Constructor.java:526) at com.mysql.jdbc.Util.handleNewInstance(Util.java:425) at com.mysql.jdbc.ConnectionImpl.getInstance(ConnectionImpl.java:385) at com.mysql.jdbc.NonRegisteringDriver.connect(NonRegisteringDriver.java:323) at com.alibaba.druid.pool.DruidAbstractDataSource.createPhysicalConnection(DruidAbstractDataSource.java:1513) at com.alibaba.druid.pool.DruidAbstractDataSource.createPhysicalConnection(DruidAbstractDataSource.java:1578) at com.alibaba.druid.pool.DruidDataSource$CreateConnectionThread.run(DruidDataSource.java:2466) Caused by: java.io.EOFException: Can not read response from server. Expected to read 4 bytes, read 0 bytes before connection was unexpectedly lost. at com.mysql.jdbc.MysqlIO.readFully(MysqlIO.java:2969) at com.mysql.jdbc.MysqlIO.reuseAndReadPacket(MysqlIO.java:3427) ... 19 more 2020-01-17 13:33:03 ERROR [DiskStorageFactory.java:495] - Disk Write of mcs_preset_rlat failed: java.lang.OutOfMemoryError: GC overhead limit exceeded at net.sf.ehcache.util.MemoryEfficientByteArrayOutputStream.getBytes(MemoryEfficientByteArrayOutputStream.java:65) at net.sf.ehcache.util.MemoryEfficientByteArrayOutputStream.serialize(MemoryEfficientByteArrayOutputStream.java:99) at net.sf.ehcache.store.disk.DiskStorageFactory.serializeElement(DiskStorageFactory.java:405) at net.sf.ehcache.store.disk.DiskStorageFactory.write(DiskStorageFactory.java:384) at net.sf.ehcache.store.disk.DiskStorageFactory$DiskWriteTask.call(DiskStorageFactory.java:485) at net.sf.ehcache.store.disk.DiskStorageFactory$PersistentDiskWriteTask.call(DiskStorageFactory.java:1088) at net.sf.ehcache.store.disk.DiskStorageFactory$PersistentDiskWriteTask.call(DiskStorageFactory.java:1072) at java.util.concurrent.FutureTask.run(FutureTask.java:262) at java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.access$201(ScheduledThreadPoolExecutor.java:178) at java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(ScheduledThreadPoolExecutor.java:292) at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145) at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615) at java.lang.Thread.run(Thread.java:745) 2020-01-17 13:32:59 WARN [DruidDataSource.java:1258] - get connection timeout retry : 1 2020-01-17 13:32:59 WARN [DruidDataSource.java:1258] - get connection timeout retry : 1 2020-01-17 13:32:55 WARN [DruidDataSource.java:1258] - get connection timeout retry : 1 2020-01-17 13:33:16 WARN [DruidDataSource.java:1258] - get connection timeout retry : 1 2020-01-17 13:33:27 WARN [SqlExceptionHelper.java:143] - SQL Error: 0, SQLState: null 2020-01-17 13:33:27 ERROR [SqlExceptionHelper.java:144] - wait millis 7401, active 16, maxActive 40, creating 1 2020-01-17 13:33:27 INFO [GetCWRCapHandler.java:114] - 解析录像服务器上报心跳协议过程中出现异常:org.springframework.transaction.CannotCreateTransactionException: Could not open JPA EntityManager for transaction; nested exception is javax.persistence.PersistenceException: org.hibernate.exception.GenericJDBCException: Could not open connection 2020-01-17 13:33:33 WARN [DruidDataSource.java:1258] - get connection timeout retry : 1 2020-01-17 13:33:33 WARN [DruidDataSource.java:1258] - get connection timeout retry : 1 2020-01-17 13:33:49 WARN [DruidDataSource.java:1258] - get connection timeout retry : 1 2020-01-17 13:33:49 WARN [DruidDataSource.java:1258] - get connection timeout retry : 1 2020-01-17 13:33:49 WARN [DruidDataSource.java:1258] - get connection timeout retry : 1 2020-01-17 13:33:49 WARN [DruidDataSource.java:1258] - get connection timeout retry : 1 2020-01-17 13:33:49 WARN [DruidDataSource.java:1258] - get connection timeout retry : 1 2020-01-17 13:33:53 WARN [DruidDataSource.java:1258] - get connection timeout retry : 1 2020-01-17 13:33:53 WARN [DruidDataSource.java:1258] - get connection timeout retry : 1 2020-01-17 13:33:53 WARN [DruidDataSource.java:1258] - get connection timeout retry : 1 2020-01-17 13:34:02 WARN [SqlExceptionHelper.java:143] - SQL Error: 0, SQLState: null 2020-01-17 13:34:02 ERROR [SqlExceptionHelper.java:144] - wait millis 5503, active 18, maxActive 40, creating 1 2020-01-17 13:34:02 INFO [GetCWRCapHandler.java:114] - 解析录像服务器上报心跳协议过程中出现异常:org.springframework.transaction.CannotCreateTransactionException: Could not open JPA EntityManager for transaction; nested exception is javax.persistence.PersistenceException: org.hibernate.exception.GenericJDBCException: Could not open connection 2020-01-17 13:34:04 WARN [DruidDataSource.java:1258] - get connection timeout retry : 1 2020-01-17 13:34:08 WARN [DruidDataSource.java:1258] - get connection timeout retry : 1 2020-01-17 13:34:12 WARN [DruidDataSource.java:1258] - get connection timeout retry : 1 2020-01-17 13:34:12 WARN [DruidDataSource.java:1258] - get connection timeout retry : 1 2020-01-17 13:34:14 WARN [DruidDataSource.java:1258] - get connection timeout retry : 1 2020-01-17 13:34:14 WARN [DruidDataSource.java:1258] - get connection timeout retry : 1 2020-01-17 13:34:14 WARN [DruidDataSource.java:1258] - get connection timeout retry : 1 2020-01-17 13:34:17 WARN [SqlExceptionHelper.java:143] - SQL Error: 0, SQLState: null 2020-01-17 13:34:17 ERROR [SqlExceptionHelper.java:144] - wait millis 5485, active 19, maxActive 40, creating 1 2020-01-17 13:34:17 INFO [GetCWRCapHandler.java:114] - 解析录像服务器上报心跳协议过程中出现异常:org.springframework.orm.hibernate3.HibernateJdbcException: JDBC exception on Hibernate data access: SQLException for SQL [n/a]; SQL state [null]; error code [0]; Could not open connection; nested exception is org.hibernate.exception.GenericJDBCException: Could not open connection 2020-01-17 13:34:17 WARN [DruidDataSource.java:1258] - get connection timeout retry : 1 2020-01-17 13:34:27 INFO [DeviceCache.java:249] - 设备状态及报警维护过程中出现异常 2020-01-17 13:34:35 WARN [DruidDataSource.java:1258] - get connection timeout retry : 1 2020-01-17 13:34:35 WARN [DruidDataSource.java:1258] - get connection timeout retry : 1 2020-01-17 13:34:42 WARN [DruidDataSource.java:1258] - get connection timeout retry : 1 2020-01-17 13:34:42 WARN [DruidDataSource.java:1258] - get connection timeout retry : 1 2020-01-17 13:34:44 WARN [DruidDataSource.java:1258] - get connection timeout retry : 1 2020-01-17 13:34:44 WARN [DruidDataSource.java:1258] - get connection timeout retry : 1 2020-01-17 13:34:52 WARN [SqlExceptionHelper.java:143] - SQL Error: 0, SQLState: null 2020-01-17 13:34:52 ERROR [SqlExceptionHelper.java:144] - wait millis 7301, active 22, maxActive 40, creating 1 2020-01-17 13:34:52 INFO [GetCWRCapHandler.java:114] - 解析录像服务器上报心跳协议过程中出现异常:org.springframework.transaction.CannotCreateTransactionException: Could not open JPA EntityManager for transaction; nested exception is javax.persistence.PersistenceException: org.hibernate.exception.GenericJDBCException: Could not open connection 2020-01-17 13:34:53 WARN [DruidDataSource.java:1258] - get connection timeout retry : 1 2020-01-17 13:35:05 ERROR [DiskStorageFactory.java:495] - Disk Write of mcs_roll_preset failed: java.lang.OutOfMemoryError: GC overhead limit exceeded 2020-01-17 13:35:18 WARN [SqlExceptionHelper.java:143] - SQL Error: 0, SQLState: null 2020-01-17 13:35:18 ERROR [SqlExceptionHelper.java:144] - Error 2020-01-17 13:35:50 WARN [SqlExceptionHelper.java:143] - SQL Error: 0, SQLState: null 2020-01-17 13:35:50 ERROR [SqlExceptionHelper.java:144] - Error 2020-01-17 13:36:13 WARN [SqlExceptionHelper.java:143] - SQL Error: 1205, SQLState: 40001 2020-01-17 13:36:13 ERROR [SqlExceptionHelper.java:144] - Lock wait timeout exceeded; try restarting transaction 2020-01-17 13:36:13 WARN [SqlExceptionHelper.java:143] - SQL Error: 1205, SQLState: 40001 2020-01-17 13:36:24 ERROR [SqlExceptionHelper.java:144] - Lock wait timeout exceeded; try restarting transaction 2020-01-17 13:36:13 WARN [SqlExceptionHelper.java:143] - SQL Error: 1205, SQLState: 40001 2020-01-17 13:37:01 ERROR [SqlExceptionHelper.java:144] - Lock wait timeout exceeded; try restarting transaction 2020-01-17 13:37:43 WARN [DruidDataSource.java:1258] - get connection timeout retry : 1 2020-01-17 13:36:37 WARN [SqlExceptionHelper.java:143] - SQL Error: 1205, SQLState: 40001 2020-01-17 13:36:33 WARN [SqlExceptionHelper.java:143] - SQL Error: 1205, SQLState: 40001 2020-01-17 13:36:22 WARN [SqlExceptionHelper.java:143] - SQL Error: 1205, SQLState: 40001 ``` 现在就是无法确定到底哪里导致的内存溢出? 这个是上次出现的异常和代码 ``` 2020-01-13 13:21:44 WARN [SqlExceptionHelper.java:143] - SQL Error: 1205, SQLState: 40001 2020-01-13 13:21:44 ERROR [SqlExceptionHelper.java:144] - Lock wait timeout exceeded; try restarting transaction 2020-01-13 13:23:55 INFO [MQTTProtocolHandler.java:283] - 接收到设备接入协议:{"topic":"info/deviceBaseInfo/BHIP118-S/00:00:01:A6:00:A2","reportTime":"256573751","decodeCapacity":{"totalLevel":2,"totalBlock":36,"totalPixel":16588800},"outputList":[{"status":"start","ratio":"UHD","pixel":"3840*2160","type":"UDP","url":"udp://231.0.100.80:7001"},{"status":"start","ratio":"HD","pixel":"960*540","type":"UDP","url":"udp://231.0.101.80:7001"},{"status":"start","ratio":"SD","pixel":"352*288","type":"UDP","url":"udp://231.0.102.80:7001"},{"status":"stop","ratio":"UHD","pixel":"3840*2160","type":"RTMP","url":"rtmp://192.168.15.80:1935/live/100"},{"status":"stop","ratio":"HD","pixel":"960*540","type":"RTMP","url":"rtmp://192.168.15.124:1935/live/101"},{"status":"stop","ratio":"SD","pixel":"352*288","type":"RTMP","url":"rtmp://192.168.15.80:1935/live/102"}],"baseInfo":{"deviceType":"BHIP118-S","code":"00:00:01:A6:00:A2","version":"v1.0.0.11","ip":"192.168.16.80"},"online":true} 2020-01-13 13:23:59 INFO [DeviceCache.java:127] - 接收到新增/更新设备信息:00:00:01:A6:00:A2 :BaseInfo [type=0,name=BHIP118-S,ip=192.168.16.80,port=0,code=00:00:01:A6:00:A2,groupCode=null,groupIndex=0,online=true,multiCastTime=0,ver=v1.0.0.11,channel=0,reserve=0,lockStatus=null,workMode=null,videoPixerls=null,kvmMode=null,serialNumber=null,sdipPortInfo is null,hdIpPortInfo is null,audioIpPortInfo is null,outputList[SourceOutput [ratio=UHD, url=udp://231.0.100.80:7001, pixel=3840*2160, channel=0, SourceOutput [ratio=HD, url=udp://231.0.101.80:7001, pixel=960*540, channel=0, SourceOutput [ratio=SD, url=udp://231.0.102.80:7001, pixel=352*288, channel=0, SourceOutput [ratio=UHD, url=rtmp://192.168.15.80:1935/live/100, pixel=3840*2160, channel=0, SourceOutput [ratio=HD, url=rtmp://192.168.15.124:1935/live/101, pixel=960*540, channel=0, SourceOutput [ratio=SD, url=rtmp://192.168.15.80:1935/live/102, pixel=352*288, channel=0],callStatus=null] 2020-01-13 13:24:06 INFO [MultiCastDeviceInfoHandler.java:229] - 更新视频合成器设备信息: Device [id=ff8080816f694555016f6a4bf61e158c, code=00:00:01:A6:00:A2, name=192.168.16.80, ip=192.168.16.80, port=0, status=0, type=BHIP118, deviceType=BHIP118-S, stamp=2020-01-03 15:26:00, abilityInfo=null, netCardInfo=null] 2020-01-13 13:24:53 INFO [MultiCastDeviceInfoHandler.java:364] - 更新解码器信息:Decoder [IP=192.168.16.80, Port=0, Channel=0, totalBlocks=36, totalLevel=2, uRefWidth=1920, uRefHeight=1080, audioPort=0, totalPixel=16588800, deviceTypeName=BHIP118-S, pixelsWidth=null, pixelsHeight=null] 2020-01-13 13:25:17 WARN [DruidDataSource.java:1258] - get connection timeout retry : 1 2020-01-13 13:25:17 WARN [DruidDataSource.java:1258] - get connection timeout retry : 1 2020-01-13 13:25:17 WARN [DruidDataSource.java:1258] - get connection timeout retry : 1 2020-01-13 13:25:19 WARN [DruidDataSource.java:1258] - get connection timeout retry : 1 2020-01-13 13:25:28 WARN [DruidDataSource.java:1258] - get connection timeout retry : 1 2020-01-13 13:25:42 WARN [SqlExceptionHelper.java:143] - SQL Error: 0, SQLState: null 2020-01-13 13:25:42 ERROR [SqlExceptionHelper.java:144] - wait millis 14463, active 11, maxActive 40, creating 1 2020-01-13 13:25:46 ERROR [DiskStorageFactory.java:495] - Disk Write of mcs_playback_task failed: java.lang.OutOfMemoryError: GC overhead limit exceeded 2020-01-13 13:25:48 WARN [SqlExceptionHelper.java:143] - SQL Error: 0, SQLState: null 2020-01-13 13:25:50 ERROR [SqlExceptionHelper.java:144] - wait millis 18056, active 11, maxActive 40, creating 1 2020-01-13 13:25:50 ERROR [DiskStorageFactory.java:495] - Disk Write of mcs_source_volume failed: java.lang.OutOfMemoryError: GC overhead limit exceeded 2020-01-13 13:25:51 WARN [DruidDataSource.java:1258] - get connection timeout retry : 1 2020-01-13 13:25:51 WARN [DruidDataSource.java:1258] - get connection timeout retry : 1 2020-01-13 13:25:51 INFO [GetCWRCapHandler.java:114] - 解析录像服务器上报心跳协议过程中出现异常:org.springframework.orm.hibernate3.HibernateJdbcException: JDBC exception on Hibernate data access: SQLException for SQL [n/a]; SQL state [null]; error code [0]; Could not open connection; nested exception is org.hibernate.exception.GenericJDBCException: Could not open connection 2020-01-13 13:25:51 INFO [GetCWRCapHandler.java:114] - 解析录像服务器上报心跳协议过程中出现异常:org.springframework.transaction.CannotCreateTransactionException: Could not open JPA EntityManager for transaction; nested exception is javax.persistence.PersistenceException: org.hibernate.exception.GenericJDBCException: Could not open connection 2020-01-13 13:25:58 ERROR [DiskStorageFactory.java:495] - Disk Write of mcs_kvm failed: java.lang.OutOfMemoryError: GC overhead limit exceeded 2020-01-13 13:25:56 WARN [DruidDataSource.java:1258] - get connection timeout retry : 1 2020-01-13 13:25:56 WARN [DruidDataSource.java:1258] - get connection timeout retry : 1 2020-01-13 13:26:03 ERROR [DiskStorageFactory.java:495] - Disk Write of mcs_video_terminal failed: java.lang.OutOfMemoryError: GC overhead limit exceeded 2020-01-13 13:26:05 ERROR [DiskStorageFactory.java:495] - Disk Write of mcs_pic_layer failed: java.lang.OutOfMemoryError: GC overhead limit exceeded 2020-01-13 13:26:05 ERROR [DiskStorageFactory.java:495] - Disk Write of mcs_template failed: java.lang.OutOfMemoryError: GC overhead limit exceeded 2020-01-13 13:26:05 ERROR [DiskStorageFactory.java:495] - Disk Write of mcs_multicast_address failed: java.lang.OutOfMemoryError: GC overhead limit exceeded 2020-01-13 13:26:08 ERROR [DiskStorageFactory.java:495] - Disk Write of mcs_mass_source failed: java.lang.OutOfMemoryError: GC overhead limit exceeded 2020-01-13 13:26:08 ERROR [DiskStorageFactory.java:495] - Disk Write of mcs_pic_encoder_block failed: java.lang.OutOfMemoryError: GC overhead limit exceeded 2020-01-13 13:26:08 ERROR [DiskStorageFactory.java:495] - Disk Write of mcs_ipc_preset_group failed: java.lang.OutOfMemoryError: GC overhead limit exceeded 2020-01-13 13:26:08 WARN [DruidDataSource.java:1258] - get connection timeout retry : 1 2020-01-13 13:26:13 ERROR [DiskStorageFactory.java:495] - Disk Write of mcs_video_terminal failed: java.lang.OutOfMemoryError: GC overhead limit exceeded 2020-01-13 13:26:17 ERROR [DiskStorageFactory.java:495] - Disk Write of mcs_record_layer_block failed: java.lang.OutOfMemoryError: GC overhead limit exceeded 2020-01-13 13:26:24 ERROR [DiskStorageFactory.java:495] - Disk Write of mcs_video_meeting failed: java.lang.OutOfMemoryError: GC overhead limit exceeded 2020-01-13 13:26:28 ERROR [DiskStorageFactory.java:495] - Disk Write of mcs_audio_preset_rlat failed: java.lang.OutOfMemoryError: GC overhead limit exceeded 2020-01-13 13:26:28 ERROR [DiskStorageFactory.java:495] - Disk Write of mcs_preset failed: java.lang.OutOfMemoryError: GC overhead limit exceeded 2020-01-13 13:26:28 ERROR [DiskStorageFactory.java:495] - Disk Write of mcs_layer_block failed: java.lang.OutOfMemoryError: GC overhead limit exceeded 2020-01-13 13:26:28 WARN [DruidDataSource.java:1258] - get connection timeout retry : 1 2020-01-13 13:26:32 WARN [SqlExceptionHelper.java:143] - SQL Error: 0, SQLState: null 2020-01-13 13:26:32 WARN [DruidDataSource.java:1258] - get connection timeout retry : 1 2020-01-13 13:26:32 WARN [DruidDataSource.java:1258] - get connection timeout retry : 1 2020-01-13 13:26:32 ERROR [DiskStorageFactory.java:495] - Disk Write of mcs_encoder failed: java.lang.OutOfMemoryError: GC overhead limit exceeded 2020-01-13 13:26:52 WARN [DruidDataSource.java:1258] - get connection timeout retry : 1 2020-01-13 13:27:07 ERROR [SqlExceptionHelper.java:144] - wait millis 20150, active 12, maxActive 40, creating 1 2020-01-13 13:26:50 WARN [DruidDataSource.java:1258] - get connection timeout retry : 1 2020-01-13 13:26:46 WARN [DruidDataSource.java:1258] - get connection timeout retry : 1 2020-01-13 13:26:46 WARN [DruidDataSource.java:1258] - get connection timeout retry : 1 2020-01-13 13:26:41 WARN [DruidDataSource.java:1258] - get connection timeout retry : 1 2020-01-13 13:26:41 ERROR [SqlExceptionHelper.java:144] - wait millis 23244, active 12, maxActive 40, creating 1 2020-01-13 13:26:41 WARN [DruidDataSource.java:1258] - get connection timeout retry : 1 2020-01-13 13:26:41 WARN [DruidDataSource.java:1258] - get connection timeout retry : 1 2020-01-13 13:30:29 WARN [DruidDataSource.java:1258] - get connection timeout retry : 1 2020-01-13 13:30:25 WARN [DruidDataSource.java:1258] - get connection timeout retry : 1 2020-01-13 13:30:07 WARN [DruidDataSource.java:1258] - get connection timeout retry : 1 2020-01-13 13:30:07 WARN [DruidDataSource.java:1258] - get connection timeout retry : 1 2020-01-13 13:30:07 WARN [DruidDataSource.java:1258] - get connection timeout retry : 1 ``` 这个是组播上报的设备信息,因为组播是一秒钟上报了好几次,是不是这里一直占用着连接 ``` public void run() { try { ms = new MulticastSocket(multiCastPort);//建立组播套接字 ms.setReceiveBufferSize(65535); // ms.setNetworkInterface(NetworkInterface.getByInetAddress(localAddress)); ms.joinGroup(InetAddress.getByName(multiCastIP));//加入组播组 LogHome.getLog().info("正常启动监听" + localAddress.getHostAddress()); byte[] buffer = null; DatagramPacket dp = null; while (true) { try { if (IsStop) { break; } buffer = new byte[1400]; dp = new DatagramPacket(buffer, buffer.length); LogHome.getLog().debug("等待接受组播信息:"); ms.receive(dp); MultiCastInfoParser info = new MultiCastInfoParser(dp.getData()); DeviceCache.baseInfoOf1004Handler(info.parse()); } catch (Exception e) { if (IsStop) {// 如果换网卡可能会出问题 break; } else { if (ms.isClosed()) { ms = new MulticastSocket(multiCastPort); ms.setNetworkInterface(NetworkInterface.getByInetAddress(localAddress)); ms.joinGroup(InetAddress.getByName(multiCastIP)); } } LogHome.getLog().error("接受并设置组播信息失败",e); }finally{ Thread.sleep(1); } } } catch (Exception e) { e.printStackTrace(); try { if (listenerList.containsKey(localAddress.getHostAddress())) { listenerList.remove(localAddress.getHostAddress()); } } catch (Exception ex) { LogHome.getLog().error(ex); } } ```

mysq连接两张表的查询,并limit其中一张表,怎么写

如我一次查完,可以这么写 select * from class left join student on student.id = class.id 我想查询前30个班级的前10个学生,信息怎么搞 难道要查询31次, 1.查询有哪些班级,limit 30 2.分别查出每个班级的前10个学生 , limit 10

mysql limit错误错在哪里?

MySQL 5.6.12 select * from table limit 1,-1 ; ERROR 1064 (42000): You have an error in your SQL syntax; check the manual that corresponds to your MySQL server version for the right syntax to use near '-1,1' at line 1

java.lang.OutOfMemoryError: Java heap space

![图片说明](https://img-ask.csdn.net/upload/201709/07/1504758887_307108.png) 请问手动设置内存大小在具体哪添加,set JAVA_OPTS= -Xms32m -Xmx512m linux里的tomcat服务器

java中解析传过来的sql

java后端接口中怎么解析取到传过来的sql语句groupby后面跟的字段,求代码 各位大神谢谢啦

MYSQL查询将结果数组保存为变量

<div class="post-text" itemprop="text"> <p>im after the simplest (and easiest to understand from a novice programmers perspective) to do the following;</p> <p>I have a query on a database:</p> <pre><code>$newsquery = mysql_query('SELECT * FROM news ORDER BY id DESC LIMIT 3'); </code></pre> <p>I want to to select the top three rows in my table and get the column 'news' </p> <p>Then I need to save each result as a separate variable, this could be $news1, $news2 and $news3</p> <p>So i could then use these variables to echo as needed on my page. </p> <p>I hope this makes sense, many thanks .</p> </div>

大学四年自学走来,这些私藏的实用工具/学习网站我贡献出来了

大学四年,看课本是不可能一直看课本的了,对于学习,特别是自学,善于搜索网上的一些资源来辅助,还是非常有必要的,下面我就把这几年私藏的各种资源,网站贡献出来给你们。主要有:电子书搜索、实用工具、在线视频学习网站、非视频学习网站、软件下载、面试/求职必备网站。 注意:文中提到的所有资源,文末我都给你整理好了,你们只管拿去,如果觉得不错,转发、分享就是最大的支持了。 一、电子书搜索 对于大部分程序员...

在中国程序员是青春饭吗?

今年,我也32了 ,为了不给大家误导,咨询了猎头、圈内好友,以及年过35岁的几位老程序员……舍了老脸去揭人家伤疤……希望能给大家以帮助,记得帮我点赞哦。 目录: 你以为的人生 一次又一次的伤害 猎头界的真相 如何应对互联网行业的「中年危机」 一、你以为的人生 刚入行时,拿着傲人的工资,想着好好干,以为我们的人生是这样的: 等真到了那一天,你会发现,你的人生很可能是这样的: ...

程序员请照顾好自己,周末病魔差点一套带走我。

程序员在一个周末的时间,得了重病,差点当场去世,还好及时挽救回来了。

技术大佬:我去,你写的 switch 语句也太老土了吧

昨天早上通过远程的方式 review 了两名新来同事的代码,大部分代码都写得很漂亮,严谨的同时注释也很到位,这令我非常满意。但当我看到他们当中有一个人写的 switch 语句时,还是忍不住破口大骂:“我擦,小王,你丫写的 switch 语句也太老土了吧!” 来看看小王写的代码吧,看完不要骂我装逼啊。 private static String createPlayer(PlayerTypes p...

上班一个月,后悔当初着急入职的选择了

最近有个老铁,告诉我说,上班一个月,后悔当初着急入职现在公司了。他之前在美图做手机研发,今年美图那边今年也有一波组织优化调整,他是其中一个,在协商离职后,当时捉急找工作上班,因为有房贷供着,不能没有收入来源。所以匆忙选了一家公司,实际上是一个大型外包公司,主要派遣给其他手机厂商做外包项目。**当时承诺待遇还不错,所以就立马入职去上班了。但是后面入职后,发现薪酬待遇这块并不是HR所说那样,那个HR自...

女程序员,为什么比男程序员少???

昨天看到一档综艺节目,讨论了两个话题:(1)中国学生的数学成绩,平均下来看,会比国外好?为什么?(2)男生的数学成绩,平均下来看,会比女生好?为什么?同时,我又联想到了一个技术圈经常讨...

为什么本科以上学历的人只占中国人口的5%,但感觉遍地都是大学生?

中国大学生占总人口不到5% 2017年,中国整体的本科率仅有5.9%;如果算上研究生,这一比例可以进一步上升到6.5% 为什么在国家统计局推出的这份年鉴中,学历的最高一阶就是到研究生,而没有进一步再统计博士生的数量的。 原因其实并不难理解,相比全国和各省整体人口体量,博士生的占比非常之低,属于绝对意义上的小概率样本。 这一点,我们从上表中的各省研究生占比情况也可以看出端倪。除北京、天津、上海三...

副业收入是我做程序媛的3倍,工作外的B面人生是怎样的?

提到“程序员”,多数人脑海里首先想到的大约是:为人木讷、薪水超高、工作枯燥…… 然而,当离开工作岗位,撕去层层标签,脱下“程序员”这身外套,有的人生动又有趣,马上展现出了完全不同的A/B面人生! 不论是简单的爱好,还是正经的副业,他们都干得同样出色。偶尔,还能和程序员的特质结合,产生奇妙的“化学反应”。 @Charlotte:平日素颜示人,周末美妆博主 大家都以为程序媛也个个不修边幅,但我们也许...

MySQL数据库面试题(2020最新版)

文章目录数据库基础知识为什么要使用数据库什么是SQL?什么是MySQL?数据库三大范式是什么mysql有关权限的表都有哪几个MySQL的binlog有有几种录入格式?分别有什么区别?数据类型mysql有哪些数据类型引擎MySQL存储引擎MyISAM与InnoDB区别MyISAM索引与InnoDB索引的区别?InnoDB引擎的4大特性存储引擎选择索引什么是索引?索引有哪些优缺点?索引使用场景(重点)...

如果你是老板,你会不会踢了这样的员工?

有个好朋友ZS,是技术总监,昨天问我:“有一个老下属,跟了我很多年,做事勤勤恳恳,主动性也很好。但随着公司的发展,他的进步速度,跟不上团队的步伐了,有点...

我入职阿里后,才知道原来简历这么写

私下里,有不少读者问我:“二哥,如何才能写出一份专业的技术简历呢?我总感觉自己写的简历太烂了,所以投了无数份,都石沉大海了。”说实话,我自己好多年没有写过简历了,但我认识的一个同行,他在阿里,给我说了一些他当年写简历的方法论,我感觉太牛逼了,实在是忍不住,就分享了出来,希望能够帮助到你。 01、简历的本质 作为简历的撰写者,你必须要搞清楚一点,简历的本质是什么,它就是为了来销售你的价值主张的。往深...

程序员写出这样的代码,能不挨骂吗?

当你换槽填坑时,面对一个新的环境。能够快速熟练,上手实现业务需求是关键。但是,哪些因素会影响你快速上手呢?是原有代码写的不够好?还是注释写的不够好?昨夜...

外包程序员的幸福生活

今天给你们讲述一个外包程序员的幸福生活。男主是Z哥,不是在外包公司上班的那种,是一名自由职业者,接外包项目自己干。接下来讲的都是真人真事。 先给大家介绍一下男主,Z哥,老程序员,是我十多年前的老同事,技术大牛,当过CTO,也创过业。因为我俩都爱好喝酒、踢球,再加上住的距离不算远,所以一直也断断续续的联系着,我对Z哥的状况也有大概了解。 Z哥几年前创业失败,后来他开始干起了外包,利用自己的技术能...

带了6个月的徒弟当了面试官,而身为高级工程师的我天天修Bug......

即将毕业的应届毕业生一枚,现在只拿到了两家offer,但最近听到一些消息,其中一个offer,我这个组据说客户很少,很有可能整组被裁掉。 想问大家: 如果我刚入职这个组就被裁了怎么办呢? 大家都是什么时候知道自己要被裁了的? 面试软技能指导: BQ/Project/Resume 试听内容: 除了刷题,还有哪些技能是拿到offer不可或缺的要素 如何提升面试软实力:简历, 行为面试,沟通能...

优雅的替换if-else语句

场景 日常开发,if-else语句写的不少吧??当逻辑分支非常多的时候,if-else套了一层又一层,虽然业务功能倒是实现了,但是看起来是真的很不优雅,尤其是对于我这种有强迫症的程序"猿",看到这么多if-else,脑袋瓜子就嗡嗡的,总想着解锁新姿势:干掉过多的if-else!!!本文将介绍三板斧手段: 优先判断条件,条件不满足的,逻辑及时中断返回; 采用策略模式+工厂模式; 结合注解,锦...

离职半年了,老东家又发 offer,回不回?

有小伙伴问松哥这个问题,他在上海某公司,在离职了几个月后,前公司的领导联系到他,希望他能够返聘回去,他很纠结要不要回去? 俗话说好马不吃回头草,但是这个小伙伴既然感到纠结了,我觉得至少说明了两个问题:1.曾经的公司还不错;2.现在的日子也不是很如意。否则应该就不会纠结了。 老实说,松哥之前也有过类似的经历,今天就来和小伙伴们聊聊回头草到底吃不吃。 首先一个基本观点,就是离职了也没必要和老东家弄的苦...

记录下入职中软一个月(外包华为)

我在年前从上一家公司离职,没想到过年期间疫情爆发,我也被困在家里,在家呆着的日子让人很焦躁,于是我疯狂的投简历,看面试题,希望可以进大公司去看看。 我也有幸面试了我觉得还挺大的公司的(虽然不是bat之类的大厂,但是作为一名二本计算机专业刚毕业的大学生bat那些大厂我连投简历的勇气都没有),最后选择了中软,我知道这是一家外包公司,待遇各方面甚至不如我的上一家公司,但是对我而言这可是外包华为,能...

为什么程序员做外包会被瞧不起?

二哥,有个事想询问下您的意见,您觉得应届生值得去外包吗?公司虽然挺大的,中xx,但待遇感觉挺低,马上要报到,挺纠结的。

当HR压你价,说你只值7K,你该怎么回答?

当HR压你价,说你只值7K时,你可以流畅地回答,记住,是流畅,不能犹豫。 礼貌地说:“7K是吗?了解了。嗯~其实我对贵司的面试官印象很好。只不过,现在我的手头上已经有一份11K的offer。来面试,主要也是自己对贵司挺有兴趣的,所以过来看看……”(未完) 这段话主要是陪HR互诈的同时,从公司兴趣,公司职员印象上,都给予对方正面的肯定,既能提升HR的好感度,又能让谈判气氛融洽,为后面的发挥留足空间。...

面试:第十六章:Java中级开发

HashMap底层实现原理,红黑树,B+树,B树的结构原理 Spring的AOP和IOC是什么?它们常见的使用场景有哪些?Spring事务,事务的属性,传播行为,数据库隔离级别 Spring和SpringMVC,MyBatis以及SpringBoot的注解分别有哪些?SpringMVC的工作原理,SpringBoot框架的优点,MyBatis框架的优点 SpringCould组件有哪些,他们...

面试阿里p7,被按在地上摩擦,鬼知道我经历了什么?

面试阿里p7被问到的问题(当时我只知道第一个):@Conditional是做什么的?@Conditional多个条件是什么逻辑关系?条件判断在什么时候执...

Python爬虫,高清美图我全都要(彼岸桌面壁纸)

爬取彼岸桌面网站较为简单,用到了requests、lxml、Beautiful Soup4

无代码时代来临,程序员如何保住饭碗?

编程语言层出不穷,从最初的机器语言到如今2500种以上的高级语言,程序员们大呼“学到头秃”。程序员一边面临编程语言不断推陈出新,一边面临由于许多代码已存在,程序员编写新应用程序时存在重复“搬砖”的现象。 无代码/低代码编程应运而生。无代码/低代码是一种创建应用的方法,它可以让开发者使用最少的编码知识来快速开发应用程序。开发者通过图形界面中,可视化建模来组装和配置应用程序。这样一来,开发者直...

面试了一个 31 岁程序员,让我有所触动,30岁以上的程序员该何去何从?

最近面试了一个31岁8年经验的程序猿,让我有点感慨,大龄程序猿该何去何从。

6年开发经验女程序员,面试京东Java岗要求薪资28K

写在开头: 上周面试了一位女程序员,上午10::30来我们部门面试,2B哥接待了她.来看看她的简历: 个人简历 个人技能: ● 熟悉spring mvc 、spring、mybatis 等框架 ● 熟悉 redis 、rocketmq、dubbo、zookeeper、netty 、nginx、tomcat、mysql。 ● 阅读过juc 中的线程池、锁的源...

大三实习生,字节跳动面经分享,已拿Offer

说实话,自己的算法,我一个不会,太难了吧

程序员垃圾简历长什么样?

已经连续五年参加大厂校招、社招的技术面试工作,简历看的不下于万份 这篇文章会用实例告诉你,什么是差的程序员简历! 疫情快要结束了,各个公司也都开始春招了,作为即将红遍大江南北的新晋UP主,那当然要为小伙伴们做点事(手动狗头)。 就在公众号里公开征简历,义务帮大家看,并一一点评。《启舰:春招在即,义务帮大家看看简历吧》 一石激起千层浪,三天收到两百多封简历。 花光了两个星期的所有空闲时...

Java岗开发3年,公司临时抽查算法,离职后这几题我记一辈子

前几天我们公司做了一件蠢事,非常非常愚蠢的事情。我原以为从学校出来之后,除了找工作有测试外,不会有任何与考试有关的事儿。 但是,天有不测风云,公司技术总监、人事总监两位大佬突然降临到我们事业线,叫上我老大,给我们组织了一场别开生面的“考试”。 那是一个风和日丽的下午,我翘着二郎腿,左手端着一杯卡布奇诺,右手抓着我的罗技鼠标,滚动着轮轴,穿梭在头条热点之间。 “淡黄的长裙~蓬松的头发...

大牛都会用的IDEA调试技巧!!!

导读 前天面试了一个985高校的实习生,问了他平时用什么开发工具,他想也没想的说IDEA,于是我抛砖引玉的问了一下IDEA的调试用过吧,你说说怎么设置断点...

都前后端分离了,咱就别做页面跳转了!统统 JSON 交互

文章目录1. 无状态登录1.1 什么是有状态1.2 什么是无状态1.3 如何实现无状态1.4 各自优缺点2. 登录交互2.1 前后端分离的数据交互2.2 登录成功2.3 登录失败3. 未认证处理方案4. 注销登录 这是本系列的第四篇,有小伙伴找不到之前文章,松哥给大家列一个索引出来: 挖一个大坑,Spring Security 开搞! 松哥手把手带你入门 Spring Security,别再问密...

立即提问
相关内容推荐