mysql怎么查询10-20条的数据

mysql怎么查询1-5条的数据??????????????????

5个回答

 select * from table limit 0, 5
select * from table limit 10, 20
Cashey1991
开水 回复qq_24592567: 查询语句后面加上“;”吧,select * from table limit 0, 5;
大约 5 年之前 回复
u012224727
DanielPop 回复qq_24592567: 报什么错? 贴上来
大约 5 年之前 回复
qq_24592567
qq_24592567 不行的啊,会报错的
大约 5 年之前 回复

limit第二个参数是个数,所以查询10-20条应该是select * from table limit 9, 10;

m0_37589586
Tg丶break 错了吧,10-20有11条记录,应该是limit 9,11
大约 2 年之前 回复

在select语句后面使用limit

mysql怎么查询11-20条的数据??????????????????

mysql应该是9,11

Csdn user default icon
上传中...
上传图片
插入图片
抄袭、复制答案,以达到刷声望分或其他目的的行为,在CSDN问答是严格禁止的,一经发现立刻封号。是时候展现真正的技术了!
其他相关推荐
怎么创建分组,例如10-20,20-30,30-40分组数不确定,可能是一个分组,叶可能是多个

分组查询数据,并且 计算百分比![图片说明](https://img-ask.csdn.net/upload/201901/10/1547110474_882638.png) 查询出来的类似这种 占参与人数百分比是a部门所有考了test试卷的人达到了10-20这个区间的百分比 区间是页面穿过来的数组,所以不确定有多少区间,如果这个区间内没有人达到就显示0%。

mysql查询大量数据异常

The driver was unable to create a connection due to an inability to establish the client portion of a socket. This is usually caused by a limit on the number of sockets imposed by the operating system. This limit is usually configurable. For Unix-based platforms, see the manual page for the 'ulimit' command. Kernel or system reconfiguration may also be required. 数据库没问题,每次查询完执行: if(rs!=null){ try { rs.close(); } catch (Exception e) { // TODO: handle exception } } //关闭资源[先开后闭]; if(ps!=null){ try { ps.close(); } catch (SQLException e) { // TODO Auto-generated catch block e.printStackTrace(); } ps=null;//使用垃圾回收. } if(ct!=null){ try { ct.close(); } catch (SQLException e) { // TODO Auto-generated catch block e.printStackTrace(); } ct=null; } 在一个方法里有个循环要反复去查询数据库,每次都是调用工具类去查询,也就是每次都执行了上面的关闭。可当查询到4000次左右时就抛出了上面的异常。请帮忙看看,万分感谢!

在基于日期和另一个字段的Mysql查询中合并表行数据。 表有超过2M行

<div class="post-text" itemprop="text"> <p>Here is sample table of data</p> <pre><code> ID MC BC TIME AB AT 1 4 10 2016-12-05 09:02:00 5 8 2 4 20 2016-12-15 09:03:00 2 3 3 4 10 2016-12-15 09:02:00 1 4 4 4 20 2016-12-25 09:02:00 3 6 5 4 05 2016-12-05 09:02:00 4 2 6 4 05 2016-12-08 09:02:00 6 2 7 4 10 2016-12-11 09:02:00 7 6 8 4 10 2016-12-05 09:02:00 9 8 9 4 10 2016-12-15 09:02:00 9 8 10 4 10 2016-12-15 09:05:00 10 20 11 5 10 2016-12-15 09:05:00 10 20 12 5 10 2016-12-15 09:05:00 10 20 13 5 10 2016-12-15 09:05:00 10 20 </code></pre> <p>If i query for where 'MC'= 4 and 'TIME' like <code>2016-12%</code> then i want data of 'AB' and 'AT' column merged based on day in 'TIME' column and each 'BC' have separate day merged. </p> <p>Like in sample output day 15 have two 'BC' (10,20) and they have two different rows.</p> <p>I want the output like this </p> <pre><code> ID MC BC TIME AB AT 1 4 10 2016-12-05 14 16 2 4 10 2016-12-11 7 6 3 4 10 2016-12-15 20 32 4 4 20 2016-12-15 2 3 5 4 20 2016-12-25 3 6 6 4 05 2016-12-05 4 2 7 4 05 2016-12-08 6 2 </code></pre> <p>Table have more 2 million rows for current month like in sample table.</p> </div>

Mysql查询某段时间内每一天数据的汇总

如题,例如我想要查询2016-04-20到2016-08-30期间 每一天的数据合 time num 2016-04-20 10 2016-04-20 20 2016-04-21 5 2016-04-21 6 . . . . . . 2016-08-30 30 查询出来就是 2016-04-20 30 2016-04-21 11 . . . . . . 2016-08-30 30

大数据量Mysql查询后经过循环使用python分片

1 问题描述: (1)使用mysql查询基础数据,这里只有三四个基础的查询条件,联了一个表,同时有limit分页了; (2)之后经过一系列逻辑处理,在这些处理中又包含了很多sql查询,而且是在第(1)条查询出来的结果基础上查询,以前是先分页的,第(1)一次只查询了十条,第二步最多循环十次,但是现在的新逻辑是,查询出来后,经过(2)的处理,不满足筛选条件的数据remove掉,然后再返回最后剩下的数据 (3)由于每一页都会remove()掉部分数据,我曾经尝试定义全局变量,记录删除数据,但是只能得到我当前查询这一页删除了多少,无法获取总共删除了多少,而且每一页的数量都不一定是10条,一般来说是10条以下(因为会删除部分不符合筛选条件的数据),但是要求是要获取满足筛选条件的总数据量,而且需要正常分页 (4)于是我不用limit分页,直接取全量数据,然后再记录删除的数据量,使用切片手动分页,就能获取总数据了,每页也都是10条,但是循环次数过多,数据量稍微大一点儿,就需要49秒左右 2 部分相关代码: (1)基础查询: ``` SELECT op.order_id, opc.order_code, op.created_at AS create_time, opc.departure_date, opc.end_date, opc.company, opc.channel_id, opc.retail, opc.final_cost, opc.has_pay, opc.commission_price, opc.commission_type, opc.commission_value \ FROM order_product_customize AS opc \ LEFT JOIN order_product AS op ON opc.order_product_id = op.order_product_id \ WHERE { 0 } ORDER BY opc.created_at DESC { 1 } ``` (2) 手动分页: ``` nextPage = limit_start+page_size result['data_list'] = result['data_list'][limit_start:nextPage] result['total_num'] = result['total_num'] - self.delNum ``` 3 报错信息: 没有报错,只是执行时间极其长 一台比较好的机器,执行时间为27.72秒,本地执行时间接近40秒,无法上传图片 4 已经尝试过的办法 (1)记录删除次数再减去(因为每次都只查一页,只能获取当前页删除的条数) (2)取符合筛选条件的全量数据(数据量太大,又有循环,导致速度极其慢) (3)每次查20条左右数据,然后获取没删除的前10条,记录最后一条的id(动态分页,无法获取每一页第一条数据,无法保证20条经过筛选后还能剩下10条)

mysql 查询某张表的前25%数据,不是前25%的行数

mysql 查询某张表的前25%数据,不是前25%的行数,会写的大佬帮帮忙,小弟感激不尽

10w条excel数据导入到mysql

现在我一个季度需要把excel数据导入到mysql中,excel数据条数大概在20w到30w之间,导入要求如下:![图片说明](https://img-ask.csdn.net/upload/201709/13/1505307353_807226.png)现在需要把excel内容导入到两张表中,一张user(id(自增主键),姓名,电话,地址),一张工单表(工单号,机型,品类, user_id),但是在导入工单表时,每次都需要查找user_id,导致时间过长,请问大家有什么好的方法实现嘛?

mysql / php - 确定数据的来源

<div class="post-text" itemprop="text"> <p>I need to determine what table data is from for a news feed. The feed must say something like "Person has uploaded a video" or "Person has updated their bio". Therefore I need to determine where data came from as different types of data are in different tables, obviously. I am hoping you can do this with SQL but probably not so PHP is the option. I have no idea how to do this so just need pointing in the right direction.</p> <p>I'll briefly describe the database as I don't have time to make a diagram.</p> <p>1.There is a table titled members with all basic info such as email, password and ID. The ID is the primary key.</p> <ol> <li><p>All other tables have foreign keys for the ID linking to the ID in the members table. </p></li> <li><p>Other tables include; tracks, status, pics, videos. All pretty self explanatory from there.</p></li> </ol> <p>I need to determine somehow what table the updated data comes from so I can then tell the user what so and so has done. Preferably I would want only one SQL statement for the whole feed so all the tables are joined and ordered by timestamp making everything much simpler for me. Hopefully I can do both but as I said really not sure. </p> <p>A basic outline of the statement, will be longer but have simplified;</p> <pre><code>SELECT N.article, N.ID, A.ID, A.name,a.url, N.timestamp FROM news N LEFT JOIN artists A ON N.ID = A.ID WHERE N.ID = A.ID ORDER BY N.timestamp DESC LIMIT 10 </code></pre> <p>Members table;</p> <pre><code>CREATE TABLE `members` ( `ID` int(111) NOT NULL AUTO_INCREMENT, `email` varchar(100) COLLATE latin1_general_ci NOT NULL, `password` varchar(100) COLLATE latin1_general_ci NOT NULL, `FNAME` varchar(100) COLLATE latin1_general_ci NOT NULL, `SURNAME` varchar(100) COLLATE latin1_general_ci NOT NULL, `timestamp` timestamp NOT NULL DEFAULT '0000-00-00 00:00:00' ON UPDATE CURRENT_TIMESTAMP, PRIMARY KEY (`ID`), UNIQUE KEY `email` (`email`) ) ENGINE=InnoDB AUTO_INCREMENT=5 DEFAULT CHARSET=latin1 COLLATE=latin1_general_ci </code></pre> <p>Tracks table, all other tables are pretty much the same;</p> <pre><code>CREATE TABLE `tracks` ( `ID` int(11) NOT NULL, `url` varchar(200) COLLATE latin1_general_ci NOT NULL, `name` varchar(100) COLLATE latin1_general_ci NOT NULL, `timestamp` timestamp NOT NULL DEFAULT '0000-00-00 00:00:00' ON UPDATE CURRENT_TIMESTAMP, `track_ID` int(11) NOT NULL AUTO_INCREMENT, PRIMARY KEY (`track_ID`), UNIQUE KEY `url` (`url`), UNIQUE KEY `track_ID` (`track_ID`), KEY `ID` (`ID`), CONSTRAINT `tracks_ibfk_1` FOREIGN KEY (`ID`) REFERENCES `members` (`ID`) ) ENGINE=InnoDB AUTO_INCREMENT=4 DEFAULT CHARSET=latin1 COLLATE=latin1_general_ci </code></pre> <p>Before I have tried using a mysql query for each table and putting everything into an array and echoing it out. This seemed long and tiresome and I had no luck with it. I have now deleted all that code as it was a week or so ago.</p> <p>Please do not feel you have to go into depth with this just point me in the right direction.</p> <p>ADDITION:</p> <p>Here is the sql query i have made for a trigger that was suggested. Not sure what is wrong as have never used trigger before. When inserting something into tracks this error comes up </p> <pre><code>#1054 - Unknown column 'test' in 'field list' </code></pre> <p>The values in the query are just for testing at the moment</p> <pre><code> delimiter $$ CREATE TRIGGER tracks_event AFTER INSERT ON tracks FOR EACH ROW BEGIN INSERT into events(ID, action) VALUES (3, test); END$$ delimiter ; </code></pre> <p>UPDATE!</p> <p>I have now created a table called events as suggested and used triggers to update it AFTER an insert in one of several tables.</p> <p>Here is the query I have tried but it is wrong. The query needs to get info referenced in the events table from all the other tables and order by timestamp.</p> <pre><code>SELECT T.url, E.ID, T.ID, E.action, T.name, T.timestamp FROM tracks T LEFT JOIN events E ON T.ID = E.ID WHERE T.ID = E.ID ORDER BY T.timestamp DESC </code></pre> <p>In that query I have only include the events and tracks table for simplicity as the problem is still there. There will be many more tables so the problem will worsen.</p> <p>It's hard to describe the problem but basically because there is an ID in every table and one ID can do several actions, the action can be shown with the wrong outcome, in this case url.</p> <p>I will explain what's in the events table and the tracks table and give the outcome to further explain.</p> <p>In the events table;</p> <pre><code> 4 has uploaded a track. 3 has some news. 4 has become an NBS artist. </code></pre> <p>In the tracks; </p> <pre><code> 2 uploads/abc.wav Cannonballs &amp; Stones 2012-08-20 23:59:59 1 3 uploads/19c9aa51c821952c81be46ca9b2e9056.mp3 test 2012-08-31 23:59:59 2 4 uploads/2b412dd197d464fedcecb1e244e18faf.mp3 testing 2012-08-31 00:32:56 3 4 111 111111 0000-00-00 00:00:00 111111 </code></pre> <p>Outcome of query;</p> <pre><code>uploads/19c9aa51c821952c81be46ca9b2e9056.mp3 3 3 has some news. test 2012-08-31 23:59:59 uploads/2b412dd197d464fedcecb1e244e18faf.mp3 4 4 has uploaded a track. testing 2012-08-31 00:32:56 uploads/2b412dd197d464fedcecb1e244e18faf.mp3 4 4 has become an NBS artist. testing 2012-08-31 00:32:56 111 4 4 has become an NBS artist. 111111 0000-00-00 00:00:00 111 4 4 has uploaded a track. 111111 0000-00-00 00:00:00 </code></pre> <p>As you can see the query gives unwanted results. The action for each ID is given on each url so the url can be shown more than once and with the wrong action. Because there is only the tracks table in that query, the only action i would want showing is 'has uploaded a track.'</p> </div>

MySql - 避免更新时出现重复记录

<div class="post-text" itemprop="text"> <p>I have the following update query which runs when a user logs in to their account having place items in their cart prior to logging in (a temporary account set up on the table)...</p> <pre><code>UPDATE cart_items SET account_id=$account WHERE account_id=$cookieId; </code></pre> <p>This occasionally creates duplicate results looking something like this:</p> <pre><code>id | account_id | itemNumber | itemQuantity ------------------------------------------ 20 | 10 | 6 | 2 25 | 10 | 6 | 1 </code></pre> <p>What I would like to do i write a query which avoids creating these duplicate records and just leaves a single record like this:</p> <pre><code>id | account_id | itemNumber | itemQuantity ------------------------------------------ 20 | 10 | 6 | 3 </code></pre> <p>I think using <code>DUPLICATE KEY UPDATE</code> might be what I'm looking for but I can't get my head around it. Can anyone help me out please?</p> </div>

MySQL 8.0.11 安装失败 cmd提示:服务没有响应控制功能

my.ini 配置以下基本信息: [mysql] # 设置mysql客户端默认字符集 default-character-set=utf8 [mysqld] # 设置3306端口 port = 3306 # 设置mysql的安装目录 basedir=C:\web\mysql-8.0.11 # 设置 mysql数据库的数据的存放目录,MySQL 8+ 不需要以下配置,系统自己生成即可,否则有可能报错 # datadir=C:\web\sqldata # 允许最大连接数 max_connections=20 # 服务端使用的字符集默认为8比特编码的latin1字符集 character-set-server=utf8 # 创建新表时将使用的默认存储引擎 default-storage-engine=INNODB 接下来我们来启动下 MySQL 数据库: 以管理员身份打开 cmd 命令行工具,切换目录: cd C:\web\mysql-8.0.11\bin 初始化数据库: mysqld --initialize --console 执行完成后,会输出 root 用户的初始默认密码,如: ... 2018-04-20T02:35:05.464644Z 5 [Note] [MY-010454] [Server] A temporary password is generated for root@localhost: APWCY5ws&hjQ ... APWCY5ws&hjQ 就是初始密码,后续登录需要用到,你也可以在登陆后修改密码。 输入以下安装命令: mysqld install 启动输入以下命令即可: net start mysql 但是输入 net start mysql 的时候失败了 ![图片说明](https://img-ask.csdn.net/upload/201810/29/1540817565_776189.png)

Spring框架入门,tomcat启动成功,但是出现关于上下文加载异常?

这是一个制作登陆页面的小练习,大概长这样 ![图片说明](https://img-ask.csdn.net/upload/201907/20/1563604903_663111.png) **错误信息是在Server startup后立马就出现的**,还没来得及访问index.jsp呢!求指点,谢谢。 web.xml存放在WEB-INF文件夹下 ![图片说明](https://img-ask.csdn.net/upload/201907/20/1563607667_370819.png) ## 错误信息: ``` 七月 20, 2019 3:20:35 下午 org.apache.catalina.startup.VersionLoggerListener log 信息: Server version: Apache Tomcat/8.5.42 七月 20, 2019 3:20:35 下午 org.apache.catalina.startup.VersionLoggerListener log 信息: Server built: Jun 4 2019 20:29:04 UTC 七月 20, 2019 3:20:35 下午 org.apache.catalina.startup.VersionLoggerListener log 信息: Server number: 8.5.42.0 七月 20, 2019 3:20:35 下午 org.apache.catalina.startup.VersionLoggerListener log 信息: OS Name: Windows 10 七月 20, 2019 3:20:35 下午 org.apache.catalina.startup.VersionLoggerListener log 信息: OS Version: 10.0 七月 20, 2019 3:20:35 下午 org.apache.catalina.startup.VersionLoggerListener log 信息: Architecture: amd64 七月 20, 2019 3:20:35 下午 org.apache.catalina.startup.VersionLoggerListener log 信息: Java Home: C:\Program Files\Java\jdk1.8.0_201\jre 七月 20, 2019 3:20:35 下午 org.apache.catalina.startup.VersionLoggerListener log 信息: JVM Version: 1.8.0_201-b09 七月 20, 2019 3:20:35 下午 org.apache.catalina.startup.VersionLoggerListener log 信息: JVM Vendor: Oracle Corporation 七月 20, 2019 3:20:35 下午 org.apache.catalina.startup.VersionLoggerListener log 信息: CATALINA_BASE: D:\eclipse-workspace\.metadata\.plugins\org.eclipse.wst.server.core\tmp0 七月 20, 2019 3:20:35 下午 org.apache.catalina.startup.VersionLoggerListener log 信息: CATALINA_HOME: D:\apache-tomcat-8.5.42 七月 20, 2019 3:20:35 下午 org.apache.catalina.startup.VersionLoggerListener log 信息: Command line argument: -Dcatalina.base=D:\eclipse-workspace\.metadata\.plugins\org.eclipse.wst.server.core\tmp0 七月 20, 2019 3:20:35 下午 org.apache.catalina.startup.VersionLoggerListener log 信息: Command line argument: -Dcatalina.home=D:\apache-tomcat-8.5.42 七月 20, 2019 3:20:35 下午 org.apache.catalina.startup.VersionLoggerListener log 信息: Command line argument: -Dwtp.deploy=D:\eclipse-workspace\.metadata\.plugins\org.eclipse.wst.server.core\tmp0\wtpwebapps 七月 20, 2019 3:20:35 下午 org.apache.catalina.startup.VersionLoggerListener log 信息: Command line argument: -Djava.endorsed.dirs=D:\apache-tomcat-8.5.42\endorsed 七月 20, 2019 3:20:35 下午 org.apache.catalina.startup.VersionLoggerListener log 信息: Command line argument: -Dfile.encoding=UTF-8 七月 20, 2019 3:20:35 下午 org.apache.catalina.core.AprLifecycleListener lifecycleEvent 信息: Loaded APR based Apache Tomcat Native library [1.2.21] using APR version [1.6.5]. 七月 20, 2019 3:20:35 下午 org.apache.catalina.core.AprLifecycleListener lifecycleEvent 信息: APR capabilities: IPv6 [true], sendfile [true], accept filters [false], random [true]. 七月 20, 2019 3:20:35 下午 org.apache.catalina.core.AprLifecycleListener lifecycleEvent 信息: APR/OpenSSL configuration: useAprConnector [false], useOpenSSL [true] 七月 20, 2019 3:20:35 下午 org.apache.catalina.core.AprLifecycleListener initializeSSL 信息: OpenSSL successfully initialized [OpenSSL 1.1.1a 20 Nov 2018] 七月 20, 2019 3:20:35 下午 org.apache.coyote.AbstractProtocol init 信息: Initializing ProtocolHandler ["http-nio-8080"] 七月 20, 2019 3:20:36 下午 org.apache.tomcat.util.net.NioSelectorPool getSharedSelector 信息: Using a shared selector for servlet write/read 七月 20, 2019 3:20:36 下午 org.apache.coyote.AbstractProtocol init 信息: Initializing ProtocolHandler ["ajp-nio-8009"] 七月 20, 2019 3:20:36 下午 org.apache.tomcat.util.net.NioSelectorPool getSharedSelector 信息: Using a shared selector for servlet write/read 七月 20, 2019 3:20:36 下午 org.apache.catalina.startup.Catalina load 信息: Initialization processed in 1265 ms 七月 20, 2019 3:20:36 下午 org.apache.catalina.core.StandardService startInternal 信息: Starting service [Catalina] 七月 20, 2019 3:20:36 下午 org.apache.catalina.core.StandardEngine startInternal 信息: Starting Servlet Engine: Apache Tomcat/8.5.42 七月 20, 2019 3:20:39 下午 org.apache.catalina.core.ApplicationContext log 信息: No Spring WebApplicationInitializer types detected on classpath 七月 20, 2019 3:20:39 下午 org.apache.jasper.servlet.TldScanner scanJars 信息: At least one JAR was scanned for TLDs yet contained no TLDs. Enable debug logging for this logger for a complete list of JARs that were scanned but no TLDs were found in them. Skipping unneeded JARs during scanning can improve startup time and JSP compilation time. 七月 20, 2019 3:20:39 下午 org.apache.catalina.core.ApplicationContext log 信息: Initializing Spring root WebApplicationContext log4j:WARN No appenders could be found for logger (org.springframework.web.context.ContextLoader). log4j:WARN Please initialize the log4j system properly. log4j:WARN See http://logging.apache.org/log4j/1.2/faq.html#noconfig for more info. 七月 20, 2019 3:20:42 下午 org.apache.catalina.core.StandardContext listenerStart 严重: Exception sending context initialized event to listener instance of class [org.springframework.web.context.ContextLoaderListener] org.springframework.beans.factory.BeanCreationException: Error creating bean with name 'usersService' defined in class path resource [applicationContext.xml]: Error setting property values; nested exception is org.springframework.beans.NotWritablePropertyException: Invalid property 'usersMapper' of bean class [yan.ibbie.service.impl.UsersServiceImpl]: Bean property 'usersMapper' is not writable or has an invalid setter method. Does the parameter type of the setter match the return type of the getter? at org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.applyPropertyValues(AbstractAutowireCapableBeanFactory.java:1718) at org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.populateBean(AbstractAutowireCapableBeanFactory.java:1433) at org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.doCreateBean(AbstractAutowireCapableBeanFactory.java:592) at org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.createBean(AbstractAutowireCapableBeanFactory.java:515) at org.springframework.beans.factory.support.AbstractBeanFactory.lambda$doGetBean$0(AbstractBeanFactory.java:320) at org.springframework.beans.factory.support.DefaultSingletonBeanRegistry.getSingleton(DefaultSingletonBeanRegistry.java:222) at org.springframework.beans.factory.support.AbstractBeanFactory.doGetBean(AbstractBeanFactory.java:318) at org.springframework.beans.factory.support.AbstractBeanFactory.getBean(AbstractBeanFactory.java:199) at org.springframework.beans.factory.support.DefaultListableBeanFactory.preInstantiateSingletons(DefaultListableBeanFactory.java:845) at org.springframework.context.support.AbstractApplicationContext.finishBeanFactoryInitialization(AbstractApplicationContext.java:877) at org.springframework.context.support.AbstractApplicationContext.refresh(AbstractApplicationContext.java:549) at org.springframework.web.context.ContextLoader.configureAndRefreshWebApplicationContext(ContextLoader.java:400) at org.springframework.web.context.ContextLoader.initWebApplicationContext(ContextLoader.java:291) at org.springframework.web.context.ContextLoaderListener.contextInitialized(ContextLoaderListener.java:103) at org.apache.catalina.core.StandardContext.listenerStart(StandardContext.java:4770) at org.apache.catalina.core.StandardContext.startInternal(StandardContext.java:5236) at org.apache.catalina.util.LifecycleBase.start(LifecycleBase.java:150) at org.apache.catalina.core.ContainerBase$StartChild.call(ContainerBase.java:1423) at org.apache.catalina.core.ContainerBase$StartChild.call(ContainerBase.java:1413) at java.util.concurrent.FutureTask.run(FutureTask.java:266) at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149) at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624) at java.lang.Thread.run(Thread.java:748) Caused by: org.springframework.beans.NotWritablePropertyException: Invalid property 'usersMapper' of bean class [yan.ibbie.service.impl.UsersServiceImpl]: Bean property 'usersMapper' is not writable or has an invalid setter method. Does the parameter type of the setter match the return type of the getter? at org.springframework.beans.BeanWrapperImpl.createNotWritablePropertyException(BeanWrapperImpl.java:243) at org.springframework.beans.AbstractNestablePropertyAccessor.processLocalProperty(AbstractNestablePropertyAccessor.java:426) at org.springframework.beans.AbstractNestablePropertyAccessor.setPropertyValue(AbstractNestablePropertyAccessor.java:278) at org.springframework.beans.AbstractNestablePropertyAccessor.setPropertyValue(AbstractNestablePropertyAccessor.java:266) at org.springframework.beans.AbstractPropertyAccessor.setPropertyValues(AbstractPropertyAccessor.java:97) at org.springframework.beans.AbstractPropertyAccessor.setPropertyValues(AbstractPropertyAccessor.java:77) at org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.applyPropertyValues(AbstractAutowireCapableBeanFactory.java:1714) ... 22 more 七月 20, 2019 3:20:42 下午 org.apache.catalina.core.StandardContext startInternal 严重: One or more listeners failed to start. Full details will be found in the appropriate container log file 七月 20, 2019 3:20:42 下午 org.apache.catalina.core.StandardContext startInternal 严重: Context [/SpringLogin] startup failed due to previous errors 七月 20, 2019 3:20:42 下午 org.apache.catalina.core.ApplicationContext log 信息: Closing Spring root WebApplicationContext 七月 20, 2019 3:20:42 下午 org.apache.catalina.loader.WebappClassLoaderBase clearReferencesJdbc 警告: The web application [SpringLogin] registered the JDBC driver [com.mysql.cj.jdbc.Driver] but failed to unregister it when the web application was stopped. To prevent a memory leak, the JDBC Driver has been forcibly unregistered. 七月 20, 2019 3:20:42 下午 org.apache.catalina.loader.WebappClassLoaderBase clearReferencesThreads 警告: The web application [SpringLogin] appears to have started a thread named [mysql-cj-abandoned-connection-cleanup] but has failed to stop it. This is very likely to create a memory leak. Stack trace of thread: java.lang.Object.wait(Native Method) java.lang.ref.ReferenceQueue.remove(ReferenceQueue.java:144) com.mysql.cj.jdbc.AbandonedConnectionCleanupThread.run(AbandonedConnectionCleanupThread.java:85) java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149) java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624) java.lang.Thread.run(Thread.java:748) 七月 20, 2019 3:20:42 下午 org.apache.coyote.AbstractProtocol start 信息: Starting ProtocolHandler ["http-nio-8080"] 七月 20, 2019 3:20:42 下午 org.apache.coyote.AbstractProtocol start 信息: Starting ProtocolHandler ["ajp-nio-8009"] 七月 20, 2019 3:20:42 下午 org.apache.catalina.startup.Catalina start 信息: Server startup in 5949 ms 七月 20, 2019 3:20:45 下午 org.apache.catalina.loader.WebappClassLoaderBase checkStateForResourceLoading 信息: Illegal access: this web application instance has been stopped already. Could not load []. The following stack trace is thrown for debugging purposes as well as to attempt to terminate the thread which caused the illegal access. java.lang.IllegalStateException: Illegal access: this web application instance has been stopped already. Could not load []. The following stack trace is thrown for debugging purposes as well as to attempt to terminate the thread which caused the illegal access. at org.apache.catalina.loader.WebappClassLoaderBase.checkStateForResourceLoading(WebappClassLoaderBase.java:1384) at org.apache.catalina.loader.WebappClassLoaderBase.getResource(WebappClassLoaderBase.java:1034) at com.mysql.cj.jdbc.AbandonedConnectionCleanupThread.checkThreadContextClassLoader(AbandonedConnectionCleanupThread.java:117) at com.mysql.cj.jdbc.AbandonedConnectionCleanupThread.run(AbandonedConnectionCleanupThread.java:84) at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149) at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624) at java.lang.Thread.run(Thread.java:748) ``` ## web.xml配置如下: ``` <?xml version="1.0" encoding="UTF-8"?> <web-app version="3.1" xmlns="http://xmlns.jcp.org/xml/ns/javaee" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance" xsi:schemaLocation="http://xmlns.jcp.org/xml/ns/javaee http://xmlns.jcp.org/xml/ns/javaee/web-app_3_1.xsd"> <!-- 设置Spring配置文件路径 --> <!-- 当tomcat加载web.xml时,会把Spring配置文件信息存放到application对象 --> <!-- WebApplicationContext是ApplicationContext的子接口 --> <context-param> <param-name>contextConfigLocation</param-name> <param-value>classpath:applicationContext.xml</param-value> </context-param> <!-- 加载Spring配置文件 --> <listener> <listener-class>org.springframework.web.context.ContextLoaderListener</listener-class> </listener> </web-app> ``` ## applicationContext.xml的Spring配置如下: ``` <?xml version="1.0" encoding="UTF-8"?> <beans xmlns="http://www.springframework.org/schema/beans" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance" xsi:schemaLocation="http://www.springframework.org/schema/beans https://www.springframework.org/schema/beans/spring-beans.xsd"> <!-- 数据源封装类 --> <bean id="datasource" class="org.springframework.jdbc.datasource.DriverManagerDataSource"> <property name="driverClassName" value="com.mysql.cj.jdbc.Driver"/> <property name="url" value="jdbc:mysql://localhost:3306/ssm?serverTimezone=GMT%2B8"/> <property name="username" value="root"/> <property name="password" value="147852369"/> </bean> <!-- SqlSessionFactory --> <bean id="factory" class="org.mybatis.spring.SqlSessionFactoryBean"> <property name="dataSource" ref="datasource"/> </bean> <!-- 扫描器 --> <bean class="org.mybatis.spring.mapper.MapperScannerConfigurer"> <property name="sqlSessionFactory" ref="factory"/> <property name="basePackage" value="yan.ibbie.mapper"/> </bean> <!-- 创建UsersService的实现类,其中usersMapper是Spring自动创建的bean --> <bean id="usersService" class="yan.ibbie.service.impl.UsersServiceImpl"> <property name="usersMapper" ref="usersMapper"/> </bean> </beans> ``` ## index.jsp页面如下: ``` <%@ page language="java" contentType="text/html; charset=UTF-8" pageEncoding="UTF-8"%> <!DOCTYPE html> <html> <head> <meta charset="UTF-8"> <title>Insert title here</title> <script type="text/javascript" src="/js/jquery-3.4.1.js"></script> <script type="text/javascript"> $(function(){ $("a").click(function(){ $("img").attr("src","validcode?date="+new Date()); return false; }) }) </script> </head> <body> ${error } <form action="Login" method="post"> 用户名:<input type="text" name="username" /><br> 密码:<input type="password" name="password" /><br> 验证码:<input type="text" size="1" name="code"/><img src="validcode" width="80" height="40"/><a href="">看不清</a><br> <input type="submit" value="登陆" /><input type="reset" value="重置" /> </form> </body> </html> ``` ## 出错的是LoginServlet中一堆aaaaa那行,LoginServlet如下: ``` package yan.ibbie.servlet; import java.io.IOException; import javax.servlet.ServletException; import javax.servlet.annotation.WebServlet; import javax.servlet.http.HttpServlet; import javax.servlet.http.HttpServletRequest; import javax.servlet.http.HttpServletResponse; import javax.servlet.http.HttpSession; import org.springframework.context.ApplicationContext; import org.springframework.web.context.support.WebApplicationContextUtils; import yan.ibbie.pojo.Users; import yan.ibbie.service.UsersService; import yan.ibbie.service.impl.UsersServiceImpl; /** * Servlet implementation class LoginServlet */ @WebServlet("/Login") public class LoginServlet extends HttpServlet { private static final long serialVersionUID = 1L; private UsersService usersService; @Override public void init() throws ServletException { System.out.println("aaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaa"+getServletContext()); ApplicationContext ac = WebApplicationContextUtils.getRequiredWebApplicationContext(getServletContext()); usersService = ac.getBean("usersService",UsersServiceImpl.class); } @Override protected void service(HttpServletRequest req, HttpServletResponse resp) throws ServletException, IOException { String code = req.getParameter("code"); HttpSession session = req.getSession(); String codeSession = session.getAttribute("validcode").toString(); if (codeSession.equals(code)) { Users users = new Users(); String username = req.getParameter("username"); String password = req.getParameter("password"); users.setUsername(username); users.setPasserword(password); Users user = usersService.login(users); if (user!=null) { resp.sendRedirect("main.jsp"); }else { req.setAttribute("error", "用户名和密码不正确"); req.getRequestDispatcher("index.jsp").forward(req, resp); } }else { req.setAttribute("error", "验证码不正确"); req.getRequestDispatcher("index.jsp").forward(req, resp); } } } ```

sql语句问题,怎么将查询出来的数据,只取第2第3条数据

SELECT date_format(createTime,'%Y-%m-%d') createTime from t_zx_sqzx where communityId='8' GROUP BY date_format(createTime,'%Y-%m-%d') DESC ![图片说明](https://img-ask.csdn.net/upload/201509/10/1441849698_218829.jpg) 这是从数据库读取出的数据,怎么取第2第3条数据,或者其他条数据

mysql查询数据使用 limit 999999和 未使用limit 结果不同 limit的使用方法?

需求: 查询用户房间type为2的房间战绩排行榜 根据 主条件killNum做降序, 次条件scoreNum做降序,一个用户只能出现一次取其最高的数据 在同样环境,数据相同的情况下,查询出来的数据不同 纠结了许久 答案数据返回正确 但是我心里摸不着底SQL感觉还需要优化 也喜欢CSDN的大佬们有时间能看一看 有limit 正常: ![图片说明](https://img-ask.csdn.net/upload/201811/27/1543319092_471669.png) 无limit 不正常: ![图片说明](https://img-ask.csdn.net/upload/201811/27/1543319161_377684.png) 表结构如下: 用户表: CREATE TABLE `ss_account` ( `id` int(11) NOT NULL AUTO_INCREMENT, `name` varchar(10) CHARACTER SET utf8 COLLATE utf8_general_ci NOT NULL DEFAULT '', PRIMARY KEY (`id`) USING BTREE ) ENGINE = InnoDB AUTO_INCREMENT = 125 CHARACTER SET = utf8 COLLATE = utf8_general_ci ROW_FORMAT = Dynamic; 房间id表: CREATE TABLE `ss_room_inc_id` ( `id` int(11) NOT NULL AUTO_INCREMENT , `type` tinyint(1) NOT NULL , `create_time` int(10) NOT NULL DEFAULT 0 , PRIMARY KEY (`id`) USING BTREE ) ENGINE = InnoDB AUTO_INCREMENT = 2065 CHARACTER SET = utf8 COLLATE = utf8_general_ci ROW_FORMAT = Dynamic; 房间战绩表: CREATE TABLE `ss_game_record` ( `id` int(11) NOT NULL AUTO_INCREMENT , `account_id` int(11) NOT NULL, `room_id` int(11) NOT NULL , `score_num` int(11) NOT NULL, `kill_num` int(11) NOT NULL, PRIMARY KEY (`id`) USING BTREE, ) ENGINE = InnoDB AUTO_INCREMENT = 156 CHARACTER SET = utf8 COLLATE = utf8_general_ci ROW_FORMAT = Dynamic; SET FOREIGN_KEY_CHECKS = 1; 表数据: INSERT INTO `ss_account` VALUES (120, 'Dean'); INSERT INTO `ss_account` VALUES (121, ''); INSERT INTO `ss_account` VALUES (122, '丑娃儿'); INSERT INTO `ss_account` VALUES (123, '一万年'); INSERT INTO `ss_account` VALUES (124, '单打独斗'); INSERT INTO `ss_game_record` VALUES (2, 120, 1905, 120, 2); INSERT INTO `ss_game_record` VALUES (3, 121, 1906, 80, 1); INSERT INTO `ss_game_record` VALUES (4, 122, 1907, 70, 5); INSERT INTO `ss_game_record` VALUES (5, 122, 1908, 80, 5); INSERT INTO `ss_game_record` VALUES (6, 120, 1909, 100, 0); INSERT INTO `ss_game_record` VALUES (7, 120, 1910, 70, 0); INSERT INTO `ss_game_record` VALUES (8, 120, 1911, 45, 1); INSERT INTO `ss_game_record` VALUES (9, 120, 1912, 195, 1); INSERT INTO `ss_game_record` VALUES (10, 122, 1913, 110, 4); INSERT INTO `ss_game_record` VALUES (11, 120, 1914, 75, 1); INSERT INTO `ss_game_record` VALUES (12, 120, 1915, 105, 0); INSERT INTO `ss_game_record` VALUES (13, 120, 1916, 140, 1); INSERT INTO `ss_game_record` VALUES (14, 120, 1917, 180, 1); INSERT INTO `ss_game_record` VALUES (15, 120, 1918, 495, 0); INSERT INTO `ss_game_record` VALUES (16, 120, 1919, 170, 1); INSERT INTO `ss_game_record` VALUES (17, 120, 1920, 205, 0); INSERT INTO `ss_game_record` VALUES (18, 120, 1921, 435, 0); INSERT INTO `ss_game_record` VALUES (19, 120, 1922, 95, 1); INSERT INTO `ss_game_record` VALUES (20, 120, 1923, 105, 1); INSERT INTO `ss_game_record` VALUES (21, 122, 1924, 230, 0); INSERT INTO `ss_game_record` VALUES (22, 122, 1925, 145, 0); INSERT INTO `ss_game_record` VALUES (23, 122, 1926, 55, 0); INSERT INTO `ss_game_record` VALUES (24, 122, 1927, 325, 0); INSERT INTO `ss_game_record` VALUES (25, 122, 1928, 235, 0); INSERT INTO `ss_game_record` VALUES (26, 122, 1930, 45, 0); INSERT INTO `ss_game_record` VALUES (27, 123, 1929, 70, 0); INSERT INTO `ss_game_record` VALUES (28, 122, 1931, 25, 0); INSERT INTO `ss_game_record` VALUES (29, 122, 1932, 25, 0); INSERT INTO `ss_game_record` VALUES (30, 122, 1933, 35, 0); INSERT INTO `ss_game_record` VALUES (31, 122, 1934, 25, 0); INSERT INTO `ss_game_record` VALUES (32, 122, 1935, 25, 0); INSERT INTO `ss_game_record` VALUES (33, 122, 1936, 55, 0); INSERT INTO `ss_game_record` VALUES (34, 122, 1937, 25, 0); INSERT INTO `ss_game_record` VALUES (35, 122, 1938, 115, 0); INSERT INTO `ss_game_record` VALUES (36, 122, 1939, 135, 0); INSERT INTO `ss_game_record` VALUES (37, 122, 1940, 105, 0); INSERT INTO `ss_game_record` VALUES (38, 122, 1941, 95, 0); INSERT INTO `ss_game_record` VALUES (39, 120, 1942, 380, 0); INSERT INTO `ss_game_record` VALUES (40, 122, 1943, 45, 0); INSERT INTO `ss_game_record` VALUES (41, 120, 1945, 2640, 3); INSERT INTO `ss_game_record` VALUES (42, 122, 1944, 35, 0); INSERT INTO `ss_game_record` VALUES (43, 122, 1946, 25, 0); INSERT INTO `ss_game_record` VALUES (44, 122, 1947, 35, 0); INSERT INTO `ss_game_record` VALUES (45, 122, 1948, 30, 0); INSERT INTO `ss_game_record` VALUES (46, 122, 1950, 45, 0); INSERT INTO `ss_game_record` VALUES (47, 122, 1951, 45, 0); INSERT INTO `ss_game_record` VALUES (48, 122, 1952, 35, 0); INSERT INTO `ss_game_record` VALUES (49, 122, 1954, 35, 0); INSERT INTO `ss_game_record` VALUES (50, 122, 1953, 100, 0); INSERT INTO `ss_game_record` VALUES (51, 122, 1956, 100, 0); INSERT INTO `ss_game_record` VALUES (52, 122, 1957, 25, 0); INSERT INTO `ss_game_record` VALUES (53, 122, 1958, 80, 0); INSERT INTO `ss_game_record` VALUES (54, 122, 1955, 30, 0); INSERT INTO `ss_game_record` VALUES (55, 122, 1959, 85, 0); INSERT INTO `ss_game_record` VALUES (56, 122, 1960, 45, 0); INSERT INTO `ss_game_record` VALUES (57, 122, 1961, 45, 0); INSERT INTO `ss_game_record` VALUES (58, 122, 1962, 30, 0); INSERT INTO `ss_game_record` VALUES (59, 122, 1963, 65, 0); INSERT INTO `ss_game_record` VALUES (60, 122, 1964, 75, 0); INSERT INTO `ss_game_record` VALUES (61, 122, 1967, 25, 0); INSERT INTO `ss_game_record` VALUES (62, 122, 1966, 110, 0); INSERT INTO `ss_game_record` VALUES (63, 122, 1968, 25, 0); INSERT INTO `ss_game_record` VALUES (64, 122, 1969, 55, 0); INSERT INTO `ss_game_record` VALUES (65, 122, 1970, 25, 0); INSERT INTO `ss_game_record` VALUES (66, 122, 1974, 45, 0); INSERT INTO `ss_game_record` VALUES (67, 122, 1973, 80, 0); INSERT INTO `ss_game_record` VALUES (68, 122, 1972, 90, 0); INSERT INTO `ss_game_record` VALUES (69, 122, 1971, 110, 0); INSERT INTO `ss_game_record` VALUES (70, 122, 1975, 25, 0); INSERT INTO `ss_game_record` VALUES (71, 122, 1977, 25, 0); INSERT INTO `ss_game_record` VALUES (72, 122, 1976, 100, 0); INSERT INTO `ss_game_record` VALUES (73, 122, 1978, 35, 0); INSERT INTO `ss_game_record` VALUES (74, 122, 1979, 45, 0); INSERT INTO `ss_game_record` VALUES (75, 122, 1980, 40, 0); INSERT INTO `ss_game_record` VALUES (76, 122, 1983, 25, 0); INSERT INTO `ss_game_record` VALUES (77, 122, 1984, 25, 0); INSERT INTO `ss_game_record` VALUES (78, 122, 1981, 180, 0); INSERT INTO `ss_game_record` VALUES (79, 122, 1985, 25, 0); INSERT INTO `ss_game_record` VALUES (80, 122, 1982, 50, 0); INSERT INTO `ss_game_record` VALUES (81, 122, 1989, 25, 0); INSERT INTO `ss_game_record` VALUES (82, 122, 1986, 85, 0); INSERT INTO `ss_game_record` VALUES (83, 122, 1990, 80, 0); INSERT INTO `ss_game_record` VALUES (84, 122, 1987, 165, 0); INSERT INTO `ss_game_record` VALUES (85, 122, 1988, 45, 0); INSERT INTO `ss_game_record` VALUES (86, 122, 1991, 65, 0); INSERT INTO `ss_game_record` VALUES (87, 122, 1992, 40, 0); INSERT INTO `ss_game_record` VALUES (88, 122, 1993, 80, 0); INSERT INTO `ss_game_record` VALUES (89, 122, 1994, 650, 0); INSERT INTO `ss_game_record` VALUES (90, 122, 1995, 140, 0); INSERT INTO `ss_game_record` VALUES (91, 122, 1997, 190, 0); INSERT INTO `ss_game_record` VALUES (92, 120, 2001, 0, 0); INSERT INTO `ss_game_record` VALUES (93, 122, 1999, 75, 0); INSERT INTO `ss_game_record` VALUES (94, 120, 2002, 585, 0); INSERT INTO `ss_game_record` VALUES (95, 120, 2005, 115, 0); INSERT INTO `ss_game_record` VALUES (96, 122, 2003, 125, 0); INSERT INTO `ss_game_record` VALUES (97, 122, 2004, 55, 0); INSERT INTO `ss_game_record` VALUES (98, 122, 2000, 35, 0); INSERT INTO `ss_game_record` VALUES (99, 122, 2006, 90, 0); INSERT INTO `ss_game_record` VALUES (100, 120, 2008, 880, 2); INSERT INTO `ss_game_record` VALUES (101, 120, 2011, 380, 0); INSERT INTO `ss_game_record` VALUES (102, 122, 2009, 180, 0); INSERT INTO `ss_game_record` VALUES (103, 120, 2012, 730, 1); INSERT INTO `ss_game_record` VALUES (104, 120, 2013, 0, 0); INSERT INTO `ss_game_record` VALUES (105, 122, 2007, 310, 0); INSERT INTO `ss_game_record` VALUES (106, 120, 2014, 635, 0); INSERT INTO `ss_game_record` VALUES (107, 122, 2015, 55, 0); INSERT INTO `ss_game_record` VALUES (108, 122, 2016, 75, 0); INSERT INTO `ss_game_record` VALUES (109, 122, 2017, 240, 0); INSERT INTO `ss_game_record` VALUES (110, 122, 2018, 25, 0); INSERT INTO `ss_game_record` VALUES (111, 122, 2019, 140, 0); INSERT INTO `ss_game_record` VALUES (112, 122, 2020, 35, 0); INSERT INTO `ss_game_record` VALUES (113, 122, 2021, 25, 0); INSERT INTO `ss_game_record` VALUES (114, 122, 2022, 40, 0); INSERT INTO `ss_game_record` VALUES (115, 122, 2023, 25, 0); INSERT INTO `ss_game_record` VALUES (116, 122, 2024, 35, 0); INSERT INTO `ss_game_record` VALUES (117, 122, 2025, 25, 0); INSERT INTO `ss_game_record` VALUES (118, 122, 2026, 60, 0); INSERT INTO `ss_game_record` VALUES (119, 122, 2027, 55, 0); INSERT INTO `ss_game_record` VALUES (120, 122, 2028, 25, 0); INSERT INTO `ss_game_record` VALUES (121, 122, 2029, 25, 0); INSERT INTO `ss_game_record` VALUES (122, 122, 2030, 25, 0); INSERT INTO `ss_game_record` VALUES (123, 122, 2031, 105, 0); INSERT INTO `ss_game_record` VALUES (124, 122, 2032, 50, 0); INSERT INTO `ss_game_record` VALUES (125, 122, 2033, 25, 0); INSERT INTO `ss_game_record` VALUES (126, 122, 2034, 25, 0); INSERT INTO `ss_game_record` VALUES (127, 122, 2035, 25, 0); INSERT INTO `ss_game_record` VALUES (128, 122, 2036, 45, 0); INSERT INTO `ss_game_record` VALUES (129, 122, 2037, 25, 0); INSERT INTO `ss_game_record` VALUES (130, 122, 2038, 425, 0); INSERT INTO `ss_game_record` VALUES (131, 122, 2039, 50, 0); INSERT INTO `ss_game_record` VALUES (132, 122, 2040, 65, 0); INSERT INTO `ss_game_record` VALUES (133, 122, 2041, 195, 0); INSERT INTO `ss_game_record` VALUES (134, 122, 2042, 250, 0); INSERT INTO `ss_game_record` VALUES (135, 122, 2043, 25, 0); INSERT INTO `ss_game_record` VALUES (136, 122, 2044, 100, 0); INSERT INTO `ss_game_record` VALUES (137, 122, 2045, 25, 0); INSERT INTO `ss_game_record` VALUES (138, 122, 2046, 70, 0); INSERT INTO `ss_game_record` VALUES (139, 122, 2047, 30, 0); INSERT INTO `ss_game_record` VALUES (140, 122, 2048, 110, 0); INSERT INTO `ss_game_record` VALUES (141, 122, 2050, 25, 0); INSERT INTO `ss_game_record` VALUES (142, 122, 2049, 330, 0); INSERT INTO `ss_game_record` VALUES (143, 122, 2051, 45, 0); INSERT INTO `ss_game_record` VALUES (144, 122, 2052, 35, 0); INSERT INTO `ss_game_record` VALUES (145, 122, 2053, 25, 0); INSERT INTO `ss_game_record` VALUES (146, 122, 2054, 110, 0); INSERT INTO `ss_game_record` VALUES (147, 122, 2055, 230, 0); INSERT INTO `ss_game_record` VALUES (148, 122, 2056, 105, 0); INSERT INTO `ss_game_record` VALUES (149, 122, 2057, 175, 0); INSERT INTO `ss_game_record` VALUES (150, 122, 2058, 585, 0); INSERT INTO `ss_game_record` VALUES (151, 122, 2059, 60, 0); INSERT INTO `ss_game_record` VALUES (152, 122, 2060, 160, 0); INSERT INTO `ss_game_record` VALUES (153, 122, 2061, 170, 0); INSERT INTO `ss_game_record` VALUES (154, 122, 2062, 255, 0); INSERT INTO `ss_game_record` VALUES (155, 122, 2063, 250, 0); INSERT INTO `ss_room_inc_id` VALUES (1905, 2, 1543304140); INSERT INTO `ss_room_inc_id` VALUES (1906, 2, 1543304154); INSERT INTO `ss_room_inc_id` VALUES (1907, 2, 1543304140); INSERT INTO `ss_room_inc_id` VALUES (1908, 2, 1143304770); INSERT INTO `ss_room_inc_id` VALUES (1909, 1, 1543304770); INSERT INTO `ss_room_inc_id` VALUES (1910, 1, 1543304770); INSERT INTO `ss_room_inc_id` VALUES (1911, 1, 1543304862); INSERT INTO `ss_room_inc_id` VALUES (1912, 2, 1543304862); INSERT INTO `ss_room_inc_id` VALUES (1913, 2, 1543304862); INSERT INTO `ss_room_inc_id` VALUES (1914, 1, 1543305379); INSERT INTO `ss_room_inc_id` VALUES (1915, 1, 1543305379); INSERT INTO `ss_room_inc_id` VALUES (1916, 1, 1543305491); INSERT INTO `ss_room_inc_id` VALUES (1917, 1, 1543305491); INSERT INTO `ss_room_inc_id` VALUES (1918, 1, 1543306739); INSERT INTO `ss_room_inc_id` VALUES (1919, 1, 1543306740); INSERT INTO `ss_room_inc_id` VALUES (1920, 2, 1543306872); INSERT INTO `ss_room_inc_id` VALUES (1921, 2, 1543306930); INSERT INTO `ss_room_inc_id` VALUES (1922, 1, 1543306934); INSERT INTO `ss_room_inc_id` VALUES (1923, 1, 1543306934); INSERT INTO `ss_room_inc_id` VALUES (1924, 1, 1543308579); INSERT INTO `ss_room_inc_id` VALUES (1925, 1, 1543308579); INSERT INTO `ss_room_inc_id` VALUES (1926, 1, 1543309072); INSERT INTO `ss_room_inc_id` VALUES (1927, 1, 1543309072); INSERT INTO `ss_room_inc_id` VALUES (1928, 1, 1543309199); INSERT INTO `ss_room_inc_id` VALUES (1929, 1, 1543309199); INSERT INTO `ss_room_inc_id` VALUES (1930, 1, 1543309277); INSERT INTO `ss_room_inc_id` VALUES (1931, 1, 1543309277); INSERT INTO `ss_room_inc_id` VALUES (1932, 1, 1543309626); INSERT INTO `ss_room_inc_id` VALUES (1933, 1, 1543309626); INSERT INTO `ss_room_inc_id` VALUES (1934, 1, 1543309635); INSERT INTO `ss_room_inc_id` VALUES (1935, 1, 1543309636); INSERT INTO `ss_room_inc_id` VALUES (1936, 1, 1543309685); INSERT INTO `ss_room_inc_id` VALUES (1937, 1, 1543309685); INSERT INTO `ss_room_inc_id` VALUES (1938, 1, 1543309722); INSERT INTO `ss_room_inc_id` VALUES (1939, 1, 1543309722); INSERT INTO `ss_room_inc_id` VALUES (1940, 1, 1543309770); INSERT INTO `ss_room_inc_id` VALUES (1941, 1, 1543309770); INSERT INTO `ss_room_inc_id` VALUES (1942, 2, 1543309797); INSERT INTO `ss_room_inc_id` VALUES (1943, 1, 1543309889); INSERT INTO `ss_room_inc_id` VALUES (1944, 1, 1243309889); INSERT INTO `ss_room_inc_id` VALUES (1945, 2, 1243310030); INSERT INTO `ss_room_inc_id` VALUES (1946, 1, 1543310151); INSERT INTO `ss_room_inc_id` VALUES (1947, 1, 1543310151); INSERT INTO `ss_room_inc_id` VALUES (1948, 1, 1543310160); INSERT INTO `ss_room_inc_id` VALUES (1949, 1, 1543310160); INSERT INTO `ss_room_inc_id` VALUES (1950, 1, 1543310233); INSERT INTO `ss_room_inc_id` VALUES (1951, 1, 1543310233); INSERT INTO `ss_room_inc_id` VALUES (1952, 1, 1543310244); INSERT INTO `ss_room_inc_id` VALUES (1953, 1, 1543310244); INSERT INTO `ss_room_inc_id` VALUES (1954, 1, 1543310255); INSERT INTO `ss_room_inc_id` VALUES (1955, 1, 1543310255); INSERT INTO `ss_room_inc_id` VALUES (1956, 1, 1543310300); INSERT INTO `ss_room_inc_id` VALUES (1957, 1, 1543310300); INSERT INTO `ss_room_inc_id` VALUES (1958, 1, 1543310462); INSERT INTO `ss_room_inc_id` VALUES (1959, 1, 1543310462); INSERT INTO `ss_room_inc_id` VALUES (1960, 1, 1543310487); INSERT INTO `ss_room_inc_id` VALUES (1961, 1, 1543310487); INSERT INTO `ss_room_inc_id` VALUES (1962, 1, 1543310496); INSERT INTO `ss_room_inc_id` VALUES (1963, 1, 1543310496); INSERT INTO `ss_room_inc_id` VALUES (1964, 1, 1543310561); INSERT INTO `ss_room_inc_id` VALUES (1965, 1, 1543310561); INSERT INTO `ss_room_inc_id` VALUES (1966, 1, 1543310902); INSERT INTO `ss_room_inc_id` VALUES (1967, 1, 1543310902); INSERT INTO `ss_room_inc_id` VALUES (1968, 1, 1543310946); INSERT INTO `ss_room_inc_id` VALUES (1969, 1, 1543310946); INSERT INTO `ss_room_inc_id` VALUES (1970, 1, 1543310953); INSERT INTO `ss_room_inc_id` VALUES (1971, 1, 1543310953); INSERT INTO `ss_room_inc_id` VALUES (1972, 1, 1543310961); INSERT INTO `ss_room_inc_id` VALUES (1973, 1, 1543310961); INSERT INTO `ss_room_inc_id` VALUES (1974, 1, 1543310967); INSERT INTO `ss_room_inc_id` VALUES (1975, 1, 1543310991); INSERT INTO `ss_room_inc_id` VALUES (1976, 1, 1543310991); INSERT INTO `ss_room_inc_id` VALUES (1977, 1, 1543311000); INSERT INTO `ss_room_inc_id` VALUES (1978, 1, 1543311000); INSERT INTO `ss_room_inc_id` VALUES (1979, 1, 1543311011); INSERT INTO `ss_room_inc_id` VALUES (1980, 1, 1543311011); INSERT INTO `ss_room_inc_id` VALUES (1981, 1, 1543311012); INSERT INTO `ss_room_inc_id` VALUES (1982, 1, 1543311012); INSERT INTO `ss_room_inc_id` VALUES (1983, 1, 1543311020); INSERT INTO `ss_room_inc_id` VALUES (1984, 1, 1543311020); INSERT INTO `ss_room_inc_id` VALUES (1985, 1, 1543311031); INSERT INTO `ss_room_inc_id` VALUES (1986, 1, 1543311031); INSERT INTO `ss_room_inc_id` VALUES (1987, 1, 1543311034); INSERT INTO `ss_room_inc_id` VALUES (1988, 1, 1543311034); INSERT INTO `ss_room_inc_id` VALUES (1989, 1, 1543311038); INSERT INTO `ss_room_inc_id` VALUES (1990, 1, 1543311038); INSERT INTO `ss_room_inc_id` VALUES (1991, 1, 1543311067); INSERT INTO `ss_room_inc_id` VALUES (1992, 1, 1543311067); INSERT INTO `ss_room_inc_id` VALUES (1993, 1, 1543311091); INSERT INTO `ss_room_inc_id` VALUES (1994, 1, 1543311091); INSERT INTO `ss_room_inc_id` VALUES (1995, 1, 1543311227); INSERT INTO `ss_room_inc_id` VALUES (1996, 1, 1543311227); INSERT INTO `ss_room_inc_id` VALUES (1997, 1, 1543311283); INSERT INTO `ss_room_inc_id` VALUES (1998, 1, 1543311283); INSERT INTO `ss_room_inc_id` VALUES (1999, 1, 1543311339); INSERT INTO `ss_room_inc_id` VALUES (2000, 1, 1543311339); INSERT INTO `ss_room_inc_id` VALUES (2001, 2, 1543311342); INSERT INTO `ss_room_inc_id` VALUES (2002, 2, 1543311400); INSERT INTO `ss_room_inc_id` VALUES (2003, 1, 1543311400); INSERT INTO `ss_room_inc_id` VALUES (2004, 1, 1543311400); INSERT INTO `ss_room_inc_id` VALUES (2005, 2, 1543311416); INSERT INTO `ss_room_inc_id` VALUES (2006, 1, 1543311514); INSERT INTO `ss_room_inc_id` VALUES (2007, 1, 1543311514); INSERT INTO `ss_room_inc_id` VALUES (2008, 2, 1543311526); INSERT INTO `ss_room_inc_id` VALUES (2009, 1, 1543311651); INSERT INTO `ss_room_inc_id` VALUES (2010, 1, 1543311651); INSERT INTO `ss_room_inc_id` VALUES (2011, 2, 1543311661); INSERT INTO `ss_room_inc_id` VALUES (2012, 2, 1543311716); INSERT INTO `ss_room_inc_id` VALUES (2013, 2, 1543311718); INSERT INTO `ss_room_inc_id` VALUES (2014, 2, 1543311787); INSERT INTO `ss_room_inc_id` VALUES (2015, 1, 1543312046); INSERT INTO `ss_room_inc_id` VALUES (2016, 1, 1543312046); INSERT INTO `ss_room_inc_id` VALUES (2017, 1, 1543312119); INSERT INTO `ss_room_inc_id` VALUES (2018, 1, 1543312119); INSERT INTO `ss_room_inc_id` VALUES (2019, 1, 1543312257); INSERT INTO `ss_room_inc_id` VALUES (2020, 1, 1543312257); INSERT INTO `ss_room_inc_id` VALUES (2021, 1, 1543312319); INSERT INTO `ss_room_inc_id` VALUES (2022, 1, 1543312319); INSERT INTO `ss_room_inc_id` VALUES (2023, 1, 1543312498); INSERT INTO `ss_room_inc_id` VALUES (2024, 1, 1543312498); INSERT INTO `ss_room_inc_id` VALUES (2025, 1, 1543312561); INSERT INTO `ss_room_inc_id` VALUES (2026, 1, 1543312561); INSERT INTO `ss_room_inc_id` VALUES (2027, 1, 1543312622); INSERT INTO `ss_room_inc_id` VALUES (2028, 1, 1543312622); INSERT INTO `ss_room_inc_id` VALUES (2029, 1, 1543312761); INSERT INTO `ss_room_inc_id` VALUES (2030, 1, 1543312761); INSERT INTO `ss_room_inc_id` VALUES (2031, 1, 1543312920); INSERT INTO `ss_room_inc_id` VALUES (2032, 1, 1543312920); INSERT INTO `ss_room_inc_id` VALUES (2033, 1, 1543313082); INSERT INTO `ss_room_inc_id` VALUES (2034, 1, 1543313082); INSERT INTO `ss_room_inc_id` VALUES (2035, 1, 1543313189); INSERT INTO `ss_room_inc_id` VALUES (2036, 1, 1543313189); INSERT INTO `ss_room_inc_id` VALUES (2037, 1, 1543313720); INSERT INTO `ss_room_inc_id` VALUES (2038, 1, 1543313720); INSERT INTO `ss_room_inc_id` VALUES (2039, 1, 1543313906); INSERT INTO `ss_room_inc_id` VALUES (2040, 1, 1543313906); INSERT INTO `ss_room_inc_id` VALUES (2041, 1, 1543314357); INSERT INTO `ss_room_inc_id` VALUES (2042, 1, 1543314357); INSERT INTO `ss_room_inc_id` VALUES (2043, 1, 1543314600); INSERT INTO `ss_room_inc_id` VALUES (2044, 1, 1543314600); INSERT INTO `ss_room_inc_id` VALUES (2045, 1, 1543314977); INSERT INTO `ss_room_inc_id` VALUES (2046, 1, 1543314977); INSERT INTO `ss_room_inc_id` VALUES (2047, 1, 1543315498); INSERT INTO `ss_room_inc_id` VALUES (2048, 1, 1543315498); INSERT INTO `ss_room_inc_id` VALUES (2049, 1, 1543315563); INSERT INTO `ss_room_inc_id` VALUES (2050, 1, 1543315563); INSERT INTO `ss_room_inc_id` VALUES (2051, 1, 1543315604); INSERT INTO `ss_room_inc_id` VALUES (2052, 1, 1543315604); INSERT INTO `ss_room_inc_id` VALUES (2053, 1, 1543315678); INSERT INTO `ss_room_inc_id` VALUES (2054, 1, 1543315678); INSERT INTO `ss_room_inc_id` VALUES (2055, 1, 1543316230); INSERT INTO `ss_room_inc_id` VALUES (2056, 1, 1543316230); INSERT INTO `ss_room_inc_id` VALUES (2057, 1, 1543316396); INSERT INTO `ss_room_inc_id` VALUES (2058, 1, 1543316396); INSERT INTO `ss_room_inc_id` VALUES (2059, 1, 1543318165); INSERT INTO `ss_room_inc_id` VALUES (2060, 1, 1543318165); INSERT INTO `ss_room_inc_id` VALUES (2061, 1, 1543318189); INSERT INTO `ss_room_inc_id` VALUES (2062, 1, 1543318189); INSERT INTO `ss_room_inc_id` VALUES (2063, 1, 1543318237); INSERT INTO `ss_room_inc_id` VALUES (2064, 1, 1543318237);

MySQL的分页查询语句limit

假设,一个表中有一百条数据,我要查询第5页,每页10条数据,SQL语句怎么写?是用limit么? 不用涉及到Java语言,就SQL语句

mysql数据到appcelerator-titanium mobile

<div class="post-text" itemprop="text"> <p>I´m trying to find a way to get data to my appcelerator app (android) from my mysql database. I am using a PHP script and a http request to parse a json, but I am doing something wrong. I know my database structure and the json works because I can print it on the web using php, so the problem seems to be in my js code or the way I am creating a json file with php. What I am trying to do in my example down here is to have an alert in my app showing the "employee_name" from the first row ("Steve") in the json. </p> <p>Can someone please help me to understand what I´m doing wrong?</p> <p>I been looking into this example <a href="https://archive.appcelerator.com/question/51201/how-to-create-mysql-query-from-titanium-mobile" rel="nofollow">https://archive.appcelerator.com/question/51201/how-to-create-mysql-query-from-titanium-mobile</a> and others, but without success.</p> <p>App.js</p> <pre><code>var xhr = Titanium.Network.createHTTPClient(); xhr.onload = function(){ var json = JSON.parse(this.responseText); if (!json) { Titanium.API.info('Error - Null return!'); return; } alert(json[1].employee_name); }; xhr.open('GET', "http://ljudy.com/test.php"); xhr.send(); </code></pre> <p>test.PHP</p> <pre><code>&lt;?php $connection = mysqli_connect("localhost","ljudycom_andreas","****","ljudycom_test") or die("Error " . mysqli_error($connection)); $sql = "select * from tbl_employee"; $result = mysqli_query($connection, $sql) or die("Error in Selecting " . mysqli_error($connection)); //create an array $emparray = array(); while($row =mysqli_fetch_assoc($result)) { $emparray[] = $row; } $fp = fopen('empdata.json', 'w'); fwrite($fp, json_encode($emparray)); fclose($fp); //close the db connection mysqli_close($connection); ?&gt; </code></pre> <p>JSON</p> <pre><code>[ { "employee_id" : "1", "employee_name" : "Steve", "designation" : "VP", "hired_date" : "2013-08-01", "salary" : "60000" }, { "employee_id" : "2", "employee_name" : "Robert", "designation" : "Executive", "hired_date" : "2014-10-09", "salary" : "20000" }, { "employee_id" : "3", "employee_name" : "Luci", "designation" : "Manager", "hired_date" : "2013-08-20", "salary" : "40000" }, { "employee_id" : "4", "employee_name" : "Joe", "designation" : "Executive", "hired_date" : "2013-06-01", "salary" : "25000" }, { "employee_id" : "5", "employee_name" : "Julia", "designation" : "Trainee", "hired_date" : "2014-10-01", "salary" : "10000" } ] </code></pre> </div>

急,高分求答!写一段mysql脚本,查询某个部门所有员工的考勤明细和考勤状态,要完整可用的sql脚本

有四张表,分别是部门,员工,签到,时间(是否工作日),表结构如下: 部门department: ![图片说明](https://img-ask.csdn.net/upload/201901/25/1548384875_223346.png) 员工employee: ![图片说明](https://img-ask.csdn.net/upload/201901/25/1548385181_116337.png) 签到sign_record: ![图片说明](https://img-ask.csdn.net/upload/201901/25/1548385285_615335.png) 时间deal_calendar: ![图片说明](https://img-ask.csdn.net/upload/201901/25/1548385342_846878.png) 表结构和测试数据sql脚本: ``` /* Navicat MySQL Data Transfer Source Server : guangda Source Server Version : 80013 Source Host : 127.0.0.1:3306 Source Database : guangda Target Server Type : MYSQL Target Server Version : 80013 File Encoding : 65001 Date: 2019-01-25 09:08:09 */ SET FOREIGN_KEY_CHECKS=0; -- ---------------------------- -- Table structure for deal_calendar -- ---------------------------- DROP TABLE IF EXISTS `deal_calendar`; CREATE TABLE `deal_calendar` ( `id` int(11) NOT NULL AUTO_INCREMENT, `date` date DEFAULT NULL COMMENT '交易日历表', `isDealDay` varchar(1) CHARACTER SET utf8 COLLATE utf8_general_ci DEFAULT 'N' COMMENT '是否交易日,Y是,N不是', PRIMARY KEY (`id`) ) ENGINE=InnoDB AUTO_INCREMENT=31 DEFAULT CHARSET=utf8; -- ---------------------------- -- Records of deal_calendar -- ---------------------------- INSERT INTO `deal_calendar` VALUES ('1', '2018-09-01', 'N'); INSERT INTO `deal_calendar` VALUES ('2', '2018-09-02', 'N'); INSERT INTO `deal_calendar` VALUES ('3', '2018-09-03', 'Y'); INSERT INTO `deal_calendar` VALUES ('4', '2018-09-04', 'Y'); INSERT INTO `deal_calendar` VALUES ('5', '2018-09-05', 'Y'); INSERT INTO `deal_calendar` VALUES ('6', '2018-09-06', 'Y'); INSERT INTO `deal_calendar` VALUES ('7', '2018-09-07', 'Y'); INSERT INTO `deal_calendar` VALUES ('8', '2018-09-08', 'N'); INSERT INTO `deal_calendar` VALUES ('9', '2018-09-09', 'N'); INSERT INTO `deal_calendar` VALUES ('10', '2018-09-10', 'Y'); INSERT INTO `deal_calendar` VALUES ('11', '2018-09-11', 'Y'); INSERT INTO `deal_calendar` VALUES ('12', '2018-09-12', 'Y'); INSERT INTO `deal_calendar` VALUES ('13', '2018-09-13', 'Y'); INSERT INTO `deal_calendar` VALUES ('14', '2018-09-14', 'Y'); INSERT INTO `deal_calendar` VALUES ('15', '2018-09-15', 'N'); INSERT INTO `deal_calendar` VALUES ('16', '2018-09-16', 'N'); INSERT INTO `deal_calendar` VALUES ('17', '2018-09-17', 'Y'); INSERT INTO `deal_calendar` VALUES ('18', '2018-09-18', 'Y'); INSERT INTO `deal_calendar` VALUES ('19', '2018-09-19', 'Y'); INSERT INTO `deal_calendar` VALUES ('20', '2018-09-20', 'Y'); INSERT INTO `deal_calendar` VALUES ('21', '2018-09-21', 'Y'); INSERT INTO `deal_calendar` VALUES ('22', '2018-09-22', 'N'); INSERT INTO `deal_calendar` VALUES ('23', '2018-09-23', 'N'); INSERT INTO `deal_calendar` VALUES ('24', '2018-09-24', 'Y'); INSERT INTO `deal_calendar` VALUES ('25', '2018-09-25', 'Y'); INSERT INTO `deal_calendar` VALUES ('26', '2018-09-26', 'Y'); INSERT INTO `deal_calendar` VALUES ('27', '2018-09-27', 'Y'); INSERT INTO `deal_calendar` VALUES ('28', '2018-09-28', 'Y'); INSERT INTO `deal_calendar` VALUES ('29', '2018-09-29', 'N'); INSERT INTO `deal_calendar` VALUES ('30', '2018-09-30', 'N'); -- ---------------------------- -- Table structure for department -- ---------------------------- DROP TABLE IF EXISTS `department`; CREATE TABLE `department` ( `id` int(11) NOT NULL AUTO_INCREMENT COMMENT '序列号', `name` varchar(50) CHARACTER SET utf8 COLLATE utf8_general_ci NOT NULL COMMENT '部门名字', `status` int(11) DEFAULT NULL COMMENT '部门状态 0不可用,1可用', `no_permission_floors` varchar(255) CHARACTER SET utf8 COLLATE utf8_general_ci DEFAULT NULL COMMENT '无权限进入的门', PRIMARY KEY (`id`), UNIQUE KEY `id` (`id`) ) ENGINE=InnoDB AUTO_INCREMENT=3 DEFAULT CHARSET=utf8; -- ---------------------------- -- Records of department -- ---------------------------- INSERT INTO `department` VALUES ('1', '固定收益部', '1', null); INSERT INTO `department` VALUES ('2', '资本市场部', '1', null); -- ---------------------------- -- Table structure for employee -- ---------------------------- DROP TABLE IF EXISTS `employee`; CREATE TABLE `employee` ( `id` int(11) NOT NULL AUTO_INCREMENT COMMENT '序列号', `faceId` varchar(255) DEFAULT NULL COMMENT '注册人脸库人脸id', `name` varchar(50) CHARACTER SET utf8 COLLATE utf8_general_ci DEFAULT NULL COMMENT '员工姓名', `empNO` varchar(64) NOT NULL COMMENT '员工编号', `cardNO` int(32) DEFAULT NULL, `postId` int(11) DEFAULT NULL COMMENT '关联岗位表id', `dept` int(11) DEFAULT NULL COMMENT '关联部门表id', `password` varchar(50) CHARACTER SET utf8 COLLATE utf8_general_ci DEFAULT NULL COMMENT '员工密码', `entryTime` date DEFAULT NULL COMMENT '入职时间', `birthday` date DEFAULT NULL COMMENT '员工生日', `sex` char(1) DEFAULT NULL COMMENT '性别,M男,F女', `isblacklist` int(2) DEFAULT '0' COMMENT '是否黑名单 0不是,1是', `vip` char(1) CHARACTER SET utf8 COLLATE utf8_general_ci DEFAULT 'N' COMMENT '是否vip,N不是,Y是', `tel` varchar(12) CHARACTER SET utf8 COLLATE utf8_general_ci DEFAULT NULL COMMENT '电话号码', `img` varchar(500) CHARACTER SET utf8 COLLATE utf8_general_ci DEFAULT NULL COMMENT '头像地址', `status` int(2) DEFAULT '1' COMMENT '0无效,1有效', `faceToken` varchar(255) DEFAULT NULL COMMENT '人脸token', `updateTime` datetime DEFAULT NULL COMMENT '更新时间', `remark` varchar(255) DEFAULT NULL COMMENT '备注,VIP客户企业', UNIQUE KEY `主索引` (`id`), UNIQUE KEY `uk_empNo` (`empNO`) USING BTREE ) ENGINE=InnoDB AUTO_INCREMENT=11 DEFAULT CHARSET=utf8; -- ---------------------------- -- Records of employee -- ---------------------------- INSERT INTO `employee` VALUES ('1', null, '张三', 'zhangsan', '1', null, '1', null, null, '1990-10-12', 'F', '0', 'N', null, null, '1', 'zhangsan', null, null); INSERT INTO `employee` VALUES ('2', null, '李四', 'lisi', '3', null, '1', null, null, '1982-12-16', 'M', '0', 'N', null, null, '1', 'lisi', null, null); INSERT INTO `employee` VALUES ('3', null, '王五', 'wangwu', '2', null, '2', null, null, '1990-02-01', 'M', '0', 'N', null, null, '1', 'wangwu', null, null); -- ---------------------------- -- Table structure for sign_record -- ---------------------------- DROP TABLE IF EXISTS `sign_record`; CREATE TABLE `sign_record` ( `id` int(11) NOT NULL AUTO_INCREMENT COMMENT '序列号', `empNO` varchar(32) DEFAULT NULL COMMENT '员工号', `confidence` float DEFAULT NULL COMMENT '比对相似度', `cardNo` varchar(32) DEFAULT NULL COMMENT '门禁卡号', `signTime` datetime DEFAULT NULL COMMENT '签到时间', `deviceNo` int(11) DEFAULT NULL COMMENT '设备号', `imagePath` varchar(255) DEFAULT NULL COMMENT '头像路径', `type` varchar(10) CHARACTER SET utf8 COLLATE utf8_general_ci DEFAULT '0' COMMENT '类型,0普通,1生日,2入职100天,3入职1000天,4,最早到,5本月全勤,6陌生人,7黑名单,8领导层', `IO` char(1) DEFAULT NULL COMMENT '进出标识', `source` int(1) DEFAULT NULL COMMENT '打卡数据来源,0人', `remark` varchar(255) DEFAULT NULL COMMENT '备注', PRIMARY KEY (`id`), UNIQUE KEY `主索引` (`id`), KEY `index_sign_time` (`signTime`) ) ENGINE=InnoDB AUTO_INCREMENT=63 DEFAULT CHARSET=utf8; -- ---------------------------- -- Records of sign_record -- ---------------------------- INSERT INTO `sign_record` VALUES ('8', 'zhangsan', null, '1', '2018-09-24 19:59:33', '512', null, '4', null, '0', ''); INSERT INTO `sign_record` VALUES ('9', 'wangwu', null, '2', '2018-09-24 20:00:26', '512', null, '0', null, '0', null); INSERT INTO `sign_record` VALUES ('24', 'lisi', null, '3', '2018-09-24 07:32:53', '512', null, '0', null, '0', null); INSERT INTO `sign_record` VALUES ('26', 'lisi', null, '3', '2018-09-24 18:53:42', '512', null, '0', null, '0', null); INSERT INTO `sign_record` VALUES ('59', 'lisi', null, '3', '2018-09-30 09:08:37', '512', null, '0', null, '0', null); INSERT INTO `sign_record` VALUES ('60', 'lisi', null, '3', '2018-09-30 18:09:16', '512', null, '0', null, '0', null); INSERT INTO `sign_record` VALUES ('61', 'kesc', null, '2', '2018-09-29 07:20:58', '512', null, '0', null, '0', null); INSERT INTO `sign_record` VALUES ('62', 'zhangsan', null, '1', '2018-09-26 12:22:01', '512', null, '0', null, '0', null); ``` 完整需求: 1.员工号,姓名,月份,工作日天数,是否全勤,正常天数,迟到天数,早退天数,迟到加早退天数,全天缺席天数 2.员工号,姓名,日期,考勤类别(正常,迟到,早退,迟到加早退,全天缺席) 补充:早上8点后迟到,下午5点半前早退,非工作日加班不用计算迟到或早退

mysql 查询某张表的前30%的记录。

例如这张表的前30%。 ![图片说明](https://img-ask.csdn.net/upload/201910/20/1571564167_738230.jpg) mysql 查询某张表的前30%的记录。

mysql两个单表都在30万数据的表联合查询,查询时间长,求优化

现在有两个表,每个表的数据量 30万左右 。表一的id和表2的res_id是关联的,因为表2和另一张表关联,所以无法做外键。 现在需求是 检索条件都在表一,但是排序是根据表二的字段排序。普通写法最后结果,都在5S以上速度太慢。求大神给出优化意见。刚才没有补充,on后面都是主键,where后面的都建立了普通索引,orderby 后面的也是普通索引。 sql 如下: ``` SELECT a.id, a.title, a.type, a.digest, a.file_type, a.file_sufix, a.bpackage, a.author_name, a.source, a.source_name, a.org_name, b.download_count, b.preview_count, b.favorite_count, a.author, a.section_name, a.subject_name, a.version_name, a.material_name, a.chapter_name, b.evaluate_count FROM res_resource a LEFT JOIN res_statistics b ON a.id = b.res_id WHERE a.dflag = 0 AND a.sflag = 1 AND a.publish_status = '1' ORDER BY overall_score DESC LIMIT 0,10 ``` 表1 ![表1](https://img-ask.csdn.net/upload/201804/09/1523259185_588613.png) <br/> 表2 ![表2](https://img-ask.csdn.net/upload/201804/09/1523259221_845390.png) <br/>

从500万条数据中查询,求优化一条SQL语句

表结构如下,里面有500W数据。我没有权限修改这个表,所以请大家给我优化下SQL查询 [code="sql"]CREATE TABLE IF NOT EXISTS `jdp_tb_trade` ( `tid` bigint(20) NOT NULL, `status` varchar(64) DEFAULT NULL, `type` varchar(64) DEFAULT NULL, `seller_nick` varchar(32) DEFAULT NULL, `buyer_nick` varchar(32) DEFAULT NULL, `created` datetime DEFAULT NULL, `modified` datetime DEFAULT NULL, `jdp_hashcode` varchar(128) DEFAULT NULL, `jdp_response` mediumtext, `jdp_created` datetime DEFAULT NULL, `jdp_modified` datetime DEFAULT NULL, PRIMARY KEY (`tid`), KEY `ind_jdp_tb_trade_seller_nick_jdp_modified` (`seller_nick`,`jdp_modified`), KEY `ind_jdp_tb_trade_jdp_modified` (`jdp_modified`), KEY `ind_jdp_tb_trade_seller_nick_modified` (`seller_nick`,`modified`), KEY `ind_jdp_tb_trade_modified` (`modified`) ) ENGINE=InnoDB DEFAULT CHARSET=utf8;[/code] 符合以下条件的有3W条数据 [code="sql"]SELECT COUNT(*) AS tp_count FROM `jdp_tb_trade` WHERE ( `seller_nick` IN ('李心','zhixian50','陈鹏','雪儿','稀饭','婷婷','七七') ) AND ( (`jdp_modified` > '2007-11-30 09:52:39') AND (`jdp_modified` <= '2014-04-21 22:31:13') ) LIMIT 1[/code] 我要分页查询出这3W条数据,由于MYSQL分页越往后查询越慢,所以我用了关联查询。该查询在前面2页需要3到4分钟才能返回结果。后面的每页4秒左右就返回了。实在搞不明白为什么。麻烦大家帮忙优化下。每页必须要在10秒内完成。 [code="sql"]SELECT t1.jdp_modified,t1.jdp_response FROM jdp_tb_trade t1, ( SELECT `tid` FROM `jdp_tb_trade` WHERE ( `seller_nick` IN ('李心','zhixian50','陈鹏','雪儿','稀饭','婷婷','七七') ) AND ( (`jdp_modified` > '2007-11-30 09:52:39') AND (`jdp_modified` <= '2014-04-21 22:31:13') ) ORDER BY jdp_modified desc LIMIT 0,200 ) t2 WHERE t1.tid=t2.tid[/code]

副业收入是我做程序媛的3倍,工作外的B面人生是怎样的?

提到“程序员”,多数人脑海里首先想到的大约是:为人木讷、薪水超高、工作枯燥…… 然而,当离开工作岗位,撕去层层标签,脱下“程序员”这身外套,有的人生动又有趣,马上展现出了完全不同的A/B面人生! 不论是简单的爱好,还是正经的副业,他们都干得同样出色。偶尔,还能和程序员的特质结合,产生奇妙的“化学反应”。 @Charlotte:平日素颜示人,周末美妆博主 大家都以为程序媛也个个不修边幅,但我们也许...

MySQL数据库面试题(2020最新版)

文章目录数据库基础知识为什么要使用数据库什么是SQL?什么是MySQL?数据库三大范式是什么mysql有关权限的表都有哪几个MySQL的binlog有有几种录入格式?分别有什么区别?数据类型mysql有哪些数据类型引擎MySQL存储引擎MyISAM与InnoDB区别MyISAM索引与InnoDB索引的区别?InnoDB引擎的4大特性存储引擎选择索引什么是索引?索引有哪些优缺点?索引使用场景(重点)...

我说我不会算法,阿里把我挂了。

不说了,字节跳动也反手把我挂了。

抖音上很火的时钟效果

反正,我的抖音没人看,别人都有几十万个赞什么的。 发到CSDN上来,大家交流下~ 主要用到原生态的 JS+CSS3。 具体不解释了,看注释: &lt;!DOCTYPE html&gt; &lt;html lang="en"&gt; &lt;head&gt; &lt;meta charset="UTF-8"&gt; &lt;title&gt;Title&lt;/tit...

记录下入职中软一个月(外包华为)

我在年前从上一家公司离职,没想到过年期间疫情爆发,我也被困在家里,在家呆着的日子让人很焦躁,于是我疯狂的投简历,看面试题,希望可以进大公司去看看。 我也有幸面试了我觉得还挺大的公司的(虽然不是bat之类的大厂,但是作为一名二本计算机专业刚毕业的大学生bat那些大厂我连投简历的勇气都没有),最后选择了中软,我知道这是一家外包公司,待遇各方面甚至不如我的上一家公司,但是对我而言这可是外包华为,能...

为什么程序员做外包会被瞧不起?

二哥,有个事想询问下您的意见,您觉得应届生值得去外包吗?公司虽然挺大的,中xx,但待遇感觉挺低,马上要报到,挺纠结的。

手机经常收到"回复TD退订",回还是不回?今天总算是弄清楚了

自从有了微信和QQ,手机短信几乎很少再用了,但是我们手机里面还是经常会收到"回复TD退订"的消息,那到底要不要回复呢?今天就来告诉大家! 信息内容可能包括 推销信息 品牌活动日的时候,会根据你的用户浏览信息,或者购买记录,后续发送一些降价消息。 但是笔者想说我是缺那10块钱的人嘛,我缺的是1000块。 垃圾信息 虽然我们已经不经常用短信功能,但是还是有不少...

当HR压你价,说你只值7K,你该怎么回答?

当HR压你价,说你只值7K时,你可以流畅地回答,记住,是流畅,不能犹豫。 礼貌地说:“7K是吗?了解了。嗯~其实我对贵司的面试官印象很好。只不过,现在我的手头上已经有一份11K的offer。来面试,主要也是自己对贵司挺有兴趣的,所以过来看看……”(未完) 这段话主要是陪HR互诈的同时,从公司兴趣,公司职员印象上,都给予对方正面的肯定,既能提升HR的好感度,又能让谈判气氛融洽,为后面的发挥留足空间。...

面试官问我:如何加载100M的图片却不撑爆内存

还记得当年面试一个面试官问我怎么加载巨图才能不撑爆内存,我没回答上来,他说分片显示,我寻思特么分片能减少内存使用??现在可以打他脸了! 内容扩展 1.图片的三级缓存中,图片加载到内存中,如果内存快爆了,会发生什么?怎么处理? 2.内存中如果加载一张 500*500 的 png 高清图片.应该是占用多少的内存? 3.Bitmap 如何处理大图,如一张 30M 的大图,如何预防 OOM? A...

面试阿里p7,被按在地上摩擦,鬼知道我经历了什么?

面试阿里p7被问到的问题(当时我只知道第一个):@Conditional是做什么的?@Conditional多个条件是什么逻辑关系?条件判断在什么时候执...

又出事了?网站被攻击了?高中生?

北京时间2020年3月27日9点整,如往常一样来到公司,带开电脑,正准备打开Github网站看一会源代码,再开始手头的工作。哟吼,一直打不开,一直出现如下页面: 我想很多网友也尝到了甜头,各大技术群炸开了锅,据网友反馈有攻击者正在发起大规模的中间人挟持,京东和Github等网站等网站都受到了影响。 什么是中间中间人挟持呢? 简而言之,就是攻击者在数据网络传输的过程中,截获传输过程中的数据并篡改...

培训班出来的人后来都怎么样了?(二)

接着上回说,培训班学习生涯结束了。后面每天就是无休止的背面试题,不是没有头脑的背,培训公司还是有方法的,现在回想当时背的面试题好像都用上了,也被问到了。回头找找面试题,当时都是打印下来天天看,天天背。 不理解呢也要背,面试造飞机,上班拧螺丝。班里的同学开始四处投简历面试了,很快就有面试成功的,刚开始一个,然后越来越多。不知道是什么原因,尝到胜利果实的童鞋,不满足于自己通过的公司,嫌薪水要少了,选择...

面试了一个 31 岁程序员,让我有所触动,30岁以上的程序员该何去何从?

最近面试了一个31岁8年经验的程序猿,让我有点感慨,大龄程序猿该何去何从。

6年开发经验女程序员,面试京东Java岗要求薪资28K

写在开头: 上周面试了一位女程序员,上午10::30来我们部门面试,2B哥接待了她.来看看她的简历: 个人简历 个人技能: ● 熟悉spring mvc 、spring、mybatis 等框架 ● 熟悉 redis 、rocketmq、dubbo、zookeeper、netty 、nginx、tomcat、mysql。 ● 阅读过juc 中的线程池、锁的源...

大三实习生,字节跳动面经分享,已拿Offer

说实话,自己的算法,我一个不会,太难了吧

程序员垃圾简历长什么样?

已经连续五年参加大厂校招、社招的技术面试工作,简历看的不下于万份 这篇文章会用实例告诉你,什么是差的程序员简历! 疫情快要结束了,各个公司也都开始春招了,作为即将红遍大江南北的新晋UP主,那当然要为小伙伴们做点事(手动狗头)。 就在公众号里公开征简历,义务帮大家看,并一一点评。《启舰:春招在即,义务帮大家看看简历吧》 一石激起千层浪,三天收到两百多封简历。 花光了两个星期的所有空闲时...

工作八年,月薪60K,裸辞两个月,投简历投到怀疑人生!

近日,有网友在某职场社交平台吐槽,自己裸辞两个月了,但是找工作却让自己的心态都要崩溃了,全部无果,不是已查看无回音,就是已查看不符合。 “工作八年,两年一跳,裸辞两个月了,之前月薪60K,最近找工作找的心态崩了!所有招聘工具都用了,全部无果,不是已查看无回音,就是已查看不符合。进头条,滴滴之类的大厂很难吗???!!!投简历投的开始怀疑人生了!希望 可以收到大厂offer” 先来看看网...

推荐9个能让你看一天的网站

分享的这9个保证另你意外的网站,每个都非常实用!非常干货!毫不客气的说,这些网站最少值10万块钱。 利用好这些网站,会让你各方面的技能都得到成长,不说让你走上人生巅峰,但对比现在的你,在眼界、学识、技能方面都有质的飞跃。 一、AIRPANO 传送门:https://www.airpano.com/360photo_list.php 这是一个可以躺在家里,就能环游世界的神奇网站。 世界那么大,绝大多...

月薪22K程序员,打卡迟到10次,收到工资短信一脸懵逼

每家公司为了保证公司员工每天的工作时间,一般都会采用上下班打卡的工作制度,这其实是一个很常见的是,本身也没有什么问题的。正所谓无规矩不成方圆,公司肯定是有公司的规矩,虽然每个员工都很不喜欢这些规矩来束缚我们,但是公司也只是为了能更好的管理员工。但是一家公司如果一成不变的使用打卡制度,而不会去变通管理,也真不一定是好事。 打卡制度特别对于销售部门来说,不但会让公司发展不起来,还很容易丢失员工。但如...

97年世界黑客编程大赛冠军作品(大小仅为16KB),惊艳世界的编程巨作

这是世界编程大赛第一名作品(97年Mekka ’97 4K Intro比赛)汇编语言所写。 整个文件只有4095个字节, 大小仅仅为16KB! 不仅实现了3D动画的效果!还有一段震撼人心的背景音乐!!! 内容无法以言语形容,实在太强大! 下面是代码,具体操作看最后! @echo off more +1 %~s0|debug e100 33 f6 bf 0 20 b5 10 f3 a5...

python不到50行代码完成了多张excel合并

一 前言 公司同事最近在做excel相关的工作;今天来求助知识追寻者合并多个excel为一个一个工作本,原本是java操作poi太蛋疼了,笨重不堪,内存消耗严重,知识追寻者使用python不到40行代码完成了60多张excel工作本合并为一张;python真香 牛皮吹完了,如果看过知识追寻者系列文章的读者肯定知道之前知识追寻者发过一篇 python专题使用openpyxl操作excel(公众号读者...

什么是a站、b站、c站、d站、e站、f站、g站、h站、i站、j站、k站、l站、m站、n站?00后的世界我不懂!

A站 AcFun弹幕视频网,简称“A站”,成立于2007年6月,取意于Anime Comic Fun,是中国大陆第一家弹幕视频网站。A站以视频为载体,逐步发展出基于原生内容二次创作的完整生态,拥有高质量互动弹幕,是中国弹幕文化的发源地;拥有大量超粘性的用户群体,产生输出了金坷垃、鬼畜全明星、我的滑板鞋、小苹果等大量网络流行文化,也是中国二次元文化的发源地。 B站 全称“哔哩哔哩(bilibili...

终于,月薪过5万了!

来看几个问题想不想月薪超过5万?想不想进入公司架构组?想不想成为项目组的负责人?想不想成为spring的高手,超越99%的对手?那么本文内容是你必须要掌握的。本文主要详解bean的生命...

大厂的 404 页面都长啥样?最后一个笑了...

每天浏览各大网站,难免会碰到404页面啊。你注意过404页面么?猿妹搜罗来了下面这些知名网站的404页面,以供大家欣赏,看看哪个网站更有创意: 正在上传…重新上传取消 腾讯 正在上传…重新上传取消 网易 淘宝 百度 新浪微博 正在上传…重新上传取消 新浪 京东 优酷 腾讯视频 搜...

自从喜欢上了B站这12个UP主,我越来越觉得自己是个废柴了!

不怕告诉你,我自从喜欢上了这12个UP主,哔哩哔哩成为了我手机上最耗电的软件,几乎每天都会看,可是吧,看的越多,我就越觉得自己是个废柴,唉,老天不公啊,不信你看看…… 间接性踌躇满志,持续性混吃等死,都是因为你们……但是,自己的学习力在慢慢变强,这是不容忽视的,推荐给你们! 都说B站是个宝,可是有人不会挖啊,没事,今天咱挖好的送你一箩筐,首先啊,我在B站上最喜欢看这个家伙的视频了,为啥 ,咱撇...

代码注释如此沙雕,会玩还是你们程序员!

某站后端代码被“开源”,同时刷遍全网的,还有代码里的那些神注释。 我们这才知道,原来程序员个个都是段子手;这么多年来,我们也走过了他们的无数套路… 首先,产品经理,是永远永远吐槽不完的!网友的评论也非常扎心,说看这些代码就像在阅读程序员的日记,每一页都写满了对产品经理的恨。 然后,也要发出直击灵魂的质问:你是尊贵的付费大会员吗? 这不禁让人想起之前某音乐app的穷逼Vip,果然,穷逼在哪里都是...

爬虫(101)爬点重口味的

小弟最近在学校无聊的很哪,浏览网页突然看到一张图片,都快流鼻血。。。然后小弟冥思苦想,得干一点有趣的事情python 爬虫库安装https://s.taobao.com/api?_ks...

工作两年简历写成这样,谁要你呀!

作者:小傅哥 博客:https://bugstack.cn 沉淀、分享、成长,让自己和他人都能有所收获! 一、前言 最近有伙伴问小傅哥,我的简历怎么投递了都没有反应,心里慌的很呀。 工作两年了目前的公司没有什么大项目,整天的维护别人的代码,有坑也不让重构,都烦死了。荒废我一身技能无处施展,投递的简历也没人看。我是不动物园里的猩猩,狒狒了! 我要加班,我要996,我要疯狂编码,求给我个机会… ...

B站上的高能学习资源来了,c/c++、Java、python、机器学习、大前端......

小玉用心总结了一些良心up主,包含了c、c++、java、python、web前端、机器学习等等各个方面的优质视频,不进来看看你就亏大了

在拼多多上班,是一种什么样的体验?我心态崩了呀!

之前有很多读者咨询我:武哥,在拼多多上班是一种什么样的体验?由于一直很忙,没抽出时间来和大家分享。上周末特地花点时间来写了一篇文章,跟大家分享一下拼多多的日常。 1. 倒时差的作息 可能很多小伙伴都听说了,拼多多加班很严重。这怎么说呢?作息上确实和其他公司有点区别,大家知道 996,那么自然也就能理解拼多多的“11 11 6”了。 所以当很多小伙伴早上出门时,他们是这样的: 我们是这样的: 当...

相关热词 c#树形选择 c#中类图的使用方法 c# 传参 调用exe c# 怎么定义方法 c# 修改本地时间 c#前台怎么读取资源文件 c# xml转list c#实现框选截图 m*m乘法表c# c# 乘法99表
立即提问