运行invalid resource directory name。急求! 5C

Android studio 2.3:Error:android-apt-compiler: [xxxx] invalid resource directory name: C:\Users\Administrator\Desktop\myapp\app\src com
图片说明
图片说明

2个回答

看下你的res目录,里面是不是混入了别的东西:https://blog.csdn.net/leilu2008/article/details/6316127

happy_back
happy_back 是src目录报错呢
大约 2 年之前 回复

invalid resource directory name: C:\Users\Administrator\Desktop\Android build to
ols\apk2java\apktool1.4.1\NewFisho\res/drawable-xxhdpi

Exception in thread "main" brut.androlib.AndrolibException: brut.androlib.Androl

ibException: brut.common.BrutException: could not exec command: [aapt, p, --min-

将res目录下多余的目录删除即可,如drawable-xxhdpi

Csdn user default icon
上传中...
上传图片
插入图片
抄袭、复制答案,以达到刷声望分或其他目的的行为,在CSDN问答是严格禁止的,一经发现立刻封号。是时候展现真正的技术了!
其他相关推荐
python运行时出现ValueError: Invalid file name (invalid file name)

相关程序如下 ``` parser = argparse.ArgumentParser(description='evaluate.py') parser.add_argument('INPUT', help='path to input image') parser.add_argument('REF', default="", nargs="?", help='path to reference image, if omitted NR IQA is assumed') parser.add_argument('--model', '-m', default='', help='path to the trained model') parser.add_argument('--top', choices=('patchwise', 'weighted'), default='weighted', help='top layer and loss definition') parser.add_argument('--gpu', '-g', default=0, type=int, help='GPU ID') args = parser.parse_args() ``` 我在运行 ``` python evaluate.py D:\PyCharm 2019.3.3\test-code\deepIQA-master\img.jpg ``` 后提示我 ``` Traceback (most recent call last): File "D:/PyCharm 2019.3.3/test-code/deepIQA-master/evaluate.py", line 75, in <module> serializers.load_hdf5(args.model, model) File "D:\PyCharm 2019.3.3\lib\site-packages\chainer\serializers\hdf5.py", line 195, in load_hdf5 with h5py.File(filename, 'r') as f: File "D:\PyCharm 2019.3.3\lib\site-packages\h5py\_hl\files.py", line 408, in __init__ swmr=swmr) File "D:\PyCharm 2019.3.3\lib\site-packages\h5py\_hl\files.py", line 173, in make_fid fid = h5f.open(name, flags, fapl=fapl) File "h5py\_objects.pyx", line 54, in h5py._objects.with_phil.wrapper File "h5py\_objects.pyx", line 55, in h5py._objects.with_phil.wrapper File "h5py\h5f.pyx", line 88, in h5py.h5f.open ValueError: Invalid file name (invalid file name) ``` 是我的图片路径不对么?

JSP invalid value for import请教!!

<%@ page language="java" pageEncoding="gb2312"%> <%@ taglib uri="/WEB-INF/struts-bean.tld" prefix="bean" %> <%@ taglib uri="/WEB-INF/struts-html.tld" prefix="html" %> <%@ taglib uri="/WEB-INF/struts-logic.tld" prefix="logic" %> <!DOCTYPE HTML PUBLIC "-//W3C//DTD HTML 4.01 Transitional//EN"> <html:html locale="true"> <head> <html:base /> <title>网页主窗口</title>![ <link rel="stylesheet" type="text/css" href="CSS/style.css"> </head> <%@ include file="top.jsp" %> <body><center>图片说明](https://img-ask.csdn.net/upload/201506/16/1434443416_365534.png) <table width="1003" border="0" cellpadding="0" cellspacing="0" height="590"> <tr> <td width="202" valign="bottom"> <iframe src="left.jsp" width="100%" height="100%" frameborder="0" scrolling="auto" name="leftiframe"> </iframe> </td> <td width="801" valign="top"> <iframe src="main.jsp" width="100%" height="100%" frameborder="0" scrolling="auto" name="mainFrame"> </iframe> </td> </tr> <tr> <td height="17" valign="bottom" background="Images/left_bg_bottom.jpg"></td> <td height="17" valign="bottom" background="Images/main_bottom.jpg"></td> </tr> </table> </center> </body> </html:html> 严重: Servlet.service() for servlet jsp threw exception org.apache.jasper.JasperException: /default.jsp (line: 11, column: 5) Page directive: invalid value for import at org.apache.jasper.compiler.DefaultErrorHandler.jspError(DefaultErrorHandler.java:41) at org.apache.jasper.compiler.ErrorDispatcher.dispatch(ErrorDispatcher.java:275) at org.apache.jasper.compiler.ErrorDispatcher.jspError(ErrorDispatcher.java:91) at org.apache.jasper.compiler.Parser.processIncludeDirective(Parser.java:325) at org.apache.jasper.compiler.Parser.parseIncludeDirective(Parser.java:358) at org.apache.jasper.compiler.Parser.parseDirective(Parser.java:461) at org.apache.jasper.compiler.Parser.parseFileDirectives(Parser.java:1782) at org.apache.jasper.compiler.Parser.parse(Parser.java:136) at org.apache.jasper.compiler.ParserController.doParse(ParserController.java:227) at org.apache.jasper.compiler.ParserController.parseDirectives(ParserController.java:117) at org.apache.jasper.compiler.Compiler.generateJava(Compiler.java:194) at org.apache.jasper.compiler.Compiler.compile(Compiler.java:356) at org.apache.jasper.compiler.Compiler.compile(Compiler.java:336) at org.apache.jasper.compiler.Compiler.compile(Compiler.java:323) at org.apache.jasper.JspCompilationContext.compile(JspCompilationContext.java:580) at org.apache.jasper.servlet.JspServletWrapper.service(JspServletWrapper.java:356) at org.apache.jasper.servlet.JspServlet.serviceJspFile(JspServlet.java:396) at org.apache.jasper.servlet.JspServlet.service(JspServlet.java:340) at javax.servlet.http.HttpServlet.service(HttpServlet.java:729) at org.apache.catalina.core.ApplicationFilterChain.internalDoFilter(ApplicationFilterChain.java:291) at org.apache.catalina.core.ApplicationFilterChain.doFilter(ApplicationFilterChain.java:206) at org.apache.tomcat.websocket.server.WsFilter.doFilter(WsFilter.java:52) at org.apache.catalina.core.ApplicationFilterChain.internalDoFilter(ApplicationFilterChain.java:239) at org.apache.catalina.core.ApplicationFilterChain.doFilter(ApplicationFilterChain.java:206) at org.apache.catalina.core.ApplicationDispatcher.invoke(ApplicationDispatcher.java:721) at org.apache.catalina.core.ApplicationDispatcher.processRequest(ApplicationDispatcher.java:466) at org.apache.catalina.core.ApplicationDispatcher.doForward(ApplicationDispatcher.java:391) at org.apache.catalina.core.ApplicationDispatcher.forward(ApplicationDispatcher.java:318) at org.apache.struts.action.RequestProcessor.doForward(RequestProcessor.java:1069) at org.apache.struts.action.RequestProcessor.processForwardConfig(RequestProcessor.java:455) at org.apache.struts.action.RequestProcessor.process(RequestProcessor.java:279) at org.apache.struts.action.ActionServlet.process(ActionServlet.java:1482) at org.apache.struts.action.ActionServlet.doPost(ActionServlet.java:525) at javax.servlet.http.HttpServlet.service(HttpServlet.java:648) at javax.servlet.http.HttpServlet.service(HttpServlet.java:729) at org.apache.catalina.core.ApplicationFilterChain.internalDoFilter(ApplicationFilterChain.java:291) at org.apache.catalina.core.ApplicationFilterChain.doFilter(ApplicationFilterChain.java:206) at org.apache.tomcat.websocket.server.WsFilter.doFilter(WsFilter.java:52) at org.apache.catalina.core.ApplicationFilterChain.internalDoFilter(ApplicationFilterChain.java:239) at org.apache.catalina.core.ApplicationFilterChain.doFilter(ApplicationFilterChain.java:206) at org.apache.catalina.core.ApplicationDispatcher.invoke(ApplicationDispatcher.java:721) at org.apache.catalina.core.ApplicationDispatcher.processRequest(ApplicationDispatcher.java:466) at org.apache.catalina.core.ApplicationDispatcher.doForward(ApplicationDispatcher.java:391) at org.apache.catalina.core.ApplicationDispatcher.forward(ApplicationDispatcher.java:318) at org.apache.struts.action.RequestProcessor.doForward(RequestProcessor.java:1069) at org.apache.struts.action.RequestProcessor.processForwardConfig(RequestProcessor.java:455) at org.apache.struts.action.RequestProcessor.process(RequestProcessor.java:279) at org.apache.struts.action.ActionServlet.process(ActionServlet.java:1482) at org.apache.struts.action.ActionServlet.doPost(ActionServlet.java:525) at javax.servlet.http.HttpServlet.service(HttpServlet.java:648) at javax.servlet.http.HttpServlet.service(HttpServlet.java:729) at org.apache.catalina.core.ApplicationFilterChain.internalDoFilter(ApplicationFilterChain.java:291) at org.apache.catalina.core.ApplicationFilterChain.doFilter(ApplicationFilterChain.java:206) at org.apache.tomcat.websocket.server.WsFilter.doFilter(WsFilter.java:52) at org.apache.catalina.core.ApplicationFilterChain.internalDoFilter(ApplicationFilterChain.java:239) at org.apache.catalina.core.ApplicationFilterChain.doFilter(ApplicationFilterChain.java:206) at com.struts.filter.MyFilter.doFilter(MyFilter.java:15) at org.apache.catalina.core.ApplicationFilterChain.internalDoFilter(ApplicationFilterChain.java:239) at org.apache.catalina.core.ApplicationFilterChain.doFilter(ApplicationFilterChain.java:206) at org.apache.catalina.core.StandardWrapperValve.invoke(StandardWrapperValve.java:219) at org.apache.catalina.core.StandardContextValve.invoke(StandardContextValve.java:106) at org.apache.catalina.authenticator.AuthenticatorBase.invoke(AuthenticatorBase.java:502) at org.apache.catalina.core.StandardHostValve.invoke(StandardHostValve.java:142) at org.apache.catalina.valves.ErrorReportValve.invoke(ErrorReportValve.java:79) at org.apache.catalina.valves.AbstractAccessLogValve.invoke(AbstractAccessLogValve.java:610) at org.apache.catalina.core.StandardEngineValve.invoke(StandardEngineValve.java:88) at org.apache.catalina.connector.CoyoteAdapter.service(CoyoteAdapter.java:518) at org.apache.coyote.http11.AbstractHttp11Processor.process(AbstractHttp11Processor.java:1091) at org.apache.coyote.AbstractProtocol$AbstractConnectionHandler.process(AbstractProtocol.java:668) at org.apache.coyote.http11.Http11AprProtocol$Http11ConnectionHandler.process(Http11AprProtocol.java:283) at org.apache.tomcat.util.net.AprEndpoint$SocketProcessor.doRun(AprEndpoint.java:2463) at org.apache.tomcat.util.net.AprEndpoint$SocketProcessor.run(AprEndpoint.java:2452) at java.util.concurrent.ThreadPoolExecutor.runWorker(Unknown Source) at java.util.concurrent.ThreadPoolExecutor$Worker.run(Unknown Source) at org.apache.tomcat.util.threads.TaskThread$WrappingRunnable.run(TaskThread.java:61) at java.lang.Thread.run(Unknown Source)

更改appsecret后微信分享失效了 提示invalid signature 求大神帮忙!!

![图片说明](https://img-ask.csdn.net/upload/201903/31/1554044159_615549.jpg) 更改appsecret之前没有问题

关于My97周的显示问题,急求!!!

各位论坛大神、大牛好,工作中遇到这样一个问题,看图: ![图片说明](https://img-ask.csdn.net/upload/201601/13/1452673829_699403.jpg), My97的年与年之间周的显示(或计算)问题,那个2016-1-4是不是为2016年的第二周才对(设周一为星期开头),找了半天My97属性也没有找到可以设置这个周的显示问题, 我查了其他日历,年与年之间的那个周应该是重合才对,如图显示2015-12-28到 2015-12-31应该为2015年的第53周,也就是2015年52周零四天。 2016-1-1到2016-1-3应该为2016年的第一周,但是那个周的显示能调么? 但是我java后台计算的是对的,2016年第二周,导致我两端不统一

xml自定义sql 运行jar遇到问题 Invalid bound statement (not found)。

idea直接运行没问题。 打包成jar后用java -jar运行就报错 org.apache.ibatis.binding.BindingException: Invalid bound statement (not found): ........ 网上的答案都试了 1.@Mapper注解已加 2.@MapperScan已加 3.mybatis-plus: mapper-locations: classpath*:alone\zhao\auth\mapper\**\*.xml 4.<resources> <resource> <!-- 描述存放资源的目录,该路径相对POM路径--> <directory>src/main/java</directory> <includes> <include>**/*.xml</include> </includes> <filtering>true</filtering> </resource> <resource> <directory>src/main/resources</directory> <filtering>false</filtering> <includes> <include>**/*.*</include> </includes> </resource> </resources> 5.确定打包后的jar里有xml文件 mp使用的3.3.1 Spring Boot 2.1.4.RELEASE 无助!!求拯救

急求!Eclipse marketplace打开报错

![图片说明](https://img-ask.csdn.net/upload/201812/28/1545982932_716689.jpg)

Idea运行出错!!!!!!!1

Error running 'NoteBookDemo': Cannot start process, the working directory 'F:\NoteBook\NoteBook' does not exist

hive运行insert语句在on yarn的情况下报错,开启本地模式后就好了,报错如下:

``` hive> insert into test values('B',2); Query ID = root_20191114105642_8cc05952-0497-4eff-893e-af6de8f05c6e Total jobs = 3 Launching Job 1 out of 3 Number of reduce tasks is set to 0 since there's no reduce operator 19/11/14 10:56:43 INFO client.RMProxy: Connecting to ResourceManager at cloudera/37.64.0.71:8032 19/11/14 10:56:43 INFO client.RMProxy: Connecting to ResourceManager at cloudera/37.64.0.71:8032 java.io.IOException: org.apache.hadoop.yarn.exceptions.InvalidResourceRequestException: Invalid resource request! Cannot allocate containers as requested resource is greater than maximum allowed allocation. Requested resource type=[memory-mb], Requested resource=<memory:15360, vCores:8>, maximum allowed allocation=<memory:6557, vCores:8>, please note that maximum allowed allocation is calculated by scheduler based on maximum resource of registered NodeManagers, which might be less than configured maximum allocation=<memory:6557, vCores:8> at org.apache.hadoop.yarn.server.resourcemanager.scheduler.SchedulerUtils.throwInvalidResourceException(SchedulerUtils.java:478) at org.apache.hadoop.yarn.server.resourcemanager.scheduler.SchedulerUtils.checkResourceRequestAgainstAvailableResource(SchedulerUtils.java:374) at org.apache.hadoop.yarn.server.resourcemanager.scheduler.SchedulerUtils.validateResourceRequest(SchedulerUtils.java:302) at org.apache.hadoop.yarn.server.resourcemanager.scheduler.SchedulerUtils.normalizeAndValidateRequest(SchedulerUtils.java:280) at org.apache.hadoop.yarn.server.resourcemanager.RMAppManager.validateAndCreateResourceRequest(RMAppManager.java:522) at org.apache.hadoop.yarn.server.resourcemanager.RMAppManager.createAndPopulateNewRMApp(RMAppManager.java:377) at org.apache.hadoop.yarn.server.resourcemanager.RMAppManager.submitApplication(RMAppManager.java:318) at org.apache.hadoop.yarn.server.resourcemanager.ClientRMService.submitApplication(ClientRMService.java:633) at org.apache.hadoop.yarn.api.impl.pb.service.ApplicationClientProtocolPBServiceImpl.submitApplication(ApplicationClientProtocolPBServiceImpl.java:267) at org.apache.hadoop.yarn.proto.ApplicationClientProtocol$ApplicationClientProtocolService$2.callBlockingMethod(ApplicationClientProtocol.java:531) at org.apache.hadoop.ipc.ProtobufRpcEngine$Server$ProtoBufRpcInvoker.call(ProtobufRpcEngine.java:523) at org.apache.hadoop.ipc.RPC$Server.call(RPC.java:991) at org.apache.hadoop.ipc.Server$RpcCall.run(Server.java:869) at org.apache.hadoop.ipc.Server$RpcCall.run(Server.java:815) at java.security.AccessController.doPrivileged(Native Method) at javax.security.auth.Subject.doAs(Subject.java:422) at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1875) at org.apache.hadoop.ipc.Server$Handler.run(Server.java:2675) at org.apache.hadoop.mapred.YARNRunner.submitJob(YARNRunner.java:345) at org.apache.hadoop.mapreduce.JobSubmitter.submitJobInternal(JobSubmitter.java:251) at org.apache.hadoop.mapreduce.Job$11.run(Job.java:1570) at org.apache.hadoop.mapreduce.Job$11.run(Job.java:1567) at java.security.AccessController.doPrivileged(Native Method) at javax.security.auth.Subject.doAs(Subject.java:422) at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1875) at org.apache.hadoop.mapreduce.Job.submit(Job.java:1567) at org.apache.hadoop.mapred.JobClient$1.run(JobClient.java:576) at org.apache.hadoop.mapred.JobClient$1.run(JobClient.java:571) at java.security.AccessController.doPrivileged(Native Method) at javax.security.auth.Subject.doAs(Subject.java:422) at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1875) at org.apache.hadoop.mapred.JobClient.submitJobInternal(JobClient.java:571) at org.apache.hadoop.mapred.JobClient.submitJob(JobClient.java:562) at org.apache.hadoop.hive.ql.exec.mr.ExecDriver.execute(ExecDriver.java:444) at org.apache.hadoop.hive.ql.exec.mr.MapRedTask.execute(MapRedTask.java:151) at org.apache.hadoop.hive.ql.exec.Task.executeTask(Task.java:199) at org.apache.hadoop.hive.ql.exec.TaskRunner.runSequential(TaskRunner.java:97) at org.apache.hadoop.hive.ql.Driver.launchTask(Driver.java:2200) at org.apache.hadoop.hive.ql.Driver.execute(Driver.java:1843) at org.apache.hadoop.hive.ql.Driver.runInternal(Driver.java:1563) at org.apache.hadoop.hive.ql.Driver.run(Driver.java:1339) at org.apache.hadoop.hive.ql.Driver.run(Driver.java:1328) at org.apache.hadoop.hive.cli.CliDriver.processLocalCmd(CliDriver.java:239) at org.apache.hadoop.hive.cli.CliDriver.processCmd(CliDriver.java:187) at org.apache.hadoop.hive.cli.CliDriver.processLine(CliDriver.java:409) at org.apache.hadoop.hive.cli.CliDriver.executeDriver(CliDriver.java:836) at org.apache.hadoop.hive.cli.CliDriver.run(CliDriver.java:772) at org.apache.hadoop.hive.cli.CliDriver.main(CliDriver.java:699) at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at org.apache.hadoop.util.RunJar.run(RunJar.java:313) at org.apache.hadoop.util.RunJar.main(RunJar.java:227) Caused by: org.apache.hadoop.yarn.exceptions.InvalidResourceRequestException: Invalid resource request! Cannot allocate containers as requested resource is greater than maximum allowed allocation. Requested resource type=[memory-mb], Requested resource=<memory:15360, vCores:8>, maximum allowed allocation=<memory:6557, vCores:8>, please note that maximum allowed allocation is calculated by scheduler based on maximum resource of registered NodeManagers, which might be less than configured maximum allocation=<memory:6557, vCores:8> at org.apache.hadoop.yarn.server.resourcemanager.scheduler.SchedulerUtils.throwInvalidResourceException(SchedulerUtils.java:478) at org.apache.hadoop.yarn.server.resourcemanager.scheduler.SchedulerUtils.checkResourceRequestAgainstAvailableResource(SchedulerUtils.java:374) at org.apache.hadoop.yarn.server.resourcemanager.scheduler.SchedulerUtils.validateResourceRequest(SchedulerUtils.java:302) at org.apache.hadoop.yarn.server.resourcemanager.scheduler.SchedulerUtils.normalizeAndValidateRequest(SchedulerUtils.java:280) at org.apache.hadoop.yarn.server.resourcemanager.RMAppManager.validateAndCreateResourceRequest(RMAppManager.java:522) at org.apache.hadoop.yarn.server.resourcemanager.RMAppManager.createAndPopulateNewRMApp(RMAppManager.java:377) at org.apache.hadoop.yarn.server.resourcemanager.RMAppManager.submitApplication(RMAppManager.java:318) at org.apache.hadoop.yarn.server.resourcemanager.ClientRMService.submitApplication(ClientRMService.java:633) at org.apache.hadoop.yarn.api.impl.pb.service.ApplicationClientProtocolPBServiceImpl.submitApplication(ApplicationClientProtocolPBServiceImpl.java:267) at org.apache.hadoop.yarn.proto.ApplicationClientProtocol$ApplicationClientProtocolService$2.callBlockingMethod(ApplicationClientProtocol.java:531) at org.apache.hadoop.ipc.ProtobufRpcEngine$Server$ProtoBufRpcInvoker.call(ProtobufRpcEngine.java:523) at org.apache.hadoop.ipc.RPC$Server.call(RPC.java:991) at org.apache.hadoop.ipc.Server$RpcCall.run(Server.java:869) at org.apache.hadoop.ipc.Server$RpcCall.run(Server.java:815) at java.security.AccessController.doPrivileged(Native Method) at javax.security.auth.Subject.doAs(Subject.java:422) at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1875) at org.apache.hadoop.ipc.Server$Handler.run(Server.java:2675) at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method) at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62) at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45) at java.lang.reflect.Constructor.newInstance(Constructor.java:423) at org.apache.hadoop.yarn.ipc.RPCUtil.instantiateException(RPCUtil.java:53) at org.apache.hadoop.yarn.ipc.RPCUtil.instantiateYarnException(RPCUtil.java:75) at org.apache.hadoop.yarn.ipc.RPCUtil.unwrapAndThrowException(RPCUtil.java:116) at org.apache.hadoop.yarn.api.impl.pb.client.ApplicationClientProtocolPBClientImpl.submitApplication(ApplicationClientProtocolPBClientImpl.java:284) at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at org.apache.hadoop.io.retry.RetryInvocationHandler.invokeMethod(RetryInvocationHandler.java:422) at org.apache.hadoop.io.retry.RetryInvocationHandler$Call.invokeMethod(RetryInvocationHandler.java:165) at org.apache.hadoop.io.retry.RetryInvocationHandler$Call.invoke(RetryInvocationHandler.java:157) at org.apache.hadoop.io.retry.RetryInvocationHandler$Call.invokeOnce(RetryInvocationHandler.java:95) at org.apache.hadoop.io.retry.RetryInvocationHandler.invoke(RetryInvocationHandler.java:359) at com.sun.proxy.$Proxy43.submitApplication(Unknown Source) at org.apache.hadoop.yarn.client.api.impl.YarnClientImpl.submitApplication(YarnClientImpl.java:290) at org.apache.hadoop.mapred.ResourceMgrDelegate.submitApplication(ResourceMgrDelegate.java:297) at org.apache.hadoop.mapred.YARNRunner.submitJob(YARNRunner.java:330) ... 35 more Caused by: org.apache.hadoop.ipc.RemoteException(org.apache.hadoop.yarn.exceptions.InvalidResourceRequestException): Invalid resource request! Cannot allocate containers as requested resource is greater than maximum allowed allocation. Requested resource type=[memory-mb], Requested resource=<memory:15360, vCores:8>, maximum allowed allocation=<memory:6557, vCores:8>, please note that maximum allowed allocation is calculated by scheduler based on maximum resource of registered NodeManagers, which might be less than configured maximum allocation=<memory:6557, vCores:8> at org.apache.hadoop.yarn.server.resourcemanager.scheduler.SchedulerUtils.throwInvalidResourceException(SchedulerUtils.java:478) at org.apache.hadoop.yarn.server.resourcemanager.scheduler.SchedulerUtils.checkResourceRequestAgainstAvailableResource(SchedulerUtils.java:374) at org.apache.hadoop.yarn.server.resourcemanager.scheduler.SchedulerUtils.validateResourceRequest(SchedulerUtils.java:302) at org.apache.hadoop.yarn.server.resourcemanager.scheduler.SchedulerUtils.normalizeAndValidateRequest(SchedulerUtils.java:280) at org.apache.hadoop.yarn.server.resourcemanager.RMAppManager.validateAndCreateResourceRequest(RMAppManager.java:522) at org.apache.hadoop.yarn.server.resourcemanager.RMAppManager.createAndPopulateNewRMApp(RMAppManager.java:377) at org.apache.hadoop.yarn.server.resourcemanager.RMAppManager.submitApplication(RMAppManager.java:318) at org.apache.hadoop.yarn.server.resourcemanager.ClientRMService.submitApplication(ClientRMService.java:633) at org.apache.hadoop.yarn.api.impl.pb.service.ApplicationClientProtocolPBServiceImpl.submitApplication(ApplicationClientProtocolPBServiceImpl.java:267) at org.apache.hadoop.yarn.proto.ApplicationClientProtocol$ApplicationClientProtocolService$2.callBlockingMethod(ApplicationClientProtocol.java:531) at org.apache.hadoop.ipc.ProtobufRpcEngine$Server$ProtoBufRpcInvoker.call(ProtobufRpcEngine.java:523) at org.apache.hadoop.ipc.RPC$Server.call(RPC.java:991) at org.apache.hadoop.ipc.Server$RpcCall.run(Server.java:869) at org.apache.hadoop.ipc.Server$RpcCall.run(Server.java:815) at java.security.AccessController.doPrivileged(Native Method) at javax.security.auth.Subject.doAs(Subject.java:422) at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1875) at org.apache.hadoop.ipc.Server$Handler.run(Server.java:2675) at org.apache.hadoop.ipc.Client.getRpcResponse(Client.java:1499) at org.apache.hadoop.ipc.Client.call(Client.java:1445) at org.apache.hadoop.ipc.Client.call(Client.java:1355) at org.apache.hadoop.ipc.ProtobufRpcEngine$Invoker.invoke(ProtobufRpcEngine.java:228) at org.apache.hadoop.ipc.ProtobufRpcEngine$Invoker.invoke(ProtobufRpcEngine.java:116) at com.sun.proxy.$Proxy42.submitApplication(Unknown Source) at org.apache.hadoop.yarn.api.impl.pb.client.ApplicationClientProtocolPBClientImpl.submitApplication(ApplicationClientProtocolPBClientImpl.java:281) ... 48 more Job Submission failed with exception 'java.io.IOException(org.apache.hadoop.yarn.exceptions.InvalidResourceRequestException: Invalid resource request! Cannot allocate containers as requested resource is greater than maximum allowed allocation. Requested resource type=[memory-mb], Requested resource=<memory:15360, vCores:8>, maximum allowed allocation=<memory:6557, vCores:8>, please note that maximum allowed allocation is calculated by scheduler based on maximum resource of registered NodeManagers, which might be less than configured maximum allocation=<memory:6557, vCores:8> at org.apache.hadoop.yarn.server.resourcemanager.scheduler.SchedulerUtils.throwInvalidResourceException(SchedulerUtils.java:478) at org.apache.hadoop.yarn.server.resourcemanager.scheduler.SchedulerUtils.checkResourceRequestAgainstAvailableResource(SchedulerUtils.java:374) at org.apache.hadoop.yarn.server.resourcemanager.scheduler.SchedulerUtils.validateResourceRequest(SchedulerUtils.java:302) at org.apache.hadoop.yarn.server.resourcemanager.scheduler.SchedulerUtils.normalizeAndValidateRequest(SchedulerUtils.java:280) at org.apache.hadoop.yarn.server.resourcemanager.RMAppManager.validateAndCreateResourceRequest(RMAppManager.java:522) at org.apache.hadoop.yarn.server.resourcemanager.RMAppManager.createAndPopulateNewRMApp(RMAppManager.java:377) at org.apache.hadoop.yarn.server.resourcemanager.RMAppManager.submitApplication(RMAppManager.java:318) at org.apache.hadoop.yarn.server.resourcemanager.ClientRMService.submitApplication(ClientRMService.java:633) at org.apache.hadoop.yarn.api.impl.pb.service.ApplicationClientProtocolPBServiceImpl.submitApplication(ApplicationClientProtocolPBServiceImpl.java:267) at org.apache.hadoop.yarn.proto.ApplicationClientProtocol$ApplicationClientProtocolService$2.callBlockingMethod(ApplicationClientProtocol.java:531) at org.apache.hadoop.ipc.ProtobufRpcEngine$Server$ProtoBufRpcInvoker.call(ProtobufRpcEngine.java:523) at org.apache.hadoop.ipc.RPC$Server.call(RPC.java:991) at org.apache.hadoop.ipc.Server$RpcCall.run(Server.java:869) at org.apache.hadoop.ipc.Server$RpcCall.run(Server.java:815) at java.security.AccessController.doPrivileged(Native Method) at javax.security.auth.Subject.doAs(Subject.java:422) at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1875) at org.apache.hadoop.ipc.Server$Handler.run(Server.java:2675) )' FAILED: Execution Error, return code 1 from org.apache.hadoop.hive.ql.exec.mr.MapRedTask. org.apache.hadoop.yarn.exceptions.InvalidResourceRequestException: Invalid resource request! Cannot allocate containers as requested resource is greater than maximum allowed allocation. Requested resource type=[memory-mb], Requested resource=<memory:15360, vCores:8>, maximum allowed allocation=<memory:6557, vCores:8>, please note that maximum allowed allocation is calculated by scheduler based on maximum resource of registered NodeManagers, which might be less than configured maximum allocation=<memory:6557, vCores:8> at org.apache.hadoop.yarn.server.resourcemanager.scheduler.SchedulerUtils.throwInvalidResourceException(SchedulerUtils.java:478) at org.apache.hadoop.yarn.server.resourcemanager.scheduler.SchedulerUtils.checkResourceRequestAgainstAvailableResource(SchedulerUtils.java:374) at org.apache.hadoop.yarn.server.resourcemanager.scheduler.SchedulerUtils.validateResourceRequest(SchedulerUtils.java:302) at org.apache.hadoop.yarn.server.resourcemanager.scheduler.SchedulerUtils.normalizeAndValidateRequest(SchedulerUtils.java:280) at org.apache.hadoop.yarn.server.resourcemanager.RMAppManager.validateAndCreateResourceRequest(RMAppManager.java:522) at org.apache.hadoop.yarn.server.resourcemanager.RMAppManager.createAndPopulateNewRMApp(RMAppManager.java:377) at org.apache.hadoop.yarn.server.resourcemanager.RMAppManager.submitApplication(RMAppManager.java:318) at org.apache.hadoop.yarn.server.resourcemanager.ClientRMService.submitApplication(ClientRMService.java:633) at org.apache.hadoop.yarn.api.impl.pb.service.ApplicationClientProtocolPBServiceImpl.submitApplication(ApplicationClientProtocolPBServiceImpl.java:267) at org.apache.hadoop.yarn.proto.ApplicationClientProtocol$ApplicationClientProtocolService$2.callBlockingMethod(ApplicationClientProtocol.java:531) at org.apache.hadoop.ipc.ProtobufRpcEngine$Server$ProtoBufRpcInvoker.call(ProtobufRpcEngine.java:523) at org.apache.hadoop.ipc.RPC$Server.call(RPC.java:991) at org.apache.hadoop.ipc.Server$RpcCall.run(Server.java:869) at org.apache.hadoop.ipc.Server$RpcCall.run(Server.java:815) at java.security.AccessController.doPrivileged(Native Method) at javax.security.auth.Subject.doAs(Subject.java:422) at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1875) at org.apache.hadoop.ipc.Server$Handler.run(Server.java:2675) ``` # 内存最大只有6G,他非要申请15G,这个问题该如何处理, # 求助各位大佬!!!

40052, invalid action name...微信获取二维码ticket失败

微信公众平台接口调试工具里面,填写的token和要传的参数,测试正确。 用的还是官方的例子,但在自己的项目中,就会提示{errcode: 40052, errmsg: "invalid action name。 而这个40052的错误代码,在官方的文档中竟然不存在。 看报错应该是actionname错误。 但我确定这个是正确的,以为都是官方的例子。 $.post("https://api.weixin.qq.com/cgi-bin/qrcode/create?access_token=TOKEN", { "expire_seconds" : 1800, "action_name" : "QR_SCENE", "action_info" : { "scene" : { "scene_id" : 123 } } }, function(result) { // });

Android studio 问题,急求帮助

更新了一个东西之后,Android studio 就一直出错,不太记得更新了什么。用手机测试的时候,单击APP中的按钮,就会显示“很抱歉,。。。已停止运行“然后Event Log中显示以下信息, ”Instant Run performed a full build and install since the installation on the device does not match the local build on disk.“ 请问大家这个怎么解决,初学者不太懂

配置maven提示JAVA_HOME is set an invalid directory不是分号

![图片说明](https://img-ask.csdn.net/upload/201611/30/1480517140_996119.png)

The request has an invalid header name

易信公众号点击菜单图片无法显示,但是通过url可打开图片,服务器Httperror日志中显示: HTTP/1.1 GET /res/20140404/1290.jpg 400 - Header -, 通过服务器回包可以看到,易信服务器在请求我的服务器是的错误是:![![图片说明](https://img-ask.csdn.net/upload/201503/10/1425965826_629068.png)图片说明](https://img-ask.csdn.net/upload/201503/10/1425965817_813569.png) 其中在ios客户端可现实图片,但是在安卓客户端无法显示图片。 求问各路大神这个该怎么解决呢?急!急!

Web项目运行中出现严重: Invalid path was requested ,急求解决方法

![图片说明](https://img-ask.csdn.net/upload/201705/22/1495457503_569509.png) ![图片说明](https://img-ask.csdn.net/upload/201705/22/1495457515_422067.png) ![图片说明](https://img-ask.csdn.net/upload/201705/22/1495457526_248492.png)

多节点运行cuda+mpi时报错 cudaEvent报错

我在一个小集群上运行mpi+cuda的程序,设置了cuda计时,但是其中一个节点运行到 CUDA_CALL(cudaEventRecord(stop, 0)); CUDA_CALL(cudaEventSynchronize(stop)); 时报错: CUDA Error: invalid resource handle (err_num=33) 其他节点正常运行,请问这是什么原因。

Error creating bean with name 'dataSource' defined in ServletContext resource

struts-config.xml配置信息 <?xml version="1.0" encoding="UTF-8"?> <!DOCTYPE struts-config PUBLIC "-//Apache Software Foundation//DTD Struts Configuration 1.2//EN" "http://struts.apache.org/dtds/struts-config_1_2.dtd"> <struts-config> <data-sources /> <form-beans /> <global-exceptions /> <global-forwards /> <action-mappings> <action path="/login" parameter="method" type="org.springframework.web.struts.DelegatingActionProxy"> <forward name="login" path="/WEB-INF/jsp/login.jsp"></forward> <forward name="success" path="/WEB-INF/jsp/common/main.jsp"></forward> <forward name="failure" path="/WEB-INF/jsp/login.jsp"></forward> </action> <action path="/commonJump" parameter="method" type="org.springframework.web.struts.DelegatingActionProxy"> <forward name="top" path="/WEB-INF/jsp/common/top.jsp" /> <forward name="left" path="/WEB-INF/jsp/common/left.jsp" /> <forward name="right" path="/WEB-INF/jsp/common/right.jsp" /> </action> </action-mappings> <message-resources parameter="com.af.family.ApplicationResources" /> <!-- 添加支持Spring的插件 --> <plug-in className="org.springframework.web.struts.ContextLoaderPlugIn"> <set-property property="contextConfigLocation" value="/WEB-INF/spring.xml" /> </plug-in> </struts-config> spring.xml配置信息 <?xml version="1.0" encoding="UTF-8"?> <beans xmlns="http://www.springframework.org/schema/beans" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance" xsi:schemaLocation="http://www.springframework.org/schema/beans http://www.springframework.org/schema/beans/spring-beans-2.0.xsd"> <bean id="propertyConfigure" class="org.springframework.beans.factory.config.PropertyPlaceholderConfigurer"> <property name="location" value="/WEB-INF/jdbc.properties" /> </bean> <bean id="dataSource" class="org.springframework.jdbc.datasource.DriverManagerDataSource"> <property name="driverClassName" value="${database.driver}"></property> <property name="url" value="${database.url}"></property> <property name="username" value="${database.username}"></property> <property name="password" value="${database.password}"></property> <property name="initialSize" value="25" /> <property name="maxActive" value="100" /> </bean> <bean id="loginDao" class="com.af.family.dao.common.LoginDaoImpl"> <property name="dataSource" ref="dataSource"></property> </bean> <!-- 对于Action组件配置,name,class。name的值是struts-config.xml中相应action的path值 --> <bean name="/login" class="com.af.family.action.LoginAction"> <property name="dao" ref="loginDao"></property> </bean> <bean name="/commonJump" class="com.af.family.action.CommonJumpAction"> </bean> </beans> 部署后出现如下错误: [color=red]严重: Context initialization failed org.springframework.beans.factory.BeanCreationException: Error creating bean with name 'dataSource' defined in ServletContext resource [/WEB-INF/spring.xml]: Initialization of bean failed; nested exception is org.springframework.beans.InvalidPropertyException: Invalid property 'initialSize' of bean class [org.springframework.jdbc.datasource.DriverManagerDataSource]: No property 'initialSize' found Caused by: org.springframework.beans.InvalidPropertyException: Invalid property 'initialSize' of bean class [org.springframework.jdbc.datasource.DriverManagerDataSource]: No property 'initialSize' found [/color] at org.springframework.beans.BeanWrapperImpl.convertForProperty(BeanWrapperImpl.java:376) at org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.applyPropertyValues(AbstractAutowireCapableBeanFactory.java:1107) at org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.populateBean(AbstractAutowireCapableBeanFactory.java:857) at org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.createBean(AbstractAutowireCapableBeanFactory.java:423) at org.springframework.beans.factory.support.AbstractBeanFactory$1.getObject(AbstractBeanFactory.java:249) at org.springframework.beans.factory.support.DefaultSingletonBeanRegistry.getSingleton(DefaultSingletonBeanRegistry.java:155) at org.springframework.beans.factory.support.AbstractBeanFactory.getBean(AbstractBeanFactory.java:246) at org.springframework.beans.factory.support.AbstractBeanFactory.getBean(AbstractBeanFactory.java:160) at org.springframework.beans.factory.support.DefaultListableBeanFactory.preInstantiateSingletons(DefaultListableBeanFactory.java:291) at org.springframework.context.support.AbstractApplicationContext.refresh(AbstractApplicationContext.java:352) at org.springframework.web.struts.ContextLoaderPlugIn.createWebApplicationContext(ContextLoaderPlugIn.java:354) at org.springframework.web.struts.ContextLoaderPlugIn.initWebApplicationContext(ContextLoaderPlugIn.java:295) at org.springframework.web.struts.ContextLoaderPlugIn.init(ContextLoaderPlugIn.java:225) at org.apache.struts.action.ActionServlet.initModulePlugIns(ActionServlet.java:869) at org.apache.struts.action.ActionServlet.init(ActionServlet.java:336) at javax.servlet.GenericServlet.init(GenericServlet.java:212) at org.apache.catalina.core.StandardWrapper.loadServlet(StandardWrapper.java:1161) at org.apache.catalina.core.StandardWrapper.load(StandardWrapper.java:981) at org.apache.catalina.core.StandardContext.loadOnStartup(StandardContext.java:4045) at org.apache.catalina.core.StandardContext.start(StandardContext.java:4351) at org.apache.catalina.startup.HostConfig.checkResources(HostConfig.java:1105) at org.apache.catalina.startup.HostConfig.check(HostConfig.java:1203) at org.apache.catalina.startup.HostConfig.lifecycleEvent(HostConfig.java:293) at org.apache.catalina.util.LifecycleSupport.fireLifecycleEvent(LifecycleSupport.java:117) at org.apache.catalina.core.ContainerBase.backgroundProcess(ContainerBase.java:1337) at org.apache.catalina.core.ContainerBase$ContainerBackgroundProcessor.processChildren(ContainerBase.java:1601) at org.apache.catalina.core.ContainerBase$ContainerBackgroundProcessor.processChildren(ContainerBase.java:1610) at org.apache.catalina.core.ContainerBase$ContainerBackgroundProcessor.run(ContainerBase.java:1590) at java.lang.Thread.run(Unknown Source) 2011-1-28 13:53:11 org.apache.catalina.core.ApplicationContext log

InvalidArgumentError: Failed to create a directory: log/C:; Invalid argument这是什么原因呀

tensorflow.python.framework.errors_impl.InvalidArgumentError: Failed to create a directory: log/C:; Invalid argument 这个代码我改了一点就运行不了了,提示无法创建文件夹是什么原因? 我在123文件夹里建立了TO文件夹里面也建立了log 但是还是报这个错误 ``` # train_model.py import numpy as np from alexnet import alexnet WIDTH = 214 HEIGHT = 132 LR = 1e-3 EPOCHS = 10 MODEL_NAME = 'C:/Users/Administrator/Desktop/123/pygta5-car-fast-{}-{}-{}-epochs-300K-data.model'.format(LR, 'alexnetv2',EPOCHS) model = alexnet(WIDTH, HEIGHT, LR) hm_data = 22 for i in range(EPOCHS): for i in range(1,hm_data+1): train_data = np.load('C:/Users/Administrator/Desktop/123/training_data-{}-balanced.npy'.format(i)) train = train_data[:-100] test = train_data[-100:] X = np.array([i[0] for i in train]).reshape(-1,WIDTH,HEIGHT,1) Y = [i[1] for i in train] test_x = np.array([i[0] for i in test]).reshape(-1,WIDTH,HEIGHT,1) test_y = [i[1] for i in test] model.fit({'input': X}, {'targets': Y}, n_epoch=1, validation_set=({'input': test_x}, {'targets': test_y}), snapshot_step=500, show_metric=True, run_id=MODEL_NAME) model.save(MODEL_NAME) # tensorboard --logdir=foo:C:/path/to/log ```

perl运行出现invalid argument......line15,求帮忙看一下?

新手小白,在perl运行过程中发现invalid argument 我的代码如下 use strict; use warnings; my $newDir="files"; unless(-d $newDir) { mkdir $newDir or die $!; } my @allFiles=glob("*"); foreach my $subDir(@allFiles) { if(-d $subDir) { opendir(SUB,".\\$subDir") or die $!; while(my $file=readdir(SUB)) { if($file=~/\.gz$/) { `copy .\\$subDir\\$file .\\$newDir`; } } close(SUB); } } 出现的问题是 invalid argument at xxxxx.pl line 15 我想问一下15行是哪一行?空格行算一行吗? 谢谢各位大神帮忙

没人访问的情况下Invalid character found in method name. HTTP method names must be tokens

用springboot搭了一个小项目,每次部署到服务器都不到2天就挂掉,每次都是这个错误。 试过了网上方法:1.改大maxHttpHeaderSize没有用;2.项目没人访问,而且用https方式访问也不能复现问题 初步推断是是内部的问题,各位有没有好的解决办法呢? PS:项目目前用了elasticsearch、fastds、shiro,不知道是不是这些有影响 ![错误截图](https://img-ask.csdn.net/upload/201902/27/1551257406_786415.png)![properties截图](https://img-ask.csdn.net/upload/201902/27/1551257419_675422.png)![properties截图](https://img-ask.csdn.net/upload/201902/27/1551257431_111573.png)

invalid null pointer!

![图片说明](https://img-ask.csdn.net/upload/201705/07/1494168216_985520.png) ``` // win_cDemo.cpp : 定义控制台应用程序的入口点。 // #include "stdafx.h" #include "tinyxml.h" #include "NLPIR.h" #include <stdio.h> #include <stdlib.h> #include<windows.h> #include <fstream> #include <iostream> #include <string> #include <hash_map> #include <WinSock.h> #include <mysql.h> using namespace std; #pragma comment(lib,"tinyxml.lib") #pragma comment(lib, "NLPIR.lib") #pragma comment(lib, "ws2_32.lib") #pragma comment(lib, "libmysql.lib") //存放字典数据的hashMap hash_map<string,float> hmDic; //计算运行时间的变量 DWORD start_time,end_time; //mysql文件 MYSQL mysqlData; string user,passwd; void processDictionary(string filename,float weight) { ifstream fin(filename.c_str()); if( !fin ) { cout << "Error opening " << filename << " for input" << endl; return; } string buf; while(fin>>buf) { hmDic.insert(make_pair(buf,weight)); } cout<<"Process Dictionary "<<filename<<" Success"<<endl; } void processConfigure() { //获取XML对象 TiXmlDocument doc; //装载文件 doc.LoadFile("configure.xml"); //获取dics TiXmlElement *dicsLevel = (doc.RootElement())->FirstChildElement(); //获取dic TiXmlElement *dicLevel=dicsLevel->FirstChildElement(); TiXmlElement *pathLevel,*weightLevel; start_time = GetTickCount64();//获取开始时间 cout<<"Process Begin, Please Wait Patient"<<endl; while (dicLevel != NULL)// { pathLevel = dicLevel->FirstChildElement();//获取path weightLevel = pathLevel->NextSiblingElement();//获取weight //cout<<pathLevel->GetText()<<":"<<weightLevel->GetText()<<endl; //处理文本 string filename =pathLevel->GetText(); float weight = atof(weightLevel->GetText()); //cout<<filename<<"|"<<weight+1<<endl; processDictionary(filename,weight); dicLevel=dicLevel->NextSiblingElement(); } end_time = GetTickCount64();//获取结束时间 cout<<"Process All Dictionary Use "<<end_time-start_time<<" ms"<<endl; //获取mysql TiXmlElement *mysqlLevel = dicsLevel->NextSiblingElement(); TiXmlElement *userLevel,*passwdLevel; userLevel = mysqlLevel->FirstChildElement();//获取user passwdLevel = userLevel->NextSiblingElement();//获取passwd user = userLevel->GetText(); passwd = passwdLevel->GetText(); } int sentenceAnalysis(string sSentence) { try { const result_t *pVecResult; int nCount; //对句子进行分词 pVecResult=NLPIR_ParagraphProcessA(sSentence.c_str(),&nCount); float totalNum = 0;//句子总的情感值 float tmpNum = 0;//每个字句的情感值 float tmpD = 1;//用来存放临时副词的变量 bool isTmpDSet = 0;//是否有副词被设置 for (int i=0;i<nCount;i++) { string ciXin = pVecResult[i].sPOS;//词性 //不存在词性则跳过 if(ciXin.empty()) { continue; } //词性中包含'w'(即标点符号),表示当前的一个子句已经处理完毕 if(ciXin[0]=='w') { totalNum += tmpNum; tmpNum = 0; continue; } //如果不是形容词a,副词d,动词v,名词n,数词m,则不处理 if(!(ciXin=="a"||ciXin=="d"||ciXin=="v"||ciXin=="n"||ciXin=="m")) continue; string tmp = sSentence.substr(pVecResult[i].start,pVecResult[i].length); auto itr=hmDic.find(tmp); if(itr!=hmDic.end()) { if(ciXin=="d")//如果副词后面接副词,进行多重复合 { tmpD *= itr->second; isTmpDSet = 1; } else if(isTmpDSet)//如果之前存在副词 { tmpNum += tmpD * itr->second; isTmpDSet = 0; } else { tmpNum += itr->second;//不存在副词的影响,直接加减 } } } return totalNum; } catch(...) { //对出错的情况进行默认处理 return 0; } } void initMysql() { try { mysql_init(&mysqlData); // localhost:服务器 root为账号,123456为密码 test为数据库名 3306为端口 if(!mysql_real_connect(&mysqlData, "localhost","root","","fooddata",3306,NULL,0)) { cout<<"database connect fail"<<endl; exit(1); } else cout<<"database connect success"<<endl; } catch (...) { exit(1); } } int main() { //初始化分词器 if(!NLPIR_Init()) { printf("Init fails\n"); return -1; } //首先读取配置文件,建立字典数据 processConfigure(); //初始化mysql数据库 initMysql(); //存储每一行数据 string text; //查询所有结果 string sqlstr = "SELECT id,text,kind FROM comment"; MYSQL_RES *result = NULL; if (0 == mysql_query(&mysqlData, sqlstr.c_str())) { //一次性取得数据集 result = mysql_store_result(&mysqlData); //获取每一行 MYSQL_ROW row = NULL; row = mysql_fetch_row(result); start_time = GetTickCount64(); while (NULL != row) { text = row[1]; //对每个句子进行分词 int answer = sentenceAnalysis(text); if(answer>0) cout<<"正面评价"<<endl; else if(answer<0) cout<<"负面评价"<<endl; else cout<<"中性评价"<<endl; row = mysql_fetch_row(result); } end_time = GetTickCount64(); cout<<"Text all rows use "<<end_time-start_time<<" ms"<<endl; } //关闭数据库 // mysql_close(&mysqlData); //释放分词器资源 NLPIR_Exit(); return 0; } ``` 求大神看看这个怎么修改???

程序员的兼职技能课

程序员的兼职技能课

为linux系统设计一个简单的二级文件系统

实验目的: 通过一个简单多用户文件系统的设计,加深理解文件系统的内部功能及内部实现。 实验要求: 为linux系统设计一个简单的二级文件系统。要求做到以下几点: (1)可以实现下列几条命令(至少4条)

CCNA+HCNA+wireshark抓包综合网工技能提升套餐

本套餐包含思科路由交换CCNA,部分CCNP核心,华为HCNA以及wireshark抓包等类容,旨在培养具有综合能力的网络工程师。

董付国老师Python全栈学习优惠套餐

购买套餐的朋友可以关注微信公众号“Python小屋”,上传付款截图,然后领取董老师任意图书1本。

成年人用得到的6款资源网站!各个都是宝藏,绝对让你大饱眼福!

不管是学习还是工作,我们都需要一些资源帮助我们有效地解决实际问题。 很多人找资源只知道上百度,但是你们知道吗,有的资源是百度也搜索不出来的,那么今天小编就给大家介绍几款好用的资源网站,大家赶紧收入囊中吧! 1.网盘007 https://wangpan007.com/ 一款全能的资源搜索网站!只需要输入关键字,就能获得你想要的视频、音乐、压缩包等各种资源,网上...

矿车轴载荷计算方法的比较及选用

针对矿车轴的弯曲损坏,分析了固定式矿车车轴的受力,并对力叠加法以及当量负荷法2种计算方法进行了分析和比较,认为应采用当量负荷法进行车轴的设计计算。

Python数据清洗实战入门

Python数据清洗实战入门

C/C++跨平台研发从基础到高阶实战系列套餐

一 专题从基础的C语言核心到c++ 和stl完成基础强化; 二 再到数据结构,设计模式完成专业计算机技能强化; 三 通过跨平台网络编程,linux编程,qt界面编程,mfc编程,windows编程,c++与lua联合编程来完成应用强化 四 最后通过基于ffmpeg的音视频播放器,直播推流,屏幕录像,

Polar编码matlab程序

matlab实现的Polar codes源程序

2019全国大学生数学建模竞赛C题原版优秀论文

2019全国大学生数学建模竞赛C题原版优秀论文,PDF原版论文,不是图片合成的,是可编辑的文字版。共三篇。 C044.pdf C137.pdf C308.pdf

Linux常用命令大全(非常全!!!)

Linux常用命令大全(非常全!!!) 最近都在和Linux打交道,感觉还不错。我觉得Linux相比windows比较麻烦的就是很多东西都要用命令来控制,当然,这也是很多人喜欢linux的原因,比较短小但却功能强大。我将我了解到的命令列举一下,仅供大家参考: 系统信息 arch 显示机器的处理器架构 uname -m 显示机器的处理器架构 uname -r 显示正在使用的内核版本 d...

Linux下聊天室实现(基于C)

在linux下的基于TCP/IP,采用socket通信的聊天室,实现进入聊天室,进行多人群聊,指定人进行私聊,群主管理员功能,颗进行禁言,提出群聊等操作。个人账号可修改昵称或者修改密码,还可进行找回密

一个较完整的Qt用户登录界面设计

一个较完整的Qt用户登录界面,稍微移植可用,用sqlite数据库存储用户名和密码,具有增加和删除用户的功能,开发环境为ubuntu16.04+Qt5.6.1,win7下程序也编译可用。贡献出来,共同学

机器学习初学者必会的案例精讲

机器学习初学者必会的案例精讲

【C语言】贪吃蛇游戏代码(Visual C++6.0实现)

本游戏代码参考《C语言项目开发全程实录(第二版)》第六章。代码已在Visual C++6.0环境下编译测试通过,可在VC++6.0编译器中导入工程编译运行查看效果,或者也可以直接运行Release或D

Android小项目——新闻APP(源码)

Android小项目——新闻APP(源码),一个很简单的可以练手的Android Demo Ps:下载之前可以先看一下这篇文章——https://blog.csdn.net/qq_34149526/a

网络工程师小白入门--【思科CCNA、华为HCNA等网络工程师认证】

网络工程师小白入门--【思科CCNA、华为HCNA等网络工程师认证】

Android音视频开发全套

Android平台音视频开发全套,涉及:FFmpeg软解码解码、Mediacodec硬解码编码、Openssl音频播放、OpenGL ES视频渲染、RTMP推流等核心重要知识点。

YOLOv3目标检测实战:训练自己的数据集

YOLOv3目标检测实战:训练自己的数据集

2019 Python开发者日-培训

2019 Python开发者日-培训

2019 AI开发者大会

2019 AI开发者大会

Windows版YOLOv4目标检测实战:训练自己的数据集

Windows版YOLOv4目标检测实战:训练自己的数据集

4小时玩转微信小程序——基础入门与微信支付实战

4小时玩转微信小程序——基础入门与微信支付实战

因为看了这些书,我大二就拿了华为Offer

四年了,四年,你知道大学这四年我怎么过的么?

Python可以这样学(第四季:数据分析与科学计算可视化)

Python可以这样学(第四季:数据分析与科学计算可视化)

我说我不会算法,阿里把我挂了。

不说了,字节跳动也反手把我挂了。

技术大佬:我去,你写的 switch 语句也太老土了吧

昨天早上通过远程的方式 review 了两名新来同事的代码,大部分代码都写得很漂亮,严谨的同时注释也很到位,这令我非常满意。但当我看到他们当中有一个人写的 switch 语句时,还是忍不住破口大骂:“我擦,小王,你丫写的 switch 语句也太老土了吧!” 来看看小王写的代码吧,看完不要骂我装逼啊。 private static String createPlayer(PlayerTypes p...

YOLOv3目标检测实战系列课程

《YOLOv3目标检测实战系列课程》旨在帮助大家掌握YOLOv3目标检测的训练、原理、源码与网络模型改进方法。 本课程的YOLOv3使用原作darknet(c语言编写),在Ubuntu系统上做项目演示。 本系列课程包括三门课: (1)《YOLOv3目标检测实战:训练自己的数据集》 包括:安装darknet、给自己的数据集打标签、整理自己的数据集、修改配置文件、训练自己的数据集、测试训练出的网络模型、性能统计(mAP计算和画出PR曲线)和先验框聚类。 (2)《YOLOv3目标检测:原理与源码解析》讲解YOLOv1、YOLOv2、YOLOv3的原理、程序流程并解析各层的源码。 (3)《YOLOv3目标检测:网络模型改进方法》讲解YOLOv3的改进方法,包括改进1:不显示指定类别目标的方法 (增加功能) ;改进2:合并BN层到卷积层 (加快推理速度) ; 改进3:使用GIoU指标和损失函数 (提高检测精度) ;改进4:tiny YOLOv3 (简化网络模型)并介绍 AlexeyAB/darknet项目。

DirectX修复工具V4.0增强版

DirectX修复工具(DirectX Repair)是一款系统级工具软件,简便易用。本程序为绿色版,无需安装,可直接运行。 本程序的主要功能是检测当前系统的DirectX状态,如果发现异常则进行修复

C++语言基础视频教程

C++语言基础视频教程

相关热词 c# cad插入影像 c#设计思想 c#正则表达式 转换 c#form复制 c#写web c# 柱形图 c# wcf 服务库 c#应用程序管理器 c#数组如何赋值给数组 c#序列化应用目的博客园
立即提问