java中如何将Object类型的数据强制转换为tuple类型的?

Object [] parg = {input , //测试数据

"F:\shuju\model.txt" , // 调用训练模型

"F:\shuju\jieguo2.txt" }; //预测结果

    我把tuple类型的数据input和String类型的数据同时放在object数组中,现在我要用数组中的input,我怎么能把object类型的数据强制转换为tuple类型?

3个回答

java中任何子类都可以转成父类(一般子类中的东西会比父类多,所以这是做减法,只要把不是父类中的东西保存一下,是父类的就直接放在父类,
所以子类转父类可以实现)。

父类转子类就需要讨论了
1.如果这个父类就是来源于new出来的父类,本身就没有子类中任何信息,这时候父类是不能强转子类的(毕竟子类的东西多,父类并不能把子类的数据
补全)。

2.如果这个父类的实例就是子类强转过来,这时候这个父类就可以在转为子类,因为子类在转父类时子类相关数据并没有丢失。

public void run(){

        A a = new A();
        a.a=1;

        Object o = a;//子类转父类

        A b = (A)o;//还可以转回子类
        System.out.println(b.a);

        Object o1 = new Object();
        A c = (A)o1;//这时就会报错,提示不能强转

        System.out.println(c.a);
    }

相信上面已经把java父子类之间的转换将明白了。

如果你的Object的实例来源跟Tuple类没有关系,那就不能强转,你只能自己手动new Tuple(),并且把相关参数设置进去。

存手打,望采纳。
有不明白的地方可以 继续交流

object类型是可以强转成其它类型的

楼上稳~ 讲解的十分详细~

Csdn user default icon
上传中...
上传图片
插入图片
抄袭、复制答案,以达到刷声望分或其他目的的行为,在CSDN问答是严格禁止的,一经发现立刻封号。是时候展现真正的技术了!
其他相关推荐
关于python中 int,str,tuple类型的wen'ti
使用数据类型定义元组时: ![图片说明](https://img-ask.csdn.net/upload/201908/29/1567043098_257023.png) ![图片说明](https://img-ask.csdn.net/upload/201908/29/1567043112_598258.png) 如果里面是单个数字,不管后面加不加逗号,都会报错,提示int 不可迭代,但为什么单个字符串就可以。 请问可以讲一下类型转换时,tuple转化其他类型的过程是什么?最好从原理上讲一下为什么上述问题会出错?谢谢!
一个元组如何转换为列表
给定元组 a_tuple = (1,’a’,(1,’a’))  将 a_tuple 转化为[1,’a’,(1,’a’)]  将 a_tuple 转化为[1,’a’,[1,’a’]]  删除 a_tuple 中的元素(1,’a’)中的’a’,转化为(1,’a’,(1,))
pyhon3如何快速高效的对比一个列表内的数据是否都相同
假设有两个列表,每个列表中是元组的数据类型 list1=[tuple1,tuple2,tuple3......] list2=[tuple1,tuple2,tuple3......] 现在需要判断list1中每个tuple是否都相等,如果相等再去判断list1与list2是否相等(list1与list2元组个数相等) 请问有什么高效快捷的方法吗?
scala如何初始化一个Tuple3数组?
要的结果用idea debug出来是这样 ``` results:Tuple3[2][]@6416 ``` 其中6416随意 初始化一个Tuple3的API是这样写的 ``` new Tuple3(_1: T1, _2: T2, _3: T3) Create a new tuple with 3 elements. ```
TypeError: expected str, bytes or os.PathLike object, not tuple此报错有大神遇到过吗怎么解决
import gensim import torch from util import read_caption_data, char_table_to_sentence, word2vec from torch.utils.serialization import load_lua import os def save_embeddings(filepath, filename, embeddings): if not os.path.exists(filepath): os.mkdir(filepath) target_path = os.path.join(filepath, filename) torch.save({'embeds': embeddings}, target_path) # 保存整个网络,包括整个计算图 return True class Word_Embeddings(): def __init__(self, root_dir, caption_dir, split_file, alphabet): self.caption_dir = caption_dir self.split_file = split_file self.alphabet = alphabet self.dir_path = os.path.join(root_dir, 'pretrained_embeddings') self.model = gensim.models.KeyedVectors.load_word2vec_format('/data0/Masters/yanwencai/CrossModalRetrieval-master/models/GoogleNews-vectors-negative300.bin', binary=True) #加载Google训练的词向量 def load_caption(self, cap): assert (os.path.isfile(cap)) cls, fn = cap.split('/')[-2], cap.split('/')[-1] file_path = os.path.join((self.dir_path, cls)) #报错行 caption = load_lua(cap) char = caption['char'] sentence = char_table_to_sentence(self.alphabet, char) embeds = word2vec(self.model, sentence, sen_size=16, emb_size=300) return file_path, fn, embeds alphabet = "abcdefghijklmnopqrstuvwxyz0123456789-,;.!?:'\"/\\|_@#$%^&*~`+-=<>()[]{} " data_dir = '/data0/Masters/yanwencai/CrossModalRetrieval-master/datasets/CUB_200_2011/' split_file = os.path.join(data_dir, 'train_val.txt') caption_dir = os.path.join(data_dir, 'cub_icml') caption_list = read_caption_data(caption_dir, split_file) WE = Word_Embeddings(data_dir, caption_dir, split_file, alphabet) for idx, cap in enumerate(caption_list): filepath, fn, embeds = WE.load_caption(cap) #报错行 if save_embeddings(filepath, fn, embeds): print(filepath, fn, 'saved') ``` 错误描述:Traceback (most recent call last): File "word_embeddings.py", line 61, in <module> filepath, fn, embeds,type = WE.load_caption(cap) File "word_embeddings.py", line 36, in load_caption file_path,type = os.path.join((self.dir_path, cls)) File "/home/amax/anaconda2/lib/python3.6/posixpath.py", line 80, in join a = os.fspath(a) TypeError: expected str, bytes or os.PathLike object, not tuple 请各位大神帮忙看一下,xie'xie
spark计算mongodb数据,不知是环境的问题还是代码的问题,还没入门大神们帮帮忙啊
spark计算mongodb中的数据,总是计算不出结果,这些错误信息也找不到是为什么, 有一两次能计算出结果 。第一次接触这个东西 大神们帮帮忙啊 # 主要代码如下: SparkConf sparkConf = new SparkConf(); sparkConf.setMaster(SPARK_PATH); sparkConf.setAppName("Logs_Collect"); String[] jars = { "F:\\bigdata.jar" };// 将文件导出为jar包,不然会报classNotFound的异常 sparkConf.setJars(jars); JavaSparkContext ctx = new JavaSparkContext(sparkConf); Configuration config = new Configuration(); config.set("mongo.input.uri", MONGODB_URL + ".log"); config.set("mongo.output.uri", MONGODB_URL + ".testcollect"); Date start = DateUtil.getLastNDay(dateRange); Date end = DateUtil.getLastNDay(0); // 从mongodb取数据 JavaPairRDD<Object, BSONObject> mongoRDD = ctx.newAPIHadoopRDD(config, MongoInputFormat.class, Object.class, BSONObject.class); JavaPairRDD<Object, BSONObject> mongoRDD2 = mongoRDD .filter(new Function<Tuple2<Object, BSONObject>, Boolean>() { @Override public Boolean call(Tuple2<Object, BSONObject> arg0) throws Exception { if (((Date) arg0._2.get("time")).after(start) && ((Date) arg0._2.get("time")).before(end)) { return true; } else return false; } }); JavaPairRDD<Map<String, Object>, BSONObject> mongoRDD3 = mongoRDD2 .mapToPair(new PairFunction<Tuple2<Object, BSONObject>, Map<String, Object>, BSONObject>() { @Override public Tuple2<Map<String, Object>, BSONObject> call(Tuple2<Object, BSONObject> arg0) throws Exception { Object host = arg0._2.get("host"); Object content = arg0._2.get("content"); Map<String, Object> k = new HashMap<String, Object>(); k.put("host", host); k.put("content", content); return new Tuple2<Map<String, Object>, BSONObject>(k, arg0._2); } }); JavaPairRDD<Map<String, Object>, Integer> mongoRDD4 = mongoRDD3 .mapToPair(new PairFunction<Tuple2<Map<String, Object>, BSONObject>, Map<String, Object>, Integer>() { @Override public Tuple2<Map<String, Object>, Integer> call(Tuple2<Map<String, Object>, BSONObject> arg0) throws Exception { return new Tuple2<Map<String, Object>, Integer>(arg0._1, 1); } }); JavaPairRDD<Map<String, Object>, Integer> mongoRDD5 = mongoRDD4 .reduceByKey(new Function2<Integer, Integer, Integer>() { public Integer call(Integer v1, Integer v2) throws Exception { return v1 + v2; } }); Map<Map<String, Object>, Integer> map2 = mongoRDD5.collectAsMap(); # 运算过程console信息:(前面没有出错的就不贴了) [INFO] com.mongodb.hadoop.splitter.MongoCollectionSplitter - Created split: min={ "_id" : { "$oid" : "563dc85a002e25dc6bfd59cd"}}, max= { "_id" : { "$oid" : "563dc85b002e25dc6bfd7b1b"}} [INFO] com.mongodb.hadoop.splitter.MongoCollectionSplitter - Created split: min={ "_id" : { "$oid" : "563dc85b002e25dc6bfd7b1b"}}, max= null [Stage 0:> (0 + 4) / 79][DEBUG] org.spark-project.jetty.http.HttpParser - filled 173/173 [DEBUG] org.spark-project.jetty.server.Server - REQUEST /jars/bigdata.jar on BlockingHttpConnection@3190b6f6,g=HttpGenerator{s=0,h=-1,b=-1,c=-1},p=HttpParser{s=-5,l=10,c=0},r=1 [DEBUG] org.spark-project.jetty.server.Server - RESPONSE /jars/bigdata.jar 200 handled=true [WARN] org.apache.spark.scheduler.TaskSetManager - **Lost task 0.0 in stage 0.0 (TID 0, slave02): java.io.IOException: java.lang.ArrayIndexOutOfBoundsException** at org.apache.spark.util.Utils$.tryOrIOException(Utils.scala:1141) at org.apache.spark.SerializableWritable.readObject(SerializableWritable.scala:39) at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:497) at java.io.ObjectStreamClass.invokeReadObject(ObjectStreamClass.java:1017) at java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:1896) at java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:1801) at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1351) at java.io.ObjectInputStream.defaultReadFields(ObjectInputStream.java:1993) at java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:1918) at java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:1801) at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1351) at java.io.ObjectInputStream.defaultReadFields(ObjectInputStream.java:1993) at java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:1918) at java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:1801) at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1351) at java.io.ObjectInputStream.readObject(ObjectInputStream.java:371) at org.apache.spark.serializer.JavaDeserializationStream.readObject(JavaSerializer.scala:68) at org.apache.spark.serializer.JavaSerializerInstance.deserialize(JavaSerializer.scala:94) at org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:185) at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142) at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617) at java.lang.Thread.run(Thread.java:745) **Caused by: java.lang.ArrayIndexOutOfBoundsException** at java.lang.System.arraycopy(Native Method) at org.bson.BasicBSONDecoder$BSONInput._need(BasicBSONDecoder.java:404) at org.bson.BasicBSONDecoder$BSONInput.read(BasicBSONDecoder.java:452) at org.bson.BasicBSONDecoder$BSONInput.readCStr(BasicBSONDecoder.java:492) at org.bson.BasicBSONDecoder.decodeElement(BasicBSONDecoder.java:197) at org.bson.BasicBSONDecoder._decode(BasicBSONDecoder.java:153) at org.bson.BasicBSONDecoder.decode(BasicBSONDecoder.java:121) at com.mongodb.hadoop.input.MongoInputSplit.readFields(MongoInputSplit.java:185) at org.apache.hadoop.io.ObjectWritable.readObject(ObjectWritable.java:285) at org.apache.hadoop.io.ObjectWritable.readFields(ObjectWritable.java:77) at org.apache.spark.SerializableWritable$$anonfun$readObject$1.apply$mcV$sp(SerializableWritable.scala:43) at org.apache.spark.util.Utils$.tryOrIOException(Utils.scala:1138) ... 24 more
C#中用Tuple<int,int,int>存储的矩阵,又如何输出呢?
C#中用Tuple<int,int,int>存储的矩阵,又如何输出呢?
函数返回tuple怎么添加List里面(python环境下)
1、在python环境下调用函数返回tuple,将结果添加在list list在循环前为空 2、代码如下: ``` result=[] print(type(result)) for i in range(len(urllist)): print(i) add=urlget(urllist[i]) print(type(add)) result=result.append(add) print(type(result)) ``` 3、报错信息 <class 'list'> 0 <class 'tuple'> <class 'NoneType'> 1 Traceback (most recent call last): File "C:/Users/123/PycharmProjects/urldownload/urlopentest.py", line 84, in <module> result=result.append(add) AttributeError: 'NoneType' object has no attribute 'append' <class 'tuple'>
java处理读取一个文件夹下的10个文件后再读取11~20该如何实现呢?
import java.io.IOException; import java.net.URI; import java.util.Date; import org.apache.hadoop.conf.Configuration; import org.apache.hadoop.fs.FSDataOutputStream; import org.apache.hadoop.fs.FileStatus; import org.apache.hadoop.fs.FileSystem; import org.apache.hadoop.fs.FileUtil; import org.apache.hadoop.fs.Path; import org.apache.spark.SparkConf; import org.apache.spark.api.java.JavaRDD; import org.apache.spark.api.java.JavaSparkContext; import org.apache.spark.api.java.function.Function; import scala.Tuple3; public class Test { public static void main(String[] args) throws IOException { // 创建一个配置类SparkConf,然后创建一个SparkContext SparkConf conf = new SparkConf().setAppName("CollectFemaleInfo"); JavaSparkContext jsc = new JavaSparkContext(conf); String url = "E://"; JavaRDD<String> data = null; Configuration config = new Configuration(); FileSystem hoursHDFS = FileSystem.get(URI.create(url),config); FileStatus[] hoursFile = hoursHDFS.listStatus(new Path(url)); Path[] paths = FileUtil.stat2Paths(hoursFile); for (int i = 0; i < paths.length; i++) { Path path = paths[i]; // 读取原文件数据,每一行记录转成RDD里面的一个元素 data = jsc.textFile(path.toString()); String newFile = "E://"+Long.valueOf(new Date().getTime())+".txt"; FileSystem newHDFS = FileSystem.get(URI.create(newFile),config); // 将每条记录的每列切割出来,生成一个Tuple JavaRDD<Tuple3<String, String, Integer>> person = data .map(new Function<String, Tuple3<String, String, Integer>>() { private static final long serialVersionUID = -2381522520231963249L; public Tuple3<String, String, Integer> call(String s) throws Exception { // 按逗号分割一行数据 String[] tokens = s.split(","); // 将分割后的三个元素组成一个三元Tuple Tuple3<String, String, Integer> person = new Tuple3<String, String, Integer>( tokens[0], tokens[1], Integer .parseInt(tokens[2])); return person; } }); FSDataOutputStream os = newHDFS.create(new Path(newFile),true); // 遍历数据写到新文件中 for (Tuple3<String,String, Integer> d : person.collect()) { StringBuffer sb = new StringBuffer(); sb.append(d._1() + "," + d._2()+","+d._3()); os.write(sb.toString().getBytes("UTF-8")); } } // 将每条记录的每列切割出来,生成一个Tuple jsc.stop(); jsc.close(); } } 首先是代码。 目前功能已实现到读取文件夹的文件写到一个新文件上。 但是想以10个文件读取一次写到一个新文件上。没啥思路。 因为我写代码对于算法逻辑最是想不通。 求大神指教,另外代码是用spark处理的。在读文件时也是在hdfs里去读取。目前代码是写的本地路径
求助,java在上传文档的时候出现NullPointerException
org.hibernate.PropertyAccessException: NullPointerException occurred while calling setter of com.model.Job.hwfile org.hibernate.property.BasicPropertyAccessor$BasicSetter.set(BasicPropertyAccessor.java:78) org.hibernate.tuple.entity.AbstractEntityTuplizer.setPropertyValue(AbstractEntityTuplizer.java:720) org.hibernate.persister.entity.AbstractEntityPersister.setPropertyValue(AbstractEntityPersister.java:4948) com.base.dao.impl.BaseDaoImpl.updates(BaseDaoImpl.java:101) com.service.impl.BaseServiceImpl.updates(BaseServiceImpl.java:36) sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) java.lang.reflect.Method.invoke(Method.java:497) org.springframework.aop.support.AopUtils.invokeJoinpointUsingReflection(AopUtils.java:317) org.springframework.aop.framework.ReflectiveMethodInvocation.invokeJoinpoint(ReflectiveMethodInvocation.java:190) org.springframework.aop.framework.ReflectiveMethodInvocation.proceed(ReflectiveMethodInvocation.java:157) org.springframework.transaction.interceptor.TransactionInterceptor$1.proceedWithInvocation(TransactionInterceptor.java:98) org.springframework.transaction.interceptor.TransactionAspectSupport.invokeWithinTransaction(TransactionAspectSupport.java:262) org.springframework.transaction.interceptor.TransactionInterceptor.invoke(TransactionInterceptor.java:95) org.springframework.aop.framework.ReflectiveMethodInvocation.proceed(ReflectiveMethodInvocation.java:179) org.springframework.aop.interceptor.ExposeInvocationInterceptor.invoke(ExposeInvocationInterceptor.java:92) org.springframework.aop.framework.ReflectiveMethodInvocation.proceed(ReflectiveMethodInvocation.java:179) org.springframework.aop.framework.JdkDynamicAopProxy.invoke(JdkDynamicAopProxy.java:207) com.sun.proxy.$Proxy30.updates(Unknown Source) com.action.JobAction.updateDoc(JobAction.java:236) sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) java.lang.reflect.Method.invoke(Method.java:497) com.opensymphony.xwork2.DefaultActionInvocation.invokeAction(DefaultActionInvocation.java:450) com.opensymphony.xwork2.DefaultActionInvocation.invokeActionOnly(DefaultActionInvocation.java:289) com.opensymphony.xwork2.DefaultActionInvocation.invoke(DefaultActionInvocation.java:252) org.apache.struts2.interceptor.debugging.DebuggingInterceptor.intercept(DebuggingInterceptor.java:256) com.opensymphony.xwork2.DefaultActionInvocation.invoke(DefaultActionInvocation.java:246) com.opensymphony.xwork2.interceptor.DefaultWorkflowInterceptor.doIntercept(DefaultWorkflowInterceptor.java:167) com.opensymphony.xwork2.interceptor.MethodFilterInterceptor.intercept(MethodFilterInterceptor.java:98) com.opensymphony.xwork2.DefaultActionInvocation.invoke(DefaultActionInvocation.java:246) com.opensymphony.xwork2.validator.ValidationInterceptor.doIntercept(ValidationInterceptor.java:265) org.apache.struts2.interceptor.validation.AnnotationValidationInterceptor.doIntercept(AnnotationValidationInterceptor.java:68) com.opensymphony.xwork2.interceptor.MethodFilterInterceptor.intercept(MethodFilterInterceptor.java:98) com.opensymphony.xwork2.DefaultActionInvocation.invoke(DefaultActionInvocation.java:246) com.opensymphony.xwork2.interceptor.ConversionErrorInterceptor.intercept(ConversionErrorInterceptor.java:138) com.opensymphony.xwork2.DefaultActionInvocation.invoke(DefaultActionInvocation.java:246) com.opensymphony.xwork2.interceptor.ParametersInterceptor.doIntercept(ParametersInterceptor.java:239) com.opensymphony.xwork2.interceptor.MethodFilterInterceptor.intercept(MethodFilterInterceptor.java:98) com.opensymphony.xwork2.DefaultActionInvocation.invoke(DefaultActionInvocation.java:246) com.opensymphony.xwork2.interceptor.ParametersInterceptor.doIntercept(ParametersInterceptor.java:239) com.opensymphony.xwork2.interceptor.MethodFilterInterceptor.intercept(MethodFilterInterceptor.java:98) com.opensymphony.xwork2.DefaultActionInvocation.invoke(DefaultActionInvocation.java:246) com.opensymphony.xwork2.interceptor.StaticParametersInterceptor.intercept(StaticParametersInterceptor.java:191) com.opensymphony.xwork2.DefaultActionInvocation.invoke(DefaultActionInvocation.java:246) org.apache.struts2.interceptor.MultiselectInterceptor.intercept(MultiselectInterceptor.java:73) com.opensymphony.xwork2.DefaultActionInvocation.invoke(DefaultActionInvocation.java:246) org.apache.struts2.interceptor.CheckboxInterceptor.intercept(CheckboxInterceptor.java:91) com.opensymphony.xwork2.DefaultActionInvocation.invoke(DefaultActionInvocation.java:246) org.apache.struts2.interceptor.FileUploadInterceptor.intercept(FileUploadInterceptor.java:325) com.opensymphony.xwork2.DefaultActionInvocation.invoke(DefaultActionInvocation.java:246) com.opensymphony.xwork2.interceptor.ModelDrivenInterceptor.intercept(ModelDrivenInterceptor.java:100) com.opensymphony.xwork2.DefaultActionInvocation.invoke(DefaultActionInvocation.java:246) com.opensymphony.xwork2.interceptor.ScopedModelDrivenInterceptor.intercept(ScopedModelDrivenInterceptor.java:141) com.opensymphony.xwork2.DefaultActionInvocation.invoke(DefaultActionInvocation.java:246) com.opensymphony.xwork2.interceptor.ChainingInterceptor.intercept(ChainingInterceptor.java:145) com.opensymphony.xwork2.DefaultActionInvocation.invoke(DefaultActionInvocation.java:246) com.opensymphony.xwork2.interceptor.PrepareInterceptor.doIntercept(PrepareInterceptor.java:171) com.opensymphony.xwork2.interceptor.MethodFilterInterceptor.intercept(MethodFilterInterceptor.java:98) com.opensymphony.xwork2.DefaultActionInvocation.invoke(DefaultActionInvocation.java:246) com.opensymphony.xwork2.interceptor.I18nInterceptor.intercept(I18nInterceptor.java:161) com.opensymphony.xwork2.DefaultActionInvocation.invoke(DefaultActionInvocation.java:246) org.apache.struts2.interceptor.ServletConfigInterceptor.intercept(ServletConfigInterceptor.java:164) com.opensymphony.xwork2.DefaultActionInvocation.invoke(DefaultActionInvocation.java:246) com.opensymphony.xwork2.interceptor.AliasInterceptor.intercept(AliasInterceptor.java:193) com.opensymphony.xwork2.DefaultActionInvocation.invoke(DefaultActionInvocation.java:246) com.opensymphony.xwork2.interceptor.ExceptionMappingInterceptor.intercept(ExceptionMappingInterceptor.java:189) com.opensymphony.xwork2.DefaultActionInvocation.invoke(DefaultActionInvocation.java:246) org.apache.struts2.impl.StrutsActionProxy.execute(StrutsActionProxy.java:54) org.apache.struts2.dispatcher.Dispatcher.serviceAction(Dispatcher.java:563) org.apache.struts2.dispatcher.ng.ExecuteOperations.executeAction(ExecuteOperations.java:77) org.apache.struts2.dispatcher.ng.filter.StrutsPrepareAndExecuteFilter.doFilter(StrutsPrepareAndExecuteFilter.java:99) com.filter.MyStrutsFilter.doFilter(MyStrutsFilter.java:25) com.filter.SystemContextFilter.doFilter(SystemContextFilter.java:40) org.springframework.orm.hibernate4.support.OpenSessionInViewFilter.doFilterInternal(OpenSessionInViewFilter.java:149) org.springframework.web.filter.OncePerRequestFilter.doFilter(OncePerRequestFilter.java:108) org.springframework.web.filter.CharacterEncodingFilter.doFilterInternal(CharacterEncodingFilter.java:88) org.springframework.web.filter.OncePerRequestFilter.doFilter(OncePerRequestFilter.java:108) 原代码: Action类 public String updateDoc() throws IOException { if (file != null) { ActionContext ac = ActionContext.getContext(); HttpServletRequest request = (HttpServletRequest) ac.get(ServletActionContext.HTTP_REQUEST); String root = request.getRealPath("/upload"); InputStream is = new FileInputStream(file); String f = fileFileName; f = UUIDUtils.create() + fileFileName; OutputStream os = new FileOutputStream(new File(root, f)); byte[] buffer = new byte[500]; int length = 0; while (-1 != (length = is.read(buffer, 0, buffer.length))) { os.write(buffer); } os.close(); is.close(); job.setHwfile("\\upload\\" + f); } jobService.updates(job);//显示这一行出错 ActionContext.getContext().put("url", "job_dataList.do"); return "redirect"; } BaseDaoImpl.java public T updates(T bean) { // 取得元数据 ClassMetadata cm = sessionFactory.getClassMetadata(getClz()); // 取得主键名称 String identifierName = cm.getIdentifierPropertyName(); // 反射取得主键值 Integer id = (Integer) getSimpleProperty(bean, identifierName); // 取得数据库中的对象 T po = findById(id); // 取得所有属性 String[] propNames = cm.getPropertyNames(); // 定义存储新值 Object newValue; for (String propName : propNames) { // 如果是主键就跳过 if (propName.equals(identifierName)) { continue; } // 反射取得新值 newValue = getSimpleProperty(bean, propName); if (newValue != null) { // 设置新值 cm.setPropertyValue(po, propName, newValue);//显示这一行出错 } } return po; } BaseServiceImpl.java: public void updates(T entity) { dao.updates(entity);//显示这一行出错 } Job实体类: private String hwfile; public String getHwfile() { return hwfile; } public void setHwfile(String hwfile) { this.hwfile = hwfile; }
java hibernate运行起动报错
严重: Exception sending context initialized event to listener instance of class org.springframework.web.context.ContextLoaderListener org.springframework.beans.factory.BeanCreationException: Error creating bean with name 'sessionFactory' defined in ServletContext resource [/WEB-INF/classes/applicationContext.xml]: Invocation of init method failed; nested exception is org.hibernate.HibernateException: Unable to instantiate default tuplizer [org.hibernate.tuple.entity.PojoEntityTuplizer] at org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.initializeBean(AbstractAutowireCapableBeanFactory.java:1338) at org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.doCreateBean(AbstractAutowireCapableBeanFactory.java:473) at org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory$1.run(AbstractAutowireCapableBeanFactory.java:409) at java.security.AccessController.doPrivileged(Native Method) at org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.createBean(AbstractAutowireCapableBeanFactory.java:380) at org.springframework.beans.factory.support.AbstractBeanFactory$1.getObject(AbstractBeanFactory.java:264) at org.springframework.beans.factory.support.DefaultSingletonBeanRegistry.getSingleton(DefaultSingletonBeanRegistry.java:222) at org.springframework.beans.factory.support.AbstractBeanFactory.doGetBean(AbstractBeanFactory.java:261) at org.springframework.beans.factory.support.AbstractBeanFactory.getBean(AbstractBeanFactory.java:185) at org.springframework.beans.factory.support.AbstractBeanFactory.getBean(AbstractBeanFactory.java:164) at org.springframework.beans.factory.support.DefaultListableBeanFactory.preInstantiateSingletons(DefaultListableBeanFactory.java:423) at org.springframework.context.support.AbstractApplicationContext.finishBeanFactoryInitialization(AbstractApplicationContext.java:728) at org.springframework.context.support.AbstractApplicationContext.refresh(AbstractApplicationContext.java:380) at org.springframework.web.context.ContextLoader.createWebApplicationContext(ContextLoader.java:255) at org.springframework.web.context.ContextLoader.initWebApplicationContext(ContextLoader.java:199) at org.springframework.web.context.ContextLoaderListener.contextInitialized(ContextLoaderListener.java:45) at org.apache.catalina.core.StandardContext.listenerStart(StandardContext.java:4779) at org.apache.catalina.core.StandardContext.startInternal(StandardContext.java:5273) at org.apache.catalina.util.LifecycleBase.start(LifecycleBase.java:150) at org.apache.catalina.core.ContainerBase.addChildInternal(ContainerBase.java:895) at org.apache.catalina.core.ContainerBase.addChild(ContainerBase.java:871) at org.apache.catalina.core.StandardHost.addChild(StandardHost.java:615) at org.apache.catalina.startup.HostConfig.deployDirectory(HostConfig.java:1099) at org.apache.catalina.startup.HostConfig$DeployDirectory.run(HostConfig.java:1621) at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:441) at java.util.concurrent.FutureTask$Sync.innerRun(FutureTask.java:303) at java.util.concurrent.FutureTask.run(FutureTask.java:138) at java.util.concurrent.ThreadPoolExecutor$Worker.runTask(ThreadPoolExecutor.java:886) at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:908) at java.lang.Thread.run(Thread.java:619) Caused by: org.hibernate.HibernateException: Unable to instantiate default tuplizer [org.hibernate.tuple.entity.PojoEntityTuplizer] at org.hibernate.tuple.entity.EntityTuplizerFactory.constructTuplizer(EntityTuplizerFactory.java:110) at org.hibernate.tuple.entity.EntityTuplizerFactory.constructDefaultTuplizer(EntityTuplizerFactory.java:135) at org.hibernate.tuple.entity.EntityEntityModeToTuplizerMapping.<init>(EntityEntityModeToTuplizerMapping.java:80) at org.hibernate.tuple.entity.EntityMetamodel.<init>(EntityMetamodel.java:323) at org.hibernate.persister.entity.AbstractEntityPersister.<init>(AbstractEntityPersister.java:456) at org.hibernate.persister.entity.SingleTableEntityPersister.<init>(SingleTableEntityPersister.java:131) at org.hibernate.persister.PersisterFactory.createClassPersister(PersisterFactory.java:84) at org.hibernate.impl.SessionFactoryImpl.<init>(SessionFactoryImpl.java:267) at org.hibernate.cfg.Configuration.buildSessionFactory(Configuration.java:1341) at org.hibernate.cfg.AnnotationConfiguration.buildSessionFactory(AnnotationConfiguration.java:867) at org.springframework.orm.hibernate3.LocalSessionFactoryBean.newSessionFactory(LocalSessionFactoryBean.java:814) at org.springframework.orm.hibernate3.LocalSessionFactoryBean.buildSessionFactory(LocalSessionFactoryBean.java:732) at org.springframework.orm.hibernate3.AbstractSessionFactoryBean.afterPropertiesSet(AbstractSessionFactoryBean.java:211) at org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.invokeInitMethods(AbstractAutowireCapableBeanFactory.java:1369) at org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.initializeBean(AbstractAutowireCapableBeanFactory.java:1335) ... 29 more Caused by: java.lang.reflect.InvocationTargetException at sun.reflect.GeneratedConstructorAccessor9.newInstance(Unknown Source) at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:27) at java.lang.reflect.Constructor.newInstance(Constructor.java:513) at org.hibernate.tuple.entity.EntityTuplizerFactory.constructTuplizer(EntityTuplizerFactory.java:107) ... 43 more Caused by: org.hibernate.PropertyNotFoundException: Could not find a getter for xtjy in class com.znyy.bean.lis.base.LisInstrument at org.hibernate.property.BasicPropertyAccessor.createGetter(BasicPropertyAccessor.java:306) at org.hibernate.property.BasicPropertyAccessor.getGetter(BasicPropertyAccessor.java:299) at org.hibernate.mapping.Property.getGetter(Property.java:294) at org.hibernate.tuple.entity.PojoEntityTuplizer.buildPropertyGetter(PojoEntityTuplizer.java:300) at org.hibernate.tuple.entity.AbstractEntityTuplizer.<init>(AbstractEntityTuplizer.java:141) at org.hibernate.tuple.entity.PojoEntityTuplizer.<init>(PojoEntityTuplizer.java:78) ... 47 more
如何将列表类型转换成dict类型,用于AssRule函数?
``` def AssRule(freq, min_conf ): """ This assRule must input a dict for itemset -> support rate And also can customize your minimum confidence """ assert type(freq) is dict result = [] for item, sup in freq.items(): for subitem in subs(list(item)): sb = [x for x in item if x not in subitem] if sb == [] or subitem == []: continue if len(subitem) == 1 and (subitem[0][0] == 'in' or subitem[0][0] == 'out'): continue conf = sup/freq[tuple(subitem)] if conf >= min_conf: result.append({'from':subitem, 'to':sb, 'sup':sup, 'conf':conf}) return result ``` ``` if __name__ == '__main__': a = pd.read_csv('E:/li.csv', low_memory=False) A=a.to_dict(orient = "lists") AssRule(A,0.4) ```
Python中的类型和函数的区别
初学Python,在看书的时候,学习到list,tuple,str还有dict的时候,书上说这些其实是数据类型,理解上不是很困难,但是有这样一个例子把我搞混了 item=[('name','Bill'),('age',23)] d=dict(item) 我看了这两行代码,始终认为dict其实就是一个函数,感觉就是item作为函数参数,传递给了dict,但是作为数据类型,像 INT 什么的,直接就是定义赋值,为什么这里dict却像函数一样使用? 在下先谢过各位了,感激不尽
运行Python,解释器给出如下输出:<generator object <genexpr> at 0x000001B66C1799A8>,请问什么意思?
今天运行python,结果运行处不知道什么意思的代码,请问是什么?报错的话应该怎么修改。代码如下: >>>tuple1 = (1,2,4,8,16,32,64,128,256) >>>list1 = [i for i in tuple1] >>>print(i for i in list1) 输出结果是: <generator object <genexpr> at 0x000001B66C1799A8>
报java.lang.IllegalArgumentException错误?
信息: Server startup in 27476 ms Hibernate: select bbasic0_.b_ID as b1_0_, bbasic0_.b_cost as b2_0_, bbasic0_.b_level as b3_0_, bbasic0_.b_number as b4_0_, bbasic0_.b_project as b5_0_, bbasic0_.b_remark as b6_0_, bbasic0_.b_state as b7_0_ from p_basic bbasic0_ limit ? 2017-11-1 19:43:11 org.apache.catalina.core.StandardWrapperValve invoke 严重: Servlet.service() for servlet springDispatcherServlet threw exception java.lang.IllegalArgumentException at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25) at java.lang.reflect.Method.invoke(Method.java:597) at org.hibernate.property.BasicPropertyAccessor$BasicSetter.set(BasicPropertyAccessor.java:66) at org.hibernate.tuple.entity.AbstractEntityTuplizer.setPropertyValues(AbstractEntityTuplizer.java:583) at org.hibernate.tuple.entity.PojoEntityTuplizer.setPropertyValues(PojoEntityTuplizer.java:229) at org.hibernate.persister.entity.AbstractEntityPersister.setPropertyValues(AbstractEntityPersister.java:3848) at org.hibernate.engine.TwoPhaseLoad.initializeEntity(TwoPhaseLoad.java:152) at org.hibernate.loader.Loader.initializeEntitiesAndCollections(Loader.java:982) at org.hibernate.loader.Loader.doQuery(Loader.java:857) at org.hibernate.loader.Loader.doQueryAndInitializeNonLazyCollections(Loader.java:274) at org.hibernate.loader.Loader.doList(Loader.java:2542) at org.hibernate.loader.Loader.listIgnoreQueryCache(Loader.java:2276) at org.hibernate.loader.Loader.list(Loader.java:2271) at org.hibernate.loader.hql.QueryLoader.list(QueryLoader.java:459) at org.hibernate.hql.ast.QueryTranslatorImpl.list(QueryTranslatorImpl.java:365) at org.hibernate.engine.query.HQLQueryPlan.performList(HQLQueryPlan.java:196) at org.hibernate.impl.SessionImpl.list(SessionImpl.java:1268) at org.hibernate.impl.QueryImpl.list(QueryImpl.java:102) at cn.zrcx.dao.DaoSupport.getScrollData(DaoSupport.java:75) at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25) at java.lang.reflect.Method.invoke(Method.java:597) at org.springframework.aop.support.AopUtils.invokeJoinpointUsingReflection(AopUtils.java:317) at org.springframework.aop.framework.ReflectiveMethodInvocation.invokeJoinpoint(ReflectiveMethodInvocation.java:190) at org.springframework.aop.framework.ReflectiveMethodInvocation.proceed(ReflectiveMethodInvocation.java:157) at org.springframework.transaction.interceptor.TransactionInterceptor$1.proceedWithInvocation(TransactionInterceptor.java:98) at org.springframework.transaction.interceptor.TransactionAspectSupport.invokeWithinTransaction(TransactionAspectSupport.java:266) at org.springframework.transaction.interceptor.TransactionInterceptor.invoke(TransactionInterceptor.java:95) at org.springframework.aop.framework.ReflectiveMethodInvocation.proceed(ReflectiveMethodInvocation.java:179) at org.springframework.aop.framework.JdkDynamicAopProxy.invoke(JdkDynamicAopProxy.java:207) at $Proxy41.getScrollData(Unknown Source) at cn.zrcx.action.BasicAction.query(BasicAction.java:109) at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25) at java.lang.reflect.Method.invoke(Method.java:597) at org.springframework.web.method.support.InvocableHandlerMethod.invoke(InvocableHandlerMethod.java:215) at org.springframework.web.method.support.InvocableHandlerMethod.invokeForRequest(InvocableHandlerMethod.java:132) at org.springframework.web.servlet.mvc.method.annotation.ServletInvocableHandlerMethod.invokeAndHandle(ServletInvocableHandlerMethod.java:104) at org.springframework.web.servlet.mvc.method.annotation.RequestMappingHandlerAdapter.invokeHandleMethod(RequestMappingHandlerAdapter.java:781) at org.springframework.web.servlet.mvc.method.annotation.RequestMappingHandlerAdapter.handleInternal(RequestMappingHandlerAdapter.java:721) at org.springframework.web.servlet.mvc.method.AbstractHandlerMethodAdapter.handle(AbstractHandlerMethodAdapter.java:83) at org.springframework.web.servlet.DispatcherServlet.doDispatch(DispatcherServlet.java:945) at org.springframework.web.servlet.DispatcherServlet.doService(DispatcherServlet.java:877) at org.springframework.web.servlet.FrameworkServlet.processRequest(FrameworkServlet.java:961) at org.springframework.web.servlet.FrameworkServlet.doGet(FrameworkServlet.java:852) at javax.servlet.http.HttpServlet.service(HttpServlet.java:617) at org.springframework.web.servlet.FrameworkServlet.service(FrameworkServlet.java:837) at javax.servlet.http.HttpServlet.service(HttpServlet.java:717) at org.apache.catalina.core.ApplicationFilterChain.internalDoFilter(ApplicationFilterChain.java:290) at org.apache.catalina.core.ApplicationFilterChain.doFilter(ApplicationFilterChain.java:206) at cn.zrcx.filter.EcodingFilter.doFilter(EcodingFilter.java:40) at org.apache.catalina.core.ApplicationFilterChain.internalDoFilter(ApplicationFilterChain.java:235) at org.apache.catalina.core.ApplicationFilterChain.doFilter(ApplicationFilterChain.java:206) at org.apache.catalina.core.StandardWrapperValve.invoke(StandardWrapperValve.java:233) at org.apache.catalina.core.StandardContextValve.invoke(StandardContextValve.java:191) at org.apache.catalina.core.StandardHostValve.invoke(StandardHostValve.java:127) at org.apache.catalina.valves.ErrorReportValve.invoke(ErrorReportValve.java:102) at org.apache.catalina.core.StandardEngineValve.invoke(StandardEngineValve.java:109) at org.apache.catalina.connector.CoyoteAdapter.service(CoyoteAdapter.java:293) at org.apache.coyote.http11.Http11AprProcessor.process(Http11AprProcessor.java:877) at org.apache.coyote.http11.Http11AprProtocol$Http11ConnectionHandler.process(Http11AprProtocol.java:594) at org.apache.tomcat.util.net.AprEndpoint$Worker.run(AprEndpoint.java:1675) at java.lang.Thread.run(Thread.java:619) 向数据库中执行分页查询操作包如上的错误,不知道是哪里出的问题 **控制层代码:** @RequestMapping("/basicAction_query") public void query(HttpServletRequest request, HttpServletResponse response) throws IOException{ //int currentPage=Integer.parseInt(request.getParameter("page")); //int pageSize=Integer.parseInt(request.getParameter("rows")); int currentPage=1; int pageSize=5; PageView pageview=new PageView(currentPage,pageSize); QueryResult<BBasic> queryResult=basicService.getScrollData(BBasic.class, (currentPage-1)*pageview.getMaxresult(),pageview.getMaxresult(),null,null,null); **//上面分页查询语句报错** long totalrecord=queryResult.getTotalrecord(); List<BBasic> resultlist=queryResult.getResultlist(); JsonConfig jsonConfig=new JsonConfig(); jsonConfig.registerJsonValueProcessor(java.util.Date.class, new DateJsonValueProcessor()); String jsonData=JSONArray.fromObject(resultlist,jsonConfig).toString(); System.out.println("{\"total\":"+totalrecord+",\"rows\":"+jsonData+"}"); response.getWriter().print("{\"total\":"+totalrecord+",\"rows\":"+jsonData+"}"); } 直接访问控制层方法报上面的错误,Hibernate查询语句出来了,正常可以把查询结果转变为json数据的,但是报了上面这个错误,不知道是哪里出的问题?请教各位大神,帮忙给点指点,谢谢!
java java.lang.IllegalArgumentException空值问题
2011-11-15 10:37:29 org.apache.catalina.core.StandardWrapperValve invoke 严重: Servlet.service() for servlet [dispatcher] in context with path [/FireProofing] threw exception [Request processing failed; nested exception is org.springframework.orm.hibernate3.HibernateSystemException: Null value was assigned to a property of primitive type setter of ff.model.PTZ.northMigration; nested exception is org.hibernate.PropertyAccessException: Null value was assigned to a property of primitive type setter of ff.model.PTZ.northMigration] with root cause java.lang.IllegalArgumentException at sun.reflect.GeneratedMethodAccessor55.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25) at java.lang.reflect.Method.invoke(Method.java:597) at org.hibernate.property.BasicPropertyAccessor$BasicSetter.set(BasicPropertyAccessor.java:66) at org.hibernate.tuple.entity.AbstractEntityTuplizer.setPropertyValues(AbstractEntityTuplizer.java:583) at org.hibernate.tuple.entity.PojoEntityTuplizer.setPropertyValues(PojoEntityTuplizer.java:229) at org.hibernate.persister.entity.AbstractEntityPersister.setPropertyValues(AbstractEntityPersister.java:3847) at org.hibernate.engine.TwoPhaseLoad.initializeEntity(TwoPhaseLoad.java:152) at org.hibernate.loader.Loader.initializeEntitiesAndCollections(Loader.java:982) at org.hibernate.loader.Loader.doQuery(Loader.java:857) at org.hibernate.loader.Loader.doQueryAndInitializeNonLazyCollections(Loader.java:274) at org.hibernate.loader.Loader.doList(Loader.java:2533) at org.hibernate.loader.Loader.listIgnoreQueryCache(Loader.java:2276) at org.hibernate.loader.Loader.list(Loader.java:2271) at org.hibernate.loader.hql.QueryLoader.list(QueryLoader.java:452) at org.hibernate.hql.ast.QueryTranslatorImpl.list(QueryTranslatorImpl.java:363) at org.hibernate.engine.query.HQLQueryPlan.performList(HQLQueryPlan.java:196) at org.hibernate.impl.SessionImpl.list(SessionImpl.java:1268) at org.hibernate.impl.QueryImpl.list(QueryImpl.java:102) at org.springframework.orm.hibernate3.HibernateTemplate$30.doInHibernate(HibernateTemplate.java:921) at org.springframework.orm.hibernate3.HibernateTemplate$30.doInHibernate(HibernateTemplate.java:1) at org.springframework.orm.hibernate3.HibernateTemplate.doExecute(HibernateTemplate.java:406) at org.springframework.orm.hibernate3.HibernateTemplate.executeWithNativeSession(HibernateTemplate.java:374) at org.springframework.orm.hibernate3.HibernateTemplate.find(HibernateTemplate.java:912) at org.springframework.orm.hibernate3.HibernateTemplate.find(HibernateTemplate.java:904) at ff.dao.impl.PTZDaoHImpl.getAllPTZs(PTZDaoHImpl.java:30) at ff.service.impl.PTZServiceImpl.getPTZList(PTZServiceImpl.java:96) at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25) at java.lang.reflect.Method.invoke(Method.java:597) at org.springframework.aop.support.AopUtils.invokeJoinpointUsingReflection(AopUtils.java:309) at org.springframework.aop.framework.ReflectiveMethodInvocation.invokeJoinpoint(ReflectiveMethodInvocation.java:183) at org.springframework.aop.framework.ReflectiveMethodInvocation.proceed(ReflectiveMethodInvocation.java:150) at org.springframework.transaction.interceptor.TransactionInterceptor.invoke(TransactionInterceptor.java:110) at org.springframework.aop.framework.ReflectiveMethodInvocation.proceed(ReflectiveMethodInvocation.java:172) at org.springframework.aop.interceptor.ExposeInvocationInterceptor.invoke(ExposeInvocationInterceptor.java:90) at org.springframework.aop.framework.ReflectiveMethodInvocation.proceed(ReflectiveMethodInvocation.java:172) at org.springframework.aop.framework.JdkDynamicAopProxy.invoke(JdkDynamicAopProxy.java:202) at $Proxy49.getPTZList(Unknown Source) at ff.controller.PTZController.getAllPTZs(PTZController.java:86) at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25) at java.lang.reflect.Method.invoke(Method.java:597) at org.springframework.web.servlet.mvc.multiaction.MultiActionController.invokeNamedMethod(MultiActionController.java:471) at org.springframework.web.servlet.mvc.multiaction.MultiActionController.handleRequestInternal(MultiActionController.java:408) at org.springframework.web.servlet.mvc.AbstractController.handleRequest(AbstractController.java:153) at org.springframework.web.servlet.mvc.SimpleControllerHandlerAdapter.handle(SimpleControllerHandlerAdapter.java:48) at org.springframework.web.servlet.DispatcherServlet.doDispatch(DispatcherServlet.java:790) at org.springframework.web.servlet.DispatcherServlet.doService(DispatcherServlet.java:719) at org.springframework.web.servlet.FrameworkServlet.processRequest(FrameworkServlet.java:669) at org.springframework.web.servlet.FrameworkServlet.doGet(FrameworkServlet.java:574) at javax.servlet.http.HttpServlet.service(HttpServlet.java:621) at javax.servlet.http.HttpServlet.service(HttpServlet.java:722) at org.apache.catalina.core.ApplicationFilterChain.internalDoFilter(ApplicationFilterChain.java:304) at org.apache.catalina.core.ApplicationFilterChain.doFilter(ApplicationFilterChain.java:210) at org.springframework.orm.hibernate3.support.OpenSessionInViewFilter.doFilterInternal(OpenSessionInViewFilter.java:198) at org.springframework.web.filter.OncePerRequestFilter.doFilter(OncePerRequestFilter.java:76) at org.apache.catalina.core.ApplicationFilterChain.internalDoFilter(ApplicationFilterChain.java:243) at org.apache.catalina.core.ApplicationFilterChain.doFilter(ApplicationFilterChain.java:210) at ff.filter.loginFilter.doFilter(loginFilter.java:51) at org.springframework.web.filter.DelegatingFilterProxy.invokeDelegate(DelegatingFilterProxy.java:237) at org.springframework.web.filter.DelegatingFilterProxy.doFilter(DelegatingFilterProxy.java:167) at org.apache.catalina.core.ApplicationFilterChain.internalDoFilter(ApplicationFilterChain.java:243) at org.apache.catalina.core.ApplicationFilterChain.doFilter(ApplicationFilterChain.java:210) at org.springframework.web.filter.CharacterEncodingFilter.doFilterInternal(CharacterEncodingFilter.java:88) at org.springframework.web.filter.OncePerRequestFilter.doFilter(OncePerRequestFilter.java:76) at org.apache.catalina.core.ApplicationFilterChain.internalDoFilter(ApplicationFilterChain.java:243) at org.apache.catalina.core.ApplicationFilterChain.doFilter(ApplicationFilterChain.java:210) at org.apache.catalina.core.StandardWrapperValve.invoke(StandardWrapperValve.java:240) at org.apache.catalina.core.StandardContextValve.invoke(StandardContextValve.java:164) at org.apache.catalina.authenticator.AuthenticatorBase.invoke(AuthenticatorBase.java:498) at org.apache.catalina.core.StandardHostValve.invoke(StandardHostValve.java:164) at org.apache.catalina.valves.ErrorReportValve.invoke(ErrorReportValve.java:100) at org.apache.catalina.valves.AccessLogValve.invoke(AccessLogValve.java:562) at org.apache.catalina.core.StandardEngineValve.invoke(StandardEngineValve.java:118) at org.apache.catalina.connector.CoyoteAdapter.service(CoyoteAdapter.java:394) at org.apache.coyote.http11.Http11Processor.process(Http11Processor.java:243) at org.apache.coyote.http11.Http11Protocol$Http11ConnectionHandler.process(Http11Protocol.java:188) at org.apache.tomcat.util.net.JIoEndpoint$SocketProcessor.run(JIoEndpoint.java:302) at java.util.concurrent.ThreadPoolExecutor$Worker.runTask(ThreadPoolExecutor.java:886) at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:908) at java.lang.Thread.run(Thread.java:619) 说空值,参数传递问题,能说一下我具体在哪步出现错误么,按照提示他说我在model中有错误,难道是我在model中有变量没有get、set?我已经全部get、set了,求指点
使用Python 将某个excel 中的一列 写入 另一个excel 中的某一列
嗨 朋友们: 我还在编写这个程序中...然而遇到了一些新的麻烦... 想要实现的功能: 将某个excel 中的一列 写入 另一个excel 中的一列. 当前我已经可以通过自定义的 get infor() 函数 提取某一excel 表中的数据 并存储在列表 (L=[ ] )中, ![图片说明](https://img-ask.csdn.net/upload/201909/01/1567349983_131463.png) 并通过open excel cost () 函数 将这些信息插入到目标excel之中了。 然而,我发现自己所使用的 插入excel 的方法只能针对 某个 特定的 单元格 (如我代码之中的 "D2" 单元格)。 我想对这个open excel cost () 函数 代码进行调整,使得它可以根据get infor() 函数中的列表返回结果 对目标excel中的某一列 批量赋值。 请问应该如何实现呢? 我的代码如下, 感谢大佬们的辛苦指正 ``` import openpyxl import xlrd def get_infor(): book = xlrd.open_workbook('C:/Users/lenovo/Desktop/模板.xlsx') sheet = book.sheet_by_name('WRT模板IN') L =[] for i in range (2,sheet.nrows): # i 从第四列开始 PN = str(sheet.cell(i,5).value) Cost_USD = str(sheet.cell(i, 9).value) L.append((PN,Cost_USD)) #t=tuple(L) return L #print(L) t1= get_infor() #print(t1) # # sh = t1[0][0] # print(sh) # def Open_excel_cost(): workbook = openpyxl.load_workbook('C:/Users/lenovo/Desktop/文档模板/IN/Cost_IN.xlsx') sheet = workbook.worksheets[0] # 第一个页签 sheet['D2'] = t1[0][1] #sheet['A1'] = t1[1][2] workbook.save('C:/Users/lenovo/Desktop/文档模板/IN/Cost_IN.xlsx') # 此步骤要保存才行的, 否则是不会在excel中显示的 2019.8.24 print(sheet['D2'].value) print(sheet.title) Open_excel_cost() ```
C C++是不是有个函数能够实现字符串处理转换?
类似于我建立一个tuple数组,实现读取以下字符串中的数据: string“123 as 456 de”-> tuple[123,as,456,de]
hibernate多对多错误,JUnit出的错
org.hibernate.HibernateException: Unable to instantiate default tuplizer [org.hibernate.tuple.entity.PojoEntityTuplizer] at org.hibernate.tuple.entity.EntityTuplizerFactory.constructTuplizer(EntityTuplizerFactory.java:110) at org.hibernate.tuple.entity.EntityTuplizerFactory.constructDefaultTuplizer(EntityTuplizerFactory.java:135) at org.hibernate.tuple.entity.EntityEntityModeToTuplizerMapping.<init>(EntityEntityModeToTuplizerMapping.java:80) at org.hibernate.tuple.entity.EntityMetamodel.<init>(EntityMetamodel.java:323) at org.hibernate.persister.entity.AbstractEntityPersister.<init>(AbstractEntityPersister.java:456) at org.hibernate.persister.entity.SingleTableEntityPersister.<init>(SingleTableEntityPersister.java:131) at org.hibernate.persister.PersisterFactory.createClassPersister(PersisterFactory.java:84) at org.hibernate.impl.SessionFactoryImpl.<init>(SessionFactoryImpl.java:267) at org.hibernate.cfg.Configuration.buildSessionFactory(Configuration.java:1341) at com.jdbc.hibernate.test.T.kais(T.java:23) at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:606) at org.junit.runners.model.FrameworkMethod$1.runReflectiveCall(FrameworkMethod.java:44) at org.junit.internal.runners.model.ReflectiveCallable.run(ReflectiveCallable.java:15) at org.junit.runners.model.FrameworkMethod.invokeExplosively(FrameworkMethod.java:41) at org.junit.internal.runners.statements.RunBefores.evaluate(RunBefores.java:27) at org.junit.internal.runners.statements.RunAfters.evaluate(RunAfters.java:31) at org.junit.runners.BlockJUnit4ClassRunner.runNotIgnored(BlockJUnit4ClassRunner.java:79) at org.junit.runners.BlockJUnit4ClassRunner.runChild(BlockJUnit4ClassRunner.java:71) at org.junit.runners.BlockJUnit4ClassRunner.runChild(BlockJUnit4ClassRunner.java:49) at org.junit.runners.ParentRunner$3.run(ParentRunner.java:193) at org.junit.runners.ParentRunner$1.schedule(ParentRunner.java:52) at org.junit.runners.ParentRunner.runChildren(ParentRunner.java:191) at org.junit.runners.ParentRunner.access$000(ParentRunner.java:42) at org.junit.runners.ParentRunner$2.evaluate(ParentRunner.java:184) at org.junit.runners.ParentRunner.run(ParentRunner.java:236) at org.eclipse.jdt.internal.junit4.runner.JUnit4TestReference.run(JUnit4TestReference.java:50) at org.eclipse.jdt.internal.junit.runner.TestExecution.run(TestExecution.java:38) at org.eclipse.jdt.internal.junit.runner.RemoteTestRunner.runTests(RemoteTestRunner.java:467) at org.eclipse.jdt.internal.junit.runner.RemoteTestRunner.runTests(RemoteTestRunner.java:683) at org.eclipse.jdt.internal.junit.runner.RemoteTestRunner.run(RemoteTestRunner.java:390) at org.eclipse.jdt.internal.junit.runner.RemoteTestRunner.main(RemoteTestRunner.java:197) Caused by: java.lang.reflect.InvocationTargetException at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method) at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:57) at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45) at java.lang.reflect.Constructor.newInstance(Constructor.java:526) at org.hibernate.tuple.entity.EntityTuplizerFactory.constructTuplizer(EntityTuplizerFactory.java:107) ... 33 more Caused by: org.hibernate.PropertyNotFoundException: Could not find a getter for emp in class com.jdbc.hibernate.Entity.Project at org.hibernate.property.BasicPropertyAccessor.createGetter(BasicPropertyAccessor.java:306) at org.hibernate.property.BasicPropertyAccessor.getGetter(BasicPropertyAccessor.java:299) at org.hibernate.mapping.Property.getGetter(Property.java:294) at org.hibernate.tuple.entity.PojoEntityTuplizer.buildPropertyGetter(PojoEntityTuplizer.java:300) at org.hibernate.tuple.entity.AbstractEntityTuplizer.<init>(AbstractEntityTuplizer.java:141) at org.hibernate.tuple.entity.PojoEntityTuplizer.<init>(PojoEntityTuplizer.java:78) ... 38 more java.lang.NullPointerException at com.jdbc.hibernate.test.T.jies(T.java:79) at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:606) at org.junit.runners.model.FrameworkMethod$1.runReflectiveCall(FrameworkMethod.java:44) at org.junit.internal.runners.model.ReflectiveCallable.run(ReflectiveCallable.java:15) at org.junit.runners.model.FrameworkMethod.invokeExplosively(FrameworkMethod.java:41) at org.junit.internal.runners.statements.RunAfters.evaluate(RunAfters.java:37) at org.junit.runners.BlockJUnit4ClassRunner.runNotIgnored(BlockJUnit4ClassRunner.java:79) at org.junit.runners.BlockJUnit4ClassRunner.runChild(BlockJUnit4ClassRunner.java:71) at org.junit.runners.BlockJUnit4ClassRunner.runChild(BlockJUnit4ClassRunner.java:49) at org.junit.runners.ParentRunner$3.run(ParentRunner.java:193) at org.junit.runners.ParentRunner$1.schedule(ParentRunner.java:52) at org.junit.runners.ParentRunner.runChildren(ParentRunner.java:191) at org.junit.runners.ParentRunner.access$000(ParentRunner.java:42) at org.junit.runners.ParentRunner$2.evaluate(ParentRunner.java:184) at org.junit.runners.ParentRunner.run(ParentRunner.java:236) at org.eclipse.jdt.internal.junit4.runner.JUnit4TestReference.run(JUnit4TestReference.java:50) at org.eclipse.jdt.internal.junit.runner.TestExecution.run(TestExecution.java:38) at org.eclipse.jdt.internal.junit.runner.RemoteTestRunner.runTests(RemoteTestRunner.java:467) at org.eclipse.jdt.internal.junit.runner.RemoteTestRunner.runTests(RemoteTestRunner.java:683) at org.eclipse.jdt.internal.junit.runner.RemoteTestRunner.run(RemoteTestRunner.java:390) at org.eclipse.jdt.internal.junit.runner.RemoteTestRunner.main(RemoteTestRunner.java:197)
相见恨晚的超实用网站
搞学习 知乎:www.zhihu.com 简答题:http://www.jiandati.com/ 网易公开课:https://open.163.com/ted/ 网易云课堂:https://study.163.com/ 中国大学MOOC:www.icourse163.org 网易云课堂:study.163.com 哔哩哔哩弹幕网:www.bilibili.com 我要自学网:www.51zxw
爬虫福利二 之 妹子图网MM批量下载
爬虫福利一:27报网MM批量下载    点击 看了本文,相信大家对爬虫一定会产生强烈的兴趣,激励自己去学习爬虫,在这里提前祝:大家学有所成! 目标网站:妹子图网 环境:Python3.x 相关第三方模块:requests、beautifulsoup4 Re:各位在测试时只需要将代码里的变量 path 指定为你当前系统要保存的路径,使用 python xxx.py 或IDE运行即可。
字节跳动视频编解码面经
引言 本文主要是记录一下面试字节跳动的经历。 三四月份投了字节跳动的实习(图形图像岗位),然后hr打电话过来问了一下会不会opengl,c++,shador,当时只会一点c++,其他两个都不会,也就直接被拒了。 七月初内推了字节跳动的提前批,因为内推没有具体的岗位,hr又打电话问要不要考虑一下图形图像岗,我说实习投过这个岗位不合适,不会opengl和shador,然后hr就说秋招更看重基础。我当时
开源一个功能完整的SpringBoot项目框架
福利来了,给大家带来一个福利。 最近想了解一下有关Spring Boot的开源项目,看了很多开源的框架,大多是一些demo或者是一个未成形的项目,基本功能都不完整,尤其是用户权限和菜单方面几乎没有完整的。 想到我之前做的框架,里面通用模块有:用户模块,权限模块,菜单模块,功能模块也齐全了,每一个功能都是完整的。 打算把这个框架分享出来,供大家使用和学习。 为什么用框架? 框架可以学习整体
源码阅读(19):Java中主要的Map结构——HashMap容器(下1)
(接上文《源码阅读(18):Java中主要的Map结构——HashMap容器(中)》) 3.4.4、HashMap添加K-V键值对(红黑树方式) 上文我们介绍了在HashMap中table数组的某个索引位上,基于单向链表添加新的K-V键值对对象(HashMap.Node&lt;K, V&gt;类的实例),但是我们同时知道在某些的场景下,HashMap中table数据的某个索引位上,数据是按照红黑树
c++制作的植物大战僵尸,开源,一代二代结合游戏
    此游戏全部由本人自己制作完成。游戏大部分的素材来源于原版游戏素材,少部分搜集于网络,以及自己制作。 此游戏为同人游戏而且仅供学习交流使用,任何人未经授权,不得对本游戏进行更改、盗用等,否则后果自负。 目前有六种僵尸和六种植物,植物和僵尸的动画都是本人做的。qq:2117610943 开源代码下载 提取码:3vzm 点击下载--&gt; 11月28日 新增四种植物 统一植物画风,全部修
Java学习的正确打开方式
在博主认为,对于入门级学习java的最佳学习方法莫过于视频+博客+书籍+总结,前三者博主将淋漓尽致地挥毫于这篇博客文章中,至于总结在于个人,实际上越到后面你会发现学习的最好方式就是阅读参考官方文档其次就是国内的书籍,博客次之,这又是一个层次了,这里暂时不提后面再谈。博主将为各位入门java保驾护航,各位只管冲鸭!!!上天是公平的,只要不辜负时间,时间自然不会辜负你。 何谓学习?博主所理解的学习,它
程序员必须掌握的核心算法有哪些?
由于我之前一直强调数据结构以及算法学习的重要性,所以就有一些读者经常问我,数据结构与算法应该要学习到哪个程度呢?,说实话,这个问题我不知道要怎么回答你,主要取决于你想学习到哪些程度,不过针对这个问题,我稍微总结一下我学过的算法知识点,以及我觉得值得学习的算法。这些算法与数据结构的学习大多数是零散的,并没有一本把他们全部覆盖的书籍。下面是我觉得值得学习的一些算法以及数据结构,当然,我也会整理一些看过
Python——画一棵漂亮的樱花树(不同种樱花+玫瑰+圣诞树喔)
最近翻到一篇知乎,上面有不少用Python(大多是turtle库)绘制的树图,感觉很漂亮,我整理了一下,挑了一些我觉得不错的代码分享给大家(这些我都测试过,确实可以生成喔~) one 樱花树 动态生成樱花 效果图(这个是动态的): 实现代码 import turtle as T import random import time # 画樱花的躯干(60,t) def Tree(branch
linux系列之常用运维命令整理笔录
本博客记录工作中需要的linux运维命令,大学时候开始接触linux,会一些基本操作,可是都没有整理起来,加上是做开发,不做运维,有些命令忘记了,所以现在整理成博客,当然vi,文件操作等就不介绍了,慢慢积累一些其它拓展的命令,博客不定时更新 顺便拉下票,我在参加csdn博客之星竞选,欢迎投票支持,每个QQ或者微信每天都可以投5票,扫二维码即可,http://m234140.nofollow.ax.
Python 基础(一):入门必备知识
目录1 标识符2 关键字3 引号4 编码5 输入输出6 缩进7 多行8 注释9 数据类型10 运算符10.1 常用运算符10.2 运算符优先级 1 标识符 标识符是编程时使用的名字,用于给变量、函数、语句块等命名,Python 中标识符由字母、数字、下划线组成,不能以数字开头,区分大小写。 以下划线开头的标识符有特殊含义,单下划线开头的标识符,如:_xxx ,表示不能直接访问的类属性,需通过类提供
深度学习图像算法在内容安全领域的应用
互联网给人们生活带来便利的同时也隐含了大量不良信息,防范互联网平台有害内容传播引起了多方面的高度关注。本次演讲从技术层面分享网易易盾在内容安全领域的算法实践经验,包括深度学习图
程序员接私活怎样防止做完了不给钱?
首先跟大家说明一点,我们做 IT 类的外包开发,是非标品开发,所以很有可能在开发过程中会有这样那样的需求修改,而这种需求修改很容易造成扯皮,进而影响到费用支付,甚至出现做完了项目收不到钱的情况。 那么,怎么保证自己的薪酬安全呢? 我们在开工前,一定要做好一些证据方面的准备(也就是“讨薪”的理论依据),这其中最重要的就是需求文档和验收标准。一定要让需求方提供这两个文档资料作为开发的基础。之后开发
网页实现一个简单的音乐播放器(大佬别看。(⊙﹏⊙))
今天闲着无事,就想写点东西。然后听了下歌,就打算写个播放器。 于是乎用h5 audio的加上js简单的播放器完工了。 欢迎 改进 留言。 演示地点跳到演示地点 html代码如下`&lt;!DOCTYPE html&gt; &lt;html&gt; &lt;head&gt; &lt;title&gt;music&lt;/title&gt; &lt;meta charset="utf-8"&gt
Python十大装B语法
Python 是一种代表简单思想的语言,其语法相对简单,很容易上手。不过,如果就此小视 Python 语法的精妙和深邃,那就大错特错了。本文精心筛选了最能展现 Python 语法之精妙的十个知识点,并附上详细的实例代码。如能在实战中融会贯通、灵活使用,必将使代码更为精炼、高效,同时也会极大提升代码B格,使之看上去更老练,读起来更优雅。 1. for - else 什么?不是 if 和 else 才
数据库优化 - SQL优化
前面一篇文章从实例的角度进行数据库优化,通过配置一些参数让数据库性能达到最优。但是一些“不好”的SQL也会导致数据库查询变慢,影响业务流程。本文从SQL角度进行数据库优化,提升SQL运行效率。 判断问题SQL 判断SQL是否有问题时可以通过两个表象进行判断: 系统级别表象 CPU消耗严重 IO等待严重 页面响应时间过长
2019年11月中国大陆编程语言排行榜
2019年11月2日,我统计了某招聘网站,获得有效程序员招聘数据9万条。针对招聘信息,提取编程语言关键字,并统计如下: 编程语言比例 rank pl_ percentage 1 java 33.62% 2 c/c++ 16.42% 3 c_sharp 12.82% 4 javascript 12.31% 5 python 7.93% 6 go 7.25% 7
通俗易懂地给女朋友讲:线程池的内部原理
餐厅的约会 餐盘在灯光的照耀下格外晶莹洁白,女朋友拿起红酒杯轻轻地抿了一小口,对我说:“经常听你说线程池,到底线程池到底是个什么原理?”我楞了一下,心里想女朋友今天是怎么了,怎么突然问出这么专业的问题,但做为一个专业人士在女朋友面前也不能露怯啊,想了一下便说:“我先给你讲讲我前同事老王的故事吧!” 大龄程序员老王 老王是一个已经北漂十多年的程序员,岁数大了,加班加不动了,升迁也无望,于是拿着手里
经典算法(5)杨辉三角
杨辉三角 是经典算法,这篇博客对它的算法思想进行了讲解,并有完整的代码实现。
腾讯算法面试题:64匹马8个跑道需要多少轮才能选出最快的四匹?
昨天,有网友私信我,说去阿里面试,彻底的被打击到了。问了为什么网上大量使用ThreadLocal的源码都会加上private static?他被难住了,因为他从来都没有考虑过这个问题。无独有偶,今天笔者又发现有网友吐槽了一道腾讯的面试题,我们一起来看看。 腾讯算法面试题:64匹马8个跑道需要多少轮才能选出最快的四匹? 在互联网职场论坛,一名程序员发帖求助到。二面腾讯,其中一个算法题:64匹
面试官:你连RESTful都不知道我怎么敢要你?
面试官:了解RESTful吗? 我:听说过。 面试官:那什么是RESTful? 我:就是用起来很规范,挺好的 面试官:是RESTful挺好的,还是自我感觉挺好的 我:都挺好的。 面试官:… 把门关上。 我:… 要干嘛?先关上再说。 面试官:我说出去把门关上。 我:what ?,夺门而去 文章目录01 前言02 RESTful的来源03 RESTful6大原则1. C-S架构2. 无状态3.统一的接
为啥国人偏爱Mybatis,而老外喜欢Hibernate/JPA呢?
关于SQL和ORM的争论,永远都不会终止,我也一直在思考这个问题。昨天又跟群里的小伙伴进行了一番讨论,感触还是有一些,于是就有了今天这篇文。 声明:本文不会下关于Mybatis和JPA两个持久层框架哪个更好这样的结论。只是摆事实,讲道理,所以,请各位看官勿喷。 一、事件起因 关于Mybatis和JPA孰优孰劣的问题,争论已经很多年了。一直也没有结论,毕竟每个人的喜好和习惯是大不相同的。我也看
项目中的if else太多了,该怎么重构?
介绍 最近跟着公司的大佬开发了一款IM系统,类似QQ和微信哈,就是聊天软件。我们有一部分业务逻辑是这样的 if (msgType = "文本") { // dosomething } else if(msgType = "图片") { // doshomething } else if(msgType = "视频") { // doshomething } else { // doshom...
致 Python 初学者
欢迎来到“Python进阶”专栏!来到这里的每一位同学,应该大致上学习了很多 Python 的基础知识,正在努力成长的过程中。在此期间,一定遇到了很多的困惑,对未来的学习方向感到迷茫。我非常理解你们所面临的处境。我从2007年开始接触 python 这门编程语言,从2009年开始单一使用 python 应对所有的开发工作,直至今天。回顾自己的学习过程,也曾经遇到过无数的困难,也曾经迷茫过、困惑过。开办这个专栏,正是为了帮助像我当年一样困惑的 Python 初学者走出困境、快速成长。希望我的经验能真正帮到你
Python 编程实用技巧
Python是一门很灵活的语言,也有很多实用的方法,有时候实现一个功能可以用多种方法实现,我这里总结了一些常用的方法,并会持续更新。
“狗屁不通文章生成器”登顶GitHub热榜,分分钟写出万字形式主义大作
一、垃圾文字生成器介绍 最近在浏览GitHub的时候,发现了这样一个骨骼清奇的雷人项目,而且热度还特别高。 项目中文名:狗屁不通文章生成器 项目英文名:BullshitGenerator 根据作者的介绍,他是偶尔需要一些中文文字用于GUI开发时测试文本渲染,因此开发了这个废话生成器。但由于生成的废话实在是太过富于哲理,所以最近已经被小伙伴们给玩坏了。 他的文风可能是这样的: 你发现,...
程序员:我终于知道post和get的区别
是一个老生常谈的话题,然而随着不断的学习,对于以前的认识有很多误区,所以还是需要不断地总结的,学而时习之,不亦说乎
"狗屁不通文章生成器"登顶GitHub热榜,分分钟写出万字形式主义大作
GitHub 被誉为全球最大的同性交友网站,……,陪伴我们已经走过 10+ 年时间,它托管了大量的软件代码,同时也承载了程序员无尽的欢乐。 万字申请,废话报告,魔幻形式主义大作怎么写?兄dei,狗屁不通文章生成器了解一下。这个富有灵魂的项目名吸引了众人的目光。项目仅仅诞生一周,便冲上了GitHub趋势榜榜首(Js中文网 -前端进阶资源教程)、是榜首哦
推荐几款比较实用的工具,网站
1.盘百度PanDownload 这个云盘工具是免费的,可以进行资源搜索,提速(偶尔会抽风????) 不要去某站买付费的???? PanDownload下载地址 2.BeJSON 这是一款拥有各种在线工具的网站,推荐它的主要原因是网站简洁,功能齐全,广告相比其他广告好太多了 bejson网站 3.二维码美化 这个网站的二维码美化很好看,网站界面也很...
《程序人生》系列-这个程序员只用了20行代码就拿了冠军
你知道的越多,你不知道的越多 点赞再看,养成习惯GitHub上已经开源https://github.com/JavaFamily,有一线大厂面试点脑图,欢迎Star和完善 前言 这一期不算《吊打面试官》系列的,所有没前言我直接开始。 絮叨 本来应该是没有这期的,看过我上期的小伙伴应该是知道的嘛,双十一比较忙嘛,要值班又要去帮忙拍摄年会的视频素材,还得搞个程序员一天的Vlog,还要写BU...
相关热词 c# plc s1200 c#里氏转换原则 c# 主界面 c# do loop c#存为组套 模板 c# 停掉协程 c# rgb 读取图片 c# 图片颜色调整 最快 c#多张图片上传 c#密封类与密封方法
立即提问