eclipse->hibernate Configurations配置出现问题

图片说明
点击之后出现
图片说明
之后反向生成实体类的时候没有出现表
图片说明

3个回答

你看一下你的java compiler 里面配置的jdk版本是什么,换一下试试看

z893222309
z893222309 回复飞翔的小野鸭: 1511299877
接近 3 年之前 回复
u012470804
飞翔的小野鸭 回复z893222309: 我buildpath的jdk版本就是1.8的.compiler也是1.8的,能留个扣扣帮我一下吗
接近 3 年之前 回复
z893222309
z893222309 回复飞翔的小野鸭: buildpath用的jdk跟compile用的jdk版本一致吗,换1.7,1.6试试,你那个异常说的是jdk版本的问题
接近 3 年之前 回复
u012470804
飞翔的小野鸭 我换成1.8的一样出这样的问题
接近 3 年之前 回复

控制台没报异常吗,没生成对应的表的时候

Compiler配置

Csdn user default icon
上传中...
上传图片
插入图片
抄袭、复制答案,以达到刷声望分或其他目的的行为,在CSDN问答是严格禁止的,一经发现立刻封号。是时候展现真正的技术了!
其他相关推荐
Eclipse找不到My Eclipse ---add hibernate capabilities

![图片说明](https://img-ask.csdn.net/upload/201503/04/1425436055_743670.png) 需要安装插件,那如果不安装插件,有没有其他方法添加hibernate? 求高手!!!

[新手求教]eclipse下创建hibernate configurations时classpath报错

eclipse下使用hibernate tools实现hibernate逆向工程,创建hibernate configurations时classpath报错,跪求大佬说明 ![图片说明](https://img-ask.csdn.net/upload/201906/03/1559547166_944532.png) ![图片说明](https://img-ask.csdn.net/upload/201906/03/1559547407_321202.png) 忽视错误建成后无法获取数据库的数据 ![图片说明](https://img-ask.csdn.net/upload/201906/03/1559547811_553309.png)

Eclipse中的Hibernate Console Configuration出现了问题

以下是我的Eclipse的Hibernate tools版本 ![Hibernate tools版本](https://img-ask.csdn.net/upload/201510/29/1446103021_611979.png) 项目结构 ![项目结构](https://img-ask.csdn.net/upload/201510/29/1446103138_336003.png) pom.xml文件 ![pom.xml文件](https://img-ask.csdn.net/upload/201510/29/1446103167_165583.png) 问题: ![图片说明](https://img-ask.csdn.net/upload/201510/29/1446103119_197295.png) ![图片说明](https://img-ask.csdn.net/upload/201510/29/1446103130_482783.png) 如[classpath]的红色叹号!

hibernate逆向生成,设置运行配置时出错

最新版的Eclipse 和hibernate,从mysql数据库逆向生成时,设置Run Configurations时,点Run时出错,错误信息:Direct launch not supported ,请各位大神帮帮忙,看是什么原因!

eclipse 建立hadoop map/reduce项目 下载hadoop2.x-eclipse-pugins 插件然后编译出问题了 怎么搞都不行

hadoop-2.7.7 centos7 ant 1.9.14 ![图片说明](https://img-ask.csdn.net/upload/202005/03/1588493547_395256.png)![图片说明](https://img-ask.csdn.net/upload/202005/03/1588493565_427495.png) biuld-contrib.xml代码如下 <?xml version="1.0"?> <!-- Licensed to the Apache Software Foundation (ASF) under one or more contributor license agreements. See the NOTICE file distributed with this work for additional information regarding copyright ownership. The ASF licenses this file to You under the Apache License, Version 2.0 (the "License"); you may not use this file except in compliance with the License. You may obtain a copy of the License at http://www.apache.org/licenses/LICENSE-2.0 Unless required by applicable law or agreed to in writing, software distributed under the License is distributed on an "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the License for the specific language governing permissions and limitations under the License. --> <!-- Imported by contrib/*/build.xml files to share generic targets. --> <project name="hadoopbuildcontrib" xmlns:ivy="antlib:org.apache.ivy.ant"> <property name="name" value="${ant.project.name}"/> <property name="root" value="${basedir}"/> <property name="hadoop.root" location="${root}/../../../"/> <!-- Load all the default properties, and any the user wants --> <!-- to contribute (without having to type -D or edit this file --> <property file="${user.home}/${name}.build.properties" /> <property file="${root}/build.properties" /> <property file="${hadoop.root}/build.properties" /> <property name="src.dir" location="${root}/src/java"/> <property name="src.test" location="${root}/src/test"/> <property name="src.test.data" location="${root}/src/test/data"/> <!-- Property added for contrib system tests --> <property name="build-fi.dir" location="${hadoop.root}/build-fi"/> <property name="system-test-build-dir" location="${build-fi.dir}/system"/> <property name="src.test.system" location="${root}/src/test/system"/> <property name="src.examples" location="${root}/src/examples"/> <available file="${src.examples}" type="dir" property="examples.available"/> <available file="${src.test}" type="dir" property="test.available"/> <!-- Property added for contrib system tests --> <available file="${src.test.system}" type="dir" property="test.system.available"/> <property name="conf.dir" location="${hadoop.root}/conf"/> <property name="test.junit.output.format" value="plain"/> <property name="test.output" value="no"/> <property name="test.timeout" value="900000"/> <property name="build.contrib.dir" location="${hadoop.root}/build/contrib"/> <property name="build.dir" location="${hadoop.root}/build/contrib/${name}"/> <property name="build.classes" location="${build.dir}/classes"/> <property name="build.test" location="${build.dir}/test"/> <property name="build.examples" location="${build.dir}/examples"/> <property name="hadoop.log.dir" location="${build.dir}/test/logs"/> <!-- all jars together --> <property name="javac.deprecation" value="off"/> <property name="javac.debug" value="on"/> <property name="build.ivy.lib.dir" value="${hadoop.root}/build/ivy/lib"/> <property name="javadoc.link" value="http://java.sun.com/j2se/1.4/docs/api/"/> <property name="build.encoding" value="ISO-8859-1"/> <fileset id="lib.jars" dir="${root}" includes="lib/*.jar"/> <!-- Property added for contrib system tests --> <property name="build.test.system" location="${build.dir}/system"/> <property name="build.system.classes" location="${build.test.system}/classes"/> <!-- IVY properties set here --> <property name="ivy.dir" location="ivy" /> <!-- loglevel take values like default|download-only|quiet --> <property name="loglevel" value="quiet"/> <property name="ivysettings.xml" location="${hadoop.root}/ivy/ivysettings.xml"/> <loadproperties srcfile="${ivy.dir}/libraries.properties"/> <loadproperties srcfile="${hadoop.root}/ivy/libraries.properties"/> <property name="ivy.jar" location="${hadoop.root}/ivy/ivy-${ivy.version}.jar"/> <property name="ivy_repo_url" value="http://repo2.maven.org/maven2/org/apache/ivy/ivy/${ivy.version}/ivy-${ivy.version}.jar" /> <property name="build.dir" location="build" /> <property name="build.ivy.dir" location="${build.dir}/ivy" /> <property name="build.ivy.lib.dir" location="${build.ivy.dir}/lib" /> <property name="build.ivy.report.dir" location="${build.ivy.dir}/report" /> <property name="common.ivy.lib.dir" location="${build.ivy.lib.dir}/${ant.project.name}/common"/> <!--this is the naming policy for artifacts we want pulled down--> <property name="ivy.artifact.retrieve.pattern" value="${ant.project.name}/[conf]/[artifact]-[revision].[ext]"/> <!-- the normal classpath --> <path id="contrib-classpath"> <pathelement location="${build.classes}"/> <pathelement location="${hadoop.root}/build/tools"/> <fileset refid="lib.jars"/> <pathelement location="${hadoop.root}/build/classes"/> <fileset dir="${hadoop.root}/lib"> <include name="**/*.jar" /> </fileset> <path refid="${ant.project.name}.common-classpath"/> <pathelement path="${clover.jar}"/> </path> <!-- the unit test classpath --> <path id="test.classpath"> <pathelement location="${build.test}" /> <pathelement location="${hadoop.root}/build/test/classes"/> <pathelement location="${hadoop.root}/src/contrib/test"/> <pathelement location="${conf.dir}"/> <pathelement location="${hadoop.root}/build"/> <pathelement location="${build.examples}"/> <pathelement location="${hadoop.root}/build/examples"/> <path refid="contrib-classpath"/> </path> <!-- The system test classpath --> <path id="test.system.classpath"> <pathelement location="${hadoop.root}/src/contrib/${name}/src/test/system" /> <pathelement location="${build.test.system}" /> <pathelement location="${build.test.system}/classes"/> <pathelement location="${build.examples}"/> <pathelement location="${hadoop.root}/build-fi/system/classes" /> <pathelement location="${hadoop.root}/build-fi/system/test/classes" /> <pathelement location="${hadoop.root}/build-fi" /> <pathelement location="${hadoop.root}/build-fi/tools" /> <pathelement location="${hadoop.home}"/> <pathelement location="${hadoop.conf.dir}"/> <pathelement location="${hadoop.conf.dir.deployed}"/> <pathelement location="${hadoop.root}/build"/> <pathelement location="${hadoop.root}/build/examples"/> <pathelement location="${hadoop.root}/build-fi/test/classes" /> <path refid="contrib-classpath"/> <fileset dir="${hadoop.root}/src/test/lib"> <include name="**/*.jar" /> <exclude name="**/excluded/" /> </fileset> <fileset dir="${hadoop.root}/build-fi/system"> <include name="**/*.jar" /> <exclude name="**/excluded/" /> </fileset> <fileset dir="${hadoop.root}/build-fi/test/testjar"> <include name="**/*.jar" /> <exclude name="**/excluded/" /> </fileset> <fileset dir="${hadoop.root}/build/contrib/${name}"> <include name="**/*.jar" /> <exclude name="**/excluded/" /> </fileset> </path> <!-- to be overridden by sub-projects --> <target name="check-contrib"/> <target name="init-contrib"/> <!-- ====================================================== --> <!-- Stuff needed by all targets --> <!-- ====================================================== --> <target name="init" depends="check-contrib" unless="skip.contrib"> <echo message="contrib: ${name}"/> <mkdir dir="${build.dir}"/> <mkdir dir="${build.classes}"/> <mkdir dir="${build.test}"/> <!-- The below two tags added for contrib system tests --> <mkdir dir="${build.test.system}"/> <mkdir dir="${build.system.classes}"/> <mkdir dir="${build.examples}"/> <mkdir dir="${hadoop.log.dir}"/> <antcall target="init-contrib"/> </target> <!-- ====================================================== --> <!-- Compile a Hadoop contrib's files --> <!-- ====================================================== --> <target name="compile" depends="init, ivy-retrieve-common" unless="skip.contrib"> <echo message="contrib: ${name}"/> <javac encoding="${build.encoding}" srcdir="${src.dir}" includes="**/*.java" destdir="${build.classes}" debug="${javac.debug}" deprecation="${javac.deprecation}"> <classpath refid="contrib-classpath"/> </javac> </target> <!-- ======================================================= --> <!-- Compile a Hadoop contrib's example files (if available) --> <!-- ======================================================= --> <target name="compile-examples" depends="compile" if="examples.available"> <echo message="contrib: ${name}"/> <javac encoding="${build.encoding}" srcdir="${src.examples}" includes="**/*.java" destdir="${build.examples}" debug="${javac.debug}"> <classpath refid="contrib-classpath"/> </javac> </target> <!-- ================================================================== --> <!-- Compile test code --> <!-- ================================================================== --> <target name="compile-test" depends="compile-examples" if="test.available"> <echo message="contrib: ${name}"/> <javac encoding="${build.encoding}" srcdir="${src.test}" includes="**/*.java" excludes="system/**/*.java" destdir="${build.test}" debug="${javac.debug}"> <classpath refid="test.classpath"/> </javac> </target> <!-- ================================================================== --> <!-- Compile system test code --> <!-- ================================================================== --> <target name="compile-test-system" depends="compile-examples" if="test.system.available"> <echo message="contrib: ${name}"/> <javac encoding="${build.encoding}" srcdir="${src.test.system}" includes="**/*.java" destdir="${build.system.classes}" debug="${javac.debug}"> <classpath refid="test.system.classpath"/> </javac> </target> <!-- ====================================================== --> <!-- Make a Hadoop contrib's jar --> <!-- ====================================================== --> <target name="jar" depends="compile" unless="skip.contrib"> <echo message="contrib: ${name}"/> <jar jarfile="${build.dir}/hadoop-${name}-${version}.jar" basedir="${build.classes}" /> </target> <!-- ====================================================== --> <!-- Make a Hadoop contrib's examples jar --> <!-- ====================================================== --> <target name="jar-examples" depends="compile-examples" if="examples.available" unless="skip.contrib"> <echo message="contrib: ${name}"/> <jar jarfile="${build.dir}/hadoop-${name}-examples-${version}.jar"> <fileset dir="${build.classes}"> </fileset> <fileset dir="${build.examples}"> </fileset> </jar> </target> <!-- ====================================================== --> <!-- Package a Hadoop contrib --> <!-- ====================================================== --> <target name="package" depends="jar, jar-examples" unless="skip.contrib"> <mkdir dir="${dist.dir}/contrib/${name}"/> <copy todir="${dist.dir}/contrib/${name}" includeEmptyDirs="false" flatten="true"> <fileset dir="${build.dir}"> <include name="hadoop-${name}-${version}.jar" /> </fileset> </copy> </target> <!-- ================================================================== --> <!-- Run unit tests --> <!-- ================================================================== --> <target name="test" depends="compile-test, compile" if="test.available"> <echo message="contrib: ${name}"/> <delete dir="${hadoop.log.dir}"/> <mkdir dir="${hadoop.log.dir}"/> <junit printsummary="yes" showoutput="${test.output}" haltonfailure="no" fork="yes" maxmemory="512m" errorProperty="tests.failed" failureProperty="tests.failed" timeout="${test.timeout}"> <sysproperty key="test.build.data" value="${build.test}/data"/> <sysproperty key="build.test" value="${build.test}"/> <sysproperty key="src.test.data" value="${src.test.data}"/> <sysproperty key="contrib.name" value="${name}"/> <!-- requires fork=yes for: relative File paths to use the specified user.dir classpath to use build/contrib/*.jar --> <sysproperty key="user.dir" value="${build.test}/data"/> <sysproperty key="fs.default.name" value="${fs.default.name}"/> <sysproperty key="hadoop.test.localoutputfile" value="${hadoop.test.localoutputfile}"/> <sysproperty key="hadoop.log.dir" value="${hadoop.log.dir}"/> <sysproperty key="taskcontroller-path" value="${taskcontroller-path}"/> <sysproperty key="taskcontroller-ugi" value="${taskcontroller-ugi}"/> <classpath refid="test.classpath"/> <formatter type="${test.junit.output.format}" /> <batchtest todir="${build.test}" unless="testcase"> <fileset dir="${src.test}" includes="**/Test*.java" excludes="**/${test.exclude}.java, system/**/*.java" /> </batchtest> <batchtest todir="${build.test}" if="testcase"> <fileset dir="${src.test}" includes="**/${testcase}.java" excludes="system/**/*.java" /> </batchtest> </junit> <antcall target="checkfailure"/> </target> <!-- ================================================================== --> <!-- Run system tests --> <!-- ================================================================== --> <target name="test-system" depends="compile, compile-test-system, jar" if="test.system.available"> <delete dir="${build.test.system}/extraconf"/> <mkdir dir="${build.test.system}/extraconf"/> <property name="test.src.dir" location="${hadoop.root}/src/test"/> <property name="test.junit.printsummary" value="yes" /> <property name="test.junit.haltonfailure" value="no" /> <property name="test.junit.maxmemory" value="512m" /> <property name="test.junit.fork.mode" value="perTest" /> <property name="test.all.tests.file" value="${test.src.dir}/all-tests" /> <property name="test.build.dir" value="${hadoop.root}/build/test"/> <property name="basedir" value="${hadoop.root}"/> <property name="test.timeout" value="900000"/> <property name="test.junit.output.format" value="plain"/> <property name="test.tools.input.dir" value="${basedir}/src/test/tools/data"/> <property name="c++.src" value="${basedir}/src/c++"/> <property name="test.include" value="Test*"/> <property name="c++.libhdfs.src" value="${c++.src}/libhdfs"/> <property name="test.build.data" value="${build.test.system}/data"/> <property name="test.cache.data" value="${build.test.system}/cache"/> <property name="test.debug.data" value="${build.test.system}/debug"/> <property name="test.log.dir" value="${build.test.system}/logs"/> <patternset id="empty.exclude.list.id" /> <exec executable="sed" inputstring="${os.name}" outputproperty="nonspace.os"> <arg value="s/ /_/g"/> </exec> <property name="build.platform" value="${nonspace.os}-${os.arch}-${sun.arch.data.model}"/> <property name="build.native" value="${hadoop.root}/build/native/${build.platform}"/> <property name="lib.dir" value="${hadoop.root}/lib"/> <property name="install.c++.examples" value="${hadoop.root}/build/c++-examples/${build.platform}"/> <condition property="tests.testcase"> <and> <isset property="testcase" /> </and> </condition> <property name="test.junit.jvmargs" value="-ea" /> <macro-system-test-runner test.file="${test.all.tests.file}" classpath="test.system.classpath" test.dir="${build.test.system}" fileset.dir="${hadoop.root}/src/contrib/${name}/src/test/system" hadoop.conf.dir.deployed="${hadoop.conf.dir.deployed}"> </macro-system-test-runner> </target> <macrodef name="macro-system-test-runner"> <attribute name="test.file" /> <attribute name="classpath" /> <attribute name="test.dir" /> <attribute name="fileset.dir" /> <attribute name="hadoop.conf.dir.deployed" default="" /> <sequential> <delete dir="@{test.dir}/data"/> <mkdir dir="@{test.dir}/data"/> <delete dir="@{test.dir}/logs"/> <mkdir dir="@{test.dir}/logs"/> <copy file="${test.src.dir}/hadoop-policy.xml" todir="@{test.dir}/extraconf" /> <copy file="${test.src.dir}/fi-site.xml" todir="@{test.dir}/extraconf" /> <junit showoutput="${test.output}" printsummary="${test.junit.printsummary}" haltonfailure="${test.junit.haltonfailure}" fork="yes" forkmode="${test.junit.fork.mode}" maxmemory="${test.junit.maxmemory}" dir="${basedir}" timeout="${test.timeout}" errorProperty="tests.failed" failureProperty="tests.failed"> <jvmarg value="${test.junit.jvmargs}" /> <sysproperty key="java.net.preferIPv4Stack" value="true"/> <sysproperty key="test.build.data" value="@{test.dir}/data"/> <sysproperty key="test.tools.input.dir" value = "${test.tools.input.dir}"/> <sysproperty key="test.cache.data" value="${test.cache.data}"/> <sysproperty key="test.debug.data" value="${test.debug.data}"/> <sysproperty key="hadoop.log.dir" value="@{test.dir}/logs"/> <sysproperty key="test.src.dir" value="@{fileset.dir}"/> <sysproperty key="taskcontroller-path" value="${taskcontroller-path}"/> <sysproperty key="taskcontroller-ugi" value="${taskcontroller-ugi}"/> <sysproperty key="test.build.extraconf" value="@{test.dir}/extraconf" /> <sysproperty key="hadoop.policy.file" value="hadoop-policy.xml"/> <sysproperty key="java.library.path" value="${build.native}/lib:${lib.dir}/native/${build.platform}"/> <sysproperty key="install.c++.examples" value="${install.c++.examples}"/> <syspropertyset dynamic="no"> <propertyref name="hadoop.tmp.dir"/> </syspropertyset> <!-- set compile.c++ in the child jvm only if it is set --> <syspropertyset dynamic="no"> <propertyref name="compile.c++"/> </syspropertyset> <!-- Pass probability specifications to the spawn JVM --> <syspropertyset id="FaultProbabilityProperties"> <propertyref regex="fi.*"/> </syspropertyset> <sysproperty key="test.system.hdrc.deployed.hadoopconfdir" value="@{hadoop.conf.dir.deployed}" /> <classpath refid="@{classpath}"/> <formatter type="${test.junit.output.format}" /> <batchtest todir="@{test.dir}" unless="testcase"> <fileset dir="@{fileset.dir}" excludes="**/${test.exclude}.java aop/** system/**"> <patternset> <includesfile name="@{test.file}"/> </patternset> </fileset> </batchtest> <batchtest todir="@{test.dir}" if="testcase"> <fileset dir="@{fileset.dir}" includes="**/${testcase}.java"/> </batchtest> </junit> <antcall target="checkfailure"/> </sequential> </macrodef> <target name="checkfailure" if="tests.failed"> <touch file="${build.contrib.dir}/testsfailed"/> <fail unless="continueOnFailure">Contrib Tests failed!</fail> </target> <!-- ================================================================== --> <!-- Clean. Delete the build files, and their directories --> <!-- ================================================================== --> <target name="clean"> <echo message="contrib: ${name}"/> <delete dir="${build.dir}"/> </target> <target name="ivy-probe-antlib" > <condition property="ivy.found"> <typefound uri="antlib:org.apache.ivy.ant" name="cleancache"/> </condition> </target> <target name="ivy-download" description="To download ivy " unless="offline"> <get src="${ivy_repo_url}" dest="${ivy.jar}" usetimestamp="true"/> </target> <!--target name="ivy-init-antlib" depends="ivy-download,ivy-probe-antlib" unless="ivy.found"--> <target name="ivy-init-antlib" depends="ivy-probe-antlib" unless="ivy.found"> <typedef uri="antlib:org.apache.ivy.ant" onerror="fail" loaderRef="ivyLoader"> <classpath> <pathelement location="${ivy.jar}"/> </classpath> </typedef> <fail > <condition > <not> <typefound uri="antlib:org.apache.ivy.ant" name="cleancache"/> </not> </condition> You need Apache Ivy 2.0 or later from http://ant.apache.org/ It could not be loaded from ${ivy_repo_url} </fail> </target> <target name="ivy-init" depends="ivy-init-antlib"> <ivy:configure settingsid="${ant.project.name}.ivy.settings" file="${ivysettings.xml}"/> </target> <target name="ivy-resolve-common" depends="ivy-init"> <ivy:resolve settingsRef="${ant.project.name}.ivy.settings" conf="common" log="${loglevel}"/> </target> <target name="ivy-retrieve-common" depends="ivy-resolve-common" description="Retrieve Ivy-managed artifacts for the compile/test configurations"> <ivy:retrieve settingsRef="${ant.project.name}.ivy.settings" pattern="${build.ivy.lib.dir}/${ivy.artifact.retrieve.pattern}" sync="true" log="${loglevel}"/> <ivy:cachepath pathid="${ant.project.name}.common-classpath" conf="common" /> </target> </project> build.xml 代码如下 <?xml version="1.0" encoding="UTF-8" standalone="no"?> <!-- Licensed to the Apache Software Foundation (ASF) under one or more contributor license agreements. See the NOTICE file distributed with this work for additional information regarding copyright ownership. The ASF licenses this file to You under the Apache License, Version 2.0 (the "License"); you may not use this file except in compliance with the License. You may obtain a copy of the License at http://www.apache.org/licenses/LICENSE-2.0 Unless required by applicable law or agreed to in writing, software distributed under the License is distributed on an "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the License for the specific language governing permissions and limitations under the License. --> <project default="jar" name="eclipse-plugin"> <import file="../build-contrib.xml"/> <path id="eclipse-sdk-jars"> <fileset dir="${eclipse.home}/plugins/"> <include name="org.eclipse.ui*.jar"/> <include name="org.eclipse.jdt*.jar"/> <include name="org.eclipse.core*.jar"/> <include name="org.eclipse.equinox*.jar"/> <include name="org.eclipse.debug*.jar"/> <include name="org.eclipse.osgi*.jar"/> <include name="org.eclipse.swt*.jar"/> <include name="org.eclipse.jface*.jar"/> <include name="org.eclipse.team.cvs.ssh2*.jar"/> <include name="com.jcraft.jsch*.jar"/> </fileset> </path> <path id="hadoop-sdk-jars"> <fileset dir="${hadoop.home}/share/hadoop/mapreduce"> <include name="hadoop*.jar"/> </fileset> <fileset dir="${hadoop.home}/share/hadoop/hdfs"> <include name="hadoop*.jar"/> </fileset> <fileset dir="${hadoop.home}/share/hadoop/common"> <include name="hadoop*.jar"/> </fileset> </path> <!-- Override classpath to include Eclipse SDK jars --> <path id="classpath"> <pathelement location="${build.classes}"/> <!--pathelement location="${hadoop.root}/build/classes"/--> <path refid="eclipse-sdk-jars"/> <path refid="hadoop-sdk-jars"/> </path> <!-- Skip building if eclipse.home is unset. --> <target name="check-contrib" unless="eclipse.home"> <property name="skip.contrib" value="yes"/> <echo message="eclipse.home unset: skipping eclipse plugin"/> </target> <target name="compile" depends="init, ivy-retrieve-common" unless="skip.contrib"> <echo message="contrib: ${name}"/> <javac encoding="${build.encoding}" srcdir="${src.dir}" includes="**/*.java" destdir="${build.classes}" debug="${javac.debug}" deprecation="${javac.deprecation}"> <classpath refid="classpath"/> </javac> </target> <!-- Override jar target to specify manifest --> <target name="jar" depends="compile" unless="skip.contrib"> <mkdir dir="${build.dir}/lib"/> <copy todir="${build.dir}/lib/" verbose="true"> <fileset dir="${hadoop.home}/share/hadoop/mapreduce"> <include name="hadoop*.jar"/> </fileset> </copy> <copy todir="${build.dir}/lib/" verbose="true"> <fileset dir="${hadoop.home}/share/hadoop/common"> <include name="hadoop*.jar"/> </fileset> </copy> <copy todir="${build.dir}/lib/" verbose="true"> <fileset dir="${hadoop.home}/share/hadoop/hdfs"> <include name="hadoop*.jar"/> </fileset> </copy> <copy todir="${build.dir}/lib/" verbose="true"> <fileset dir="${hadoop.home}/share/hadoop/yarn"> <include name="hadoop*.jar"/> </fileset> </copy> <copy todir="${build.dir}/classes" verbose="true"> <fileset dir="${root}/src/java"> <include name="*.xml"/> </fileset> </copy> <copy file="${hadoop.home}/share/hadoop/common/lib/protobuf-java-${protobuf.version}.jar" todir="${build.dir}/lib" verbose="true"/> <copy file="${hadoop.home}/share/hadoop/common/lib/log4j-${log4j.version}.jar" todir="${build.dir}/lib" verbose="true"/> <copy file="${hadoop.home}/share/hadoop/common/lib/commons-cli-${commons-cli.version}.jar" todir="${build.dir}/lib" verbose="true"/> <copy file="${hadoop.home}/share/hadoop/common/lib/commons-configuration-${commons-configuration.version}.jar" todir="${build.dir}/lib" verbose="true"/> <copy file="${hadoop.home}/share/hadoop/common/lib/commons-lang-${commons-lang.version}.jar" todir="${build.dir}/lib" verbose="true"/> <copy file="${hadoop.home}/share/hadoop/common/lib/commons-collections-${commons-collections.version}.jar" todir="${build.dir}/lib" verbose="true"/> <copy file="${hadoop.home}/share/hadoop/common/lib/jackson-core-asl-${jackson.version}.jar" todir="${build.dir}/lib" verbose="true"/> <copy file="${hadoop.home}/share/hadoop/common/lib/jackson-mapper-asl-${jackson.version}.jar" todir="${build.dir}/lib" verbose="true"/> <copy file="${hadoop.home}/share/hadoop/common/lib/slf4j-log4j12-${slf4j-log4j12.version}.jar" todir="${build.dir}/lib" verbose="true"/> <copy file="${hadoop.home}/share/hadoop/common/lib/slf4j-api-${slf4j-api.version}.jar" todir="${build.dir}/lib" verbose="true"/> <copy file="${hadoop.home}/share/hadoop/common/lib/guava-${guava.version}.jar" todir="${build.dir}/lib" verbose="true"/> <copy file="${hadoop.home}/share/hadoop/common/lib/hadoop-auth-${hadoop.version}.jar" todir="${build.dir}/lib" verbose="true"/> <copy file="${hadoop.home}/share/hadoop/common/lib/commons-cli-${commons-cli.version}.jar" todir="${build.dir}/lib" verbose="true"/> <copy file="${hadoop.home}/share/hadoop/common/lib/netty-${netty.version}.jar" todir="${build.dir}/lib" verbose="true"/> <copy file="${hadoop.home}/share/hadoop/common/lib/htrace-core-${htrace.version}.jar" todir="${build.dir}/lib" verbose="true"/> <jar jarfile="${build.dir}/hadoop-${name}-${hadoop.version}.jar" manifest="${root}/META-INF/MANIFEST.MF"> <manifest> <attribute name="Bundle-ClassPath" value="classes/, lib/hadoop-mapreduce-client-core-${hadoop.version}.jar, lib/hadoop-mapreduce-client-common-${hadoop.version}.jar, lib/hadoop-mapreduce-client-jobclient-${hadoop.version}.jar, lib/hadoop-auth-${hadoop.version}.jar, lib/hadoop-common-${hadoop.version}.jar, lib/hadoop-hdfs-${hadoop.version}.jar, lib/protobuf-java-${protobuf.version}.jar, lib/log4j-${log4j.version}.jar, lib/commons-cli-${commons-cli.version}.jar, lib/commons-configuration-${commons-configuration.version}.jar, lib/commons-httpclient-${commons-httpclient.version}.jar, lib/commons-lang-${commons-lang.version}.jar, lib/commons-collections-${commons-collections.version}.jar, lib/jackson-core-asl-${jackson.version}.jar, lib/jackson-mapper-asl-${jackson.version}.jar, lib/slf4j-log4j12-${slf4j-log4j12.version}.jar, lib/slf4j-api-${slf4j-api.version}.jar, lib/guava-${guava.version}.jar, lib/netty-${netty.version}.jar, lib/htrace-core-${htrace.version}.jar lib/servlet-api-${servlet-api.version}.jar, lib/commons-io-${commons-io.version}.jar, lib/htrace-core-${htrace.version}-incubating.jar"/> </manifest> <fileset dir="${build.dir}" includes="classes/ lib/"/> <!--fileset dir="${build.dir}" includes="*.xml"/--> <fileset dir="${root}" includes="resources/ plugin.xml"/> </jar> </target> </project> 各位大佬帮我看看这到底是什么原因导致 第493行代码解析不出来 很急

eclipse控制台乱码在保持UTF-8的工作环境下而彻底解决呢

我的eclipse工作环境是UTF-8的编码,但是我从控制台输入汉字,并且再打印出来的时候是乱码,我从网上看到这样一种解决方式----------------- 1.Run -> Run configurations... 2.在Java Application中选中你的引用程序。 3.选中Common标签页。 4.在Console encoding项中选择Other,然后选中GBK。 5.点击Apply按钮。 ---------------------------- 这样确实有效,但遗憾的是只对一个文件或者一个工程有效,对于新建的工程就无能为力了,还有一个方法就是将工作环境(workspace)的编码改回GBK,但这是不合适的,因为别人的工作环境也都是UTF-8,我需要和大家保持一致,而他们的控制台输入输出却也都是正常的,由于都是新手,他们也不了解原因。请问大家究竟应该怎样才能让控制台的输入输出正常,而不必每个工程都要在运行时改为GBK的编码,谢谢了!!

idea edit configurations里面的项目总是消失

idea edit configurations里面配置的tomcat项目,只要关闭idea就会消失,每次都要重新配置很麻烦 每个窗口都只能留下一个项目 ![图片说明](https://img-ask.csdn.net/upload/201908/16/1565917407_730398.png) ![图片说明](https://img-ask.csdn.net/upload/201908/16/1565917438_579555.png) 第二个项目设置这样,与第一个基本一致 ![图片说明](https://img-ask.csdn.net/upload/201908/16/1565917390_500698.png) 保存完之后这时候是存在第二个的项目的 ![图片说明](https://img-ask.csdn.net/upload/201908/16/1565917419_99356.png) 但只要关闭了idea再重新打开,好了又变成只有一个了 ![图片说明](https://img-ask.csdn.net/upload/201908/16/1565917569_701562.png) 以前都是保存了4-5个没问题的啊,最近不知道怎么了,要启动的很多,每次都重新弄好麻烦

maven的pom文件中配置的tomcat和edit Configurations里配置的tomcat有何区别,或者说那个好?

更不可能俩都需要吧? ![图片说明](https://img-ask.csdn.net/upload/202001/18/1579281213_887143.jpg) ![图片说明](https://img-ask.csdn.net/upload/202001/18/1579281262_949938.jpg)

MacBook pro下eclipse问题 android

本人环境MacBook pro、eclipse mars、android环境。如下图: 图1 ![图片说明](https://img-ask.csdn.net/upload/201602/23/1456211217_944829.png) 图2 ![图片说明](https://img-ask.csdn.net/upload/201602/23/1456211239_55643.png) 现在发现这个环境有几个问题很美头绪。 第一个问题: debug android 程序的时候不能选择手机。具体是,android项目上右键,debug as-> debug configurations,弹出下图 图3 ![图片说明](https://img-ask.csdn.net/upload/201602/23/1456211488_142420.png) 大家可能发现了,中间的选项launch on all compatible devices/avd's,这个选项数灰色的,不能被选中,还有,如图,选中第一项,继续debug,弹出下图 图4 ![图片说明](https://img-ask.csdn.net/upload/201602/23/1456211654_773730.png) 里面没有手机选项。事实上我手机是连上的,更奇怪的是,这个图4明明没有显示手机,如果我强行点击上面那个手机列表空白处,又可以在手机上运行程序,切这个现象时有时无。我目前期望的是,图3中间选项最好可选,或者图4有手机列表即可。 第二个问题: 这个android环境无法打包。如图5 图5 ![图片说明](https://img-ask.csdn.net/upload/201602/23/1456213330_408117.png) 我明明是有密钥的,选中location导入密钥,password输入密码,奇怪的是第三行confirm鼠标不能点击,无法获得焦点,不能输入内容, 不知道上面两个问题有没有朋友遇到的,求指点下啊

对于Tomcat8,在META-INF里配置context.xml不起作用

问题概述: 我使用Tomcat8和Java8开发Web应用,Web的工程名叫myweb,因此用localhost:8080/myweb可以访问。 但我想把上下文路径改一下,使得用localhost:8080/123也能访问,于是我把conf\server.xml的<host>标签中添加了<context>标签,并且将path="/123",就能满足要求。 但由于server.xml不能动态加载,因此我不想再这里添加<context>,我想在conf\context.xml文件中修改path="/123",但是,这不起作用。 于是,我又查资料,重新写了一个包含<context>标签的myweb.xml放到conf\Catalina\localhost\之中,也不起作用 我又查资料,用eclipse开发环境对Web的目录的META-INF文件夹中,也添加一个context.xml也不起作用。 另外,网友说,用WAR文件放到webapps文件夹下,启动tomcat,它会将META-INF下的context.xml复制到xml放到conf\Catalina\localhost\之中,但我的tomcat8不会复制。这也是一个问题。 我就想问,怎么样这个context.xml起作用? 我的context.xml文件如下: <?xml version="1.0" encoding="UTF-8"?> <Context path="/123" docBase="myweb" reloadable="true"> <WatchedResource>WEB-INF/web.xml</WatchedResource> <WatchedResource>${catalina.base}/conf/web.xml</WatchedResource> </Context>

eclipse新建java项目异常

![图片说明](https://img-ask.csdn.net/upload/201509/19/1442670944_717272.png)很久没做java项目了,今天做的时候成这样了,有没有人知道怎么回事啊

在Eclipse中运行Golang Hello World问题

<div class="post-text" itemprop="text"> <p>Alright, so just a disclaimer I suspect this question will be a duplicate of another question however I'm not even sure what to search for. </p> <p>I have never used Eclipse or Golang before and am attempting to get a basic hello world application to work.</p> <p>I have installed the goclipse plugin, created a new go package and go command source file. From what I have read to run a project in Eclipse you right click the package, select Run as then set the run configurations. The problem occurs when I attempt to select the go package as none shows up and if I leave it blank it throws a 'Go package not found' exception. </p> <p><img src="https://i.stack.imgur.com/2MsYs.png" alt="enter image description here"></p> <p>Thank you for any help you can provide.</p> <p>EDIT: Upon the answers advice I have decided to go with the basic command line, however a friend did also recommend LiteIDE. I will "assume" tmichels answer is correct in regards to getting Go to work within eclipse. </p> </div>

为什么Android studio点RUN执行不了而显示Edit Configurations

为什么Android studio点RUN执行不了而显示Edit Configurations?第一次使用Android studio ,求大神帮助下,谢谢

请问IDEA编译器中,在settings中配置maven和在run configuration中配置maven有什么区别呢?

![图片说明](https://img-ask.csdn.net/upload/201906/20/1560996066_121721.png) ![图片说明](https://img-ask.csdn.net/upload/201906/20/1560996121_177483.png) * 如图所示,settings中和run configuration中都可以配置maven的配置文件和仓库地址,但是我不知道他们有什么区别呢?好像在run configuration中配置了没用啊,项目还是以settings里面的配置为准的。 * 求大佬解释一下呀谢谢!

IDEA工具TOMCAT运行配置如何引用其他工程资源目录?

开发工具由eclipse转移到idea,很多配置不清楚 我的工作空间有两个工程,一个主java web工程,另一个工程SourceProject不用于web访问,里面的根目录config放了大量配置文件,在tomcat启动主工程的时候,主工程就能读取config目录下的配置文件 在Eclipse,我是这样配置就行了,在Run Configurations ---tomcat----Arguments标签的Working directoy,选上SourceProject, 这个的作用从代码上讲就是我能直接这样读取这个目录的文件 ``` File file = new File("config/day/yyyy.xml"); ``` ![图片说明](https://img-ask.csdn.net/upload/201810/30/1540867020_657460.jpg) 但是到IDEA上,我就懵逼了,Edit Configurations 里不知道怎么加上去,有没有对这个玩的溜的指导下!

eclipse 整个项目运行main方法提示这个.

![图片说明](https://img-ask.csdn.net/upload/201802/08/1518061463_6747.png) 就普通的main方法,随便一个类,只要是在这个项目里面运行就会提示这个...请问是什么原因,怎么解决

如何在Eclipse IDE下使用Tomcat 7运行带有PHP文件的HTML项目?

<div class="post-text" itemprop="text"> <p>I have searched every where but I can't find an answer to my question.</p> <p>I have Eclipse Juno installed on arch Linux, I have installed Apache Tomcat 7 as a server and PHPEclipse for writing PHP scripts. </p> <p>I have created a project with various HTML pages, but I need some PHP scripts to be run inside this project. I have installed PHPEclipse for this reason, in <em>run configurations</em> I have set the PHP executable directory, but if I use the internal Eclipse browser the PHP doesn't get parsed at all, it appear's to me as regular text. If I run access the page in an external web browser (Firefox) I get a white screen.</p> <p>I am going really mad on this problem, can somebody give me some help?</p> </div>

Eclipse PDT将项目名称添加到虚拟主机路径

<div class="post-text" itemprop="text"> <p>Can someone help me out. I run into a problem when I run a file (e.g. index.php). Every time when I want to run a file on the test server (XAMPP) Eclipse (PDT) adds the project name (e.g. testproject) after the server name (e.g. <code>http://testproject.dev</code>). Because I have setup a virtual host setup to automatically use a specific path (e.g. <code>http://testproject.dev</code> is linked to <code>http://localhost/testproject</code>) on the server this creates a problem. Eclipse adds the project name and then the URL becomes <code>http://testproject.dev/testproject/index.php</code>. </p> <p>This are my configurations:</p> <p>XAMPP httpd.conf (c:\xampp\conf\httpd.conf)</p> <pre><code># Virtual hosts Include "conf/extra/httpd-vhosts.conf" </code></pre> <p>httpd-vhosts.conf (c:\xampp\conf\extra\httpd-vhosts.conf)</p> <pre><code>NameVirtualHost 127.0.0.1 &lt;VirtualHost 127.0.0.1&gt; DocumentRoot "C:/xampp/htdocs/testproject" ServerName testproject.dev &lt;/VirtualHost&gt; </code></pre> <p>Windows 7 Windows vhost file (C:\Windows\System32\drivers\etc\hosts)</p> <pre><code>127.0.0.1 testproject.dev </code></pre> <p>Eclipse -&gt; Preferences -&gt; PHP Server</p> <ul> <li><p>Tab "Server": Name: "Development_Server_Testproject" URL: <code>http://testproject.dev</code></p></li> <li><p>Tab "Path Mapping": Path on server <code>http://testproject.dev</code> Path in workspace</p></li> </ul> <p>/testproject</p> </div>

在OS X 10.7上的Eclipse中的Zend调试器

<div class="post-text" itemprop="text"> <p>I am trying to get Zend Debugger working in Eclipse.</p> <p>OS X 10.7 MAMP 2.0.5 using php 5.3.6, port 8888 - phpinfo says I have Zend_Debugger configured - allowed hosts is 127.0.0.1/32 Eclipse Helios - Zend is default debugger</p> <p>When I try to test the debugger in Debug Configurations, I get a timeout error. When I try and actually debug a page, the page loads without breaking to debug (i.e., it actually finds the server and files).</p> <p>Suggestions?</p> </div>

在中国程序员是青春饭吗?

今年,我也32了 ,为了不给大家误导,咨询了猎头、圈内好友,以及年过35岁的几位老程序员……舍了老脸去揭人家伤疤……希望能给大家以帮助,记得帮我点赞哦。 目录: 你以为的人生 一次又一次的伤害 猎头界的真相 如何应对互联网行业的「中年危机」 一、你以为的人生 刚入行时,拿着傲人的工资,想着好好干,以为我们的人生是这样的: 等真到了那一天,你会发现,你的人生很可能是这样的: ...

程序员请照顾好自己,周末病魔差点一套带走我。

程序员在一个周末的时间,得了重病,差点当场去世,还好及时挽救回来了。

和黑客斗争的 6 天!

互联网公司工作,很难避免不和黑客们打交道,我呆过的两家互联网公司,几乎每月每天每分钟都有黑客在公司网站上扫描。有的是寻找 Sql 注入的缺口,有的是寻找线上服务器可能存在的漏洞,大部分都...

搜狗输入法也在挑战国人的智商!

故事总是一个接着一个到来...上周写完《鲁大师已经彻底沦为一款垃圾流氓软件!》这篇文章之后,鲁大师的市场工作人员就找到了我,希望把这篇文章删除掉。经过一番沟通我先把这篇文章从公号中删除了...

总结了 150 余个神奇网站,你不来瞅瞅吗?

原博客再更新,可能就没了,之后将持续更新本篇博客。

副业收入是我做程序媛的3倍,工作外的B面人生是怎样的?

提到“程序员”,多数人脑海里首先想到的大约是:为人木讷、薪水超高、工作枯燥…… 然而,当离开工作岗位,撕去层层标签,脱下“程序员”这身外套,有的人生动又有趣,马上展现出了完全不同的A/B面人生! 不论是简单的爱好,还是正经的副业,他们都干得同样出色。偶尔,还能和程序员的特质结合,产生奇妙的“化学反应”。 @Charlotte:平日素颜示人,周末美妆博主 大家都以为程序媛也个个不修边幅,但我们也许...

MySQL数据库面试题(2020最新版)

文章目录数据库基础知识为什么要使用数据库什么是SQL?什么是MySQL?数据库三大范式是什么mysql有关权限的表都有哪几个MySQL的binlog有有几种录入格式?分别有什么区别?数据类型mysql有哪些数据类型引擎MySQL存储引擎MyISAM与InnoDB区别MyISAM索引与InnoDB索引的区别?InnoDB引擎的4大特性存储引擎选择索引什么是索引?索引有哪些优缺点?索引使用场景(重点)...

如果你是老板,你会不会踢了这样的员工?

有个好朋友ZS,是技术总监,昨天问我:“有一个老下属,跟了我很多年,做事勤勤恳恳,主动性也很好。但随着公司的发展,他的进步速度,跟不上团队的步伐了,有点...

我入职阿里后,才知道原来简历这么写

私下里,有不少读者问我:“二哥,如何才能写出一份专业的技术简历呢?我总感觉自己写的简历太烂了,所以投了无数份,都石沉大海了。”说实话,我自己好多年没有写过简历了,但我认识的一个同行,他在阿里,给我说了一些他当年写简历的方法论,我感觉太牛逼了,实在是忍不住,就分享了出来,希望能够帮助到你。 01、简历的本质 作为简历的撰写者,你必须要搞清楚一点,简历的本质是什么,它就是为了来销售你的价值主张的。往深...

优雅的替换if-else语句

场景 日常开发,if-else语句写的不少吧??当逻辑分支非常多的时候,if-else套了一层又一层,虽然业务功能倒是实现了,但是看起来是真的很不优雅,尤其是对于我这种有强迫症的程序"猿",看到这么多if-else,脑袋瓜子就嗡嗡的,总想着解锁新姿势:干掉过多的if-else!!!本文将介绍三板斧手段: 优先判断条件,条件不满足的,逻辑及时中断返回; 采用策略模式+工厂模式; 结合注解,锦...

离职半年了,老东家又发 offer,回不回?

有小伙伴问松哥这个问题,他在上海某公司,在离职了几个月后,前公司的领导联系到他,希望他能够返聘回去,他很纠结要不要回去? 俗话说好马不吃回头草,但是这个小伙伴既然感到纠结了,我觉得至少说明了两个问题:1.曾经的公司还不错;2.现在的日子也不是很如意。否则应该就不会纠结了。 老实说,松哥之前也有过类似的经历,今天就来和小伙伴们聊聊回头草到底吃不吃。 首先一个基本观点,就是离职了也没必要和老东家弄的苦...

2020阿里全球数学大赛:3万名高手、4道题、2天2夜未交卷

阿里巴巴全球数学竞赛( Alibaba Global Mathematics Competition)由马云发起,由中国科学技术协会、阿里巴巴基金会、阿里巴巴达摩院共同举办。大赛不设报名门槛,全世界爱好数学的人都可参与,不论是否出身数学专业、是否投身数学研究。 2020年阿里巴巴达摩院邀请北京大学、剑桥大学、浙江大学等高校的顶尖数学教师组建了出题组。中科院院士、美国艺术与科学院院士、北京国际数学...

男生更看重女生的身材脸蛋,还是思想?

往往,我们看不进去大段大段的逻辑。深刻的哲理,往往短而精悍,一阵见血。问:产品经理挺漂亮的,有点心动,但不知道合不合得来。男生更看重女生的身材脸蛋,还是...

程序员为什么千万不要瞎努力?

本文作者用对比非常鲜明的两个开发团队的故事,讲解了敏捷开发之道 —— 如果你的团队缺乏统一标准的环境,那么即使勤劳努力,不仅会极其耗时而且成果甚微,使用...

为什么程序员做外包会被瞧不起?

二哥,有个事想询问下您的意见,您觉得应届生值得去外包吗?公司虽然挺大的,中xx,但待遇感觉挺低,马上要报到,挺纠结的。

当HR压你价,说你只值7K,你该怎么回答?

当HR压你价,说你只值7K时,你可以流畅地回答,记住,是流畅,不能犹豫。 礼貌地说:“7K是吗?了解了。嗯~其实我对贵司的面试官印象很好。只不过,现在我的手头上已经有一份11K的offer。来面试,主要也是自己对贵司挺有兴趣的,所以过来看看……”(未完) 这段话主要是陪HR互诈的同时,从公司兴趣,公司职员印象上,都给予对方正面的肯定,既能提升HR的好感度,又能让谈判气氛融洽,为后面的发挥留足空间。...

面试:第十六章:Java中级开发(16k)

HashMap底层实现原理,红黑树,B+树,B树的结构原理 Spring的AOP和IOC是什么?它们常见的使用场景有哪些?Spring事务,事务的属性,传播行为,数据库隔离级别 Spring和SpringMVC,MyBatis以及SpringBoot的注解分别有哪些?SpringMVC的工作原理,SpringBoot框架的优点,MyBatis框架的优点 SpringCould组件有哪些,他们...

面试阿里p7,被按在地上摩擦,鬼知道我经历了什么?

面试阿里p7被问到的问题(当时我只知道第一个):@Conditional是做什么的?@Conditional多个条件是什么逻辑关系?条件判断在什么时候执...

面试了一个 31 岁程序员,让我有所触动,30岁以上的程序员该何去何从?

最近面试了一个31岁8年经验的程序猿,让我有点感慨,大龄程序猿该何去何从。

大三实习生,字节跳动面经分享,已拿Offer

说实话,自己的算法,我一个不会,太难了吧

程序员垃圾简历长什么样?

已经连续五年参加大厂校招、社招的技术面试工作,简历看的不下于万份 这篇文章会用实例告诉你,什么是差的程序员简历! 疫情快要结束了,各个公司也都开始春招了,作为即将红遍大江南北的新晋UP主,那当然要为小伙伴们做点事(手动狗头)。 就在公众号里公开征简历,义务帮大家看,并一一点评。《启舰:春招在即,义务帮大家看看简历吧》 一石激起千层浪,三天收到两百多封简历。 花光了两个星期的所有空闲时...

《Oracle Java SE编程自学与面试指南》最佳学习路线图2020年最新版(进大厂必备)

正确选择比瞎努力更重要!

《Oracle Java SE编程自学与面试指南》最佳学习路线图(2020最新版)

正确选择比瞎努力更重要!

都前后端分离了,咱就别做页面跳转了!统统 JSON 交互

文章目录1. 无状态登录1.1 什么是有状态1.2 什么是无状态1.3 如何实现无状态1.4 各自优缺点2. 登录交互2.1 前后端分离的数据交互2.2 登录成功2.3 登录失败3. 未认证处理方案4. 注销登录 这是本系列的第四篇,有小伙伴找不到之前文章,松哥给大家列一个索引出来: 挖一个大坑,Spring Security 开搞! 松哥手把手带你入门 Spring Security,别再问密...

字节跳动面试官竟然问了我JDBC?

轻松等回家通知

面试官:你连SSO都不懂,就别来面试了

大厂竟然要考我SSO,卧槽。

阿里面试官让我用Zk(Zookeeper)实现分布式锁

他可能没想到,我当场手写出来了

终于,月薪过5万了!

来看几个问题想不想月薪超过5万?想不想进入公司架构组?想不想成为项目组的负责人?想不想成为spring的高手,超越99%的对手?那么本文内容是你必须要掌握的。本文主要详解bean的生命...

自从喜欢上了B站这12个UP主,我越来越觉得自己是个废柴了!

不怕告诉你,我自从喜欢上了这12个UP主,哔哩哔哩成为了我手机上最耗电的软件,几乎每天都会看,可是吧,看的越多,我就越觉得自己是个废柴,唉,老天不公啊,不信你看看…… 间接性踌躇满志,持续性混吃等死,都是因为你们……但是,自己的学习力在慢慢变强,这是不容忽视的,推荐给你们! 都说B站是个宝,可是有人不会挖啊,没事,今天咱挖好的送你一箩筐,首先啊,我在B站上最喜欢看这个家伙的视频了,为啥 ,咱撇...

代码注释如此沙雕,会玩还是你们程序员!

某站后端代码被“开源”,同时刷遍全网的,还有代码里的那些神注释。 我们这才知道,原来程序员个个都是段子手;这么多年来,我们也走过了他们的无数套路… 首先,产品经理,是永远永远吐槽不完的!网友的评论也非常扎心,说看这些代码就像在阅读程序员的日记,每一页都写满了对产品经理的恨。 然后,也要发出直击灵魂的质问:你是尊贵的付费大会员吗? 这不禁让人想起之前某音乐app的穷逼Vip,果然,穷逼在哪里都是...

立即提问
相关内容推荐