去测试`-parallel`与`-test.parallel`哪个标志优先?

go test </ code>的两个标志之间的区别 parallel </ code> 和 -test.parallel </ code>以及哪个标志优先?</ p>

  -parallel n 
允许并行执行调用t.Parallel的测试函数。\ n该标志的值是要同时运行的最大测试次数。 默认情况下,它设置为GOMAXPROCS的值。
请注意,-parallel仅适用于单个测试二进制文件。
根据以下说明,'go test'命令也可以并行运行不同程序包的测试
。 设置-p标志
(请参阅“ go help build”)。
</ code> </ pre>

以上文档指出,并行运行的测试数量等于 GOMAXPROCS </ code>(如果未提供任何内容),但对于我来说,行为就不一样了。 因为我在只有4个核心的机器上运行测试。 但是对我来说,有8个测试并行运行,所以行为更像是:</ p>

  -test.parallel int 
最大测试并行度(默认为8)
</ code > </ pre>

那么两者之间有什么区别? 何时使用哪个标志。</ p>

更多信息</ h2>

我正在包含9个测试的单个程序包上运行所有测试,所有这些并行运行 并且所有这些都存在于单个测试函数中。</ p>
</ div>

展开原文

原文

Difference between go test's two flags -parallel and -test.parallel and which flag gets precedence?

-parallel n
            Allow parallel execution of test functions that call t.Parallel.
            The value of this flag is the maximum number of tests to run
            simultaneously; by default, it is set to the value of GOMAXPROCS.
            Note that -parallel only applies within a single test binary.
            The 'go test' command may run tests for different packages
            in parallel as well, according to the setting of the -p flag
            (see 'go help build').

Above documentation says that the number of tests that are run in parallel are equal to GOMAXPROCS if nothing is provided, but the behavior is not like that for me. Because I am running tests on a machine that has only 4 cores. But for me 8 tests run in parallel, so the behavior is more like following:

-test.parallel int
        maximum test parallelism (default 8)

So what is the difference between the two? When to use which flag.

More Information

I am running all tests on a single package which has 9 tests, all of them are run parallely and all those exist in single test function.

drzfz9995
drzfz9995 我认为代码根本不相关-这是关于gotest命令中记录的标志的简单问题,并不特定于正在运行的代码。
接近 3 年之前 回复
dongyan1974
dongyan1974 我认为这是正确的行为,因此我针对具有4个核心的VM中的群集运行了端到端测试,但是我的主机具有8个核心,因此功能似乎还不错。对不起,噪音。
接近 3 年之前 回复
dongshijiao6890
dongshijiao6890 您必须显示a)代码和b)如何执行测试(通过go测试或通过运行已编译的测试二进制文件),并显示GOMAXPROCS的值(在环境中和执行期间)。
接近 3 年之前 回复

1个回答



-test。</ code>标志由 go test </ code>命令生成。 go test </ code>命令动态生成一个 pkg.test </ code>二进制文件,并使用修改后的参数运行它。 传递给 go test </ code>的所有可识别参数将被转换。 因此,在您的情况下: -parallel n </ code>变为 -test.parallel n </ code>。</ p>

因此,此命令:</ p> \ n

 去测试-并行6 
</ code> </ pre>

创建:</ p>

  pkg.test  -test.parallel 6 
</ code> </ pre>
</ div>

展开原文

原文

The -test. flags are generated by go test command. The go test command produces a pkg.test binary on the fly and runs it with modified arguments. All the recognized arguments passed to go test will be converted. So, in your case: -parallel n becomes -test.parallel n.

So this command:

go test -parallel 6

creates:

pkg.test -test.parallel 6

dongqing904999
dongqing904999 感谢白珠的解释!
接近 3 年之前 回复
Csdn user default icon
上传中...
上传图片
插入图片
抄袭、复制答案,以达到刷声望分或其他目的的行为,在CSDN问答是严格禁止的,一经发现立刻封号。是时候展现真正的技术了!
其他相关推荐
将t.Parallel()放在测试顶部的实际好处是什么?

<div class="post-text" itemprop="text"> <p>The go <code>testing</code> package defines <a href="http://golang.org/pkg/testing/#T.Parallel" rel="nofollow">a Parallel() function</a>:</p> <blockquote> <p>Parallel signals that this test is to be run in parallel with (and only with) other parallel tests.</p> </blockquote> <p>However when I searched the tests written for the standard library, I found only a few uses of this function.</p> <p>My tests are pretty fast, and generally don't rely on mutating shared state, so I've been adding this, figuring it would lead to a speedup. But the fact it's not used in the standard library gives me pause. What is the practical benefit to adding <code>t.Parallel()</code> to your tests?</p> </div>

在Linux下编译Unity的mono一直卡在这里

![图片说明](https://img-ask.csdn.net/upload/201707/25/1500967374_925546.png) 看日志有两个地方比较可疑,一个是这里 configure:3745: checking whether to enable maintainer-specific portions of Makefiles configure:3754: result: no 说没有开启maintainer-specific portions,但我不知道怎么开启,不知道是不是原因 另一处是arm-linux-androideabi-gcc: error: unrecognized command line option '-V' 这个好像和下面的 collect2: error: ld returned 1 exit status configure:4569: $? = 1 configure:4607: result: no configure: failed program was: | /* confdefs.h */ | #define PACKAGE_NAME "" | #define PACKAGE_TARNAME "" | #define PACKAGE_VERSION "" | #define PACKAGE_STRING "" | #define PACKAGE_BUGREPORT "" | #define PACKAGE_URL "" | #define PACKAGE "mono" | #define VERSION "2.6.5" | /* end confdefs.h. */ | | int | main () | { | | ; | return 0; | } 这个有关configure:4607: result: no这个意思好像是configure的4607行返回了一个no,文件夹内的确有一个configure文件,之后打开configure,4607行是这样的 4606 if test -z "$ac_file"; then : 4607 { $as_echo "$as_me:${as_lineno-$LINENO}: result: no" >&5 4608 $as_echo "no" >&6; } 4609 $as_echo "$as_me: failed program was:" >&5 4610 sed 's/^/| /' conftest.$ac_ext >&5 4611 4612 { { $as_echo "$as_me:${as_lineno-$LINENO}: error: in \`$ac_pwd':" >&5 4613 $as_echo "$as_me: error: in \`$ac_pwd':" >&2;} 4614 as_fn_error 77 "C compiler cannot create executables 4615 See \`config.log' for more details" "$LINENO" 5; } 4616 else 4617 { $as_echo "$as_me:${as_lineno-$LINENO}: result: yes" >&5 4618 $as_echo "yes" >&6; } 4619 fi 这种语言没学过,看不懂什么意思,求大神指点,卡这很久了 下面是日志 This file contains any messages produced by compilers while running configure, to aid debugging if configure makes a mistake. It was created by configure, which was generated by GNU Autoconf 2.69. Invocation command line was $ ./configure --prefix=/home/jimmy/mono-unity-5.6/builds/android --cache-file=android_cross.cache --host=arm-eabi-linux --disable-mcs-build --disable-parallel-mark --disable-shared-handles --with-sigaltstack=no --with-tls=pthread --with-glib=embedded --enable-nls=no mono_cv_uscore=yes PATH=/home/jimmy/android-ndk_auto-r10e/toolchains/arm-linux-androideabi-4.8/prebuilt/linux-x86/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin CC=/home/jimmy/android-ndk_auto-r10e/toolchains/arm-linux-androideabi-4.8/prebuilt/linux-x86/bin/arm-linux-androideabi-gcc --sysroot=/home/jimmy/android-ndk_auto-r10e/platforms/android-9/arch-arm CXX=/home/jimmy/android-ndk_auto-r10e/toolchains/arm-linux-androideabi-4.8/prebuilt/linux-x86/bin/arm-linux-androideabi-g++ --sysroot=/home/jimmy/android-ndk_auto-r10e/platforms/android-9/arch-arm CPP=/home/jimmy/android-ndk_auto-r10e/toolchains/arm-linux-androideabi-4.8/prebuilt/linux-x86/bin/arm-linux-androideabi-cpp CXXCPP=/home/jimmy/android-ndk_auto-r10e/toolchains/arm-linux-androideabi-4.8/prebuilt/linux-x86/bin/arm-linux-androideabi-cpp CFLAGS=-DANDROID -DPLATFORM_ANDROID -DLINUX -D__linux__ -DHAVE_USR_INCLUDE_MALLOC_H -DPAGE_SIZE=0x1000 -D_POSIX_PATH_MAX=256 -DS_IWRITE=S_IWUSR -DHAVE_PTHREAD_MUTEX_TIMEDLOCK -fpic -g -funwind-tables -ffunction-sections -fdata-sections -DARM_FPU_NONE=1 -march=armv5te -mtune=xscale -msoft-float CPPFLAGS=-DANDROID -DPLATFORM_ANDROID -DLINUX -D__linux__ -DHAVE_USR_INCLUDE_MALLOC_H -DPAGE_SIZE=0x1000 -D_POSIX_PATH_MAX=256 -DS_IWRITE=S_IWUSR -DHAVE_PTHREAD_MUTEX_TIMEDLOCK -fpic -g -funwind-tables -ffunction-sections -fdata-sections -DARM_FPU_NONE=1 -march=armv5te -mtune=xscale -msoft-float CXXFLAGS=-DANDROID -DPLATFORM_ANDROID -DLINUX -D__linux__ -DHAVE_USR_INCLUDE_MALLOC_H -DPAGE_SIZE=0x1000 -D_POSIX_PATH_MAX=256 -DS_IWRITE=S_IWUSR -DHAVE_PTHREAD_MUTEX_TIMEDLOCK -fpic -g -funwind-tables -ffunction-sections -fdata-sections -DARM_FPU_NONE=1 -march=armv5te -mtune=xscale -msoft-float LDFLAGS=-Wl,--wrap,sigaction -L/home/jimmy/mono-unity-5.6/../../android_krait_signal_handler/build/obj/local/armeabi -lkrait-signal-handler -Wl,--no-undefined -Wl,--gc-sections -Wl,-rpath-link=/home/jimmy/android-ndk_auto-r10e/platforms/android-9/arch-arm/usr/lib -ldl -lm -llog -lc LD=/home/jimmy/android-ndk_auto-r10e/toolchains/arm-linux-androideabi-4.8/prebuilt/linux-x86/bin/arm-linux-androideabi-ld AR=/home/jimmy/android-ndk_auto-r10e/toolchains/arm-linux-androideabi-4.8/prebuilt/linux-x86/bin/arm-linux-androideabi-ar AS=/home/jimmy/android-ndk_auto-r10e/toolchains/arm-linux-androideabi-4.8/prebuilt/linux-x86/bin/arm-linux-androideabi-as RANLIB=/home/jimmy/android-ndk_auto-r10e/toolchains/arm-linux-androideabi-4.8/prebuilt/linux-x86/bin/arm-linux-androideabi-ranlib STRIP=/home/jimmy/android-ndk_auto-r10e/toolchains/arm-linux-androideabi-4.8/prebuilt/linux-x86/bin/arm-linux-androideabi-strip CPATH=/home/jimmy/android-ndk_auto-r10e/platforms/android-9/arch-arm/usr/include ## --------- ## ## Platform. ## ## --------- ## hostname = ubuntu uname -m = x86_64 uname -r = 4.4.0-21-generic uname -s = Linux uname -v = #37-Ubuntu SMP Mon Apr 18 18:33:37 UTC 2016 /usr/bin/uname -p = unknown /bin/uname -X = unknown /bin/arch = unknown /usr/bin/arch -k = unknown /usr/convex/getsysinfo = unknown /usr/bin/hostinfo = unknown /bin/machine = unknown /usr/bin/oslevel = unknown /bin/universe = unknown PATH: /home/jimmy/android-ndk_auto-r10e/toolchains/arm-linux-androideabi-4.8/prebuilt/linux-x86/bin PATH: /usr/local/sbin PATH: /usr/local/bin PATH: /usr/sbin PATH: /usr/bin PATH: /sbin PATH: /bin ## ----------- ## ## Core tests. ## ## ----------- ## configure:2828: creating cache android_cross.cache configure:2934: checking build system type configure:2948: result: x86_64-pc-linux-gnu configure:2968: checking host system type configure:2981: result: arm-eabi-linux-gnu configure:3001: checking target system type configure:3014: result: arm-eabi-linux-gnu configure:3064: checking for a BSD-compatible install configure:3132: result: /usr/bin/install -c configure:3143: checking whether build environment is sane configure:3198: result: yes configure:3257: checking for arm-eabi-linux-strip configure:3284: result: /home/jimmy/android-ndk_auto-r10e/toolchains/arm-linux-androideabi-4.8/prebuilt/linux-x86/bin/arm-linux-androideabi-strip configure:3349: checking for a thread-safe mkdir -p configure:3388: result: /bin/mkdir -p configure:3395: checking for gawk configure:3411: found /usr/bin/gawk configure:3422: result: gawk configure:3433: checking whether make sets $(MAKE) configure:3455: result: yes configure:3484: checking whether make supports nested variables configure:3501: result: yes configure:3591: checking whether UID '0' is supported by ustar format configure:3594: result: yes configure:3601: checking whether GID '0' is supported by ustar format configure:3604: result: yes configure:3612: checking how to create a ustar tar archive configure:3623: tar --version tar (GNU tar) 1.28 Copyright (C) 2014 Free Software Foundation, Inc. License GPLv3+: GNU GPL version 3 or later <http://gnu.org/licenses/gpl.html>. This is free software: you are free to change and redistribute it. There is NO WARRANTY, to the extent permitted by law. Written by John Gilmore and Jay Fenlason. configure:3626: $? = 0 configure:3666: tardir=conftest.dir && eval tar --format=ustar -chf - "$tardir" >conftest.tar configure:3669: $? = 0 configure:3673: tar -xf - <conftest.tar configure:3676: $? = 0 configure:3678: cat conftest.dir/file GrepMe configure:3681: $? = 0 configure:3694: result: gnutar configure:3745: checking whether to enable maintainer-specific portions of Makefiles configure:3754: result: no configure:3771: checking whether ln -s works configure:3775: result: yes configure:3847: checking host platform characteristics configure:4065: result: ok configure:4110: checking for arm-eabi-linux-gcc configure:4137: result: /home/jimmy/android-ndk_auto-r10e/toolchains/arm-linux-androideabi-4.8/prebuilt/linux-x86/bin/arm-linux-androideabi-gcc --sysroot=/home/jimmy/android-ndk_auto-r10e/platforms/android-9/arch-arm configure:4207: checking for arm-eabi-linux-gcc configure:4234: result: /home/jimmy/android-ndk_auto-r10e/toolchains/arm-linux-androideabi-4.8/prebuilt/linux-x86/bin/arm-linux-androideabi-gcc --sysroot=/home/jimmy/android-ndk_auto-r10e/platforms/android-9/arch-arm configure:4503: checking for C compiler version configure:4512: /home/jimmy/android-ndk_auto-r10e/toolchains/arm-linux-androideabi-4.8/prebuilt/linux-x86/bin/arm-linux-androideabi-gcc --sysroot=/home/jimmy/android-ndk_auto-r10e/platforms/android-9/arch-arm --version >&5 arm-linux-androideabi-gcc (GCC) 4.8 Copyright (C) 2013 Free Software Foundation, Inc. This is free software; see the source for copying conditions. There is NO warranty; not even for MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. configure:4523: $? = 0 configure:4512: /home/jimmy/android-ndk_auto-r10e/toolchains/arm-linux-androideabi-4.8/prebuilt/linux-x86/bin/arm-linux-androideabi-gcc --sysroot=/home/jimmy/android-ndk_auto-r10e/platforms/android-9/arch-arm -v >&5 Using built-in specs. COLLECT_GCC=/home/jimmy/android-ndk_auto-r10e/toolchains/arm-linux-androideabi-4.8/prebuilt/linux-x86/bin/arm-linux-androideabi-gcc COLLECT_LTO_WRAPPER=/home/jimmy/android-ndk_auto-r10e/toolchains/arm-linux-androideabi-4.8/prebuilt/linux-x86/bin/../libexec/gcc/arm-linux-androideabi/4.8/lto-wrapper Target: arm-linux-androideabi Configured with: /s/ndk-toolchain/src/build/../gcc/gcc-4.8/configure --prefix=/tmp/ndk-andrewhsieh/build/toolchain/prefix --target=arm-linux-androideabi --host=x86_64-linux-gnu --build=x86_64-linux-gnu --with-gnu-as --with-gnu-ld --enable-languages=c,c++ --with-gmp=/tmp/ndk-andrewhsieh/build/toolchain/temp-install --with-mpfr=/tmp/ndk-andrewhsieh/build/toolchain/temp-install --with-mpc=/tmp/ndk-andrewhsieh/build/toolchain/temp-install --with-cloog=/tmp/ndk-andrewhsieh/build/toolchain/temp-install --with-isl=/tmp/ndk-andrewhsieh/build/toolchain/temp-install --with-ppl=/tmp/ndk-andrewhsieh/build/toolchain/temp-install --disable-ppl-version-check --disable-cloog-version-check --disable-isl-version-check --enable-cloog-backend=isl --with-host-libstdcxx='-static-libgcc -Wl,-Bstatic,-lstdc++,-Bdynamic -lm' --disable-libssp --enable-threads --disable-nls --disable-libmudflap --disable-libgomp --disable-libstdc__-v3 --disable-sjlj-exceptions --disable-shared --disable-tls --disable-libitm --with-float=soft --with-fpu=vfp --with-arch=armv5te --enable-target-optspace --enable-initfini-array --disable-nls --prefix=/tmp/ndk-andrewhsieh/build/toolchain/prefix --with-sysroot=/tmp/ndk-andrewhsieh/build/toolchain/prefix/sysroot --with-binutils-version=2.25 --with-mpfr-version=3.1.1 --with-mpc-version=1.0.1 --with-gmp-version=5.0.5 --with-gcc-version=4.8 --with-gdb-version=7.7 --with-python=/usr/local/google/home/andrewhsieh/mydroid/ndk/prebuilt/linux-x86/bin/python-config.sh --with-gxx-include-dir=/tmp/ndk-andrewhsieh/build/toolchain/prefix/include/c++/4.8 --with-bugurl=http://source.android.com/source/report-bugs.html --enable-languages=c,c++ --disable-bootstrap --enable-plugins --enable-libgomp --enable-gnu-indirect-function --disable-libsanitizer --enable-gold --enable-threads --enable-graphite=yes --with-cloog-version=0.18.0 --with-isl-version=0.11.1 --enable-eh-frame-hdr-for-static --with-arch=armv5te --program-transform-name='s&^&arm-linux-androideabi-&' --enable-gold=default Thread model: posix gcc version 4.8 (GCC) configure:4523: $? = 0 configure:4512: /home/jimmy/android-ndk_auto-r10e/toolchains/arm-linux-androideabi-4.8/prebuilt/linux-x86/bin/arm-linux-androideabi-gcc --sysroot=/home/jimmy/android-ndk_auto-r10e/platforms/android-9/arch-arm -V >&5 arm-linux-androideabi-gcc: error: unrecognized command line option '-V' arm-linux-androideabi-gcc: fatal error: no input files compilation terminated. configure:4523: $? = 1 configure:4512: /home/jimmy/android-ndk_auto-r10e/toolchains/arm-linux-androideabi-4.8/prebuilt/linux-x86/bin/arm-linux-androideabi-gcc --sysroot=/home/jimmy/android-ndk_auto-r10e/platforms/android-9/arch-arm -qversion >&5 arm-linux-androideabi-gcc: error: unrecognized command line option '-qversion' arm-linux-androideabi-gcc: fatal error: no input files compilation terminated. configure:4523: $? = 1 configure:4543: checking whether the C compiler works configure:4565: /home/jimmy/android-ndk_auto-r10e/toolchains/arm-linux-androideabi-4.8/prebuilt/linux-x86/bin/arm-linux-androideabi-gcc --sysroot=/home/jimmy/android-ndk_auto-r10e/platforms/android-9/arch-arm -DANDROID -DPLATFORM_ANDROID -DLINUX -D__linux__ -DHAVE_USR_INCLUDE_MALLOC_H -DPAGE_SIZE=0x1000 -D_POSIX_PATH_MAX=256 -DS_IWRITE=S_IWUSR -DHAVE_PTHREAD_MUTEX_TIMEDLOCK -fpic -g -funwind-tables -ffunction-sections -fdata-sections -DARM_FPU_NONE=1 -march=armv5te -mtune=xscale -msoft-float -DANDROID -DPLATFORM_ANDROID -DLINUX -D__linux__ -DHAVE_USR_INCLUDE_MALLOC_H -DPAGE_SIZE=0x1000 -D_POSIX_PATH_MAX=256 -DS_IWRITE=S_IWUSR -DHAVE_PTHREAD_MUTEX_TIMEDLOCK -fpic -g -funwind-tables -ffunction-sections -fdata-sections -DARM_FPU_NONE=1 -march=armv5te -mtune=xscale -msoft-float -DGC_LINUX_THREADS -D_GNU_SOURCE -D_REENTRANT -DUSE_MMAP -DUSE_MUNMAP -Wl,--wrap,sigaction -L/home/jimmy/mono-unity-5.6/../../android_krait_signal_handler/build/obj/local/armeabi -lkrait-signal-handler -Wl,--no-undefined -Wl,--gc-sections -Wl,-rpath-link=/home/jimmy/android-ndk_auto-r10e/platforms/android-9/arch-arm/usr/lib -ldl -lm -llog -lc conftest.c >&5 /home/jimmy/android-ndk_auto-r10e/toolchains/arm-linux-androideabi-4.8/prebuilt/linux-x86/bin/../lib/gcc/arm-linux-androideabi/4.8/../../../../arm-linux-androideabi/bin/ld: error: cannot find -lkrait-signal-handler collect2: error: ld returned 1 exit status configure:4569: $? = 1 configure:4607: result: no configure: failed program was: | /* confdefs.h */ | #define PACKAGE_NAME "" | #define PACKAGE_TARNAME "" | #define PACKAGE_VERSION "" | #define PACKAGE_STRING "" | #define PACKAGE_BUGREPORT "" | #define PACKAGE_URL "" | #define PACKAGE "mono" | #define VERSION "2.6.5" | /* end confdefs.h. */ | | int | main () | { | | ; | return 0; | } configure:4612: error: in `/home/jimmy/mono-unity-5.6': configure:4614: error: C compiler cannot create executables See `config.log' for more details ## ---------------- ## ## Cache variables. ## ## ---------------- ## ac_cv_build=x86_64-pc-linux-gnu ac_cv_env_CCASFLAGS_set= ac_cv_env_CCASFLAGS_value= ac_cv_env_CCAS_set= ac_cv_env_CCAS_value= ac_cv_env_CCC_set= ac_cv_env_CCC_value= ac_cv_env_CC_set=set ac_cv_env_CC_value='/home/jimmy/android-ndk_auto-r10e/toolchains/arm-linux-androideabi-4.8/prebuilt/linux-x86/bin/arm-linux-androideabi-gcc --sysroot=/home/jimmy/android-ndk_auto-r10e/platforms/android-9/arch-arm' ac_cv_env_CFLAGS_set=set ac_cv_env_CFLAGS_value='-DANDROID -DPLATFORM_ANDROID -DLINUX -D__linux__ -DHAVE_USR_INCLUDE_MALLOC_H -DPAGE_SIZE=0x1000 -D_POSIX_PATH_MAX=256 -DS_IWRITE=S_IWUSR -DHAVE_PTHREAD_MUTEX_TIMEDLOCK -fpic -g -funwind-tables -ffunction-sections -fdata-sections -DARM_FPU_NONE=1 -march=armv5te -mtune=xscale -msoft-float' ac_cv_env_CPPFLAGS_set=set ac_cv_env_CPPFLAGS_value='-DANDROID -DPLATFORM_ANDROID -DLINUX -D__linux__ -DHAVE_USR_INCLUDE_MALLOC_H -DPAGE_SIZE=0x1000 -D_POSIX_PATH_MAX=256 -DS_IWRITE=S_IWUSR -DHAVE_PTHREAD_MUTEX_TIMEDLOCK -fpic -g -funwind-tables -ffunction-sections -fdata-sections -DARM_FPU_NONE=1 -march=armv5te -mtune=xscale -msoft-float' ac_cv_env_CPP_set=set ac_cv_env_CPP_value=/home/jimmy/android-ndk_auto-r10e/toolchains/arm-linux-androideabi-4.8/prebuilt/linux-x86/bin/arm-linux-androideabi-cpp ac_cv_env_CXXCPP_set=set ac_cv_env_CXXCPP_value=/home/jimmy/android-ndk_auto-r10e/toolchains/arm-linux-androideabi-4.8/prebuilt/linux-x86/bin/arm-linux-androideabi-cpp ac_cv_env_CXXFLAGS_set=set ac_cv_env_CXXFLAGS_value='-DANDROID -DPLATFORM_ANDROID -DLINUX -D__linux__ -DHAVE_USR_INCLUDE_MALLOC_H -DPAGE_SIZE=0x1000 -D_POSIX_PATH_MAX=256 -DS_IWRITE=S_IWUSR -DHAVE_PTHREAD_MUTEX_TIMEDLOCK -fpic -g -funwind-tables -ffunction-sections -fdata-sections -DARM_FPU_NONE=1 -march=armv5te -mtune=xscale -msoft-float' ac_cv_env_CXX_set=set ac_cv_env_CXX_value='/home/jimmy/android-ndk_auto-r10e/toolchains/arm-linux-androideabi-4.8/prebuilt/linux-x86/bin/arm-linux-androideabi-g++ --sysroot=/home/jimmy/android-ndk_auto-r10e/platforms/android-9/arch-arm' ac_cv_env_LDFLAGS_set=set ac_cv_env_LDFLAGS_value='-Wl,--wrap,sigaction -L/home/jimmy/mono-unity-5.6/../../android_krait_signal_handler/build/obj/local/armeabi -lkrait-signal-handler -Wl,--no-undefined -Wl,--gc-sections -Wl,-rpath-link=/home/jimmy/android-ndk_auto-r10e/platforms/android-9/arch-arm/usr/lib -ldl -lm -llog -lc ' ac_cv_env_LIBS_set= ac_cv_env_LIBS_value= ac_cv_env_LT_SYS_LIBRARY_PATH_set= ac_cv_env_LT_SYS_LIBRARY_PATH_value= ac_cv_env_XMKMF_set= ac_cv_env_XMKMF_value= ac_cv_env_build_alias_set= ac_cv_env_build_alias_value= ac_cv_env_host_alias_set=set ac_cv_env_host_alias_value=arm-eabi-linux ac_cv_env_target_alias_set= ac_cv_env_target_alias_value= ac_cv_host=arm-eabi-linux-gnu ac_cv_path_install='/usr/bin/install -c' ac_cv_path_mkdir=/bin/mkdir ac_cv_prog_AWK=gawk ac_cv_prog_CC='/home/jimmy/android-ndk_auto-r10e/toolchains/arm-linux-androideabi-4.8/prebuilt/linux-x86/bin/arm-linux-androideabi-gcc --sysroot=/home/jimmy/android-ndk_auto-r10e/platforms/android-9/arch-arm' ac_cv_prog_STRIP=/home/jimmy/android-ndk_auto-r10e/toolchains/arm-linux-androideabi-4.8/prebuilt/linux-x86/bin/arm-linux-androideabi-strip ac_cv_prog_make_make_set=yes ac_cv_target=arm-eabi-linux-gnu am_cv_make_support_nested_variables=yes am_cv_prog_tar_ustar=gnutar mono_cv_uscore=yes ## ----------------- ## ## Output variables. ## ## ----------------- ## ACLOCAL='${SHELL} /home/jimmy/mono-unity-5.6/missing aclocal-1.15' ALPHA_FALSE='' ALPHA_TRUE='' AMD64_FALSE='' AMD64_TRUE='' AMDEPBACKSLASH='' AMDEP_FALSE='' AMDEP_TRUE='' AMTAR='$${TAR-tar}' AM_BACKSLASH='\' AM_DEFAULT_V='$(AM_DEFAULT_VERBOSITY)' AM_DEFAULT_VERBOSITY='1' AM_V='$(V)' API_VER='1.0' AR='/home/jimmy/android-ndk_auto-r10e/toolchains/arm-linux-androideabi-4.8/prebuilt/linux-x86/bin/arm-linux-androideabi-ar' ARM_FALSE='' ARM_TRUE='' AS='/home/jimmy/android-ndk_auto-r10e/toolchains/arm-linux-androideabi-4.8/prebuilt/linux-x86/bin/arm-linux-androideabi-as' AUTOCONF='${SHELL} /home/jimmy/mono-unity-5.6/missing autoconf' AUTOHEADER='${SHELL} /home/jimmy/mono-unity-5.6/missing autoheader' AUTOMAKE='${SHELL} /home/jimmy/mono-unity-5.6/missing automake-1.15' AWK='gawk' BISON='' BUILD_EXEEXT='' BUILD_GLIB_CFLAGS='' BUILD_GLIB_LIBS='' BUILD_MCS_FALSE='' BUILD_MCS_TRUE='' CC='/home/jimmy/android-ndk_auto-r10e/toolchains/arm-linux-androideabi-4.8/prebuilt/linux-x86/bin/arm-linux-androideabi-gcc --sysroot=/home/jimmy/android-ndk_auto-r10e/platforms/android-9/arch-arm' CCAS='' CCASDEPMODE='' CCASFLAGS='' CCDEPMODE='' CC_FOR_BUILD='' CFLAGS='-DANDROID -DPLATFORM_ANDROID -DLINUX -D__linux__ -DHAVE_USR_INCLUDE_MALLOC_H -DPAGE_SIZE=0x1000 -D_POSIX_PATH_MAX=256 -DS_IWRITE=S_IWUSR -DHAVE_PTHREAD_MUTEX_TIMEDLOCK -fpic -g -funwind-tables -ffunction-sections -fdata-sections -DARM_FPU_NONE=1 -march=armv5te -mtune=xscale -msoft-float' CFLAGS_FOR_BUILD='' CPP='/home/jimmy/android-ndk_auto-r10e/toolchains/arm-linux-androideabi-4.8/prebuilt/linux-x86/bin/arm-linux-androideabi-cpp' CPPFLAGS='-DANDROID -DPLATFORM_ANDROID -DLINUX -D__linux__ -DHAVE_USR_INCLUDE_MALLOC_H -DPAGE_SIZE=0x1000 -D_POSIX_PATH_MAX=256 -DS_IWRITE=S_IWUSR -DHAVE_PTHREAD_MUTEX_TIMEDLOCK -fpic -g -funwind-tables -ffunction-sections -fdata-sections -DARM_FPU_NONE=1 -march=armv5te -mtune=xscale -msoft-float -DGC_LINUX_THREADS -D_GNU_SOURCE -D_REENTRANT -DUSE_MMAP -DUSE_MUNMAP' CROSS_COMPILING_FALSE='' CROSS_COMPILING_TRUE='' CXX='/home/jimmy/android-ndk_auto-r10e/toolchains/arm-linux-androideabi-4.8/prebuilt/linux-x86/bin/arm-linux-androideabi-g++ --sysroot=/home/jimmy/android-ndk_auto-r10e/platforms/android-9/arch-arm' CXXCPP='/home/jimmy/android-ndk_auto-r10e/toolchains/arm-linux-androideabi-4.8/prebuilt/linux-x86/bin/arm-linux-androideabi-cpp' CXXDEPMODE='' CXXFLAGS='-DANDROID -DPLATFORM_ANDROID -DLINUX -D__linux__ -DHAVE_USR_INCLUDE_MALLOC_H -DPAGE_SIZE=0x1000 -D_POSIX_PATH_MAX=256 -DS_IWRITE=S_IWUSR -DHAVE_PTHREAD_MUTEX_TIMEDLOCK -fpic -g -funwind-tables -ffunction-sections -fdata-sections -DARM_FPU_NONE=1 -march=armv5te -mtune=xscale -msoft-float' CYGPATH_W='echo' DEFS='' DEPDIR='' DISABLE_JIT_FALSE='' DISABLE_JIT_TRUE='' DISABLE_MCS_DOCS_FALSE='' DISABLE_MCS_DOCS_TRUE='' DISABLE_PROFILER_FALSE='' DISABLE_PROFILER_TRUE='' DISABLE_SHARED_HANDLES='' DLLTOOL='' DOLT_BASH='' DSYMUTIL='' DTRACE='' DTRACEFLAGS='' DTRACE_G_REQUIRED_FALSE='' DTRACE_G_REQUIRED_TRUE='' DUMPBIN='' ECHO_C='' ECHO_N='-n' ECHO_T='' EGLIB_BUILD_FALSE='' EGLIB_BUILD_TRUE='' EGREP='' ENABLE_DTRACE_FALSE='' ENABLE_DTRACE_TRUE='' ENABLE_LLVM_FALSE='' ENABLE_LLVM_TRUE='' ENABLE_NUNIT_TESTS_FALSE='' ENABLE_NUNIT_TESTS_TRUE='' EXEEXT='' FGREP='' GETTEXT_MACRO_VERSION='' GLIB_CFLAGS='' GLIB_LIBS='' GMODULE_CFLAGS='' GMODULE_LIBS='' GMSGFMT='' GMSGFMT_015='' GREP='' HAVE_BOEHM_GC='' HAVE_MSGFMT='' HAVE_OPROFILE_FALSE='' HAVE_OPROFILE_TRUE='' HAVE_ZLIB_FALSE='' HAVE_ZLIB_TRUE='' HOST_CC='' HPPA_FALSE='' HPPA_TRUE='' IA64_FALSE='' IA64_TRUE='' INCLUDED_LIBGC_FALSE='' INCLUDED_LIBGC_TRUE='' INSTALL_2_0_FALSE='' INSTALL_2_0_TRUE='' INSTALL_2_1_FALSE='' INSTALL_2_1_TRUE='' INSTALL_4_0_FALSE='' INSTALL_4_0_TRUE='' INSTALL_DATA='${INSTALL} -m 644' INSTALL_MONOTOUCH_FALSE='' INSTALL_MONOTOUCH_TRUE='' INSTALL_PROGRAM='${INSTALL}' INSTALL_SCRIPT='${INSTALL}' INSTALL_STRIP_PROGRAM='$(install_sh) -c -s' INSTALL_UNITY_FALSE='' INSTALL_UNITY_TRUE='' INTERP_SUPPORTED_FALSE='' INTERP_SUPPORTED_TRUE='' INTL='' JIT_SUPPORTED_FALSE='' JIT_SUPPORTED_TRUE='' LD='/home/jimmy/android-ndk_auto-r10e/toolchains/arm-linux-androideabi-4.8/prebuilt/linux-x86/bin/arm-linux-androideabi-ld' LDFLAGS='-Wl,--wrap,sigaction -L/home/jimmy/mono-unity-5.6/../../android_krait_signal_handler/build/obj/local/armeabi -lkrait-signal-handler -Wl,--no-undefined -Wl,--gc-sections -Wl,-rpath-link=/home/jimmy/android-ndk_auto-r10e/platforms/android-9/arch-arm/usr/lib -ldl -lm -llog -lc ' LIBC='' LIBGC_CFLAGS='' LIBGC_LIBS='' LIBGC_STATIC_LIBS='' LIBOBJS='' LIBS='' LIBTOOL='' LIPO='' LLVM_CFLAGS='' LLVM_CONFIG='' LLVM_CXXFLAGS='' LLVM_LDFLAGS='' LLVM_LIBS='' LN_S='ln -s' LTCOMPILE='' LTCXXCOMPILE='' LTLIBOBJS='' LT_SYS_LIBRARY_PATH='' M68K_FALSE='' M68K_TRUE='' MAINT='#' MAINTAINER_MODE_FALSE='' MAINTAINER_MODE_TRUE='#' MAKEINFO='${SHELL} /home/jimmy/mono-unity-5.6/missing makeinfo' MANIFEST_TOOL='' MIPS_FALSE='' MIPS_GCC_FALSE='' MIPS_GCC_TRUE='' MIPS_SGI_FALSE='' MIPS_SGI_TRUE='' MIPS_TRUE='' MKDIR_P='/bin/mkdir -p' MONO_DEBUGGER_SUPPORTED_FALSE='' MONO_DEBUGGER_SUPPORTED_TRUE='' MONO_DL_NEED_USCORE='' MSGFMT='' MSGFMT_015='' MSGMERGE='' NM='' NMEDIT='' NO_VERSION_SCRIPT_FALSE='' NO_VERSION_SCRIPT_TRUE='' OBJDUMP='' OBJEXT='' OPROFILE_CFLAGS='' OPROFILE_LIBS='' OTOOL64='' OTOOL='' PACKAGE='mono' PACKAGE_BUGREPORT='' PACKAGE_NAME='' PACKAGE_STRING='' PACKAGE_TARNAME='' PACKAGE_URL='' PACKAGE_VERSION='' PATH_SEPARATOR=':' PKG_CONFIG='' PLATFORM_DARWIN_FALSE='' PLATFORM_DARWIN_TRUE='#' PLATFORM_LINUX_FALSE='#' PLATFORM_LINUX_TRUE='' PLATFORM_SIGPOSIX_FALSE='#' PLATFORM_SIGPOSIX_TRUE='' PLATFORM_WIN32_FALSE='' PLATFORM_WIN32_TRUE='#' POWERPC64_FALSE='' POWERPC64_TRUE='' POWERPC_FALSE='' POWERPC_TRUE='' RANLIB='/home/jimmy/android-ndk_auto-r10e/toolchains/arm-linux-androideabi-4.8/prebuilt/linux-x86/bin/arm-linux-androideabi-ranlib' S390_FALSE='' S390_TRUE='' S390x_FALSE='' S390x_TRUE='' SED='' SET_MAKE='' SHELL='/bin/bash' SPARC64_FALSE='' SPARC64_TRUE='' SPARC_FALSE='' SPARC_TRUE='' SQLITE3='' SQLITE='' STATIC_MONO_FALSE='' STATIC_MONO_TRUE='' STRIP='/home/jimmy/android-ndk_auto-r10e/toolchains/arm-linux-androideabi-4.8/prebuilt/linux-x86/bin/arm-linux-androideabi-strip' USE_BATCH_FILES_FALSE='' USE_BATCH_FILES_TRUE='' USE_JIT_FALSE='' USE_JIT_TRUE='' USE_NLS='' VERSION='2.6.5' X11='' X86_FALSE='' X86_TRUE='' XATTR_LIB='' XGETTEXT='' XGETTEXT_015='' XGETTEXT_EXTRA_OPTIONS='' XMKMF='' ac_ct_AR='' ac_ct_CC='' ac_ct_CXX='' ac_ct_DUMPBIN='' am__EXEEXT_FALSE='' am__EXEEXT_TRUE='' am__fastdepCCAS_FALSE='' am__fastdepCCAS_TRUE='' am__fastdepCC_FALSE='' am__fastdepCC_TRUE='' am__fastdepCXX_FALSE='' am__fastdepCXX_TRUE='' am__include='' am__isrc='' am__leading_dot='.' am__nodep='' am__quote='' am__tar='tar --format=ustar -chf - "$$tardir"' am__untar='tar -xf -' arch_target='' bindir='${exec_prefix}/bin' build='x86_64-pc-linux-gnu' build_alias='' build_cpu='x86_64' build_os='linux-gnu' build_vendor='pc' datadir='${datarootdir}' datarootdir='${prefix}/share' docdir='${datarootdir}/doc/${PACKAGE}' docs_dir='' dvidir='${docdir}' eglib_dir='' exec_prefix='NONE' export_ldflags='' host='arm-eabi-linux-gnu' host_alias='arm-eabi-linux' host_cpu='arm' host_os='linux-gnu' host_vendor='eabi' htmldir='${docdir}' ikvm_native_dir='' includedir='${prefix}/include' infodir='${datarootdir}/info' install_sh='${SHELL} /home/jimmy/mono-unity-5.6/install-sh' libdir='${exec_prefix}/lib' libexecdir='${exec_prefix}/libexec' libgc_dir='' libgdiplus_loc='' libmono_cflags='-D_REENTRANT' libmono_ldflags='' libsuffix='' localedir='${datarootdir}/locale' localstatedir='${prefix}/var' mandir='${datarootdir}/man' mcs_topdir='' mcs_topdir_from_srcdir='' mkdir_p='$(MKDIR_P)' mono_build_root='' mono_cfg_dir='' mono_runtime='' oldincludedir='/usr/include' pdfdir='${docdir}' prefix='/home/jimmy/mono-unity-5.6/builds/android' program_transform_name='s,x,x,' psdir='${docdir}' reloc_libdir='lib' runstatedir='${localstatedir}/run' sbindir='${exec_prefix}/sbin' sharedstatedir='${prefix}/com' subdirs='' sysconfdir='${prefix}/etc' target='arm-eabi-linux-gnu' target_alias='' target_cpu='arm' target_os='linux-gnu' target_vendor='eabi' ## ----------- ## ## confdefs.h. ## ## ----------- ## /* confdefs.h */ #define PACKAGE_NAME "" #define PACKAGE_TARNAME "" #define PACKAGE_VERSION "" #define PACKAGE_STRING "" #define PACKAGE_BUGREPORT "" #define PACKAGE_URL "" #define PACKAGE "mono" #define VERSION "2.6.5" configure: exit 77

matlab2014a并行错误,parallel pool test 失败

本来用的挺好,突然并行错误,不知道该怎么办 Failed to locate and destroy old interactive jobs. 错误使用 parallel.internal.customattr.CustomGetSet/get (line 32) The storage metadata file does not exist or is corrupt.

java.lang.ClassNotFoundException: org.junit.Test

java.lang.ClassNotFoundException: org.junit.Test at org.apache.catalina.loader.WebappClassLoaderBase.loadClass(WebappClassLoaderBase.java:1856) at org.apache.catalina.loader.WebappClassLoaderBase.loadClass(WebappClassLoaderBase.java:1705) at org.springframework.core.type.classreading.RecursiveAnnotationAttributesVisitor.visitEnd(AnnotationAttributesReadingVisitor.java:167) at org.springframework.asm.ClassReader.a(Unknown Source) at org.springframework.asm.ClassReader.accept(Unknown Source) at org.springframework.asm.ClassReader.accept(Unknown Source) at org.springframework.core.type.classreading.SimpleMetadataReader.<init>(SimpleMetadataReader.java:54) at org.springframework.core.type.classreading.SimpleMetadataReaderFactory.getMetadataReader(SimpleMetadataReaderFactory.java:80) at org.springframework.core.type.classreading.CachingMetadataReaderFactory.getMetadataReader(CachingMetadataReaderFactory.java:101) at org.springframework.context.annotation.ClassPathScanningCandidateComponentProvider.findCandidateComponents(ClassPathScanningCandidateComponentProvider.java:237) at org.springframework.context.annotation.ClassPathBeanDefinitionScanner.doScan(ClassPathBeanDefinitionScanner.java:242) at org.springframework.context.annotation.ComponentScanBeanDefinitionParser.parse(ComponentScanBeanDefinitionParser.java:84) at org.springframework.beans.factory.xml.NamespaceHandlerSupport.parse(NamespaceHandlerSupport.java:73) at org.springframework.beans.factory.xml.BeanDefinitionParserDelegate.parseCustomElement(BeanDefinitionParserDelegate.java:1419) at org.springframework.beans.factory.xml.BeanDefinitionParserDelegate.parseCustomElement(BeanDefinitionParserDelegate.java:1409) at org.springframework.beans.factory.xml.DefaultBeanDefinitionDocumentReader.parseBeanDefinitions(DefaultBeanDefinitionDocumentReader.java:184) at org.springframework.beans.factory.xml.DefaultBeanDefinitionDocumentReader.doRegisterBeanDefinitions(DefaultBeanDefinitionDocumentReader.java:140) at org.springframework.beans.factory.xml.DefaultBeanDefinitionDocumentReader.registerBeanDefinitions(DefaultBeanDefinitionDocumentReader.java:111) at org.springframework.beans.factory.xml.XmlBeanDefinitionReader.registerBeanDefinitions(XmlBeanDefinitionReader.java:493) at org.springframework.beans.factory.xml.XmlBeanDefinitionReader.doLoadBeanDefinitions(XmlBeanDefinitionReader.java:390) at org.springframework.beans.factory.xml.XmlBeanDefinitionReader.loadBeanDefinitions(XmlBeanDefinitionReader.java:334) at org.springframework.beans.factory.xml.XmlBeanDefinitionReader.loadBeanDefinitions(XmlBeanDefinitionReader.java:302) at org.springframework.beans.factory.support.AbstractBeanDefinitionReader.loadBeanDefinitions(AbstractBeanDefinitionReader.java:174) at org.springframework.beans.factory.support.AbstractBeanDefinitionReader.loadBeanDefinitions(AbstractBeanDefinitionReader.java:209) at org.springframework.beans.factory.xml.DefaultBeanDefinitionDocumentReader.importBeanDefinitionResource(DefaultBeanDefinitionDocumentReader.java:239) at org.springframework.beans.factory.xml.DefaultBeanDefinitionDocumentReader.parseDefaultElement(DefaultBeanDefinitionDocumentReader.java:196) at org.springframework.beans.factory.xml.DefaultBeanDefinitionDocumentReader.parseBeanDefinitions(DefaultBeanDefinitionDocumentReader.java:181) at org.springframework.beans.factory.xml.DefaultBeanDefinitionDocumentReader.doRegisterBeanDefinitions(DefaultBeanDefinitionDocumentReader.java:140) at org.springframework.beans.factory.xml.DefaultBeanDefinitionDocumentReader.registerBeanDefinitions(DefaultBeanDefinitionDocumentReader.java:111) at org.springframework.beans.factory.xml.XmlBeanDefinitionReader.registerBeanDefinitions(XmlBeanDefinitionReader.java:493) at org.springframework.beans.factory.xml.XmlBeanDefinitionReader.doLoadBeanDefinitions(XmlBeanDefinitionReader.java:390) at org.springframework.beans.factory.xml.XmlBeanDefinitionReader.loadBeanDefinitions(XmlBeanDefinitionReader.java:334) at org.springframework.beans.factory.xml.XmlBeanDefinitionReader.loadBeanDefinitions(XmlBeanDefinitionReader.java:302) at org.springframework.beans.factory.support.AbstractBeanDefinitionReader.loadBeanDefinitions(AbstractBeanDefinitionReader.java:174) at org.springframework.beans.factory.support.AbstractBeanDefinitionReader.loadBeanDefinitions(AbstractBeanDefinitionReader.java:209) at org.springframework.beans.factory.support.AbstractBeanDefinitionReader.loadBeanDefinitions(AbstractBeanDefinitionReader.java:180) at org.springframework.web.context.support.XmlWebApplicationContext.loadBeanDefinitions(XmlWebApplicationContext.java:125) at org.springframework.web.context.support.XmlWebApplicationContext.loadBeanDefinitions(XmlWebApplicationContext.java:94) at org.springframework.context.support.AbstractRefreshableApplicationContext.refreshBeanFactory(AbstractRefreshableApplicationContext.java:131) at org.springframework.context.support.AbstractApplicationContext.obtainFreshBeanFactory(AbstractApplicationContext.java:522) at org.springframework.context.support.AbstractApplicationContext.refresh(AbstractApplicationContext.java:436) at org.springframework.web.servlet.FrameworkServlet.configureAndRefreshWebApplicationContext(FrameworkServlet.java:631) at org.springframework.web.servlet.FrameworkServlet.createWebApplicationContext(FrameworkServlet.java:588) at org.springframework.web.servlet.FrameworkServlet.createWebApplicationContext(FrameworkServlet.java:645) at org.springframework.web.servlet.FrameworkServlet.initWebApplicationContext(FrameworkServlet.java:508) at org.springframework.web.servlet.FrameworkServlet.initServletBean(FrameworkServlet.java:449) at org.springframework.web.servlet.HttpServletBean.init(HttpServletBean.java:133) at javax.servlet.GenericServlet.init(GenericServlet.java:158) at org.apache.catalina.core.StandardWrapper.initServlet(StandardWrapper.java:1284) at org.apache.catalina.core.StandardWrapper.loadServlet(StandardWrapper.java:1197) at org.apache.catalina.core.StandardWrapper.load(StandardWrapper.java:1087) at org.apache.catalina.core.StandardContext.loadOnStartup(StandardContext.java:5327) at org.apache.catalina.core.StandardContext.startInternal(StandardContext.java:5617) at org.apache.catalina.util.LifecycleBase.start(LifecycleBase.java:150) at org.apache.catalina.core.ContainerBase.addChildInternal(ContainerBase.java:901) at org.apache.catalina.core.ContainerBase.addChild(ContainerBase.java:877) at org.apache.catalina.core.StandardHost.addChild(StandardHost.java:652) at org.apache.catalina.startup.HostConfig.deployDescriptor(HostConfig.java:677) at org.apache.catalina.startup.HostConfig$DeployDescriptor.run(HostConfig.java:1939) at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:471) at java.util.concurrent.FutureTask.run(FutureTask.java:262) at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145) at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615) at java.lang.Thread.run(Thread.java:745)

flag.PrintDefaults包含`-test`标志

<div class="post-text" itemprop="text"> <p>I am building a CLI application in Go.</p> <pre><code>flag.IntVar(&amp;connections, "c", 1, "Connections to keep open per endpoint") flag.IntVar(&amp;duration, "T", 10, "Exit after the specified amount of time in seconds") flag.IntVar(&amp;txsRate, "r", 1000, "Txs per second to send in a connection") flag.BoolVar(&amp;verbose, "v", false, "Verbose output") flag.Usage = func() { fmt.Println(`....`) fmt.Println("Flags:") flag.PrintDefaults() } flag.Parse() if flag.NArg() == 0 { flag.Usage() os.Exit(1) } </code></pre> <p>(<a href="https://github.com/tendermint/tools/blob/develop/tm-bench/main.go" rel="nofollow noreferrer">Full listing on Github</a>)</p> <p>For some strange reason, the above snippet produces:</p> <pre><code>Flags: -T int Exit after the specified amount of time in seconds (default 10) -c int Connections to keep open per endpoint (default 1) -r int Txs per second to send in a connection (default 1000) -test.bench regexp run only benchmarks matching regexp -test.benchmem print memory allocations for benchmarks -test.benchtime d run each benchmark for duration d (default 1s) -test.blockprofile file write a goroutine blocking profile to file -test.blockprofilerate rate set blocking profile rate (see runtime.SetBlockProfileRate) (default 1) -test.count n run tests and benchmarks n times (default 1) -test.coverprofile file write a coverage profile to file -test.cpu list comma-separated list of cpu counts to run each test with -test.cpuprofile file write a cpu profile to file -test.memprofile file write a memory profile to file -test.memprofilerate rate set memory profiling rate (see runtime.MemProfileRate) -test.mutexprofile string write a mutex contention profile to the named file after execution -test.mutexprofilefraction int if &gt;= 0, calls runtime.SetMutexProfileFraction() (default 1) -test.outputdir dir write profiles to dir -test.parallel n run at most n tests in parallel (default 2) -test.run regexp run only tests and examples matching regexp -test.short run smaller test suite to save time -test.timeout d fail test binary execution after duration d (0 means unlimited) -test.trace file write an execution trace to file -test.v verbose: print additional output -v Verbose output </code></pre> <p>Any ideas why Go includes <code>-test</code> flags? Thanks!</p> </div>

kettle的JOB互相嵌套的作业,在windows上正常运行,迁移到Linux服务器就报错了

windows系统运行正常: ![图片说明](https://img-ask.csdn.net/upload/201907/29/1564389024_347344.png) Linux系统下运行报错: ``` /usr/local/kettle/data-integration/spoon.sh: line 140: ldconfig: command not found ####################################################################### WARNING: no libwebkitgtk-1.0 detected, some features will be unavailable Consider installing the package with apt-get or yum. e.g. 'sudo apt-get install libwebkitgtk-1.0-0' ####################################################################### Java HotSpot(TM) 64-Bit Server VM warning: ignoring option MaxPermSize=256m; support was removed in 8.0 log4j:WARN Continuable parsing error 45 and column 76 log4j:WARN Element type "rollingPolicy" must be declared. log4j:WARN Continuable parsing error 52 and column 14 log4j:WARN The content of element type "appender" must match "(errorHandler?,param*,layout?,filter*,appender-ref*)". log4j:WARN Please set a rolling policy for the RollingFileAppender named 'pdi-execution-appender' 16:11:29,469 INFO [KarafBoot] Checking to see if org.pentaho.clean.karaf.cache is enabled 16:11:29,646 INFO [KarafInstance] ******************************************************************************* *** Karaf Instance Number: 1 at /usr/local/kettle/data-integration/./system *** *** /karaf/caches/kitchen/data-1 *** *** FastBin Provider Port:52901 *** *** Karaf Port:8802 *** *** OSGI Service Port:9051 *** ******************************************************************************* Jul 29, 2019 4:11:32 PM org.apache.karaf.main.Main$KarafLockCallback lockAquired INFO: Lock acquired. Setting startlevel to 100 2019/07/29 16:11:36 - Kitchen - Start of run. 2019/07/29 16:11:36 - RepositoriesMeta - Reading repositories XML file: /usr/local/kettle/.kettle/repositories.xml 2019-07-29 16:11:53.034:INFO:oejs.Server:jetty-8.1.15.v20140411 2019-07-29 16:11:53.106:INFO:oejs.AbstractConnector:Started NIOSocketConnectorWrapper@0.0.0.0:9051 Jul 29, 2019 4:11:58 PM org.apache.cxf.bus.blueprint.NamespaceHandlerRegisterer register INFO: Registered blueprint namespace handler for http://cxf.apache.org/blueprint/core Jul 29, 2019 4:11:58 PM org.apache.cxf.bus.blueprint.NamespaceHandlerRegisterer register INFO: Registered blueprint namespace handler for http://cxf.apache.org/configuration/beans Jul 29, 2019 4:11:58 PM org.apache.cxf.bus.blueprint.NamespaceHandlerRegisterer register INFO: Registered blueprint namespace handler for http://cxf.apache.org/configuration/parameterized-types Jul 29, 2019 4:11:58 PM org.apache.cxf.bus.blueprint.NamespaceHandlerRegisterer register INFO: Registered blueprint namespace handler for http://cxf.apache.org/configuration/security Jul 29, 2019 4:11:58 PM org.apache.cxf.bus.blueprint.NamespaceHandlerRegisterer register INFO: Registered blueprint namespace handler for http://schemas.xmlsoap.org/wsdl/ Jul 29, 2019 4:11:58 PM org.apache.cxf.bus.blueprint.NamespaceHandlerRegisterer register INFO: Registered blueprint namespace handler for http://www.w3.org/2005/08/addressing Jul 29, 2019 4:11:58 PM org.apache.cxf.bus.blueprint.NamespaceHandlerRegisterer register INFO: Registered blueprint namespace handler for http://schemas.xmlsoap.org/ws/2004/08/addressing Jul 29, 2019 4:11:58 PM org.apache.cxf.bus.osgi.CXFExtensionBundleListener addExtensions INFO: Adding the extensions from bundle org.apache.cxf.cxf-rt-management (195) [org.apache.cxf.management.InstrumentationManager] Jul 29, 2019 4:11:59 PM org.apache.cxf.bus.osgi.CXFExtensionBundleListener addExtensions INFO: Adding the extensions from bundle org.apache.cxf.cxf-rt-wsdl (198) [org.apache.cxf.wsdl.WSDLManager] Jul 29, 2019 4:11:59 PM org.apache.cxf.bus.osgi.CXFExtensionBundleListener addExtensions INFO: Adding the extensions from bundle org.apache.cxf.cxf-rt-bindings-xml (200) [org.apache.cxf.binding.xml.XMLBindingFactory, org.apache.cxf.binding.xml.wsdl11.XMLWSDLExtensionLoader] Jul 29, 2019 4:11:59 PM org.apache.cxf.bus.osgi.CXFExtensionBundleListener addExtensions INFO: Adding the extensions from bundle org.apache.cxf.cxf-rt-bindings-soap (201) [org.apache.cxf.binding.soap.SoapBindingFactory, org.apache.cxf.binding.soap.SoapTransportFactory] Jul 29, 2019 4:11:59 PM org.apache.cxf.bus.blueprint.NamespaceHandlerRegisterer register INFO: Registered blueprint namespace handler for http://cxf.apache.org/blueprint/bindings/soap Jul 29, 2019 4:11:59 PM org.apache.cxf.bus.osgi.CXFExtensionBundleListener addExtensions INFO: Adding the extensions from bundle org.apache.cxf.cxf-rt-transports-http (202) [org.apache.cxf.transport.http.HTTPTransportFactory, org.apache.cxf.transport.http.HTTPWSDLExtensionLoader, org.apache.cxf.transport.http.policy.HTTPClientAssertionBuilder, org.apache.cxf.transport.http.policy.HTTPServerAssertionBuilder, org.apache.cxf.transport.http.policy.NoOpPolicyInterceptorProvider] Jul 29, 2019 4:11:59 PM org.apache.cxf.bus.osgi.CXFExtensionBundleListener addExtensions INFO: Adding the extensions from bundle org.apache.cxf.cxf-rt-ws-policy (220) [org.apache.cxf.ws.policy.PolicyEngine, org.apache.cxf.policy.PolicyDataEngine, org.apache.cxf.ws.policy.AssertionBuilderRegistry, org.apache.cxf.ws.policy.PolicyInterceptorProviderRegistry, org.apache.cxf.ws.policy.PolicyBuilder, org.apache.cxf.ws.policy.PolicyAnnotationListener, org.apache.cxf.ws.policy.attachment.ServiceModelPolicyProvider, org.apache.cxf.ws.policy.attachment.external.DomainExpressionBuilderRegistry, org.apache.cxf.ws.policy.attachment.external.EndpointReferenceDomainExpressionBuilder, org.apache.cxf.ws.policy.attachment.external.URIDomainExpressionBuilder, org.apache.cxf.ws.policy.attachment.wsdl11.Wsdl11AttachmentPolicyProvider, org.apache.cxf.ws.policy.mtom.MTOMAssertionBuilder, org.apache.cxf.ws.policy.mtom.MTOMPolicyInterceptorProvider] Jul 29, 2019 4:11:59 PM org.apache.cxf.bus.blueprint.NamespaceHandlerRegisterer register INFO: Registered blueprint namespace handler for http://cxf.apache.org/transports/http/configuration Jul 29, 2019 4:11:59 PM org.apache.cxf.bus.blueprint.NamespaceHandlerRegisterer register INFO: Registered blueprint namespace handler for http://cxf.apache.org/blueprint/simple Jul 29, 2019 4:11:59 PM org.apache.cxf.bus.osgi.CXFExtensionBundleListener addExtensions INFO: Adding the extensions from bundle org.apache.cxf.cxf-rt-frontend-jaxws (204) [org.apache.cxf.jaxws.context.WebServiceContextResourceResolver] Jul 29, 2019 4:11:59 PM org.apache.cxf.bus.blueprint.NamespaceHandlerRegisterer register INFO: Registered blueprint namespace handler for http://cxf.apache.org/blueprint/jaxws Jul 29, 2019 4:12:00 PM org.apache.cxf.bus.blueprint.NamespaceHandlerRegisterer register INFO: Registered blueprint namespace handler for http://cxf.apache.org/blueprint/jaxrs Jul 29, 2019 4:12:00 PM org.apache.cxf.bus.blueprint.NamespaceHandlerRegisterer register INFO: Registered blueprint namespace handler for http://cxf.apache.org/blueprint/jaxrs-client Jul 29, 2019 4:12:00 PM org.apache.cxf.bus.blueprint.NamespaceHandlerRegisterer register INFO: Registered blueprint namespace handler for http://cxf.apache.org/binding/coloc Jul 29, 2019 4:12:00 PM org.apache.cxf.bus.osgi.CXFExtensionBundleListener addExtensions INFO: Adding the extensions from bundle org.apache.cxf.cxf-rt-transports-local (216) [org.apache.cxf.transport.local.LocalTransportFactory] Jul 29, 2019 4:12:00 PM org.apache.cxf.bus.osgi.CXFExtensionBundleListener addExtensions INFO: Adding the extensions from bundle org.apache.cxf.cxf-rt-bindings-object (217) [org.apache.cxf.binding.object.ObjectBindingFactory] Jul 29, 2019 4:12:00 PM org.apache.cxf.bus.blueprint.NamespaceHandlerRegisterer register INFO: Registered blueprint namespace handler for http://cxf.apache.org/blueprint/binding/object Jul 29, 2019 4:12:00 PM org.apache.cxf.bus.blueprint.NamespaceHandlerRegisterer register INFO: Registered blueprint namespace handler for http://cxf.apache.org/policy Jul 29, 2019 4:12:00 PM org.apache.cxf.bus.blueprint.NamespaceHandlerRegisterer register INFO: Registered blueprint namespace handler for http://www.w3.org/ns/ws-policy Jul 29, 2019 4:12:00 PM org.apache.cxf.bus.blueprint.NamespaceHandlerRegisterer register INFO: Registered blueprint namespace handler for http://www.w3.org/2006/07/ws-policy Jul 29, 2019 4:12:00 PM org.apache.cxf.bus.blueprint.NamespaceHandlerRegisterer register INFO: Registered blueprint namespace handler for http://schemas.xmlsoap.org/ws/2004/09/policy Jul 29, 2019 4:12:00 PM org.apache.cxf.bus.blueprint.NamespaceHandlerRegisterer register INFO: Registered blueprint namespace handler for http://docs.oasis-open.org/wss/2004/01/oasis-200401-wss-wssecurity-secext-1.0.xsd Jul 29, 2019 4:12:00 PM org.apache.cxf.bus.blueprint.NamespaceHandlerRegisterer register INFO: Registered blueprint namespace handler for http://docs.oasis-open.org/wss/2004/01/oasis-200401-wss-wssecurity-utility-1.0.xsd Jul 29, 2019 4:12:00 PM org.apache.cxf.bus.blueprint.NamespaceHandlerRegisterer register INFO: Registered blueprint namespace handler for http://www.w3.org/2000/09/xmldsig# Jul 29, 2019 4:12:00 PM org.apache.cxf.bus.blueprint.NamespaceHandlerRegisterer register INFO: Registered blueprint namespace handler for http://docs.oasis-open.org/ws-sx/ws-securitypolicy/200702 Jul 29, 2019 4:12:00 PM org.apache.cxf.bus.osgi.CXFExtensionBundleListener addExtensions INFO: Adding the extensions from bundle org.apache.cxf.cxf-rt-ws-addr (237) [org.apache.cxf.ws.addressing.policy.AddressingAssertionBuilder, org.apache.cxf.ws.addressing.policy.UsingAddressingAssertionBuilder, org.apache.cxf.ws.addressing.policy.AddressingPolicyInterceptorProvider, org.apache.cxf.ws.addressing.impl.AddressingWSDLExtensionLoader, org.apache.cxf.ws.addressing.WSAddressingFeature$WSAddressingFeatureApplier, org.apache.cxf.ws.addressing.MAPAggregator$MAPAggregatorLoader] Jul 29, 2019 4:12:00 PM org.apache.cxf.bus.blueprint.NamespaceHandlerRegisterer register INFO: Registered blueprint namespace handler for http://cxf.apache.org/ws/addressing Jul 29, 2019 4:12:01 PM org.apache.cxf.bus.osgi.CXFExtensionBundleListener addExtensions INFO: Adding the extensions from bundle org.apache.cxf.cxf-rt-ws-security (239) [org.apache.cxf.ws.security.policy.WSSecurityPolicyLoader, org.apache.cxf.ws.security.cache.CacheCleanupListener] Jul 29, 2019 4:12:01 PM org.apache.cxf.bus.osgi.CXFExtensionBundleListener addExtensions INFO: Adding the extensions from bundle org.apache.cxf.cxf-rt-ws-rm (241) [org.apache.cxf.ws.rm.RMManager, org.apache.cxf.ws.rm.policy.RMPolicyInterceptorProvider, org.apache.cxf.ws.rm.policy.RM10AssertionBuilder, org.apache.cxf.ws.rm.policy.RM12AssertionBuilder, org.apache.cxf.ws.rm.policy.WSRMP12PolicyLoader, org.apache.cxf.ws.rm.policy.MC11PolicyLoader, org.apache.cxf.ws.rm.policy.RSPPolicyLoader] Jul 29, 2019 4:12:01 PM org.apache.cxf.bus.blueprint.NamespaceHandlerRegisterer register INFO: Registered blueprint namespace handler for http://cxf.apache.org/ws/rm/manager Jul 29, 2019 4:12:01 PM org.apache.cxf.bus.blueprint.NamespaceHandlerRegisterer register INFO: Registered blueprint namespace handler for http://schemas.xmlsoap.org/ws/2005/02/rm/policy Jul 29, 2019 4:12:01 PM org.apache.cxf.bus.osgi.CXFExtensionBundleListener addExtensions INFO: Adding the extensions from bundle org.apache.cxf.cxf-rt-javascript (242) [org.apache.cxf.javascript.JavascriptServerListener] Jul 29, 2019 4:12:01 PM org.pentaho.caching.impl.PentahoCacheManagerFactory$RegistrationHandler$1 onSuccess INFO: New Caching Service registered SLF4J: Class path contains multiple SLF4J bindings. SLF4J: Found binding in [jar:file:/usr/local/kettle/data-integration/launcher/../lib/slf4j-log4j12-1.7.7.jar!/org/slf4j/impl/StaticLoggerBinder.class] SLF4J: Found binding in [jar:file:/usr/local/kettle/data-integration/plugins/pentaho-big-data-plugin/lib/slf4j-log4j12-1.7.7.jar!/org/slf4j/impl/StaticLoggerBinder.class] SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an explanation. SLF4J: Actual binding is of type [org.slf4j.impl.Log4jLoggerFactory] 2019/07/29 16:12:08 - test - Start of job execution log4j:ERROR No output stream or file set for the appender named [pdi-execution-appender]. 2019/07/29 16:12:08 - test - Starting entry [job1] 2019/07/29 16:12:08 - test - Launched job entry [job1] in parallel. 2019/07/29 16:12:08 - test - Starting entry [job2] 2019/07/29 16:12:08 - test - Launched job entry [job2] in parallel. 2019/07/29 16:12:08 - job2 - ERROR (version 8.2.0.0-342, build 8.2.0.0-342 from 2018-11-14 10.30.55 by buildguy) : Error running job entry 'job' : 2019/07/29 16:12:08 - job1 - ERROR (version 8.2.0.0-342, build 8.2.0.0-342 from 2018-11-14 10.30.55 by buildguy) : Error running job entry 'job' : 2019/07/29 16:12:08 - job2 - ERROR (version 8.2.0.0-342, build 8.2.0.0-342 from 2018-11-14 10.30.55 by buildguy) : org.pentaho.di.core.exception.KettleException: 2019/07/29 16:12:08 - job2 - Unexpected error during job metadata load 2019/07/29 16:12:08 - job2 - at java.lang.Thread.run (Thread.java:748) 2019/07/29 16:12:08 - job2 - at org.pentaho.di.job.Job$1.run (Job.java:798) 2019/07/29 16:12:08 - job2 - at org.pentaho.di.job.Job.access$000 (Job.java:121) 2019/07/29 16:12:08 - job2 - at org.pentaho.di.job.Job.execute (Job.java:680) 2019/07/29 16:12:08 - job2 - at org.pentaho.di.job.entries.job.JobEntryJob.execute (JobEntryJob.java:667) 2019/07/29 16:12:08 - job2 - at org.pentaho.di.job.entries.job.JobEntryJob.getJobMeta (JobEntryJob.java:1343) 2019/07/29 16:12:08 - job2 - at org.pentaho.di.job.entries.job.JobEntryJob.getJobMeta (JobEntryJob.java:1381) 2019/07/29 16:12:08 - job2 - at org.pentaho.di.job.entries.job.JobEntryJob.getJobMetaFromRepository (JobEntryJob.java:1353) 2019/07/29 16:12:08 - job2 - 2019/07/29 16:12:08 - job2 - at org.pentaho.di.job.entries.job.JobEntryJob.getJobMeta(JobEntryJob.java:1421) 2019/07/29 16:12:08 - job2 - at org.pentaho.di.job.entries.job.JobEntryJob.getJobMeta(JobEntryJob.java:1343) 2019/07/29 16:12:08 - job2 - at org.pentaho.di.job.entries.job.JobEntryJob.execute(JobEntryJob.java:667) 2019/07/29 16:12:08 - job2 - at org.pentaho.di.job.Job.execute(Job.java:680) 2019/07/29 16:12:08 - job2 - at org.pentaho.di.job.Job.access$000(Job.java:121) 2019/07/29 16:12:08 - job2 - at org.pentaho.di.job.Job$1.run(Job.java:798) 2019/07/29 16:12:08 - job2 - at java.lang.Thread.run(Thread.java:748) 2019/07/29 16:12:08 - job2 - Caused by: java.lang.NullPointerException 2019/07/29 16:12:08 - job2 - at org.pentaho.di.job.entries.job.JobEntryJob.getJobMetaFromRepository(JobEntryJob.java:1353) 2019/07/29 16:12:08 - job2 - at org.pentaho.di.job.entries.job.JobEntryJob.getJobMeta(JobEntryJob.java:1381) 2019/07/29 16:12:08 - job2 - ... 6 more 2019/07/29 16:12:08 - job1 - ERROR (version 8.2.0.0-342, build 8.2.0.0-342 from 2018-11-14 10.30.55 by buildguy) : org.pentaho.di.core.exception.KettleException: 2019/07/29 16:12:08 - job1 - Unexpected error during job metadata load 2019/07/29 16:12:08 - job1 - at java.lang.Thread.run (Thread.java:748) 2019/07/29 16:12:08 - job1 - at org.pentaho.di.job.Job$1.run (Job.java:798) 2019/07/29 16:12:08 - job1 - at org.pentaho.di.job.Job.access$000 (Job.java:121) 2019/07/29 16:12:08 - job1 - at org.pentaho.di.job.Job.execute (Job.java:680) 2019/07/29 16:12:08 - job1 - at org.pentaho.di.job.entries.job.JobEntryJob.execute (JobEntryJob.java:667) 2019/07/29 16:12:08 - job1 - at org.pentaho.di.job.entries.job.JobEntryJob.getJobMeta (JobEntryJob.java:1343) 2019/07/29 16:12:08 - job1 - at org.pentaho.di.job.entries.job.JobEntryJob.getJobMeta (JobEntryJob.java:1381) 2019/07/29 16:12:08 - job1 - at org.pentaho.di.job.entries.job.JobEntryJob.getJobMetaFromRepository (JobEntryJob.java:1353) 2019/07/29 16:12:08 - job1 - 2019/07/29 16:12:08 - job1 - at org.pentaho.di.job.entries.job.JobEntryJob.getJobMeta(JobEntryJob.java:1421) 2019/07/29 16:12:08 - job1 - at org.pentaho.di.job.entries.job.JobEntryJob.getJobMeta(JobEntryJob.java:1343) 2019/07/29 16:12:08 - job1 - at org.pentaho.di.job.entries.job.JobEntryJob.execute(JobEntryJob.java:667) 2019/07/29 16:12:08 - job1 - at org.pentaho.di.job.Job.execute(Job.java:680) 2019/07/29 16:12:08 - job1 - at org.pentaho.di.job.Job.access$000(Job.java:121) 2019/07/29 16:12:08 - job1 - at org.pentaho.di.job.Job$1.run(Job.java:798) 2019/07/29 16:12:08 - job1 - at java.lang.Thread.run(Thread.java:748) 2019/07/29 16:12:08 - job1 - Caused by: java.lang.NullPointerException 2019/07/29 16:12:08 - job1 - at org.pentaho.di.job.entries.job.JobEntryJob.getJobMetaFromRepository(JobEntryJob.java:1353) 2019/07/29 16:12:08 - job1 - at org.pentaho.di.job.entries.job.JobEntryJob.getJobMeta(JobEntryJob.java:1381) 2019/07/29 16:12:08 - job1 - ... 6 more 2019/07/29 16:12:08 - test - Job execution finished 2019/07/29 16:12:08 - Kitchen - Finished! 2019/07/29 16:12:08 - Kitchen - ERROR (version 8.2.0.0-342, build 8.2.0.0-342 from 2018-11-14 10.30.55 by buildguy) : Finished with errors 2019/07/29 16:12:08 - Kitchen - Start=2019/07/29 16:11:36.193, Stop=2019/07/29 16:12:08.071 2019/07/29 16:12:08 - Kitchen - Processing ended after 31 seconds. ``` Linux环境运行作业调用转换的就没问题,作业调用作业的就报错,望大神指点,哪儿有问题。 kettle版本为8.2,jdk版本为 java version "1.8.0_171" 如果需要什么其他的参数,我再上传

在Golang中测试数据库交互

<div class="post-text" itemprop="text"> <p>I have an API that has a storage layer. It only does the database interactions and perform the CRUD operations. Now I want to test these functions. In my path API/storage/ , I have different packages having functions to interact with different tables in same database. Tables A, B and C are in same database.</p> <p>My file hierarchy goes like:</p> <pre><code>--api --storage --A --A.go --A_test.go --B --C --server --A --testData --A.sql --B.sql </code></pre> <p>In this way I want to test the whole storage layer using command </p> <pre><code>go test ./... </code></pre> <p>The approach I was following is that I have a function <strong>RefreshTables</strong> which first truncates the table, then fills it with a fixed test data that I have kept in testData folder. For truncating I do : </p> <pre><code>db.Exec("SET FOREIGN_KEY_CHECKS = 0;") db.Exec("truncate " + table) db.Exec("SET FOREIGN_KEY_CHECKS = 1;") </code></pre> <p>As go test runs test functions of different packages in parallel by default, multiple sql connections get created and <strong>truncate</strong> runs on some other connection while <strong>set foreign key</strong> runs on some other connection randomly from connection pool.</p> <p>I am not able to pass my tests if run together but all tests pass if run alone or package wise.</p> <p>If I do : </p> <pre><code>go test ./... -p 1 </code></pre> <p>which makes test functions run one by one, all the tests pass.</p> <p>I have also tried using a transaction for truncate and locking table before truncate.</p> <p>I checked this article (<a href="https://medium.com/kongkow-it-medan/parallel-database-integration-test-on-go-application-8706b150ee2e" rel="nofollow noreferrer">https://medium.com/kongkow-it-medan/parallel-database-integration-test-on-go-application-8706b150ee2e</a>), and he suggests making different databases in every test function and dropping that database after function ends. I think this will be very time taking.</p> <p>It would be really helpful if someone suggest the best method for testing database interactions in Golang.</p> </div>

我把hive-site.xml放进spark/conf/里后报了一堆警告,怎么处理,不处理有影响吗?

之前配置的时候一直没发现忘记把hive-site.xml配置文件放到spark/conf中,今天把文件放进去,结果一打开pyspark就报一堆错,使用sparksql的时候也是报一堆警告,警告如下: 因为太长,所以先把想法写在这。我想知道怎么可以把这个提示的等级调高,或者怎么可以解决这些警告,麻烦大佬们帮忙看看,谢谢! ```shell To adjust logging level use sc.setLogLevel(newLevel). For SparkR, use setLogLevel(newLevel). 2019-11-14 17:13:50,994 WARN conf.HiveConf: HiveConf of name hive.metastore.client.capability.check does not exist 2019-11-14 17:13:50,994 WARN conf.HiveConf: HiveConf of name hive.metastore.hbase.aggregate.stats.false.positive.probability does not exist 2019-11-14 17:13:50,994 WARN conf.HiveConf: HiveConf of name hive.druid.broker.address.default does not exist 2019-11-14 17:13:50,995 WARN conf.HiveConf: HiveConf of name hive.llap.io.orc.time.counters does not exist 2019-11-14 17:13:50,995 WARN conf.HiveConf: HiveConf of name hive.tez.task.scale.memory.reserve-fraction.min does not exist 2019-11-14 17:13:50,995 WARN conf.HiveConf: HiveConf of name hive.orc.splits.ms.footer.cache.ppd.enabled does not exist 2019-11-14 17:13:50,995 WARN conf.HiveConf: HiveConf of name hive.metastore.event.message.factory does not exist 2019-11-14 17:13:50,995 WARN conf.HiveConf: HiveConf of name hive.server2.metrics.enabled does not exist 2019-11-14 17:13:50,995 WARN conf.HiveConf: HiveConf of name hive.tez.hs2.user.access does not exist 2019-11-14 17:13:50,995 WARN conf.HiveConf: HiveConf of name hive.druid.storage.storageDirectory does not exist 2019-11-14 17:13:50,995 WARN conf.HiveConf: HiveConf of name hive.llap.am.liveness.connection.timeout.ms does not exist 2019-11-14 17:13:50,995 WARN conf.HiveConf: HiveConf of name hive.tez.dynamic.semijoin.reduction.threshold does not exist 2019-11-14 17:13:50,995 WARN conf.HiveConf: HiveConf of name hive.server2.thrift.client.connect.retry.limit does not exist 2019-11-14 17:13:50,995 WARN conf.HiveConf: HiveConf of name hive.llap.daemon.xmx.headroom does not exist 2019-11-14 17:13:50,995 WARN conf.HiveConf: HiveConf of name hive.tez.dynamic.semijoin.reduction does not exist 2019-11-14 17:13:50,995 WARN conf.HiveConf: HiveConf of name hive.llap.io.allocator.direct does not exist 2019-11-14 17:13:50,995 WARN conf.HiveConf: HiveConf of name hive.llap.auto.enforce.stats does not exist 2019-11-14 17:13:50,995 WARN conf.HiveConf: HiveConf of name hive.llap.client.consistent.splits does not exist 2019-11-14 17:13:50,995 WARN conf.HiveConf: HiveConf of name hive.server2.tez.session.lifetime does not exist 2019-11-14 17:13:50,995 WARN conf.HiveConf: HiveConf of name hive.timedout.txn.reaper.start does not exist 2019-11-14 17:13:50,995 WARN conf.HiveConf: HiveConf of name hive.metastore.hbase.cache.ttl does not exist 2019-11-14 17:13:50,995 WARN conf.HiveConf: HiveConf of name hive.llap.management.acl does not exist 2019-11-14 17:13:50,995 WARN conf.HiveConf: HiveConf of name hive.llap.daemon.delegation.token.lifetime does not exist 2019-11-14 17:13:50,995 WARN conf.HiveConf: HiveConf of name hive.server2.authentication.ldap.guidKey does not exist 2019-11-14 17:13:50,995 WARN conf.HiveConf: HiveConf of name hive.ats.hook.queue.capacity does not exist 2019-11-14 17:13:50,995 WARN conf.HiveConf: HiveConf of name hive.strict.checks.large.query does not exist 2019-11-14 17:13:50,995 WARN conf.HiveConf: HiveConf of name hive.tez.bigtable.minsize.semijoin.reduction does not exist 2019-11-14 17:13:50,995 WARN conf.HiveConf: HiveConf of name hive.llap.io.allocator.alloc.min does not exist 2019-11-14 17:13:50,995 WARN conf.HiveConf: HiveConf of name hive.server2.thrift.client.user does not exist 2019-11-14 17:13:50,995 WARN conf.HiveConf: HiveConf of name hive.llap.io.encode.alloc.size does not exist 2019-11-14 17:13:50,995 WARN conf.HiveConf: HiveConf of name hive.llap.daemon.wait.queue.comparator.class.name does not exist 2019-11-14 17:13:50,995 WARN conf.HiveConf: HiveConf of name hive.llap.daemon.output.service.port does not exist 2019-11-14 17:13:50,995 WARN conf.HiveConf: HiveConf of name hive.orc.cache.use.soft.references does not exist 2019-11-14 17:13:50,996 WARN conf.HiveConf: HiveConf of name hive.llap.io.encode.enabled does not exist 2019-11-14 17:13:50,996 WARN conf.HiveConf: HiveConf of name hive.tez.task.scale.memory.reserve.fraction.max does not exist 2019-11-14 17:13:50,996 WARN conf.HiveConf: HiveConf of name hive.llap.task.communicator.listener.thread-count does not exist 2019-11-14 17:13:50,996 WARN conf.HiveConf: HiveConf of name hive.tez.container.max.java.heap.fraction does not exist 2019-11-14 17:13:50,996 WARN conf.HiveConf: HiveConf of name hive.stats.column.autogather does not exist 2019-11-14 17:13:50,996 WARN conf.HiveConf: HiveConf of name hive.llap.daemon.am.liveness.heartbeat.interval.ms does not exist 2019-11-14 17:13:50,996 WARN conf.HiveConf: HiveConf of name hive.llap.io.decoding.metrics.percentiles.intervals does not exist 2019-11-14 17:13:50,996 WARN conf.HiveConf: HiveConf of name hive.groupby.position.alias does not exist 2019-11-14 17:13:50,996 WARN conf.HiveConf: HiveConf of name hive.metastore.txn.store.impl does not exist 2019-11-14 17:13:50,996 WARN conf.HiveConf: HiveConf of name hive.spark.use.groupby.shuffle does not exist 2019-11-14 17:13:50,996 WARN conf.HiveConf: HiveConf of name hive.llap.object.cache.enabled does not exist 2019-11-14 17:13:50,996 WARN conf.HiveConf: HiveConf of name hive.server2.parallel.ops.in.session does not exist 2019-11-14 17:13:50,996 WARN conf.HiveConf: HiveConf of name hive.groupby.limit.extrastep does not exist 2019-11-14 17:13:50,996 WARN conf.HiveConf: HiveConf of name hive.server2.webui.use.ssl does not exist 2019-11-14 17:13:50,996 WARN conf.HiveConf: HiveConf of name hive.service.metrics.file.location does not exist 2019-11-14 17:13:50,996 WARN conf.HiveConf: HiveConf of name hive.server2.thrift.client.retry.delay.seconds does not exist 2019-11-14 17:13:50,996 WARN conf.HiveConf: HiveConf of name hive.materializedview.fileformat does not exist 2019-11-14 17:13:50,996 WARN conf.HiveConf: HiveConf of name hive.llap.daemon.num.file.cleaner.threads does not exist 2019-11-14 17:13:50,996 WARN conf.HiveConf: HiveConf of name hive.test.fail.compaction does not exist 2019-11-14 17:13:50,996 WARN conf.HiveConf: HiveConf of name hive.blobstore.use.blobstore.as.scratchdir does not exist 2019-11-14 17:13:50,996 WARN conf.HiveConf: HiveConf of name hive.service.metrics.class does not exist 2019-11-14 17:13:50,996 WARN conf.HiveConf: HiveConf of name hive.llap.io.allocator.mmap.path does not exist 2019-11-14 17:13:50,996 WARN conf.HiveConf: HiveConf of name hive.llap.daemon.download.permanent.fns does not exist 2019-11-14 17:13:50,996 WARN conf.HiveConf: HiveConf of name hive.server2.webui.max.historic.queries does not exist 2019-11-14 17:13:50,996 WARN conf.HiveConf: HiveConf of name hive.vectorized.execution.reducesink.new.enabled does not exist 2019-11-14 17:13:50,996 WARN conf.HiveConf: HiveConf of name hive.compactor.max.num.delta does not exist 2019-11-14 17:13:50,996 WARN conf.HiveConf: HiveConf of name hive.compactor.history.retention.attempted does not exist 2019-11-14 17:13:50,996 WARN conf.HiveConf: HiveConf of name hive.server2.webui.port does not exist 2019-11-14 17:13:50,999 WARN conf.HiveConf: HiveConf of name hive.compactor.initiator.failed.compacts.threshold does not exist 2019-11-14 17:13:50,999 WARN conf.HiveConf: HiveConf of name hive.service.metrics.reporter does not exist 2019-11-14 17:13:50,999 WARN conf.HiveConf: HiveConf of name hive.llap.daemon.output.service.max.pending.writes does not exist 2019-11-14 17:13:50,999 WARN conf.HiveConf: HiveConf of name hive.llap.execution.mode does not exist 2019-11-14 17:13:50,999 WARN conf.HiveConf: HiveConf of name hive.llap.enable.grace.join.in.llap does not exist 2019-11-14 17:13:50,999 WARN conf.HiveConf: HiveConf of name hive.optimize.limittranspose does not exist 2019-11-14 17:13:51,000 WARN conf.HiveConf: HiveConf of name hive.llap.io.memory.mode does not exist 2019-11-14 17:13:51,000 WARN conf.HiveConf: HiveConf of name hive.llap.io.threadpool.size does not exist 2019-11-14 17:13:51,000 WARN conf.HiveConf: HiveConf of name hive.druid.select.threshold does not exist 2019-11-14 17:13:51,000 WARN conf.HiveConf: HiveConf of name hive.scratchdir.lock does not exist 2019-11-14 17:13:51,000 WARN conf.HiveConf: HiveConf of name hive.server2.webui.use.spnego does not exist 2019-11-14 17:13:51,000 WARN conf.HiveConf: HiveConf of name hive.service.metrics.file.frequency does not exist 2019-11-14 17:13:51,000 WARN conf.HiveConf: HiveConf of name hive.llap.hs2.coordinator.enabled does not exist 2019-11-14 17:13:51,000 WARN conf.HiveConf: HiveConf of name hive.llap.task.scheduler.timeout.seconds does not exist 2019-11-14 17:13:51,000 WARN conf.HiveConf: HiveConf of name hive.optimize.filter.stats.reduction does not exist 2019-11-14 17:13:51,000 WARN conf.HiveConf: HiveConf of name hive.exec.orc.base.delta.ratio does not exist 2019-11-14 17:13:51,000 WARN conf.HiveConf: HiveConf of name hive.metastore.fastpath does not exist 2019-11-14 17:13:51,000 WARN conf.HiveConf: HiveConf of name hive.server2.clear.dangling.scratchdir does not exist 2019-11-14 17:13:51,000 WARN conf.HiveConf: HiveConf of name hive.test.fail.heartbeater does not exist 2019-11-14 17:13:51,000 WARN conf.HiveConf: HiveConf of name hive.llap.file.cleanup.delay.seconds does not exist 2019-11-14 17:13:51,000 WARN conf.HiveConf: HiveConf of name hive.llap.management.rpc.port does not exist 2019-11-14 17:13:51,000 WARN conf.HiveConf: HiveConf of name hive.mapjoin.hybridgrace.bloomfilter does not exist 2019-11-14 17:13:51,000 WARN conf.HiveConf: HiveConf of name hive.llap.auto.enforce.tree does not exist 2019-11-14 17:13:51,000 WARN conf.HiveConf: HiveConf of name hive.metastore.stats.ndv.tuner does not exist 2019-11-14 17:13:51,000 WARN conf.HiveConf: HiveConf of name hive.direct.sql.max.query.length does not exist 2019-11-14 17:13:51,000 WARN conf.HiveConf: HiveConf of name hive.compactor.history.retention.failed does not exist 2019-11-14 17:13:51,000 WARN conf.HiveConf: HiveConf of name hive.server2.close.session.on.disconnect does not exist 2019-11-14 17:13:51,000 WARN conf.HiveConf: HiveConf of name hive.optimize.ppd.windowing does not exist 2019-11-14 17:13:51,000 WARN conf.HiveConf: HiveConf of name hive.metastore.initial.metadata.count.enabled does not exist 2019-11-14 17:13:51,000 WARN conf.HiveConf: HiveConf of name hive.server2.webui.host does not exist 2019-11-14 17:13:51,000 WARN conf.HiveConf: HiveConf of name hive.orc.splits.ms.footer.cache.enabled does not exist 2019-11-14 17:13:51,000 WARN conf.HiveConf: HiveConf of name hive.optimize.point.lookup.min does not exist 2019-11-14 17:13:51,000 WARN conf.HiveConf: HiveConf of name hive.metastore.hbase.file.metadata.threads does not exist 2019-11-14 17:13:51,000 WARN conf.HiveConf: HiveConf of name hive.llap.daemon.service.refresh.interval.sec does not exist 2019-11-14 17:13:51,001 WARN conf.HiveConf: HiveConf of name hive.llap.auto.max.output.size does not exist 2019-11-14 17:13:51,001 WARN conf.HiveConf: HiveConf of name hive.driver.parallel.compilation does not exist 2019-11-14 17:13:51,001 WARN conf.HiveConf: HiveConf of name hive.llap.remote.token.requires.signing does not exist 2019-11-14 17:13:51,001 WARN conf.HiveConf: HiveConf of name hive.tez.bucket.pruning does not exist 2019-11-14 17:13:51,001 WARN conf.HiveConf: HiveConf of name hive.llap.cache.allow.synthetic.fileid does not exist 2019-11-14 17:13:51,001 WARN conf.HiveConf: HiveConf of name hive.hash.table.inflation.factor does not exist 2019-11-14 17:13:51,001 WARN conf.HiveConf: HiveConf of name hive.metastore.hbase.aggr.stats.hbase.ttl does not exist 2019-11-14 17:13:51,001 WARN conf.HiveConf: HiveConf of name hive.llap.auto.enforce.vectorized does not exist 2019-11-14 17:13:51,001 WARN conf.HiveConf: HiveConf of name hive.writeset.reaper.interval does not exist 2019-11-14 17:13:51,001 WARN conf.HiveConf: HiveConf of name hive.vectorized.use.vector.serde.deserialize does not exist 2019-11-14 17:13:51,001 WARN conf.HiveConf: HiveConf of name hive.order.columnalignment does not exist 2019-11-14 17:13:51,001 WARN conf.HiveConf: HiveConf of name hive.llap.daemon.output.service.send.buffer.size does not exist 2019-11-14 17:13:51,001 WARN conf.HiveConf: HiveConf of name hive.exec.schema.evolution does not exist 2019-11-14 17:13:51,001 WARN conf.HiveConf: HiveConf of name hive.direct.sql.max.elements.values.clause does not exist 2019-11-14 17:13:51,001 WARN conf.HiveConf: HiveConf of name hive.server2.llap.concurrent.queries does not exist 2019-11-14 17:13:51,001 WARN conf.HiveConf: HiveConf of name hive.llap.auto.allow.uber does not exist 2019-11-14 17:13:51,001 WARN conf.HiveConf: HiveConf of name hive.druid.indexer.partition.size.max does not exist 2019-11-14 17:13:51,001 WARN conf.HiveConf: HiveConf of name hive.llap.auto.auth does not exist 2019-11-14 17:13:51,001 WARN conf.HiveConf: HiveConf of name hive.orc.splits.include.fileid does not exist 2019-11-14 17:13:51,001 WARN conf.HiveConf: HiveConf of name hive.llap.daemon.communicator.num.threads does not exist 2019-11-14 17:13:51,001 WARN conf.HiveConf: HiveConf of name hive.orderby.position.alias does not exist 2019-11-14 17:13:51,001 WARN conf.HiveConf: HiveConf of name hive.llap.task.communicator.connection.sleep.between.retries.ms does not exist 2019-11-14 17:13:51,001 WARN conf.HiveConf: HiveConf of name hive.metastore.hbase.aggregate.stats.max.partitions does not exist 2019-11-14 17:13:51,001 WARN conf.HiveConf: HiveConf of name hive.service.metrics.hadoop2.component does not exist 2019-11-14 17:13:51,001 WARN conf.HiveConf: HiveConf of name hive.llap.daemon.yarn.shuffle.port does not exist 2019-11-14 17:13:51,001 WARN conf.HiveConf: HiveConf of name hive.direct.sql.max.elements.in.clause does not exist 2019-11-14 17:13:51,001 WARN conf.HiveConf: HiveConf of name hive.druid.passiveWaitTimeMs does not exist 2019-11-14 17:13:51,001 WARN conf.HiveConf: HiveConf of name hive.load.dynamic.partitions.thread does not exist 2019-11-14 17:13:51,002 WARN conf.HiveConf: HiveConf of name hive.druid.indexer.segments.granularity does not exist 2019-11-14 17:13:51,002 WARN conf.HiveConf: HiveConf of name hive.server2.thrift.http.response.header.size does not exist 2019-11-14 17:13:51,002 WARN conf.HiveConf: HiveConf of name hive.conf.internal.variable.list does not exist 2019-11-14 17:13:51,002 WARN conf.HiveConf: HiveConf of name hive.optimize.limittranspose.reductionpercentage does not exist 2019-11-14 17:13:51,002 WARN conf.HiveConf: HiveConf of name hive.repl.cm.enabled does not exist 2019-11-14 17:13:51,002 WARN conf.HiveConf: HiveConf of name hive.server2.thrift.client.retry.limit does not exist 2019-11-14 17:13:51,002 WARN conf.HiveConf: HiveConf of name hive.server2.thrift.resultset.serialize.in.tasks does not exist 2019-11-14 17:13:51,002 WARN conf.HiveConf: HiveConf of name hive.enable.spark.execution.engine does not exist 2019-11-14 17:13:51,002 WARN conf.HiveConf: HiveConf of name hive.query.timeout.seconds does not exist 2019-11-14 17:13:51,002 WARN conf.HiveConf: HiveConf of name hive.service.metrics.hadoop2.frequency does not exist 2019-11-14 17:13:51,002 WARN conf.HiveConf: HiveConf of name hive.orc.splits.directory.batch.ms does not exist 2019-11-14 17:13:51,004 WARN conf.HiveConf: HiveConf of name hive.metastore.hbase.cache.max.reader.wait does not exist 2019-11-14 17:13:51,004 WARN conf.HiveConf: HiveConf of name hive.llap.task.scheduler.node.reenable.max.timeout.ms does not exist 2019-11-14 17:13:51,004 WARN conf.HiveConf: HiveConf of name hive.max.open.txns does not exist 2019-11-14 17:13:51,004 WARN conf.HiveConf: HiveConf of name hive.auto.convert.sortmerge.join.reduce.side does not exist 2019-11-14 17:13:51,004 WARN conf.HiveConf: HiveConf of name hive.server2.zookeeper.publish.configs does not exist 2019-11-14 17:13:51,004 WARN conf.HiveConf: HiveConf of name hive.auto.convert.join.hashtable.max.entries does not exist 2019-11-14 17:13:51,004 WARN conf.HiveConf: HiveConf of name hive.server2.tez.sessions.init.threads does not exist 2019-11-14 17:13:51,004 WARN conf.HiveConf: HiveConf of name hive.metastore.authorization.storage.check.externaltable.drop does not exist 2019-11-14 17:13:51,004 WARN conf.HiveConf: HiveConf of name hive.execution.mode does not exist 2019-11-14 17:13:51,004 WARN conf.HiveConf: HiveConf of name hive.cbo.cnf.maxnodes does not exist 2019-11-14 17:13:51,004 WARN conf.HiveConf: HiveConf of name hive.vectorized.adaptor.usage.mode does not exist 2019-11-14 17:13:51,005 WARN conf.HiveConf: HiveConf of name hive.materializedview.rewriting does not exist 2019-11-14 17:13:51,005 WARN conf.HiveConf: HiveConf of name hive.server2.authentication.ldap.groupMembershipKey does not exist 2019-11-14 17:13:51,005 WARN conf.HiveConf: HiveConf of name hive.metastore.hbase.catalog.cache.size does not exist 2019-11-14 17:13:51,005 WARN conf.HiveConf: HiveConf of name hive.cbo.show.warnings does not exist 2019-11-14 17:13:51,005 WARN conf.HiveConf: HiveConf of name hive.metastore.fshandler.threads does not exist 2019-11-14 17:13:51,005 WARN conf.HiveConf: HiveConf of name hive.tez.max.bloom.filter.entries does not exist 2019-11-14 17:13:51,005 WARN conf.HiveConf: HiveConf of name hive.llap.io.metadata.fraction does not exist 2019-11-14 17:13:51,005 WARN conf.HiveConf: HiveConf of name hive.materializedview.serde does not exist 2019-11-14 17:13:51,005 WARN conf.HiveConf: HiveConf of name hive.llap.daemon.task.scheduler.wait.queue.size does not exist 2019-11-14 17:13:51,005 WARN conf.HiveConf: HiveConf of name hive.metastore.hbase.aggr.stats.cache.entries does not exist 2019-11-14 17:13:51,005 WARN conf.HiveConf: HiveConf of name hive.txn.operational.properties does not exist 2019-11-14 17:13:51,005 WARN conf.HiveConf: HiveConf of name hive.metastore.hbase.aggr.stats.memory.ttl does not exist 2019-11-14 17:13:51,005 WARN conf.HiveConf: HiveConf of name hive.llap.daemon.rpc.port does not exist 2019-11-14 17:13:51,005 WARN conf.HiveConf: HiveConf of name hive.llap.io.nonvector.wrapper.enabled does not exist 2019-11-14 17:13:51,005 WARN conf.HiveConf: HiveConf of name hive.metastore.hbase.aggregate.stats.cache.size does not exist 2019-11-14 17:13:51,005 WARN conf.HiveConf: HiveConf of name hive.vectorized.use.vectorized.input.format does not exist 2019-11-14 17:13:51,005 WARN conf.HiveConf: HiveConf of name hive.optimize.cte.materialize.threshold does not exist 2019-11-14 17:13:51,005 WARN conf.HiveConf: HiveConf of name hive.metastore.hbase.cache.clean.until does not exist 2019-11-14 17:13:51,005 WARN conf.HiveConf: HiveConf of name hive.optimize.semijoin.conversion does not exist 2019-11-14 17:13:51,005 WARN conf.HiveConf: HiveConf of name hive.metastore.port does not exist 2019-11-14 17:13:51,005 WARN conf.HiveConf: HiveConf of name hive.spark.dynamic.partition.pruning does not exist 2019-11-14 17:13:51,005 WARN conf.HiveConf: HiveConf of name hive.metastore.metrics.enabled does not exist 2019-11-14 17:13:51,005 WARN conf.HiveConf: HiveConf of name hive.repl.rootdir does not exist 2019-11-14 17:13:51,005 WARN conf.HiveConf: HiveConf of name hive.metastore.limit.partition.request does not exist 2019-11-14 17:13:51,005 WARN conf.HiveConf: HiveConf of name hive.async.log.enabled does not exist 2019-11-14 17:13:51,005 WARN conf.HiveConf: HiveConf of name hive.llap.daemon.logger does not exist 2019-11-14 17:13:51,005 WARN conf.HiveConf: HiveConf of name hive.allow.udf.load.on.demand does not exist 2019-11-14 17:13:51,005 WARN conf.HiveConf: HiveConf of name hive.cli.tez.session.async does not exist 2019-11-14 17:13:51,005 WARN conf.HiveConf: HiveConf of name hive.tez.bloom.filter.factor does not exist 2019-11-14 17:13:51,005 WARN conf.HiveConf: HiveConf of name hive.llap.daemon.am-reporter.max.threads does not exist 2019-11-14 17:13:51,005 WARN conf.HiveConf: HiveConf of name hive.spark.use.file.size.for.mapjoin does not exist 2019-11-14 17:13:51,005 WARN conf.HiveConf: HiveConf of name hive.strict.checks.bucketing does not exist 2019-11-14 17:13:51,005 WARN conf.HiveConf: HiveConf of name hive.tez.bucket.pruning.compat does not exist 2019-11-14 17:13:51,005 WARN conf.HiveConf: HiveConf of name hive.server2.webui.spnego.principal does not exist 2019-11-14 17:13:51,005 WARN conf.HiveConf: HiveConf of name hive.llap.daemon.task.preemption.metrics.intervals does not exist 2019-11-14 17:13:51,005 WARN conf.HiveConf: HiveConf of name hive.llap.daemon.shuffle.dir.watcher.enabled does not exist 2019-11-14 17:13:51,005 WARN conf.HiveConf: HiveConf of name hive.llap.io.allocator.arena.count does not exist 2019-11-14 17:13:51,005 WARN conf.HiveConf: HiveConf of name hive.metastore.use.SSL does not exist 2019-11-14 17:13:51,005 WARN conf.HiveConf: HiveConf of name hive.llap.task.communicator.connection.timeout.ms does not exist 2019-11-14 17:13:51,005 WARN conf.HiveConf: HiveConf of name hive.transpose.aggr.join does not exist 2019-11-14 17:13:51,005 WARN conf.HiveConf: HiveConf of name hive.druid.maxTries does not exist 2019-11-14 17:13:51,005 WARN conf.HiveConf: HiveConf of name hive.spark.dynamic.partition.pruning.max.data.size does not exist 2019-11-14 17:13:51,005 WARN conf.HiveConf: HiveConf of name hive.druid.metadata.base does not exist 2019-11-14 17:13:51,005 WARN conf.HiveConf: HiveConf of name hive.metastore.hbase.aggr.stats.invalidator.frequency does not exist 2019-11-14 17:13:51,005 WARN conf.HiveConf: HiveConf of name hive.llap.io.use.lrfu does not exist 2019-11-14 17:13:51,005 WARN conf.HiveConf: HiveConf of name hive.llap.io.allocator.mmap does not exist 2019-11-14 17:13:51,005 WARN conf.HiveConf: HiveConf of name hive.druid.coordinator.address.default does not exist 2019-11-14 17:13:51,005 WARN conf.HiveConf: HiveConf of name hive.server2.thrift.resultset.max.fetch.size does not exist 2019-11-14 17:13:51,005 WARN conf.HiveConf: HiveConf of name hive.conf.hidden.list does not exist 2019-11-14 17:13:51,005 WARN conf.HiveConf: HiveConf of name hive.io.sarg.cache.max.weight.mb does not exist 2019-11-14 17:13:51,005 WARN conf.HiveConf: HiveConf of name hive.server2.clear.dangling.scratchdir.interval does not exist 2019-11-14 17:13:51,006 WARN conf.HiveConf: HiveConf of name hive.druid.sleep.time does not exist 2019-11-14 17:13:51,006 WARN conf.HiveConf: HiveConf of name hive.vectorized.use.row.serde.deserialize does not exist 2019-11-14 17:13:51,006 WARN conf.HiveConf: HiveConf of name hive.server2.compile.lock.timeout does not exist 2019-11-14 17:13:51,006 WARN conf.HiveConf: HiveConf of name hive.timedout.txn.reaper.interval does not exist 2019-11-14 17:13:51,006 WARN conf.HiveConf: HiveConf of name hive.metastore.hbase.aggregate.stats.max.variance does not exist 2019-11-14 17:13:51,006 WARN conf.HiveConf: HiveConf of name hive.llap.io.lrfu.lambda does not exist 2019-11-14 17:13:51,006 WARN conf.HiveConf: HiveConf of name hive.druid.metadata.db.type does not exist 2019-11-14 17:13:51,006 WARN conf.HiveConf: HiveConf of name hive.llap.daemon.output.stream.timeout does not exist 2019-11-14 17:13:51,006 WARN conf.HiveConf: HiveConf of name hive.transactional.events.mem does not exist 2019-11-14 17:13:51,006 WARN conf.HiveConf: HiveConf of name hive.server2.thrift.resultset.default.fetch.size does not exist 2019-11-14 17:13:51,006 WARN conf.HiveConf: HiveConf of name hive.repl.cm.retain does not exist 2019-11-14 17:13:51,006 WARN conf.HiveConf: HiveConf of name hive.merge.cardinality.check does not exist 2019-11-14 17:13:51,006 WARN conf.HiveConf: HiveConf of name hive.server2.authentication.ldap.groupClassKey does not exist 2019-11-14 17:13:51,006 WARN conf.HiveConf: HiveConf of name hive.optimize.point.lookup does not exist 2019-11-14 17:13:51,006 WARN conf.HiveConf: HiveConf of name hive.llap.allow.permanent.fns does not exist 2019-11-14 17:13:51,006 WARN conf.HiveConf: HiveConf of name hive.llap.daemon.web.ssl does not exist 2019-11-14 17:13:51,006 WARN conf.HiveConf: HiveConf of name hive.txn.manager.dump.lock.state.on.acquire.timeout does not exist 2019-11-14 17:13:51,006 WARN conf.HiveConf: HiveConf of name hive.compactor.history.retention.succeeded does not exist 2019-11-14 17:13:51,006 WARN conf.HiveConf: HiveConf of name hive.llap.io.use.fileid.path does not exist 2019-11-14 17:13:51,006 WARN conf.HiveConf: HiveConf of name hive.llap.io.encode.slice.row.count does not exist 2019-11-14 17:13:51,007 WARN conf.HiveConf: HiveConf of name hive.mapjoin.optimized.hashtable.probe.percent does not exist 2019-11-14 17:13:51,007 WARN conf.HiveConf: HiveConf of name hive.druid.select.distribute does not exist 2019-11-14 17:13:51,007 WARN conf.HiveConf: HiveConf of name hive.llap.am.use.fqdn does not exist 2019-11-14 17:13:51,007 WARN conf.HiveConf: HiveConf of name hive.llap.task.scheduler.node.reenable.min.timeout.ms does not exist 2019-11-14 17:13:51,007 WARN conf.HiveConf: HiveConf of name hive.llap.validate.acls does not exist 2019-11-14 17:13:51,007 WARN conf.HiveConf: HiveConf of name hive.support.special.characters.tablename does not exist 2019-11-14 17:13:51,007 WARN conf.HiveConf: HiveConf of name hive.mv.files.thread does not exist 2019-11-14 17:13:51,007 WARN conf.HiveConf: HiveConf of name hive.llap.skip.compile.udf.check does not exist 2019-11-14 17:13:51,007 WARN conf.HiveConf: HiveConf of name hive.llap.io.encode.vector.serde.enabled does not exist 2019-11-14 17:13:51,007 WARN conf.HiveConf: HiveConf of name hive.repl.cm.interval does not exist 2019-11-14 17:13:51,007 WARN conf.HiveConf: HiveConf of name hive.server2.sleep.interval.between.start.attempts does not exist 2019-11-14 17:13:51,007 WARN conf.HiveConf: HiveConf of name hive.llap.daemon.yarn.container.mb does not exist 2019-11-14 17:13:51,007 WARN conf.HiveConf: HiveConf of name hive.druid.http.read.timeout does not exist 2019-11-14 17:13:51,007 WARN conf.HiveConf: HiveConf of name hive.blobstore.optimizations.enabled does not exist 2019-11-14 17:13:51,007 WARN conf.HiveConf: HiveConf of name hive.llap.orc.gap.cache does not exist 2019-11-14 17:13:51,007 WARN conf.HiveConf: HiveConf of name hive.optimize.dynamic.partition.hashjoin does not exist 2019-11-14 17:13:51,007 WARN conf.HiveConf: HiveConf of name hive.exec.copyfile.maxnumfiles does not exist 2019-11-14 17:13:51,007 WARN conf.HiveConf: HiveConf of name hive.llap.io.encode.formats does not exist 2019-11-14 17:13:51,008 WARN conf.HiveConf: HiveConf of name hive.druid.http.numConnection does not exist 2019-11-14 17:13:51,008 WARN conf.HiveConf: HiveConf of name hive.llap.daemon.task.scheduler.enable.preemption does not exist 2019-11-14 17:13:51,008 WARN conf.HiveConf: HiveConf of name hive.llap.daemon.num.executors does not exist 2019-11-14 17:13:51,008 WARN conf.HiveConf: HiveConf of name hive.metastore.hbase.cache.max.full does not exist 2019-11-14 17:13:51,008 WARN conf.HiveConf: HiveConf of name hive.metastore.hbase.connection.class does not exist 2019-11-14 17:13:51,008 WARN conf.HiveConf: HiveConf of name hive.server2.tez.sessions.custom.queue.allowed does not exist 2019-11-14 17:13:51,008 WARN conf.HiveConf: HiveConf of name hive.llap.io.encode.slice.lrr does not exist 2019-11-14 17:13:51,008 WARN conf.HiveConf: HiveConf of name hive.server2.thrift.client.password does not exist 2019-11-14 17:13:51,008 WARN conf.HiveConf: HiveConf of name hive.metastore.hbase.cache.max.writer.wait does not exist 2019-11-14 17:13:51,008 WARN conf.HiveConf: HiveConf of name hive.server2.thrift.http.request.header.size does not exist 2019-11-14 17:13:51,008 WARN conf.HiveConf: HiveConf of name hive.server2.webui.max.threads does not exist 2019-11-14 17:13:51,008 WARN conf.HiveConf: HiveConf of name hive.optimize.limittranspose.reductiontuples does not exist 2019-11-14 17:13:51,008 WARN conf.HiveConf: HiveConf of name hive.test.rollbacktxn does not exist 2019-11-14 17:13:51,008 WARN conf.HiveConf: HiveConf of name hive.llap.task.scheduler.num.schedulable.tasks.per.node does not exist 2019-11-14 17:13:51,008 WARN conf.HiveConf: HiveConf of name hive.llap.daemon.acl does not exist 2019-11-14 17:13:51,008 WARN conf.HiveConf: HiveConf of name hive.llap.io.memory.size does not exist 2019-11-14 17:13:51,008 WARN conf.HiveConf: HiveConf of name hive.strict.checks.type.safety does not exist 2019-11-14 17:13:51,008 WARN conf.HiveConf: HiveConf of name hive.server2.async.exec.async.compile does not exist 2019-11-14 17:13:51,008 WARN conf.HiveConf: HiveConf of name hive.llap.auto.max.input.size does not exist 2019-11-14 17:13:51,008 WARN conf.HiveConf: HiveConf of name hive.tez.enable.memory.manager does not exist 2019-11-14 17:13:51,008 WARN conf.HiveConf: HiveConf of name hive.msck.repair.batch.size does not exist 2019-11-14 17:13:51,008 WARN conf.HiveConf: HiveConf of name hive.blobstore.supported.schemes does not exist 2019-11-14 17:13:51,008 WARN conf.HiveConf: HiveConf of name hive.orc.splits.allow.synthetic.fileid does not exist 2019-11-14 17:13:51,008 WARN conf.HiveConf: HiveConf of name hive.stats.filter.in.factor does not exist 2019-11-14 17:13:51,009 WARN conf.HiveConf: HiveConf of name hive.spark.use.op.stats does not exist 2019-11-14 17:13:51,009 WARN conf.HiveConf: HiveConf of name hive.exec.input.listing.max.threads does not exist 2019-11-14 17:13:51,009 WARN conf.HiveConf: HiveConf of name hive.server2.tez.session.lifetime.jitter does not exist 2019-11-14 17:13:51,009 WARN conf.HiveConf: HiveConf of name hive.llap.daemon.web.port does not exist 2019-11-14 17:13:51,009 WARN conf.HiveConf: HiveConf of name hive.strict.checks.cartesian.product does not exist 2019-11-14 17:13:51,009 WARN conf.HiveConf: HiveConf of name hive.llap.daemon.rpc.num.handlers does not exist 2019-11-14 17:13:51,009 WARN conf.HiveConf: HiveConf of name hive.llap.daemon.vcpus.per.instance does not exist 2019-11-14 17:13:51,009 WARN conf.HiveConf: HiveConf of name hive.count.open.txns.interval does not exist 2019-11-14 17:13:51,009 WARN conf.HiveConf: HiveConf of name hive.tez.min.bloom.filter.entries does not exist 2019-11-14 17:13:51,009 WARN conf.HiveConf: HiveConf of name hive.optimize.partition.columns.separate does not exist 2019-11-14 17:13:51,009 WARN conf.HiveConf: HiveConf of name hive.orc.cache.stripe.details.mem.size does not exist 2019-11-14 17:13:51,009 WARN conf.HiveConf: HiveConf of name hive.txn.heartbeat.threadpool.size does not exist 2019-11-14 17:13:51,009 WARN conf.HiveConf: HiveConf of name hive.llap.task.scheduler.locality.delay does not exist 2019-11-14 17:13:51,009 WARN conf.HiveConf: HiveConf of name hive.repl.cmrootdir does not exist 2019-11-14 17:13:51,009 WARN conf.HiveConf: HiveConf of name hive.llap.task.scheduler.node.disable.backoff.factor does not exist 2019-11-14 17:13:51,009 WARN conf.HiveConf: HiveConf of name hive.llap.am.liveness.connection.sleep.between.retries.ms does not exist 2019-11-14 17:13:51,009 WARN conf.HiveConf: HiveConf of name hive.spark.exec.inplace.progress does not exist 2019-11-14 17:13:51,009 WARN conf.HiveConf: HiveConf of name hive.druid.working.directory does not exist 2019-11-14 17:13:51,009 WARN conf.HiveConf: HiveConf of name hive.llap.daemon.memory.per.instance.mb does not exist 2019-11-14 17:13:51,009 WARN conf.HiveConf: HiveConf of name hive.msck.path.validation does not exist 2019-11-14 17:13:51,009 WARN conf.HiveConf: HiveConf of name hive.tez.task.scale.memory.reserve.fraction does not exist 2019-11-14 17:13:51,009 WARN conf.HiveConf: HiveConf of name hive.merge.nway.joins does not exist 2019-11-14 17:13:51,009 WARN conf.HiveConf: HiveConf of name hive.compactor.history.reaper.interval does not exist 2019-11-14 17:13:51,009 WARN conf.HiveConf: HiveConf of name hive.txn.strict.locking.mode does not exist 2019-11-14 17:13:51,009 WARN conf.HiveConf: HiveConf of name hive.llap.io.encode.vector.serde.async.enabled does not exist 2019-11-14 17:13:51,009 WARN conf.HiveConf: HiveConf of name hive.tez.input.generate.consistent.splits does not exist 2019-11-14 17:13:51,009 WARN conf.HiveConf: HiveConf of name hive.server2.in.place.progress does not exist 2019-11-14 17:13:51,009 WARN conf.HiveConf: HiveConf of name hive.druid.indexer.memory.rownum.max does not exist 2019-11-14 17:13:51,009 WARN conf.HiveConf: HiveConf of name hive.server2.xsrf.filter.enabled does not exist 2019-11-14 17:13:51,009 WARN conf.HiveConf: HiveConf of name hive.llap.io.allocator.alloc.max does not exist Welcome to ____ __ / __/__ ___ _____/ /__ _\ \/ _ \/ _ `/ __/ '_/ /__ / .__/\_,_/_/ /_/\_\ version 2.4.4 /_/ Using Python version 3.7.4 (default, Sep 20 2019 17:49:03) SparkSession available as 'spark'. ```

错误:您的要求无法解析为可安装的软件包集

<div class="post-text" itemprop="text"> <p>hello I have one problem </p> <p>Composer JSON <a href="https://pastebin.com/qfi10DAX" rel="nofollow noreferrer">https://pastebin.com/qfi10DAX</a></p> <p>I have this error :</p> <pre><code> Problem 1 - Installation request for symfony/http-kernel dev-master -&gt; satisfiable by symfony/http-kernel[dev-master]. - symfony/http-kernel dev-master requires symfony/error-catcher ^4.4|^5.0 -&gt; satisfiable by symfony/error-catcher[4.4.x-dev] but these conflict with your requirements or minimum-stability. Problem 2 - symfony/web-server-bundle dev-master requires symfony/http-kernel ^4.4|^5.0 -&gt; satisfiable by symfony/http-kernel[4.4.x-dev, 5.0.x-dev]. - symfony/web-server-bundle dev-master requires symfony/http-kernel ^4.4|^5.0 -&gt; satisfiable by symfony/http-kernel[4.4.x-dev, 5.0.x-dev]. - symfony/http-kernel 5.0.x-dev requires symfony/error-catcher ^4.4|^5.0 -&gt; satisfiable by symfony/error-catcher[4.4.x-dev] but these conflict with your requirements or minimum-stability. - symfony/http-kernel 4.4.x-dev requires symfony/error-catcher ^4.4|^5.0 -&gt; satisfiable by symfony/error-catcher[4.4.x-dev] but these conflict with your requirements or minimum-stability. - symfony/http-kernel 4.4.x-dev requires symfony/error-catcher ^4.4|^5.0 -&gt; satisfiable by symfony/error-catcher[4.4.x-dev] but these conflict with your requirements or minimum-stability. - Installation request for symfony/web-server-bundle dev-master -&gt; satisfiable by symfony/web-server-bundle[dev-master]. </code></pre> <p>what can I do to solve this?</p> <p>My <strong>composer.json</strong></p> <pre><code>{ "require": { "psr/container": "^1.0@dev", "symfony/event-dispatcher": "dev-master", "symfony/lock": "^4.3", "psr/log": "^1.1", "psr/event-dispatcher": "^1.0", "symfony/http-kernel": "dev-master", "symfony/browser-kit": "^4.3", "symfony/config": "^4.4@dev", "symfony/console": "dev-master", "symfony/dependency-injection": "dev-master", "symfony/var-dumper": "^4.3", "symfony/css-selector": "^4.3", "symfony/process": "dev-master", "symfony/yaml": "^3.4", "symfony/finder": "^4.3", "symfony/expression-language": "^4.3", "symfony/proxy-manager-bridge": "^4.3", "symfony/asset": "^4.3", "symfony/cache": "^4.3", "symfony/class-loader": "^3.4", "symfony/workflow": "^4.3", "symfony/webpack-encore-pack": "^1.0", "symfony/webpack-encore-bundle": "dev-master", "symfony/var-exporter": "^4.3", "symfony/validator": "^4.3", "symfony/translation": "^4.3", "symfony/templating": "^4.3", "symfony/stopwatch": "^4.3", "symfony/serializer": "^4.3", "symfony/security": "^4.3", "symfony/property-info": "^4.3", "doctrine/annotations": "^1.6", "doctrine/cache": "^1.8", "symfony/contracts": "^1.1", "symfony/debug": "^4.3", "symfony/dom-crawler": "^4.3", "symfony/dotenv": "^4.3", "symfony/filesystem": "^4.3", "symfony/form": "^4.3", "symfony/security-guard": "^4.3", "symfony/http-client": "^4.3", "symfony/http-foundation": "dev-master", "symfony/inflector": "^4.3", "symfony/intl": "^4.3", "ext-ldap": "^7.3", "symfony/ldap": "^4.3", "symfony/mailer": "^4.3", "symfony/messenger": "^4.3", "enqueue/messenger-adapter": "^0.2.2", "enqueue/async-event-dispatcher": "^0.9.7", "enqueue/async-command": "^0.9.6", "symfony/web-link": "^4.3", "enqueue/sqs": "^0.9.11", "aws/aws-php-sns-message-validator": "^1.5", "enqueue/dbal": "^0.9.9", "enqueue/redis": "^0.9.7", "enqueue/fs": "^0.9.8", "enqueue/stomp": "^0.9.10", "php-http/stopwatch-plugin": "^1.2", "php-http/cache-plugin": "^1.6", "php-http/logger-plugin": "^1.1", "slim/slim": "^3.12", "zendframework/zend-diactoros": "^2.1", "ircmaxell/random-lib": "^1.2", "moontoast/math": "^1.1", "symfony/mime": "dev-master", "symfony/options-resolver": "^4.3", "symfony/phpunit-bridge": "^4.3", "symfony/polyfill-apcu": "^1.11", "symfony/polyfill-ctype": "^1.11", "symfony/polyfill-iconv": "^1.11", "symfony/polyfill-intl-grapheme": "^1.11", "symfony/polyfill-intl-icu": "^1.11", "symfony/polyfill-intl-idn": "^1.11", "symfony/polyfill-intl-messageformatter": "^1.11", "symfony/polyfill-intl-normalizer": "^1.11", "symfony/polyfill-mbstring": "^1.11", "symfony/polyfill-php54": "^1.11", "symfony/polyfill-php55": "^1.11", "symfony/polyfill-php56": "^1.11", "symfony/polyfill-php70": "^1.11", "symfony/polyfill-php71": "^1.11", "symfony/polyfill-php72": "^1.11", "symfony/polyfill-php73": "^1.11", "symfony/polyfill-util": "^1.11", "symfony/property-access": "^4.3", "symfony/routing": "^4.3", "sylius/resource-bundle": "^1.5@dev", "doctrine/orm": "^2.5", "symfony/web-profiler-bundle": "^4.3", "symfony/security-acl": "^3.0", "doctrine/data-fixtures": "^1.3", "mongodb/mongodb": "^1.4", "alcaeus/mongo-php-adapter": "^1.1", "sensio/framework-extra-bundle": "^5.3", "symfony/psr-http-message-bridge": "^1.2", "nyholm/psr7": "^1.1", "symfony/security-bundle": "^4.3", "doctrine/mongodb-odm-bundle": "^3.5", "jmikola/geojson": "^1.0", "doctrine/mongodb-odm": "^1.2", "sylius/locale": "^1.5", "solarium/solarium": "^5.0", "minimalcode/search": "^1.0", "propel/propel": "^2.0@dev", "monolog/monolog": "^2.0@dev", "elasticsearch/elasticsearch": "dev-master", "sentry/sentry": "^2.0@dev", "php-amqplib/php-amqplib": "^2.8@dev", "php-console/php-console": "dev-master", "graylog2/gelf-php": "^1.4@dev", "ruflin/elastica": "^6.0@dev", "ocramius/package-versions": "^1.4", "doctrine/coding-standard": "^6.0", "composer/composer": "^1.9@dev", "infection/infection": "dev-master", "phpunit/phpunit": "^8.2@dev", "phpunit/php-invoker": "^2.0@dev", "symfony/web-server-bundle": "dev-master", "doctrine/couchdb": "^1.0@dev", "aws/aws-sdk-php": "^3.0@dev", "swiftmailer/swiftmailer": "^6.2@dev", "true/punycode": "dev-master", "jakub-onderka/php-parallel-lint": "dev-master", "jakub-onderka/php-console-highlighter": "dev-master", "predis/predis": "^2.0@dev", "sylius/money-bundle": "^1.6@dev", "akeneo/phpspec-skip-example-extension": "^4.0@dev", "phpspec/nyan-formatters": "^1.0@dev", "doctrine/dbal": "^2.5@dev", "lakion/api-test-case": "^5.0@dev", "symfony/doctrine-bridge": "^4.3@dev", "lchrusciel/api-test-case": "^5.0@dev", "polishsymfonycommunity/symfony-mocker-container": "dev-master" }, "require-dev": { "jean85/pretty-package-versions": "^1.0@dev" }, "conflict": { "symfony/doctrine-bridge": "4.2.0", "symfony/framework-bundle": "4.3.0" } } </code></pre> </div>

Laravel | 使用gitlab ci部署和自动部署

<div class="post-text" itemprop="text"> <p>i want to setup automatic deploy to my production server.</p> <p>This is my log from gitlab ci -&gt;</p> <pre><code>Running with gitlab-runner 11.9.0-rc2 (227934c0) on docker-auto-scale fa6cab46 Using Docker executor with image lorisleiva/laravel-docker:latest ... Pulling docker image lorisleiva/laravel-docker:latest ... Using docker image sha256:4bd5ecacba7b0f46950944f376647090071a70a7b1ffa0eacb492719bd476c6b for lorisleiva/laravel-docker:latest ... Running on runner-fa6cab46-project-11286864-concurrent-0 via runner-fa6cab46-srm-1552945620-f480ce3e... Initialized empty Git repository in /builds/Woblex/web/.git/ Fetching changes... Created fresh repository. From https://gitlab.com/Woblex/web * [new branch] master -&gt; origin/master Checking out ce51a64e as master... Skipping Git submodules setup Downloading artifacts for composer (179849020)... Downloading artifacts from coordinator... ok id=179849020 responseStatus=200 OK token=zy8-CGce Downloading artifacts for npm (179849021)... Downloading artifacts from coordinator... ok id=179849021 responseStatus=200 OK token=NvUWyzkg $ eval $(ssh-agent -s) # collapsed multi-line command Agent pid 11 Identity added: (stdin) (git@gitlab.com) $ find . -type f -not -path "./vendor/*" -exec chmod 664 {} \; # collapsed multi-line command $ whoami root $ php artisan deploy woblex.cz -s upload ✈︎ Deploying HEAD on woblex.cz with upload strategy ➤ Executing task deploy:prepare ✔ Ok ➤ Executing task deploy:lock ✔ Ok ➤ Executing task deploy:release ✔ Ok ➤ Executing task deploy:update_code ➤ Executing task deploy:failed ✔ Ok ➤ Executing task deploy:unlock ✔ Ok In Client.php line 99: The command "cd /var/www/dev.woblex.cz &amp;&amp; (/usr/bin/git clone --recursive git@gitlab.com:Woblex/web.git /var/www/dev.woblex.cz/releases/1 2&gt;&amp;1)" fa iled. Exit Code: 128 (Invalid exit argument) Host Name: woblex.cz ================ Warning: Identity file /home/gitlab/.ssh/id_rsa not accessible: No such fil e or directory. deploy [-p|--parallel] [-l|--limit LIMIT] [--no-hooks] [--log LOG] [--roles ROLES] [--hosts HOSTS] [-o|--option OPTION] [--] [&lt;stage&gt;] In Process.php line 239: The command "vendor/bin/dep --file=vendor/lorisleiva/laravel-deployer/.buil d/deploy.php deploy 'woblex.cz'" failed. Exit Code: 128(Invalid exit argument) Working directory: /builds/Woblex/web Output: ================ ✈︎ Deploying HEAD on woblex.cz with upload strategy ➤ Executing task deploy:prepare ✔ Ok ➤ Executing task deploy:lock ✔ Ok ➤ Executing task deploy:release ✔ Ok ➤ Executing task deploy:update_code ➤ Executing task deploy:failed ✔ Ok ➤ Executing task deploy:unlock ✔ Ok Error Output: ================ In Client.php line 99: The command "cd /var/www/dev.woblex.cz &amp;&amp; (/usr/bin/git clone --recursi ve git@gitlab.com:Woblex/web.git /var/www/dev.woblex.cz/releases/1 2&gt;&amp;1)" fa iled. Exit Code: 128 (Invalid exit argument) Host Name: woblex.cz ================ Warning: Identity file /home/gitlab/.ssh/id_rsa not accessible: No such f il e or directory. deploy [-p|--parallel] [-l|--limit LIMIT] [--no-hooks] [--log LOG] [--roles ROLES] [--hosts HOSTS] [-o|--option OPTION] [--] [&lt;stage&gt;] ERROR: Job failed: exit code 1 </code></pre> <p>I there is my deploy.php configuration -&gt;</p> <pre><code>&lt;?php return [ /* |-------------------------------------------------------------------------- | Default deployment strategy |-------------------------------------------------------------------------- | | This option defines which deployment strategy to use by default on all | of your hosts. Laravel Deployer provides some strategies out-of-box | for you to choose from explained in detail in the documentation. | | Supported: 'basic', 'firstdeploy', 'local', 'pull'. | */ 'default' =&gt; 'basic', /* |-------------------------------------------------------------------------- | Custom deployment strategies |-------------------------------------------------------------------------- | | Here, you can easily set up new custom strategies as a list of tasks. | Any key of this array are supported in the `default` option above. | Any key matching Laravel Deployer's strategies overrides them. | */ 'strategies' =&gt; [ // ], /* |-------------------------------------------------------------------------- | Hooks |-------------------------------------------------------------------------- | | Hooks let you customize your deployments conveniently by pushing tasks | into strategic places of your deployment flow. Each of the official | strategies invoke hooks in different ways to implement their logic. | */ 'hooks' =&gt; [ // Right before we start deploying. 'start' =&gt; [ // ], // Code and composer vendors are ready but nothing is built. 'build' =&gt; [ // ], // Deployment is done but not live yet (before symlink) 'ready' =&gt; [ 'artisan:storage:link', 'artisan:view:clear', 'artisan:cache:clear', 'artisan:config:cache', 'artisan:migrate', 'artisan:horizon:terminate', ], // Deployment is done and live 'done' =&gt; [ // ], // Deployment succeeded. 'success' =&gt; [ // ], // Deployment failed. 'fail' =&gt; [ // ], ], /* |-------------------------------------------------------------------------- | Deployment options |-------------------------------------------------------------------------- | | Options follow a simple key/value structure and are used within tasks | to make them more configurable and reusable. You can use options to | configure existing tasks or to use within your own custom tasks. | */ 'options' =&gt; [ 'application' =&gt; env('APP_NAME', 'Laravel'), 'repository' =&gt; 'git@gitlab.com:Woblex/web.git', 'git_tty' =&gt; false, ], /* |-------------------------------------------------------------------------- | Hosts |-------------------------------------------------------------------------- | | Here, you can define any domain or subdomain you want to deploy to. | You can provide them with roles and stages to filter them during | deployment. Read more about how to configure them in the docs. | */ 'hosts' =&gt; [ 'woblex.cz' =&gt; [ 'deploy_path' =&gt; '/var/www/dev.woblex.cz', 'user' =&gt; 'gitlab', 'identityFile' =&gt; '/home/gitlab/.ssh/id_rsa', ], ], /* |-------------------------------------------------------------------------- | Localhost |-------------------------------------------------------------------------- | | This localhost option give you the ability to deploy directly on your | local machine, without needing any SSH connection. You can use the | same configurations used by hosts to configure your localhost. | */ 'localhost' =&gt; [ // ], /* |-------------------------------------------------------------------------- | Include additional Deployer recipes |-------------------------------------------------------------------------- | | Here, you can add any third party recipes to provide additional tasks, | options and strategies. Therefore, it also allows you to create and | include your own recipes to define more complex deployment flows. | */ 'include' =&gt; [ // ], /* |-------------------------------------------------------------------------- | Use a custom Deployer file |-------------------------------------------------------------------------- | | If you know what you are doing and want to take complete control over | Deployer's file, you can provide its path here. Note that, without | this configuration file, the root's deployer file will be used. | */ 'custom_deployer_file' =&gt; false, ]; </code></pre> <p>I think that there is a problem with user connected to my ubuntu server from laravel deployer, because i set in config the username to 'gitlab', but if deployer connect to ubuntu server with this user it can see a file /home/gitlab/.ssh/id_rsa. Have anybody someting, that can help me? Thanks a lot!</p> <p>Edit: Here my gitlab-ci.yml</p> <pre><code>image: lorisleiva/laravel-docker:latest .init_ssh: &amp;init_ssh | eval $(ssh-agent -s) echo "$SSH_PRIVATE_KEY" | tr -d ' ' | ssh-add - &gt; /dev/null mkdir -p ~/.ssh chmod 700 ~/.ssh [[ -f /.dockerenv ]] &amp;&amp; echo -e "Host * \tStrictHostKeyChecking no " &gt; ~/.ssh/config # Replace the last line with the following lines if you'd rather # leave StrictHostKeyChecking enabled (replace yourdomain.com): # # ssh-keyscan yourdomain.com &gt;&gt; ~/.ssh/known_hosts # chmod 644 ~/.ssh/known_hosts .change_file_permissions: &amp;change_file_permissions | find . -type f -not -path "./vendor/*" -exec chmod 664 {} \; find . -type d -not -path "./vendor/*" -exec chmod 775 {} \; composer: stage: build cache: key: ${CI_COMMIT_REF_SLUG}-composer paths: - vendor/ script: - composer install --prefer-dist --no-ansi --no-interaction --no-progress --no-scripts - cp .env.example .env - php artisan key:generate artifacts: expire_in: 1 month paths: - vendor/ - .env npm: stage: build cache: key: ${CI_COMMIT_REF_SLUG}-npm paths: - node_modules/ script: - npm install - npm run production artifacts: expire_in: 1 month paths: - node_modules/ - public/css/ - public/js/ #codestyle: # stage: test # dependencies: [] # script: # - phpcs --standard=PSR2 --extensions=php --ignore=app/Support/helpers.php app phpunit: stage: test dependencies: - composer script: - phpunit --coverage-text --colors=never staging: stage: deploy script: - *init_ssh - *change_file_permissions - whoami - php artisan deploy woblex.cz -s upload environment: name: staging url: http://www.dev.woblex.cz only: - master production: stage: deploy script: - *init_ssh - *change_file_permissions - php artisan deploy new.woblex.cz -s upload environment: name: production url: https://www.new.woblex.cz when: manual only: - master </code></pre> </div>

OpenJdk platform binary idea64启动报错

parallel desktop 虚拟机中装了win7,win7中安装idea64,就简单写了个helloworld输出,运行就报以下错误,win7中的显卡也升级不了 Error:Abnormal build process termination: "C:\Me\JetBrains\IntelliJ IDEA 2018.3.3\jre64\bin\java.exe" -Xmx700m -Djava.awt.headless=true -Djava.endorsed.dirs=\"\" -Djdt.compiler.useSingleThread=true -Dcompile.parallel=false -Drebuild.on.dependency.change=true -Djava.net.preferIPv4Stack=true -Dio.netty.initialSeedUniquifier=5762697851691075053 -Dfile.encoding=GBK -Duser.language=zh -Duser.country=CN -Didea.paths.selector=IntelliJIdea2018.3 "-Didea.home.path=C:\Me\JetBrains\IntelliJ IDEA 2018.3.3" -Didea.config.path=C:\Users\Brazen\.IntelliJIdea2018.3\config -Didea.plugins.path=C:\Users\Brazen\.IntelliJIdea2018.3\config\plugins -Djps.log.dir=C:/Users/Brazen/.IntelliJIdea2018.3/system/log/build-log "-Djps.fallback.jdk.home=C:/Me/JetBrains/IntelliJ IDEA 2018.3.3/jre64" -Djps.fallback.jdk.version=1.8.0_152-release -Dio.netty.noUnsafe=true -Djava.io.tmpdir=C:/Users/Brazen/.IntelliJIdea2018.3/system/compile-server/test_f6da7a95/_temp_ -Djps.backward.ref.index.builder=true -Dkotlin.incremental.compilation=true -Dkotlin.daemon.enabled -Dkotlin.daemon.client.alive.path=\"C:\Users\Brazen\AppData\Local\Temp\kotlin-idea-3894503579951865671-is-running\" -classpath "C:/Me/JetBrains/IntelliJ IDEA 2018.3.3/lib/jps-launcher.jar;C:/Me/JetBrains/IntelliJ IDEA 2018.3.3/jre64/lib/tools.jar;C:/Me/JetBrains/IntelliJ IDEA 2018.3.3/lib/optimizedFileManager.jar" org.jetbrains.jps.cmdline.Launcher "C:/Me/JetBrains/IntelliJ IDEA 2018.3.3/lib/netty-common-4.1.30.Final.jar;C:/Me/JetBrains/IntelliJ IDEA 2018.3.3/lib/annotations.jar;C:/Me/JetBrains/IntelliJ IDEA 2018.3.3/lib/jps-builders.jar;C:/Me/JetBrains/IntelliJ IDEA 2018.3.3/lib/aether-spi-1.1.0.jar;C:/Me/JetBrains/IntelliJ IDEA 2018.3.3/lib/maven-builder-support-3.3.9.jar;C:/Me/JetBrains/IntelliJ IDEA 2018.3.3/lib/maven-aether-provider-3.3.9.jar;C:/Me/JetBrains/IntelliJ IDEA 2018.3.3/lib/oro-2.0.8.jar;C:/Me/JetBrains/IntelliJ IDEA 2018.3.3/lib/slf4j-api-1.7.25.jar;C:/Me/JetBrains/IntelliJ IDEA 2018.3.3/lib/httpcore-4.4.10.jar;C:/Me/JetBrains/IntelliJ IDEA 2018.3.3/lib/nanoxml-2.2.3.jar;C:/Me/JetBrains/IntelliJ IDEA 2018.3.3/lib/plexus-interpolation-1.21.jar;C:/Me/JetBrains/IntelliJ IDEA 2018.3.3/lib/aether-impl-1.1.0.jar;C:/Me/JetBrains/IntelliJ IDEA 2018.3.3/lib/commons-lang3-3.4.jar;C:/Me/JetBrains/IntelliJ IDEA 2018.3.3/lib/commons-codec-1.10.jar;C:/Me/JetBrains/IntelliJ IDEA 2018.3.3/lib/forms_rt.jar;C:/Me/JetBrains/IntelliJ IDEA 2018.3.3/lib/trove4j.jar;C:/Me/JetBrains/IntelliJ IDEA 2018.3.3/lib/maven-model-builder-3.3.9.jar;C:/Me/JetBrains/IntelliJ IDEA 2018.3.3/lib/jps-model.jar;C:/Me/JetBrains/IntelliJ IDEA 2018.3.3/lib/idea_rt.jar;C:/Me/JetBrains/IntelliJ IDEA 2018.3.3/lib/util.jar;C:/Me/JetBrains/IntelliJ IDEA 2018.3.3/lib/plexus-component-annotations-1.6.jar;C:/Me/JetBrains/IntelliJ IDEA 2018.3.3/lib/protobuf-java-3.4.0.jar;C:/Me/JetBrains/IntelliJ IDEA 2018.3.3/lib/netty-buffer-4.1.30.Final.jar;C:/Me/JetBrains/IntelliJ IDEA 2018.3.3/lib/asm-all-7.0.jar;C:/Me/JetBrains/IntelliJ IDEA 2018.3.3/lib/log4j.jar;C:/Me/JetBrains/IntelliJ IDEA 2018.3.3/lib/aether-dependency-resolver.jar;C:/Me/JetBrains/IntelliJ IDEA 2018.3.3/lib/guava-25.1-jre.jar;C:/Me/JetBrains/IntelliJ IDEA 2018.3.3/lib/aether-connector-basic-1.1.0.jar;C:/Me/JetBrains/IntelliJ IDEA 2018.3.3/lib/netty-resolver-4.1.30.Final.jar;C:/Me/JetBrains/IntelliJ IDEA 2018.3.3/lib/aether-transport-http-1.1.0.jar;C:/Me/JetBrains/IntelliJ IDEA 2018.3.3/lib/commons-logging-1.2.jar;C:/Me/JetBrains/IntelliJ IDEA 2018.3.3/lib/resources_en.jar;C:/Me/JetBrains/IntelliJ IDEA 2018.3.3/lib/aether-api-1.1.0.jar;C:/Me/JetBrains/IntelliJ IDEA 2018.3.3/lib/maven-repository-metadata-3.3.9.jar;C:/Me/JetBrains/IntelliJ IDEA 2018.3.3/lib/jna-platform.jar;C:/Me/JetBrains/IntelliJ IDEA 2018.3.3/lib/aether-util-1.1.0.jar;C:/Me/JetBrains/IntelliJ IDEA 2018.3.3/lib/netty-transport-4.1.30.Final.jar;C:/Me/JetBrains/IntelliJ IDEA 2018.3.3/lib/aether-transport-file-1.1.0.jar;C:/Me/JetBrains/IntelliJ IDEA 2018.3.3/lib/jna.jar;C:/Me/JetBrains/IntelliJ IDEA 2018.3.3/lib/netty-codec-4.1.30.Final.jar;C:/Me/JetBrains/IntelliJ IDEA 2018.3.3/lib/httpclient-4.5.6.jar;C:/Me/JetBrains/IntelliJ IDEA 2018.3.3/lib/lz4-1.3.0.jar;C:/Me/JetBrains/IntelliJ IDEA 2018.3.3/lib/platform-api.jar;C:/Me/JetBrains/IntelliJ IDEA 2018.3.3/lib/maven-artifact-3.3.9.jar;C:/Me/JetBrains/IntelliJ IDEA 2018.3.3/lib/jdom.jar;C:/Me/JetBrains/IntelliJ IDEA 2018.3.3/lib/maven-model-3.3.9.jar;C:/Me/JetBrains/IntelliJ IDEA 2018.3.3/lib/forms-1.1-preview.jar;C:/Me/JetBrains/IntelliJ IDEA 2018.3.3/lib/jps-builders-6.jar;C:/Me/JetBrains/IntelliJ IDEA 2018.3.3/lib/javac2.jar;C:/Me/JetBrains/IntelliJ IDEA 2018.3.3/lib/plexus-utils-3.0.22.jar;;C:/Me/JetBrains/IntelliJ IDEA 2018.3.3/lib/gson-2.8.5.jar;C:/Me/JetBrains/IntelliJ IDEA 2018.3.3/plugins/android/lib/jarutils.jar;C:/Me/JetBrains/IntelliJ IDEA 2018.3.3/lib/guava-25.1-jre.jar;C:/Me/JetBrains/IntelliJ IDEA 2018.3.3/plugins/android/lib/common-26.1.2.jar;C:/Me/JetBrains/IntelliJ IDEA 2018.3.3/plugins/android/lib/manifest-merger-26.1.2.jar;C:/Me/JetBrains/IntelliJ IDEA 2018.3.3/plugins/android/lib/sdk-common-26.1.2.jar;C:/Me/JetBrains/IntelliJ IDEA 2018.3.3/plugins/android/lib/builder-model-3.1.2.jar;C:/Me/JetBrains/IntelliJ IDEA 2018.3.3/plugins/android/lib/builder-test-api-3.1.2.jar;C:/Me/JetBrains/IntelliJ IDEA 2018.3.3/plugins/android/lib/ddmlib-26.1.2.jar;C:/Me/JetBrains/IntelliJ IDEA 2018.3.3/plugins/android/lib/repository-26.1.2.jar;C:/Me/JetBrains/IntelliJ IDEA 2018.3.3/plugins/gradle/lib/gradle-api-4.10.jar;C:/Me/JetBrains/IntelliJ IDEA 2018.3.3/lib/gson-2.8.5.jar;C:/Me/JetBrains/IntelliJ IDEA 2018.3.3/plugins/android/lib/jarutils.jar;C:/Me/JetBrains/IntelliJ IDEA 2018.3.3/lib/guava-25.1-jre.jar;C:/Me/JetBrains/IntelliJ IDEA 2018.3.3/plugins/android/lib/common-26.1.2.jar;C:/Me/JetBrains/IntelliJ IDEA 2018.3.3/plugins/android/lib/manifest-merger-26.1.2.jar;C:/Me/JetBrains/IntelliJ IDEA 2018.3.3/plugins/android/lib/sdk-common-26.1.2.jar;C:/Me/JetBrains/IntelliJ IDEA 2018.3.3/plugins/android/lib/builder-model-3.1.2.jar;C:/Me/JetBrains/IntelliJ IDEA 2018.3.3/plugins/android/lib/builder-test-api-3.1.2.jar;C:/Me/JetBrains/IntelliJ IDEA 2018.3.3/plugins/android/lib/ddmlib-26.1.2.jar;C:/Me/JetBrains/IntelliJ IDEA 2018.3.3/plugins/android/lib/repository-26.1.2.jar;C:/Me/JetBrains/IntelliJ IDEA 2018.3.3/plugins/gradle/lib/gradle-api-4.10.jar;C:/Me/JetBrains/IntelliJ IDEA 2018.3.3/plugins/ant/lib/ant-jps-plugin.jar;C:/Me/JetBrains/IntelliJ IDEA 2018.3.3/plugins/uiDesigner/lib/jps/ui-designer-jps-plugin.jar;C:/Me/JetBrains/IntelliJ IDEA 2018.3.3/plugins/IntelliLang/lib/intellilang-jps-plugin.jar;C:/Me/JetBrains/IntelliJ IDEA 2018.3.3/plugins/Groovy/lib/groovy-jps-plugin.jar;C:/Me/JetBrains/IntelliJ IDEA 2018.3.3/plugins/Groovy/lib/groovy-rt-constants.jar;C:/Me/JetBrains/IntelliJ IDEA 2018.3.3/plugins/eclipse/lib/eclipse-jps-plugin.jar;C:/Me/JetBrains/IntelliJ IDEA 2018.3.3/plugins/eclipse/lib/common-eclipse-util.jar;C:/Me/JetBrains/IntelliJ IDEA 2018.3.3/plugins/maven/lib/maven-jps-plugin.jar;C:/Me/JetBrains/IntelliJ IDEA 2018.3.3/plugins/osmorc/lib/osmorc-jps-plugin.jar;C:/Me/JetBrains/IntelliJ IDEA 2018.3.3/plugins/osmorc/lib/biz.aQute.bndlib-4.0.0.jar;C:/Me/JetBrains/IntelliJ IDEA 2018.3.3/plugins/osmorc/lib/biz.aQute.repository-4.0.0.jar;C:/Me/JetBrains/IntelliJ IDEA 2018.3.3/plugins/osmorc/lib/biz.aQute.resolve-4.0.0.jar;C:/Me/JetBrains/IntelliJ IDEA 2018.3.3/plugins/osmorc/lib/bundlor-all.jar;C:/Me/JetBrains/IntelliJ IDEA 2018.3.3/plugins/aspectj/lib/aspectj-jps-plugin.jar;C:/Me/JetBrains/IntelliJ IDEA 2018.3.3/plugins/gradle/lib/gradle-jps-plugin.jar;C:/Me/JetBrains/IntelliJ IDEA 2018.3.3/plugins/devkit/lib/devkit-jps-plugin.jar;C:/Me/JetBrains/IntelliJ IDEA 2018.3.3/plugins/JavaEE/lib/javaee-jps-plugin.jar;C:/Me/JetBrains/IntelliJ IDEA 2018.3.3/plugins/JavaEE/lib/jps/jpa-jps-plugin.jar;C:/Me/JetBrains/IntelliJ IDEA 2018.3.3/plugins/webSphereIntegration/lib/jps/webSphere-jps-plugin.jar;C:/Me/JetBrains/IntelliJ IDEA 2018.3.3/plugins/weblogicIntegration/lib/jps/weblogic-jps-plugin.jar;C:/Me/JetBrains/IntelliJ IDEA 2018.3.3/plugins/android/lib/jps/android-jps-plugin.jar;C:/Me/JetBrains/IntelliJ IDEA 2018.3.3/plugins/android/lib/android-common.jar;C:/Me/JetBrains/IntelliJ IDEA 2018.3.3/plugins/android/lib/build-common.jar;C:/Me/JetBrains/IntelliJ IDEA 2018.3.3/plugins/android/lib/android-rt.jar;C:/Me/JetBrains/IntelliJ IDEA 2018.3.3/plugins/android/lib/sdklib.jar;C:/Me/JetBrains/IntelliJ IDEA 2018.3.3/plugins/android/lib/common-26.1.2.jar;C:/Me/JetBrains/IntelliJ IDEA 2018.3.3/plugins/android/lib/jarutils.jar;C:/Me/JetBrains/IntelliJ IDEA 2018.3.3/plugins/android/lib/layoutlib-api.jar;C:/Me/JetBrains/IntelliJ IDEA 2018.3.3/plugins/javaFX/lib/javaFX-jps-plugin.jar;C:/Me/JetBrains/IntelliJ IDEA 2018.3.3/plugins/javaFX/lib/common-javaFX-plugin.jar;C:/Me/JetBrains/IntelliJ IDEA 2018.3.3/plugins/Kotlin/lib/jps/kotlin-jps-plugin.jar;C:/Me/JetBrains/IntelliJ IDEA 2018.3.3/plugins/Kotlin/lib/kotlin-stdlib.jar;C:/Me/JetBrains/IntelliJ IDEA 2018.3.3/plugins/Kotlin/lib/kotlin-reflect.jar;C:/Me/JetBrains/IntelliJ IDEA 2018.3.3/plugins/Kotlin/lib/kotlin-plugin.jar;C:/Me/JetBrains/IntelliJ IDEA 2018.3.3/plugins/Kotlin/lib/android-extensions-ide.jar;C:/Me/JetBrains/IntelliJ IDEA 2018.3.3/plugins/Kotlin/lib/android-extensions-compiler.jar;C:/Me/JetBrains/IntelliJ IDEA 2018.3.3/plugins/flex/lib/flex-jps-plugin.jar;C:/Me/JetBrains/IntelliJ IDEA 2018.3.3/plugins/flex/lib/flex-shared.jar;C:/Me/JetBrains/IntelliJ IDEA 2018.3.3/plugins/dmServer/lib/dmServer-jps-plugin.jar;C:/Me/JetBrains/IntelliJ IDEA 2018.3.3/plugins/GwtStudio/lib/gwt-jps-plugin.jar;C:/Me/JetBrains/IntelliJ IDEA 2018.3.3/plugins/GoogleAppEngine/lib/google-app-engine-jps-plugin.jar;C:/Me/JetBrains/IntelliJ IDEA 2018.3.3/plugins/GoogleAppEngine/lib/appEngine-runtime.jar;C:/Me/JetBrains/IntelliJ IDEA 2018.3.3/plugins/Grails/lib/grails-jps-plugin.jar;C:/Me/JetBrains/IntelliJ IDEA 2018.3.3/plugins/Grails/lib/grails-compiler-patch.jar" org.jetbrains.jps.cmdline.BuildMain 127.0.0.1 49669 1e1dfc85-eb4c-4dc1-b38c-ed6896c49276 C:/Users/Brazen/.IntelliJIdea2018.3/system/compile-server

Win7 64位编译boost内存占满卡死

Win7 4g内存 VS2013 64位编译Boost 1_57_0 运行命令bjam.exe stage --toolset=msvc-12.0 --without-graph --without-graph_parallel --without-math --without-mpi --without-serialization --without-wave --without-test --without-program_options --without-serialization --without-signals --stagedir=".\bin\vc12_x86"link=static runtime-link=static threading=multi debug release 内存占满,然后卡死,求解是否是内存不够的问题?

包测试的串行执行

<div class="post-text" itemprop="text"> <p>I have implemented several packages for a web API, each with their own test cases. When each package is tested using <code>go test ./api/pkgname</code> the tests pass. If I want to run all tests at once with <code>go test ./api/...</code> test cases always fail.</p> <p>In each test case, I recreate the entire schema using <code>DROP SCHEMA public CASCADE</code> followed by <code>CREATE SCHEMA public</code> and apply all migrations. The test suite reports errors back at random, saying a relation/table does not exist, so I guess each test suite (per package) is run in parallel somehow, thus messing up the DB state.</p> <p>I tried to pass along some test flags like <code>go test -cpu 1 -parallel 0 ./src/api/...</code> with no success.</p> <p>Could the problem here be tests running in parallel, and if yes, how can I force serial execution?</p> <p><em><strong>Update:</strong></em></p> <p>Currently I use this workaround to run the tests, but I still wonder if there's a better solution</p> <pre><code>find &lt;dir&gt; -type d -exec go test {} \; </code></pre> </div>

Battery

Problem Description Recently hzz invented a new kind of solar battery. The battery is so amazing that the electric power generated by it can satisfy the entire village. People in the villager are all very happy since they can get free and green energy from now on. But the manager of a power company is sorrow about this. So he plans to take some action to obstruct the battery. The battery can be regarded as a segment of L meters. And the manager plans to build n pillars on the battery. Like the picture above, the distance between pillar i and the battery's left end is Xi, and its height is Hi. The thickness of all pillars can be ignored. When the sunlight is slant, some part of the battery will be sheltered by the pillars. One meter battery exposed in the vertical sunlight for one hour will generate one unit of energy. If the sunlight is slant, the amount of energy generated should be multiplied by sinβ (β is the angle between sunlight and horizontal line). The sun rises from the infinite far left end of the horizon at 6 o’clock and goes down at the infinite far right end of the horizon at 18 o’clock. The sun is always infinite far away. So the sunlight is parallel, and β is π/2 at 12 o′clock. Please calculate the amount of energy generated by the battery between t1 o’clock and t2 o’clock (6 ≤t1<t2≤18 ). Input There are multiple test cases. For each test case: The first line contains two integer L(10≤L≤100,000) andN(4≤N≤1000), indicating the length of the battery and the number of pillars. The second line contains two integers, above mentioned t1 and t2(6 ≤t1<t2≤18 ). Then N lines follow, each containing two integers Xi(0≤Xi≤L) and Hi(1≤Hi≤1000), indicating the position and height of a pillar. It is guaranteed that no two pillars will be in the same position. It is also guaranteed that there is a pillar on both end of the battery. The input end with L=0, N=0. Output For each test case, you should output a line with the energy described above. Output should be rounded to 5 digits after decimal point. Sample Input 10 4 14 17 0 2 5 1 8 3 10 1 0 0 Sample Output 7.97188

真正的单元测试可以始终并行运行吗?

<div class="post-text" itemprop="text"> <h2>Background:</h2> <p>I'm writing a lot of go code, using the <code>go test</code> tool and the provided <code>"testing"</code> package for, well, testing. Most of the testing that I do is unit testing, within the TDD discipline. These "units" under test are never permitted to depend on stateful externalities like persistent storage, a network hop, etc., but receive fake, in-memory implementations of those externalities usually in "constructor/builder" functions (yes, I know they aren't constructors in the traditional sense).</p> <h2>The Problem:</h2> <p>It has long bothered me that the <code>go test</code> tool always runs test functions in the same deterministic order. This has, in a few cases, allowed race conditions to hide in the code. One way to find these bugs is by setting the <code>-race</code> flag. Another might be to always run unit tests in parallel...</p> <h2>The Question:</h2> <p>Is there <em>ever</em> a situation in which <em>isolated</em> unit tests could not or should not <em>always</em> be run in parallel (using the <code>-parallel</code> flag)?</p> </div>

关于用mvn编译java程序的问题

INFO:hackathon.launcher:execute script: mvn -T 2C clean package -Dmaven.test.skip=true -f ./draenor/pom.xml [INFO] Scanning for projects... [INFO] Building with 4 threads [INFO] [INFO] ------------------------------------------------------------------------ [INFO] Building draenor 0.1.0-SNAPSHOT [INFO] ------------------------------------------------------------------------ [WARNING] ***************************************************************** [WARNING] * Your build is requesting parallel execution, but project * [WARNING] * contains the following plugin(s) that are not marked as * [WARNING] * @threadSafe to support parallel building. * [WARNING] * While this /may/ work fine, please look for plugin updates * [WARNING] * and/or request plugins be made thread-safe. * [WARNING] * If reporting an issue, report it against the plugin in * [WARNING] * question, not against maven-core * [WARNING] ***************************************************************** [WARNING] The following plugins are not marked @threadSafe in draenor: [WARNING] org.apache.maven.plugins:maven-jar-plugin:2.2 [WARNING] ***************************************************************** [INFO] [INFO] --- maven-clean-plugin:2.5:clean (default-clean) @ draenor --- [INFO] Deleting /vagrant/./draenor/target [INFO] [INFO] --- maven-resources-plugin:2.5:resources (default-resources) @ draenor --- [debug] execute contextualize 日志文件里上述警告不断出现,程序没启动起来,各位大神,上面是说什么意思啊,我的程序时在Ubuntu下跑的,用的是maven

Frame Polygonal Line

You are going to read a sequence of pairs of integer numbers. Each pair represents the Cartesian coordinates of a point in a 2-dimentional plane. The first number is the x coordinate, while the second is that of y. The sequence represents a polygonal line. Your task is to draw a rectangle with minimal length of sides that exactly surrounds the polygonal line. The sides of the rectangle are parallel to x- and y-axis, respectively. Input Input consists of several test cases. For each case, a sequence of coordinates is given. Each pair of x and y occupies a line, with |x| and |y| less than 2^31. The sequence is terminated with a pair of 0's. Note that (0, 0) will never be considered as a point on any of the polygonal lines. An empty polygonal line signals the end of input. Output For each test case, print in one line two pairs of numbers, which are the south-west and north-east corners of the surrounding rectangle. The numbers must be separated by one space as is indicated in the samples. Sample Input 12 56 23 56 13 10 0 0 12 34 0 0 0 0 Sample Output 12 10 23 56 12 34 12 34

Box Relations

Problem Description There are n boxes C1, C2, ..., Cn in 3D space. The edges of the boxes are parallel to the x, y or z-axis. We provide some relations of the boxes, and your task is to construct a set of boxes satisfying all these relations. There are four kinds of relations (1 <= i,j <= n, i is different from j): I i j: The intersection volume of Ci and Cj is positive. X i j: The intersection volume is zero, and any point inside Ci has smaller x-coordinate than any point inside Cj. Y i j: The intersection volume is zero, and any point inside Ci has smaller y-coordinate than any point inside Cj. Z i j: The intersection volume is zero, and any point inside Ci has smaller z-coordinate than any point inside Cj. . Input There will be at most 30 test cases. Each case begins with a line containing two integers n (1 <= n <= 1,000) and R (0 <= R <= 100,000), the number of boxes and the number of relations. Each of the following R lines describes a relation, written in the format above. The last test case is followed by n=R=0, which should not be processed. Output For each test case, print the case number and either the word POSSIBLE or IMPOSSIBLE. If it's possible to construct the set of boxes, the i-th line of the following n lines contains six integers x1, y1, z1, x2, y2, z2, that means the i-th box is the set of points (x,y,z) satisfying x1 <= x <= x2, y1 <= y <= y2, z1 <= z <= z2. The absolute values of x1, y1, z1, x2, y2, z2 should not exceed 1,000,000. Print a blank line after the output of each test case. Sample Input 3 2 I 1 2 X 2 3 3 3 Z 1 2 Z 2 3 Z 3 1 1 0 0 0 Sample Output Case 1: POSSIBLE 0 0 0 2 2 2 1 1 1 3 3 3 8 8 8 9 9 9 Case 2: IMPOSSIBLE Case 3: POSSIBLE 0 0 0 1 1 1

软件测试入门、SQL、性能测试、测试管理工具

软件测试2小时入门,让您快速了解软件测试基本知识,有系统的了解; SQL一小时,让您快速理解和掌握SQL基本语法 jmeter性能测试 ,让您快速了解主流来源性能测试工具jmeter 测试管理工具-禅道,让您快速学会禅道的使用,学会测试项目、用例、缺陷的管理、

计算机组成原理实验教程

西北工业大学计算机组成原理实验课唐都仪器实验帮助,同实验指导书。分为运算器,存储器,控制器,模型计算机,输入输出系统5个章节

Java 最常见的 200+ 面试题:面试必备

这份面试清单是从我 2015 年做了 TeamLeader 之后开始收集的,一方面是给公司招聘用,另一方面是想用它来挖掘在 Java 技术栈中,还有那些知识点是我不知道的,我想找到这些技术盲点,然后修复它,以此来提高自己的技术水平。虽然我是从 2009 年就开始参加编程工作了,但我依旧觉得自己现在要学的东西很多,并且学习这些知识,让我很有成就感和满足感,那所以何乐而不为呢? 说回面试的事,这份面试...

winfrom中嵌套html,跟html的交互

winfrom中嵌套html,跟html的交互,源码就在里面一看就懂,很简单

玩转Python-Python3基础入门

总课时80+,提供源码和相关资料 本课程从Python零基础到纯Python项目实战。内容详细,案例丰富,覆盖了Python知识的方方面面,学完后不仅对Python知识有个系统化的了解,让你从Python小白变编程大牛! 课程包含: 1.python安装 2.变量、数据类型和运算符 3.选择结构 4.循环结构 5.函数和模块 6.文件读写 7.了解面向对象 8.异常处理

程序员的兼职技能课

获取讲师答疑方式: 在付费视频第一节(触摸命令_ALL)片头有二维码及加群流程介绍 限时福利 原价99元,今日仅需39元!购课添加小助手(微信号:itxy41)按提示还可领取价值800元的编程大礼包! 讲师介绍: 苏奕嘉&nbsp;前阿里UC项目工程师 脚本开发平台官方认证满级(六级)开发者。 我将如何教会你通过【定制脚本】赚到你人生的第一桶金? 零基础程序定制脚本开发课程,是完全针对零脚本开发经验的小白而设计,课程内容共分为3大阶段: ①前期将带你掌握Q开发语言和界面交互开发能力; ②中期通过实战来制作有具体需求的定制脚本; ③后期将解锁脚本的更高阶玩法,打通任督二脉; ④应用定制脚本合法赚取额外收入的完整经验分享,带你通过程序定制脚本开发这项副业,赚取到你的第一桶金!

HoloLens2开发入门教程

本课程为HoloLens2开发入门教程,讲解部署开发环境,安装VS2019,Unity版本,Windows SDK,创建Unity项目,讲解如何使用MRTK,编辑器模拟手势交互,打包VS工程并编译部署应用到HoloLens上等。

基于VHDL的16位ALU简易设计

基于VHDL的16位ALU简易设计,可完成基本的加减、带进位加减、或、与等运算。

MFC一站式终极全套课程包

该套餐共包含从C小白到C++到MFC的全部课程,整套学下来绝对成为一名C++大牛!!!

利用Verilog实现数字秒表(基本逻辑设计分频器练习)

设置复位开关。当按下复位开关时,秒表清零并做好计时准备。在任何情况下只要按下复位开关,秒表都要无条件地进行复位操作,即使是在计时过程中也要无条件地进行清零操作。 设置启/停开关。当按下启/停开关后,将

董付国老师Python全栈学习优惠套餐

购买套餐的朋友可以关注微信公众号“Python小屋”,上传付款截图,然后领取董老师任意图书1本。

Python可以这样学(第一季:Python内功修炼)

董付国系列教材《Python程序设计基础》、《Python程序设计(第2版)》、《Python可以这样学》配套视频,讲解Python 3.5.x和3.6.x语法、内置对象用法、选择与循环以及函数设计与使用、lambda表达式用法、字符串与正则表达式应用、面向对象编程、文本文件与二进制文件操作、目录操作与系统运维、异常处理结构。

计算机操作系统 第三版.pdf

计算机操作系统 第三版 本书全面介绍了计算机系统中的一个重要软件——操作系统(OS),本书是第三版,对2001年出版的修订版的各章内容均作了较多的修改,基本上能反映当前操作系统发展的现状,但章节名称基

技术大佬:我去,你写的 switch 语句也太老土了吧

昨天早上通过远程的方式 review 了两名新来同事的代码,大部分代码都写得很漂亮,严谨的同时注释也很到位,这令我非常满意。但当我看到他们当中有一个人写的 switch 语句时,还是忍不住破口大骂:“我擦,小王,你丫写的 switch 语句也太老土了吧!” 来看看小王写的代码吧,看完不要骂我装逼啊。 private static String createPlayer(PlayerTypes p...

Vue.js 2.0之全家桶系列视频课程

基于新的Vue.js 2.3版本, 目前新全的Vue.js教学视频,让你少走弯路,直达技术前沿! 1. 包含Vue.js全家桶(vue.js、vue-router、axios、vuex、vue-cli、webpack、ElementUI等) 2. 采用笔记+代码案例的形式讲解,通俗易懂

微信公众平台开发入门

本套课程的设计完全是为初学者量身打造,课程内容由浅入深,课程讲解通俗易懂,代码实现简洁清晰。通过本课程的学习,学员能够入门微信公众平台开发,能够胜任企业级的订阅号、服务号、企业号的应用开发工作。 通过本课程的学习,学员能够对微信公众平台有一个清晰的、系统性的认识。例如,公众号是什么,它有什么特点,它能做什么,怎么开发公众号。 其次,通过本课程的学习,学员能够掌握微信公众平台开发的方法、技术和应用实现。例如,开发者文档怎么看,开发环境怎么搭建,基本的消息交互如何实现,常用的方法技巧有哪些,真实应用怎么开发。

150讲轻松搞定Python网络爬虫

【为什么学爬虫?】 &nbsp; &nbsp; &nbsp; &nbsp;1、爬虫入手容易,但是深入较难,如何写出高效率的爬虫,如何写出灵活性高可扩展的爬虫都是一项技术活。另外在爬虫过程中,经常容易遇到被反爬虫,比如字体反爬、IP识别、验证码等,如何层层攻克难点拿到想要的数据,这门课程,你都能学到! &nbsp; &nbsp; &nbsp; &nbsp;2、如果是作为一个其他行业的开发者,比如app开发,web开发,学习爬虫能让你加强对技术的认知,能够开发出更加安全的软件和网站 【课程设计】 一个完整的爬虫程序,无论大小,总体来说可以分成三个步骤,分别是: 网络请求:模拟浏览器的行为从网上抓取数据。 数据解析:将请求下来的数据进行过滤,提取我们想要的数据。 数据存储:将提取到的数据存储到硬盘或者内存中。比如用mysql数据库或者redis等。 那么本课程也是按照这几个步骤循序渐进的进行讲解,带领学生完整的掌握每个步骤的技术。另外,因为爬虫的多样性,在爬取的过程中可能会发生被反爬、效率低下等。因此我们又增加了两个章节用来提高爬虫程序的灵活性,分别是: 爬虫进阶:包括IP代理,多线程爬虫,图形验证码识别、JS加密解密、动态网页爬虫、字体反爬识别等。 Scrapy和分布式爬虫:Scrapy框架、Scrapy-redis组件、分布式爬虫等。 通过爬虫进阶的知识点我们能应付大量的反爬网站,而Scrapy框架作为一个专业的爬虫框架,使用他可以快速提高我们编写爬虫程序的效率和速度。另外如果一台机器不能满足你的需求,我们可以用分布式爬虫让多台机器帮助你快速爬取数据。 &nbsp; 从基础爬虫到商业化应用爬虫,本套课程满足您的所有需求! 【课程服务】 专属付费社群+每周三讨论会+1v1答疑

SEIR课程设计源码与相关城市数据.rar

SEIR结合学报与之前博客结合所做的一些改进,选择其中三个城市进行拟合仿真SEIR结合学报与之前博客结合所做的一些改进,选择其中三个城市进行拟合仿真SEIR结合学报与之前博客结合所做的一些改进,选择其

Python数据挖掘简易入门

&nbsp; &nbsp; &nbsp; &nbsp; 本课程为Python数据挖掘方向的入门课程,课程主要以真实数据为基础,详细介绍数据挖掘入门的流程和使用Python实现pandas与numpy在数据挖掘方向的运用,并深入学习如何运用scikit-learn调用常用的数据挖掘算法解决数据挖掘问题,为进一步深入学习数据挖掘打下扎实的基础。

2019 AI开发者大会

2019 AI开发者大会(AI ProCon 2019)是由中国IT社区CSDN主办的AI技术与产业年度盛会。多年经验淬炼,如今蓄势待发:2019年9月6-7日,大会将有近百位中美顶尖AI专家、知名企业代表以及千余名AI开发者齐聚北京,进行技术解读和产业论证。我们不空谈口号,只谈技术,诚挚邀请AI业内人士一起共铸人工智能新篇章!

Java面试题大全(2020版)

发现网上很多Java面试题都没有答案,所以花了很长时间搜集整理出来了这套Java面试题大全,希望对大家有帮助哈~ 本套Java面试题大全,全的不能再全,哈哈~ 一、Java 基础 1. JDK 和 JRE 有什么区别? JDK:Java Development Kit 的简称,java 开发工具包,提供了 java 的开发环境和运行环境。 JRE:Java Runtime Environ...

定量遥感中文版 梁顺林著 范闻捷译

这是梁顺林的定量遥感的中文版,由范闻捷等翻译的,是电子版PDF,解决了大家看英文费时费事的问题,希望大家下载看看,一定会有帮助的

GIS程序设计教程 基于ArcGIS Engine的C#开发实例

张丰,杜震洪,刘仁义编著.GIS程序设计教程 基于ArcGIS Engine的C#开发实例.浙江大学出版社,2012.05

人工智能-计算机视觉实战之路(必备算法+深度学习+项目实战)

系列课程主要分为3大阶段:(1)首先掌握计算机视觉必备算法原理,结合Opencv进行学习与练手,通过实际视项目进行案例应用展示。(2)进军当下最火的深度学习进行视觉任务实战,掌握深度学习中必备算法原理与网络模型架构。(3)结合经典深度学习框架与实战项目进行实战,基于真实数据集展开业务分析与建模实战。整体风格通俗易懂,项目驱动学习与就业面试。 建议同学们按照下列顺序来进行学习:1.Python入门视频课程 2.Opencv计算机视觉实战(Python版) 3.深度学习框架-PyTorch实战/人工智能框架实战精讲:Keras项目 4.Python-深度学习-物体检测实战 5.后续实战课程按照自己喜好选择就可以

三个项目玩转深度学习(附1G源码)

从事大数据与人工智能开发与实践约十年,钱老师亲自见证了大数据行业的发展与人工智能的从冷到热。事实证明,计算机技术的发展,算力突破,海量数据,机器人技术等,开启了第四次工业革命的序章。深度学习图像分类一直是人工智能的经典任务,是智慧零售、安防、无人驾驶等机器视觉应用领域的核心技术之一,掌握图像分类技术是机器视觉学习的重中之重。针对现有线上学习的特点与实际需求,我们开发了人工智能案例实战系列课程。打造:以项目案例实践为驱动的课程学习方式,覆盖了智能零售,智慧交通等常见领域,通过基础学习、项目案例实践、社群答疑,三维立体的方式,打造最好的学习效果。

微信小程序开发实战之番茄时钟开发

微信小程序番茄时钟视频教程,本课程将带着各位学员开发一个小程序初级实战类项目,针对只看过官方文档而又无从下手的开发者来说,可以作为一个较好的练手项目,对于有小程序开发经验的开发者而言,可以更好加深对小程序各类组件和API 的理解,为更深层次高难度的项目做铺垫。

面试了一个 31 岁程序员,让我有所触动,30岁以上的程序员该何去何从?

最近面试了一个31岁8年经验的程序猿,让我有点感慨,大龄程序猿该何去何从。

去除异常值matlab程序

数据预处理中去除异常值的程序,matlab写成

用verilog HDL语言编写的秒表

在秒表设计中,分模块书写。用在七段数码管上显示。输入频率是1KHZ.可以显示百分秒,秒,分。如要显示小时,只需修改leds里的代码和主模块代码。改程序以通过硬件电路验证。完全正确。

[透视java——反编译、修补和逆向工程技术]源代码

源代码。

用QUARTUS设计模可变计数器器

用QUARTUS设计摸20|60的模可变计数器,文本设计

相关热词 c# 开发接口 c# 中方法上面的限制 c# java 时间戳 c#单元测试入门 c# 数组转化成文本 c#实体类主外键关系设置 c# 子函数 局部 c#窗口位置设置 c# list 查询 c# 事件 执行顺序
立即提问
相关内容推荐