在64位的win7下,用vb6.0设置hook出提示找不到入口。
 Private hhook As Long
Private Declare Function SetWindowsHookEx Lib "user32" (ByVal idHook As Long, ByVal lpfn As Long, ByVal hmod As Long, ByVal dwThreadId As Long) As Long

Private Sub Form_Load()

    hhook = SetWindowsHookEx(2, AddressOf Module1.AddControlHookA, App.hInstance, App.ThreadID)
End Sub

这是源码,运行后提示
图片说明

请问,是64位的win7不支持这个方法吗?64位win7要怎么操作呢?

1个回答

64位windows也是可以支持的。你试试看Alias "SetWindowsHookExW"

Csdn user default icon
上传中...
上传图片
插入图片
抄袭、复制答案,以达到刷声望分或其他目的的行为,在CSDN问答是严格禁止的,一经发现立刻封号。是时候展现真正的技术了!
其他相关推荐
win7键盘鼠标HOOK还能用吗
hook键盘鼠标 在XP下能用 WIN7不能用
mysql 5.6 版本的程序跑mysql 8.0 报错如下 ,数据库可以正常连接,连接数据库的驱动等相关jar 也换成了mysql8.0对应版本还是报如下错误
[http-nio-8085-exec-3] INFO com.gdk.jdbc.connection.ConnectionPoolManager - ***** Add ConnectionPoolShutdownHook to JVM hook ***** [MLog-Init-Reporter] INFO com.mchange.v2.log.MLog - MLog clients using slf4j logging. [http-nio-8085-exec-3] INFO com.mchange.v2.c3p0.C3P0Registry - Initializing c3p0-0.9.5 [built 02-January-2015 13:25:04 -0500; debug? true; trace: 10] [http-nio-8085-exec-3] WARN com.mchange.v2.c3p0.cfg.C3P0Config - named-config with name 'mysql' does not exist. Using default-config. [http-nio-8085-exec-3] WARN com.mchange.v2.c3p0.cfg.C3P0Config - named-config with name 'mysql' does not exist. Using default-config extensions. [http-nio-8085-exec-3] INFO com.gdk.jdbc.connection.c3p0.C3P0ConnectionProvider - >>>>>> C3P0ConnectionProvider startup initing with configuration: {"file":"/E:/tomcat/webapps/huajiu/WEB-INF/classes/datasource.xml","id":"mysql.1509899288","name":"mysql","driverClass":"com.mysql.cj.jdbc.Driver","driverUrl":"jdbc:mysql://47.XXX.XXX.229:3306/huajiu?useSSL=true","user":"root","password":"******","connectionProvider":"com.gdk.jdbc.connection.c3p0.C3P0ConnectionProvider","dialect":"com.gdk.jdbc.dialect.MySQLDialect","maxConnectionSize":10,"minConnectionSize":1,"initConnectionSize":1,"availableConnectionSize":1,"acquireIncrementSize":1,"maxConnectionIdletime":3600,"maxConnectionLifetime":14400,"acquireRetryAttempts":30,"acquireRetryDelay":2000,"idleConnectionTestPeriod":3600,"testConnectionCheckout":false,"testConnectionCheckin":false,"connectionTimeout":5000,"showSQL":false,"loadOnStartup":true,"monitorEnable":false,"weight":1,"_xaDataSourceClass":"com.mysql.jdbc.jdbc2.optional.MysqlXADataSource","customize":{}} [http-nio-8085-exec-3] INFO com.mchange.v2.c3p0.impl.AbstractPoolBackedDataSource - Initializing c3p0 pool... com.mchange.v2.c3p0.ComboPooledDataSource [ acquireIncrement -> 1, acquireRetryAttempts -> 30, acquireRetryDelay -> 2000, autoCommitOnClose -> false, automaticTestTable -> null, breakAfterAcquireFailure -> false, checkoutTimeout -> 5000, connectionCustomizerClassName -> null, connectionTesterClassName -> com.mchange.v2.c3p0.impl.DefaultConnectionTester, contextClassLoaderSource -> caller, dataSourceName -> mysql.1509899288, debugUnreturnedConnectionStackTraces -> false, description -> null, driverClass -> com.mysql.cj.jdbc.Driver, extensions -> {}, factoryClassLocation -> null, forceIgnoreUnresolvedTransactions -> false, forceUseNamedDriverClass -> false, identityToken -> 1hgf2m9a61o8yo2f6gh111|79cc61a8, idleConnectionTestPeriod -> 3600, initialPoolSize -> 1, jdbcUrl -> jdbc:mysql://47.XXX.XXX.229:3306/huajiu?useSSL=true, maxAdministrativeTaskTime -> 0, maxConnectionAge -> 14400, maxIdleTime -> 3600, maxIdleTimeExcessConnections -> 0, maxPoolSize -> 10, maxStatements -> 0, maxStatementsPerConnection -> 0, minPoolSize -> 1, numHelperThreads -> 1, preferredTestQuery -> null, privilegeSpawnedThreads -> false, properties -> {user=******, password=******}, propertyCycle -> 0, statementCacheNumDeferredCloseThreads -> 0, testConnectionOnCheckin -> false, testConnectionOnCheckout -> false, unreturnedConnectionTimeout -> 0, userOverrides -> {}, usesTraditionalReflectiveProxies -> false ] com.gdk.jdbc.connection.ConnectFailedException: Can not get a Master Connection from datasource<mysql>! at com.gdk.jdbc.connection.ConnectionPoolManager.getMasterConnection(ConnectionPoolManager.java:188) at com.gdk.jdbc.JdbcHandlerImpl.getMasterConnection(JdbcHandlerImpl.java:1780) at com.gdk.jdbc.JdbcHandlerImpl.getConnection(JdbcHandlerImpl.java:1767) at com.gdk.jdbc.JdbcHandlerImpl.queryForList(JdbcHandlerImpl.java:1156) at com.gdk.jdbc.JdbcHandlerImpl.queryForList(JdbcHandlerImpl.java:1138) at com.qshl.aqb.sys.dao.PlatformDao.getUserByUname(PlatformDao.java:21) at com.qshl.aqb.sys.service.impl.PlatformServiceImpl.login(PlatformServiceImpl.java:26) at com.qshl.aqb.sys.controller.PlatformController.unameLogin(PlatformController.java:51) at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:497) at org.springframework.web.method.support.InvocableHandlerMethod.doInvoke(InvocableHandlerMethod.java:221) at org.springframework.web.method.support.InvocableHandlerMethod.invokeForRequest(InvocableHandlerMethod.java:137) at org.springframework.web.servlet.mvc.method.annotation.ServletInvocableHandlerMethod.invokeAndHandle(ServletInvocableHandlerMethod.java:110) at org.springframework.web.servlet.mvc.method.annotation.RequestMappingHandlerAdapter.invokeHandleMethod(RequestMappingHandlerAdapter.java:776) at org.springframework.web.servlet.mvc.method.annotation.RequestMappingHandlerAdapter.handleInternal(RequestMappingHandlerAdapter.java:705) at org.springframework.web.servlet.mvc.method.AbstractHandlerMethodAdapter.handle(AbstractHandlerMethodAdapter.java:85) at org.springframework.web.servlet.DispatcherServlet.doDispatch(DispatcherServlet.java:959) at org.springframework.web.servlet.DispatcherServlet.doService(DispatcherServlet.java:893) at org.springframework.web.servlet.FrameworkServlet.processRequest(FrameworkServlet.java:966) at org.springframework.web.servlet.FrameworkServlet.doPost(FrameworkServlet.java:868) at javax.servlet.http.HttpServlet.service(HttpServlet.java:648) at org.springframework.web.servlet.FrameworkServlet.service(FrameworkServlet.java:842) at javax.servlet.http.HttpServlet.service(HttpServlet.java:729) at org.apache.catalina.core.ApplicationFilterChain.internalDoFilter(ApplicationFilterChain.java:292) at org.apache.catalina.core.ApplicationFilterChain.doFilter(ApplicationFilterChain.java:207) at org.apache.tomcat.websocket.server.WsFilter.doFilter(WsFilter.java:52) at org.apache.catalina.core.ApplicationFilterChain.internalDoFilter(ApplicationFilterChain.java:240) at org.apache.catalina.core.ApplicationFilterChain.doFilter(ApplicationFilterChain.java:207) at org.springframework.web.filter.CharacterEncodingFilter.doFilterInternal(CharacterEncodingFilter.java:85) at org.springframework.web.filter.OncePerRequestFilter.doFilter(OncePerRequestFilter.java:107) at org.apache.catalina.core.ApplicationFilterChain.internalDoFilter(ApplicationFilterChain.java:240) at org.apache.catalina.core.ApplicationFilterChain.doFilter(ApplicationFilterChain.java:207) at org.apache.catalina.core.StandardWrapperValve.invoke(StandardWrapperValve.java:212) at org.apache.catalina.core.StandardContextValve.invoke(StandardContextValve.java:94) at org.apache.catalina.authenticator.AuthenticatorBase.invoke(AuthenticatorBase.java:496) at org.apache.catalina.core.StandardHostValve.invoke(StandardHostValve.java:141) at org.apache.catalina.valves.ErrorReportValve.invoke(ErrorReportValve.java:79) at org.apache.catalina.valves.AbstractAccessLogValve.invoke(AbstractAccessLogValve.java:620) at org.apache.catalina.core.StandardEngineValve.invoke(StandardEngineValve.java:88) at org.apache.catalina.connector.CoyoteAdapter.service(CoyoteAdapter.java:502) at org.apache.coyote.http11.AbstractHttp11Processor.process(AbstractHttp11Processor.java:1156) at org.apache.coyote.AbstractProtocol$AbstractConnectionHandler.process(AbstractProtocol.java:684) at org.apache.tomcat.util.net.NioEndpoint$SocketProcessor.doRun(NioEndpoint.java:1539) at org.apache.tomcat.util.net.NioEndpoint$SocketProcessor.run(NioEndpoint.java:1495) at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142) at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617) at org.apache.tomcat.util.threads.TaskThread$WrappingRunnable.run(TaskThread.java:61) at java.lang.Thread.run(Thread.java:745) Caused by: com.mchange.v2.resourcepool.TimeoutException: A client timed out while waiting to acquire a resource from com.mchange.v2.resourcepool.BasicResourcePool@1c2032db -- timeout at awaitAvailable() at com.mchange.v2.resourcepool.BasicResourcePool.awaitAvailable(BasicResourcePool.java:1461) at com.mchange.v2.resourcepool.BasicResourcePool.prelimCheckoutResource(BasicResourcePool.java:639) at com.mchange.v2.resourcepool.BasicResourcePool.checkoutResource(BasicResourcePool.java:549) at com.mchange.v2.c3p0.impl.C3P0PooledConnectionPool.checkoutAndMarkConnectionInUse(C3P0PooledConnectionPool.java:756) at com.mchange.v2.c3p0.impl.C3P0PooledConnectionPool.checkoutPooledConnection(C3P0PooledConnectionPool.java:683) at com.mchange.v2.c3p0.impl.AbstractPoolBackedDataSource.getConnection(AbstractPoolBackedDataSource.java:140) at com.gdk.jdbc.connection.c3p0.C3P0ConnectionProvider.getConnection(C3P0ConnectionProvider.java:230) at com.gdk.jdbc.connection.ConnectionPoolManager.getMasterConnection(ConnectionPoolManager.java:177) ... 49 more
HOOK WebBrowser获取打开的网址 request response header等.
以下代码使用了EsayHook 想先获取到打开的网址. 无奈断点断不到..是否HOOK的函数错了? ``` using System; using System.Collections.Generic; using System.ComponentModel; using System.Data; using System.Drawing; using System.Linq; using System.Text; using System.Windows.Forms; using System.Runtime.InteropServices; using EasyHook; namespace TestHookAPI { public partial class Form1 : Form { LocalHook hookt = null; [DllImport("WININET.dll", SetLastError = true, CharSet = CharSet.Unicode)] public static extern IntPtr InternetOpenUrlW( IntPtr hInternet, string lpszUrl, string lpszHeaders, Int32 dwHeadersLength, Int32 dwFlags, System.UIntPtr dwContext ); [UnmanagedFunctionPointer(CallingConvention.StdCall, CharSet = CharSet.Unicode, SetLastError = true)] public delegate IntPtr d_InternetOpenUrl( IntPtr hInternet, string lpszUrl, string lpszHeaders, Int32 dwHeadersLength, Int32 dwFlags, System.UIntPtr dwContext ); public static IntPtr h_InternetOpenUrl( IntPtr hInternet, string lpszUrl, string lpszHeaders, Int32 dwHeadersLength, Int32 dwFlags, System.UIntPtr dwContext ) { IntPtr pret = InternetOpenUrlW(hInternet, lpszUrl, lpszHeaders, dwHeadersLength, dwFlags, dwContext); return pret; } public Form1() { InitializeComponent(); } private void Form1_Load(object sender, EventArgs e) { hookt = LocalHook.Create(LocalHook.GetProcAddress("WININET.dll", "InternetOpenUrlW"), new d_InternetOpenUrl(h_InternetOpenUrl), this); hookt.ThreadACL.SetExclusiveACL(new Int32[0]); webBrowser1.Navigate("http://www.baidu.com/"); } private void webBrowser1_DocumentCompleted(object sender, WebBrowserDocumentCompletedEventArgs e) { } } } ```
Xposed框架中hook了System.load函数导致app崩溃
在Xposed框架中自己编写模块,hook了System.load函数,激活之后会导致QQ浏览器崩溃、UC浏览器提示安装不完全、4.4系统自带浏览器打开黑屏。 查看日志报的错误是“java.lang.NoClassDefFoundError”,同时还伴随“ W/linker(5153): libmsfbootV2.so has text relocations. This is wasting memory and is a security risk. Please fix.”这样的提示信息。 手机型号是HTC D816t,版本是Android 4.4.2,Xposed框架是2.6.1版本,编写模块引入的jar包是“XposedBridgeApi-54”。 求解,是程序增加了防护措施导致java层hook失败么?hook其它函数并不会导致程序完全崩溃的情况。
The gold miners 程序的编写
Problem Description There are some funny mini games on the internet. Sailormoon girls usually play these games in them spare time. They always play a mini game named THE GOLD MINNERS. Now they give you a problem about this game. Could you solve it? Now let’s look at the following picture. We set the position of the minner as the origin of coordinate.There are many objects in this game,not only gold. They also have diamonds,stones,and other unknown things.All the objects can be treated as polygon. Every two polygons can not intersect. In this problem,we will give you some data about the angle in order. When the hook with certain angle sinking, it will always catch nearest object, and get the corresponding money. There will be nothing in the position of this object later. If there have no objects,the hook will return back to the origin of coordinate. How much money does the minner have at last?Do you know? Input There are multiple test cases. Each case begins with a positive integer N (0 <= N <= 30) , it means there are N objects. The following N * 2 lines describe the informatin about objects. Each object starts with a positive integer V (3 <= V <= 15), it means this object can be treated as a polygon with V vertex. Followed by a positive integer W which is the value of this object. The next line contains V * 2 decimal number. X1,Y1,X2,Y2……Xv,Yv. You can suppose all coordinate data are in the double range.Vertex coordinates clockwise. Then followed by a positive integer M (M <= 100) which is number of the minner cast hook. The following M lines contains M positive integer.The Xth integer means the angle with which the minner cast hook the Xth.(0 < angle < 180) Output Calculate and print the value of the minner earn at last for each test case. Sample Input 4 4 158 0.0 -1.0 2.0 -1.0 0.0 -3.0 -1.0 -2.0 4 37 4.0 -1.0 4.0 -2.0 3.0 -2.0 3.0 -1.0 6 360 2.0 -3.0 3.0 -3.0 2.0 -5.0 1.0 -5.0 1.0 -4.0 2.0 -4.0 5 223 -2.0 -4.0 -2.0 -5.0 -3.0 -6.0 -4.0 -5.0 -4.0 -4.0 3 45 16 127 5 4 10 -3.0 0.0 -2.0 -1.0 -3.0 -1.0 -4.0 0.0 4 208 -2.0 0.0 0.0 -1.0 3.0 -1.0 0.0 -3.0 6 660 3.0 -2.0 4.0 -2.0 5.0 -3.0 6.0 -3.0 6.0 -6.0 3.0 -6.0 4 25 -1.0 -4.0 2.0 -4.0 2.0 -6.0 -1.0 -6.0 3 18 0.0 -7.0 1.0 -8.0 -1.0 -8.0 3 34 64 90 Sample Output 418 251
利用conda install TensorFlow-gpu在win7上conda3.7版本上安装tensorflow后,测试时出现下面的问题
在测试import TensorFlow as tf print('hello'),出现下列问题,请问这是什么原因造成的,如何改? ``` Traceback (most recent call last): File "D:\ProgramData\Anaconda3\lib\site-packages\tensorflow\python\pywrap_tensorflow.py", line 58, in <module> from tensorflow.python.pywrap_tensorflow_internal import * File "D:\Program Files\JetBrains\PyCharm 2019.1.3\helpers\pydev\_pydev_bundle\pydev_import_hook.py", line 21, in do_import module = self._system_import(name, *args, **kwargs) File "D:\ProgramData\Anaconda3\lib\site-packages\tensorflow\python\pywrap_tensorflow_internal.py", line 28, in <module> _pywrap_tensorflow_internal = swig_import_helper() File "D:\ProgramData\Anaconda3\lib\site-packages\tensorflow\python\pywrap_tensorflow_internal.py", line 24, in swig_import_helper _mod = imp.load_module('_pywrap_tensorflow_internal', fp, pathname, description) File "D:\ProgramData\Anaconda3\lib\imp.py", line 242, in load_module return load_dynamic(name, filename, file) File "D:\ProgramData\Anaconda3\lib\imp.py", line 342, in load_dynamic return _load(spec) ImportError: DLL load failed: 找不到指定的程序。 During handling of the above exception, another exception occurred: Traceback (most recent call last): File "D:\ProgramData\Anaconda3\lib\site-packages\IPython\core\interactiveshell.py", line 3296, in run_code exec(code_obj, self.user_global_ns, self.user_ns) File "<ipython-input-2-d1ce02c95f3b>", line 1, in <module> runfile('C:/Users/jianjiu17/Desktop/deep-learning-from-scratch-master/uittle.py', wdir='C:/Users/jianjiu17/Desktop/deep-learning-from-scratch-master') File "D:\Program Files\JetBrains\PyCharm 2019.1.3\helpers\pydev\_pydev_bundle\pydev_umd.py", line 197, in runfile pydev_imports.execfile(filename, global_vars, local_vars) # execute the script File "D:\Program Files\JetBrains\PyCharm 2019.1.3\helpers\pydev\_pydev_imps\_pydev_execfile.py", line 18, in execfile exec(compile(contents+"\n", file, 'exec'), glob, loc) File "C:/Users/jianjiu17/Desktop/deep-learning-from-scratch-master/uittle.py", line 1, in <module> import tensorflow as tf File "D:\Program Files\JetBrains\PyCharm 2019.1.3\helpers\pydev\_pydev_bundle\pydev_import_hook.py", line 21, in do_import module = self._system_import(name, *args, **kwargs) File "D:\ProgramData\Anaconda3\lib\site-packages\tensorflow\__init__.py", line 24, in <module> from tensorflow.python import pywrap_tensorflow # pylint: disable=unused-import File "D:\Program Files\JetBrains\PyCharm 2019.1.3\helpers\pydev\_pydev_bundle\pydev_import_hook.py", line 21, in do_import module = self._system_import(name, *args, **kwargs) File "D:\ProgramData\Anaconda3\lib\site-packages\tensorflow\python\__init__.py", line 49, in <module> from tensorflow.python import pywrap_tensorflow File "D:\Program Files\JetBrains\PyCharm 2019.1.3\helpers\pydev\_pydev_bundle\pydev_import_hook.py", line 21, in do_import module = self._system_import(name, *args, **kwargs) File "D:\ProgramData\Anaconda3\lib\site-packages\tensorflow\python\pywrap_tensorflow.py", line 74, in <module> raise ImportError(msg) ImportError: Traceback (most recent call last): File "D:\ProgramData\Anaconda3\lib\site-packages\tensorflow\python\pywrap_tensorflow.py", line 58, in <module> from tensorflow.python.pywrap_tensorflow_internal import * File "D:\Program Files\JetBrains\PyCharm 2019.1.3\helpers\pydev\_pydev_bundle\pydev_import_hook.py", line 21, in do_import module = self._system_import(name, *args, **kwargs) File "D:\ProgramData\Anaconda3\lib\site-packages\tensorflow\python\pywrap_tensorflow_internal.py", line 28, in <module> _pywrap_tensorflow_internal = swig_import_helper() File "D:\ProgramData\Anaconda3\lib\site-packages\tensorflow\python\pywrap_tensorflow_internal.py", line 24, in swig_import_helper _mod = imp.load_module('_pywrap_tensorflow_internal', fp, pathname, description) File "D:\ProgramData\Anaconda3\lib\imp.py", line 242, in load_module return load_dynamic(name, filename, file) File "D:\ProgramData\Anaconda3\lib\imp.py", line 342, in load_dynamic return _load(spec) ImportError: DLL load failed: 找不到指定的程序。 Failed to load the native TensorFlow runtime. See https://www.tensorflow.org/install/errors for some common reasons and solutions. Include the entire stack trace above this error message when asking for help. ```
如何在64位Linux下Hook,实现跳转到自己定义的钩子函数?
想在64bit的linux上,做Hook。 实现如下功能: 1.由系统函数跳转到自己定义的钩子函数,如何是爱心啊注意:是64bit系统不是32系统。
在centos6.9上注册自己开发的netfilter内核模块 系统重启,有大神帮看下?
自己写的一个NAT功能,挂载pre routing 和 post routing两个钩子函数。但是insmod 模块后,系统直接重启。我尝试注掉钩子函数内部的内容后,不重启了,但是rmmod时 又重启。说明可能是挂载有问题, 但是找不到问题出在哪。 ``` { nfhk_serv_in.hook = nf_hook_proc_in; nfhk_serv_in.pf = PF_INET; nfhk_serv_in.hooknum = NF_INET_PRE_ROUTING; nfhk_serv_in.priority = NF_BR_PRI_FIRST; //ret = nf_register_hook(&nfhk_serv_in); if (ret != 0) return ret; nfhk_serv_out.hook = nf_hook_proc_out; nfhk_serv_out.pf = PF_INET; nfhk_serv_out.hooknum = NF_INET_POST_ROUTING; nfhk_serv_out.priority = NF_BR_PRI_FIRST; ret = nf_register_hook(&nfhk_serv_out); if (ret != 0) return ret; } ```
pyinstaller 打包文件时出现pre-safe-import-module hook failed, needs fixing.的错误
pyinstaller 打包文件时出现错误。 ``` 3113 INFO: Processing pre-safe import module hook urllib3.packages.six.moves pre-safe-import-module hook failed, needs fixing. ``` 无法生成exe文件 版本:python3.7 本人初学python,小白一个, 求教大佬是怎么回事,如何解决。。
求助:pyinstaller3.5 打包 WARNING: Cannot read QLibraryInfo..... json.decoder.JSONDecodeError..
求助,python新手,installer打包问题... --- 我是直接安装Anaconda pyinstaller是下载的包,然后直接python setup.py install --- 版本信息: PyQt:PyQt5-5.12.1 Python:Python 3.5.2 :: Anaconda 4.2.0 (64-bit) pyinstaller:3.5.dev0+14b6e6564 --- 下面是报错 ----- ```` C:\CodeNew\pythonEx\excel>pyinstaller -F UnitClassification.py 573 INFO: PyInstaller: 3.5.dev0+14b6e6564 573 INFO: Python: 3.5.2 575 INFO: Platform: Windows-10-10.0.17134-SP0 583 INFO: wrote C:\CodeNew\pythonEx\excel\UnitClassification.spec 588 INFO: UPX is not available. 590 INFO: Extending PYTHONPATH with paths ['C:\\CodeNew\\pythonEx\\excel', 'C:\\CodeNew\\pythonEx\\excel'] 591 INFO: checking Analysis 910 INFO: Building because C:\Users\wen\Anaconda3\lib\site-packages\PyQt5\__init__.py changed 911 INFO: Initializing module dependency graph... 918 INFO: Initializing module graph hooks... 928 INFO: Analyzing base_library.zip ... 8230 INFO: running Analysis Analysis-00.toc 9060 INFO: Caching module hooks... 9066 INFO: Analyzing C:\CodeNew\pythonEx\excel\UnitClassification.py 9247 INFO: Processing pre-find module path hook distutils 9823 INFO: Processing pre-safe import module hook six.moves 18032 INFO: Processing pre-safe import module hook setuptools.extern.six.moves 18932 INFO: Processing pre-find module path hook site 18938 INFO: site: retargeting to fake-dir 'c:\\users\\wen\\anaconda3\\lib\\site-packages\\pyinstaller-3.5.dev0+14b6e6564-py3.5.egg\\PyInstaller\\fake-modules' 18982 INFO: Processing pre-safe import module hook win32com 102468 INFO: Loading module hooks... 102469 INFO: Loading module hook "hook-PyQt5.py"... 102560 WARNING: Cannot read QLibraryInfo output: raised Expecting value: line 1 column 1 (char 0) when decoding: Traceback (most recent call last): File "<string>", line 11, in <module> ImportError: DLL load failed: 找不到指定的模块。 Traceback (most recent call last): File "C:\Users\wen\Anaconda3\Scripts\pyinstaller-script.py", line 11, in <module> load_entry_point('PyInstaller==3.5.dev0+14b6e6564', 'console_scripts', 'pyinstaller')() File "c:\users\wen\anaconda3\lib\site-packages\pyinstaller-3.5.dev0+14b6e6564-py3.5.egg\PyInstaller\__main__.py", line 111, in run run_build(pyi_config, spec_file, **vars(args)) File "c:\users\wen\anaconda3\lib\site-packages\pyinstaller-3.5.dev0+14b6e6564-py3.5.egg\PyInstaller\__main__.py", line 63, in run_build PyInstaller.building.build_main.main(pyi_config, spec_file, **kwargs) File "c:\users\wen\anaconda3\lib\site-packages\pyinstaller-3.5.dev0+14b6e6564-py3.5.egg\PyInstaller\building\build_main.py", line 844, in main build(specfile, kw.get('distpath'), kw.get('workpath'), kw.get('clean_build')) File "c:\users\wen\anaconda3\lib\site-packages\pyinstaller-3.5.dev0+14b6e6564-py3.5.egg\PyInstaller\building\build_main.py", line 791, in build exec(code, spec_namespace) File "C:\CodeNew\pythonEx\excel\UnitClassification.spec", line 17, in <module> noarchive=False) File "c:\users\wen\anaconda3\lib\site-packages\pyinstaller-3.5.dev0+14b6e6564-py3.5.egg\PyInstaller\building\build_main.py", line 243, in __init__ self.__postinit__() File "c:\users\wen\anaconda3\lib\site-packages\pyinstaller-3.5.dev0+14b6e6564-py3.5.egg\PyInstaller\building\datastruct.py", line 158, in __postinit__ self.assemble() File "c:\users\wen\anaconda3\lib\site-packages\pyinstaller-3.5.dev0+14b6e6564-py3.5.egg\PyInstaller\building\build_main.py", line 502, in assemble module_hook.post_graph() File "c:\users\wen\anaconda3\lib\site-packages\pyinstaller-3.5.dev0+14b6e6564-py3.5.egg\PyInstaller\building\imphook.py", line 410, in post_graph self._load_hook_module() File "c:\users\wen\anaconda3\lib\site-packages\pyinstaller-3.5.dev0+14b6e6564-py3.5.egg\PyInstaller\building\imphook.py", line 377, in _load_hook_module self.hook_module_name, self.hook_filename) File "c:\users\wen\anaconda3\lib\site-packages\pyinstaller-3.5.dev0+14b6e6564-py3.5.egg\PyInstaller\compat.py", line 785, in importlib_load_source return mod_loader.load_module() File "<frozen importlib._bootstrap_external>", line 388, in _check_name_wrapper File "<frozen importlib._bootstrap_external>", line 809, in load_module File "<frozen importlib._bootstrap_external>", line 668, in load_module File "<frozen importlib._bootstrap>", line 268, in _load_module_shim File "<frozen importlib._bootstrap>", line 693, in _load File "<frozen importlib._bootstrap>", line 673, in _load_unlocked File "<frozen importlib._bootstrap_external>", line 665, in exec_module File "<frozen importlib._bootstrap>", line 222, in _call_with_frames_removed File "c:\users\wen\anaconda3\lib\site-packages\pyinstaller-3.5.dev0+14b6e6564-py3.5.egg\PyInstaller\hooks\hook-PyQt5.py", line 23, in <module> collect_system_data_files(pyqt5_library_info.location['PrefixPath'], File "c:\users\wen\anaconda3\lib\site-packages\pyinstaller-3.5.dev0+14b6e6564-py3.5.egg\PyInstaller\utils\hooks\qt.py", line 70, in __getattr__ qli = json.loads(json_str) File "C:\Users\wen\Anaconda3\lib\json\__init__.py", line 319, in loads return _default_decoder.decode(s) File "C:\Users\wen\Anaconda3\lib\json\decoder.py", line 339, in decode obj, end = self.raw_decode(s, idx=_w(s, 0).end()) File "C:\Users\wen\Anaconda3\lib\json\decoder.py", line 357, in raw_decode raise JSONDecodeError("Expecting value", s, err.value) from None json.decoder.JSONDecodeError: Expecting value: line 1 column 1 (char 0) ```
如何 hook 夜神模拟器最新版 的OpenGL中的wglswapbuffers 函数 并且被执行能够跳转到我自己的函数
我想hook 夜神模拟器最新版的客户端 内部设置是opengl 渲染 我用dependency 查看过 它最新版客户端的 dll依赖情况,发现一个问题,它并没有直接依赖gdi32.dll 和opengl32.dll 而是依赖了glut32.dll 然后 这个dll依次依赖上面的2个dll 我现在 去hook gdi32.dll 中的SwapBuffers 函数 hook了但是没有被执行所以没有跳转到我的 函数中,hook opengl32.dll中的 wglSwapBuffers 函数也是一样。
C# 全局键盘监听Hook监听不到
使用的是网上的Hook轮子,如下 ``` using System; using System.Collections.Generic; using System.Text; using System.Runtime.InteropServices; using System.Windows.Forms; using System.Reflection; namespace HookTest { /// <summary> /// 键盘钩子 /// [以下代码来自某网友,并非本人原创] /// </summary> class KeyboardHook { public event KeyEventHandler KeyDownEvent; public event KeyPressEventHandler KeyPressEvent; public event KeyEventHandler KeyUpEvent; public delegate int HookProc (int nCode, Int32 wParam, IntPtr lParam); static int hKeyboardHook = 0; //声明键盘钩子处理的初始值 //值在Microsoft SDK的Winuser.h里查询 // http://www.bianceng.cn/Programming/csharp/201410/45484.htm public const int WH_KEYBOARD_LL = 13; //线程键盘钩子监听鼠标消息设为2,全局键盘监听鼠标消息设为13 HookProc KeyboardHookProcedure; //声明KeyboardHookProcedure作为HookProc类型 //键盘结构 [StructLayout(LayoutKind.Sequential)] public class KeyboardHookStruct { public int vkCode; //定一个虚拟键码。该代码必须有一个价值的范围1至254 public int scanCode; // 指定的硬件扫描码的关键 public int flags; // 键标志 public int time; // 指定的时间戳记的这个讯息 public int dwExtraInfo; // 指定额外信息相关的信息 } //使用此功能,安装了一个钩子 [DllImport("user32.dll", CharSet = CharSet.Auto, CallingConvention = CallingConvention.StdCall)] public static extern int SetWindowsHookEx (int idHook, HookProc lpfn, IntPtr hInstance, int threadId); //调用此函数卸载钩子 [DllImport("user32.dll", CharSet = CharSet.Auto, CallingConvention = CallingConvention.StdCall)] public static extern bool UnhookWindowsHookEx (int idHook); //使用此功能,通过信息钩子继续下一个钩子 [DllImport("user32.dll", CharSet = CharSet.Auto, CallingConvention = CallingConvention.StdCall)] public static extern int CallNextHookEx (int idHook, int nCode, Int32 wParam, IntPtr lParam); // 取得当前线程编号(线程钩子需要用到) [DllImport("kernel32.dll")] static extern int GetCurrentThreadId (); //使用WINDOWS API函数代替获取当前实例的函数,防止钩子失效 [DllImport("kernel32.dll")] public static extern IntPtr GetModuleHandle (string name); public void Start () { // 安装键盘钩子 if(hKeyboardHook == 0) { KeyboardHookProcedure = new HookProc(KeyboardHookProc); hKeyboardHook = SetWindowsHookEx(WH_KEYBOARD_LL, KeyboardHookProcedure, GetModuleHandle(System.Diagnostics.Process.GetCurrentProcess().MainModule.ModuleName), 0); //hKeyboardHook = SetWindowsHookEx(WH_KEYBOARD_LL, KeyboardHookProcedure, Marshal.GetHINSTANCE(Assembly.GetExecutingAssembly().GetModules()[0]), 0); //************************************ //键盘线程钩子 //SetWindowsHookEx( 2,KeyboardHookProcedure, IntPtr.Zero, GetCurrentThreadId());//指定要监听的线程idGetCurrentThreadId(), //键盘全局钩子,需要引用空间(using System.Reflection;) //SetWindowsHookEx( 13,MouseHookProcedure,Marshal.GetHINSTANCE(Assembly.GetExecutingAssembly().GetModules()[0]),0); // //关于SetWindowsHookEx (int idHook, HookProc lpfn, IntPtr hInstance, int threadId)函数将钩子加入到钩子链表中,说明一下四个参数: //idHook 钩子类型,即确定钩子监听何种消息,上面的代码中设为2,即监听键盘消息并且是线程钩子,如果是全局钩子监听键盘消息应设为13, //线程钩子监听鼠标消息设为7,全局钩子监听鼠标消息设为14。lpfn 钩子子程的地址指针。如果dwThreadId参数为0 或是一个由别的进程创建的 //线程的标识,lpfn必须指向DLL中的钩子子程。 除此以外,lpfn可以指向当前进程的一段钩子子程代码。钩子函数的入口地址,当钩子钩到任何 //消息后便调用这个函数。hInstance应用程序实例的句柄。标识包含lpfn所指的子程的DLL。如果threadId 标识当前进程创建的一个线程,而且子 //程代码位于当前进程,hInstance必须为NULL。可以很简单的设定其为本应用程序的实例句柄。threaded 与安装的钩子子程相关联的线程的标识符 //如果为0,钩子子程与所有的线程关联,即为全局钩子 //************************************ //如果SetWindowsHookEx失败 if(hKeyboardHook == 0) { Stop(); throw new Exception("安装键盘钩子失败"); } } } public void Stop () { bool retKeyboard = true; if(hKeyboardHook != 0) { retKeyboard = UnhookWindowsHookEx(hKeyboardHook); hKeyboardHook = 0; } if(!(retKeyboard)) throw new Exception("卸载钩子失败!"); } //ToAscii职能的转换指定的虚拟键码和键盘状态的相应字符或字符 [DllImport("user32")] public static extern int ToAscii (int uVirtKey, //[in] 指定虚拟关键代码进行翻译。 int uScanCode, // [in] 指定的硬件扫描码的关键须翻译成英文。高阶位的这个值设定的关键,如果是(不压) byte[] lpbKeyState, // [in] 指针,以256字节数组,包含当前键盘的状态。每个元素(字节)的数组包含状态的一个关键。如果高阶位的字节是一套,关键是下跌(按下)。在低比特,如果设置表明,关键是对切换。在此功能,只有肘位的CAPS LOCK键是相关的。在切换状态的NUM个锁和滚动锁定键被忽略。 byte[] lpwTransKey, // [out] 指针的缓冲区收到翻译字符或字符。 int fuState); // [in] Specifies whether a menu is active. This parameter must be 1 if a menu is active, or 0 otherwise. //获取按键的状态 [DllImport("user32")] public static extern int GetKeyboardState (byte[] pbKeyState); [DllImport("user32.dll", CharSet = CharSet.Auto, CallingConvention = CallingConvention.StdCall)] private static extern short GetKeyState (int vKey); private const int WM_KEYDOWN = 0x100;//KEYDOWN private const int WM_KEYUP = 0x101;//KEYUP private const int WM_SYSKEYDOWN = 0x104;//SYSKEYDOWN private const int WM_SYSKEYUP = 0x105;//SYSKEYUP private int KeyboardHookProc (int nCode, Int32 wParam, IntPtr lParam) { // 侦听键盘事件 if((nCode >= 0) && (KeyDownEvent != null || KeyUpEvent != null || KeyPressEvent != null)) { KeyboardHookStruct MyKeyboardHookStruct = (KeyboardHookStruct)Marshal.PtrToStructure(lParam, typeof(KeyboardHookStruct)); // raise KeyDown if(KeyDownEvent != null && (wParam == WM_KEYDOWN || wParam == WM_SYSKEYDOWN)) { Keys keyData = (Keys)MyKeyboardHookStruct.vkCode; KeyEventArgs e = new KeyEventArgs(keyData); KeyDownEvent(this, e); } //键盘按下 if(KeyPressEvent != null && wParam == WM_KEYDOWN) { byte[] keyState = new byte[256]; GetKeyboardState(keyState); byte[] inBuffer = new byte[2]; if(ToAscii(MyKeyboardHookStruct.vkCode, MyKeyboardHookStruct.scanCode, keyState, inBuffer, MyKeyboardHookStruct.flags) == 1) { KeyPressEventArgs e = new KeyPressEventArgs((char)inBuffer[0]); KeyPressEvent(this, e); } } // 键盘抬起 if(KeyUpEvent != null && (wParam == WM_KEYUP || wParam == WM_SYSKEYUP)) { Keys keyData = (Keys)MyKeyboardHookStruct.vkCode; KeyEventArgs e = new KeyEventArgs(keyData); KeyUpEvent(this, e); } } //如果返回1,则结束消息,这个消息到此为止,不再传递。 //如果返回0或调用CallNextHookEx函数则消息出了这个钩子继续往下传递,也就是传给消息真正的接受者 return CallNextHookEx(hKeyboardHook, nCode, wParam, lParam); } ~KeyboardHook () { Stop(); } } } ``` 今天做个测试,想要全局监听键盘事件,一开始测试没什么问题 都能正常监听,但是后来我不小心切换到lol作为最前端窗口的时候发现不能用了,监听不到键盘事件了,而且我又发现tgp窗口焦点的时候也不行,我是新手,求大神指点
关于Linux 4.9.x内核下的netfilter的问题
近期由于要做毕设需要操作转发包,了解到netfilter可以通过自定义hook实现此功能就去学习了几篇文章,结果发现大部分文章的实现环境是在内核2.4.x最高的也就3.x。 我想问下有没有前辈了解过4.9.x下的相关内容,或者我在哪里可以得到内核升级过程中带来的新的变化?
怎么获取api hook以后被hook函数的原始入口地址呢?
怎么获取api hook以后被hook函数的原始入口地址呢?怎么获取这些内存块的可写权限?
用pyinstaller封装python3.6脚本,生成的exe无法加载所需的包怎么办?
我已经尝试了各种办法,在创建exe的时候用-p命令添加python\lib\side-package里面我所要用的包的文件夹路径,或者在后面加上--path=,但是生成的exe文件在运行的时候还是会显示这样: ``` [15664] PyInstaller Bootloader 3.x [15664] LOADER: executable is E:\python\exe\dist\Timer.exe [15664] LOADER: homepath is E:\python\exe\dist [15664] LOADER: _MEIPASS2 is NULL [15664] LOADER: archivename is E:\python\exe\dist\Timer.exe [15664] LOADER: Extracting binaries [15664] LOADER: Executing self as child [15664] LOADER: set _MEIPASS2 to C:\Users\pm494\AppData\Local\Temp\_MEI156642 [15664] LOADER: Setting up to run child [15664] LOADER: Creating child process [15664] LOADER: Waiting for child process to finish... [17992] PyInstaller Bootloader 3.x [17992] LOADER: executable is E:\python\exe\dist\Timer.exe [17992] LOADER: homepath is E:\python\exe\dist [17992] LOADER: _MEIPASS2 is C:\Users\pm494\AppData\Local\Temp\_MEI156642 [17992] LOADER: archivename is E:\python\exe\dist\Timer.exe [17992] LOADER: SetDllDirectory(C:\Users\pm494\AppData\Local\Temp\_MEI156642) [17992] LOADER: Already in the child - running user's code. [17992] LOADER: manifestpath: C:\Users\pm494\AppData\Local\Temp\_MEI156642\Timer.exe.manifest [17992] LOADER: Activation context created [17992] LOADER: Activation context activated [17992] LOADER: Python library: C:\Users\pm494\AppData\Local\Temp\_MEI156642\python36.dll [17992] LOADER: Loaded functions from Python library. [17992] LOADER: Manipulating environment (sys.path, sys.prefix) [17992] LOADER: Pre-init sys.path is C:\Users\pm494\AppData\Local\Temp\_MEI156642\base_library.zip;C:\Users\pm494\AppData\Local\Temp\_MEI156642 [17992] LOADER: sys.prefix is C:\Users\pm494\AppData\Local\Temp\_MEI156642 [17992] LOADER: Setting runtime options [17992] LOADER: Bootloader option: pyi-windows-manifest-filename Timer.exe.manifest [17992] LOADER: Initializing python [17992] LOADER: Overriding Python's sys.path [17992] LOADER: Post-init sys.path is C:\Users\pm494\AppData\Local\Temp\_MEI156642\base_library.zip;C:\Users\pm494\AppData\Local\Temp\_MEI156642 [17992] LOADER: Setting sys.argv [17992] LOADER: setting sys._MEIPASS [17992] LOADER: importing modules from CArchive [17992] LOADER: extracted struct [17992] LOADER: callfunction returned... [17992] LOADER: extracted pyimod01_os_path [17992] LOADER: callfunction returned... [17992] LOADER: extracted pyimod02_archive [17992] LOADER: callfunction returned... [17992] LOADER: extracted pyimod03_importers [17992] LOADER: callfunction returned... [17992] LOADER: Installing PYZ archive with Python modules. [17992] LOADER: PYZ archive: out00-PYZ.pyz [17992] LOADER: Running pyiboot01_bootstrap.py [17992] LOADER: Running pyi_rth_win32comgenpy.py [17992] LOADER: Running Timer.py Traceback (most recent call last): File "Timer.py", line 24, in <module> ModuleNotFoundError: No module named 'logilib' [17992] Failed to execute script Timer [17992] LOADER: OK. [17992] LOADER: Cleaning up Python interpreter. [15664] LOADER: Back to parent (RC: -1) [15664] LOADER: Doing cleanup [15664] LOADER: Freeing archive status for E:\python\exe\dist\Timer.exe ``` 我尝试在另一台装有python3.5和要用的包的电脑上使用pyinstaller,按照同样的方法生成exe,运行结果是一样的。我看了看代码中的24行是加载BeautifulSoup命令:from bs4 import BeautifulSoup 谷歌看了半天也找到合适的解决办法,求大神帮我看看到底是怎么回事? pyinstaller运行过程是这样的: ``` E:\python\exe>pyinstaller -F -d Timer.py --path=E:/python/exe/Lib;E:/python/exe/Lib/site-package;E:/python/exe/Lib/site-packages/xlwings;E:/python/exe/Lib/site-packages/bs4;E:/python/exe/Lib/site-packages/selenium 5 WARNING: Internal error: early pywin32 import was introduced 63 INFO: PyInstaller: 3.2.1 63 INFO: Python: 3.6.2 64 INFO: Platform: Windows-10-10.0.14393-SP0 66 INFO: wrote E:\python\exe\Timer.spec 68 INFO: UPX is not available. 71 INFO: Extending PYTHONPATH with paths ['E:\\python\\exe', 'E:\\python\\exe\\Lib', 'E:\\python\\exe\\Lib\\site-package', 'E:\\python\\exe\\Lib\\site-packages\\xlwings', 'E:\\python\\exe\\Lib\\site-packages\\bs4', 'E:\\python\\exe\\Lib\\site-packages\\selenium', 'E:\\python\\exe'] 75 INFO: checking Analysis 76 INFO: Building Analysis because out00-Analysis.toc is non existent 77 INFO: Initializing module dependency graph... 80 INFO: Initializing module graph hooks... 82 INFO: Analyzing base_library.zip ... 2889 INFO: running Analysis out00-Analysis.toc 2892 INFO: Adding Microsoft.Windows.Common-Controls to dependent assemblies of final executable required by D:\workapp\Python\python.exe 3230 INFO: Caching module hooks... 3234 INFO: Analyzing E:\python\exe\Timer.py 4930 INFO: Processing pre-safe import module hook win32com 6402 INFO: Loading module hooks... 6402 INFO: Loading module hook "hook-encodings.py"... 6488 INFO: Loading module hook "hook-pydoc.py"... 6489 INFO: Loading module hook "hook-pythoncom.py"... 6706 INFO: Loading module hook "hook-pywintypes.py"... 6915 INFO: Loading module hook "hook-selenium.py"... 6920 INFO: Loading module hook "hook-win32com.py"... 6920 INFO: Loading module hook "hook-xml.dom.domreg.py"... 6921 INFO: Loading module hook "hook-xml.py"... 6940 INFO: Looking for ctypes DLLs 7045 WARNING: library coredll required via ctypes not found 7054 INFO: Analyzing run-time hooks ... 7058 INFO: Including run-time hook 'pyi_rth_win32comgenpy.py' 7067 INFO: Looking for dynamic libraries 7380 INFO: Looking for eggs 7381 INFO: Using Python library D:\workapp\Python\python36.dll 7382 INFO: Found binding redirects: [] 7389 INFO: Warnings written to E:\python\exe\build\Timer\warnTimer.txt 7401 INFO: checking PYZ 7401 INFO: Building PYZ because out00-PYZ.toc is non existent 7403 INFO: Building PYZ (ZlibArchive) E:\python\exe\build\Timer\out00-PYZ.pyz 8492 INFO: Building PYZ (ZlibArchive) E:\python\exe\build\Timer\out00-PYZ.pyz completed successfully. 8503 INFO: checking PKG 8503 INFO: Building PKG because out00-PKG.toc is non existent 8504 INFO: Building PKG (CArchive) out00-PKG.pkg 11271 INFO: Building PKG (CArchive) out00-PKG.pkg completed successfully. 11275 INFO: Bootloader D:\workapp\Python\lib\site-packages\PyInstaller\bootloader\Windows-64bit\run_d.exe 11275 INFO: checking EXE 11276 INFO: Building EXE because out00-EXE.toc is non existent 11278 INFO: Building EXE from out00-EXE.toc 11279 INFO: Appending archive to EXE E:\python\exe\dist\Timer.exe 11288 INFO: Building EXE from out00-EXE.toc completed successfully. ``` ------------------------------------------------------------------------ 刚刚试了一下另一个包,发现问题可能就是出在脚本中的bs4上面,在加载另一个包的时候并没有问题,程序可以运行。所以为什么是bs4这个包?到底该怎么找到问题呢?
spark 读取不到hive metastore 获取不到数据库
直接上异常 ``` Using Spark's default log4j profile: org/apache/spark/log4j-defaults.properties SLF4J: Class path contains multiple SLF4J bindings. SLF4J: Found binding in [jar:file:/data01/hadoop/yarn/local/filecache/355/spark2-hdp-yarn-archive.tar.gz/slf4j-log4j12-1.7.16.jar!/org/slf4j/impl/StaticLoggerBinder.class] SLF4J: Found binding in [jar:file:/usr/hdp/2.6.5.0-292/hadoop/lib/slf4j-log4j12-1.7.10.jar!/org/slf4j/impl/StaticLoggerBinder.class] SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an explanation. SLF4J: Actual binding is of type [org.slf4j.impl.Log4jLoggerFactory] 19/08/13 19:53:17 INFO SignalUtils: Registered signal handler for TERM 19/08/13 19:53:17 INFO SignalUtils: Registered signal handler for HUP 19/08/13 19:53:17 INFO SignalUtils: Registered signal handler for INT 19/08/13 19:53:17 INFO SecurityManager: Changing view acls to: yarn,hdfs 19/08/13 19:53:17 INFO SecurityManager: Changing modify acls to: yarn,hdfs 19/08/13 19:53:17 INFO SecurityManager: Changing view acls groups to: 19/08/13 19:53:17 INFO SecurityManager: Changing modify acls groups to: 19/08/13 19:53:17 INFO SecurityManager: SecurityManager: authentication disabled; ui acls disabled; users with view permissions: Set(yarn, hdfs); groups with view permissions: Set(); users with modify permissions: Set(yarn, hdfs); groups with modify permissions: Set() 19/08/13 19:53:18 INFO ApplicationMaster: Preparing Local resources 19/08/13 19:53:19 INFO ApplicationMaster: ApplicationAttemptId: appattempt_1565610088533_0087_000001 19/08/13 19:53:19 INFO ApplicationMaster: Starting the user application in a separate Thread 19/08/13 19:53:19 INFO ApplicationMaster: Waiting for spark context initialization... 19/08/13 19:53:19 INFO SparkContext: Running Spark version 2.3.0.2.6.5.0-292 19/08/13 19:53:19 INFO SparkContext: Submitted application: voice_stream 19/08/13 19:53:19 INFO SecurityManager: Changing view acls to: yarn,hdfs 19/08/13 19:53:19 INFO SecurityManager: Changing modify acls to: yarn,hdfs 19/08/13 19:53:19 INFO SecurityManager: Changing view acls groups to: 19/08/13 19:53:19 INFO SecurityManager: Changing modify acls groups to: 19/08/13 19:53:19 INFO SecurityManager: SecurityManager: authentication disabled; ui acls disabled; users with view permissions: Set(yarn, hdfs); groups with view permissions: Set(); users with modify permissions: Set(yarn, hdfs); groups with modify permissions: Set() 19/08/13 19:53:19 INFO Utils: Successfully started service 'sparkDriver' on port 20410. 19/08/13 19:53:19 INFO SparkEnv: Registering MapOutputTracker 19/08/13 19:53:19 INFO SparkEnv: Registering BlockManagerMaster 19/08/13 19:53:19 INFO BlockManagerMasterEndpoint: Using org.apache.spark.storage.DefaultTopologyMapper for getting topology information 19/08/13 19:53:19 INFO BlockManagerMasterEndpoint: BlockManagerMasterEndpoint up 19/08/13 19:53:19 INFO DiskBlockManager: Created local directory at /data01/hadoop/yarn/local/usercache/hdfs/appcache/application_1565610088533_0087/blockmgr-94d35b97-43b2-496e-a4cb-73ecd3ed186c 19/08/13 19:53:19 INFO MemoryStore: MemoryStore started with capacity 366.3 MB 19/08/13 19:53:19 INFO SparkEnv: Registering OutputCommitCoordinator 19/08/13 19:53:19 INFO JettyUtils: Adding filter: org.apache.hadoop.yarn.server.webproxy.amfilter.AmIpFilter 19/08/13 19:53:19 INFO Utils: Successfully started service 'SparkUI' on port 28852. 19/08/13 19:53:19 INFO SparkUI: Bound SparkUI to 0.0.0.0, and started at http://datanode02:28852 19/08/13 19:53:19 INFO YarnClusterScheduler: Created YarnClusterScheduler 19/08/13 19:53:20 INFO SchedulerExtensionServices: Starting Yarn extension services with app application_1565610088533_0087 and attemptId Some(appattempt_1565610088533_0087_000001) 19/08/13 19:53:20 INFO Utils: Successfully started service 'org.apache.spark.network.netty.NettyBlockTransferService' on port 31984. 19/08/13 19:53:20 INFO NettyBlockTransferService: Server created on datanode02:31984 19/08/13 19:53:20 INFO BlockManager: Using org.apache.spark.storage.RandomBlockReplicationPolicy for block replication policy 19/08/13 19:53:20 INFO BlockManagerMaster: Registering BlockManager BlockManagerId(driver, datanode02, 31984, None) 19/08/13 19:53:20 INFO BlockManagerMasterEndpoint: Registering block manager datanode02:31984 with 366.3 MB RAM, BlockManagerId(driver, datanode02, 31984, None) 19/08/13 19:53:20 INFO BlockManagerMaster: Registered BlockManager BlockManagerId(driver, datanode02, 31984, None) 19/08/13 19:53:20 INFO BlockManager: Initialized BlockManager: BlockManagerId(driver, datanode02, 31984, None) 19/08/13 19:53:20 INFO EventLoggingListener: Logging events to hdfs:/spark2-history/application_1565610088533_0087_1 19/08/13 19:53:20 INFO ApplicationMaster: =============================================================================== YARN executor launch context: env: CLASSPATH -> {{PWD}}<CPS>{{PWD}}/__spark_conf__<CPS>{{PWD}}/__spark_libs__/*<CPS>/usr/hdp/2.6.5.0-292/hadoop/conf<CPS>/usr/hdp/2.6.5.0-292/hadoop/*<CPS>/usr/hdp/2.6.5.0-292/hadoop/lib/*<CPS>/usr/hdp/current/hadoop-hdfs-client/*<CPS>/usr/hdp/current/hadoop-hdfs-client/lib/*<CPS>/usr/hdp/current/hadoop-yarn-client/*<CPS>/usr/hdp/current/hadoop-yarn-client/lib/*<CPS>/usr/hdp/current/ext/hadoop/*<CPS>$PWD/mr-framework/hadoop/share/hadoop/mapreduce/*:$PWD/mr-framework/hadoop/share/hadoop/mapreduce/lib/*:$PWD/mr-framework/hadoop/share/hadoop/common/*:$PWD/mr-framework/hadoop/share/hadoop/common/lib/*:$PWD/mr-framework/hadoop/share/hadoop/yarn/*:$PWD/mr-framework/hadoop/share/hadoop/yarn/lib/*:$PWD/mr-framework/hadoop/share/hadoop/hdfs/*:$PWD/mr-framework/hadoop/share/hadoop/hdfs/lib/*:$PWD/mr-framework/hadoop/share/hadoop/tools/lib/*:/usr/hdp/2.6.5.0-292/hadoop/lib/hadoop-lzo-0.6.0.2.6.5.0-292.jar:/etc/hadoop/conf/secure:/usr/hdp/current/ext/hadoop/*<CPS>{{PWD}}/__spark_conf__/__hadoop_conf__ SPARK_YARN_STAGING_DIR -> *********(redacted) SPARK_USER -> *********(redacted) command: LD_LIBRARY_PATH="/usr/hdp/current/hadoop-client/lib/native:/usr/hdp/current/hadoop-client/lib/native/Linux-amd64-64:$LD_LIBRARY_PATH" \ {{JAVA_HOME}}/bin/java \ -server \ -Xmx5120m \ -Djava.io.tmpdir={{PWD}}/tmp \ '-Dspark.history.ui.port=18081' \ '-Dspark.rpc.message.maxSize=100' \ -Dspark.yarn.app.container.log.dir=<LOG_DIR> \ -XX:OnOutOfMemoryError='kill %p' \ org.apache.spark.executor.CoarseGrainedExecutorBackend \ --driver-url \ spark://CoarseGrainedScheduler@datanode02:20410 \ --executor-id \ <executorId> \ --hostname \ <hostname> \ --cores \ 2 \ --app-id \ application_1565610088533_0087 \ --user-class-path \ file:$PWD/__app__.jar \ --user-class-path \ file:$PWD/hadoop-common-2.7.3.jar \ --user-class-path \ file:$PWD/guava-12.0.1.jar \ --user-class-path \ file:$PWD/hbase-server-1.2.8.jar \ --user-class-path \ file:$PWD/hbase-protocol-1.2.8.jar \ --user-class-path \ file:$PWD/hbase-client-1.2.8.jar \ --user-class-path \ file:$PWD/hbase-common-1.2.8.jar \ --user-class-path \ file:$PWD/mysql-connector-java-5.1.44-bin.jar \ --user-class-path \ file:$PWD/spark-streaming-kafka-0-8-assembly_2.11-2.3.2.jar \ --user-class-path \ file:$PWD/spark-examples_2.11-1.6.0-typesafe-001.jar \ --user-class-path \ file:$PWD/fastjson-1.2.7.jar \ 1><LOG_DIR>/stdout \ 2><LOG_DIR>/stderr resources: spark-streaming-kafka-0-8-assembly_2.11-2.3.2.jar -> resource { scheme: "hdfs" host: "CID-042fb939-95b4-4b74-91b8-9f94b999bdf7" port: -1 file: "/user/hdfs/.sparkStaging/application_1565610088533_0087/spark-streaming-kafka-0-8-assembly_2.11-2.3.2.jar" } size: 12271027 timestamp: 1565697198603 type: FILE visibility: PRIVATE spark-examples_2.11-1.6.0-typesafe-001.jar -> resource { scheme: "hdfs" host: "CID-042fb939-95b4-4b74-91b8-9f94b999bdf7" port: -1 file: "/user/hdfs/.sparkStaging/application_1565610088533_0087/spark-examples_2.11-1.6.0-typesafe-001.jar" } size: 1867746 timestamp: 1565697198751 type: FILE visibility: PRIVATE hbase-server-1.2.8.jar -> resource { scheme: "hdfs" host: "CID-042fb939-95b4-4b74-91b8-9f94b999bdf7" port: -1 file: "/user/hdfs/.sparkStaging/application_1565610088533_0087/hbase-server-1.2.8.jar" } size: 4197896 timestamp: 1565697197770 type: FILE visibility: PRIVATE hbase-common-1.2.8.jar -> resource { scheme: "hdfs" host: "CID-042fb939-95b4-4b74-91b8-9f94b999bdf7" port: -1 file: "/user/hdfs/.sparkStaging/application_1565610088533_0087/hbase-common-1.2.8.jar" } size: 570163 timestamp: 1565697198318 type: FILE visibility: PRIVATE __app__.jar -> resource { scheme: "hdfs" host: "CID-042fb939-95b4-4b74-91b8-9f94b999bdf7" port: -1 file: "/user/hdfs/.sparkStaging/application_1565610088533_0087/spark_history_data2.jar" } size: 44924 timestamp: 1565697197260 type: FILE visibility: PRIVATE guava-12.0.1.jar -> resource { scheme: "hdfs" host: "CID-042fb939-95b4-4b74-91b8-9f94b999bdf7" port: -1 file: "/user/hdfs/.sparkStaging/application_1565610088533_0087/guava-12.0.1.jar" } size: 1795932 timestamp: 1565697197614 type: FILE visibility: PRIVATE hbase-client-1.2.8.jar -> resource { scheme: "hdfs" host: "CID-042fb939-95b4-4b74-91b8-9f94b999bdf7" port: -1 file: "/user/hdfs/.sparkStaging/application_1565610088533_0087/hbase-client-1.2.8.jar" } size: 1306401 timestamp: 1565697198180 type: FILE visibility: PRIVATE __spark_conf__ -> resource { scheme: "hdfs" host: "CID-042fb939-95b4-4b74-91b8-9f94b999bdf7" port: -1 file: "/user/hdfs/.sparkStaging/application_1565610088533_0087/__spark_conf__.zip" } size: 273513 timestamp: 1565697199131 type: ARCHIVE visibility: PRIVATE fastjson-1.2.7.jar -> resource { scheme: "hdfs" host: "CID-042fb939-95b4-4b74-91b8-9f94b999bdf7" port: -1 file: "/user/hdfs/.sparkStaging/application_1565610088533_0087/fastjson-1.2.7.jar" } size: 417221 timestamp: 1565697198865 type: FILE visibility: PRIVATE hbase-protocol-1.2.8.jar -> resource { scheme: "hdfs" host: "CID-042fb939-95b4-4b74-91b8-9f94b999bdf7" port: -1 file: "/user/hdfs/.sparkStaging/application_1565610088533_0087/hbase-protocol-1.2.8.jar" } size: 4366252 timestamp: 1565697198023 type: FILE visibility: PRIVATE __spark_libs__ -> resource { scheme: "hdfs" host: "CID-042fb939-95b4-4b74-91b8-9f94b999bdf7" port: -1 file: "/hdp/apps/2.6.5.0-292/spark2/spark2-hdp-yarn-archive.tar.gz" } size: 227600110 timestamp: 1549953820247 type: ARCHIVE visibility: PUBLIC mysql-connector-java-5.1.44-bin.jar -> resource { scheme: "hdfs" host: "CID-042fb939-95b4-4b74-91b8-9f94b999bdf7" port: -1 file: "/user/hdfs/.sparkStaging/application_1565610088533_0087/mysql-connector-java-5.1.44-bin.jar" } size: 999635 timestamp: 1565697198445 type: FILE visibility: PRIVATE hadoop-common-2.7.3.jar -> resource { scheme: "hdfs" host: "CID-042fb939-95b4-4b74-91b8-9f94b999bdf7" port: -1 file: "/user/hdfs/.sparkStaging/application_1565610088533_0087/hadoop-common-2.7.3.jar" } size: 3479293 timestamp: 1565697197476 type: FILE visibility: PRIVATE =============================================================================== 19/08/13 19:53:20 INFO RMProxy: Connecting to ResourceManager at namenode02/10.1.38.38:8030 19/08/13 19:53:20 INFO YarnRMClient: Registering the ApplicationMaster 19/08/13 19:53:20 INFO YarnAllocator: Will request 3 executor container(s), each with 2 core(s) and 5632 MB memory (including 512 MB of overhead) 19/08/13 19:53:20 INFO YarnSchedulerBackend$YarnSchedulerEndpoint: ApplicationMaster registered as NettyRpcEndpointRef(spark://YarnAM@datanode02:20410) 19/08/13 19:53:20 INFO YarnAllocator: Submitted 3 unlocalized container requests. 19/08/13 19:53:20 INFO ApplicationMaster: Started progress reporter thread with (heartbeat : 3000, initial allocation : 200) intervals 19/08/13 19:53:20 INFO AMRMClientImpl: Received new token for : datanode03:45454 19/08/13 19:53:21 INFO YarnAllocator: Launching container container_e20_1565610088533_0087_01_000002 on host datanode03 for executor with ID 1 19/08/13 19:53:21 INFO YarnAllocator: Received 1 containers from YARN, launching executors on 1 of them. 19/08/13 19:53:21 INFO ContainerManagementProtocolProxy: yarn.client.max-cached-nodemanagers-proxies : 0 19/08/13 19:53:21 INFO ContainerManagementProtocolProxy: Opening proxy : datanode03:45454 19/08/13 19:53:21 INFO AMRMClientImpl: Received new token for : datanode01:45454 19/08/13 19:53:21 INFO YarnAllocator: Launching container container_e20_1565610088533_0087_01_000003 on host datanode01 for executor with ID 2 19/08/13 19:53:21 INFO YarnAllocator: Received 1 containers from YARN, launching executors on 1 of them. 19/08/13 19:53:21 INFO ContainerManagementProtocolProxy: yarn.client.max-cached-nodemanagers-proxies : 0 19/08/13 19:53:21 INFO ContainerManagementProtocolProxy: Opening proxy : datanode01:45454 19/08/13 19:53:22 INFO AMRMClientImpl: Received new token for : datanode02:45454 19/08/13 19:53:22 INFO YarnAllocator: Launching container container_e20_1565610088533_0087_01_000004 on host datanode02 for executor with ID 3 19/08/13 19:53:22 INFO YarnAllocator: Received 1 containers from YARN, launching executors on 1 of them. 19/08/13 19:53:22 INFO ContainerManagementProtocolProxy: yarn.client.max-cached-nodemanagers-proxies : 0 19/08/13 19:53:22 INFO ContainerManagementProtocolProxy: Opening proxy : datanode02:45454 19/08/13 19:53:24 INFO YarnSchedulerBackend$YarnDriverEndpoint: Registered executor NettyRpcEndpointRef(spark-client://Executor) (10.1.198.144:41122) with ID 1 19/08/13 19:53:25 INFO YarnSchedulerBackend$YarnDriverEndpoint: Registered executor NettyRpcEndpointRef(spark-client://Executor) (10.1.229.163:24656) with ID 3 19/08/13 19:53:25 INFO BlockManagerMasterEndpoint: Registering block manager datanode03:3328 with 2.5 GB RAM, BlockManagerId(1, datanode03, 3328, None) 19/08/13 19:53:25 INFO BlockManagerMasterEndpoint: Registering block manager datanode02:28863 with 2.5 GB RAM, BlockManagerId(3, datanode02, 28863, None) 19/08/13 19:53:25 INFO YarnSchedulerBackend$YarnDriverEndpoint: Registered executor NettyRpcEndpointRef(spark-client://Executor) (10.1.229.158:64276) with ID 2 19/08/13 19:53:25 INFO YarnClusterSchedulerBackend: SchedulerBackend is ready for scheduling beginning after reached minRegisteredResourcesRatio: 0.8 19/08/13 19:53:25 INFO YarnClusterScheduler: YarnClusterScheduler.postStartHook done 19/08/13 19:53:25 INFO BlockManagerMasterEndpoint: Registering block manager datanode01:20487 with 2.5 GB RAM, BlockManagerId(2, datanode01, 20487, None) 19/08/13 19:53:25 WARN SparkContext: Using an existing SparkContext; some configuration may not take effect. 19/08/13 19:53:25 INFO SparkContext: Starting job: start at VoiceApplication2.java:128 19/08/13 19:53:25 INFO DAGScheduler: Registering RDD 1 (start at VoiceApplication2.java:128) 19/08/13 19:53:25 INFO DAGScheduler: Got job 0 (start at VoiceApplication2.java:128) with 20 output partitions 19/08/13 19:53:25 INFO DAGScheduler: Final stage: ResultStage 1 (start at VoiceApplication2.java:128) 19/08/13 19:53:25 INFO DAGScheduler: Parents of final stage: List(ShuffleMapStage 0) 19/08/13 19:53:25 INFO DAGScheduler: Missing parents: List(ShuffleMapStage 0) 19/08/13 19:53:26 INFO DAGScheduler: Submitting ShuffleMapStage 0 (MapPartitionsRDD[1] at start at VoiceApplication2.java:128), which has no missing parents 19/08/13 19:53:26 INFO MemoryStore: Block broadcast_0 stored as values in memory (estimated size 3.1 KB, free 366.3 MB) 19/08/13 19:53:26 INFO MemoryStore: Block broadcast_0_piece0 stored as bytes in memory (estimated size 2011.0 B, free 366.3 MB) 19/08/13 19:53:26 INFO BlockManagerInfo: Added broadcast_0_piece0 in memory on datanode02:31984 (size: 2011.0 B, free: 366.3 MB) 19/08/13 19:53:26 INFO SparkContext: Created broadcast 0 from broadcast at DAGScheduler.scala:1039 19/08/13 19:53:26 INFO DAGScheduler: Submitting 50 missing tasks from ShuffleMapStage 0 (MapPartitionsRDD[1] at start at VoiceApplication2.java:128) (first 15 tasks are for partitions Vector(0, 1, 2, 3, 4, 5, 6, 7, 8, 9, 10, 11, 12, 13, 14)) 19/08/13 19:53:26 INFO YarnClusterScheduler: Adding task set 0.0 with 50 tasks 19/08/13 19:53:26 INFO TaskSetManager: Starting task 0.0 in stage 0.0 (TID 0, datanode02, executor 3, partition 0, PROCESS_LOCAL, 7831 bytes) 19/08/13 19:53:26 INFO TaskSetManager: Starting task 1.0 in stage 0.0 (TID 1, datanode03, executor 1, partition 1, PROCESS_LOCAL, 7831 bytes) 19/08/13 19:53:26 INFO TaskSetManager: Starting task 2.0 in stage 0.0 (TID 2, datanode01, executor 2, partition 2, PROCESS_LOCAL, 7831 bytes) 19/08/13 19:53:26 INFO TaskSetManager: Starting task 3.0 in stage 0.0 (TID 3, datanode02, executor 3, partition 3, PROCESS_LOCAL, 7831 bytes) 19/08/13 19:53:26 INFO TaskSetManager: Starting task 4.0 in stage 0.0 (TID 4, datanode03, executor 1, partition 4, PROCESS_LOCAL, 7831 bytes) 19/08/13 19:53:26 INFO TaskSetManager: Starting task 5.0 in stage 0.0 (TID 5, datanode01, executor 2, partition 5, PROCESS_LOCAL, 7831 bytes) 19/08/13 19:53:26 INFO BlockManagerInfo: Added broadcast_0_piece0 in memory on datanode02:28863 (size: 2011.0 B, free: 2.5 GB) 19/08/13 19:53:26 INFO BlockManagerInfo: Added broadcast_0_piece0 in memory on datanode03:3328 (size: 2011.0 B, free: 2.5 GB) 19/08/13 19:53:26 INFO BlockManagerInfo: Added broadcast_0_piece0 in memory on datanode01:20487 (size: 2011.0 B, free: 2.5 GB) 19/08/13 19:53:26 INFO TaskSetManager: Starting task 6.0 in stage 0.0 (TID 6, datanode02, executor 3, partition 6, PROCESS_LOCAL, 7831 bytes) 19/08/13 19:53:26 INFO TaskSetManager: Starting task 7.0 in stage 0.0 (TID 7, datanode02, executor 3, partition 7, PROCESS_LOCAL, 7831 bytes) 19/08/13 19:53:26 INFO TaskSetManager: Finished task 3.0 in stage 0.0 (TID 3) in 693 ms on datanode02 (executor 3) (1/50) 19/08/13 19:53:26 INFO TaskSetManager: Finished task 0.0 in stage 0.0 (TID 0) in 712 ms on datanode02 (executor 3) (2/50) 19/08/13 19:53:26 INFO TaskSetManager: Starting task 8.0 in stage 0.0 (TID 8, datanode02, executor 3, partition 8, PROCESS_LOCAL, 7831 bytes) 19/08/13 19:53:26 INFO TaskSetManager: Finished task 7.0 in stage 0.0 (TID 7) in 21 ms on datanode02 (executor 3) (3/50) 19/08/13 19:53:26 INFO TaskSetManager: Starting task 9.0 in stage 0.0 (TID 9, datanode02, executor 3, partition 9, PROCESS_LOCAL, 7831 bytes) 19/08/13 19:53:26 INFO TaskSetManager: Finished task 6.0 in stage 0.0 (TID 6) in 26 ms on datanode02 (executor 3) (4/50) 19/08/13 19:53:26 INFO TaskSetManager: Starting task 10.0 in stage 0.0 (TID 10, datanode02, executor 3, partition 10, PROCESS_LOCAL, 7831 bytes) 19/08/13 19:53:26 INFO TaskSetManager: Finished task 8.0 in stage 0.0 (TID 8) in 23 ms on datanode02 (executor 3) (5/50) 19/08/13 19:53:26 INFO TaskSetManager: Starting task 11.0 in stage 0.0 (TID 11, datanode02, executor 3, partition 11, PROCESS_LOCAL, 7831 bytes) 19/08/13 19:53:26 INFO TaskSetManager: Finished task 9.0 in stage 0.0 (TID 9) in 25 ms on datanode02 (executor 3) (6/50) 19/08/13 19:53:26 INFO TaskSetManager: Starting task 12.0 in stage 0.0 (TID 12, datanode02, executor 3, partition 12, PROCESS_LOCAL, 7831 bytes) 19/08/13 19:53:26 INFO TaskSetManager: Finished task 10.0 in stage 0.0 (TID 10) in 18 ms on datanode02 (executor 3) (7/50) 19/08/13 19:53:26 INFO TaskSetManager: Finished task 11.0 in stage 0.0 (TID 11) in 14 ms on datanode02 (executor 3) (8/50) 19/08/13 19:53:26 INFO TaskSetManager: Starting task 13.0 in stage 0.0 (TID 13, datanode02, executor 3, partition 13, PROCESS_LOCAL, 7831 bytes) 19/08/13 19:53:26 INFO TaskSetManager: Starting task 14.0 in stage 0.0 (TID 14, datanode02, executor 3, partition 14, PROCESS_LOCAL, 7831 bytes) 19/08/13 19:53:26 INFO TaskSetManager: Finished task 12.0 in stage 0.0 (TID 12) in 16 ms on datanode02 (executor 3) (9/50) 19/08/13 19:53:26 INFO TaskSetManager: Starting task 15.0 in stage 0.0 (TID 15, datanode02, executor 3, partition 15, PROCESS_LOCAL, 7831 bytes) 19/08/13 19:53:26 INFO TaskSetManager: Finished task 13.0 in stage 0.0 (TID 13) in 22 ms on datanode02 (executor 3) (10/50) 19/08/13 19:53:26 INFO TaskSetManager: Starting task 16.0 in stage 0.0 (TID 16, datanode02, executor 3, partition 16, PROCESS_LOCAL, 7831 bytes) 19/08/13 19:53:26 INFO TaskSetManager: Finished task 14.0 in stage 0.0 (TID 14) in 16 ms on datanode02 (executor 3) (11/50) 19/08/13 19:53:26 INFO TaskSetManager: Starting task 17.0 in stage 0.0 (TID 17, datanode02, executor 3, partition 17, PROCESS_LOCAL, 7831 bytes) 19/08/13 19:53:26 INFO TaskSetManager: Finished task 15.0 in stage 0.0 (TID 15) in 13 ms on datanode02 (executor 3) (12/50) 19/08/13 19:53:26 INFO TaskSetManager: Starting task 18.0 in stage 0.0 (TID 18, datanode01, executor 2, partition 18, PROCESS_LOCAL, 7831 bytes) 19/08/13 19:53:26 INFO TaskSetManager: Starting task 19.0 in stage 0.0 (TID 19, datanode01, executor 2, partition 19, PROCESS_LOCAL, 7831 bytes) 19/08/13 19:53:26 INFO TaskSetManager: Finished task 5.0 in stage 0.0 (TID 5) in 787 ms on datanode01 (executor 2) (13/50) 19/08/13 19:53:26 INFO TaskSetManager: Finished task 2.0 in stage 0.0 (TID 2) in 789 ms on datanode01 (executor 2) (14/50) 19/08/13 19:53:26 INFO TaskSetManager: Starting task 20.0 in stage 0.0 (TID 20, datanode03, executor 1, partition 20, PROCESS_LOCAL, 7831 bytes) 19/08/13 19:53:26 INFO TaskSetManager: Starting task 21.0 in stage 0.0 (TID 21, datanode03, executor 1, partition 21, PROCESS_LOCAL, 7831 bytes) 19/08/13 19:53:27 INFO TaskSetManager: Finished task 4.0 in stage 0.0 (TID 4) in 905 ms on datanode03 (executor 1) (15/50) 19/08/13 19:53:27 INFO TaskSetManager: Finished task 1.0 in stage 0.0 (TID 1) in 907 ms on datanode03 (executor 1) (16/50) 19/08/13 19:53:27 INFO TaskSetManager: Starting task 22.0 in stage 0.0 (TID 22, datanode02, executor 3, partition 22, PROCESS_LOCAL, 7831 bytes) 19/08/13 19:53:27 INFO TaskSetManager: Starting task 23.0 in stage 0.0 (TID 23, datanode02, executor 3, partition 23, PROCESS_LOCAL, 7831 bytes) 19/08/13 19:53:27 INFO TaskSetManager: Starting task 24.0 in stage 0.0 (TID 24, datanode01, executor 2, partition 24, PROCESS_LOCAL, 7831 bytes) 19/08/13 19:53:27 INFO TaskSetManager: Finished task 18.0 in stage 0.0 (TID 18) in 124 ms on datanode01 (executor 2) (17/50) 19/08/13 19:53:27 INFO TaskSetManager: Finished task 16.0 in stage 0.0 (TID 16) in 134 ms on datanode02 (executor 3) (18/50) 19/08/13 19:53:27 INFO TaskSetManager: Starting task 25.0 in stage 0.0 (TID 25, datanode01, executor 2, partition 25, PROCESS_LOCAL, 7831 bytes) 19/08/13 19:53:27 INFO TaskSetManager: Starting task 26.0 in stage 0.0 (TID 26, datanode03, executor 1, partition 26, PROCESS_LOCAL, 7831 bytes) 19/08/13 19:53:27 INFO TaskSetManager: Finished task 17.0 in stage 0.0 (TID 17) in 134 ms on datanode02 (executor 3) (19/50) 19/08/13 19:53:27 INFO TaskSetManager: Finished task 20.0 in stage 0.0 (TID 20) in 122 ms on datanode03 (executor 1) (20/50) 19/08/13 19:53:27 INFO TaskSetManager: Starting task 27.0 in stage 0.0 (TID 27, datanode03, executor 1, partition 27, PROCESS_LOCAL, 7831 bytes) 19/08/13 19:53:27 INFO TaskSetManager: Finished task 19.0 in stage 0.0 (TID 19) in 127 ms on datanode01 (executor 2) (21/50) 19/08/13 19:53:27 INFO TaskSetManager: Finished task 21.0 in stage 0.0 (TID 21) in 123 ms on datanode03 (executor 1) (22/50) 19/08/13 19:53:27 INFO TaskSetManager: Starting task 28.0 in stage 0.0 (TID 28, datanode02, executor 3, partition 28, PROCESS_LOCAL, 7831 bytes) 19/08/13 19:53:27 INFO TaskSetManager: Starting task 29.0 in stage 0.0 (TID 29, datanode02, executor 3, partition 29, PROCESS_LOCAL, 7831 bytes) 19/08/13 19:53:27 INFO TaskSetManager: Finished task 22.0 in stage 0.0 (TID 22) in 19 ms on datanode02 (executor 3) (23/50) 19/08/13 19:53:27 INFO TaskSetManager: Finished task 23.0 in stage 0.0 (TID 23) in 18 ms on datanode02 (executor 3) (24/50) 19/08/13 19:53:27 INFO TaskSetManager: Starting task 30.0 in stage 0.0 (TID 30, datanode01, executor 2, partition 30, PROCESS_LOCAL, 7831 bytes) 19/08/13 19:53:27 INFO TaskSetManager: Starting task 31.0 in stage 0.0 (TID 31, datanode01, executor 2, partition 31, PROCESS_LOCAL, 7831 bytes) 19/08/13 19:53:27 INFO TaskSetManager: Finished task 25.0 in stage 0.0 (TID 25) in 27 ms on datanode01 (executor 2) (25/50) 19/08/13 19:53:27 INFO TaskSetManager: Finished task 24.0 in stage 0.0 (TID 24) in 29 ms on datanode01 (executor 2) (26/50) 19/08/13 19:53:27 INFO TaskSetManager: Starting task 32.0 in stage 0.0 (TID 32, datanode02, executor 3, partition 32, PROCESS_LOCAL, 7831 bytes) 19/08/13 19:53:27 INFO TaskSetManager: Finished task 29.0 in stage 0.0 (TID 29) in 16 ms on datanode02 (executor 3) (27/50) 19/08/13 19:53:27 INFO TaskSetManager: Starting task 33.0 in stage 0.0 (TID 33, datanode03, executor 1, partition 33, PROCESS_LOCAL, 7831 bytes) 19/08/13 19:53:27 INFO TaskSetManager: Finished task 26.0 in stage 0.0 (TID 26) in 30 ms on datanode03 (executor 1) (28/50) 19/08/13 19:53:27 INFO TaskSetManager: Starting task 34.0 in stage 0.0 (TID 34, datanode02, executor 3, partition 34, PROCESS_LOCAL, 7831 bytes) 19/08/13 19:53:27 INFO TaskSetManager: Finished task 28.0 in stage 0.0 (TID 28) in 21 ms on datanode02 (executor 3) (29/50) 19/08/13 19:53:27 INFO TaskSetManager: Starting task 35.0 in stage 0.0 (TID 35, datanode03, executor 1, partition 35, PROCESS_LOCAL, 7831 bytes) 19/08/13 19:53:27 INFO TaskSetManager: Finished task 27.0 in stage 0.0 (TID 27) in 32 ms on datanode03 (executor 1) (30/50) 19/08/13 19:53:27 INFO TaskSetManager: Starting task 36.0 in stage 0.0 (TID 36, datanode02, executor 3, partition 36, PROCESS_LOCAL, 7831 bytes) 19/08/13 19:53:27 INFO TaskSetManager: Finished task 32.0 in stage 0.0 (TID 32) in 11 ms on datanode02 (executor 3) (31/50) 19/08/13 19:53:27 INFO TaskSetManager: Starting task 37.0 in stage 0.0 (TID 37, datanode01, executor 2, partition 37, PROCESS_LOCAL, 7831 bytes) 19/08/13 19:53:27 INFO TaskSetManager: Finished task 30.0 in stage 0.0 (TID 30) in 18 ms on datanode01 (executor 2) (32/50) 19/08/13 19:53:27 INFO TaskSetManager: Starting task 38.0 in stage 0.0 (TID 38, datanode01, executor 2, partition 38, PROCESS_LOCAL, 7831 bytes) 19/08/13 19:53:27 INFO TaskSetManager: Finished task 31.0 in stage 0.0 (TID 31) in 20 ms on datanode01 (executor 2) (33/50) 19/08/13 19:53:27 INFO TaskSetManager: Starting task 39.0 in stage 0.0 (TID 39, datanode03, executor 1, partition 39, PROCESS_LOCAL, 7831 bytes) 19/08/13 19:53:27 INFO TaskSetManager: Finished task 33.0 in stage 0.0 (TID 33) in 17 ms on datanode03 (executor 1) (34/50) 19/08/13 19:53:27 INFO TaskSetManager: Finished task 34.0 in stage 0.0 (TID 34) in 17 ms on datanode02 (executor 3) (35/50) 19/08/13 19:53:27 INFO TaskSetManager: Starting task 40.0 in stage 0.0 (TID 40, datanode02, executor 3, partition 40, PROCESS_LOCAL, 7831 bytes) 19/08/13 19:53:27 INFO TaskSetManager: Starting task 41.0 in stage 0.0 (TID 41, datanode03, executor 1, partition 41, PROCESS_LOCAL, 7831 bytes) 19/08/13 19:53:27 INFO TaskSetManager: Finished task 35.0 in stage 0.0 (TID 35) in 17 ms on datanode03 (executor 1) (36/50) 19/08/13 19:53:27 INFO TaskSetManager: Starting task 42.0 in stage 0.0 (TID 42, datanode02, executor 3, partition 42, PROCESS_LOCAL, 7831 bytes) 19/08/13 19:53:27 INFO TaskSetManager: Finished task 36.0 in stage 0.0 (TID 36) in 16 ms on datanode02 (executor 3) (37/50) 19/08/13 19:53:27 INFO TaskSetManager: Starting task 43.0 in stage 0.0 (TID 43, datanode01, executor 2, partition 43, PROCESS_LOCAL, 7831 bytes) 19/08/13 19:53:27 INFO TaskSetManager: Finished task 37.0 in stage 0.0 (TID 37) in 16 ms on datanode01 (executor 2) (38/50) 19/08/13 19:53:27 INFO TaskSetManager: Starting task 44.0 in stage 0.0 (TID 44, datanode02, executor 3, partition 44, PROCESS_LOCAL, 7831 bytes) 19/08/13 19:53:27 INFO TaskSetManager: Starting task 45.0 in stage 0.0 (TID 45, datanode02, executor 3, partition 45, PROCESS_LOCAL, 7831 bytes) 19/08/13 19:53:27 INFO TaskSetManager: Finished task 40.0 in stage 0.0 (TID 40) in 14 ms on datanode02 (executor 3) (39/50) 19/08/13 19:53:27 INFO TaskSetManager: Finished task 42.0 in stage 0.0 (TID 42) in 11 ms on datanode02 (executor 3) (40/50) 19/08/13 19:53:27 INFO TaskSetManager: Starting task 46.0 in stage 0.0 (TID 46, datanode03, executor 1, partition 46, PROCESS_LOCAL, 7831 bytes) 19/08/13 19:53:27 INFO TaskSetManager: Finished task 39.0 in stage 0.0 (TID 39) in 20 ms on datanode03 (executor 1) (41/50) 19/08/13 19:53:27 INFO TaskSetManager: Starting task 47.0 in stage 0.0 (TID 47, datanode03, executor 1, partition 47, PROCESS_LOCAL, 7831 bytes) 19/08/13 19:53:27 INFO TaskSetManager: Finished task 41.0 in stage 0.0 (TID 41) in 20 ms on datanode03 (executor 1) (42/50) 19/08/13 19:53:27 INFO TaskSetManager: Starting task 48.0 in stage 0.0 (TID 48, datanode01, executor 2, partition 48, PROCESS_LOCAL, 7831 bytes) 19/08/13 19:53:27 INFO TaskSetManager: Starting task 49.0 in stage 0.0 (TID 49, datanode01, executor 2, partition 49, PROCESS_LOCAL, 7888 bytes) 19/08/13 19:53:27 INFO TaskSetManager: Finished task 43.0 in stage 0.0 (TID 43) in 18 ms on datanode01 (executor 2) (43/50) 19/08/13 19:53:27 INFO TaskSetManager: Finished task 38.0 in stage 0.0 (TID 38) in 31 ms on datanode01 (executor 2) (44/50) 19/08/13 19:53:27 INFO TaskSetManager: Finished task 45.0 in stage 0.0 (TID 45) in 11 ms on datanode02 (executor 3) (45/50) 19/08/13 19:53:27 INFO TaskSetManager: Finished task 44.0 in stage 0.0 (TID 44) in 16 ms on datanode02 (executor 3) (46/50) 19/08/13 19:53:27 INFO TaskSetManager: Finished task 46.0 in stage 0.0 (TID 46) in 18 ms on datanode03 (executor 1) (47/50) 19/08/13 19:53:27 INFO TaskSetManager: Finished task 48.0 in stage 0.0 (TID 48) in 15 ms on datanode01 (executor 2) (48/50) 19/08/13 19:53:27 INFO TaskSetManager: Finished task 47.0 in stage 0.0 (TID 47) in 15 ms on datanode03 (executor 1) (49/50) 19/08/13 19:53:27 INFO TaskSetManager: Finished task 49.0 in stage 0.0 (TID 49) in 25 ms on datanode01 (executor 2) (50/50) 19/08/13 19:53:27 INFO YarnClusterScheduler: Removed TaskSet 0.0, whose tasks have all completed, from pool 19/08/13 19:53:27 INFO DAGScheduler: ShuffleMapStage 0 (start at VoiceApplication2.java:128) finished in 1.174 s 19/08/13 19:53:27 INFO DAGScheduler: looking for newly runnable stages 19/08/13 19:53:27 INFO DAGScheduler: running: Set() 19/08/13 19:53:27 INFO DAGScheduler: waiting: Set(ResultStage 1) 19/08/13 19:53:27 INFO DAGScheduler: failed: Set() 19/08/13 19:53:27 INFO DAGScheduler: Submitting ResultStage 1 (ShuffledRDD[2] at start at VoiceApplication2.java:128), which has no missing parents 19/08/13 19:53:27 INFO MemoryStore: Block broadcast_1 stored as values in memory (estimated size 3.2 KB, free 366.3 MB) 19/08/13 19:53:27 INFO MemoryStore: Block broadcast_1_piece0 stored as bytes in memory (estimated size 1979.0 B, free 366.3 MB) 19/08/13 19:53:27 INFO BlockManagerInfo: Added broadcast_1_piece0 in memory on datanode02:31984 (size: 1979.0 B, free: 366.3 MB) 19/08/13 19:53:27 INFO SparkContext: Created broadcast 1 from broadcast at DAGScheduler.scala:1039 19/08/13 19:53:27 INFO DAGScheduler: Submitting 20 missing tasks from ResultStage 1 (ShuffledRDD[2] at start at VoiceApplication2.java:128) (first 15 tasks are for partitions Vector(0, 1, 2, 3, 4, 5, 6, 7, 8, 9, 10, 11, 12, 13, 14)) 19/08/13 19:53:27 INFO YarnClusterScheduler: Adding task set 1.0 with 20 tasks 19/08/13 19:53:27 INFO TaskSetManager: Starting task 0.0 in stage 1.0 (TID 50, datanode03, executor 1, partition 0, NODE_LOCAL, 7638 bytes) 19/08/13 19:53:27 INFO TaskSetManager: Starting task 1.0 in stage 1.0 (TID 51, datanode02, executor 3, partition 1, NODE_LOCAL, 7638 bytes) 19/08/13 19:53:27 INFO TaskSetManager: Starting task 3.0 in stage 1.0 (TID 52, datanode01, executor 2, partition 3, NODE_LOCAL, 7638 bytes) 19/08/13 19:53:27 INFO TaskSetManager: Starting task 2.0 in stage 1.0 (TID 53, datanode03, executor 1, partition 2, NODE_LOCAL, 7638 bytes) 19/08/13 19:53:27 INFO TaskSetManager: Starting task 4.0 in stage 1.0 (TID 54, datanode02, executor 3, partition 4, NODE_LOCAL, 7638 bytes) 19/08/13 19:53:27 INFO TaskSetManager: Starting task 5.0 in stage 1.0 (TID 55, datanode01, executor 2, partition 5, NODE_LOCAL, 7638 bytes) 19/08/13 19:53:27 INFO BlockManagerInfo: Added broadcast_1_piece0 in memory on datanode02:28863 (size: 1979.0 B, free: 2.5 GB) 19/08/13 19:53:27 INFO BlockManagerInfo: Added broadcast_1_piece0 in memory on datanode01:20487 (size: 1979.0 B, free: 2.5 GB) 19/08/13 19:53:27 INFO BlockManagerInfo: Added broadcast_1_piece0 in memory on datanode03:3328 (size: 1979.0 B, free: 2.5 GB) 19/08/13 19:53:27 INFO MapOutputTrackerMasterEndpoint: Asked to send map output locations for shuffle 0 to 10.1.229.163:24656 19/08/13 19:53:27 INFO MapOutputTrackerMasterEndpoint: Asked to send map output locations for shuffle 0 to 10.1.198.144:41122 19/08/13 19:53:27 INFO MapOutputTrackerMasterEndpoint: Asked to send map output locations for shuffle 0 to 10.1.229.158:64276 19/08/13 19:53:27 INFO TaskSetManager: Starting task 7.0 in stage 1.0 (TID 56, datanode03, executor 1, partition 7, NODE_LOCAL, 7638 bytes) 19/08/13 19:53:27 INFO TaskSetManager: Finished task 2.0 in stage 1.0 (TID 53) in 192 ms on datanode03 (executor 1) (1/20) 19/08/13 19:53:27 INFO TaskSetManager: Starting task 8.0 in stage 1.0 (TID 57, datanode03, executor 1, partition 8, NODE_LOCAL, 7638 bytes) 19/08/13 19:53:27 INFO TaskSetManager: Finished task 7.0 in stage 1.0 (TID 56) in 25 ms on datanode03 (executor 1) (2/20) 19/08/13 19:53:27 INFO TaskSetManager: Starting task 6.0 in stage 1.0 (TID 58, datanode02, executor 3, partition 6, NODE_LOCAL, 7638 bytes) 19/08/13 19:53:27 INFO TaskSetManager: Finished task 1.0 in stage 1.0 (TID 51) in 220 ms on datanode02 (executor 3) (3/20) 19/08/13 19:53:27 INFO TaskSetManager: Starting task 14.0 in stage 1.0 (TID 59, datanode03, executor 1, partition 14, NODE_LOCAL, 7638 bytes) 19/08/13 19:53:27 INFO TaskSetManager: Finished task 8.0 in stage 1.0 (TID 57) in 17 ms on datanode03 (executor 1) (4/20) 19/08/13 19:53:27 INFO TaskSetManager: Starting task 16.0 in stage 1.0 (TID 60, datanode03, executor 1, partition 16, NODE_LOCAL, 7638 bytes) 19/08/13 19:53:27 INFO TaskSetManager: Finished task 14.0 in stage 1.0 (TID 59) in 15 ms on datanode03 (executor 1) (5/20) 19/08/13 19:53:27 INFO TaskSetManager: Finished task 16.0 in stage 1.0 (TID 60) in 21 ms on datanode03 (executor 1) (6/20) 19/08/13 19:53:27 INFO TaskSetManager: Starting task 9.0 in stage 1.0 (TID 61, datanode02, executor 3, partition 9, NODE_LOCAL, 7638 bytes) 19/08/13 19:53:27 INFO TaskSetManager: Finished task 4.0 in stage 1.0 (TID 54) in 269 ms on datanode02 (executor 3) (7/20) 19/08/13 19:53:27 INFO TaskSetManager: Finished task 0.0 in stage 1.0 (TID 50) in 339 ms on datanode03 (executor 1) (8/20) 19/08/13 19:53:27 INFO TaskSetManager: Starting task 10.0 in stage 1.0 (TID 62, datanode02, executor 3, partition 10, NODE_LOCAL, 7638 bytes) 19/08/13 19:53:27 INFO TaskSetManager: Finished task 6.0 in stage 1.0 (TID 58) in 56 ms on datanode02 (executor 3) (9/20) 19/08/13 19:53:27 INFO TaskSetManager: Starting task 11.0 in stage 1.0 (TID 63, datanode01, executor 2, partition 11, NODE_LOCAL, 7638 bytes) 19/08/13 19:53:27 INFO TaskSetManager: Finished task 5.0 in stage 1.0 (TID 55) in 284 ms on datanode01 (executor 2) (10/20) 19/08/13 19:53:27 INFO TaskSetManager: Starting task 12.0 in stage 1.0 (TID 64, datanode01, executor 2, partition 12, NODE_LOCAL, 7638 bytes) 19/08/13 19:53:27 INFO TaskSetManager: Finished task 3.0 in stage 1.0 (TID 52) in 287 ms on datanode01 (executor 2) (11/20) 19/08/13 19:53:27 INFO TaskSetManager: Starting task 13.0 in stage 1.0 (TID 65, datanode02, executor 3, partition 13, NODE_LOCAL, 7638 bytes) 19/08/13 19:53:27 INFO TaskSetManager: Starting task 15.0 in stage 1.0 (TID 66, datanode02, executor 3, partition 15, NODE_LOCAL, 7638 bytes) 19/08/13 19:53:27 INFO TaskSetManager: Finished task 10.0 in stage 1.0 (TID 62) in 25 ms on datanode02 (executor 3) (12/20) 19/08/13 19:53:27 INFO TaskSetManager: Finished task 9.0 in stage 1.0 (TID 61) in 29 ms on datanode02 (executor 3) (13/20) 19/08/13 19:53:27 INFO TaskSetManager: Starting task 17.0 in stage 1.0 (TID 67, datanode02, executor 3, partition 17, NODE_LOCAL, 7638 bytes) 19/08/13 19:53:27 INFO TaskSetManager: Finished task 15.0 in stage 1.0 (TID 66) in 13 ms on datanode02 (executor 3) (14/20) 19/08/13 19:53:27 INFO TaskSetManager: Finished task 13.0 in stage 1.0 (TID 65) in 16 ms on datanode02 (executor 3) (15/20) 19/08/13 19:53:27 INFO TaskSetManager: Starting task 18.0 in stage 1.0 (TID 68, datanode02, executor 3, partition 18, NODE_LOCAL, 7638 bytes) 19/08/13 19:53:27 INFO TaskSetManager: Starting task 19.0 in stage 1.0 (TID 69, datanode01, executor 2, partition 19, NODE_LOCAL, 7638 bytes) 19/08/13 19:53:27 INFO TaskSetManager: Finished task 11.0 in stage 1.0 (TID 63) in 30 ms on datanode01 (executor 2) (16/20) 19/08/13 19:53:27 INFO TaskSetManager: Finished task 12.0 in stage 1.0 (TID 64) in 30 ms on datanode01 (executor 2) (17/20) 19/08/13 19:53:27 INFO TaskSetManager: Finished task 17.0 in stage 1.0 (TID 67) in 17 ms on datanode02 (executor 3) (18/20) 19/08/13 19:53:27 INFO TaskSetManager: Finished task 19.0 in stage 1.0 (TID 69) in 13 ms on datanode01 (executor 2) (19/20) 19/08/13 19:53:27 INFO TaskSetManager: Finished task 18.0 in stage 1.0 (TID 68) in 20 ms on datanode02 (executor 3) (20/20) 19/08/13 19:53:27 INFO YarnClusterScheduler: Removed TaskSet 1.0, whose tasks have all completed, from pool 19/08/13 19:53:27 INFO DAGScheduler: ResultStage 1 (start at VoiceApplication2.java:128) finished in 0.406 s 19/08/13 19:53:27 INFO DAGScheduler: Job 0 finished: start at VoiceApplication2.java:128, took 1.850883 s 19/08/13 19:53:27 INFO ReceiverTracker: Starting 1 receivers 19/08/13 19:53:27 INFO ReceiverTracker: ReceiverTracker started 19/08/13 19:53:27 INFO KafkaInputDStream: Slide time = 60000 ms 19/08/13 19:53:27 INFO KafkaInputDStream: Storage level = Serialized 1x Replicated 19/08/13 19:53:27 INFO KafkaInputDStream: Checkpoint interval = null 19/08/13 19:53:27 INFO KafkaInputDStream: Remember interval = 60000 ms 19/08/13 19:53:27 INFO KafkaInputDStream: Initialized and validated org.apache.spark.streaming.kafka.KafkaInputDStream@5fd3dc81 19/08/13 19:53:27 INFO ForEachDStream: Slide time = 60000 ms 19/08/13 19:53:27 INFO ForEachDStream: Storage level = Serialized 1x Replicated 19/08/13 19:53:27 INFO ForEachDStream: Checkpoint interval = null 19/08/13 19:53:27 INFO ForEachDStream: Remember interval = 60000 ms 19/08/13 19:53:27 INFO ForEachDStream: Initialized and validated org.apache.spark.streaming.dstream.ForEachDStream@4044ec97 19/08/13 19:53:27 INFO KafkaInputDStream: Slide time = 60000 ms 19/08/13 19:53:27 INFO KafkaInputDStream: Storage level = Serialized 1x Replicated 19/08/13 19:53:27 INFO KafkaInputDStream: Checkpoint interval = null 19/08/13 19:53:27 INFO KafkaInputDStream: Remember interval = 60000 ms 19/08/13 19:53:27 INFO KafkaInputDStream: Initialized and validated org.apache.spark.streaming.kafka.KafkaInputDStream@5fd3dc81 19/08/13 19:53:27 INFO MappedDStream: Slide time = 60000 ms 19/08/13 19:53:27 INFO MappedDStream: Storage level = Serialized 1x Replicated 19/08/13 19:53:27 INFO MappedDStream: Checkpoint interval = null 19/08/13 19:53:27 INFO MappedDStream: Remember interval = 60000 ms 19/08/13 19:53:27 INFO MappedDStream: Initialized and validated org.apache.spark.streaming.dstream.MappedDStream@5dd4b960 19/08/13 19:53:27 INFO ForEachDStream: Slide time = 60000 ms 19/08/13 19:53:27 INFO ForEachDStream: Storage level = Serialized 1x Replicated 19/08/13 19:53:27 INFO ForEachDStream: Checkpoint interval = null 19/08/13 19:53:27 INFO ForEachDStream: Remember interval = 60000 ms 19/08/13 19:53:27 INFO ForEachDStream: Initialized and validated org.apache.spark.streaming.dstream.ForEachDStream@132d0c3c 19/08/13 19:53:27 INFO KafkaInputDStream: Slide time = 60000 ms 19/08/13 19:53:27 INFO KafkaInputDStream: Storage level = Serialized 1x Replicated 19/08/13 19:53:27 INFO KafkaInputDStream: Checkpoint interval = null 19/08/13 19:53:27 INFO KafkaInputDStream: Remember interval = 60000 ms 19/08/13 19:53:27 INFO KafkaInputDStream: Initialized and validated org.apache.spark.streaming.kafka.KafkaInputDStream@5fd3dc81 19/08/13 19:53:27 INFO MappedDStream: Slide time = 60000 ms 19/08/13 19:53:27 INFO MappedDStream: Storage level = Serialized 1x Replicated 19/08/13 19:53:27 INFO MappedDStream: Checkpoint interval = null 19/08/13 19:53:27 INFO MappedDStream: Remember interval = 60000 ms 19/08/13 19:53:27 INFO MappedDStream: Initialized and validated org.apache.spark.streaming.dstream.MappedDStream@5dd4b960 19/08/13 19:53:27 INFO ForEachDStream: Slide time = 60000 ms 19/08/13 19:53:27 INFO ForEachDStream: Storage level = Serialized 1x Replicated 19/08/13 19:53:27 INFO ForEachDStream: Checkpoint interval = null 19/08/13 19:53:27 INFO ForEachDStream: Remember interval = 60000 ms 19/08/13 19:53:27 INFO ForEachDStream: Initialized and validated org.apache.spark.streaming.dstream.ForEachDStream@525bed0c 19/08/13 19:53:27 INFO DAGScheduler: Got job 1 (start at VoiceApplication2.java:128) with 1 output partitions 19/08/13 19:53:27 INFO DAGScheduler: Final stage: ResultStage 2 (start at VoiceApplication2.java:128) 19/08/13 19:53:27 INFO DAGScheduler: Parents of final stage: List() 19/08/13 19:53:27 INFO DAGScheduler: Missing parents: List() 19/08/13 19:53:27 INFO DAGScheduler: Submitting ResultStage 2 (Receiver 0 ParallelCollectionRDD[3] at makeRDD at ReceiverTracker.scala:613), which has no missing parents 19/08/13 19:53:27 INFO ReceiverTracker: Receiver 0 started 19/08/13 19:53:27 INFO MemoryStore: Block broadcast_2 stored as values in memory (estimated size 133.5 KB, free 366.2 MB) 19/08/13 19:53:27 INFO MemoryStore: Block broadcast_2_piece0 stored as bytes in memory (estimated size 36.3 KB, free 366.1 MB) 19/08/13 19:53:27 INFO BlockManagerInfo: Added broadcast_2_piece0 in memory on datanode02:31984 (size: 36.3 KB, free: 366.3 MB) 19/08/13 19:53:27 INFO SparkContext: Created broadcast 2 from broadcast at DAGScheduler.scala:1039 19/08/13 19:53:27 INFO DAGScheduler: Submitting 1 missing tasks from ResultStage 2 (Receiver 0 ParallelCollectionRDD[3] at makeRDD at ReceiverTracker.scala:613) (first 15 tasks are for partitions Vector(0)) 19/08/13 19:53:27 INFO YarnClusterScheduler: Adding task set 2.0 with 1 tasks 19/08/13 19:53:27 INFO TaskSetManager: Starting task 0.0 in stage 2.0 (TID 70, datanode01, executor 2, partition 0, PROCESS_LOCAL, 8757 bytes) 19/08/13 19:53:27 INFO RecurringTimer: Started timer for JobGenerator at time 1565697240000 19/08/13 19:53:27 INFO JobGenerator: Started JobGenerator at 1565697240000 ms 19/08/13 19:53:27 INFO JobScheduler: Started JobScheduler 19/08/13 19:53:27 INFO StreamingContext: StreamingContext started 19/08/13 19:53:27 INFO BlockManagerInfo: Added broadcast_2_piece0 in memory on datanode01:20487 (size: 36.3 KB, free: 2.5 GB) 19/08/13 19:53:27 INFO ReceiverTracker: Registered receiver for stream 0 from 10.1.229.158:64276 19/08/13 19:54:00 INFO JobScheduler: Added jobs for time 1565697240000 ms 19/08/13 19:54:00 INFO JobScheduler: Starting job streaming job 1565697240000 ms.0 from job set of time 1565697240000 ms 19/08/13 19:54:00 INFO JobScheduler: Starting job streaming job 1565697240000 ms.1 from job set of time 1565697240000 ms 19/08/13 19:54:00 INFO JobScheduler: Finished job streaming job 1565697240000 ms.1 from job set of time 1565697240000 ms 19/08/13 19:54:00 INFO JobScheduler: Finished job streaming job 1565697240000 ms.0 from job set of time 1565697240000 ms 19/08/13 19:54:00 INFO JobScheduler: Starting job streaming job 1565697240000 ms.2 from job set of time 1565697240000 ms 19/08/13 19:54:00 INFO SharedState: loading hive config file: file:/data01/hadoop/yarn/local/usercache/hdfs/filecache/85431/__spark_conf__.zip/__hadoop_conf__/hive-site.xml 19/08/13 19:54:00 INFO SharedState: Setting hive.metastore.warehouse.dir ('null') to the value of spark.sql.warehouse.dir ('hdfs://CID-042fb939-95b4-4b74-91b8-9f94b999bdf7/apps/hive/warehouse'). 19/08/13 19:54:00 INFO SharedState: Warehouse path is 'hdfs://CID-042fb939-95b4-4b74-91b8-9f94b999bdf7/apps/hive/warehouse'. 19/08/13 19:54:00 INFO StateStoreCoordinatorRef: Registered StateStoreCoordinator endpoint 19/08/13 19:54:00 INFO BlockManagerInfo: Removed broadcast_1_piece0 on datanode02:31984 in memory (size: 1979.0 B, free: 366.3 MB) 19/08/13 19:54:00 INFO BlockManagerInfo: Removed broadcast_1_piece0 on datanode02:28863 in memory (size: 1979.0 B, free: 2.5 GB) 19/08/13 19:54:00 INFO BlockManagerInfo: Removed broadcast_1_piece0 on datanode01:20487 in memory (size: 1979.0 B, free: 2.5 GB) 19/08/13 19:54:00 INFO BlockManagerInfo: Removed broadcast_1_piece0 on datanode03:3328 in memory (size: 1979.0 B, free: 2.5 GB) 19/08/13 19:54:02 INFO CodeGenerator: Code generated in 175.416957 ms 19/08/13 19:54:02 INFO JobScheduler: Finished job streaming job 1565697240000 ms.2 from job set of time 1565697240000 ms 19/08/13 19:54:02 ERROR JobScheduler: Error running job streaming job 1565697240000 ms.2 org.apache.spark.sql.catalyst.analysis.NoSuchDatabaseException: Database 'meta_voice' not found; at org.apache.spark.sql.catalyst.catalog.ExternalCatalog.requireDbExists(ExternalCatalog.scala:40) at org.apache.spark.sql.catalyst.catalog.InMemoryCatalog.tableExists(InMemoryCatalog.scala:331) at org.apache.spark.sql.catalyst.catalog.SessionCatalog.tableExists(SessionCatalog.scala:388) at org.apache.spark.sql.DataFrameWriter.saveAsTable(DataFrameWriter.scala:398) at org.apache.spark.sql.DataFrameWriter.saveAsTable(DataFrameWriter.scala:393) at com.stream.VoiceApplication2$2.call(VoiceApplication2.java:122) at com.stream.VoiceApplication2$2.call(VoiceApplication2.java:115) at org.apache.spark.streaming.api.java.JavaDStreamLike$$anonfun$foreachRDD$2.apply(JavaDStreamLike.scala:280) at org.apache.spark.streaming.api.java.JavaDStreamLike$$anonfun$foreachRDD$2.apply(JavaDStreamLike.scala:280) at org.apache.spark.streaming.dstream.ForEachDStream$$anonfun$1$$anonfun$apply$mcV$sp$1.apply$mcV$sp(ForEachDStream.scala:51) at org.apache.spark.streaming.dstream.ForEachDStream$$anonfun$1$$anonfun$apply$mcV$sp$1.apply(ForEachDStream.scala:51) at org.apache.spark.streaming.dstream.ForEachDStream$$anonfun$1$$anonfun$apply$mcV$sp$1.apply(ForEachDStream.scala:51) at org.apache.spark.streaming.dstream.DStream.createRDDWithLocalProperties(DStream.scala:416) at org.apache.spark.streaming.dstream.ForEachDStream$$anonfun$1.apply$mcV$sp(ForEachDStream.scala:50) at org.apache.spark.streaming.dstream.ForEachDStream$$anonfun$1.apply(ForEachDStream.scala:50) at org.apache.spark.streaming.dstream.ForEachDStream$$anonfun$1.apply(ForEachDStream.scala:50) at scala.util.Try$.apply(Try.scala:192) at org.apache.spark.streaming.scheduler.Job.run(Job.scala:39) at org.apache.spark.streaming.scheduler.JobScheduler$JobHandler$$anonfun$run$1.apply$mcV$sp(JobScheduler.scala:257) at org.apache.spark.streaming.scheduler.JobScheduler$JobHandler$$anonfun$run$1.apply(JobScheduler.scala:257) at org.apache.spark.streaming.scheduler.JobScheduler$JobHandler$$anonfun$run$1.apply(JobScheduler.scala:257) at scala.util.DynamicVariable.withValue(DynamicVariable.scala:58) at org.apache.spark.streaming.scheduler.JobScheduler$JobHandler.run(JobScheduler.scala:256) at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149) at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624) at java.lang.Thread.run(Thread.java:748) 19/08/13 19:54:02 ERROR ApplicationMaster: User class threw exception: org.apache.spark.sql.catalyst.analysis.NoSuchDatabaseException: Database 'meta_voice' not found; org.apache.spark.sql.catalyst.analysis.NoSuchDatabaseException: Database 'meta_voice' not found; at org.apache.spark.sql.catalyst.catalog.ExternalCatalog.requireDbExists(ExternalCatalog.scala:40) at org.apache.spark.sql.catalyst.catalog.InMemoryCatalog.tableExists(InMemoryCatalog.scala:331) at org.apache.spark.sql.catalyst.catalog.SessionCatalog.tableExists(SessionCatalog.scala:388) at org.apache.spark.sql.DataFrameWriter.saveAsTable(DataFrameWriter.scala:398) at org.apache.spark.sql.DataFrameWriter.saveAsTable(DataFrameWriter.scala:393) at com.stream.VoiceApplication2$2.call(VoiceApplication2.java:122) at com.stream.VoiceApplication2$2.call(VoiceApplication2.java:115) at org.apache.spark.streaming.api.java.JavaDStreamLike$$anonfun$foreachRDD$2.apply(JavaDStreamLike.scala:280) at org.apache.spark.streaming.api.java.JavaDStreamLike$$anonfun$foreachRDD$2.apply(JavaDStreamLike.scala:280) at org.apache.spark.streaming.dstream.ForEachDStream$$anonfun$1$$anonfun$apply$mcV$sp$1.apply$mcV$sp(ForEachDStream.scala:51) at org.apache.spark.streaming.dstream.ForEachDStream$$anonfun$1$$anonfun$apply$mcV$sp$1.apply(ForEachDStream.scala:51) at org.apache.spark.streaming.dstream.ForEachDStream$$anonfun$1$$anonfun$apply$mcV$sp$1.apply(ForEachDStream.scala:51) at org.apache.spark.streaming.dstream.DStream.createRDDWithLocalProperties(DStream.scala:416) at org.apache.spark.streaming.dstream.ForEachDStream$$anonfun$1.apply$mcV$sp(ForEachDStream.scala:50) at org.apache.spark.streaming.dstream.ForEachDStream$$anonfun$1.apply(ForEachDStream.scala:50) at org.apache.spark.streaming.dstream.ForEachDStream$$anonfun$1.apply(ForEachDStream.scala:50) at scala.util.Try$.apply(Try.scala:192) at org.apache.spark.streaming.scheduler.Job.run(Job.scala:39) at org.apache.spark.streaming.scheduler.JobScheduler$JobHandler$$anonfun$run$1.apply$mcV$sp(JobScheduler.scala:257) at org.apache.spark.streaming.scheduler.JobScheduler$JobHandler$$anonfun$run$1.apply(JobScheduler.scala:257) at org.apache.spark.streaming.scheduler.JobScheduler$JobHandler$$anonfun$run$1.apply(JobScheduler.scala:257) at scala.util.DynamicVariable.withValue(DynamicVariable.scala:58) at org.apache.spark.streaming.scheduler.JobScheduler$JobHandler.run(JobScheduler.scala:256) at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149) at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624) at java.lang.Thread.run(Thread.java:748) 19/08/13 19:54:02 INFO ApplicationMaster: Final app status: FAILED, exitCode: 15, (reason: User class threw exception: org.apache.spark.sql.catalyst.analysis.NoSuchDatabaseException: Database 'meta_voice' not found; at org.apache.spark.sql.catalyst.catalog.ExternalCatalog.requireDbExists(ExternalCatalog.scala:40) at org.apache.spark.sql.catalyst.catalog.InMemoryCatalog.tableExists(InMemoryCatalog.scala:331) at org.apache.spark.sql.catalyst.catalog.SessionCatalog.tableExists(SessionCatalog.scala:388) at org.apache.spark.sql.DataFrameWriter.saveAsTable(DataFrameWriter.scala:398) at org.apache.spark.sql.DataFrameWriter.saveAsTable(DataFrameWriter.scala:393) at com.stream.VoiceApplication2$2.call(VoiceApplication2.java:122) at com.stream.VoiceApplication2$2.call(VoiceApplication2.java:115) at org.apache.spark.streaming.api.java.JavaDStreamLike$$anonfun$foreachRDD$2.apply(JavaDStreamLike.scala:280) at org.apache.spark.streaming.api.java.JavaDStreamLike$$anonfun$foreachRDD$2.apply(JavaDStreamLike.scala:280) at org.apache.spark.streaming.dstream.ForEachDStream$$anonfun$1$$anonfun$apply$mcV$sp$1.apply$mcV$sp(ForEachDStream.scala:51) at org.apache.spark.streaming.dstream.ForEachDStream$$anonfun$1$$anonfun$apply$mcV$sp$1.apply(ForEachDStream.scala:51) at org.apache.spark.streaming.dstream.ForEachDStream$$anonfun$1$$anonfun$apply$mcV$sp$1.apply(ForEachDStream.scala:51) at org.apache.spark.streaming.dstream.DStream.createRDDWithLocalProperties(DStream.scala:416) at org.apache.spark.streaming.dstream.ForEachDStream$$anonfun$1.apply$mcV$sp(ForEachDStream.scala:50) at org.apache.spark.streaming.dstream.ForEachDStream$$anonfun$1.apply(ForEachDStream.scala:50) at org.apache.spark.streaming.dstream.ForEachDStream$$anonfun$1.apply(ForEachDStream.scala:50) at scala.util.Try$.apply(Try.scala:192) at org.apache.spark.streaming.scheduler.Job.run(Job.scala:39) at org.apache.spark.streaming.scheduler.JobScheduler$JobHandler$$anonfun$run$1.apply$mcV$sp(JobScheduler.scala:257) at org.apache.spark.streaming.scheduler.JobScheduler$JobHandler$$anonfun$run$1.apply(JobScheduler.scala:257) at org.apache.spark.streaming.scheduler.JobScheduler$JobHandler$$anonfun$run$1.apply(JobScheduler.scala:257) at scala.util.DynamicVariable.withValue(DynamicVariable.scala:58) at org.apache.spark.streaming.scheduler.JobScheduler$JobHandler.run(JobScheduler.scala:256) at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149) at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624) at java.lang.Thread.run(Thread.java:748) ) 19/08/13 19:54:02 INFO StreamingContext: Invoking stop(stopGracefully=true) from shutdown hook 19/08/13 19:54:02 INFO ReceiverTracker: Sent stop signal to all 1 receivers 19/08/13 19:54:02 ERROR ReceiverTracker: Deregistered receiver for stream 0: Stopped by driver 19/08/13 19:54:02 INFO TaskSetManager: Finished task 0.0 in stage 2.0 (TID 70) in 35055 ms on datanode01 (executor 2) (1/1) 19/08/13 19:54:02 INFO YarnClusterScheduler: Removed TaskSet 2.0, whose tasks have all completed, from pool 19/08/13 19:54:02 INFO DAGScheduler: ResultStage 2 (start at VoiceApplication2.java:128) finished in 35.086 s 19/08/13 19:54:02 INFO ReceiverTracker: Waiting for receiver job to terminate gracefully 19/08/13 19:54:02 INFO ReceiverTracker: Waited for receiver job to terminate gracefully 19/08/13 19:54:02 INFO ReceiverTracker: All of the receivers have deregistered successfully 19/08/13 19:54:02 INFO ReceiverTracker: ReceiverTracker stopped 19/08/13 19:54:02 INFO JobGenerator: Stopping JobGenerator gracefully 19/08/13 19:54:02 INFO JobGenerator: Waiting for all received blocks to be consumed for job generation 19/08/13 19:54:02 INFO JobGenerator: Waited for all received blocks to be consumed for job generation 19/08/13 19:54:12 WARN ShutdownHookManager: ShutdownHook '$anon$2' timeout, java.util.concurrent.TimeoutException java.util.concurrent.TimeoutException at java.util.concurrent.FutureTask.get(FutureTask.java:205) at org.apache.hadoop.util.ShutdownHookManager$1.run(ShutdownHookManager.java:67) 19/08/13 19:54:12 ERROR Utils: Uncaught exception in thread pool-1-thread-1 java.lang.InterruptedException at java.lang.Object.wait(Native Method) at java.lang.Thread.join(Thread.java:1252) at java.lang.Thread.join(Thread.java:1326) at org.apache.spark.streaming.util.RecurringTimer.stop(RecurringTimer.scala:86) at org.apache.spark.streaming.scheduler.JobGenerator.stop(JobGenerator.scala:137) at org.apache.spark.streaming.scheduler.JobScheduler.stop(JobScheduler.scala:123) at org.apache.spark.streaming.StreamingContext$$anonfun$stop$1.apply$mcV$sp(StreamingContext.scala:681) at org.apache.spark.util.Utils$.tryLogNonFatalError(Utils.scala:1357) at org.apache.spark.streaming.StreamingContext.stop(StreamingContext.scala:680) at org.apache.spark.streaming.StreamingContext.org$apache$spark$streaming$StreamingContext$$stopOnShutdown(StreamingContext.scala:714) at org.apache.spark.streaming.StreamingContext$$anonfun$start$1.apply$mcV$sp(StreamingContext.scala:599) at org.apache.spark.util.SparkShutdownHook.run(ShutdownHookManager.scala:216) at org.apache.spark.util.SparkShutdownHookManager$$anonfun$runAll$1$$anonfun$apply$mcV$sp$1.apply$mcV$sp(ShutdownHookManager.scala:188) at org.apache.spark.util.SparkShutdownHookManager$$anonfun$runAll$1$$anonfun$apply$mcV$sp$1.apply(ShutdownHookManager.scala:188) at org.apache.spark.util.SparkShutdownHookManager$$anonfun$runAll$1$$anonfun$apply$mcV$sp$1.apply(ShutdownHookManager.scala:188) at org.apache.spark.util.Utils$.logUncaughtExceptions(Utils.scala:1988) at org.apache.spark.util.SparkShutdownHookManager$$anonfun$runAll$1.apply$mcV$sp(ShutdownHookManager.scala:188) at org.apache.spark.util.SparkShutdownHookManager$$anonfun$runAll$1.apply(ShutdownHookManager.scala:188) at org.apache.spark.util.SparkShutdownHookManager$$anonfun$runAll$1.apply(ShutdownHookManager.scala:188) at scala.util.Try$.apply(Try.scala:192) at org.apache.spark.util.SparkShutdownHookManager.runAll(ShutdownHookManager.scala:188) at org.apache.spark.util.SparkShutdownHookManager$$anon$2.run(ShutdownHookManager.scala:178) at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511) at java.util.concurrent.FutureTask.run(FutureTask.java:266) at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149) at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624) at java.lang.Thread.run(Thread.java:748) ```
通过VB API HOOK实现对user32的读取
通过VB API HOOK实现对user32的读取,请问具体获得user32模块的方法是什么,怎么得到api的模块?
为什么nuxt.js无法安装node-sass?
1.npm 不管怎么样都安装不了 淘宝镜像也没用 2.每次安装后都卡在Downloading binary from https://github.com/sass/node-sass/releases/download/v4.12.0/win32-x64-64这里然后就报错 3.T.T小白求教 ``` PS E:\Users\FuYuHao\Desktop\sb\a\aaa> npm i node-sass > node-sass@4.12.0 install E:\Users\FuYuHao\Desktop\sb\a\aaa\node_modules\node-sass > node scripts/install.js Downloading binary from https://github.com/sass/node-sass/releases/download/v4.12.0/win32-x64-64_binding.node Cannot download "https://github.com/sass/node-sass/releases/download/v4.12.0/win32-x64-64_binding.node": connect ETIMEDOUT 52.216.135.19:443 Timed out whilst downloading the prebuilt binary Hint: If github.com is not accessible in your location try setting a proxy via HTTP_PROXY, e.g. export HTTP_PROXY=http://example.com:1234 or configure npm proxy via npm config set proxy http://example.com:8080 > node-sass@4.12.0 postinstall E:\Users\FuYuHao\Desktop\sb\a\aaa\node_modules\node-sass > node scripts/build.js Building: C:\Program Files\nodejs\node.exe E:\Users\FuYuHao\Desktop\sb\a\aaa\node_modules\node-gyp\bin\node-gyp.js rebuild --verbose --libsass_ext= --libsass_cflags= --libsass_ldflags= --libsass_library= gyp info it worked if it ends with ok gyp verb cli [ 'C:\\Program Files\\nodejs\\node.exe', gyp verb cli 'E:\\Users\\FuYuHao\\Desktop\\sb\\a\\aaa\\node_modules\\node-gyp\\bin\\node-gyp.js', gyp verb cli 'rebuild', gyp verb cli '--verbose', gyp verb cli '--libsass_ext=', gyp verb cli '--libsass_cflags=', gyp verb cli '--libsass_ldflags=', gyp verb cli '--libsass_library=' ] gyp info using node-gyp@3.8.0 gyp info using node@10.14.1 | win32 | x64 gyp verb command rebuild [] gyp verb command clean [] gyp verb clean removing "build" directory gyp verb command configure [] gyp verb check python checking for Python executable "python2" in the PATH gyp verb `which` failed Error: not found: python2 gyp verb `which` failed at getNotFoundError (E:\Users\FuYuHao\Desktop\sb\a\aaa\node_modules\which\which.js:13:12) gyp verb `which` failed at F (E:\Users\FuYuHao\Desktop\sb\a\aaa\node_modules\which\which.js:68:19) gyp verb `which` failed at E (E:\Users\FuYuHao\Desktop\sb\a\aaa\node_modules\which\which.js:80:29) gyp verb `which` failed at E:\Users\FuYuHao\Desktop\sb\a\aaa\node_modules\which\which.js:89:16 gyp verb `which` failed at E:\Users\FuYuHao\Desktop\sb\a\aaa\node_modules\isexe\index.js:42:5 gyp verb `which` failed at E:\Users\FuYuHao\Desktop\sb\a\aaa\node_modules\isexe\windows.js:36:5 gyp verb `which` failed at FSReqWrap.oncomplete (fs.js:154:21) gyp verb `which` failed python2 { Error: not found: python2 gyp verb `which` failed at getNotFoundError (E:\Users\FuYuHao\Desktop\sb\a\aaa\node_modules\which\which.js:13:12) gyp verb `which` failed at F (E:\Users\FuYuHao\Desktop\sb\a\aaa\node_modules\which\which.js:68:19) gyp verb `which` failed at E (E:\Users\FuYuHao\Desktop\sb\a\aaa\node_modules\which\which.js:80:29) gyp verb `which` failed at E:\Users\FuYuHao\Desktop\sb\a\aaa\node_modules\which\which.js:89:16 gyp verb `which` failed at E:\Users\FuYuHao\Desktop\sb\a\aaa\node_modules\isexe\index.js:42:5 gyp verb `which` failed at E:\Users\FuYuHao\Desktop\sb\a\aaa\node_modules\isexe\windows.js:36:5 gyp verb `which` failed at FSReqWrap.oncomplete (fs.js:154:21) gyp verb `which` failed stack: gyp verb `which` failed 'Error: not found: python2\n at getNotFoundError (E:\\Users\\FuYuHao\\Desktop\\sb\\a\\aaa\\node_modules\\which\\which.js:13:12)\n at F (E:\\Users\\FuYuHao\\Desktop\\sb\\a\\aaa\\node_modules\\which\\which.js:68:19)\n at E (E:\\Users\\FuYuHao\\Desktop\\sb\\a\\aaa\\node_modules\\which\\which.js:80:29)\n at E:\\Users\\FuYuHao\\Desktop\\sb\\a\\aaa\\node_modules\\which\\which.js:89:16\n at E:\\Users\\FuYuHao\\Desktop\\sb\\a\\aaa\\node_modules\\isexe\\index.js:42:5\n at E:\\Users\\FuYuHao\\Desktop\\sb\\a\\aaa\\node_modules\\isexe\\windows.js:36:5\n at FSReqWrap.oncomplete (fs.js:154:21)', gyp verb `which` failed code: 'ENOENT' } gyp verb check python checking for Python executable "python" in the PATH gyp verb `which` succeeded python C:\Python27\python.EXE gyp verb check python version `C:\Python27\python.EXE -c "import sys; print "2.7.15 gyp verb check python version .%s.%s" % sys.version_info[:3];"` returned: %j gyp verb get node dir no --target version specified, falling back to host node version: 10.14.1 gyp verb command install [ '10.14.1' ] gyp verb install input version string "10.14.1" gyp verb install installing version: 10.14.1 gyp verb install --ensure was passed, so won't reinstall if already installed gyp verb install version is already installed, need to check "installVersion" gyp verb got "installVersion" 9 gyp verb needs "installVersion" 9 gyp verb install version is good gyp verb get node dir target node version installed: 10.14.1 gyp verb build dir attempting to create "build" dir: E:\Users\FuYuHao\Desktop\sb\a\aaa\node_modules\node-sass\build gyp verb build dir "build" dir needed to be created? E:\Users\FuYuHao\Desktop\sb\a\aaa\node_modules\node-sass\build gyp verb find vs2017 Found installation at: C:\Program Files (x86)\Microsoft Visual Studio\2017\BuildTools gyp verb find vs2017 - Found Microsoft.VisualStudio.Component.Windows10SDK.17763 gyp verb find vs2017 - Found Microsoft.VisualStudio.Component.VC.Tools.x86.x64 gyp verb find vs2017 - Found Microsoft.VisualStudio.VC.MSBuild.Base gyp verb find vs2017 - Using this installation with Windows 10 SDK gyp verb find vs2017 using installation: C:\Program Files (x86)\Microsoft Visual Studio\2017\BuildTools gyp verb build/config.gypi creating config file gyp verb build/config.gypi writing out config file: E:\Users\FuYuHao\Desktop\sb\a\aaa\node_modules\node-sass\build\config.gypi gyp verb config.gypi checking for gypi file: E:\Users\FuYuHao\Desktop\sb\a\aaa\node_modules\node-sass\config.gypi gyp verb common.gypi checking for gypi file: E:\Users\FuYuHao\Desktop\sb\a\aaa\node_modules\node-sass\common.gypi gyp verb gyp gyp format was not specified; forcing "msvs" gyp info spawn C:\Python27\python.EXE gyp info spawn args [ 'E:\\Users\\FuYuHao\\Desktop\\sb\\a\\aaa\\node_modules\\node-gyp\\gyp\\gyp_main.py', gyp info spawn args 'binding.gyp', gyp info spawn args '-f', gyp info spawn args 'msvs', gyp info spawn args '-G', gyp info spawn args 'msvs_version=2015', gyp info spawn args '-I', gyp info spawn args 'E:\\Users\\FuYuHao\\Desktop\\sb\\a\\aaa\\node_modules\\node-sass\\build\\config.gypi', gyp info spawn args '-I', gyp info spawn args 'E:\\Users\\FuYuHao\\Desktop\\sb\\a\\aaa\\node_modules\\node-gyp\\addon.gypi', gyp info spawn args '-I', gyp info spawn args 'C:\\Users\\FuYuHao\\.node-gyp\\10.14.1\\include\\node\\common.gypi', gyp info spawn args '-Dlibrary=shared_library', gyp info spawn args '-Dvisibility=default', gyp info spawn args '-Dnode_root_dir=C:\\Users\\FuYuHao\\.node-gyp\\10.14.1', gyp info spawn args '-Dnode_gyp_dir=E:\\Users\\FuYuHao\\Desktop\\sb\\a\\aaa\\node_modules\\node-gyp', gyp info spawn args '-Dnode_lib_file=C:\\Users\\FuYuHao\\.node-gyp\\10.14.1\\<(target_arch)\\node.lib', gyp info spawn args '-Dmodule_root_dir=E:\\Users\\FuYuHao\\Desktop\\sb\\a\\aaa\\node_modules\\node-sass', gyp info spawn args '-Dnode_engine=v8', gyp info spawn args '--depth=.', gyp info spawn args '--no-parallel', gyp info spawn args '--generator-output', gyp info spawn args 'E:\\Users\\FuYuHao\\Desktop\\sb\\a\\aaa\\node_modules\\node-sass\\build', gyp info spawn args '-Goutput_dir=.' ] gyp verb command build [] gyp verb build type Release gyp verb architecture x64 gyp verb node dev dir C:\Users\FuYuHao\.node-gyp\10.14.1 gyp verb found first Solution file build/binding.sln gyp verb using MSBuild: C:\Program Files (x86)\Microsoft Visual Studio\2017\BuildTools\MSBuild\15.0\Bin\MSBuild.exe gyp info spawn C:\Program Files (x86)\Microsoft Visual Studio\2017\BuildTools\MSBuild\15.0\Bin\MSBuild.exe gyp info spawn args [ 'build/binding.sln', gyp info spawn args '/nologo', gyp info spawn args '/p:Configuration=Release;Platform=x64' ] 在此解决方案中一次生成一个项目。若要启用并行生成,请添加“/m”开关。 生成启动时间为 2019/5/29 22:01:12。 节点 1 上的项目“E:\Users\FuYuHao\Desktop\sb\a\aaa\node_modules\node-sass\build\binding.sln”(默认目标)。 ValidateSolutionConfiguration: 正在生成解决方案配置“Release|x64”。 项目“E:\Users\FuYuHao\Desktop\sb\a\aaa\node_modules\node-sass\build\binding.sln”(1)正在节点 1 上生成“E:\Users\FuYuHao\Desktop\sb\a\aaa\node_modules\node-sass\build\binding.vcxproj.metaproj”(2) (默认目标)。 项目“E:\Users\FuYuHao\Desktop\sb\a\aaa\node_modules\node-sass\build\binding.vcxproj.metaproj”(2)正在节点 1 上生成“E:\Users\FuYuHao\Desktop\sb\a\aaa\node_modules\node-sass\build\src\libsass.vcxproj”(3) (默认目标)。 PrepareForBuild: 正在创建目录“Release\obj\libsass\”。 正在创建目录“E:\Users\FuYuHao\Desktop\sb\a\aaa\node_modules\node-sass\build\Release\”。 正在创建目录“Release\obj\libsass\libsass.tlog\”。 InitializeBuildStatus: 正在创建“Release\obj\libsass\libsass.tlog\unsuccessfulbuild”,因为已指定“AlwaysCreate”。 ClCompile: C:\Program Files (x86)\Microsoft Visual Studio\2017\BuildTools\VC\Tools\MSVC\14.16.27023\bin\HostX64\x64\CL.exe /c /I"C:\Users\FuYuHao\.node-gyp\10.14.1\include\node" /I"C:\Users\FuYuHao\.node-gyp\10.14.1\src" /I"C:\Use rs\FuYuHao\.node-gyp\10.14.1\deps\openssl\config" /I"C:\Users\FuYuHao\.node-gyp\10.14.1\deps\openssl\openssl\include" /I"C:\Users\FuYuHao\.node-gyp\10.14.1\deps\uv\include" /I"C:\Users\FuYuHao\.node-gyp\10.14.1\deps\zli b" /I"C:\Users\FuYuHao\.node-gyp\10.14.1\deps\v8\include" /I..\..\src\libsass\include /Z7 /nologo /W3 /WX- /diagnostics:classic /MP /Ox /Ob2 /Oi /Ot /Oy /GL /D NODE_GYP_MODULE_NAME=libsass /D USING_UV_SHARED=1 /D USING_ V8_SHARED=1 /D V8_DEPRECATION_WARNINGS=1 /D WIN32 /D _CRT_SECURE_NO_DEPRECATE /D _CRT_NONSTDC_NO_DEPRECATE /D _HAS_EXCEPTIONS=0 /D "LIBSASS_VERSION=\"3.5.4\"" /GF /Gm- /MT /GS /Gy /fp:precise /Zc:wchar_t /Zc:forScope /Z c:inline /GR- /Fo"Release\obj\libsass\\" /Fd"Release\obj\libsass\libsass.pdb" /Gd /TP /wd4351 /wd4355 /wd4800 /wd4251 /wd4275 /wd4244 /wd4267 /FC /errorReport:queue /GR /EHsc ..\..\src\libsass\src\ast.cpp ..\..\src\libs ass\src\ast_fwd_decl.cpp ..\..\src\libsass\src\backtrace.cpp ..\..\src\libsass\src\base64vlq.cpp ..\..\src\libsass\src\bind.cpp ..\..\src\libsass\src\check_nesting.cpp ..\..\src\libsass\src\color_maps.cpp ..\..\src\libs ass\src\constants.cpp ..\..\src\libsass\src\context.cpp ..\..\src\libsass\src\cssize.cpp ..\..\src\libsass\src\emitter.cpp ..\..\src\libsass\src\environment.cpp ..\..\src\libsass\src\error_handling.cpp ..\..\src\libsass \src\eval.cpp ..\..\src\libsass\src\expand.cpp ..\..\src\libsass\src\extend.cpp ..\..\src\libsass\src\file.cpp ..\..\src\libsass\src\functions.cpp ..\..\src\libsass\src\inspect.cpp ..\..\src\libsass\src\json.cpp ..\..\s rc\libsass\src\lexer.cpp ..\..\src\libsass\src\listize.cpp ..\..\src\libsass\src\memory\SharedPtr.cpp ..\..\src\libsass\src\node.cpp ..\..\src\libsass\src\operators.cpp ..\..\src\libsass\src\output.cpp ..\..\src\libsass \src\parser.cpp ..\..\src\libsass\src\plugins.cpp ..\..\src\libsass\src\position.cpp ..\..\src\libsass\src\prelexer.cpp ..\..\src\libsass\src\remove_placeholders.cpp ..\..\src\libsass\src\sass.cpp ..\..\src\libsass\src\ sass2scss.cpp ..\..\src\libsass\src\sass_context.cpp ..\..\src\libsass\src\sass_functions.cpp ..\..\src\libsass\src\sass_util.cpp ..\..\src\libsass\src\sass_values.cpp ..\..\src\libsass\src\source_map.cpp ..\..\src\libs ass\src\subset_map.cpp ..\..\src\libsass\src\to_c.cpp ..\..\src\libsass\src\to_value.cpp ..\..\src\libsass\src\units.cpp ..\..\src\libsass\src\utf8_string.cpp ..\..\src\libsass\src\util.cpp ..\..\src\libsass\src\values. cpp cl : 命令行 warning D9025: 正在重写“/GR-”(用“/GR”) [E:\Users\FuYuHao\Desktop\sb\a\aaa\node_modules\node-sass\build\src\libsass.vcxproj] cl : 命令行 warning D9025: 正在重写“/GR-”(用“/GR”) [E:\Users\FuYuHao\Desktop\sb\a\aaa\node_modules\node-sass\build\src\libsass.vcxproj] ast.cpp cl : 命令行 warning D9025: 正在重写“/GR-”(用“/GR”) [E:\Users\FuYuHao\Desktop\sb\a\aaa\node_modules\node-sass\build\src\libsass.vcxproj] ast_fwd_decl.cpp cl : 命令行 warning D9025: 正在重写“/GR-”(用“/GR”) [E:\Users\FuYuHao\Desktop\sb\a\aaa\node_modules\node-sass\build\src\libsass.vcxproj] backtrace.cpp cl : 命令行 warning D9025: 正在重写“/GR-”(用“/GR”) [E:\Users\FuYuHao\Desktop\sb\a\aaa\node_modules\node-sass\build\src\libsass.vcxproj] base64vlq.cpp bind.cpp check_nesting.cpp color_maps.cpp constants.cpp context.cpp cssize.cpp emitter.cpp environment.cpp error_handling.cpp eval.cpp expand.cpp extend.cpp file.cpp functions.cpp inspect.cpp json.cpp e:\users\fuyuhao\desktop\sb\a\aaa\node_modules\node-sass\src\libsass\src\json.cpp(26): warning C4005: “_CRT_NONSTDC_NO_DEPRECATE”: 宏重定义 [E:\Users\FuYuHao\Desktop\sb\a\aaa\node_modules\node-sass\build\src\libsass.vcxp roj] e:\users\fuyuhao\desktop\sb\a\aaa\node_modules\node-sass\src\libsass\src\json.cpp: note: 参见“_CRT_NONSTDC_NO_DEPRECATE”的前一个定义 lexer.cpp listize.cpp SharedPtr.cpp node.cpp operators.cpp output.cpp parser.cpp plugins.cpp position.cpp prelexer.cpp remove_placeholders.cpp sass.cpp sass2scss.cpp e:\users\fuyuhao\desktop\sb\a\aaa\node_modules\node-sass\src\libsass\src\sass2scss.cpp(9): warning C4005: “_CRT_NONSTDC_NO_DEPRECATE”: 宏重定义 [E:\Users\FuYuHao\Desktop\sb\a\aaa\node_modules\node-sass\build\src\libsass. vcxpr oj] e:\users\fuyuhao\desktop\sb\a\aaa\node_modules\node-sass\src\libsass\src\sass2scss.cpp: note: 参见“_CRT_NONSTDC_NO_DEPRECATE”的前一个定义 sass_context.cpp sass_functions.cpp sass_util.cpp sass_values.cpp source_map.cpp subset_map.cpp to_c.cpp to_value.cpp units.cpp utf8_string.cpp util.cpp values.cpp C:\Program Files (x86)\Microsoft Visual Studio\2017\BuildTools\VC\Tools\MSVC\14.16.27023\bin\HostX64\x64\CL.exe /c /I"C:\Users\FuYuHao\.node-gyp\10.14.1\include\node" /I"C:\Users\FuYuHao\.node-gyp\10.14.1\src" /I"C:\Use rs\FuYuHao\.node-gyp\10.14.1\deps\openssl\config" /I"C:\Users\FuYuHao\.node-gyp\10.14.1\deps\openssl\openssl\include" /I"C:\Users\FuYuHao\.node-gyp\10.14.1\deps\uv\include" /I"C:\Users\FuYuHao\.node-gyp\10.14.1\deps\zli b" /I"C:\Users\FuYuHao\.node-gyp\10.14.1\deps\v8\include" /I..\..\src\libsass\include /Z7 /nologo /W3 /WX- /diagnostics:classic /MP /Ox /Ob2 /Oi /Ot /Oy /GL /D NODE_GYP_MODULE_NAME=libsass /D USING_UV_SHARED=1 /D USING_ V8_SHARED=1 /D V8_DEPRECATION_WARNINGS=1 /D WIN32 /D _CRT_SECURE_NO_DEPRECATE /D _CRT_NONSTDC_NO_DEPRECATE /D _HAS_EXCEPTIONS=0 /D "LIBSASS_VERSION=\"3.5.4\"" /GF /Gm- /MT /GS /Gy /fp:precise /Zc:wchar_t /Zc:forScope /Z c:inline /GR- /Fo"Release\obj\libsass\\" /Fd"Release\obj\libsass\libsass.pdb" /Gd /TC /wd4351 /wd4355 /wd4800 /wd4251 /wd4275 /wd4244 /wd4267 /FC /errorReport:queue /GR /EHsc ..\..\src\libsass\src\cencode.c cl : 命令行 warning D9025: 正在重写“/GR-”(用“/GR”) [E:\Users\FuYuHao\Desktop\sb\a\aaa\node_modules\node-sass\build\src\libsass.vcxproj] cencode.c Lib: C:\Program Files (x86)\Microsoft Visual Studio\2017\BuildTools\VC\Tools\MSVC\14.16.27023\bin\HostX64\x64\Lib.exe /OUT:"E:\Users\FuYuHao\Desktop\sb\a\aaa\node_modules\node-sass\build\Release\libsass.lib" /NOLOGO /MACHINE :X64 /LTCG:INCREMENTAL Release\obj\libsass\ast.obj Release\obj\libsass\ast_fwd_decl.obj Release\obj\libsass\backtrace.obj Release\obj\libsass\base64vlq.obj Release\obj\libsass\bind.obj Release\obj\libsass\cencode.obj Release\obj\libsass\check_nesting.obj Release\obj\libsass\color_maps.obj Release\obj\libsass\constants.obj Release\obj\libsass\context.obj Release\obj\libsass\cssize.obj Release\obj\libsass\emitter.obj Release\obj\libsass\environment.obj Release\obj\libsass\error_handling.obj Release\obj\libsass\eval.obj Release\obj\libsass\expand.obj Release\obj\libsass\extend.obj Release\obj\libsass\file.obj Release\obj\libsass\functions.obj Release\obj\libsass\inspect.obj Release\obj\libsass\json.obj Release\obj\libsass\lexer.obj Release\obj\libsass\listize.obj Release\obj\libsass\SharedPtr.obj Release\obj\libsass\node.obj Release\obj\libsass\operators.obj Release\obj\libsass\output.obj Release\obj\libsass\parser.obj Release\obj\libsass\plugins.obj Release\obj\libsass\position.obj Release\obj\libsass\prelexer.obj Release\obj\libsass\remove_placeholders.obj Release\obj\libsass\sass.obj Release\obj\libsass\sass2scss.obj Release\obj\libsass\sass_context.obj Release\obj\libsass\sass_functions.obj Release\obj\libsass\sass_util.obj Release\obj\libsass\sass_values.obj Release\obj\libsass\source_map.obj Release\obj\libsass\subset_map.obj Release\obj\libsass\to_c.obj Release\obj\libsass\to_value.obj Release\obj\libsass\units.obj Release\obj\libsass\utf8_string.obj Release\obj\libsass\util.obj Release\obj\libsass\values.obj libsass.vcxproj -> E:\Users\FuYuHao\Desktop\sb\a\aaa\node_modules\node-sass\build\Release\\libsass.lib FinalizeBuildStatus: 正在删除文件“Release\obj\libsass\libsass.tlog\unsuccessfulbuild”。 正在对“Release\obj\libsass\libsass.tlog\libsass.lastbuildstate”执行 Touch 任务。 已完成生成项目“E:\Users\FuYuHao\Desktop\sb\a\aaa\node_modules\node-sass\build\src\libsass.vcxproj”(默认目标)的操作。 项目“E:\Users\FuYuHao\Desktop\sb\a\aaa\node_modules\node-sass\build\binding.vcxproj.metaproj”(2)正在节点 1 上生成“E:\Users\FuYuHao\Desktop\sb\a\aaa\node_modules\node-sass\build\binding.vcxproj”(4) (默认目标)。 PrepareForBuild: 正在创建目录“Release\obj\binding\”。 正在创建目录“Release\obj\binding\binding.tlog\”。 InitializeBuildStatus: 正在创建“Release\obj\binding\binding.tlog\unsuccessfulbuild”,因为已指定“AlwaysCreate”。 ClCompile: C:\Program Files (x86)\Microsoft Visual Studio\2017\BuildTools\VC\Tools\MSVC\14.16.27023\bin\HostX64\x64\CL.exe /c /I"C:\Users\FuYuHao\.node-gyp\10.14.1\include\node" /I"C:\Users\FuYuHao\.node-gyp\10.14.1\src" /I"C:\Use rs\FuYuHao\.node-gyp\10.14.1\deps\openssl\config" /I"C:\Users\FuYuHao\.node-gyp\10.14.1\deps\openssl\openssl\include" /I"C:\Users\FuYuHao\.node-gyp\10.14.1\deps\uv\include" /I"C:\Users\FuYuHao\.node-gyp\10.14.1\deps\zli b" /I"C:\Users\FuYuHao\.node-gyp\10.14.1\deps\v8\include" /I..\..\nan /I..\src\libsass\include /Z7 /nologo /W3 /WX- /diagnostics:classic /MP /Ox /Ob2 /Oi /Ot /Oy /GL /D NODE_GYP_MODULE_NAME=binding /D USING_UV_SHARED=1 /D USING_V8_SHARED=1 /D V8_DEPRECATION_WARNINGS=1 /D WIN32 /D _CRT_SECURE_NO_DEPRECATE /D _CRT_NONSTDC_NO_DEPRECATE /D _HAS_EXCEPTIONS=0 /D BUILDING_NODE_EXTENSION /D _WINDLL /GF /Gm- /MT /GS /Gy /fp:precise /Zc:wchar_t /Zc:forScope /Zc:inline /GR- /Fo"Release\obj\binding\\" /Fd"Release\obj\binding\vc141.pdb" /Gd /TP /wd4351 /wd4355 /wd4800 /wd4251 /wd4275 /wd4244 /wd4267 /FC /errorReport:queue /Zc:threadSafeInit- ..\src\binding.cpp . .\src\create_string.cpp ..\src\custom_function_bridge.cpp ..\src\custom_importer_bridge.cpp ..\src\sass_context_wrapper.cpp ..\src\sass_types\boolean.cpp ..\src\sass_types\color.cpp ..\src\sass_types\error.cpp ..\src\sa ss_types\factory.cpp ..\src\sass_types\list.cpp ..\src\sass_types\map.cpp ..\src\sass_types\null.cpp ..\src\sass_types\number.cpp ..\src\sass_types\string.cpp "E:\Users\FuYuHao\Desktop\sb\a\aaa\node_modules\node-gyp\src \win_delay_load_hook.cc" binding.cpp create_string.cpp custom_function_bridge.cpp custom_importer_bridge.cpp sass_context_wrapper.cpp boolean.cpp color.cpp error.cpp factory.cpp list.cpp map.cpp null.cpp number.cpp string.cpp win_delay_load_hook.cc Link: C:\Program Files (x86)\Microsoft Visual Studio\2017\BuildTools\VC\Tools\MSVC\14.16.27023\bin\HostX64\x64\link.exe /ERRORREPORT:QUEUE /OUT:"E:\Users\FuYuHao\Desktop\sb\a\aaa\node_modules\node-sass\build\Release\binding.n ode" /INCREMENTAL:NO /NOLOGO kernel32.lib user32.lib gdi32.lib winspool.lib comdlg32.lib advapi32.lib shell32.lib ole32.lib oleaut32.lib uuid.lib odbc32.lib DelayImp.lib "C:\Users\FuYuHao\.node-gyp\10.14.1\x64\node.lib" Delayimp.lib /DELAYLOAD:iojs.exe /DELAYLOAD:node.exe /MANIFEST /MANIFESTUAC:"level='asInvoker' uiAccess='false'" /manifest:embed /DEBUG /PDB:"E:\Users\FuYuHao\Desktop\sb\a\aaa\node_modules\node-sass\build\Release\bindi ng.pdb" /MAP /MAPINFO:EXPORTS /OPT:REF /OPT:ICF /TLBID:1 /RELEASE /DYNAMICBASE /NXCOMPAT /MACHINE:X64 /ignore:4199 /LTCG:INCREMENTAL /DLL Release\obj\binding\binding.obj Release\obj\binding\create_string.obj Release\obj\binding\custom_function_bridge.obj Release\obj\binding\custom_importer_bridge.obj Release\obj\binding\sass_context_wrapper.obj Release\obj\binding\boolean.obj Release\obj\binding\color.obj Release\obj\binding\error.obj Release\obj\binding\factory.obj Release\obj\binding\list.obj Release\obj\binding\map.obj Release\obj\binding\null.obj Release\obj\binding\number.obj Release\obj\binding\string.obj Release\obj\binding\win_delay_load_hook.obj "E:\Users\FuYuHao\Desktop\sb\a\aaa\node_modules\node-sass\build\Release\libsass.lib" C:\Users\FuYuHao\.node-gyp\10.14.1\x64\node.lib : fatal error LNK1107: 文件无效或损坏: 无法在 0x39F993 处读取 [E:\Users\FuYuHao\Desktop\sb\a\aaa\node_modules\node-sass\build\binding.vcxproj] 已完成生成项目“E:\Users\FuYuHao\Desktop\sb\a\aaa\node_modules\node-sass\build\binding.vcxproj”(默认目标)的操作 - 失败。 已完成生成项目“E:\Users\FuYuHao\Desktop\sb\a\aaa\node_modules\node-sass\build\binding.vcxproj.metaproj”(默认目标)的操作 - 失败。 已完成生成项目“E:\Users\FuYuHao\Desktop\sb\a\aaa\node_modules\node-sass\build\binding.sln”(默认目标)的操作 - 失败。 生成失败。 “E:\Users\FuYuHao\Desktop\sb\a\aaa\node_modules\node-sass\build\binding.sln”(默认目标) (1) -> “E:\Users\FuYuHao\Desktop\sb\a\aaa\node_modules\node-sass\build\binding.vcxproj.metaproj”(默认目标) (2) -> “E:\Users\FuYuHao\Desktop\sb\a\aaa\node_modules\node-sass\build\src\libsass.vcxproj”(默认目标) (3) -> (ClCompile 目标) -> cl : 命令行 warning D9025: 正在重写“/GR-”(用“/GR”) [E:\Users\FuYuHao\Desktop\sb\a\aaa\node_modules\node-sass\build\src\libsass.vcxproj] cl : 命令行 warning D9025: 正在重写“/GR-”(用“/GR”) [E:\Users\FuYuHao\Desktop\sb\a\aaa\node_modules\node-sass\build\src\libsass.vcxproj] cl : 命令行 warning D9025: 正在重写“/GR-”(用“/GR”) [E:\Users\FuYuHao\Desktop\sb\a\aaa\node_modules\node-sass\build\src\libsass.vcxproj] cl : 命令行 warning D9025: 正在重写“/GR-”(用“/GR”) [E:\Users\FuYuHao\Desktop\sb\a\aaa\node_modules\node-sass\build\src\libsass.vcxproj] cl : 命令行 warning D9025: 正在重写“/GR-”(用“/GR”) [E:\Users\FuYuHao\Desktop\sb\a\aaa\node_modules\node-sass\build\src\libsass.vcxproj] e:\users\fuyuhao\desktop\sb\a\aaa\node_modules\node-sass\src\libsass\src\json.cpp(26): warning C4005: “_CRT_NONSTDC_NO_DEPRECATE”: 宏重定义 [E:\Users\FuYuHao\Desktop\sb\a\aaa\node_modules\node-sass\build\src\libsass.vc xproj ] e:\users\fuyuhao\desktop\sb\a\aaa\node_modules\node-sass\src\libsass\src\sass2scss.cpp(9): warning C4005: “_CRT_NONSTDC_NO_DEPRECATE”: 宏重定义 [E:\Users\FuYuHao\Desktop\sb\a\aaa\node_modules\node-sass\build\src\libsas s.vcx proj] “E:\Users\FuYuHao\Desktop\sb\a\aaa\node_modules\node-sass\build\binding.sln”(默认目标) (1) -> “E:\Users\FuYuHao\Desktop\sb\a\aaa\node_modules\node-sass\build\binding.vcxproj.metaproj”(默认目标) (2) -> “E:\Users\FuYuHao\Desktop\sb\a\aaa\node_modules\node-sass\build\src\libsass.vcxproj”(默认目标) (3) -> cl : 命令行 warning D9025: 正在重写“/GR-”(用“/GR”) [E:\Users\FuYuHao\Desktop\sb\a\aaa\node_modules\node-sass\build\src\libsass.vcxproj] “E:\Users\FuYuHao\Desktop\sb\a\aaa\node_modules\node-sass\build\binding.sln”(默认目标) (1) -> “E:\Users\FuYuHao\Desktop\sb\a\aaa\node_modules\node-sass\build\binding.vcxproj.metaproj”(默认目标) (2) -> “E:\Users\FuYuHao\Desktop\sb\a\aaa\node_modules\node-sass\build\binding.vcxproj”(默认目标) (4) -> (Link 目标) -> C:\Users\FuYuHao\.node-gyp\10.14.1\x64\node.lib : fatal error LNK1107: 文件无效或损坏: 无法在 0x39F993 处读取 [E:\Users\FuYuHao\Desktop\sb\a\aaa\node_modules\node-sass\build\binding.vcxproj] 8 个警告 1 个错误 已用时间 00:00:24.13 gyp ERR! build error gyp ERR! stack Error: `C:\Program Files (x86)\Microsoft Visual Studio\2017\BuildTools\MSBuild\15.0\Bin\MSBuild.exe` failed with exit code: 1 gyp ERR! stack at ChildProcess.onExit (E:\Users\FuYuHao\Desktop\sb\a\aaa\node_modules\node-gyp\lib\build.js:262:23) gyp ERR! stack at ChildProcess.emit (events.js:182:13) gyp ERR! stack at Process.ChildProcess._handle.onexit (internal/child_process.js:240:12) gyp ERR! System Windows_NT 10.0.17763 gyp ERR! command "C:\\Program Files\\nodejs\\node.exe" "E:\\Users\\FuYuHao\\Desktop\\sb\\a\\aaa\\node_modules\\node-gyp\\bin\\node-gyp.js" "rebuild" "--verbose" "--libsass_ext=" "--libsass_cflags=" "--libsass_ldflags=" "--libsass_library=" gyp ERR! cwd E:\Users\FuYuHao\Desktop\sb\a\aaa\node_modules\node-sass gyp ERR! node -v v10.14.1 gyp ERR! node-gyp -v v3.8.0 gyp ERR! not ok Build failed with error code: 1 npm WARN optional SKIPPING OPTIONAL DEPENDENCY: fsevents@2.0.7 (node_modules\fsevents): npm WARN notsup SKIPPING OPTIONAL DEPENDENCY: Unsupported platform for fsevents@2.0.7: wanted {"os":"darwin","arch":"any"} (current: {"os":"win32","arch":"x64"}) npm WARN optional SKIPPING OPTIONAL DEPENDENCY: fsevents@1.2.9 (node_modules\watchpack\node_modules\fsevents): npm WARN notsup SKIPPING OPTIONAL DEPENDENCY: Unsupported platform for fsevents@1.2.9: wanted {"os":"darwin","arch":"any"} (current: {"os":"win32","arch":"x64"}) npm WARN optional SKIPPING OPTIONAL DEPENDENCY: fsevents@1.2.9 (node_modules\nodemon\node_modules\fsevents): npm WARN notsup SKIPPING OPTIONAL DEPENDENCY: Unsupported platform for fsevents@1.2.9: wanted {"os":"darwin","arch":"any"} (current: {"os":"win32","arch":"x64"}) npm ERR! code ELIFECYCLE npm ERR! errno 1 npm ERR! node-sass@4.12.0 postinstall: `node scripts/build.js` npm ERR! Exit status 1 npm ERR! npm ERR! Failed at the node-sass@4.12.0 postinstall script. npm ERR! This is probably not a problem with npm. There is likely additional logging output above. npm ERR! A complete log of this run can be found in: npm ERR! C:\Users\FuYuHao\AppData\Roaming\npm-cache\_logs\2019-05-29T14_01_39_903Z-debug.log PS E:\Users\FuYuHao\Desktop\sb\a\aaa> ```
wininet.dll中的CommitUrlCacheEntryBinaryBlob函数
如图,本人在开发IE11下的cookie隔离功能,遇到一个问题。通过api hook查看ie11加载网页写cookie的时候,读取到这个api函数。在网上都搜不到改函数的详细介绍。不知道后两个参数有什么作用。有哪位大神知道的?求科普,谢谢。 下面是在google上面搜到的仅有的该函数的声明: CommitUrlCacheEntryBinaryBlob from Wininet.dll Parameters: •PCWSTR pwszUrlName •DWORD dwType •FILETIME ftExpireTime •FILETIME ftModifiedTime •const BYTE* pbBlob •DWORD cbBlob Return type [ERROR_CODE].
终于明白阿里百度这样的大公司,为什么面试经常拿ThreadLocal考验求职者了
点击上面↑「爱开发」关注我们每晚10点,捕获技术思考和创业资源洞察什么是ThreadLocalThreadLocal是一个本地线程副本变量工具类,各个线程都拥有一份线程私...
《奇巧淫技》系列-python!!每天早上八点自动发送天气预报邮件到QQ邮箱
将代码部署服务器,每日早上定时获取到天气数据,并发送到邮箱。 也可以说是一个小人工智障。 思路可以运用在不同地方,主要介绍的是思路。
加快推动区块链技术和产业创新发展,2019可信区块链峰会在京召开
11月8日,由中国信息通信研究院、中国通信标准化协会、中国互联网协会、可信区块链推进计划联合主办,科技行者协办的2019可信区块链峰会将在北京悠唐皇冠假日酒店开幕。   区块链技术被认为是继蒸汽机、电力、互联网之后,下一代颠覆性的核心技术。如果说蒸汽机释放了人类的生产力,电力解决了人类基本的生活需求,互联网彻底改变了信息传递的方式,区块链作为构造信任的技术有重要的价值。   1...
阿里面试官问我:如何设计秒杀系统?我的回答让他比起大拇指
你知道的越多,你不知道的越多 点赞再看,养成习惯 GitHub上已经开源 https://github.com/JavaFamily 有一线大厂面试点脑图和个人联系方式,欢迎Star和指教 前言 Redis在互联网技术存储方面使用如此广泛,几乎所有的后端技术面试官都要在Redis的使用和原理方面对小伙伴们进行360°的刁难。 作为一个在互联网公司面一次拿一次Offer的面霸,打败了...
C语言魔塔游戏
很早就很想写这个,今天终于写完了。 游戏截图: 编译环境: VS2017 游戏需要一些图片,如果有想要的或者对游戏有什么看法的可以加我的QQ 2985486630 讨论,如果暂时没有回应,可以在博客下方留言,到时候我会看到。 下面我来介绍一下游戏的主要功能和实现方式 首先是玩家的定义,使用结构体,这个名字是可以自己改变的 struct gamerole { char n...
面试官问我:什么是消息队列?什么场景需要他?用了会出现什么问题?
你知道的越多,你不知道的越多 点赞再看,养成习惯 GitHub上已经开源 https://github.com/JavaFamily 有一线大厂面试点脑图、个人联系方式和人才交流群,欢迎Star和完善 前言 消息队列在互联网技术存储方面使用如此广泛,几乎所有的后端技术面试官都要在消息队列的使用和原理方面对小伙伴们进行360°的刁难。 作为一个在互联网公司面一次拿一次Offer的面霸...
Android性能优化(4):UI渲染机制以及优化
文章目录1. 渲染机制分析1.1 渲染机制1.2 卡顿现象1.3 内存抖动2. 渲染优化方式2.1 过度绘制优化2.1.1 Show GPU overdraw2.1.2 Profile GPU Rendering2.2 卡顿优化2.2.1 SysTrace2.2.2 TraceView 在从Android 6.0源码的角度剖析View的绘制原理一文中,我们了解到View的绘制流程有三个步骤,即m...
微服务中的Kafka与Micronaut
今天,我们将通过Apache Kafka主题构建一些彼此异步通信的微服务。我们使用Micronaut框架,它为与Kafka集成提供专门的库。让我们简要介绍一下示例系统的体系结构。我们有四个微型服务:订单服务,行程服务,司机服务和乘客服务。这些应用程序的实现非常简单。它们都有内存存储,并连接到同一个Kafka实例。 我们系统的主要目标是为客户安排行程。订单服务应用程序还充当网关。它接收来自客户的请求...
致 Python 初学者们!
作者| 许向武 责编 | 屠敏 出品 | CSDN 博客 前言 在 Python 进阶的过程中,相信很多同学应该大致上学习了很多 Python 的基础知识,也正在努力成长。在此期间,一定遇到了很多的困惑,对未来的学习方向感到迷茫。我非常理解你们所面临的处境。我从2007年开始接触 Python 这门编程语言,从2009年开始单一使用 Python 应对所有的开发工作,直至今...
究竟你适不适合买Mac?
我清晰的记得,刚买的macbook pro回到家,开机后第一件事情,就是上了淘宝网,花了500元钱,找了一个上门维修电脑的师傅,上门给我装了一个windows系统。。。。。。 表砍我。。。 当时买mac的初衷,只是想要个固态硬盘的笔记本,用来运行一些复杂的扑克软件。而看了当时所有的SSD笔记本后,最终决定,还是买个好(xiong)看(da)的。 已经有好几个朋友问我mba怎么样了,所以今天尽量客观...
程序员一般通过什么途径接私活?
二哥,你好,我想知道一般程序猿都如何接私活,我也想接,能告诉我一些方法吗? 上面是一个读者“烦不烦”问我的一个问题。其实不止是“烦不烦”,还有很多读者问过我类似这样的问题。 我接的私活不算多,挣到的钱也没有多少,加起来不到 20W。说实话,这个数目说出来我是有点心虚的,毕竟太少了,大家轻喷。但我想,恰好配得上“一般程序员”这个称号啊。毕竟苍蝇再小也是肉,我也算是有经验的人了。 唾弃接私活、做外...
字节跳动面试官这样问消息队列:分布式事务、重复消费、顺序消费,我整理了一下
你知道的越多,你不知道的越多 点赞再看,养成习惯 GitHub上已经开源 https://github.com/JavaFamily 有一线大厂面试点脑图、个人联系方式和人才交流群,欢迎Star和完善 前言 消息队列在互联网技术存储方面使用如此广泛,几乎所有的后端技术面试官都要在消息队列的使用和原理方面对小伙伴们进行360°的刁难。 作为一个在互联网公司面一次拿一次Offer的面霸...
Python爬虫爬取淘宝,京东商品信息
小编是一个理科生,不善长说一些废话。简单介绍下原理然后直接上代码。 使用的工具(Python+pycharm2019.3+selenium+xpath+chromedriver)其中要使用pycharm也可以私聊我selenium是一个框架可以通过pip下载 pip installselenium -ihttps://pypi.tuna.tsinghua.edu.cn/simple/ ...
阿里程序员写了一个新手都写不出的低级bug,被骂惨了。
这种新手都不会范的错,居然被一个工作好几年的小伙子写出来,差点被当场开除了。
Java工作4年来应聘要16K最后没要,细节如下。。。
前奏: 今天2B哥和大家分享一位前几天面试的一位应聘者,工作4年26岁,统招本科。 以下就是他的简历和面试情况。 基本情况: 专业技能: 1、&nbsp;熟悉Sping了解SpringMVC、SpringBoot、Mybatis等框架、了解SpringCloud微服务 2、&nbsp;熟悉常用项目管理工具:SVN、GIT、MAVEN、Jenkins 3、&nbsp;熟悉Nginx、tomca...
SpringBoot2.x系列教程(三十六)SpringBoot之Tomcat配置
Spring Boot默认内嵌的Tomcat为Servlet容器,关于Tomcat的所有属性都在ServerProperties配置类中。同时,也可以实现一些接口来自定义内嵌Servlet容器和内嵌Tomcat等的配置。 关于此配置,网络上有大量的资料,但都是基于SpringBoot1.5.x版本,并不适合当前最新版本。本文将带大家了解一下最新版本的使用。 ServerProperties的部分源...
Python绘图,圣诞树,花,爱心 | Turtle篇
每周每日,分享Python实战代码,入门资料,进阶资料,基础语法,爬虫,数据分析,web网站,机器学习,深度学习等等。 公众号回复【进群】沟通交流吧,QQ扫码进群学习吧 微信群 QQ群 1.画圣诞树 import turtle screen = turtle.Screen() screen.setup(800,600) circle = turtle.Turtle()...
作为一个程序员,CPU的这些硬核知识你必须会!
CPU对每个程序员来说,是个既熟悉又陌生的东西? 如果你只知道CPU是中央处理器的话,那可能对你并没有什么用,那么作为程序员的我们,必须要搞懂的就是CPU这家伙是如何运行的,尤其要搞懂它里面的寄存器是怎么一回事,因为这将让你从底层明白程序的运行机制。 随我一起,来好好认识下CPU这货吧 把CPU掰开来看 对于CPU来说,我们首先就要搞明白它是怎么回事,也就是它的内部构造,当然,CPU那么牛的一个东...
破14亿,Python分析我国存在哪些人口危机!
一、背景 二、爬取数据 三、数据分析 1、总人口 2、男女人口比例 3、人口城镇化 4、人口增长率 5、人口老化(抚养比) 6、各省人口 7、世界人口 四、遇到的问题 遇到的问题 1、数据分页,需要获取从1949-2018年数据,观察到有近20年参数:LAST20,由此推测获取近70年的参数可设置为:LAST70 2、2019年数据没有放上去,可以手动添加上去 3、将数据进行 行列转换 4、列名...
听说想当黑客的都玩过这个Monyer游戏(1~14攻略)
第零关 进入传送门开始第0关(游戏链接) 请点击链接进入第1关: 连接在左边→ ←连接在右边 看不到啊。。。。(只能看到一堆大佬做完的留名,也能看到菜鸡的我,在后面~~) 直接fn+f12吧 &lt;span&gt;连接在左边→&lt;/span&gt; &lt;a href="first.php"&gt;&lt;/a&gt; &lt;span&gt;←连接在右边&lt;/span&gt; o...
在家远程办公效率低?那你一定要收好这个「在家办公」神器!
相信大家都已经收到国务院延长春节假期的消息,接下来,在家远程办公可能将会持续一段时间。 但是问题来了。远程办公不是人在电脑前就当坐班了,相反,对于沟通效率,文件协作,以及信息安全都有着极高的要求。有着非常多的挑战,比如: 1在异地互相不见面的会议上,如何提高沟通效率? 2文件之间的来往反馈如何做到及时性?如何保证信息安全? 3如何规划安排每天工作,以及如何进行成果验收? ...... ...
作为一个程序员,内存和磁盘的这些事情,你不得不知道啊!!!
截止目前,我已经分享了如下几篇文章: 一个程序在计算机中是如何运行的?超级干货!!! 作为一个程序员,CPU的这些硬核知识你必须会! 作为一个程序员,内存的这些硬核知识你必须懂! 这些知识可以说是我们之前都不太重视的基础知识,可能大家在上大学的时候都学习过了,但是嘞,当时由于老师讲解的没那么有趣,又加上这些知识本身就比较枯燥,所以嘞,大家当初几乎等于没学。 再说啦,学习这些,也看不出来有什么用啊!...
这个世界上人真的分三六九等,你信吗?
偶然间,在知乎上看到一个问题 一时间,勾起了我深深的回忆。 以前在厂里打过两次工,做过家教,干过辅导班,做过中介。零下几度的晚上,贴过广告,满脸、满手地长冻疮。 再回首那段岁月,虽然苦,但让我学会了坚持和忍耐。让我明白了,在这个世界上,无论环境多么的恶劣,只要心存希望,星星之火,亦可燎原。 下文是原回答,希望能对你能有所启发。 如果我说,这个世界上人真的分三六九等,...
2020年全新Java学习路线图,含配套视频,学完即为中级Java程序员!!
新的一年来临,突如其来的疫情打破了平静的生活! 在家的你是否很无聊,如果无聊就来学习吧! 世上只有一种投资只赚不赔,那就是学习!!! 传智播客于2020年升级了Java学习线路图,硬核升级,免费放送! 学完你就是中级程序员,能更快一步找到工作! 一、Java基础 JavaSE基础是Java中级程序员的起点,是帮助你从小白到懂得编程的必经之路。 在Java基础板块中有6个子模块的学...
B 站上有哪些很好的学习资源?
哇说起B站,在小九眼里就是宝藏般的存在,放年假宅在家时一天刷6、7个小时不在话下,更别提今年的跨年晚会,我简直是跪着看完的!! 最早大家聚在在B站是为了追番,再后来我在上面刷欧美新歌和漂亮小姐姐的舞蹈视频,最近两年我和周围的朋友们已经把B站当作学习教室了,而且学习成本还免费,真是个励志的好平台ヽ(.◕ฺˇд ˇ◕ฺ;)ノ 下面我们就来盘点一下B站上优质的学习资源: 综合类 Oeasy: 综合...
爬取薅羊毛网站百度云资源
这是疫情期间无聊做的爬虫, 去获取暂时用不上的教程 import threading import time import pandas as pd import requests import re from threading import Thread, Lock # import urllib.request as request # req=urllib.request.Requ...
如何优雅地打印一个Java对象?
你好呀,我是沉默王二,一个和黄家驹一样身高,和刘德华一样颜值的程序员。虽然已经写了十多年的 Java 代码,但仍然觉得自己是个菜鸟(请允许我惭愧一下)。 在一个月黑风高的夜晚,我思前想后,觉得再也不能这么蹉跎下去了。于是痛下决心,准备通过输出的方式倒逼输入,以此来修炼自己的内功,从而进阶成为一名真正意义上的大神。与此同时,希望这些文章能够帮助到更多的读者,让大家在学习的路上不再寂寞、空虚和冷。 ...
雷火神山直播超两亿,Web播放器事件监听是怎么实现的?
Web播放器解决了在手机浏览器和PC浏览器上播放音视频数据的问题,让视音频内容可以不依赖用户安装App,就能进行播放以及在社交平台进行传播。在视频业务大数据平台中,播放数据的统计分析非常重要,所以Web播放器在使用过程中,需要对其内部的数据进行收集并上报至服务端,此时,就需要对发生在其内部的一些播放行为进行事件监听。 那么Web播放器事件监听是怎么实现的呢? 01 监听事件明细表 名...
3万字总结,Mysql优化之精髓
本文知识点较多,篇幅较长,请耐心学习 MySQL已经成为时下关系型数据库产品的中坚力量,备受互联网大厂的青睐,出门面试想进BAT,想拿高工资,不会点MySQL优化知识,拿offer的成功率会大大下降。 为什么要优化 系统的吞吐量瓶颈往往出现在数据库的访问速度上 随着应用程序的运行,数据库的中的数据会越来越多,处理时间会相应变慢 数据是存放在磁盘上的,读写速度无法和内存相比 如何优化 设计...
HTML5适合的情人节礼物有纪念日期功能
前言 利用HTML5,css,js实现爱心树 以及 纪念日期的功能 网页有播放音乐功能 以及打字倾诉感情的画面,非常适合情人节送给女朋友 具体的HTML代码 具体只要修改代码里面的男某某和女某某 文字段也可自行修改,还有代码下半部分的JS代码需要修改一下起始日期 注意月份为0~11月 也就是月份需要减一。 当然只有一部分HTML和JS代码不够运行的,文章最下面还附加了完整代码的下载地址 &lt;!...
相关热词 c# 压缩图片好麻烦 c#计算数组中的平均值 c#获取路由参数 c#日期精确到分钟 c#自定义异常必须继承 c#查表并返回值 c# 动态 表达式树 c# 监控方法耗时 c# listbox c#chart显示滚动条
立即提问