SQLserver 的analysis service

我的SQLserver2008里的analysis service里只有部署向导,但我要用到analysis manager,在哪里打开analysis manager啊?我的SQLserver 配置管理器里没有analysis这个选项,难道是没有安装,可是开始——所有程序——SQLserver里有啊!那怎么办,可以补充安装吗?我不想卸掉重装,怕卸不干净。。。。

sql

1个回答

BusinessOBject analysis + sqlserver analysis service 开发文档
----------------------biu~biu~biu~~~在下问答机器人小D,这是我依靠自己的聪明才智给出的答案,如果不正确,你来咬我啊!

qq_34158010
qq_34158010 哈哈,你好可爱
大约 4 年之前 回复
Csdn user default icon
上传中...
上传图片
插入图片
抄袭、复制答案,以达到刷声望分或其他目的的行为,在CSDN问答是严格禁止的,一经发现立刻封号。是时候展现真正的技术了!
其他相关推荐
请问为什么没有analysis services服务

已经安装了ssdt,但是看不到这个服务,sql也显示无法连接到服务器,但是我明明已经装了,却没有这个服务。![图片说明](https://img-ask.csdn.net/upload/201901/18/1547798345_640809.png)![图片说明](https://img-ask.csdn.net/upload/201901/18/1547798353_362309.png)

SQL Server Data Tools 新建Analysis Services 表格项目

新装了sql server 2012 新建了Analysis Services 表格项目 但是出现了如图所示的情况![图片说明](https://img-ask.csdn.net/upload/201603/09/1457491348_988923.jpg) 安装sql server2012 Analysis Services配置的时候我好像选择的默认(多维和数据挖掘模式) 我现在是不是必须要重修安装sql server 才能改成表格模式新建Analysis Services 表格项目呢

Adventure Works DW 2008 SE

大家好,我正在看一本书《SQL Server 2008报表服务从入门到精通》 清华大学出版的,其中有一张内有讲的是利用Analysis Services作为数据源,里面需要Adventure Workshop DW 2008 SE 我无法再官网上下载到该示例数据库,如果哪位同学有麻烦发给我一份,谢谢。我的邮箱是dailuqun@126.com

sql server 2012 安装失败

环境: ![图片说明](https://img-ask.csdn.net/upload/201606/06/1465185793_162101.jpg) 文件: cn_sql_server_2012_standard_edition_with_service_pack_3_x86_dvd_7290379.ISO 问题描述: 报错内容--> This SQL Server setup media does not support the language of the OS,or does not have the SQL Server English-language version installation files. Use the matching language - specific SQL Server media; or install both the language specific MUI and change the format and system locales through the regional settings inthe control panel.

为什么我的vs2013里没有Analysis Services模板啊

如题,我安装的是sqlserver2012,打开vs2013里面没有Analysis Services模板,而且联机也没有

excel无法连接analysis services

我的excel是2013版的,analysis services是2012版的,excel可以连接sqlserver的数据,没法连接analysis services,提示:无法连接数据源。原因:找不到数据库服务器,请验证所输入的数据库服务器名称正确。。。。。我连接的就是本地的服务器啊,而且名称肯定正确,输入localhost也不行,求大神帮忙啊。

ssms打开ssas出现未能加载文件或程序集错误

我是通过在服务器上的ssdt建的as项目,发布到的服务器上sql数据库中,今天突发奇想使用 文件--》打开--》as数据库;输入ip地址 + 服务器名称 ,点击链接开始报错。 截图 ![图片说明](https://img-ask.csdn.net/upload/201907/09/1562667593_29833.png) 错误提示 ``` =================================== 未能加载文件或程序集“Microsoft.BusinessIntelligence.Telemetry, Version=14.0.0.0, Culture=neutral, PublicKeyToken=89845dcd8080cc91”或它的某一个依赖项。系统找不到指定的文件。 (Microsoft SQL Server Management Studio) ------------------------------ 程序位置: 在 Microsoft.AnalysisServices.Project.OnlineAnalysisServicesProjectManager.OnProjectLoaded(Object sender, EventArgs eevent) 在 Microsoft.DataWarehouse.VsIntegration.Shell.Project.FileProjectHierarchy.OnProjectLoaded() 在 Microsoft.DataWarehouse.VsIntegration.Shell.Project.FileProjectHierarchy.Load(String pszFilename, UInt32 grfMode, Int32 iReadOnly) 在 Microsoft.DataWarehouse.VsIntegration.Shell.Service.DataWarehouseProjectManagerService.CreateProject(String projectPath, Int32 mode, DocumentObject documentObject, IFileProjectManager projectManager, Guid projectGuid) 在 Microsoft.DataWarehouse.VsIntegration.Shell.Service.DataWarehouseProjectManagerService.CreateProject(String resourceUri, Guid projectGuid) 在 Microsoft.DataWarehouse.VsIntegration.Shell.PVsProjectFactory.CreateProject(String pszFilename, String pszLocation, String pszName, UInt32 grfCreateFlags, Guid& iidProject, IntPtr& ppProjectIntPtr, Int32& ifCanceled) 在 EnvDTE.SolutionClass.AddFromFile(String FileName, Boolean Exclusive) 在 Microsoft.AnalysisServices.Project.AnalysisServicesPackage.OnConnectToAS(Object sender, EventArgs e) ```

ssms安装失败是怎么回事啊

第一次安装时由于其他软件正在安装,导致安装失败。现在每次安装时都是这个样子 ![图片说明](https://img-ask.csdn.net/upload/201906/30/1561858582_371795.png) 日志: [2F6C:2804][2019-06-30T09:36:38]i001: Burn v3.8.1128.0, Windows v6.3 (Build 9600: Service Pack 0), path: C:\Users\鲁元博\Downloads\SSMS-Setup-CHS (1).exe, cmdline: '-burn.unelevated BurnPipe.{0A675724-EDF1-4111-B250-5037B43ED2B1} {A79C7EF2-219A-402E-866A-D7013734C967} 12992' [2F6C:2804][2019-06-30T09:36:38]i000: Initializing string variable 'SSMSINSTALLROOT' to value '[ProgramFilesFolder]Microsoft SQL Server Management Studio 18' [2F6C:2804][2019-06-30T09:36:38]i000: Initializing string variable 'SSMSInstallExists' to value '0' [2F6C:2804][2019-06-30T09:36:38]i000: Initializing string variable 'SSMSInstalledLanguageMatch' to value 'false' [2F6C:2804][2019-06-30T09:36:38]i000: Initializing string variable 'SSMS18PreReleaseDetected' to value 'false' [2F6C:2804][2019-06-30T09:36:38]i000: Initializing string variable 'SSMS18InstalledVersion' to value '' [2F6C:2804][2019-06-30T09:36:38]i000: Initializing string variable 'ProductVersionVar' to value '15.0.18131.0' [2F6C:2804][2019-06-30T09:36:38]i000: Initializing string variable 'HeaderText' to value '版本 18.1' [2F6C:2804][2019-06-30T09:36:38]i000: Initializing string variable 'SubHeaderText' to value 'Microsoft SQL Server Management Studio' [2F6C:2804][2019-06-30T09:36:38]i000: Initializing string variable 'LicenseTermsUrl' to value 'https://go.microsoft.com/fwlink/?LinkID=620835&clcid=0x[System.Convert]::ToInt32(2052)' [2F6C:2804][2019-06-30T09:36:38]i000: Initializing string variable 'PreviewStatementUrl' to value 'https://go.microsoft.com/fwlink/?LinkID=824140' [2F6C:2804][2019-06-30T09:36:38]i000: Initializing string variable 'TelemetryDocumentationUrl' to value 'https://go.microsoft.com/fwlink/?LinkID=869476' [2F6C:2804][2019-06-30T09:36:38]i000: Initializing numeric variable 'InstallerLcid' to value '2052' [2F6C:2804][2019-06-30T09:36:38]i000: Initializing string variable 'CancelText' to value '取消' [2F6C:2804][2019-06-30T09:36:38]i000: Initializing string variable 'CloseText' to value '关闭(_C)' [2F6C:2804][2019-06-30T09:36:38]i000: Initializing string variable 'ContinuePastWarningText' to value '是否要继续?' [2F6C:2804][2019-06-30T09:36:38]i000: Initializing string variable 'InstallText' to value '安装(_I)' [2F6C:2804][2019-06-30T09:36:38]i000: Initializing string variable 'LicenseTermsText' to value '许可条款' [2F6C:2804][2019-06-30T09:36:38]i000: Initializing string variable 'LoadingPackagesText' to value '正在加载程序包。请稍候...' [2F6C:2804][2019-06-30T09:36:38]i000: Initializing string variable 'OverallProgressText' to value '总体进度' [2F6C:2804][2019-06-30T09:36:38]i000: Initializing string variable 'PackageProgressText' to value '程序包进度' [2F6C:2804][2019-06-30T09:36:38]i000: Initializing string variable 'PrivacyStatementText' to value '隐私声明' [2F6C:2804][2019-06-30T09:36:38]i000: Initializing string variable 'TelemetryDocumentationText' to value '文档' [2F6C:2804][2019-06-30T09:36:38]i000: Initializing string variable 'PrivacyDisclaimerText' to value '为了有助于改进产品,SQL Server Management Studio 会向 Microsoft 传输安装体验信息,以及其他使用情况和性能数据。若要详细了解数据处理和隐私控制,以及在安装后禁用收集此信息,请参阅 {documentation}。' [2F6C:2804][2019-06-30T09:36:38]i000: Initializing string variable 'RepairText' to value '修复' [2F6C:2804][2019-06-30T09:36:38]i000: Initializing string variable 'RestartText' to value '重新启动' [2F6C:2804][2019-06-30T09:36:38]i000: Initializing string variable 'SetupBlockedDescriptionText' to value '出了点问题,导致安装程序无法继续。' [2F6C:2804][2019-06-30T09:36:38]i000: Initializing string variable 'SetupBlockedText' to value '已阻止安装程序' [2F6C:2804][2019-06-30T09:36:38]i000: Initializing string variable 'SetupCanceledDescriptionText' to value '已取消安装操作。' [2F6C:2804][2019-06-30T09:36:38]i000: Initializing string variable 'SetupCanceledText' to value '已取消安装程序' [2F6C:2804][2019-06-30T09:36:38]i000: Initializing string variable 'SetupCompletedText' to value '已完成安装程序' [2F6C:2804][2019-06-30T09:36:38]i000: Initializing string variable 'SetupFailedDescriptionText' to value '安装过程中出错了。' [2F6C:2804][2019-06-30T09:36:38]i000: Initializing string variable 'SetupFailedText' to value '安装失败' [2F6C:2804][2019-06-30T09:36:38]i000: Initializing string variable 'SetupHelpDescriptionText' to value '/install | /repair | /uninstall - 安装、修复或卸载。安装为默认操作。 /passive | /quiet - 在无提示的情况下显示最小 UI 或不显示 UI 和提示。默认显示 UI 和所有提示。 /norestart - 取消任何重启尝试。默认情况下,UI 会在重启前显示提示。 /log <日志文件前缀> - SSMS 安装程序日志的前缀。默认在 %TEMP%\SSMSSetup 下创建日志文件。 SSMSInstallRoot=<SSMS 位置的路径>。 默认情况下为 {0}。' [2F6C:2804][2019-06-30T09:36:38]i000: Initializing string variable 'SetupHelpText' to value '安装程序帮助' [2F6C:2804][2019-06-30T09:36:38]i000: Initializing string variable 'SetupInstalledText' to value '已成功安装所有指定的组件。' [2F6C:2804][2019-06-30T09:36:38]i000: Initializing string variable 'SetupLayoutText' to value '已完成布局操作。' [2F6C:2804][2019-06-30T09:36:38]i000: Initializing string variable 'SetupLogText' to value '单击此处查看日志文件。' [2F6C:2804][2019-06-30T09:36:38]i000: Initializing string variable 'SetupProgressText' to value '安装进度' [2F6C:2804][2019-06-30T09:36:38]i000: Initializing string variable 'SetupRepairedText' to value '已成功修复所有指定的组件。' [2F6C:2804][2019-06-30T09:36:38]i000: Initializing string variable 'SetupRestartDescriptionText' to value '需要先重启计算机,然后安装程序才能继续。' [2F6C:2804][2019-06-30T09:36:38]i000: Initializing string variable 'SetupRestartText' to value '需要重启才能完成安装程序。' [2F6C:2804][2019-06-30T09:36:38]i000: Initializing string variable 'SetupUninstalledText' to value '已成功卸载所有指定的组件。' [2F6C:2804][2019-06-30T09:36:38]i000: Initializing string variable 'SetupWarningDescriptionText' to value '以下问题可能会影响已安装的应用程序。' [2F6C:2804][2019-06-30T09:36:38]i000: Initializing string variable 'SetupWarningText' to value '安装警告' [2F6C:2804][2019-06-30T09:36:38]i000: Initializing string variable 'TermsAndConditionText' to value '单击“安装”按钮即表明本人接受 {License Terms} 和 {Privacy Statement}。' [2F6C:2804][2019-06-30T09:36:38]i000: Initializing string variable 'UninstallText' to value '卸载' [2F6C:2804][2019-06-30T09:36:38]i000: Initializing string variable 'WelcomeInstallText' to value '欢迎使用。单击“安装”,立即开始体验吧。' [2F6C:2804][2019-06-30T09:36:38]i000: Initializing string variable 'YesText' to value '是' [2F6C:2804][2019-06-30T09:36:38]i000: Initializing string variable 'RestartAlreadyPending' to value '挂起的重启正在阻止完成安装过程。请重启计算机并再次运行安装程序。' [2F6C:2804][2019-06-30T09:36:38]i000: Initializing string variable 'AnotherInstallRunning' to value '由于当前正在运行另一个安装,安装程序被阻止。请先完成另一个安装,然后在必要时重启计算机。' [2F6C:2804][2019-06-30T09:36:38]i000: Initializing string variable 'SSMSCurrentlyRunning' to value 'SSMS 当前正在运行。请关闭 SSMS 所有已打开的实例并重新运行此安装程序。' [2F6C:2804][2019-06-30T09:36:38]i000: Initializing string variable 'MinimizeButtonAccessibleName' to value '最小化' [2F6C:2804][2019-06-30T09:36:38]i000: Initializing string variable 'Win10OrWS2016NotSupported' to value '不支持此版本的 Windows 10 或 Windows Server 2016。请升级到版本 1607 或更高版本。' [2F6C:2804][2019-06-30T09:36:38]i000: Initializing string variable 'IsUpgradeScenario' to value 'false' [2F6C:2804][2019-06-30T09:36:38]i000: Initializing string variable 'ChangeText' to value '更改(_H)' [2F6C:2804][2019-06-30T09:36:38]i000: Initializing string variable 'LocationText' to value '位置(_L):' [2F6C:2804][2019-06-30T09:36:38]i000: Initializing string variable 'InvalidLocationText' to value '位置无效。请输入有效的位置以继续。' [2F6C:2804][2019-06-30T09:36:38]i000: Setting string variable 'WixBundleLog' to value 'C:\Users\鲁元博\AppData\Local\Temp\SsmsSetup\SSMS-Setup-CHS_20190630093638.log' [2F6C:2804][2019-06-30T09:36:38]i000: Setting string variable 'WixBundleOriginalSource' to value 'C:\Users\鲁元博\Downloads\SSMS-Setup-CHS (1).exe' [2F6C:2804][2019-06-30T09:36:38]i000: Setting string variable 'WixBundleName' to value 'Microsoft SQL Server Management Studio - 18.1' [2F6C:2804][2019-06-30T09:36:38]i000: Loading managed bootstrapper application. [2F6C:2804][2019-06-30T09:36:38]i000: Creating BA thread to run asynchronously. [2F6C:2E80][2019-06-30T09:36:39]i000: ManagedBootstrapperApp.Run: Launching the managed bootstrapper application. [2F6C:2E80][2019-06-30T09:36:39]i000: BootstrapperMetadataModel.Initialize: Start loading the bootstrapper app data xml file [2F6C:1078][2019-06-30T09:36:39]i000: ManagedBootstrapperApp.LogUserEnvironmentInfoInBackground: OS Caption: Microsoft Windows 10 家庭中文版 [2F6C:1078][2019-06-30T09:36:39]i000: ManagedBootstrapperApp.LogUserEnvironmentInfoInBackground: OS Version: 10.0.16299 [2F6C:1078][2019-06-30T09:36:39]i000: ManagedBootstrapperApp.LogUserEnvironmentInfoInBackground: NetFx4 Version: 4.7.03062 [2F6C:1078][2019-06-30T09:36:39]i000: ManagedBootstrapperApp.LogUserEnvironmentInfoInBackground: OS UI Culture: Chinese (Simplified, China) (2052) [2F6C:2E80][2019-06-30T09:36:39]i000: BootstrapperMetadataModel.Initialize: Completed loading the bootstrapper app data xml file content: <BootstrapperApplicationData xmlns="http://schemas.microsoft.com/wix/2010/BootstrapperApplicationData"> <WixBalCondition Condition="SSMS18PreReleaseDetected = 0" Message="无法安装正式版(GA) SQL Server Management Studio (SSMS) v18.0,因为计算机上安装了预发布版 SSMS。请在控制面板的“添加/删除程序”中卸载预发布版 SSMS,并再次运行 SSMS 安装程序。" /> <WixBalCondition Condition="(SSMS18PreReleaseDetected = 1) OR (SSMSInstallExists = 0) OR SSMSInstalledLanguageMatch" Message="只能通过安装匹配语言包来升级 SSMS。请使用匹配版本的安装程序,或卸载当前版本的 SSMS 并再次运行 SSMS 安装程序。" /> <WixBalCondition Condition="NOT Msix64" Message="SSMS 只能安装在 64 位版本的 Windows 上。" /> <WixBalCondition Condition="RebootPending = 0" Message="安装程序检测到有一个挂起的计算机重启操作。请重启计算机,然后再次运行安装程序" /> <WixBalCondition Condition="(VersionNT = v6.1 AND ServicePackLevel = 1) OR VersionNT &gt; v6.1" Message="不支持当前操作系统。此应用程序至少需要 Windows 7 SP1 或 Windows Server 2008 R2 SP1 才能运行。" /> <WixBalCondition Condition="Installed OR (VersionNT &lt;&gt; v6.2) OR (InstallationType &lt;&gt; &quot;Client&quot;)" Message="不支持 Windows 8。请升级操作系统,然后再继续。" /> <WixBalCondition Condition="(VersionNT &lt;&gt; v6.3) OR (KB2919355_amd64_CurrentState = 112 OR KB2919355_x86_CurrentState = 112)" Message="The update corresponding to KB2919355 needs to be installed before you can install this product on Windows 8.1 or Windows Server 2012 R2. Please refer to https://support.microsoft.com/en-us/kb/2919355/ to obtain and install this update." /> <WixBundleProperties DisplayName="Microsoft SQL Server Management Studio - 18.1" LogPathVariable="WixBundleLog" Compressed="no" Id="{88251298-f74d-4665-aec9-1d88d509fc5b}" UpgradeCode="{C55E865B-F94F-42FC-A95A-00F24602F1C2}" PerMachine="yes" /> <WixMbaPrereqInformation PackageId="NetFx45Web" LicenseUrl="http://go.microsoft.com/fwlink/?LinkID=260867" /> <WixPackageProperties Package="VCRedistD12x86" Vital="no" DisplayName="Microsoft Visual C++ 2013 Redistributable (x86) - 12.0.30501" Description="Microsoft Visual C++ 2013 Redistributable (x86) - 12.0.30501" DownloadSize="6503984" PackageSize="6503984" InstalledSize="6503984" PackageType="Exe" Permanent="yes" LogPathVariable="WixBundleLog_VCRedistD12x86" RollbackLogPathVariable="WixBundleRollbackLog_VCRedistD12x86" Compressed="yes" DisplayInternalUI="no" /> <WixPackageProperties Package="VCRedistD14x86" Vital="no" DisplayName="Microsoft Visual C++ 2017 Redistributable (x86) - 14.16.27029" Description="Microsoft Visual C++ 2017 Redistributable (x86) - 14.16.27029" DownloadSize="14673288" PackageSize="14673288" InstalledSize="14673288" PackageType="Exe" Permanent="yes" LogPathVariable="WixBundleLog_VCRedistD14x86" RollbackLogPathVariable="WixBundleRollbackLog_VCRedistD14x86" Compressed="yes" DisplayInternalUI="no" /> <WixPackageProperties Package="VCRedistD14x64" Vital="no" DisplayName="Microsoft Visual C++ 2017 Redistributable (x64) - 14.16.27029" Description="Microsoft Visual C++ 2017 Redistributable (x64) - 14.16.27029" DownloadSize="15354672" PackageSize="15354672" InstalledSize="15354672" PackageType="Exe" Permanent="yes" LogPathVariable="WixBundleLog_VCRedistD14x64" RollbackLogPathVariable="WixBundleRollbackLog_VCRedistD14x64" Compressed="yes" DisplayInternalUI="no" /> <WixPackageProperties Package="HelpViewer2_3" Vital="yes" DisplayName="Microsoft Help Viewer 2.3" DownloadSize="3149592" PackageSize="3149592" InstalledSize="13682786" PackageType="Msi" Permanent="no" LogPathVariable="WixBundleLog_HelpViewer2_3" RollbackLogPathVariable="WixBundleRollbackLog_HelpViewer2_3" Compressed="yes" DisplayInternalUI="no" /> <WixPackageProperties Package="HelpViewer2_3_LP" Vital="yes" DisplayName="Microsoft Help Viewer 2.3 语言包 - 简体中文" DownloadSize="556625" PackageSize="556625" InstalledSize="812913" PackageType="Msi" Permanent="no" LogPathVariable="WixBundleLog_HelpViewer2_3_LP" RollbackLogPathVariable="WixBundleRollbackLog_HelpViewer2_3_LP" Compressed="yes" DisplayInternalUI="no" /> <WixPackageProperties Package="DotNet47" Vital="yes" DisplayName="Microsoft .NET Framework 4.7.2" Description="Microsoft .NET Framework 4.7.2 Setup" DownloadSize="83943272" PackageSize="83943272" InstalledSize="83943272" PackageType="Exe" Permanent="yes" LogPathVariable="WixBundleLog_DotNet47" RollbackLogPathVariable="WixBundleRollbackLog_DotNet47" Compressed="yes" DisplayInternalUI="no" /> <WixPackageProperties Package="NetFx45Web" Vital="yes" DisplayName="Microsoft .NET Framework 4.5" Description="Microsoft .NET Framework 4.5 Setup" DownloadSize="1005568" PackageSize="1005568" InstalledSize="1005568" PackageType="Exe" Permanent="yes" LogPathVariable="NetFx45FullWebLog" RollbackLogPathVariable="WixBundleRollbackLog_NetFx45Web" Compressed="no" DisplayInternalUI="no" /> <WixPackageProperties Package="sqlncli.msi" Vital="yes" DisplayName="Microsoft SQL Server 2012 Native Client " DownloadSize="5107712" PackageSize="5107712" InstalledSize="10813810" PackageType="Msi" Permanent="no" LogPathVariable="WixBundleLog_sqlncli.msi" RollbackLogPathVariable="WixBundleRollbackLog_sqlncli.msi" Compressed="yes" DisplayInternalUI="no" /> <WixPackageProperties Package="msodbcsql.msi" Vital="yes" DisplayName="Microsoft ODBC Driver 17 for SQL Server" DownloadSize="3657728" PackageSize="3657728" InstalledSize="10917884" PackageType="Msi" Permanent="no" LogPathVariable="WixBundleLog_msodbcsql.msi" RollbackLogPathVariable="WixBundleRollbackLog_msodbcsql.msi" Compressed="yes" DisplayInternalUI="no" /> <WixPackageProperties Package="msoledbsql.msi" Vital="yes" DisplayName="Microsoft OLE DB Driver for SQL Server" DownloadSize="5390336" PackageSize="5390336" InstalledSize="12212300" PackageType="Msi" Permanent="no" LogPathVariable="WixBundleLog_msoledbsql.msi" RollbackLogPathVariable="WixBundleRollbackLog_msoledbsql.msi" Compressed="yes" DisplayInternalUI="no" /> <WixPackageProperties Package="adalsql_x64" Vital="yes" DisplayName="适用于 SQL Server 的 Active Directory 验证库" DownloadSize="2871296" PackageSize="2871296" InstalledSize="3357210" PackageType="Msi" Permanent="yes" LogPathVariable="WixBundleLog_adalsql_x64" RollbackLogPathVariable="WixBundleRollbackLog_adalsql_x64" Compressed="yes" DisplayInternalUI="no" /> <WixPackageProperties Package="sql_as_oledb_x64" Vital="yes" DisplayName="Microsoft Analysis Services OLE DB 提供程序" DownloadSize="74444800" PackageSize="74444800" InstalledSize="478359608" PackageType="Msi" Permanent="no" LogPathVariable="WixBundleLog_sql_as_oledb_x64" RollbackLogPathVariable="WixBundleRollbackLog_sql_as_oledb_x64" Compressed="yes" DisplayInternalUI="no" /> <WixPackageProperties Package="sql_as_oledb_x86" Vital="yes" DisplayName="Microsoft Analysis Services OLE DB 提供程序" DownloadSize="36139008" PackageSize="36139008" InstalledSize="232289266" PackageType="Msi" Permanent="no" LogPathVariable="WixBundleLog_sql_as_oledb_x86" RollbackLogPathVariable="WixBundleRollbackLog_sql_as_oledb_x86" Compressed="yes" DisplayInternalUI="no" /> <WixPackageProperties Package="VS2017IsoShellForSSMS" Vital="yes" DisplayName="Visual Studio 2017 Isolated Shell for SSMS" DownloadSize="142082048" PackageSize="142082048" InstalledSize="418962877" PackageType="Msi" Permanent="no" LogPathVariable="WixBundleLog_VS2017IsoShellForSSMS" RollbackLogPathVariable="WixBundleRollbackLog_VS2017IsoShellForSSMS" Compressed="yes" DisplayInternalUI="no" /> <WixPackageProperties Package="VS2017IsoShellForSSMS_LP" Vital="yes" DisplayName="用于 SSMS LangPack 的 Visual Studio 2017 Shell (独立) - 简体中文" DownloadSize="6905856" PackageSize="6905856" InstalledSize="31787409" PackageType="Msi" Permanent="no" LogPathVariable="WixBundleLog_VS2017IsoShellForSSMS_LP" RollbackLogPathVariable="WixBundleRollbackLog_VS2017IsoShellForSSMS_LP" Compressed="yes" DisplayInternalUI="no" /> <WixPackageProperties Package="VSTA2017" Vital="yes" DisplayName="Microsoft Visual Studio Tools for Applications 2017" Description="Microsoft Visual Studio Tools for Applications 2017" DownloadSize="13647984" PackageSize="13647984" InstalledSize="13647984" PackageType="Exe" Permanent="yes" LogPathVariable="WixBundleLog_VSTA2017" RollbackLogPathVariable="WixBundleRollbackLog_VSTA2017" Compressed="yes" DisplayInternalUI="no" /> <WixPackageProperties Package="sql_ssms_x64" Vital="yes" DisplayName="SQL Server Management Studio" DownloadSize="47964160" PackageSize="47964160" InstalledSize="208078967" PackageType="Msi" Permanent="no" LogPathVariable="WixBundleLog_sql_ssms_x64" RollbackLogPathVariable="WixBundleRollbackLog_sql_ssms_x64" Compressed="yes" DisplayInternalUI="no" /> <WixPackageProperties Package="sql_ssms_loc_x64_Loc" Vital="yes" DisplayName="SQL Server Management Studio" DownloadSize="7376896" PackageSize="7376896" InstalledSize="32747110" PackageType="Msi" Permanent="no" LogPathVariable="WixBundleLog_sql_ssms_loc_x64_Loc" RollbackLogPathVariable="WixBundleRollbackLog_sql_ssms_loc_x64_Loc" Compressed="yes" DisplayInternalUI="no" /> <WixPackageProperties Package="ssms_rs_x64" Vital="yes" DisplayName="SQL Server Management Studio for Reporting Services" DownloadSize="8765440" PackageSize="8765440" InstalledSize="27019353" PackageType="Msi" Permanent="no" LogPathVariable="WixBundleLog_ssms_rs_x64" RollbackLogPathVariable="WixBundleRollbackLog_ssms_rs_x64" Compressed="yes" DisplayInternalUI="no" /> <WixPackageProperties Package="ssms_as_x64" Vital="yes" DisplayName="SQL Server Management Studio for Analysis Services" DownloadSize="73297920" PackageSize="73297920" InstalledSize="320197605" PackageType="Msi" Permanent="no" LogPathVariable="WixBundleLog_ssms_as_x64" RollbackLogPathVariable="WixBundleRollbackLog_ssms_as_x64" Compressed="yes" DisplayInternalUI="no" /> <WixPackageProperties Package="ssms_as_loc_x86" Vital="yes" DisplayName="SQL Server Management Studio for Analysis Services Localization" DownloadSize="3825664" PackageSize="3825664" InstalledSize="16651908" PackageType="Msi" Permanent="no" LogPathVariable="WixBundleLog_ssms_as_loc_x86" RollbackLogPathVariable="WixBundleRollbackLog_ssms_as_loc_x86" Compressed="yes" DisplayInternalUI="no" /> <WixPackageProperties Package="ssms_rs_loc_x86" Vital="yes" DisplayName="SQL Server Management Studio for Reporting Services Localization" DownloadSize="1548288" PackageSize="1548288" InstalledSize="4713000" PackageType="Msi" Permanent="no" LogPathVariable="WixBundleLog_ssms_rs_loc_x86" RollbackLogPathVariable="WixBundleRollbackLog_ssms_rs_loc_x86" Compressed="yes" DisplayInternalUI="no" /> <WixPackageProperties Package="ssms_is" Vital="yes" DisplayName="Integration Services" DownloadSize="32272384" PackageSize="32272384" InstalledSize="119146899" PackageType="Msi" Permanent="no" LogPathVariable="WixBundleLog_ssms_is" RollbackLogPathVariable="WixBundleRollbackLog_ssms_is" Compressed="yes" DisplayInternalUI="no" /> <WixPackageProperties Package="SsmsPostInstall_x64" Vital="yes" DisplayName="SSMS Post Install Tasks" DownloadSize="303104" PackageSize="303104" InstalledSize="0" PackageType="Msi" Permanent="no" LogPathVariable="WixBundleLog_SsmsPostInstall_x64" RollbackLogPathVariable="WixBundleRollbackLog_SsmsPostInstall_x64" Compressed="yes" DisplayInternalUI="no" /> <WixPayloadProperties Payload="VCRedistD12x86" Package="VCRedistD12x86" Container="WixAttachedContainer" Name="2013\vcredist_x86.exe" Size="6503984" DownloadUrl="https://aka.ms/vs/15/release/vc_redist.x86.exe" LayoutOnly="no" /> <WixPayloadProperties Payload="VCRedistD14x86" Package="VCRedistD14x86" Container="WixAttachedContainer" Name="2017\VC_redist.x86.exe" Size="14673288" DownloadUrl="https://aka.ms/vs/15/release/vc_redist.x86.exe" LayoutOnly="no" /> <WixPayloadProperties Payload="VCRedistD14x64" Package="VCRedistD14x64" Container="WixAttachedContainer" Name="2017\VC_redist.x64.exe" Size="15354672" DownloadUrl="https://aka.ms/vs/15/release/vc_redist.x64.exe" LayoutOnly="no" /> <WixPayloadProperties Payload="HelpViewer2_3" Package="HelpViewer2_3" Container="WixAttachedContainer" Name="redist\help3_vs_net.msi" Size="311296" LayoutOnly="no" /> <WixPayloadProperties Payload="cabE503715EC9048140D7380B4A179097EF" Package="HelpViewer2_3" Container="WixAttachedContainer" Name="redist\cab1.cab" Size="2838296" LayoutOnly="no" /> <WixPayloadProperties Payload="HelpViewer2_3_LP" Package="HelpViewer2_3_LP" Container="WixAttachedContainer" Name="redist\help3_LP_net.msi" Size="294912" LayoutOnly="no" /> <WixPayloadProperties Payload="cab96EEBCEEC415A2FC7E4C6265548D48E5" Package="HelpViewer2_3_LP" Container="WixAttachedContainer" Name="redist\cab1.cab" Size="261713" LayoutOnly="no" /> <WixPayloadProperties Payload="DotNet47" Package="DotNet47" Container="WixAttachedContainer" Name="redist\NDP472-KB4054530-x86-x64-AllOS-ENU.exe" Size="83943272" LayoutOnly="no" /> <WixPayloadProperties Payload="NetFx45Web" Package="NetFx45Web" Name="redist\dotNetFx45_Full_setup.exe" Size="1005568" DownloadUrl="http://go.microsoft.com/fwlink/?LinkId=225704" LayoutOnly="no" /> <WixPayloadProperties Payload="sqlncli.msi" Package="sqlncli.msi" Container="WixAttachedContainer" Name="x64\sqlncli.msi" Size="5107712" DownloadUrl="http://go.microsoft.com/fwlink/?LinkId=718112" LayoutOnly="no" /> <WixPayloadProperties Payload="msodbcsql.msi" Package="msodbcsql.msi" Container="WixAttachedContainer" Name="x64\msodbcsql.msi" Size="3657728" DownloadUrl="http://go.microsoft.com/fwlink/?LinkId=718071" LayoutOnly="no" /> <WixPayloadProperties Payload="msoledbsql.msi" Package="msoledbsql.msi" Container="WixAttachedContainer" Name="x64\msoledbsql.msi" Size="5390336" LayoutOnly="no" /> <WixPayloadProperties Payload="adalsql_x64" Package="adalsql_x64" Container="WixAttachedContainer" Name="x64\adalsql.msi" Size="2871296" DownloadUrl="http://go.microsoft.com/fwlink/?LinkId=718064" LayoutOnly="no" /> <WixPayloadProperties Payload="sql_as_oledb_x64" Package="sql_as_oledb_x64" Container="WixAttachedContainer" Name="x64\sql_as_oledb.msi" Size="74444800" LayoutOnly="no" /> <WixPayloadProperties Payload="sql_as_oledb_x86" Package="sql_as_oledb_x86" Container="WixAttachedContainer" Name="x86\sql_as_oledb.msi" Size="36139008" LayoutOnly="no" /> <WixPayloadProperties Payload="VS2017IsoShellForSSMS" Package="VS2017IsoShellForSSMS" Container="WixAttachedContainer" Name="redist\vs2017_isoshell_for_ssms.msi" Size="142082048" LayoutOnly="no" /> <WixPayloadProperties Payload="VS2017IsoShellForSSMS_LP" Package="VS2017IsoShellForSSMS_LP" Container="WixAttachedContainer" Name="redist\vs2017_isoshell_for_ssms_lp.msi" Size="6905856" LayoutOnly="no" /> <WixPayloadProperties Payload="VSTA2017" Package="VSTA2017" Container="WixAttachedContainer" Name="redist\vsta_setup.exe" Size="13647984" DownloadUrl="http://go.microsoft.com/fwlink/?LinkId=799679" LayoutOnly="no" /> <WixPayloadProperties Payload="sql_ssms_x64" Package="sql_ssms_x64" Container="WixAttachedContainer" Name="x64\sql_ssms.msi" Size="47964160" DownloadUrl="http://go.microsoft.com/fwlink/?LinkId=718090" LayoutOnly="no" /> <WixPayloadProperties Payload="sql_ssms_loc_x64_Loc" Package="sql_ssms_loc_x64_Loc" Container="WixAttachedContainer" Name="x64\sql_ssms_loc.msi" Size="7376896" DownloadUrl="http://go.microsoft.com/fwlink/?LinkId=718096" LayoutOnly="no" /> <WixPayloadProperties Payload="ssms_rs_x64" Package="ssms_rs_x64" Container="WixAttachedContainer" Name="x64\ssms_rs.msi" Size="8765440" LayoutOnly="no" /> <WixPayloadProperties Payload="ssms_as_x64" Package="ssms_as_x64" Container="WixAttachedContainer" Name="x64\ssms_as.msi" Size="73297920" LayoutOnly="no" /> <WixPayloadProperties Payload="ssms_as_loc_x86" Package="ssms_as_loc_x86" Container="WixAttachedContainer" Name="x86\ssms_as_loc.msi" Size="3825664" LayoutOnly="no" /> <WixPayloadProperties Payload="ssms_rs_loc_x86" Package="ssms_rs_loc_x86" Container="WixAttachedContainer" Name="x86\ssms_rs_loc.msi" Size="1548288" LayoutOnly="no" /> <WixPayloadProperties Payload="ssms_is" Package="ssms_is" Container="WixAttachedContainer" Name="x86\ssms_is.msi" Size="32272384" LayoutOnly="no" /> <WixPayloadProperties Payload="SsmsPostInstall_x64" Package="SsmsPostInstall_x64" Container="WixAttachedContainer" Name="x64\SsmsPostInstall.msi" Size="303104" LayoutOnly="no" /> <WixStdbaOverridableVariable Name="SSMSINSTALLROOT" /> </BootstrapperApplicationData> [2F6C:2E80][2019-06-30T09:36:40]i000: MainViewModel.CheckFailedConditions: Check whether OS is Windows 10 / Windows Server 2016 pre-RS1 (build <= 10586). OSVersion = 10.0.16299 [2F6C:2E80][2019-06-30T09:36:40]i000: ManagedBootstrapperApp.SetCommandLineProperties: Begin [2F6C:2E80][2019-06-30T09:36:40]i000: ManagedBootstrapperApp.SetCommandLineProperties: End [2F6C:2804][2019-06-30T09:36:41]i100: Detect begin, 24 packages [2F6C:2804][2019-06-30T09:36:41]i000: Setting string variable 'NETFRAMEWORK45' to value '461814' [2F6C:2804][2019-06-30T09:36:41]i000: Registry value not found. Key = 'SOFTWARE\Microsoft\Microsoft SQL Server Management Studio\18', Value = 'Version' [2F6C:2804][2019-06-30T09:36:41]i000: Registry key not found. Key = 'SOFTWARE\Microsoft\SQL Server Management Studio\18' [2F6C:2804][2019-06-30T09:36:41]i000: Setting numeric variable 'SSMS18PreReleaseDetected' to value 0 [2F6C:2804][2019-06-30T09:36:41]i000: Setting numeric variable 'SSMSInstallExists' to value 1 [2F6C:2804][2019-06-30T09:36:41]i052: Condition 'SSMSInstallExists' evaluates to true. [2F6C:2804][2019-06-30T09:36:41]i000: Setting string variable 'SSMSINSTALLROOT' to value 'C:\Program Files (x86)\Microsoft SQL Server Management Studio 18\' [2F6C:2804][2019-06-30T09:36:41]i052: Condition 'SSMSInstallExists' evaluates to true. [2F6C:2804][2019-06-30T09:36:41]i000: Registry value not found. Key = 'SOFTWARE\Microsoft\Microsoft SQL Server Management Studio\18', Value = 'Version' [2F6C:2804][2019-06-30T09:36:41]i052: Condition 'SSMSInstallExists' evaluates to true. [2F6C:2804][2019-06-30T09:36:41]i000: Registry key not found. Key = 'SOFTWARE\Microsoft\Microsoft SQL Server Management Studio\18\Language' [2F6C:2804][2019-06-30T09:36:41]i000: Setting numeric variable 'SSMSInstalledLanguageMatch' to value 0 [2F6C:2804][2019-06-30T09:36:41]i000: Setting numeric variable 'VCRedist_D12x86_KeyExists' to value 1 [2F6C:2804][2019-06-30T09:36:41]i000: Setting string variable 'VCRedist_D14x64_Bld' to value '27024' [2F6C:2804][2019-06-30T09:36:41]i000: Setting numeric variable 'VCRedist_D14x64_KeyExists' to value 1 [2F6C:2804][2019-06-30T09:36:41]i000: Setting string variable 'VCRedist_D14x64_Major' to value '14' [2F6C:2804][2019-06-30T09:36:41]i000: Setting string variable 'VCRedist_D14x64_Minor' to value '16' [2F6C:2804][2019-06-30T09:36:41]i000: Setting string variable 'VCRedist_D14x86_Bld' to value '27024' [2F6C:2804][2019-06-30T09:36:41]i000: Setting numeric variable 'VCRedist_D14x86_KeyExists' to value 1 [2F6C:2804][2019-06-30T09:36:41]i000: Setting string variable 'VCRedist_D14x86_Major' to value '14' [2F6C:2804][2019-06-30T09:36:41]i000: Setting string variable 'VCRedist_D14x86_Minor' to value '16' [2F6C:2804][2019-06-30T09:36:41]i000: Registry key not found. Key = 'HKLM\Software\Microsoft\DevDiv\vsta\Servicing\15.0\hosting' [2F6C:2804][2019-06-30T09:36:41]i000: Registry key not found. Key = 'SOFTWARE\Microsoft\Windows\CurrentVersion\Component Based Servicing\Packages\Package_for_KB2919355~31bf3856ad364e35~amd64~~6.3.1.14' [2F6C:2804][2019-06-30T09:36:41]i000: Setting string variable 'Netfx4FullReleaseX64' to value '461814' [2F6C:2804][2019-06-30T09:36:41]i000: Setting string variable 'InstallationType' to value 'Client' [2F6C:2804][2019-06-30T09:36:41]i000: Setting string variable 'Netfx4ClientReleaseX64' to value '461814' [2F6C:2804][2019-06-30T09:36:41]i000: Registry key not found. Key = 'SOFTWARE\Microsoft\Windows\CurrentVersion\Component Based Servicing\Packages\Package_for_KB2919355~31bf3856ad364e35~x86~~6.3.1.14' [2F6C:2804][2019-06-30T09:36:41]i052: Condition 'VCRedist_D12x86_KeyExists' evaluates to true. [2F6C:2804][2019-06-30T09:36:41]i052: Condition 'VCRedist_D14x86_KeyExists AND (VCRedist_D14x86_Major >= 14) AND (VCRedist_D14x86_Minor >= 10) AND (VCRedist_D14x86_Bld >= 25008)' evaluates to true. [2F6C:2804][2019-06-30T09:36:41]i052: Condition 'VCRedist_D14x64_KeyExists AND (VCRedist_D14x64_Major >= 14) AND (VCRedist_D14x64_Minor >= 10) AND (VCRedist_D14x64_Bld >= 25008)' evaluates to true. [2F6C:2804][2019-06-30T09:36:41]i052: Condition '(Netfx4FullReleaseX64 >= 461808) OR (NetFx4ClientReleaseX64 >= 461808)' evaluates to true. [2F6C:2804][2019-06-30T09:36:41]i052: Condition 'NETFRAMEWORK45 >= 378389' evaluates to true. [2F6C:2804][2019-06-30T09:36:41]i052: Condition 'VSTA2017Installed' evaluates to false. [2F6C:2804][2019-06-30T09:36:41]i101: Detected package: VCRedistD12x86, state: Present, cached: None [2F6C:2804][2019-06-30T09:36:41]i101: Detected package: VCRedistD14x86, state: Present, cached: None [2F6C:2804][2019-06-30T09:36:41]i101: Detected package: VCRedistD14x64, state: Present, cached: None [2F6C:2804][2019-06-30T09:36:41]i101: Detected package: HelpViewer2_3, state: Present, cached: None [2F6C:2804][2019-06-30T09:36:41]i101: Detected package: HelpViewer2_3_LP, state: Present, cached: None [2F6C:2804][2019-06-30T09:36:41]i101: Detected package: DotNet47, state: Present, cached: None [2F6C:2804][2019-06-30T09:36:41]i101: Detected package: NetFx45Web, state: Present, cached: None [2F6C:2804][2019-06-30T09:36:41]i101: Detected package: sqlncli.msi, state: Present, cached: None [2F6C:2804][2019-06-30T09:36:41]i101: Detected package: msodbcsql.msi, state: Present, cached: None [2F6C:2804][2019-06-30T09:36:41]i101: Detected package: msoledbsql.msi, state: Present, cached: None [2F6C:2804][2019-06-30T09:36:41]i101: Detected package: adalsql_x64, state: Present, cached: None [2F6C:2804][2019-06-30T09:36:41]i101: Detected package: sql_as_oledb_x64, state: Present, cached: None [2F6C:2804][2019-06-30T09:36:41]i101: Detected package: sql_as_oledb_x86, state: Present, cached: None [2F6C:2804][2019-06-30T09:36:41]i101: Detected package: VS2017IsoShellForSSMS, state: Present, cached: None [2F6C:2804][2019-06-30T09:36:41]i101: Detected package: VS2017IsoShellForSSMS_LP, state: Present, cached: None [2F6C:2804][2019-06-30T09:36:41]i101: Detected package: VSTA2017, state: Absent, cached: None [2F6C:2804][2019-06-30T09:36:41]i101: Detected package: sql_ssms_x64, state: Absent, cached: None [2F6C:2804][2019-06-30T09:36:41]i101: Detected package: sql_ssms_loc_x64_Loc, state: Absent, cached: None [2F6C:2804][2019-06-30T09:36:41]i101: Detected package: ssms_rs_x64, state: Absent, cached: None [2F6C:2804][2019-06-30T09:36:41]i101: Detected package: ssms_as_x64, state: Absent, cached: None [2F6C:2804][2019-06-30T09:36:41]i101: Detected package: ssms_as_loc_x86, state: Absent, cached: None [2F6C:2804][2019-06-30T09:36:41]i101: Detected package: ssms_rs_loc_x86, state: Absent, cached: None [2F6C:2804][2019-06-30T09:36:41]i101: Detected package: ssms_is, state: Absent, cached: None [2F6C:2804][2019-06-30T09:36:41]i101: Detected package: SsmsPostInstall_x64, state: Absent, cached: None [2F6C:2804][2019-06-30T09:36:41]i000: BootstrapperEngineDataModel.OnDetectComplete.: Entering... [2F6C:2804][2019-06-30T09:36:41]i000: MainViewModel: Trying to parse value '1' for property 'SSMSInstallExists'... [2F6C:2804][2019-06-30T09:36:41]i000: MainViewModel.CheckInstallPathIsValid: Successfully created and deleted installation folder - 'C:\Program Files (x86)\Microsoft SQL Server Management Studio 18\' [2F6C:2804][2019-06-30T09:36:41]i000: MainViewModel.OnBootstrapperReady: IsUpgradeScenario=True [2F6C:2804][2019-06-30T09:36:41]i000: MainViewModel.OnBootstrapperReady: SSMSInstallRoot=C:\Program Files (x86)\Microsoft SQL Server Management Studio 18\ [2F6C:2804][2019-06-30T09:36:41]e000: MainViewModel: SSMSInstallVersion not found. [2F6C:2804][2019-06-30T09:36:41]i000: MainViewModel.OnBootstrapperReady: SSMSInstallVersion= [2F6C:2804][2019-06-30T09:36:41]i052: Condition 'RebootPending = 1' evaluates to false. [2F6C:2804][2019-06-30T09:36:41]i052: Condition 'SSMS18PreReleaseDetected = 0' evaluates to true. [2F6C:2804][2019-06-30T09:36:41]i052: Condition '(SSMS18PreReleaseDetected = 1) OR (SSMSInstallExists = 0) OR SSMSInstalledLanguageMatch' evaluates to false. [2F6C:2804][2019-06-30T09:36:41]i052: Condition 'NOT Msix64' evaluates to true. [2F6C:2804][2019-06-30T09:36:41]i052: Condition 'RebootPending = 0' evaluates to true. [2F6C:2804][2019-06-30T09:36:41]i052: Condition '(VersionNT = v6.1 AND ServicePackLevel = 1) OR VersionNT > v6.1' evaluates to true. [2F6C:2804][2019-06-30T09:36:41]i052: Condition 'Installed OR (VersionNT <> v6.2) OR (InstallationType <> "Client")' evaluates to true. [2F6C:2804][2019-06-30T09:36:41]i052: Condition '(VersionNT <> v6.3) OR (KB2919355_amd64_CurrentState = 112 OR KB2919355_x86_CurrentState = 112)' evaluates to true. [2F6C:2804][2019-06-30T09:36:41]i000: MainViewModel.AddFailedCondition: Error: 只能通过安装匹配语言包来升级 SSMS。请使用匹配版本的安装程序,或卸载当前版本的 SSMS 并再次运行 SSMS 安装程序。 [2F6C:2804][2019-06-30T09:36:41]i000: BootstrapperEngineDataModel.OnDetectComplete.: Exiting. [2F6C:2804][2019-06-30T09:36:41]i199: Detect complete, result: 0x0 [2F6C:2E80][2019-06-30T09:36:43]i000: MainViewModel.OpenUrl: Opening url: C:\Users\鲁元博\AppData\Local\Temp\SsmsSetup\SSMS-Setup-CHS_20190630093638.log

spark 读取不到hive metastore 获取不到数据库

直接上异常 ``` Using Spark's default log4j profile: org/apache/spark/log4j-defaults.properties SLF4J: Class path contains multiple SLF4J bindings. SLF4J: Found binding in [jar:file:/data01/hadoop/yarn/local/filecache/355/spark2-hdp-yarn-archive.tar.gz/slf4j-log4j12-1.7.16.jar!/org/slf4j/impl/StaticLoggerBinder.class] SLF4J: Found binding in [jar:file:/usr/hdp/2.6.5.0-292/hadoop/lib/slf4j-log4j12-1.7.10.jar!/org/slf4j/impl/StaticLoggerBinder.class] SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an explanation. SLF4J: Actual binding is of type [org.slf4j.impl.Log4jLoggerFactory] 19/08/13 19:53:17 INFO SignalUtils: Registered signal handler for TERM 19/08/13 19:53:17 INFO SignalUtils: Registered signal handler for HUP 19/08/13 19:53:17 INFO SignalUtils: Registered signal handler for INT 19/08/13 19:53:17 INFO SecurityManager: Changing view acls to: yarn,hdfs 19/08/13 19:53:17 INFO SecurityManager: Changing modify acls to: yarn,hdfs 19/08/13 19:53:17 INFO SecurityManager: Changing view acls groups to: 19/08/13 19:53:17 INFO SecurityManager: Changing modify acls groups to: 19/08/13 19:53:17 INFO SecurityManager: SecurityManager: authentication disabled; ui acls disabled; users with view permissions: Set(yarn, hdfs); groups with view permissions: Set(); users with modify permissions: Set(yarn, hdfs); groups with modify permissions: Set() 19/08/13 19:53:18 INFO ApplicationMaster: Preparing Local resources 19/08/13 19:53:19 INFO ApplicationMaster: ApplicationAttemptId: appattempt_1565610088533_0087_000001 19/08/13 19:53:19 INFO ApplicationMaster: Starting the user application in a separate Thread 19/08/13 19:53:19 INFO ApplicationMaster: Waiting for spark context initialization... 19/08/13 19:53:19 INFO SparkContext: Running Spark version 2.3.0.2.6.5.0-292 19/08/13 19:53:19 INFO SparkContext: Submitted application: voice_stream 19/08/13 19:53:19 INFO SecurityManager: Changing view acls to: yarn,hdfs 19/08/13 19:53:19 INFO SecurityManager: Changing modify acls to: yarn,hdfs 19/08/13 19:53:19 INFO SecurityManager: Changing view acls groups to: 19/08/13 19:53:19 INFO SecurityManager: Changing modify acls groups to: 19/08/13 19:53:19 INFO SecurityManager: SecurityManager: authentication disabled; ui acls disabled; users with view permissions: Set(yarn, hdfs); groups with view permissions: Set(); users with modify permissions: Set(yarn, hdfs); groups with modify permissions: Set() 19/08/13 19:53:19 INFO Utils: Successfully started service 'sparkDriver' on port 20410. 19/08/13 19:53:19 INFO SparkEnv: Registering MapOutputTracker 19/08/13 19:53:19 INFO SparkEnv: Registering BlockManagerMaster 19/08/13 19:53:19 INFO BlockManagerMasterEndpoint: Using org.apache.spark.storage.DefaultTopologyMapper for getting topology information 19/08/13 19:53:19 INFO BlockManagerMasterEndpoint: BlockManagerMasterEndpoint up 19/08/13 19:53:19 INFO DiskBlockManager: Created local directory at /data01/hadoop/yarn/local/usercache/hdfs/appcache/application_1565610088533_0087/blockmgr-94d35b97-43b2-496e-a4cb-73ecd3ed186c 19/08/13 19:53:19 INFO MemoryStore: MemoryStore started with capacity 366.3 MB 19/08/13 19:53:19 INFO SparkEnv: Registering OutputCommitCoordinator 19/08/13 19:53:19 INFO JettyUtils: Adding filter: org.apache.hadoop.yarn.server.webproxy.amfilter.AmIpFilter 19/08/13 19:53:19 INFO Utils: Successfully started service 'SparkUI' on port 28852. 19/08/13 19:53:19 INFO SparkUI: Bound SparkUI to 0.0.0.0, and started at http://datanode02:28852 19/08/13 19:53:19 INFO YarnClusterScheduler: Created YarnClusterScheduler 19/08/13 19:53:20 INFO SchedulerExtensionServices: Starting Yarn extension services with app application_1565610088533_0087 and attemptId Some(appattempt_1565610088533_0087_000001) 19/08/13 19:53:20 INFO Utils: Successfully started service 'org.apache.spark.network.netty.NettyBlockTransferService' on port 31984. 19/08/13 19:53:20 INFO NettyBlockTransferService: Server created on datanode02:31984 19/08/13 19:53:20 INFO BlockManager: Using org.apache.spark.storage.RandomBlockReplicationPolicy for block replication policy 19/08/13 19:53:20 INFO BlockManagerMaster: Registering BlockManager BlockManagerId(driver, datanode02, 31984, None) 19/08/13 19:53:20 INFO BlockManagerMasterEndpoint: Registering block manager datanode02:31984 with 366.3 MB RAM, BlockManagerId(driver, datanode02, 31984, None) 19/08/13 19:53:20 INFO BlockManagerMaster: Registered BlockManager BlockManagerId(driver, datanode02, 31984, None) 19/08/13 19:53:20 INFO BlockManager: Initialized BlockManager: BlockManagerId(driver, datanode02, 31984, None) 19/08/13 19:53:20 INFO EventLoggingListener: Logging events to hdfs:/spark2-history/application_1565610088533_0087_1 19/08/13 19:53:20 INFO ApplicationMaster: =============================================================================== YARN executor launch context: env: CLASSPATH -> {{PWD}}<CPS>{{PWD}}/__spark_conf__<CPS>{{PWD}}/__spark_libs__/*<CPS>/usr/hdp/2.6.5.0-292/hadoop/conf<CPS>/usr/hdp/2.6.5.0-292/hadoop/*<CPS>/usr/hdp/2.6.5.0-292/hadoop/lib/*<CPS>/usr/hdp/current/hadoop-hdfs-client/*<CPS>/usr/hdp/current/hadoop-hdfs-client/lib/*<CPS>/usr/hdp/current/hadoop-yarn-client/*<CPS>/usr/hdp/current/hadoop-yarn-client/lib/*<CPS>/usr/hdp/current/ext/hadoop/*<CPS>$PWD/mr-framework/hadoop/share/hadoop/mapreduce/*:$PWD/mr-framework/hadoop/share/hadoop/mapreduce/lib/*:$PWD/mr-framework/hadoop/share/hadoop/common/*:$PWD/mr-framework/hadoop/share/hadoop/common/lib/*:$PWD/mr-framework/hadoop/share/hadoop/yarn/*:$PWD/mr-framework/hadoop/share/hadoop/yarn/lib/*:$PWD/mr-framework/hadoop/share/hadoop/hdfs/*:$PWD/mr-framework/hadoop/share/hadoop/hdfs/lib/*:$PWD/mr-framework/hadoop/share/hadoop/tools/lib/*:/usr/hdp/2.6.5.0-292/hadoop/lib/hadoop-lzo-0.6.0.2.6.5.0-292.jar:/etc/hadoop/conf/secure:/usr/hdp/current/ext/hadoop/*<CPS>{{PWD}}/__spark_conf__/__hadoop_conf__ SPARK_YARN_STAGING_DIR -> *********(redacted) SPARK_USER -> *********(redacted) command: LD_LIBRARY_PATH="/usr/hdp/current/hadoop-client/lib/native:/usr/hdp/current/hadoop-client/lib/native/Linux-amd64-64:$LD_LIBRARY_PATH" \ {{JAVA_HOME}}/bin/java \ -server \ -Xmx5120m \ -Djava.io.tmpdir={{PWD}}/tmp \ '-Dspark.history.ui.port=18081' \ '-Dspark.rpc.message.maxSize=100' \ -Dspark.yarn.app.container.log.dir=<LOG_DIR> \ -XX:OnOutOfMemoryError='kill %p' \ org.apache.spark.executor.CoarseGrainedExecutorBackend \ --driver-url \ spark://CoarseGrainedScheduler@datanode02:20410 \ --executor-id \ <executorId> \ --hostname \ <hostname> \ --cores \ 2 \ --app-id \ application_1565610088533_0087 \ --user-class-path \ file:$PWD/__app__.jar \ --user-class-path \ file:$PWD/hadoop-common-2.7.3.jar \ --user-class-path \ file:$PWD/guava-12.0.1.jar \ --user-class-path \ file:$PWD/hbase-server-1.2.8.jar \ --user-class-path \ file:$PWD/hbase-protocol-1.2.8.jar \ --user-class-path \ file:$PWD/hbase-client-1.2.8.jar \ --user-class-path \ file:$PWD/hbase-common-1.2.8.jar \ --user-class-path \ file:$PWD/mysql-connector-java-5.1.44-bin.jar \ --user-class-path \ file:$PWD/spark-streaming-kafka-0-8-assembly_2.11-2.3.2.jar \ --user-class-path \ file:$PWD/spark-examples_2.11-1.6.0-typesafe-001.jar \ --user-class-path \ file:$PWD/fastjson-1.2.7.jar \ 1><LOG_DIR>/stdout \ 2><LOG_DIR>/stderr resources: spark-streaming-kafka-0-8-assembly_2.11-2.3.2.jar -> resource { scheme: "hdfs" host: "CID-042fb939-95b4-4b74-91b8-9f94b999bdf7" port: -1 file: "/user/hdfs/.sparkStaging/application_1565610088533_0087/spark-streaming-kafka-0-8-assembly_2.11-2.3.2.jar" } size: 12271027 timestamp: 1565697198603 type: FILE visibility: PRIVATE spark-examples_2.11-1.6.0-typesafe-001.jar -> resource { scheme: "hdfs" host: "CID-042fb939-95b4-4b74-91b8-9f94b999bdf7" port: -1 file: "/user/hdfs/.sparkStaging/application_1565610088533_0087/spark-examples_2.11-1.6.0-typesafe-001.jar" } size: 1867746 timestamp: 1565697198751 type: FILE visibility: PRIVATE hbase-server-1.2.8.jar -> resource { scheme: "hdfs" host: "CID-042fb939-95b4-4b74-91b8-9f94b999bdf7" port: -1 file: "/user/hdfs/.sparkStaging/application_1565610088533_0087/hbase-server-1.2.8.jar" } size: 4197896 timestamp: 1565697197770 type: FILE visibility: PRIVATE hbase-common-1.2.8.jar -> resource { scheme: "hdfs" host: "CID-042fb939-95b4-4b74-91b8-9f94b999bdf7" port: -1 file: "/user/hdfs/.sparkStaging/application_1565610088533_0087/hbase-common-1.2.8.jar" } size: 570163 timestamp: 1565697198318 type: FILE visibility: PRIVATE __app__.jar -> resource { scheme: "hdfs" host: "CID-042fb939-95b4-4b74-91b8-9f94b999bdf7" port: -1 file: "/user/hdfs/.sparkStaging/application_1565610088533_0087/spark_history_data2.jar" } size: 44924 timestamp: 1565697197260 type: FILE visibility: PRIVATE guava-12.0.1.jar -> resource { scheme: "hdfs" host: "CID-042fb939-95b4-4b74-91b8-9f94b999bdf7" port: -1 file: "/user/hdfs/.sparkStaging/application_1565610088533_0087/guava-12.0.1.jar" } size: 1795932 timestamp: 1565697197614 type: FILE visibility: PRIVATE hbase-client-1.2.8.jar -> resource { scheme: "hdfs" host: "CID-042fb939-95b4-4b74-91b8-9f94b999bdf7" port: -1 file: "/user/hdfs/.sparkStaging/application_1565610088533_0087/hbase-client-1.2.8.jar" } size: 1306401 timestamp: 1565697198180 type: FILE visibility: PRIVATE __spark_conf__ -> resource { scheme: "hdfs" host: "CID-042fb939-95b4-4b74-91b8-9f94b999bdf7" port: -1 file: "/user/hdfs/.sparkStaging/application_1565610088533_0087/__spark_conf__.zip" } size: 273513 timestamp: 1565697199131 type: ARCHIVE visibility: PRIVATE fastjson-1.2.7.jar -> resource { scheme: "hdfs" host: "CID-042fb939-95b4-4b74-91b8-9f94b999bdf7" port: -1 file: "/user/hdfs/.sparkStaging/application_1565610088533_0087/fastjson-1.2.7.jar" } size: 417221 timestamp: 1565697198865 type: FILE visibility: PRIVATE hbase-protocol-1.2.8.jar -> resource { scheme: "hdfs" host: "CID-042fb939-95b4-4b74-91b8-9f94b999bdf7" port: -1 file: "/user/hdfs/.sparkStaging/application_1565610088533_0087/hbase-protocol-1.2.8.jar" } size: 4366252 timestamp: 1565697198023 type: FILE visibility: PRIVATE __spark_libs__ -> resource { scheme: "hdfs" host: "CID-042fb939-95b4-4b74-91b8-9f94b999bdf7" port: -1 file: "/hdp/apps/2.6.5.0-292/spark2/spark2-hdp-yarn-archive.tar.gz" } size: 227600110 timestamp: 1549953820247 type: ARCHIVE visibility: PUBLIC mysql-connector-java-5.1.44-bin.jar -> resource { scheme: "hdfs" host: "CID-042fb939-95b4-4b74-91b8-9f94b999bdf7" port: -1 file: "/user/hdfs/.sparkStaging/application_1565610088533_0087/mysql-connector-java-5.1.44-bin.jar" } size: 999635 timestamp: 1565697198445 type: FILE visibility: PRIVATE hadoop-common-2.7.3.jar -> resource { scheme: "hdfs" host: "CID-042fb939-95b4-4b74-91b8-9f94b999bdf7" port: -1 file: "/user/hdfs/.sparkStaging/application_1565610088533_0087/hadoop-common-2.7.3.jar" } size: 3479293 timestamp: 1565697197476 type: FILE visibility: PRIVATE =============================================================================== 19/08/13 19:53:20 INFO RMProxy: Connecting to ResourceManager at namenode02/10.1.38.38:8030 19/08/13 19:53:20 INFO YarnRMClient: Registering the ApplicationMaster 19/08/13 19:53:20 INFO YarnAllocator: Will request 3 executor container(s), each with 2 core(s) and 5632 MB memory (including 512 MB of overhead) 19/08/13 19:53:20 INFO YarnSchedulerBackend$YarnSchedulerEndpoint: ApplicationMaster registered as NettyRpcEndpointRef(spark://YarnAM@datanode02:20410) 19/08/13 19:53:20 INFO YarnAllocator: Submitted 3 unlocalized container requests. 19/08/13 19:53:20 INFO ApplicationMaster: Started progress reporter thread with (heartbeat : 3000, initial allocation : 200) intervals 19/08/13 19:53:20 INFO AMRMClientImpl: Received new token for : datanode03:45454 19/08/13 19:53:21 INFO YarnAllocator: Launching container container_e20_1565610088533_0087_01_000002 on host datanode03 for executor with ID 1 19/08/13 19:53:21 INFO YarnAllocator: Received 1 containers from YARN, launching executors on 1 of them. 19/08/13 19:53:21 INFO ContainerManagementProtocolProxy: yarn.client.max-cached-nodemanagers-proxies : 0 19/08/13 19:53:21 INFO ContainerManagementProtocolProxy: Opening proxy : datanode03:45454 19/08/13 19:53:21 INFO AMRMClientImpl: Received new token for : datanode01:45454 19/08/13 19:53:21 INFO YarnAllocator: Launching container container_e20_1565610088533_0087_01_000003 on host datanode01 for executor with ID 2 19/08/13 19:53:21 INFO YarnAllocator: Received 1 containers from YARN, launching executors on 1 of them. 19/08/13 19:53:21 INFO ContainerManagementProtocolProxy: yarn.client.max-cached-nodemanagers-proxies : 0 19/08/13 19:53:21 INFO ContainerManagementProtocolProxy: Opening proxy : datanode01:45454 19/08/13 19:53:22 INFO AMRMClientImpl: Received new token for : datanode02:45454 19/08/13 19:53:22 INFO YarnAllocator: Launching container container_e20_1565610088533_0087_01_000004 on host datanode02 for executor with ID 3 19/08/13 19:53:22 INFO YarnAllocator: Received 1 containers from YARN, launching executors on 1 of them. 19/08/13 19:53:22 INFO ContainerManagementProtocolProxy: yarn.client.max-cached-nodemanagers-proxies : 0 19/08/13 19:53:22 INFO ContainerManagementProtocolProxy: Opening proxy : datanode02:45454 19/08/13 19:53:24 INFO YarnSchedulerBackend$YarnDriverEndpoint: Registered executor NettyRpcEndpointRef(spark-client://Executor) (10.1.198.144:41122) with ID 1 19/08/13 19:53:25 INFO YarnSchedulerBackend$YarnDriverEndpoint: Registered executor NettyRpcEndpointRef(spark-client://Executor) (10.1.229.163:24656) with ID 3 19/08/13 19:53:25 INFO BlockManagerMasterEndpoint: Registering block manager datanode03:3328 with 2.5 GB RAM, BlockManagerId(1, datanode03, 3328, None) 19/08/13 19:53:25 INFO BlockManagerMasterEndpoint: Registering block manager datanode02:28863 with 2.5 GB RAM, BlockManagerId(3, datanode02, 28863, None) 19/08/13 19:53:25 INFO YarnSchedulerBackend$YarnDriverEndpoint: Registered executor NettyRpcEndpointRef(spark-client://Executor) (10.1.229.158:64276) with ID 2 19/08/13 19:53:25 INFO YarnClusterSchedulerBackend: SchedulerBackend is ready for scheduling beginning after reached minRegisteredResourcesRatio: 0.8 19/08/13 19:53:25 INFO YarnClusterScheduler: YarnClusterScheduler.postStartHook done 19/08/13 19:53:25 INFO BlockManagerMasterEndpoint: Registering block manager datanode01:20487 with 2.5 GB RAM, BlockManagerId(2, datanode01, 20487, None) 19/08/13 19:53:25 WARN SparkContext: Using an existing SparkContext; some configuration may not take effect. 19/08/13 19:53:25 INFO SparkContext: Starting job: start at VoiceApplication2.java:128 19/08/13 19:53:25 INFO DAGScheduler: Registering RDD 1 (start at VoiceApplication2.java:128) 19/08/13 19:53:25 INFO DAGScheduler: Got job 0 (start at VoiceApplication2.java:128) with 20 output partitions 19/08/13 19:53:25 INFO DAGScheduler: Final stage: ResultStage 1 (start at VoiceApplication2.java:128) 19/08/13 19:53:25 INFO DAGScheduler: Parents of final stage: List(ShuffleMapStage 0) 19/08/13 19:53:25 INFO DAGScheduler: Missing parents: List(ShuffleMapStage 0) 19/08/13 19:53:26 INFO DAGScheduler: Submitting ShuffleMapStage 0 (MapPartitionsRDD[1] at start at VoiceApplication2.java:128), which has no missing parents 19/08/13 19:53:26 INFO MemoryStore: Block broadcast_0 stored as values in memory (estimated size 3.1 KB, free 366.3 MB) 19/08/13 19:53:26 INFO MemoryStore: Block broadcast_0_piece0 stored as bytes in memory (estimated size 2011.0 B, free 366.3 MB) 19/08/13 19:53:26 INFO BlockManagerInfo: Added broadcast_0_piece0 in memory on datanode02:31984 (size: 2011.0 B, free: 366.3 MB) 19/08/13 19:53:26 INFO SparkContext: Created broadcast 0 from broadcast at DAGScheduler.scala:1039 19/08/13 19:53:26 INFO DAGScheduler: Submitting 50 missing tasks from ShuffleMapStage 0 (MapPartitionsRDD[1] at start at VoiceApplication2.java:128) (first 15 tasks are for partitions Vector(0, 1, 2, 3, 4, 5, 6, 7, 8, 9, 10, 11, 12, 13, 14)) 19/08/13 19:53:26 INFO YarnClusterScheduler: Adding task set 0.0 with 50 tasks 19/08/13 19:53:26 INFO TaskSetManager: Starting task 0.0 in stage 0.0 (TID 0, datanode02, executor 3, partition 0, PROCESS_LOCAL, 7831 bytes) 19/08/13 19:53:26 INFO TaskSetManager: Starting task 1.0 in stage 0.0 (TID 1, datanode03, executor 1, partition 1, PROCESS_LOCAL, 7831 bytes) 19/08/13 19:53:26 INFO TaskSetManager: Starting task 2.0 in stage 0.0 (TID 2, datanode01, executor 2, partition 2, PROCESS_LOCAL, 7831 bytes) 19/08/13 19:53:26 INFO TaskSetManager: Starting task 3.0 in stage 0.0 (TID 3, datanode02, executor 3, partition 3, PROCESS_LOCAL, 7831 bytes) 19/08/13 19:53:26 INFO TaskSetManager: Starting task 4.0 in stage 0.0 (TID 4, datanode03, executor 1, partition 4, PROCESS_LOCAL, 7831 bytes) 19/08/13 19:53:26 INFO TaskSetManager: Starting task 5.0 in stage 0.0 (TID 5, datanode01, executor 2, partition 5, PROCESS_LOCAL, 7831 bytes) 19/08/13 19:53:26 INFO BlockManagerInfo: Added broadcast_0_piece0 in memory on datanode02:28863 (size: 2011.0 B, free: 2.5 GB) 19/08/13 19:53:26 INFO BlockManagerInfo: Added broadcast_0_piece0 in memory on datanode03:3328 (size: 2011.0 B, free: 2.5 GB) 19/08/13 19:53:26 INFO BlockManagerInfo: Added broadcast_0_piece0 in memory on datanode01:20487 (size: 2011.0 B, free: 2.5 GB) 19/08/13 19:53:26 INFO TaskSetManager: Starting task 6.0 in stage 0.0 (TID 6, datanode02, executor 3, partition 6, PROCESS_LOCAL, 7831 bytes) 19/08/13 19:53:26 INFO TaskSetManager: Starting task 7.0 in stage 0.0 (TID 7, datanode02, executor 3, partition 7, PROCESS_LOCAL, 7831 bytes) 19/08/13 19:53:26 INFO TaskSetManager: Finished task 3.0 in stage 0.0 (TID 3) in 693 ms on datanode02 (executor 3) (1/50) 19/08/13 19:53:26 INFO TaskSetManager: Finished task 0.0 in stage 0.0 (TID 0) in 712 ms on datanode02 (executor 3) (2/50) 19/08/13 19:53:26 INFO TaskSetManager: Starting task 8.0 in stage 0.0 (TID 8, datanode02, executor 3, partition 8, PROCESS_LOCAL, 7831 bytes) 19/08/13 19:53:26 INFO TaskSetManager: Finished task 7.0 in stage 0.0 (TID 7) in 21 ms on datanode02 (executor 3) (3/50) 19/08/13 19:53:26 INFO TaskSetManager: Starting task 9.0 in stage 0.0 (TID 9, datanode02, executor 3, partition 9, PROCESS_LOCAL, 7831 bytes) 19/08/13 19:53:26 INFO TaskSetManager: Finished task 6.0 in stage 0.0 (TID 6) in 26 ms on datanode02 (executor 3) (4/50) 19/08/13 19:53:26 INFO TaskSetManager: Starting task 10.0 in stage 0.0 (TID 10, datanode02, executor 3, partition 10, PROCESS_LOCAL, 7831 bytes) 19/08/13 19:53:26 INFO TaskSetManager: Finished task 8.0 in stage 0.0 (TID 8) in 23 ms on datanode02 (executor 3) (5/50) 19/08/13 19:53:26 INFO TaskSetManager: Starting task 11.0 in stage 0.0 (TID 11, datanode02, executor 3, partition 11, PROCESS_LOCAL, 7831 bytes) 19/08/13 19:53:26 INFO TaskSetManager: Finished task 9.0 in stage 0.0 (TID 9) in 25 ms on datanode02 (executor 3) (6/50) 19/08/13 19:53:26 INFO TaskSetManager: Starting task 12.0 in stage 0.0 (TID 12, datanode02, executor 3, partition 12, PROCESS_LOCAL, 7831 bytes) 19/08/13 19:53:26 INFO TaskSetManager: Finished task 10.0 in stage 0.0 (TID 10) in 18 ms on datanode02 (executor 3) (7/50) 19/08/13 19:53:26 INFO TaskSetManager: Finished task 11.0 in stage 0.0 (TID 11) in 14 ms on datanode02 (executor 3) (8/50) 19/08/13 19:53:26 INFO TaskSetManager: Starting task 13.0 in stage 0.0 (TID 13, datanode02, executor 3, partition 13, PROCESS_LOCAL, 7831 bytes) 19/08/13 19:53:26 INFO TaskSetManager: Starting task 14.0 in stage 0.0 (TID 14, datanode02, executor 3, partition 14, PROCESS_LOCAL, 7831 bytes) 19/08/13 19:53:26 INFO TaskSetManager: Finished task 12.0 in stage 0.0 (TID 12) in 16 ms on datanode02 (executor 3) (9/50) 19/08/13 19:53:26 INFO TaskSetManager: Starting task 15.0 in stage 0.0 (TID 15, datanode02, executor 3, partition 15, PROCESS_LOCAL, 7831 bytes) 19/08/13 19:53:26 INFO TaskSetManager: Finished task 13.0 in stage 0.0 (TID 13) in 22 ms on datanode02 (executor 3) (10/50) 19/08/13 19:53:26 INFO TaskSetManager: Starting task 16.0 in stage 0.0 (TID 16, datanode02, executor 3, partition 16, PROCESS_LOCAL, 7831 bytes) 19/08/13 19:53:26 INFO TaskSetManager: Finished task 14.0 in stage 0.0 (TID 14) in 16 ms on datanode02 (executor 3) (11/50) 19/08/13 19:53:26 INFO TaskSetManager: Starting task 17.0 in stage 0.0 (TID 17, datanode02, executor 3, partition 17, PROCESS_LOCAL, 7831 bytes) 19/08/13 19:53:26 INFO TaskSetManager: Finished task 15.0 in stage 0.0 (TID 15) in 13 ms on datanode02 (executor 3) (12/50) 19/08/13 19:53:26 INFO TaskSetManager: Starting task 18.0 in stage 0.0 (TID 18, datanode01, executor 2, partition 18, PROCESS_LOCAL, 7831 bytes) 19/08/13 19:53:26 INFO TaskSetManager: Starting task 19.0 in stage 0.0 (TID 19, datanode01, executor 2, partition 19, PROCESS_LOCAL, 7831 bytes) 19/08/13 19:53:26 INFO TaskSetManager: Finished task 5.0 in stage 0.0 (TID 5) in 787 ms on datanode01 (executor 2) (13/50) 19/08/13 19:53:26 INFO TaskSetManager: Finished task 2.0 in stage 0.0 (TID 2) in 789 ms on datanode01 (executor 2) (14/50) 19/08/13 19:53:26 INFO TaskSetManager: Starting task 20.0 in stage 0.0 (TID 20, datanode03, executor 1, partition 20, PROCESS_LOCAL, 7831 bytes) 19/08/13 19:53:26 INFO TaskSetManager: Starting task 21.0 in stage 0.0 (TID 21, datanode03, executor 1, partition 21, PROCESS_LOCAL, 7831 bytes) 19/08/13 19:53:27 INFO TaskSetManager: Finished task 4.0 in stage 0.0 (TID 4) in 905 ms on datanode03 (executor 1) (15/50) 19/08/13 19:53:27 INFO TaskSetManager: Finished task 1.0 in stage 0.0 (TID 1) in 907 ms on datanode03 (executor 1) (16/50) 19/08/13 19:53:27 INFO TaskSetManager: Starting task 22.0 in stage 0.0 (TID 22, datanode02, executor 3, partition 22, PROCESS_LOCAL, 7831 bytes) 19/08/13 19:53:27 INFO TaskSetManager: Starting task 23.0 in stage 0.0 (TID 23, datanode02, executor 3, partition 23, PROCESS_LOCAL, 7831 bytes) 19/08/13 19:53:27 INFO TaskSetManager: Starting task 24.0 in stage 0.0 (TID 24, datanode01, executor 2, partition 24, PROCESS_LOCAL, 7831 bytes) 19/08/13 19:53:27 INFO TaskSetManager: Finished task 18.0 in stage 0.0 (TID 18) in 124 ms on datanode01 (executor 2) (17/50) 19/08/13 19:53:27 INFO TaskSetManager: Finished task 16.0 in stage 0.0 (TID 16) in 134 ms on datanode02 (executor 3) (18/50) 19/08/13 19:53:27 INFO TaskSetManager: Starting task 25.0 in stage 0.0 (TID 25, datanode01, executor 2, partition 25, PROCESS_LOCAL, 7831 bytes) 19/08/13 19:53:27 INFO TaskSetManager: Starting task 26.0 in stage 0.0 (TID 26, datanode03, executor 1, partition 26, PROCESS_LOCAL, 7831 bytes) 19/08/13 19:53:27 INFO TaskSetManager: Finished task 17.0 in stage 0.0 (TID 17) in 134 ms on datanode02 (executor 3) (19/50) 19/08/13 19:53:27 INFO TaskSetManager: Finished task 20.0 in stage 0.0 (TID 20) in 122 ms on datanode03 (executor 1) (20/50) 19/08/13 19:53:27 INFO TaskSetManager: Starting task 27.0 in stage 0.0 (TID 27, datanode03, executor 1, partition 27, PROCESS_LOCAL, 7831 bytes) 19/08/13 19:53:27 INFO TaskSetManager: Finished task 19.0 in stage 0.0 (TID 19) in 127 ms on datanode01 (executor 2) (21/50) 19/08/13 19:53:27 INFO TaskSetManager: Finished task 21.0 in stage 0.0 (TID 21) in 123 ms on datanode03 (executor 1) (22/50) 19/08/13 19:53:27 INFO TaskSetManager: Starting task 28.0 in stage 0.0 (TID 28, datanode02, executor 3, partition 28, PROCESS_LOCAL, 7831 bytes) 19/08/13 19:53:27 INFO TaskSetManager: Starting task 29.0 in stage 0.0 (TID 29, datanode02, executor 3, partition 29, PROCESS_LOCAL, 7831 bytes) 19/08/13 19:53:27 INFO TaskSetManager: Finished task 22.0 in stage 0.0 (TID 22) in 19 ms on datanode02 (executor 3) (23/50) 19/08/13 19:53:27 INFO TaskSetManager: Finished task 23.0 in stage 0.0 (TID 23) in 18 ms on datanode02 (executor 3) (24/50) 19/08/13 19:53:27 INFO TaskSetManager: Starting task 30.0 in stage 0.0 (TID 30, datanode01, executor 2, partition 30, PROCESS_LOCAL, 7831 bytes) 19/08/13 19:53:27 INFO TaskSetManager: Starting task 31.0 in stage 0.0 (TID 31, datanode01, executor 2, partition 31, PROCESS_LOCAL, 7831 bytes) 19/08/13 19:53:27 INFO TaskSetManager: Finished task 25.0 in stage 0.0 (TID 25) in 27 ms on datanode01 (executor 2) (25/50) 19/08/13 19:53:27 INFO TaskSetManager: Finished task 24.0 in stage 0.0 (TID 24) in 29 ms on datanode01 (executor 2) (26/50) 19/08/13 19:53:27 INFO TaskSetManager: Starting task 32.0 in stage 0.0 (TID 32, datanode02, executor 3, partition 32, PROCESS_LOCAL, 7831 bytes) 19/08/13 19:53:27 INFO TaskSetManager: Finished task 29.0 in stage 0.0 (TID 29) in 16 ms on datanode02 (executor 3) (27/50) 19/08/13 19:53:27 INFO TaskSetManager: Starting task 33.0 in stage 0.0 (TID 33, datanode03, executor 1, partition 33, PROCESS_LOCAL, 7831 bytes) 19/08/13 19:53:27 INFO TaskSetManager: Finished task 26.0 in stage 0.0 (TID 26) in 30 ms on datanode03 (executor 1) (28/50) 19/08/13 19:53:27 INFO TaskSetManager: Starting task 34.0 in stage 0.0 (TID 34, datanode02, executor 3, partition 34, PROCESS_LOCAL, 7831 bytes) 19/08/13 19:53:27 INFO TaskSetManager: Finished task 28.0 in stage 0.0 (TID 28) in 21 ms on datanode02 (executor 3) (29/50) 19/08/13 19:53:27 INFO TaskSetManager: Starting task 35.0 in stage 0.0 (TID 35, datanode03, executor 1, partition 35, PROCESS_LOCAL, 7831 bytes) 19/08/13 19:53:27 INFO TaskSetManager: Finished task 27.0 in stage 0.0 (TID 27) in 32 ms on datanode03 (executor 1) (30/50) 19/08/13 19:53:27 INFO TaskSetManager: Starting task 36.0 in stage 0.0 (TID 36, datanode02, executor 3, partition 36, PROCESS_LOCAL, 7831 bytes) 19/08/13 19:53:27 INFO TaskSetManager: Finished task 32.0 in stage 0.0 (TID 32) in 11 ms on datanode02 (executor 3) (31/50) 19/08/13 19:53:27 INFO TaskSetManager: Starting task 37.0 in stage 0.0 (TID 37, datanode01, executor 2, partition 37, PROCESS_LOCAL, 7831 bytes) 19/08/13 19:53:27 INFO TaskSetManager: Finished task 30.0 in stage 0.0 (TID 30) in 18 ms on datanode01 (executor 2) (32/50) 19/08/13 19:53:27 INFO TaskSetManager: Starting task 38.0 in stage 0.0 (TID 38, datanode01, executor 2, partition 38, PROCESS_LOCAL, 7831 bytes) 19/08/13 19:53:27 INFO TaskSetManager: Finished task 31.0 in stage 0.0 (TID 31) in 20 ms on datanode01 (executor 2) (33/50) 19/08/13 19:53:27 INFO TaskSetManager: Starting task 39.0 in stage 0.0 (TID 39, datanode03, executor 1, partition 39, PROCESS_LOCAL, 7831 bytes) 19/08/13 19:53:27 INFO TaskSetManager: Finished task 33.0 in stage 0.0 (TID 33) in 17 ms on datanode03 (executor 1) (34/50) 19/08/13 19:53:27 INFO TaskSetManager: Finished task 34.0 in stage 0.0 (TID 34) in 17 ms on datanode02 (executor 3) (35/50) 19/08/13 19:53:27 INFO TaskSetManager: Starting task 40.0 in stage 0.0 (TID 40, datanode02, executor 3, partition 40, PROCESS_LOCAL, 7831 bytes) 19/08/13 19:53:27 INFO TaskSetManager: Starting task 41.0 in stage 0.0 (TID 41, datanode03, executor 1, partition 41, PROCESS_LOCAL, 7831 bytes) 19/08/13 19:53:27 INFO TaskSetManager: Finished task 35.0 in stage 0.0 (TID 35) in 17 ms on datanode03 (executor 1) (36/50) 19/08/13 19:53:27 INFO TaskSetManager: Starting task 42.0 in stage 0.0 (TID 42, datanode02, executor 3, partition 42, PROCESS_LOCAL, 7831 bytes) 19/08/13 19:53:27 INFO TaskSetManager: Finished task 36.0 in stage 0.0 (TID 36) in 16 ms on datanode02 (executor 3) (37/50) 19/08/13 19:53:27 INFO TaskSetManager: Starting task 43.0 in stage 0.0 (TID 43, datanode01, executor 2, partition 43, PROCESS_LOCAL, 7831 bytes) 19/08/13 19:53:27 INFO TaskSetManager: Finished task 37.0 in stage 0.0 (TID 37) in 16 ms on datanode01 (executor 2) (38/50) 19/08/13 19:53:27 INFO TaskSetManager: Starting task 44.0 in stage 0.0 (TID 44, datanode02, executor 3, partition 44, PROCESS_LOCAL, 7831 bytes) 19/08/13 19:53:27 INFO TaskSetManager: Starting task 45.0 in stage 0.0 (TID 45, datanode02, executor 3, partition 45, PROCESS_LOCAL, 7831 bytes) 19/08/13 19:53:27 INFO TaskSetManager: Finished task 40.0 in stage 0.0 (TID 40) in 14 ms on datanode02 (executor 3) (39/50) 19/08/13 19:53:27 INFO TaskSetManager: Finished task 42.0 in stage 0.0 (TID 42) in 11 ms on datanode02 (executor 3) (40/50) 19/08/13 19:53:27 INFO TaskSetManager: Starting task 46.0 in stage 0.0 (TID 46, datanode03, executor 1, partition 46, PROCESS_LOCAL, 7831 bytes) 19/08/13 19:53:27 INFO TaskSetManager: Finished task 39.0 in stage 0.0 (TID 39) in 20 ms on datanode03 (executor 1) (41/50) 19/08/13 19:53:27 INFO TaskSetManager: Starting task 47.0 in stage 0.0 (TID 47, datanode03, executor 1, partition 47, PROCESS_LOCAL, 7831 bytes) 19/08/13 19:53:27 INFO TaskSetManager: Finished task 41.0 in stage 0.0 (TID 41) in 20 ms on datanode03 (executor 1) (42/50) 19/08/13 19:53:27 INFO TaskSetManager: Starting task 48.0 in stage 0.0 (TID 48, datanode01, executor 2, partition 48, PROCESS_LOCAL, 7831 bytes) 19/08/13 19:53:27 INFO TaskSetManager: Starting task 49.0 in stage 0.0 (TID 49, datanode01, executor 2, partition 49, PROCESS_LOCAL, 7888 bytes) 19/08/13 19:53:27 INFO TaskSetManager: Finished task 43.0 in stage 0.0 (TID 43) in 18 ms on datanode01 (executor 2) (43/50) 19/08/13 19:53:27 INFO TaskSetManager: Finished task 38.0 in stage 0.0 (TID 38) in 31 ms on datanode01 (executor 2) (44/50) 19/08/13 19:53:27 INFO TaskSetManager: Finished task 45.0 in stage 0.0 (TID 45) in 11 ms on datanode02 (executor 3) (45/50) 19/08/13 19:53:27 INFO TaskSetManager: Finished task 44.0 in stage 0.0 (TID 44) in 16 ms on datanode02 (executor 3) (46/50) 19/08/13 19:53:27 INFO TaskSetManager: Finished task 46.0 in stage 0.0 (TID 46) in 18 ms on datanode03 (executor 1) (47/50) 19/08/13 19:53:27 INFO TaskSetManager: Finished task 48.0 in stage 0.0 (TID 48) in 15 ms on datanode01 (executor 2) (48/50) 19/08/13 19:53:27 INFO TaskSetManager: Finished task 47.0 in stage 0.0 (TID 47) in 15 ms on datanode03 (executor 1) (49/50) 19/08/13 19:53:27 INFO TaskSetManager: Finished task 49.0 in stage 0.0 (TID 49) in 25 ms on datanode01 (executor 2) (50/50) 19/08/13 19:53:27 INFO YarnClusterScheduler: Removed TaskSet 0.0, whose tasks have all completed, from pool 19/08/13 19:53:27 INFO DAGScheduler: ShuffleMapStage 0 (start at VoiceApplication2.java:128) finished in 1.174 s 19/08/13 19:53:27 INFO DAGScheduler: looking for newly runnable stages 19/08/13 19:53:27 INFO DAGScheduler: running: Set() 19/08/13 19:53:27 INFO DAGScheduler: waiting: Set(ResultStage 1) 19/08/13 19:53:27 INFO DAGScheduler: failed: Set() 19/08/13 19:53:27 INFO DAGScheduler: Submitting ResultStage 1 (ShuffledRDD[2] at start at VoiceApplication2.java:128), which has no missing parents 19/08/13 19:53:27 INFO MemoryStore: Block broadcast_1 stored as values in memory (estimated size 3.2 KB, free 366.3 MB) 19/08/13 19:53:27 INFO MemoryStore: Block broadcast_1_piece0 stored as bytes in memory (estimated size 1979.0 B, free 366.3 MB) 19/08/13 19:53:27 INFO BlockManagerInfo: Added broadcast_1_piece0 in memory on datanode02:31984 (size: 1979.0 B, free: 366.3 MB) 19/08/13 19:53:27 INFO SparkContext: Created broadcast 1 from broadcast at DAGScheduler.scala:1039 19/08/13 19:53:27 INFO DAGScheduler: Submitting 20 missing tasks from ResultStage 1 (ShuffledRDD[2] at start at VoiceApplication2.java:128) (first 15 tasks are for partitions Vector(0, 1, 2, 3, 4, 5, 6, 7, 8, 9, 10, 11, 12, 13, 14)) 19/08/13 19:53:27 INFO YarnClusterScheduler: Adding task set 1.0 with 20 tasks 19/08/13 19:53:27 INFO TaskSetManager: Starting task 0.0 in stage 1.0 (TID 50, datanode03, executor 1, partition 0, NODE_LOCAL, 7638 bytes) 19/08/13 19:53:27 INFO TaskSetManager: Starting task 1.0 in stage 1.0 (TID 51, datanode02, executor 3, partition 1, NODE_LOCAL, 7638 bytes) 19/08/13 19:53:27 INFO TaskSetManager: Starting task 3.0 in stage 1.0 (TID 52, datanode01, executor 2, partition 3, NODE_LOCAL, 7638 bytes) 19/08/13 19:53:27 INFO TaskSetManager: Starting task 2.0 in stage 1.0 (TID 53, datanode03, executor 1, partition 2, NODE_LOCAL, 7638 bytes) 19/08/13 19:53:27 INFO TaskSetManager: Starting task 4.0 in stage 1.0 (TID 54, datanode02, executor 3, partition 4, NODE_LOCAL, 7638 bytes) 19/08/13 19:53:27 INFO TaskSetManager: Starting task 5.0 in stage 1.0 (TID 55, datanode01, executor 2, partition 5, NODE_LOCAL, 7638 bytes) 19/08/13 19:53:27 INFO BlockManagerInfo: Added broadcast_1_piece0 in memory on datanode02:28863 (size: 1979.0 B, free: 2.5 GB) 19/08/13 19:53:27 INFO BlockManagerInfo: Added broadcast_1_piece0 in memory on datanode01:20487 (size: 1979.0 B, free: 2.5 GB) 19/08/13 19:53:27 INFO BlockManagerInfo: Added broadcast_1_piece0 in memory on datanode03:3328 (size: 1979.0 B, free: 2.5 GB) 19/08/13 19:53:27 INFO MapOutputTrackerMasterEndpoint: Asked to send map output locations for shuffle 0 to 10.1.229.163:24656 19/08/13 19:53:27 INFO MapOutputTrackerMasterEndpoint: Asked to send map output locations for shuffle 0 to 10.1.198.144:41122 19/08/13 19:53:27 INFO MapOutputTrackerMasterEndpoint: Asked to send map output locations for shuffle 0 to 10.1.229.158:64276 19/08/13 19:53:27 INFO TaskSetManager: Starting task 7.0 in stage 1.0 (TID 56, datanode03, executor 1, partition 7, NODE_LOCAL, 7638 bytes) 19/08/13 19:53:27 INFO TaskSetManager: Finished task 2.0 in stage 1.0 (TID 53) in 192 ms on datanode03 (executor 1) (1/20) 19/08/13 19:53:27 INFO TaskSetManager: Starting task 8.0 in stage 1.0 (TID 57, datanode03, executor 1, partition 8, NODE_LOCAL, 7638 bytes) 19/08/13 19:53:27 INFO TaskSetManager: Finished task 7.0 in stage 1.0 (TID 56) in 25 ms on datanode03 (executor 1) (2/20) 19/08/13 19:53:27 INFO TaskSetManager: Starting task 6.0 in stage 1.0 (TID 58, datanode02, executor 3, partition 6, NODE_LOCAL, 7638 bytes) 19/08/13 19:53:27 INFO TaskSetManager: Finished task 1.0 in stage 1.0 (TID 51) in 220 ms on datanode02 (executor 3) (3/20) 19/08/13 19:53:27 INFO TaskSetManager: Starting task 14.0 in stage 1.0 (TID 59, datanode03, executor 1, partition 14, NODE_LOCAL, 7638 bytes) 19/08/13 19:53:27 INFO TaskSetManager: Finished task 8.0 in stage 1.0 (TID 57) in 17 ms on datanode03 (executor 1) (4/20) 19/08/13 19:53:27 INFO TaskSetManager: Starting task 16.0 in stage 1.0 (TID 60, datanode03, executor 1, partition 16, NODE_LOCAL, 7638 bytes) 19/08/13 19:53:27 INFO TaskSetManager: Finished task 14.0 in stage 1.0 (TID 59) in 15 ms on datanode03 (executor 1) (5/20) 19/08/13 19:53:27 INFO TaskSetManager: Finished task 16.0 in stage 1.0 (TID 60) in 21 ms on datanode03 (executor 1) (6/20) 19/08/13 19:53:27 INFO TaskSetManager: Starting task 9.0 in stage 1.0 (TID 61, datanode02, executor 3, partition 9, NODE_LOCAL, 7638 bytes) 19/08/13 19:53:27 INFO TaskSetManager: Finished task 4.0 in stage 1.0 (TID 54) in 269 ms on datanode02 (executor 3) (7/20) 19/08/13 19:53:27 INFO TaskSetManager: Finished task 0.0 in stage 1.0 (TID 50) in 339 ms on datanode03 (executor 1) (8/20) 19/08/13 19:53:27 INFO TaskSetManager: Starting task 10.0 in stage 1.0 (TID 62, datanode02, executor 3, partition 10, NODE_LOCAL, 7638 bytes) 19/08/13 19:53:27 INFO TaskSetManager: Finished task 6.0 in stage 1.0 (TID 58) in 56 ms on datanode02 (executor 3) (9/20) 19/08/13 19:53:27 INFO TaskSetManager: Starting task 11.0 in stage 1.0 (TID 63, datanode01, executor 2, partition 11, NODE_LOCAL, 7638 bytes) 19/08/13 19:53:27 INFO TaskSetManager: Finished task 5.0 in stage 1.0 (TID 55) in 284 ms on datanode01 (executor 2) (10/20) 19/08/13 19:53:27 INFO TaskSetManager: Starting task 12.0 in stage 1.0 (TID 64, datanode01, executor 2, partition 12, NODE_LOCAL, 7638 bytes) 19/08/13 19:53:27 INFO TaskSetManager: Finished task 3.0 in stage 1.0 (TID 52) in 287 ms on datanode01 (executor 2) (11/20) 19/08/13 19:53:27 INFO TaskSetManager: Starting task 13.0 in stage 1.0 (TID 65, datanode02, executor 3, partition 13, NODE_LOCAL, 7638 bytes) 19/08/13 19:53:27 INFO TaskSetManager: Starting task 15.0 in stage 1.0 (TID 66, datanode02, executor 3, partition 15, NODE_LOCAL, 7638 bytes) 19/08/13 19:53:27 INFO TaskSetManager: Finished task 10.0 in stage 1.0 (TID 62) in 25 ms on datanode02 (executor 3) (12/20) 19/08/13 19:53:27 INFO TaskSetManager: Finished task 9.0 in stage 1.0 (TID 61) in 29 ms on datanode02 (executor 3) (13/20) 19/08/13 19:53:27 INFO TaskSetManager: Starting task 17.0 in stage 1.0 (TID 67, datanode02, executor 3, partition 17, NODE_LOCAL, 7638 bytes) 19/08/13 19:53:27 INFO TaskSetManager: Finished task 15.0 in stage 1.0 (TID 66) in 13 ms on datanode02 (executor 3) (14/20) 19/08/13 19:53:27 INFO TaskSetManager: Finished task 13.0 in stage 1.0 (TID 65) in 16 ms on datanode02 (executor 3) (15/20) 19/08/13 19:53:27 INFO TaskSetManager: Starting task 18.0 in stage 1.0 (TID 68, datanode02, executor 3, partition 18, NODE_LOCAL, 7638 bytes) 19/08/13 19:53:27 INFO TaskSetManager: Starting task 19.0 in stage 1.0 (TID 69, datanode01, executor 2, partition 19, NODE_LOCAL, 7638 bytes) 19/08/13 19:53:27 INFO TaskSetManager: Finished task 11.0 in stage 1.0 (TID 63) in 30 ms on datanode01 (executor 2) (16/20) 19/08/13 19:53:27 INFO TaskSetManager: Finished task 12.0 in stage 1.0 (TID 64) in 30 ms on datanode01 (executor 2) (17/20) 19/08/13 19:53:27 INFO TaskSetManager: Finished task 17.0 in stage 1.0 (TID 67) in 17 ms on datanode02 (executor 3) (18/20) 19/08/13 19:53:27 INFO TaskSetManager: Finished task 19.0 in stage 1.0 (TID 69) in 13 ms on datanode01 (executor 2) (19/20) 19/08/13 19:53:27 INFO TaskSetManager: Finished task 18.0 in stage 1.0 (TID 68) in 20 ms on datanode02 (executor 3) (20/20) 19/08/13 19:53:27 INFO YarnClusterScheduler: Removed TaskSet 1.0, whose tasks have all completed, from pool 19/08/13 19:53:27 INFO DAGScheduler: ResultStage 1 (start at VoiceApplication2.java:128) finished in 0.406 s 19/08/13 19:53:27 INFO DAGScheduler: Job 0 finished: start at VoiceApplication2.java:128, took 1.850883 s 19/08/13 19:53:27 INFO ReceiverTracker: Starting 1 receivers 19/08/13 19:53:27 INFO ReceiverTracker: ReceiverTracker started 19/08/13 19:53:27 INFO KafkaInputDStream: Slide time = 60000 ms 19/08/13 19:53:27 INFO KafkaInputDStream: Storage level = Serialized 1x Replicated 19/08/13 19:53:27 INFO KafkaInputDStream: Checkpoint interval = null 19/08/13 19:53:27 INFO KafkaInputDStream: Remember interval = 60000 ms 19/08/13 19:53:27 INFO KafkaInputDStream: Initialized and validated org.apache.spark.streaming.kafka.KafkaInputDStream@5fd3dc81 19/08/13 19:53:27 INFO ForEachDStream: Slide time = 60000 ms 19/08/13 19:53:27 INFO ForEachDStream: Storage level = Serialized 1x Replicated 19/08/13 19:53:27 INFO ForEachDStream: Checkpoint interval = null 19/08/13 19:53:27 INFO ForEachDStream: Remember interval = 60000 ms 19/08/13 19:53:27 INFO ForEachDStream: Initialized and validated org.apache.spark.streaming.dstream.ForEachDStream@4044ec97 19/08/13 19:53:27 INFO KafkaInputDStream: Slide time = 60000 ms 19/08/13 19:53:27 INFO KafkaInputDStream: Storage level = Serialized 1x Replicated 19/08/13 19:53:27 INFO KafkaInputDStream: Checkpoint interval = null 19/08/13 19:53:27 INFO KafkaInputDStream: Remember interval = 60000 ms 19/08/13 19:53:27 INFO KafkaInputDStream: Initialized and validated org.apache.spark.streaming.kafka.KafkaInputDStream@5fd3dc81 19/08/13 19:53:27 INFO MappedDStream: Slide time = 60000 ms 19/08/13 19:53:27 INFO MappedDStream: Storage level = Serialized 1x Replicated 19/08/13 19:53:27 INFO MappedDStream: Checkpoint interval = null 19/08/13 19:53:27 INFO MappedDStream: Remember interval = 60000 ms 19/08/13 19:53:27 INFO MappedDStream: Initialized and validated org.apache.spark.streaming.dstream.MappedDStream@5dd4b960 19/08/13 19:53:27 INFO ForEachDStream: Slide time = 60000 ms 19/08/13 19:53:27 INFO ForEachDStream: Storage level = Serialized 1x Replicated 19/08/13 19:53:27 INFO ForEachDStream: Checkpoint interval = null 19/08/13 19:53:27 INFO ForEachDStream: Remember interval = 60000 ms 19/08/13 19:53:27 INFO ForEachDStream: Initialized and validated org.apache.spark.streaming.dstream.ForEachDStream@132d0c3c 19/08/13 19:53:27 INFO KafkaInputDStream: Slide time = 60000 ms 19/08/13 19:53:27 INFO KafkaInputDStream: Storage level = Serialized 1x Replicated 19/08/13 19:53:27 INFO KafkaInputDStream: Checkpoint interval = null 19/08/13 19:53:27 INFO KafkaInputDStream: Remember interval = 60000 ms 19/08/13 19:53:27 INFO KafkaInputDStream: Initialized and validated org.apache.spark.streaming.kafka.KafkaInputDStream@5fd3dc81 19/08/13 19:53:27 INFO MappedDStream: Slide time = 60000 ms 19/08/13 19:53:27 INFO MappedDStream: Storage level = Serialized 1x Replicated 19/08/13 19:53:27 INFO MappedDStream: Checkpoint interval = null 19/08/13 19:53:27 INFO MappedDStream: Remember interval = 60000 ms 19/08/13 19:53:27 INFO MappedDStream: Initialized and validated org.apache.spark.streaming.dstream.MappedDStream@5dd4b960 19/08/13 19:53:27 INFO ForEachDStream: Slide time = 60000 ms 19/08/13 19:53:27 INFO ForEachDStream: Storage level = Serialized 1x Replicated 19/08/13 19:53:27 INFO ForEachDStream: Checkpoint interval = null 19/08/13 19:53:27 INFO ForEachDStream: Remember interval = 60000 ms 19/08/13 19:53:27 INFO ForEachDStream: Initialized and validated org.apache.spark.streaming.dstream.ForEachDStream@525bed0c 19/08/13 19:53:27 INFO DAGScheduler: Got job 1 (start at VoiceApplication2.java:128) with 1 output partitions 19/08/13 19:53:27 INFO DAGScheduler: Final stage: ResultStage 2 (start at VoiceApplication2.java:128) 19/08/13 19:53:27 INFO DAGScheduler: Parents of final stage: List() 19/08/13 19:53:27 INFO DAGScheduler: Missing parents: List() 19/08/13 19:53:27 INFO DAGScheduler: Submitting ResultStage 2 (Receiver 0 ParallelCollectionRDD[3] at makeRDD at ReceiverTracker.scala:613), which has no missing parents 19/08/13 19:53:27 INFO ReceiverTracker: Receiver 0 started 19/08/13 19:53:27 INFO MemoryStore: Block broadcast_2 stored as values in memory (estimated size 133.5 KB, free 366.2 MB) 19/08/13 19:53:27 INFO MemoryStore: Block broadcast_2_piece0 stored as bytes in memory (estimated size 36.3 KB, free 366.1 MB) 19/08/13 19:53:27 INFO BlockManagerInfo: Added broadcast_2_piece0 in memory on datanode02:31984 (size: 36.3 KB, free: 366.3 MB) 19/08/13 19:53:27 INFO SparkContext: Created broadcast 2 from broadcast at DAGScheduler.scala:1039 19/08/13 19:53:27 INFO DAGScheduler: Submitting 1 missing tasks from ResultStage 2 (Receiver 0 ParallelCollectionRDD[3] at makeRDD at ReceiverTracker.scala:613) (first 15 tasks are for partitions Vector(0)) 19/08/13 19:53:27 INFO YarnClusterScheduler: Adding task set 2.0 with 1 tasks 19/08/13 19:53:27 INFO TaskSetManager: Starting task 0.0 in stage 2.0 (TID 70, datanode01, executor 2, partition 0, PROCESS_LOCAL, 8757 bytes) 19/08/13 19:53:27 INFO RecurringTimer: Started timer for JobGenerator at time 1565697240000 19/08/13 19:53:27 INFO JobGenerator: Started JobGenerator at 1565697240000 ms 19/08/13 19:53:27 INFO JobScheduler: Started JobScheduler 19/08/13 19:53:27 INFO StreamingContext: StreamingContext started 19/08/13 19:53:27 INFO BlockManagerInfo: Added broadcast_2_piece0 in memory on datanode01:20487 (size: 36.3 KB, free: 2.5 GB) 19/08/13 19:53:27 INFO ReceiverTracker: Registered receiver for stream 0 from 10.1.229.158:64276 19/08/13 19:54:00 INFO JobScheduler: Added jobs for time 1565697240000 ms 19/08/13 19:54:00 INFO JobScheduler: Starting job streaming job 1565697240000 ms.0 from job set of time 1565697240000 ms 19/08/13 19:54:00 INFO JobScheduler: Starting job streaming job 1565697240000 ms.1 from job set of time 1565697240000 ms 19/08/13 19:54:00 INFO JobScheduler: Finished job streaming job 1565697240000 ms.1 from job set of time 1565697240000 ms 19/08/13 19:54:00 INFO JobScheduler: Finished job streaming job 1565697240000 ms.0 from job set of time 1565697240000 ms 19/08/13 19:54:00 INFO JobScheduler: Starting job streaming job 1565697240000 ms.2 from job set of time 1565697240000 ms 19/08/13 19:54:00 INFO SharedState: loading hive config file: file:/data01/hadoop/yarn/local/usercache/hdfs/filecache/85431/__spark_conf__.zip/__hadoop_conf__/hive-site.xml 19/08/13 19:54:00 INFO SharedState: Setting hive.metastore.warehouse.dir ('null') to the value of spark.sql.warehouse.dir ('hdfs://CID-042fb939-95b4-4b74-91b8-9f94b999bdf7/apps/hive/warehouse'). 19/08/13 19:54:00 INFO SharedState: Warehouse path is 'hdfs://CID-042fb939-95b4-4b74-91b8-9f94b999bdf7/apps/hive/warehouse'. 19/08/13 19:54:00 INFO StateStoreCoordinatorRef: Registered StateStoreCoordinator endpoint 19/08/13 19:54:00 INFO BlockManagerInfo: Removed broadcast_1_piece0 on datanode02:31984 in memory (size: 1979.0 B, free: 366.3 MB) 19/08/13 19:54:00 INFO BlockManagerInfo: Removed broadcast_1_piece0 on datanode02:28863 in memory (size: 1979.0 B, free: 2.5 GB) 19/08/13 19:54:00 INFO BlockManagerInfo: Removed broadcast_1_piece0 on datanode01:20487 in memory (size: 1979.0 B, free: 2.5 GB) 19/08/13 19:54:00 INFO BlockManagerInfo: Removed broadcast_1_piece0 on datanode03:3328 in memory (size: 1979.0 B, free: 2.5 GB) 19/08/13 19:54:02 INFO CodeGenerator: Code generated in 175.416957 ms 19/08/13 19:54:02 INFO JobScheduler: Finished job streaming job 1565697240000 ms.2 from job set of time 1565697240000 ms 19/08/13 19:54:02 ERROR JobScheduler: Error running job streaming job 1565697240000 ms.2 org.apache.spark.sql.catalyst.analysis.NoSuchDatabaseException: Database 'meta_voice' not found; at org.apache.spark.sql.catalyst.catalog.ExternalCatalog.requireDbExists(ExternalCatalog.scala:40) at org.apache.spark.sql.catalyst.catalog.InMemoryCatalog.tableExists(InMemoryCatalog.scala:331) at org.apache.spark.sql.catalyst.catalog.SessionCatalog.tableExists(SessionCatalog.scala:388) at org.apache.spark.sql.DataFrameWriter.saveAsTable(DataFrameWriter.scala:398) at org.apache.spark.sql.DataFrameWriter.saveAsTable(DataFrameWriter.scala:393) at com.stream.VoiceApplication2$2.call(VoiceApplication2.java:122) at com.stream.VoiceApplication2$2.call(VoiceApplication2.java:115) at org.apache.spark.streaming.api.java.JavaDStreamLike$$anonfun$foreachRDD$2.apply(JavaDStreamLike.scala:280) at org.apache.spark.streaming.api.java.JavaDStreamLike$$anonfun$foreachRDD$2.apply(JavaDStreamLike.scala:280) at org.apache.spark.streaming.dstream.ForEachDStream$$anonfun$1$$anonfun$apply$mcV$sp$1.apply$mcV$sp(ForEachDStream.scala:51) at org.apache.spark.streaming.dstream.ForEachDStream$$anonfun$1$$anonfun$apply$mcV$sp$1.apply(ForEachDStream.scala:51) at org.apache.spark.streaming.dstream.ForEachDStream$$anonfun$1$$anonfun$apply$mcV$sp$1.apply(ForEachDStream.scala:51) at org.apache.spark.streaming.dstream.DStream.createRDDWithLocalProperties(DStream.scala:416) at org.apache.spark.streaming.dstream.ForEachDStream$$anonfun$1.apply$mcV$sp(ForEachDStream.scala:50) at org.apache.spark.streaming.dstream.ForEachDStream$$anonfun$1.apply(ForEachDStream.scala:50) at org.apache.spark.streaming.dstream.ForEachDStream$$anonfun$1.apply(ForEachDStream.scala:50) at scala.util.Try$.apply(Try.scala:192) at org.apache.spark.streaming.scheduler.Job.run(Job.scala:39) at org.apache.spark.streaming.scheduler.JobScheduler$JobHandler$$anonfun$run$1.apply$mcV$sp(JobScheduler.scala:257) at org.apache.spark.streaming.scheduler.JobScheduler$JobHandler$$anonfun$run$1.apply(JobScheduler.scala:257) at org.apache.spark.streaming.scheduler.JobScheduler$JobHandler$$anonfun$run$1.apply(JobScheduler.scala:257) at scala.util.DynamicVariable.withValue(DynamicVariable.scala:58) at org.apache.spark.streaming.scheduler.JobScheduler$JobHandler.run(JobScheduler.scala:256) at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149) at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624) at java.lang.Thread.run(Thread.java:748) 19/08/13 19:54:02 ERROR ApplicationMaster: User class threw exception: org.apache.spark.sql.catalyst.analysis.NoSuchDatabaseException: Database 'meta_voice' not found; org.apache.spark.sql.catalyst.analysis.NoSuchDatabaseException: Database 'meta_voice' not found; at org.apache.spark.sql.catalyst.catalog.ExternalCatalog.requireDbExists(ExternalCatalog.scala:40) at org.apache.spark.sql.catalyst.catalog.InMemoryCatalog.tableExists(InMemoryCatalog.scala:331) at org.apache.spark.sql.catalyst.catalog.SessionCatalog.tableExists(SessionCatalog.scala:388) at org.apache.spark.sql.DataFrameWriter.saveAsTable(DataFrameWriter.scala:398) at org.apache.spark.sql.DataFrameWriter.saveAsTable(DataFrameWriter.scala:393) at com.stream.VoiceApplication2$2.call(VoiceApplication2.java:122) at com.stream.VoiceApplication2$2.call(VoiceApplication2.java:115) at org.apache.spark.streaming.api.java.JavaDStreamLike$$anonfun$foreachRDD$2.apply(JavaDStreamLike.scala:280) at org.apache.spark.streaming.api.java.JavaDStreamLike$$anonfun$foreachRDD$2.apply(JavaDStreamLike.scala:280) at org.apache.spark.streaming.dstream.ForEachDStream$$anonfun$1$$anonfun$apply$mcV$sp$1.apply$mcV$sp(ForEachDStream.scala:51) at org.apache.spark.streaming.dstream.ForEachDStream$$anonfun$1$$anonfun$apply$mcV$sp$1.apply(ForEachDStream.scala:51) at org.apache.spark.streaming.dstream.ForEachDStream$$anonfun$1$$anonfun$apply$mcV$sp$1.apply(ForEachDStream.scala:51) at org.apache.spark.streaming.dstream.DStream.createRDDWithLocalProperties(DStream.scala:416) at org.apache.spark.streaming.dstream.ForEachDStream$$anonfun$1.apply$mcV$sp(ForEachDStream.scala:50) at org.apache.spark.streaming.dstream.ForEachDStream$$anonfun$1.apply(ForEachDStream.scala:50) at org.apache.spark.streaming.dstream.ForEachDStream$$anonfun$1.apply(ForEachDStream.scala:50) at scala.util.Try$.apply(Try.scala:192) at org.apache.spark.streaming.scheduler.Job.run(Job.scala:39) at org.apache.spark.streaming.scheduler.JobScheduler$JobHandler$$anonfun$run$1.apply$mcV$sp(JobScheduler.scala:257) at org.apache.spark.streaming.scheduler.JobScheduler$JobHandler$$anonfun$run$1.apply(JobScheduler.scala:257) at org.apache.spark.streaming.scheduler.JobScheduler$JobHandler$$anonfun$run$1.apply(JobScheduler.scala:257) at scala.util.DynamicVariable.withValue(DynamicVariable.scala:58) at org.apache.spark.streaming.scheduler.JobScheduler$JobHandler.run(JobScheduler.scala:256) at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149) at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624) at java.lang.Thread.run(Thread.java:748) 19/08/13 19:54:02 INFO ApplicationMaster: Final app status: FAILED, exitCode: 15, (reason: User class threw exception: org.apache.spark.sql.catalyst.analysis.NoSuchDatabaseException: Database 'meta_voice' not found; at org.apache.spark.sql.catalyst.catalog.ExternalCatalog.requireDbExists(ExternalCatalog.scala:40) at org.apache.spark.sql.catalyst.catalog.InMemoryCatalog.tableExists(InMemoryCatalog.scala:331) at org.apache.spark.sql.catalyst.catalog.SessionCatalog.tableExists(SessionCatalog.scala:388) at org.apache.spark.sql.DataFrameWriter.saveAsTable(DataFrameWriter.scala:398) at org.apache.spark.sql.DataFrameWriter.saveAsTable(DataFrameWriter.scala:393) at com.stream.VoiceApplication2$2.call(VoiceApplication2.java:122) at com.stream.VoiceApplication2$2.call(VoiceApplication2.java:115) at org.apache.spark.streaming.api.java.JavaDStreamLike$$anonfun$foreachRDD$2.apply(JavaDStreamLike.scala:280) at org.apache.spark.streaming.api.java.JavaDStreamLike$$anonfun$foreachRDD$2.apply(JavaDStreamLike.scala:280) at org.apache.spark.streaming.dstream.ForEachDStream$$anonfun$1$$anonfun$apply$mcV$sp$1.apply$mcV$sp(ForEachDStream.scala:51) at org.apache.spark.streaming.dstream.ForEachDStream$$anonfun$1$$anonfun$apply$mcV$sp$1.apply(ForEachDStream.scala:51) at org.apache.spark.streaming.dstream.ForEachDStream$$anonfun$1$$anonfun$apply$mcV$sp$1.apply(ForEachDStream.scala:51) at org.apache.spark.streaming.dstream.DStream.createRDDWithLocalProperties(DStream.scala:416) at org.apache.spark.streaming.dstream.ForEachDStream$$anonfun$1.apply$mcV$sp(ForEachDStream.scala:50) at org.apache.spark.streaming.dstream.ForEachDStream$$anonfun$1.apply(ForEachDStream.scala:50) at org.apache.spark.streaming.dstream.ForEachDStream$$anonfun$1.apply(ForEachDStream.scala:50) at scala.util.Try$.apply(Try.scala:192) at org.apache.spark.streaming.scheduler.Job.run(Job.scala:39) at org.apache.spark.streaming.scheduler.JobScheduler$JobHandler$$anonfun$run$1.apply$mcV$sp(JobScheduler.scala:257) at org.apache.spark.streaming.scheduler.JobScheduler$JobHandler$$anonfun$run$1.apply(JobScheduler.scala:257) at org.apache.spark.streaming.scheduler.JobScheduler$JobHandler$$anonfun$run$1.apply(JobScheduler.scala:257) at scala.util.DynamicVariable.withValue(DynamicVariable.scala:58) at org.apache.spark.streaming.scheduler.JobScheduler$JobHandler.run(JobScheduler.scala:256) at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149) at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624) at java.lang.Thread.run(Thread.java:748) ) 19/08/13 19:54:02 INFO StreamingContext: Invoking stop(stopGracefully=true) from shutdown hook 19/08/13 19:54:02 INFO ReceiverTracker: Sent stop signal to all 1 receivers 19/08/13 19:54:02 ERROR ReceiverTracker: Deregistered receiver for stream 0: Stopped by driver 19/08/13 19:54:02 INFO TaskSetManager: Finished task 0.0 in stage 2.0 (TID 70) in 35055 ms on datanode01 (executor 2) (1/1) 19/08/13 19:54:02 INFO YarnClusterScheduler: Removed TaskSet 2.0, whose tasks have all completed, from pool 19/08/13 19:54:02 INFO DAGScheduler: ResultStage 2 (start at VoiceApplication2.java:128) finished in 35.086 s 19/08/13 19:54:02 INFO ReceiverTracker: Waiting for receiver job to terminate gracefully 19/08/13 19:54:02 INFO ReceiverTracker: Waited for receiver job to terminate gracefully 19/08/13 19:54:02 INFO ReceiverTracker: All of the receivers have deregistered successfully 19/08/13 19:54:02 INFO ReceiverTracker: ReceiverTracker stopped 19/08/13 19:54:02 INFO JobGenerator: Stopping JobGenerator gracefully 19/08/13 19:54:02 INFO JobGenerator: Waiting for all received blocks to be consumed for job generation 19/08/13 19:54:02 INFO JobGenerator: Waited for all received blocks to be consumed for job generation 19/08/13 19:54:12 WARN ShutdownHookManager: ShutdownHook '$anon$2' timeout, java.util.concurrent.TimeoutException java.util.concurrent.TimeoutException at java.util.concurrent.FutureTask.get(FutureTask.java:205) at org.apache.hadoop.util.ShutdownHookManager$1.run(ShutdownHookManager.java:67) 19/08/13 19:54:12 ERROR Utils: Uncaught exception in thread pool-1-thread-1 java.lang.InterruptedException at java.lang.Object.wait(Native Method) at java.lang.Thread.join(Thread.java:1252) at java.lang.Thread.join(Thread.java:1326) at org.apache.spark.streaming.util.RecurringTimer.stop(RecurringTimer.scala:86) at org.apache.spark.streaming.scheduler.JobGenerator.stop(JobGenerator.scala:137) at org.apache.spark.streaming.scheduler.JobScheduler.stop(JobScheduler.scala:123) at org.apache.spark.streaming.StreamingContext$$anonfun$stop$1.apply$mcV$sp(StreamingContext.scala:681) at org.apache.spark.util.Utils$.tryLogNonFatalError(Utils.scala:1357) at org.apache.spark.streaming.StreamingContext.stop(StreamingContext.scala:680) at org.apache.spark.streaming.StreamingContext.org$apache$spark$streaming$StreamingContext$$stopOnShutdown(StreamingContext.scala:714) at org.apache.spark.streaming.StreamingContext$$anonfun$start$1.apply$mcV$sp(StreamingContext.scala:599) at org.apache.spark.util.SparkShutdownHook.run(ShutdownHookManager.scala:216) at org.apache.spark.util.SparkShutdownHookManager$$anonfun$runAll$1$$anonfun$apply$mcV$sp$1.apply$mcV$sp(ShutdownHookManager.scala:188) at org.apache.spark.util.SparkShutdownHookManager$$anonfun$runAll$1$$anonfun$apply$mcV$sp$1.apply(ShutdownHookManager.scala:188) at org.apache.spark.util.SparkShutdownHookManager$$anonfun$runAll$1$$anonfun$apply$mcV$sp$1.apply(ShutdownHookManager.scala:188) at org.apache.spark.util.Utils$.logUncaughtExceptions(Utils.scala:1988) at org.apache.spark.util.SparkShutdownHookManager$$anonfun$runAll$1.apply$mcV$sp(ShutdownHookManager.scala:188) at org.apache.spark.util.SparkShutdownHookManager$$anonfun$runAll$1.apply(ShutdownHookManager.scala:188) at org.apache.spark.util.SparkShutdownHookManager$$anonfun$runAll$1.apply(ShutdownHookManager.scala:188) at scala.util.Try$.apply(Try.scala:192) at org.apache.spark.util.SparkShutdownHookManager.runAll(ShutdownHookManager.scala:188) at org.apache.spark.util.SparkShutdownHookManager$$anon$2.run(ShutdownHookManager.scala:178) at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511) at java.util.concurrent.FutureTask.run(FutureTask.java:266) at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149) at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624) at java.lang.Thread.run(Thread.java:748) ```

Symfony2中的多个数据库

<div class="post-text" itemprop="text"> <p>I have an application with two database connections. First database is application's own database and second database is from another application (same server, different subdomain).</p> <p>For SQL queries on the second database I use services.</p> <p><kbd><strong>services.yml</strong></kbd></p> <pre><code>login.company: class: JP\CoreBundle\Service\Login\CompanyService arguments: em: "@doctrine.orm.login_entity_manager" </code></pre> <p><kbd><strong>CompanyService.php</strong></kbd></p> <pre><code>class CompanyService { private $_em; public function __construct(EntityManager $em){ $this-&gt;_em = $em; } public function getSomethingBySomething($something){ // Intentionally obscured method and parameter name $conn = $this-&gt;_em-&gt;getConnection(); $sql = ''; // Intentionally removed SQL query $stmt = $conn-&gt;prepare($sql); $stmt-&gt;bindParam('something', $something); // Intentionally obscured real name to something $stmt-&gt;execute(); return ($stmt-&gt;rowCount() == 1) ? $stmt-&gt;fetch(Query::HYDRATE_ARRAY) : false; } } </code></pre> <p><kbd><strong>config.yml</strong></kbd></p> <pre><code>doctrine: dbal: default_connection: analysis connections: analysis: driver: "%database_driver1%" host: "%database_host1%" port: "%database_port1%" dbname: "%database_name1%" user: "%database_user1%" password: "%database_password1%" charset: UTF8 login: driver: "%database_driver2%" host: "%database_host2%" port: "%database_port2%" dbname: "%database_name2%" user: "%database_user2%" password: "%database_password2%" charset: UTF8 orm: default_entity_manager: analysis entity_managers: analysis: connection: analysis mappings: JPCoreBundle: ~ login: connection: login auto_generate_proxy_classes: "%kernel.debug%" </code></pre> <p><strong>Question:</strong> Is this correct use of multiple databases in Symfony2 environment? Could it be improved somehow?</p> </div>

weblogic内存溢出outofmemory日志分析

软件环境: aix 6.1+weblogic10.3+oracle11 系统每天同时在线人数100左右,数据库的数据量有多张频繁操作的表数据记录在千万以上,主要后台自动处理线程过多。 系统前段时间运行一直正常,只是从7月底出现OOM,最近差不多半个月出现一次。 对weblogic产生的phd文件和javacore文件分析的不太清楚。。。 以下是javacore的日志: ***WARNING*** Java heap is almost exhausted : 0% free Java heap Please enable verbosegc trace and use IBM Pattern Modeling and Analysis Tool(http://www.alphaworks.ibm.com/tech/pmat) to analyze garbage collection activities. If heapdumps are generated at the same time, please use IBM HeapAnalyzer(http://www.alphaworks.ibm.com/tech/heapanalyzer) to analyze Java heap. File name : C:\Users\Wcy\Desktop\20140813\ebills\heapdump\javacore.20140813.091552.10551312.0007.txt Cause of thread dump : Dump Event "systhrow" (00040000) Detail "java/lang/OutOfMemoryError" received Date: 2014/08/13 at 09:16:49 Process ID : Not available Operating System : AIX 6.1 Processor Architecture : ppc Number of Processors : 12 Java version : JRE 1.6.0 IBM J9 2.4 AIX ppc-32 build jvmap3260sr9-20110624_85526 Virtual machine version : VM build 20110624_085526 Just-In-Time(JIT) compiler switch, Ahead-Of-Time (AOT) compiler switch, Compiler version : JIT enabled, AOT enabled - r9_20101028_17488ifx17 Garbage collector version : GC - 20101027_AA Java Heap Information Maximum Java heap size : 1024m Initial Java heap size : 512m Java Home Directory : /usr/java6/jre Java DLL Directory : /usr/java6/jre/bin System Classpath : /usr/java6/jre/lib/vm.jar;/usr/java6/jre/lib/annotation.jar;/usr/java6/jre/lib/beans.jar;/usr/java6/jre/lib/java.util.jar;/usr/java6/jre/lib/jndi.jar;/usr/java6/jre/lib/logging.jar;/usr/java6/jre/lib/security.jar;/usr/java6/jre/lib/sql.jar;/usr/java6/jre/lib/ibmorb.jar;/usr/java6/jre/lib/ibmorbapi.jar;/usr/java6/jre/lib/ibmcfw.jar;/usr/java6/jre/lib/rt.jar;/usr/java6/jre/lib/charsets.jar;/usr/java6/jre/lib/resources.jar;/usr/java6/jre/lib/ibmpkcs.jar;/usr/java6/jre/lib/ibmcertpathfw.jar;/usr/java6/jre/lib/ibmjgssfw.jar;/usr/java6/jre/lib/ibmjssefw.jar;/usr/java6/jre/lib/ibmsaslfw.jar;/usr/java6/jre/lib/ibmjcefw.jar;/usr/java6/jre/lib/ibmjgssprovider.jar;/usr/java6/jre/lib/ibmjsseprovider2.jar;/usr/java6/jre/lib/ibmcertpathprovider.jar;/usr/java6/jre/lib/ibmxmlcrypto.jar;/usr/java6/jre/lib/management-agent.jar;/usr/java6/jre/lib/xml.jar;/usr/java6/jre/lib/jlm.jar;/usr/java6/jre/lib/javascript.jar; User Arguments : -Xjcl:jclscar_24 -Dcom.ibm.oti.vm.bootstrap.library.path=/usr/java6/jre/lib/ppc -Dsun.boot.library.path=/usr/java6/jre/lib/ppc -Djava.library.path=/usr/java6/jre/lib/ppc:/usr/java6/jre/lib/ppc:/usr/java6/jre/lib/ppc/j9vm:/usr/java6/jre/lib/ppc/j9vm:/usr/java6/jre/lib/ppc:/usr/java6/jre/../lib/ppc::/home/ebills/bea/wlserver_10.3/server/native/aix/ppc:/usr/lib:/usr/lib -Djava.home=/usr/java6/jre -Djava.ext.dirs=/usr/java6/jre/lib/ext -Duser.dir=/weblogic/ebills/bea/user_projects/domains/nbdomain _j2se_j9=71168 0xF0A89414 -Djava.runtime.version=pap3260sr9fp2-20110627_03 (SR9 FP2) -Xdump -Djava.class.path=:/home/ebills/bea/patch_wls1030/profiles/default/sys_manifest_classpath/weblogic_patch.jar:/home/ebills/bea/patch_cie660/profiles/default/sys_manifest_classpath/weblogic_patch.jar:/usr/java6/lib/tools.jar:/home/ebills/bea/wlserver_10.3/server/lib/weblogic_sp.jar:/home/ebills/bea/wlserver_10.3/server/lib/weblogic.jar:/home/ebills/bea/modules/features/weblogic.server.modules_10.3.0.0.jar:/home/ebills/bea/wlserver_10.3/server/lib/webservices.jar:/home/ebills/bea/modules/org.apache.ant_1.6.5/lib/ant-all.jar:/home/ebills/bea/modules/net.sf.antcontrib_1.0.0.0_1-0b2/lib/ant-contrib.jar::/home/ebills/bea/wlserver_10.3/common/eval/pointbase/lib/pbclient57.jar:/home/ebills/bea/wlserver_10.3/server/lib/xqrl.jar:: -Xms512m -Xmx1024m -da -Dplatform.home=/home/ebills/bea/wlserver_10.3 -Dwls.home=/home/ebills/bea/wlserver_10.3/server -Dweblogic.home=/home/ebills/bea/wlserver_10.3/server -Dweblogic.management.discover=true -Dwlw.iterativeDev=false -Dwlw.testConsole=false -Dwlw.logErrorsToConsole= -Dclient.encoding.override=GBK -Dfile.encoding=GBK -Duser.language=zh -Duser.region=CN -Ddefault.client.encoding=GBK -Dweblogic.threadpool.MinPoolSize=200 -Dweblogic.threadpool.MaxPoolSize=500 -Djava.awt.headless=true -Dweblogic.ext.dirs=/home/ebills/bea/patch_wls1030/profiles/default/sysext_manifest_classpath:/home/ebills/bea/patch_cie660/profiles/default/sysext_manifest_classpath -Dweblogic.management.username=weblogic -Dweblogic.management.password=weblogic -Dweblogic.Name=AdminServer -Djava.security.policy=/home/ebills/bea/wlserver_10.3/server/lib/weblogic.policy -Dsun.java.command=weblogic.Server -Dsun.java.launcher=SUN_STANDARD _port_library 0xF0A89C18 _org.apache.harmony.vmi.portlib 0x3013DE18 User Limit Analysis User Limit Analysis Type Soft Limit Hard Limit RLIMIT_AS unlimited unlimited RLIMIT_CORE unlimited unlimited RLIMIT_CPU unlimited unlimited RLIMIT_DATA 2,147,483,645 bytes unlimited RLIMIT_FSIZE unlimited unlimited RLIMIT_NOFILE 1,024 1,024 RLIMIT_RSS unlimited unlimited RLIMIT_STACK 2,147,483,646 bytes 2,147,483,646 bytes Environment Variables Analysis Environment Variable Environment Variable Value _ /usr/java6/bin/java POST_CLASSPATH :/home/ebills/bea/wlserver_10.3/common/eval/pointbase/lib/pbclient57.jar:/home/ebills/bea/wlserver_10.3/server/lib/xqrl.jar CLUSTER_PROPERTIES -Dweblogic.management.discover=true JAVA_VENDOR IBM LANG Zh_CN PRODUCTION_MODE true DOMAIN_HOME /weblogic/ebills/bea/user_projects/domains/nbdomain LOGIN ebills CLASSPATHSEP : DATABASE_CLASSPATH /home/ebills/bea/wlserver_10.3/common/eval/pointbase/lib/pbclient57.jar CIE660_PATCH_LIBPATH /home/ebills/bea/patch_cie660/profiles/default/native WLS1030_PATCH_EXT_DIR /home/ebills/bea/patch_wls1030/profiles/default/sysext_manifest_classpath POINTBASE_HOME /home/ebills/bea/wlserver_10.3/common/eval/pointbase SSH_TTY /dev/pts/0 debugFlag false MEM_MAX_PERM_SIZE -XX:MaxPermSize=1024m SUN_JAVA_HOME CLCMD_PASSTHRU 1 PATCH_CLASSPATH /home/ebills/bea/patch_wls1030/profiles/default/sys_manifest_classpath/weblogic_patch.jar:/home/ebills/bea/patch_cie660/profiles/default/sys_manifest_classpath/weblogic_patch.jar PATH /home/ebills/bea/wlserver_10.3/server/bin:/home/ebills/bea/modules/org.apache.ant_1.6.5/bin:/usr/java6/jre/bin:/usr/java6/bin:/usr/bin:/etc:/usr/sbin:/usr/ucb:/home/ebills/bin:/usr/bin/X11:/sbin:/usr/java6/bin:/usr/java6/jre/bin:. FEATURES_DIR /home/ebills/bea/modules/features CIE660_PATCH_PATH /home/ebills/bea/patch_cie660/profiles/default/native verboseLoggingFlag false ANT_CONTRIB /home/ebills/bea/modules/net.sf.antcontrib_1.0.0.0_1-0b2 PATCH_PATH /home/ebills/bea/patch_wls1030/profiles/default/native:/home/ebills/bea/patch_cie660/profiles/default/native BEA_JAVA_HOME JAVA_VM ARDIR /home/ebills/bea/wlserver_10.3/server/lib LC__FASTMSG true POINTBASE_CLASSPATH :/home/ebills/bea/wlserver_10.3/common/eval/pointbase/lib/pbembedded57.jar:/home/ebills/bea/wlserver_10.3/common/eval/pointbase/lib/pbclient57.jar SSH_AUTH_SOCK /tmp/ssh-IbX9437420/agent.9437420 CIE660_PATCH_CLASSPATH /home/ebills/bea/patch_cie660/profiles/default/sys_manifest_classpath/weblogic_patch.jar JAVA_PROFILE JAVA_DEBUG CLASSPATH :/home/ebills/bea/patch_wls1030/profiles/default/sys_manifest_classpath/weblogic_patch.jar:/home/ebills/bea/patch_cie660/profiles/default/sys_manifest_classpath/weblogic_patch.jar:/usr/java6/lib/tools.jar:/home/ebills/bea/wlserver_10.3/server/lib/weblogic_sp.jar:/home/ebills/bea/wlserver_10.3/server/lib/weblogic.jar:/home/ebills/bea/modules/features/weblogic.server.modules_10.3.0.0.jar:/home/ebills/bea/wlserver_10.3/server/lib/webservices.jar:/home/ebills/bea/modules/org.apache.ant_1.6.5/lib/ant-all.jar:/home/ebills/bea/modules/net.sf.antcontrib_1.0.0.0_1-0b2/lib/ant-contrib.jar::/home/ebills/bea/wlserver_10.3/common/eval/pointbase/lib/pbclient57.jar:/home/ebills/bea/wlserver_10.3/server/lib/xqrl.jar:: LOGNAME ebills SAMPLES_HOME /home/ebills/bea/wlserver_10.3/samples MAIL /usr/spool/mail/ebills enableHotswapFlag POINTBASE_TOOLS /home/ebills/bea/wlserver_10.3/common/eval/pointbase/lib/pbtools57.jar LOCPATH /usr/lib/nls/loc MODULES_DIR /home/ebills/bea/modules PATCH_LIBPATH /home/ebills/bea/patch_wls1030/profiles/default/native:/home/ebills/bea/patch_cie660/profiles/default/native PATHSEP : WLS1030_PATCH_PATH /home/ebills/bea/patch_wls1030/profiles/default/native WLS1030_PATCH_CLASSPATH /home/ebills/bea/patch_wls1030/profiles/default/sys_manifest_classpath/weblogic_patch.jar iterativeDevFlag false doExitFlag false USER ebills SERVER_NAME AdminServer AUTHSTATE compat SERVER_CLASS weblogic.Server PRE_CLASSPATH JAVA_PROPERTIES -Dplatform.home=/home/ebills/bea/wlserver_10.3 -Dwls.home=/home/ebills/bea/wlserver_10.3/server -Dweblogic.home=/home/ebills/bea/wlserver_10.3/server -Dweblogic.management.discover=true BEA_HOME /home/ebills/bea MEM_DEV_ARGS SHELL /usr/bin/ksh ODMDIR /etc/objrepos OMNIORB_CONFIG /home/gjyw/ebills/config/ecorba.cfg JAVA_HOME /usr/java6 testConsoleFlag false WLS_HOME /home/ebills/bea/wlserver_10.3/server WLS1030_PATCH_LIBPATH /home/ebills/bea/patch_wls1030/profiles/default/native CIE660_PATCH_EXT_DIR /home/ebills/bea/patch_cie660/profiles/default/sysext_manifest_classpath JAVA_OPTIONS -da -Dplatform.home=/home/ebills/bea/wlserver_10.3 -Dwls.home=/home/ebills/bea/wlserver_10.3/server -Dweblogic.home=/home/ebills/bea/wlserver_10.3/server -Dweblogic.management.discover=true -Dwlw.iterativeDev=false -Dwlw.testConsole=false -Dwlw.logErrorsToConsole= -Dclient.encoding.override=GBK -Dfile.encoding=GBK -Duser.language=zh -Duser.region=CN -Ddefault.client.encoding=GBK -Dweblogic.threadpool.MinPoolSize=200 -Dweblogic.threadpool.MaxPoolSize=500 -Djava.awt.headless=true -Dweblogic.ext.dirs=/home/ebills/bea/patch_wls1030/profiles/default/sysext_manifest_classpath:/home/ebills/bea/patch_cie660/profiles/default/sysext_manifest_classpath -Dweblogic.management.username=weblogic -Dweblogic.management.password=weblogic HOME /home/ebills MEM_ARGS -Xms512m -Xmx1024m WEBLOGIC_EXTENSION_DIRS /home/ebills/bea/patch_wls1030/profiles/default/sysext_manifest_classpath:/home/ebills/bea/patch_cie660/profiles/default/sysext_manifest_classpath ANT_HOME /home/ebills/bea/modules/org.apache.ant_1.6.5 FILEDIR /home/gjyw/ebills/acct SSH_CONNECTION 10.100.67.22 3456 10.100.133.2 22 SSH_CLIENT 10.100.67.22 3456 22 WEBLOGIC_CLASSPATH /home/ebills/bea/patch_wls1030/profiles/default/sys_manifest_classpath/weblogic_patch.jar:/home/ebills/bea/patch_cie660/profiles/default/sys_manifest_classpath/weblogic_patch.jar:/usr/java6/lib/tools.jar:/home/ebills/bea/wlserver_10.3/server/lib/weblogic_sp.jar:/home/ebills/bea/wlserver_10.3/server/lib/weblogic.jar:/home/ebills/bea/modules/features/weblogic.server.modules_10.3.0.0.jar:/home/ebills/bea/wlserver_10.3/server/lib/webservices.jar:/home/ebills/bea/modules/org.apache.ant_1.6.5/lib/ant-all.jar:/home/ebills/bea/modules/net.sf.antcontrib_1.0.0.0_1-0b2/lib/ant-contrib.jar TERM vt100 MAILMSG [YOU HAVE NEW MAIL] POINTBASE_FLAG false LONG_DOMAIN_HOME /weblogic/ebills/bea/user_projects/domains/nbdomain PWD /weblogic/ebills/bea/user_projects/domains/nbdomain TZ Asia/Shanghai MEM_PERM_SIZE -XX:PermSize=512m BZJFILEDIR /weblogic/ebills/gjyw/ebills/ebillsData/DepositAcct WL_HOME /home/ebills/bea/wlserver_10.3 DEBUG_PORT 8453 A__z ! LOGNAME IBM_JVM_AIXTHREAD_SCOPE_NEW_VALUE S AIXTHREAD_SCOPE S IBM_JVM_CHANGED_ENVVARS_10551312 AIXTHREAD_SCOPE,NULLPTR,CORE_MMAP,LDR_CNTRL IBM_JVM_NULLPTR_NEW_VALUE NOSEGV NULLPTR NOSEGV IBM_JVM_CORE_MMAP_NEW_VALUE yes CORE_MMAP yes IBM_JVM_LDR_CNTRL_NEW_VALUE MAXDATA=0XA0000000@DSA LDR_CNTRL MAXDATA=0XA0000000@DSA NLSPATH /usr/lib/nls/msg/%L/%N:/usr/lib/nls/msg/%L/%N.cat LIBPATH /usr/java6/jre/lib/ppc:/usr/java6/jre/lib/ppc/j9vm:/usr/java6/jre/lib/ppc/j9vm:/usr/java6/jre/lib/ppc:/usr/java6/jre/../lib/ppc::/home/ebills/bea/wlserver_10.3/server/native/aix/ppc:/usr/lib:/usr/java6/jre/lib/ppc/headless IBM_JAVA_COMMAND_LINE /usr/java6/bin/java -Xms512m -Xmx1024m -da -Dplatform.home=/home/ebills/bea/wlserver_10.3 -Dwls.home=/home/ebills/bea/wlserver_10.3/server -Dweblogic.home=/home/ebills/bea/wlserver_10.3/server -Dweblogic.management.discover=true -Dwlw.iterativeDev=false -Dwlw.testConsole=false -Dwlw.logErrorsToConsole= -Dclient.encoding.override=GBK -Dfile.encoding=GBK -Duser.language=zh -Duser.region=CN -Ddefault.client.encoding=GBK -Dweblogic.threadpool.MinPoolSize=200 -Dweblogic.threadpool.MaxPoolSize=500 -Djava.awt.headless=true -Dweblogic.ext.dirs=/home/ebills/bea/patch_wls1030/profiles/default/sysext_manifest_classpath:/home/ebills/bea/patch_cie660/profiles/default/sysext_manifest_classpath -Dweblogic.management.username=weblogic -Dweblogic.management.password=weblogic -Dweblogic.Name=AdminServer -Djava.security.policy=/home/ebills/bea/wlserver_10.3/server/lib/weblogic.policy weblogic.Server Free Java heap size: 11.89 KB Allocated Java heap size: 1 GB Memory Segment Analysis Memory Segment Analysis Memory Type # of Segments Used Memory(bytes) Used Memory(%) Free Memory(bytes) Free Memory(%) Total Memory(bytes) Internal 431 0 0 28,332,996 100 28,332,996 Object(reserved) 1 1,073,741,824 100 0 0 1,073,741,824 Class 17,984 174,683,076 96.44 6,441,464 3.56 181,124,540 JIT Code Cache 7 58,720,256 100 0 0 58,720,256 JIT Data Cache 4 26,811,512 79.9 6,742,920 20.1 33,554,432 Overall 18,427 1,333,956,668 96.98 41,517,380 3.02 1,375,474,048 Current Thread : Thread Name [ACTIVE] ExecuteThread: '119' for queue: 'weblogic.kernel.Default (self-tuning)' State Runnable Java Stack at java/lang/String. (String.java:350(Compiled Code)) at java/lang/Throwable.printStackTrace(Throwable.java:369(Compiled Code)) at java/lang/Throwable.printStackTrace(Throwable.java:212(Compiled Code)) at weblogic/utils/StackTraceUtilsClient.throwable2StackTrace(StackTraceUtilsClient.java:25(Compiled Code)) at weblogic/jdbc/common/internal/ConnectionEnv.setup(ConnectionEnv.java:308(Compiled Code)) at weblogic/common/resourcepool/ResourcePoolImpl.reserveResource(ResourcePoolImpl.java:303(Compiled Code)) at weblogic/jdbc/common/internal/ConnectionPool.reserve(ConnectionPool.java:427(Compiled Code)) at weblogic/jdbc/common/internal/ConnectionPool.reserve(ConnectionPool.java:316(Compiled Code)) at weblogic/jdbc/common/internal/ConnectionPoolManager.reserve(ConnectionPoolManager.java:85(Compiled Code)) at weblogic/jdbc/common/internal/ConnectionPoolManager.reserve(ConnectionPoolManager.java:61(Compiled Code)) at weblogic/jdbc/jta/DataSource.getXAConnectionFromPool(DataSource.java:1450(Compiled Code)) at weblogic/jdbc/jta/DataSource.refreshXAConnAndEnlist(DataSource.java:1272(Compiled Code)) at weblogic/jdbc/jta/DataSource.getConnection(DataSource.java:425(Compiled Code)) at weblogic/jdbc/jta/DataSource.connect(DataSource.java:382(Compiled Code)) at weblogic/jdbc/common/internal/RmiDataSource.getConnection(RmiDataSource.java:336(Compiled Code)) at com/amerisia/ebills/commons/util/ServiceLocator.getConnection(ServiceLocator.java:66(Compiled Code)) at com/amerisia/ebills/commons/util/BaseDAO.makeConnection(BaseDAO.java:72(Compiled Code)) at com/amerisia/ebills/commons/util/BaseDAO.loadAllRowBySql(BaseDAO.java:623(Compiled Code)) at com/amerisia/ebills/chat/ejbs/ChatDAO.getLeaveWord(ChatDAO.java:119(Compiled Code)) at com/amerisia/ebills/chat/ejbs/ChatManagerEJB.getLeaveWord(ChatManagerEJB.java:165(Compiled Code)) at com/amerisia/ebills/chat/ejbs/ChatManager_oxcdb1_ELOImpl.getLeaveWord(ChatManager_oxcdb1_ELOImpl.java:358(Compiled Code)) at com/amerisia/ebills/chat/facade/ChatNoteManagerEJB.getLeaveWord(ChatNoteManagerEJB.java:110(Compiled Code)) at com/amerisia/ebills/chat/facade/ChatNoteManager_hfunq5_EOImpl.getLeaveWord(ChatNoteManager_hfunq5_EOImpl.java:339(Compiled Code)) at com/amerisia/ebills/chat/facade/ChatNoteManager_hfunq5_EOImpl_WLSkel.invoke(Bytecode PC:240(Compiled Code)) at weblogic/rmi/internal/ServerRequest.sendReceive(ServerRequest.java:174(Compiled Code)) at weblogic/rmi/cluster/ClusterableRemoteRef.invoke(ClusterableRemoteRef.java:345(Compiled Code)) at weblogic/rmi/cluster/ClusterableRemoteRef.invoke(ClusterableRemoteRef.java:259(Compiled Code)) at com/amerisia/ebills/chat/facade/ChatNoteManager_hfunq5_EOImpl_1030_WLStub.getLeaveWord(Bytecode PC:37(Compiled Code)) at com/amerisia/ebills/chat/action/ChatAction.getMsgsForLeft(ChatAction.java:466(Compiled Code)) at sun/reflect/GeneratedMethodAccessor234.invoke(Bytecode PC:64(Compiled Code)) at sun/reflect/DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:37(Compiled Code)) at java/lang/reflect/Method.invoke(Method.java:589(Compiled Code)) at com/amerisia/ebills/commons/action/EbillsBaseDispatchAction.dispatchMethod(EbillsBaseDispatchAction.java:1072(Compiled Code)) at org/apache/struts/actions/DispatchAction.execute(DispatchAction.java:194(Compiled Code)) at org/apache/struts/action/RequestProcessor.processActionPerform(RequestProcessor.java(Compiled Code)) at org/apache/struts/action/RequestProcessor.process(RequestProcessor.java:203(Compiled Code)) at org/apache/struts/action/ActionServlet.process(ActionServlet.java:1196(Compiled Code)) at org/apache/struts/action/ActionServlet.doGet(ActionServlet.java:414(Compiled Code)) at javax/servlet/http/HttpServlet.service(HttpServlet.java:707(Compiled Code)) at javax/servlet/http/HttpServlet.service(HttpServlet.java:820(Compiled Code)) at weblogic/servlet/internal/StubSecurityHelper$ServletServiceAction.run(StubSecurityHelper.java:227(Compiled Code)) at weblogic/servlet/internal/StubSecurityHelper.invokeServlet(StubSecurityHelper.java:121(Compiled Code)) at weblogic/servlet/internal/ServletStubImpl.execute(ServletStubImpl.java:292(Compiled Code)) at weblogic/servlet/internal/TailFilter.doFilter(TailFilter.java:26(Compiled Code)) at weblogic/servlet/internal/FilterChainImpl.doFilter(FilterChainImpl.java:42(Compiled Code)) at com/amerisia/ebills/commons/web/EncodingFilter.doFilter(EncodingFilter.java(Compiled Code)) at weblogic/servlet/internal/FilterChainImpl.doFilter(FilterChainImpl.java:42(Compiled Code)) at weblogic/servlet/internal/WebAppServletContext$ServletInvocationAction.run(WebAppServletContext.java:3486(Compiled Code)) at weblogic/security/acl/internal/AuthenticatedSubject.doAs(AuthenticatedSubject.java:321(Compiled Code)) at weblogic/security/service/SecurityManager.runAs(Bytecode PC:18(Compiled Code)) at weblogic/servlet/internal/WebAppServletContext.securedExecute(WebAppServletContext.java:2120(Compiled Code)) at weblogic/servlet/internal/WebAppServletContext.execute(WebAppServletContext.java:2086(Compiled Code)) at weblogic/servlet/internal/ServletRequestImpl.run(ServletRequestImpl.java:1406(Compiled Code)) at weblogic/work/ExecuteThread.execute(ExecuteThread.java:201(Compiled Code)) at weblogic/work/ExecuteThread.run(ExecuteThread.java:173(Compiled Code)) Native Stack (0xD2647A48 [libj9prt24.so+0x9a48]) (0xD2BA2C38 [libj9dmp24.so+0x10c38]) (0xD263FBAC [libj9prt24.so+0x1bac]) (0xD2BA0AB4 [libj9dmp24.so+0xeab4]) (0xD2B9F0CC [libj9dmp24.so+0xd0cc]) (0xD263FBAC [libj9prt24.so+0x1bac]) (0xD2B9ED10 [libj9dmp24.so+0xcd10]) (0xD2BA5280 [libj9dmp24.so+0x13280]) (0xD2B944DC [libj9dmp24.so+0x24dc]) (0xD2B983A0 [libj9dmp24.so+0x63a0]) (0xD263FBAC [libj9prt24.so+0x1bac]) (0xD2B98344 [libj9dmp24.so+0x6344]) (0xD2B98130 [libj9dmp24.so+0x6130]) (0xD2BB3A94 [libj9dmp24.so+0x21a94]) (0xD2BB3ED0 [libj9dmp24.so+0x21ed0]) (0xD0ED2204 [libj9hookable24.so+0x204]) (0xD4007228 [libj9vm24.so+0x9228]) (0xD4007C8C [libj9vm24.so+0x9c8c]) (0xD5016A8C [libj9jit24.so+0x56aa8c]) (0xD4009910 [libj9vm24.so+0xb910]) (0xD263FBAC [libj9prt24.so+0x1bac]) (0xD4009830 [libj9vm24.so+0xb830]) (0xD2197CC0 [libj9thr24.so+0x1cc0]) _pthread_body+0xec (0xD04EFC50 [libpthreads.a+0x3c50]) Number of loaded classes in Java heap : 34,648 Number of classloaders in Java heap : 9,321 Recommended -Xmxcl setting (only for IBM Java 5.0, up to and including Service Refresh 4 (build date:February 1st ,2007)) : 12,117 or greater note: oNLY FOR jAVA 5.0 sERVICE rEFRESH 4 (BUILD DATE:fEBRUARY 1ST, 2007) AND OLDER. wHEN YOU USE DELEGATED CLASS LOADERS, THE jvm CAN CREATE A LARGE NUMBER OF cLASSlOADER OBJECTS. oN ibm jAVA 5.0 sERVICE rEFRESH 4 AND OLDER, THE NUMBER OF CLASS LOADERS THAT ARE PERMITTED IS LIMITED TO 8192 BY DEFAULT AND AN oUToFmEMORYeRROR EXCEPTION IS THROWN WHEN THIS LIMIT IS EXCEEDED. uSE THE -xMXCL PARAMETER TO INCREASE THE NUMBER OF CLASS LOADERS ALLOWED TO AVOID THIS PROBLEM, FOR EXAMPLE TO 25000, BY SETTING -xMXCL25000, UNTIL THE PROBLEM IS RESOLVED. pLEASE EXAMINE THE CURRENT THREAD STACK TRACE TO CHECK WHETHER A CLASS LOADER IS BEING LOADED IF THERE IS AN oUToFmEMORYeRROR. fOR EXAMPLE, THE FOLLOWING STACK TRACE INDICATES THAT A CLASS LOADER IS BEING LOADED: AT COM/IBM/OTI/VM/vm.INITIALIZEcLASSlOADER(nATIVE mETHOD) AT JAVA/LANG/cLASSlOADER. (cLASSlOADER.JAVA:120) cOMMAND LINE : /USR/JAVA6/BIN/JAVA -xMS512M -xMX1024M -DA -dPLATFORM.HOME=/HOME/EBILLS/BEA/WLSERVER_10.3 -dWLS.HOME=/HOME/EBILLS/BEA/WLSERVER_10.3/SERVER -dWEBLOGIC.HOME=/HOME/EBILLS/BEA/WLSERVER_10.3/SERVER -dWEBLOGIC.MANAGEMENT.DISCOVER=TRUE -dWLW.ITERATIVEdEV=FALSE -dWLW.TESTcONSOLE=FALSE -dWLW.LOGeRRORStOcONSOLE= -dCLIENT.ENCODING.OVERRIDE=gbk -dFILE.ENCODING=gbk -dUSER.LANGUAGE=ZH -dUSER.REGION=cn -dDEFAULT.CLIENT.ENCODING=gbk -dWEBLOGIC.THREADPOOL.mINpOOLsIZE=200 -dWEBLOGIC.THREADPOOL.mAXpOOLsIZE=500 -dJAVA.AWT.HEADLESS=TRUE -dWEBLOGIC.EXT.DIRS=/HOME/EBILLS/BEA/PATCH_WLS1030/PROFILES/DEFAULT/SYSEXT_MANIFEST_CLASSPATH:/HOME/EBILLS/BEA/PATCH_CIE660/PROFILES/DEFAULT/SYSEXT_MANIFEST_CLASSPATH -dWEBLOGIC.MANAGEMENT.USERNAME=WEBLOGIC -dWEBLOGIC.MANAGEMENT.PASSWORD=WEBLOGIC -dWEBLOGIC.nAME=aDMINsERVER -dJAVA.SECURITY.POLICY=/HOME/EBILLS/BEA/WLSERVER_10.3/SERVER/LIB/WEBLOGIC.POLICY WEBLOGIC.sERVER tHREAD sTATUS aNALYSIS sTATUS nUMBER OF tHREADS : 287 Percentage Deadlock 0 0 (%) Runnable 6 2 (%) Waiting on condition 263 92 (%) Waiting on monitor 0 0 (%) Suspended 0 0 (%) Object.wait() 0 0 (%) Blocked 16 6 (%) Parked 2 1 (%) Thread Method Analysis Method Name Number of Threads : 287 Percentage java/lang/Object.wait(Native Method) 167 58 (%) java/lang/Thread.sleep(Native Method) 22 8 (%) NO JAVA STACK 13 5 (%) weblogic/socket/PosixSocketMuxer.processSockets(PosixSocketMuxer.java:93(Compiled Code)) 10 3 (%) weblogic/timers/internal/TimerManagerImpl.complete(TimerManagerImpl.java:654(Compiled Code)) 6 2 (%) java/lang/Throwable.fillInStackTrace(Native Method) 5 2 (%) java/net/PlainSocketImpl.socketAccept(Native Method) 5 2 (%) weblogic/servlet/internal/session/SessionContext$SessionInvalidator.timerExpired(SessionContext.java:788(Compiled Code)) 2 1 (%) sun/misc/Unsafe.park(Native Method) 2 1 (%) java/util/AbstractList.iterator(AbstractList.java:583(Compiled Code)) 2 1 (%) java/io/ObjectInputStream$BlockDataInputStream.readUTFBody(ObjectInputStream.java:3036(Compiled Code)) 2 1 (%) java/lang/StringBuilder.toString(StringBuilder.java:803(Compiled Code)) 2 1 (%) sun/reflect/UTF8.encode(UTF8.java:33(Compiled Code)) 2 1 (%) java/nio/CharBuffer.wrap(CharBuffer.java:361(Compiled Code)) 2 1 (%) java/util/Hashtable.newEntry(Hashtable.java:91(Compiled Code)) 2 1 (%) java/util/regex/Pattern.newSingle(Pattern.java:2962(Compiled Code)) 1 0 (%) java/util/Vector. (Vector.java:76(Compiled Code)) 1 0 (%) java/util/TreeMap.putImpl(TreeMap.java:4544(Compiled Code)) 1 0 (%) java/util/Hashtable.put(Hashtable.java:769(Compiled Code)) 1 0 (%) java/util/Hashtable.clone(Hashtable.java:327(Compiled Code)) 1 0 (%) java/util/AbstractList$FullListIterator. (AbstractList.java:94(Compiled Code)) 1 0 (%) java/util/AbstractCollection.toArray(AbstractCollection.java:352(Compiled Code)) 1 0 (%) java/security/SecureRandom.nextBytes(SecureRandom.java:292(Compiled Code)) 1 0 (%) java/net/SocketInputStream.read(SocketInputStream.java:179(Compiled Code)) 1 0 (%) java/net/Inet6AddressImpl.lookupAllHostAddr(Native Method) 1 0 (%) java/lang/reflect/Method.invoke(Method.java:611(Compiled Code)) 1 0 (%) java/lang/reflect/Array.newArrayImpl(Native Method) 1 0 (%) java/lang/Throwable.printStackTrace(Throwable.java:363(Compiled Code)) 1 0 (%) java/lang/Throwable.printStackTrace(Throwable.java:338(Compiled Code)) 1 0 (%) java/lang/StringBuilder.ensureCapacityImpl(StringBuilder.java:339(Compiled Code)) 1 0 (%) java/lang/String.toUpperCase(String.java:1300(Compiled Code)) 1 0 (%) java/lang/String.toUpperCase(String.java:1129(Compiled Code)) 1 0 (%) java/lang/String. (String.java:350(Compiled Code)) 1 0 (%) java/io/ObjectStreamClass.lookup(ObjectStreamClass.java:291(Compiled Code)) 1 0 (%) java/io/ObjectStreamClass.lookup(ObjectStreamClass.java:287(Compiled Code)) 1 0 (%) java/io/ObjectInputStream.readString(ObjectInputStream.java:1629(Compiled Code)) 1 0 (%) com/ibm/tools/attach/javaSE/IPC.waitSemaphore(Native Method) 1 0 (%) com/ibm/misc/SignalDispatcher.waitForSignal(Native Method) 1 0 (%) com/ibm/lang/management/OperatingSystemNotificationThread.processNotificationLoop(Native Method) 1 0 (%) com/ibm/lang/management/MemoryNotificationThread.processNotificationLoop(Native Method) 1 0 (%) com/bea/security/xacml/combinator/standard/StandardRuleCombinerLibrary$3$1.evaluate(StandardRuleCombinerLibrary.java:200(Compiled Code)) 1 0 (%) com/amerisia/ebills/schedule/data/FileSource.findAll(FileSource.java:213(Compiled Code)) 1 0 (%) com/amerisia/ebills/commons/util/BaseDAO.loadAllRowBySql(BaseDAO.java:644(Compiled Code)) 1 0 (%) weblogic/utils/io/ChunkedDataOutputStream.makeChunkedDataInputStream(ChunkedDataOutputStream.java:376(Compiled Code)) 1 0 (%) weblogic/utils/http/QueryParams.getCurrent(QueryParams.java:21(Compiled Code)) 1 0 (%) weblogic/utils/collections/SecondChanceCacheMap.put(SecondChanceCacheMap.java:72(Compiled Code)) 1 0 (%) weblogic/socket/PosixSocketMuxer.poll(Native Method) 1 0 (%) weblogic/servlet/internal/MuxableSocketHTTP. (MuxableSocketHTTP.java:112(Compiled Code)) 1 0 (%) weblogic/security/service/SecurityManager.runAs(Bytecode PC:18(Compiled Code)) 1 0 (%) weblogic/security/service/JNDIResource.initialize(JNDIResource.java:146(Compiled Code)) 1 0 (%) weblogic/jndi/internal/BasicNamingNode.getPrefix(BasicNamingNode.java:984(Compiled Code)) 1 0 (%) sun/util/calendar/ZoneInfo.getOffsets(ZoneInfo.java:237(Compiled Code)) 1 0 (%) sun/reflect/GeneratedMethodAccessor120.invoke(Bytecode PC:0(Compiled Code)) 1 0 (%) org/apache/struts/util/RequestUtils.populate(RequestUtils.java:399(Compiled Code)) 1 0 (%) oracle/net/ano/AnoComm.o(Bytecode PC:1(Compiled Code)) 1 0 (%) oracle/jdbc/driver/T4CPreparedStatement.executeForDescribe(T4CPreparedStatement.java:801(Compiled Code)) 1 0 (%) oracle/jdbc/driver/T4CMAREngine.unmarshalKEYVAL(T4CMAREngine.java:1859(Compiled Code)) 1 0 (%) oracle/jdbc/driver/DBConversion.stringToDriverCharBytes(DBConversion.java:443(Compiled Code)) 1 0 (%) Thread Aggregation Analysis Thread Type Number of Threads : 287 Percentage Thread 276 96 (%)

JSON数据存储

<div class="post-text" itemprop="text"> <p>I require a service that will run every 15 minutes to generate "up-to-date" JSON for our AJAX services to consume. We are wanting to use this JSON for metric analysis, charts, all that jazz.</p> <p>The reasoning for running every 15 minutes is that the JSON generation is long-running - the queries take at least 45-60 seconds to complete. This is expected to increase as the needs of our application increase. </p> <p><strong>EDIT:</strong> We pull all our data from a SQL database. We already have a huge relational database set up. Getting the data in a fast, efficient time to our web users is absolutely necessary. Therefore, querying the database for all the data we need for metrics on the fly is not satisfactory as it takes too long.</p> <p><strong>EDIT 2</strong> The data needs to be completely regenerated every 15 minutes. The JSON data needs to be available almost instantaneously. We aren't concerned whether the background service takes long, as long as it generates our JSON every 15 minutes.</p> <p>The problem is that I am unsure on how to store this JSON. There will be at least 30-40 separate JSON objects generated and serialised to strings for transmission. I'm not sure how I should go about storing these strings. Also, these 30-40 will only increase over time. We also have preset time periods for generating the JSON, these are 1 day, 1 week, 1 month, 3 months, 6 months, 1 year, 2 years.</p> <p>I have considered a flat database table, though I loathe tables that have dozens of columns and only one row (it doesn't seem right to me). I've wondered whether it would be possible to generate some kind of "data.json" file on the server which the services would pull data from when necessary, but does this suffer from any kind of drawbacks? Can it be cached? (caching would undermine the point of this).</p> <p>Is there some other method available? Can an expert give a solid opinion on this?</p> </div>

大学四年自学走来,这些私藏的实用工具/学习网站我贡献出来了

大学四年,看课本是不可能一直看课本的了,对于学习,特别是自学,善于搜索网上的一些资源来辅助,还是非常有必要的,下面我就把这几年私藏的各种资源,网站贡献出来给你们。主要有:电子书搜索、实用工具、在线视频学习网站、非视频学习网站、软件下载、面试/求职必备网站。 注意:文中提到的所有资源,文末我都给你整理好了,你们只管拿去,如果觉得不错,转发、分享就是最大的支持了。 一、电子书搜索 对于大部分程序员...

在中国程序员是青春饭吗?

今年,我也32了 ,为了不给大家误导,咨询了猎头、圈内好友,以及年过35岁的几位老程序员……舍了老脸去揭人家伤疤……希望能给大家以帮助,记得帮我点赞哦。 目录: 你以为的人生 一次又一次的伤害 猎头界的真相 如何应对互联网行业的「中年危机」 一、你以为的人生 刚入行时,拿着傲人的工资,想着好好干,以为我们的人生是这样的: 等真到了那一天,你会发现,你的人生很可能是这样的: ...

Java基础知识面试题(2020最新版)

文章目录Java概述何为编程什么是Javajdk1.5之后的三大版本JVM、JRE和JDK的关系什么是跨平台性?原理是什么Java语言有哪些特点什么是字节码?采用字节码的最大好处是什么什么是Java程序的主类?应用程序和小程序的主类有何不同?Java应用程序与小程序之间有那些差别?Java和C++的区别Oracle JDK 和 OpenJDK 的对比基础语法数据类型Java有哪些数据类型switc...

我以为我学懂了数据结构,直到看了这个导图才发现,我错了

数据结构与算法思维导图

技术大佬:我去,你写的 switch 语句也太老土了吧

昨天早上通过远程的方式 review 了两名新来同事的代码,大部分代码都写得很漂亮,严谨的同时注释也很到位,这令我非常满意。但当我看到他们当中有一个人写的 switch 语句时,还是忍不住破口大骂:“我擦,小王,你丫写的 switch 语句也太老土了吧!” 来看看小王写的代码吧,看完不要骂我装逼啊。 private static String createPlayer(PlayerTypes p...

和黑客斗争的 6 天!

互联网公司工作,很难避免不和黑客们打交道,我呆过的两家互联网公司,几乎每月每天每分钟都有黑客在公司网站上扫描。有的是寻找 Sql 注入的缺口,有的是寻找线上服务器可能存在的漏洞,大部分都...

Linux 会成为主流桌面操作系统吗?

整理 |屠敏出品 | CSDN(ID:CSDNnews)2020 年 1 月 14 日,微软正式停止了 Windows 7 系统的扩展支持,这意味着服役十年的 Windows 7,属于...

讲一个程序员如何副业月赚三万的真实故事

loonggg读完需要3分钟速读仅需 1 分钟大家好,我是你们的校长。我之前讲过,这年头,只要肯动脑,肯行动,程序员凭借自己的技术,赚钱的方式还是有很多种的。仅仅靠在公司出卖自己的劳动时...

学习总结之HTML5剑指前端(建议收藏,图文并茂)

前言学习《HTML5与CSS3权威指南》这本书很不错,学完之后我颇有感触,觉得web的世界开明了许多。这本书是需要有一定基础的web前端开发工程师。这本书主要学习HTML5和css3,看...

女程序员,为什么比男程序员少???

昨天看到一档综艺节目,讨论了两个话题:(1)中国学生的数学成绩,平均下来看,会比国外好?为什么?(2)男生的数学成绩,平均下来看,会比女生好?为什么?同时,我又联想到了一个技术圈经常讨...

搜狗输入法也在挑战国人的智商!

故事总是一个接着一个到来...上周写完《鲁大师已经彻底沦为一款垃圾流氓软件!》这篇文章之后,鲁大师的市场工作人员就找到了我,希望把这篇文章删除掉。经过一番沟通我先把这篇文章从公号中删除了...

副业收入是我做程序媛的3倍,工作外的B面人生是怎样的?

提到“程序员”,多数人脑海里首先想到的大约是:为人木讷、薪水超高、工作枯燥…… 然而,当离开工作岗位,撕去层层标签,脱下“程序员”这身外套,有的人生动又有趣,马上展现出了完全不同的A/B面人生! 不论是简单的爱好,还是正经的副业,他们都干得同样出色。偶尔,还能和程序员的特质结合,产生奇妙的“化学反应”。 @Charlotte:平日素颜示人,周末美妆博主 大家都以为程序媛也个个不修边幅,但我们也许...

MySQL数据库面试题(2020最新版)

文章目录数据库基础知识为什么要使用数据库什么是SQL?什么是MySQL?数据库三大范式是什么mysql有关权限的表都有哪几个MySQL的binlog有有几种录入格式?分别有什么区别?数据类型mysql有哪些数据类型引擎MySQL存储引擎MyISAM与InnoDB区别MyISAM索引与InnoDB索引的区别?InnoDB引擎的4大特性存储引擎选择索引什么是索引?索引有哪些优缺点?索引使用场景(重点)...

新一代神器STM32CubeMonitor介绍、下载、安装和使用教程

关注、星标公众号,不错过精彩内容作者:黄工公众号:strongerHuang最近ST官网悄悄新上线了一款比较强大的工具:STM32CubeMonitor V1.0.0。经过我研究和使用之...

记一次腾讯面试,我挂在了最熟悉不过的队列上……

腾讯后台面试,面试官问:如何自己实现队列?

如果你是老板,你会不会踢了这样的员工?

有个好朋友ZS,是技术总监,昨天问我:“有一个老下属,跟了我很多年,做事勤勤恳恳,主动性也很好。但随着公司的发展,他的进步速度,跟不上团队的步伐了,有点...

我入职阿里后,才知道原来简历这么写

私下里,有不少读者问我:“二哥,如何才能写出一份专业的技术简历呢?我总感觉自己写的简历太烂了,所以投了无数份,都石沉大海了。”说实话,我自己好多年没有写过简历了,但我认识的一个同行,他在阿里,给我说了一些他当年写简历的方法论,我感觉太牛逼了,实在是忍不住,就分享了出来,希望能够帮助到你。 01、简历的本质 作为简历的撰写者,你必须要搞清楚一点,简历的本质是什么,它就是为了来销售你的价值主张的。往深...

冒泡排序动画(基于python pygame实现)

本项目效果初始截图如下 动画见本人b站投稿:https://www.bilibili.com/video/av95491382 本项目对应github地址:https://github.com/BigShuang python版本:3.6,pygame版本:1.9.3。(python版本一致应该就没什么问题) 样例gif如下 ======================= 大爽歌作,mad

Redis核心原理与应用实践

Redis核心原理与应用实践 在很多场景下都会使用Redis,但是到了深层次的时候就了解的不是那么深刻,以至于在面试的时候经常会遇到卡壳的现象,学习知识要做到系统和深入,不要把Redis想象的过于复杂,和Mysql一样,是个读取数据的软件。 有一个理解是Redis是key value缓存服务器,更多的优点在于对value的操作更加丰富。 安装 yum install redis #yum安装 b...

现代的 “Hello, World”,可不仅仅是几行代码而已

作者 |Charles R. Martin译者 | 弯月,责编 | 夕颜头图 |付费下载自视觉中国出品 | CSDN(ID:CSDNnews)新手...

带了6个月的徒弟当了面试官,而身为高级工程师的我天天修Bug......

即将毕业的应届毕业生一枚,现在只拿到了两家offer,但最近听到一些消息,其中一个offer,我这个组据说客户很少,很有可能整组被裁掉。 想问大家: 如果我刚入职这个组就被裁了怎么办呢? 大家都是什么时候知道自己要被裁了的? 面试软技能指导: BQ/Project/Resume 试听内容: 除了刷题,还有哪些技能是拿到offer不可或缺的要素 如何提升面试软实力:简历, 行为面试,沟通能...

!大部分程序员只会写3年代码

如果世界上都是这种不思进取的软件公司,那别说大部分程序员只会写 3 年代码,恐怕就没有程序员这种职业。

离职半年了,老东家又发 offer,回不回?

有小伙伴问松哥这个问题,他在上海某公司,在离职了几个月后,前公司的领导联系到他,希望他能够返聘回去,他很纠结要不要回去? 俗话说好马不吃回头草,但是这个小伙伴既然感到纠结了,我觉得至少说明了两个问题:1.曾经的公司还不错;2.现在的日子也不是很如意。否则应该就不会纠结了。 老实说,松哥之前也有过类似的经历,今天就来和小伙伴们聊聊回头草到底吃不吃。 首先一个基本观点,就是离职了也没必要和老东家弄的苦...

2020阿里全球数学大赛:3万名高手、4道题、2天2夜未交卷

阿里巴巴全球数学竞赛( Alibaba Global Mathematics Competition)由马云发起,由中国科学技术协会、阿里巴巴基金会、阿里巴巴达摩院共同举办。大赛不设报名门槛,全世界爱好数学的人都可参与,不论是否出身数学专业、是否投身数学研究。 2020年阿里巴巴达摩院邀请北京大学、剑桥大学、浙江大学等高校的顶尖数学教师组建了出题组。中科院院士、美国艺术与科学院院士、北京国际数学...

为什么你不想学习?只想玩?人是如何一步一步废掉的

不知道是不是只有我这样子,还是你们也有过类似的经历。 上学的时候总有很多光辉历史,学年名列前茅,或者单科目大佬,但是虽然慢慢地长大了,你开始懈怠了,开始废掉了。。。 什么?你说不知道具体的情况是怎么样的? 我来告诉你: 你常常潜意识里或者心理觉得,自己真正的生活或者奋斗还没有开始。总是幻想着自己还拥有大把时间,还有无限的可能,自己还能逆风翻盘,只不是自己还没开始罢了,自己以后肯定会变得特别厉害...

HTTP与HTTPS的区别

面试官问HTTP与HTTPS的区别,我这样回答让他竖起大拇指!

程序员毕业去大公司好还是小公司好?

虽然大公司并不是人人都能进,但我仍建议还未毕业的同学,尽力地通过校招向大公司挤,但凡挤进去,你这一生会容易很多。 大公司哪里好?没能进大公司怎么办?答案都在这里了,记得帮我点赞哦。 目录: 技术氛围 内部晋升与跳槽 啥也没学会,公司倒闭了? 不同的人脉圈,注定会有不同的结果 没能去大厂怎么办? 一、技术氛围 纵观整个程序员技术领域,哪个在行业有所名气的大牛,不是在大厂? 而且众所...

男生更看重女生的身材脸蛋,还是思想?

往往,我们看不进去大段大段的逻辑。深刻的哲理,往往短而精悍,一阵见血。问:产品经理挺漂亮的,有点心动,但不知道合不合得来。男生更看重女生的身材脸蛋,还是...

程序员为什么千万不要瞎努力?

本文作者用对比非常鲜明的两个开发团队的故事,讲解了敏捷开发之道 —— 如果你的团队缺乏统一标准的环境,那么即使勤劳努力,不仅会极其耗时而且成果甚微,使用...

为什么程序员做外包会被瞧不起?

二哥,有个事想询问下您的意见,您觉得应届生值得去外包吗?公司虽然挺大的,中xx,但待遇感觉挺低,马上要报到,挺纠结的。

面试阿里p7,被按在地上摩擦,鬼知道我经历了什么?

面试阿里p7被问到的问题(当时我只知道第一个):@Conditional是做什么的?@Conditional多个条件是什么逻辑关系?条件判断在什么时候执...

终于懂了TCP和UDP协议区别

终于懂了TCP和UDP协议区别

无代码时代来临,程序员如何保住饭碗?

编程语言层出不穷,从最初的机器语言到如今2500种以上的高级语言,程序员们大呼“学到头秃”。程序员一边面临编程语言不断推陈出新,一边面临由于许多代码已存在,程序员编写新应用程序时存在重复“搬砖”的现象。 无代码/低代码编程应运而生。无代码/低代码是一种创建应用的方法,它可以让开发者使用最少的编码知识来快速开发应用程序。开发者通过图形界面中,可视化建模来组装和配置应用程序。这样一来,开发者直...

面试了一个 31 岁程序员,让我有所触动,30岁以上的程序员该何去何从?

最近面试了一个31岁8年经验的程序猿,让我有点感慨,大龄程序猿该何去何从。

大三实习生,字节跳动面经分享,已拿Offer

说实话,自己的算法,我一个不会,太难了吧

程序员垃圾简历长什么样?

已经连续五年参加大厂校招、社招的技术面试工作,简历看的不下于万份 这篇文章会用实例告诉你,什么是差的程序员简历! 疫情快要结束了,各个公司也都开始春招了,作为即将红遍大江南北的新晋UP主,那当然要为小伙伴们做点事(手动狗头)。 就在公众号里公开征简历,义务帮大家看,并一一点评。《启舰:春招在即,义务帮大家看看简历吧》 一石激起千层浪,三天收到两百多封简历。 花光了两个星期的所有空闲时...

《经典算法案例》01-08:如何使用质数设计扫雷(Minesweeper)游戏

我们都玩过Windows操作系统中的经典游戏扫雷(Minesweeper),如果把质数当作一颗雷,那么,表格中红色的数字哪些是雷(质数)?您能找出多少个呢?文中用列表的方式罗列了10000以内的自然数、质数(素数),6的倍数等,方便大家观察质数的分布规律及特性,以便对算法求解有指导意义。另外,判断质数是初学算法,理解算法重要性的一个非常好的案例。

立即提问
相关内容推荐