为什么优化的BP神经网络预测误差会比传统的BP神经网络预测误差要大?有什么解决办法吗?
采用了AOA-BP,PSO-BP,精度都比传统BP要差一点
BP神经网络预测误差
- 写回答
- 好问题 0 提建议
- 追加酬金
- 关注问题
- 邀请回答
-
1条回答 默认 最新
关注 【相关推荐】
- 这篇博客: PSO粒子群优化算法中的 PSO与BP神经网络 部分也许能够解决你的问题, 你可以仔细阅读以下内容或跳转源博客中阅读:
[外链图片转存失败,源站可能有防盗链机制,建议将图片保存下来直接上传(img-qnZJRIL9-1598282969766)(C:\Users\Gary\AppData\Roaming\Typora\typora-user-images\image-20200824232555177.png)]
[外链图片转存失败,源站可能有防盗链机制,建议将图片保存下来直接上传(img-UOi6RHWC-1598282969767)(C:\Users\Gary\AppData\Roaming\Typora\typora-user-images\image-20200824232620315.png)]
function main clc;clear all;close all; MaxRunningTime=1; %该参数是为了使网络集成,重复训练MaxRunningTime次 HiddenUnitNum=12; rand('state',sum(100*clock)); TrainSamIn=-4:0.07:2.5; TrainSamOut=1.1*(1-TrainSamIn+2*TrainSamIn.^2).*exp(-TrainSamIn.^2/2); TestSamIn=2:0.04:3; TestSamOut=1.1*(1-TestSamIn+2*TestSamIn.^2).*exp(-TestSamIn.^2/2); [xxx,TrainSamNum]=size(TrainSamIn); [xxx,TestSamNum]=size(TestSamIn); % for HiddenUnitNum=3:MaxHiddenLayerNode %隐含层神经元的个数可以取逐渐增大的合理整数 fprintf('\n the hidden layer node');HiddenUnitNum TrainNNOut=[]; TestNNOut=[]; for t=1:MaxRunningTime fprintf('the current running times is');t [NewW1,NewB1,NewW2,NewB2]=PSOTrain(TrainSamIn,TrainSamOut,HiddenUnitNum); disp('PSO算法训练神经网络结束,BP算法接着训练网络……'); %%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%% %BP算法参数初始化,注意与上面PSO参数一致 SamInNum=length(TrainSamIn); TestSamNum=length(TestSamIn); InDim=1; OutDim=1; %学习样本添加噪声 rand('state',sum(100*clock)) NoiseVar=0.01; Noise=NoiseVar*randn(1,SamInNum); SamIn=TrainSamIn; SamOutNoNoise=TrainSamOut; SamOut=SamOutNoNoise + Noise; MaxEpochs=300; lr=0.003; E0=0.0001; W1=NewW1; B1=NewB1; W2=NewW2'; B2=NewB2; W1Ex=[W1 B1]; W2Ex=[W2 B2]; SamInEx=[SamIn' ones(SamInNum,1)]'; ErrHistory=[]; %网络参数初始化完毕 %给动画初始画图和构建动画框架和背景 HiddenOut=logsig(W1Ex*SamInEx); HiddenOutEx=[HiddenOut' ones(SamInNum,1)]'; NetworkOut=W2Ex*HiddenOutEx; Error=SamOut-NetworkOut; %给误差的动画显示提供空矩阵和其维数 SSEINIT=zeros(1,MaxEpochs); %这仅限于输出是一维的情况 SSE=sumsqr(Error); %绘画动画显示的图形框架 figure(1); rangecolour=linspace(0,1,MaxEpochs); %采用分区画图,把两幅动画在一个figure中显示 %先画第一幅图形 subplot(2,1,1); hold on axis([1 SamInNum min(SamOut) max(SamOut)]); hflash1=line(1:SamInNum,SamOut,'color',[rangecolour(1) 0 1-rangecolour(1)],... 'linestyle','-','linewidth',2,'marker','d',... 'markersize',2,'erasemode','none'); hflash2=line(1:SamInNum,NetworkOut,'color',[rangecolour(1) 0 1-rangecolour(1)],... 'linestyle','-','linewidth',2.5,'marker','h',... 'markersize',2.3,'erasemode','xor'); xlabel('训练样本的数目'); ylabel('样本的输出值或网络的输出值'); title('样本的输出值与网络的输出值动画显示','fontsize',13); legend('样本的输出值','网络的输出值'); hold off %再画第二幅图形 subplot(2,1,2); hold on axis([1 MaxEpochs -0.2*SSE SSE]); hflash3=line(1:MaxEpochs,E0*ones(1,MaxEpochs),'color',... [rangecolour(1) 0 1-rangecolour(1)],... 'linestyle','--','linewidth',2,'marker','h',... 'markersize',2,'erasemode','none'); hflash4=line(1,SSE,'color',... [rangecolour(1) 0 1-rangecolour(1)],... 'linestyle','-','linewidth',2,'marker','*',... 'markersize',2,'erasemode','xor'); xlabel('网络训练次数'); ylabel('目标误差或网络输出误差'); title('目标误差与网络输出误差动画显示','fontsize',13); legend('目标误差','网络输出误差'); hold off for i=2:MaxEpochs HiddenOut=logsig(W1Ex*SamInEx); HiddenOutEx=[HiddenOut' ones(SamInNum,1)]'; NetworkOut=W2Ex*HiddenOutEx; Error=SamOut-NetworkOut; SSE=sumsqr(Error) %让第二幅动画逐点显示 SSEINIT(:,i)=SSE; %对于神经网络训练过程中发生震荡的瞬间图像将其显示出来 ErrHistory=[ErrHistory SSE]; SSEINIT(:,1); SSEINIT(:,2); SSEINIT2=SSEINIT(:,i); SSEINIT1=SSEINIT(:,i-1); if SSE<E0,break, end Delta2=Error; Delta1=W2'*Delta2.*HiddenOut.*(1-HiddenOut); dW2Ex=Delta2*HiddenOutEx'; dW1Ex=Delta1*SamInEx'; W1Ex=W1Ex+lr*dW1Ex; W2Ex=W2Ex+lr*dW2Ex; W2=W2Ex(:,1:HiddenUnitNum); if SSEINIT2>SSEINIT1 %如果网络学习时发生震荡,每10步显示一次 if mod(i,10)==0 Counter(i)=SSEINIT(:,i); Len=size(Counter); figure(Len(1,2)); subplot(2,1,1); hold on axis([1 SamInNum min(SamOut) max(SamOut)]); hflash5=line(1:SamInNum,SamOut,'color',[rangecolour(1) 0 1-rangecolour(1)],... 'linestyle','-','linewidth',2,'marker','d',... 'markersize',2,'erasemode','none'); hflash6=line(1:SamInNum,NetworkOut,'color',[rangecolour(MaxEpochs) 0 1-rangecolour(MaxEpochs)],... 'linestyle','-','linewidth',2.5,'marker','h',... 'markersize',2.3,'erasemode','xor'); xlabel('训练样本的数目'); ylabel('样本的输出值或网络的输出值'); title('神经网络学习震荡时拟合曲线','fontsize',13); legend('样本的输出值','网络的输出值'); hold off %再画第二幅图形 subplot(2,1,2); hold on axis([1 MaxEpochs -2*SSEINIT(:,2) 2*SSEINIT(:,2)]); hflash7=line(1:MaxEpochs,E0*ones(1,MaxEpochs),'color',... [rangecolour(1) 0 1-rangecolour(1)],... 'linestyle','--','linewidth',2,'marker','h',... 'markersize',2,'erasemode','none'); hflash8=line(1:i,SSEINIT(:,1:i),'color',... [rangecolour(1) 0 1-rangecolour(1)],... 'linestyle','-','linewidth',2,'marker','*',... 'markersize',2,'erasemode','xor'); xlabel('网络训练次数'); ylabel('目标误差或网络输出误差'); title('神经网络学习震荡时误差','fontsize',13); legend('目标误差','网络输出误差'); hold off end end %动画开始放映 set(hflash2,'XData',1:SamInNum,'YData',NetworkOut,'color',... [rangecolour(MaxEpochs) 0 1-rangecolour(MaxEpochs)]); set(hflash4,'XData',1:i,'YData',SSEINIT(:,1:i),... 'color',[rangecolour(MaxEpochs) 0 1-rangecolour(MaxEpochs)]); drawnow; end W2=W2Ex(:,1:HiddenUnitNum); W1=W1Ex(:,1:InDim); B1=W1Ex(:,InDim+1); B2=W2Ex(:,1+HiddenUnitNum); TrainHiddenOut=logsig(W1*SamIn+repmat(B1,1,SamInNum)); TrainNNOut=W2*TrainHiddenOut+repmat(B2,1,SamInNum); TestHiddenOut=logsig(W1*TestSamIn+repmat(B1,1,TestSamNum)); TestNNOut=W2*TestHiddenOut+repmat(B2,1,TestSamNum); figure(MaxEpochs+1); hold on; grid; h1=plot(SamIn,SamOut); set(h1,'color','r','linestyle','-',... 'linewidth',2.5,'marker','p','markersize',5); hold on h2=plot(TestSamIn,TestSamOut); set(h2,'color','g','linestyle','--',... 'linewidth',2.5,'marker','^','markersize',7); h3=plot(SamIn,TrainNNOut); set(h3,'color','c','linestyle','-.',... 'linewidth',2.5,'marker','o','markersize',5); h4=plot(TestSamIn,TestNNOut); set(h4,'color','m','linestyle',':',... 'linewidth',2.5,'marker','s','markersize',5); xlabel('Input x','fontsize',13);ylabel('Output y','fontsize',13); box on;axis tight; %title('PSO-BP神经网络误差测试图'); legend('网络学习实际样本值','网络测试实际样本值',... '网络学习网络输出值','网络测试网络输出值'); hold off; end % end fidW1=fopen('d:\W1.txt','a+');fidB1=fopen('d:\B1.txt','a+'); fidW2=fopen('d:\W2.txt','a+');fidB2=fopen('d:\B2.txt','a+'); for i=1:length(W1) fprintf(fidW1,'\n %6.5f',W1(i)); end for i=1:length(B1) fprintf(fidB1,'\n %6.5f',B1(i)); end for i=1:length(W2) fprintf(fidW2,'\n %6.5f',W2(i)); end for i=1:length(B2) fprintf(fidB2,'\n %6.5f',B2(i)); end fclose(fidW1);fclose(fidB1);fclose(fidW2);fclose(fidB2);
BP与PSOfunction [NewW1,NewB1,NewW2,NewB2]=PSOTrain(SamIn,SamOut,HiddenUnitNum); Maxgeneration=700; E0=0.0001; Xmin=-10; Xmax=10; Vmin=-5; Vmax=5; M=100; c1=2.7; c2=1.3; w=0.9; [R,SamNum]=size(SamIn); [S2,SamNum]=size(SamOut); generation=1; Done=0; Pb1=zeros(HiddenUnitNum,R+S2+1,M); Pb2=zeros(S2,M); Pg1=zeros(HiddenUnitNum,R+S2+1); Pg2=zeros(S2,1); E=zeros(size(SamOut)); rand('state',sum(100*clock)); startP1=rand(HiddenUnitNum,R+S2+1,M)-0.5; startP2=rand(S2,M)-0.5; startV1=rand(HiddenUnitNum,R+S2+1,M)-0.5; startV2=rand(S2,M)-0.5; endP1=zeros(HiddenUnitNum,R+S2+1,M); endP2=zeros(S2,M); endV1=zeros(HiddenUnitNum,R+S2+1,M); endV2=zeros(S2,M); startE=zeros(1,M); endE=zeros(1,M); for i=1:M W1=startP1(1:HiddenUnitNum,1:R,i); W2=startP1(1:HiddenUnitNum,R+1:R+S2,i); B1=startP1(1:HiddenUnitNum,R+S2+1,i); B2=startP2(1:S2,i); for q=1:SamNum TempOut=logsig(W1*SamIn(:,q)+B1); NetworkOut(1,q)=W2'*TempOut+B2; end E=NetworkOut-SamOut; startE(1,i)=sumsqr(E)/(SamNum*S2); Pb1(:,:,i)=startP1(:,:,i); Pb2(:,i)=startP2(:,i); end [val,position]=min(startE(1,:)); Pg1=startP1(:,:,position); Pg2=startP2(:,position); Pgvalue=val; Pgvalue_last=Pgvalue; while(~Done) for num=1:M endV1(:,:,num)=w*startV1(:,:,num)+c1*rand*(Pb1(:,:,num)-startP1(:,:,num))+c2*rand*(Pg1-startP1(:,:,num)); endV2(:,num)=w*startV2(:,num)+c1*rand*(Pb2(:,num)-startP2(:,num))+c2*rand*(Pg2-startP2(:,num)); for i=1:HiddenUnitNum for j=1:(R+S2+1) endV1(i,j,num)=endV1(i,j,num); if endV1(i,j,num)>Vmax endV1(i,j,num)=Vmax; elseif endV1(i,j,num)<Vmin endV1(i,j,num)=Vmin; end end end for s2=1:S2 endV2(s2,num)=endV2(s2,num); if endV2(s2,num)>Vmax endV2(s2,num)=Vmax; elseif endV2(s2,num)<Vmin endV2(s2,num)=Vmin; end end endP1(:,:,num)=startP1(:,:,num)+endV1(:,:,num); endP2(:,num)=startP2(:,num)+endV2(:,num); for i=1:HiddenUnitNum for j=1:(R+S2+1) if endP1(i,j,num)>Xmax endP1(i,j,num)=Xmax; elseif endP1(i,j,num)<Xmin endP1(i,j,num)=Xmin; end end end for s2=1:S2 if endP2(s2,num)>Xmax endP2(s2,num)=Xmax; elseif endP2(s2,num)<Xmin endP2(s2,num)=Xmin; end end W1=endP1(1:HiddenUnitNum,1:R,num); W2=endP1(1:HiddenUnitNum,R+1:R+S2,num); B1=endP1(1:HiddenUnitNum,R+S2+1,num); B2=endP2(1:S2,num); for q=1:SamNum TempOut=logsig(W1*SamIn(:,q)+B1); NetworkOut(1,q)=W2'*TempOut+B2; end E=NetworkOut-SamOut; SSE=sumsqr(E) %便于在命令窗口观察网络误差的变化情况 endE(1,num)=sumsqr(E)/(SamNum*S2); if endE(1,num)<startE(1,num) Pb1(:,:,num)=endP1(:,:,num); Pb2(:,num)=endP2(:,num); startE(1,num)=endE(1,num); end end w=0.9-(0.5/Maxgeneration)*generation; [value,position]=min(startE(1,:)); if value<Pgvalue Pg1=Pb1(:,:,position); Pg2=Pb2(:,position); Pgvalue=value; end if (generation>=Maxgeneration) Done=1; end if Pgvalue<E0 Done=1; end startP1=endP1; startP2=endP2; startV1=endV1; startV2=endV2; startE=endE; generation=generation+1; end W1=Pg1(1:HiddenUnitNum,1:R); W2=Pg1(1:HiddenUnitNum,R+1:R+S2); B1=Pg1(1:HiddenUnitNum,R+S2+1); B2=Pg2(:,1); NewW1=W1; NewW2=W2; NewB1=B1; NewB2=B2;
如果你已经解决了该问题, 非常希望你能够分享一下解决方案, 写成博客, 将相关链接放在评论区, 以帮助更多的人 ^-^解决 无用评论 打赏 举报- 这篇博客: PSO粒子群优化算法中的 PSO与BP神经网络 部分也许能够解决你的问题, 你可以仔细阅读以下内容或跳转源博客中阅读:
悬赏问题
- ¥15 x趋于0时tanx-sinx极限可以拆开算吗
- ¥500 把面具戴到人脸上,请大家贡献智慧
- ¥15 任意一个散点图自己下载其js脚本文件并做成独立的案例页面,不要作在线的,要离线状态。
- ¥15 各位 帮我看看如何写代码,打出来的图形要和如下图呈现的一样,急
- ¥30 c#打开word开启修订并实时显示批注
- ¥15 如何解决ldsc的这条报错/index error
- ¥15 VS2022+WDK驱动开发环境
- ¥30 关于#java#的问题,请各位专家解答!
- ¥30 vue+element根据数据循环生成多个table,如何实现最后一列 平均分合并
- ¥20 pcf8563时钟芯片不启振