0
  • 聊天消息
  • 系統(tǒng)消息
  • 評論與回復
登錄后你可以
  • 下載海量資料
  • 學習在線課程
  • 觀看技術視頻
  • 寫文章/發(fā)帖/加入社區(qū)
會員中心
創(chuàng)作中心

完善資料讓更多小伙伴認識你,還能領取20積分哦,立即完善>

3天內不再提示

MATLAB殘差神經(jīng)網(wǎng)絡設計

冬至子 ? 來源:matlab學習之家 ? 作者:matlab學習之家 ? 2023-06-02 16:39 ? 次閱讀

我們都知道在CNN網(wǎng)絡中,輸入的是圖片的矩陣,也是最基本的特征,整個CNN網(wǎng)絡就是一個信息提取的過程,從底層的特征逐漸抽取到高度抽象的特征,網(wǎng)絡的層數(shù)越多也就意味這能夠提取到的不同級別的抽象特征更加豐富,并且越深的網(wǎng)絡提取的特征越抽象,就越具有語義信息。但神經(jīng)網(wǎng)絡越深真的越好嗎?我們可以看下面一張圖片,圖中描述了不同深度的傳統(tǒng)神經(jīng)網(wǎng)絡效果對比圖,顯然神經(jīng)網(wǎng)絡越深效果不一定好。

圖片

對于傳統(tǒng)CNN網(wǎng)絡,網(wǎng)絡深度的增加,容易導致梯度消失和爆炸。針對梯度消失和爆炸的解決方法一般是正則初始化和中間的正則化層,但是這會導致另一個問題,退化問題,隨著網(wǎng)絡層數(shù)的增加,在訓練集上的準確率卻飽和甚至下降了。為此,殘差神經(jīng)網(wǎng)絡應運而生。

一、算法原理

殘差網(wǎng)絡通過加入 shortcut connections,變得更加容易被優(yōu)化。包含一個 shortcut connection 的幾層網(wǎng)絡被稱為一個殘差塊(residual block),如下圖所示。

圖片

普通的平原網(wǎng)絡與深度殘差網(wǎng)絡的最大區(qū)別在于,深度殘差網(wǎng)絡有很多旁路的支線將輸入直接連到后面的層,使得后面的層可以直接學習殘差,這些支路就叫做shortcut。傳統(tǒng)的卷積層或全連接層在信息傳遞時,或多或少會存在信息丟失、損耗等問題。ResNet 在某種程度上解決了這個問題,通過直接將輸入信息繞道傳到輸出,保護信息的完整性,整個網(wǎng)絡則只需要學習輸入、輸出差別的那一部分,簡化學習目標和難度。

二、代碼實戰(zhàn)

構建19層ResNet網(wǎng)絡,以負荷預測為例
%%
clc
clear


close all
load Train.mat
% load Test.mat
Train.weekend = dummyvar(Train.weekend);
Train.month = dummyvar(Train.month);
Train = movevars(Train,{'weekend','month'},'After','demandLag');
Train.ts = [];




Train(1,:) =[];
y = Train.demand;
x = Train{:,2:5};
[xnorm,xopt] = mapminmax(x',0,1);
[ynorm,yopt] = mapminmax(y',0,1);


xnorm = xnorm(:,1:1000);
ynorm = ynorm(1:1000);


k = 24;           % 滯后長度


% 轉換成2-D image
for i = 1:length(ynorm)-k


    Train_xNorm{:,i} = xnorm(:,i:i+k-1);
    Train_yNorm(i) = ynorm(i+k-1);
    Train_y{i} = y(i+k-1);
end
Train_x = Train_xNorm';


ytest = Train.demand(1001:1170);
xtest = Train{1001:1170,2:5};
[xtestnorm] = mapminmax('apply', xtest',xopt);
[ytestnorm] = mapminmax('apply',ytest',yopt);
% xtestnorm = [xtestnorm; Train.weekend(1001:1170,:)'; Train.month(1001:1170,:)'];
xtest = xtest';
for i = 1:length(ytestnorm)-k
    Test_xNorm{:,i} = xtestnorm(:,i:i+k-1);
    Test_yNorm(i) = ytestnorm(i+k-1);
    Test_y(i) = ytest(i+k-1);
end
Test_x = Test_xNorm';
x_train = table(Train_x,Train_y');
x_test = table(Test_x);
%% 訓練集和驗證集劃分
% TrainSampleLength = length(Train_yNorm);
% validatasize = floor(TrainSampleLength * 0.1);
% Validata_xNorm = Train_xNorm(:,end - validatasize:end,:);
% Validata_yNorm = Train_yNorm(:,TrainSampleLength-validatasize:end);
% Validata_y = Train_y(TrainSampleLength-validatasize:end);
% 
% Train_xNorm = Train_xNorm(:,1:end-validatasize,:);
% Train_yNorm = Train_yNorm(:,1:end-validatasize);
% Train_y = Train_y(1:end-validatasize);
%% 構建殘差神經(jīng)網(wǎng)絡
lgraph = layerGraph();
tempLayers = [
    imageInputLayer([4 24],"Name","imageinput")
    convolution2dLayer([3 3],32,"Name","conv","Padding","same")];
lgraph = addLayers(lgraph,tempLayers);


tempLayers = [
    batchNormalizationLayer("Name","batchnorm")
    reluLayer("Name","relu")];
lgraph = addLayers(lgraph,tempLayers);


tempLayers = [
    additionLayer(2,"Name","addition")
    convolution2dLayer([3 3],32,"Name","conv_1","Padding","same")];
lgraph = addLayers(lgraph,tempLayers);


tempLayers = [
    batchNormalizationLayer("Name","batchnorm_1")
    reluLayer("Name","relu_1")];
lgraph = addLayers(lgraph,tempLayers);


tempLayers = [
    additionLayer(2,"Name","addition_1")
    convolution2dLayer([3 3],32,"Name","conv_2","Padding","same")];
lgraph = addLayers(lgraph,tempLayers);


tempLayers = [
    batchNormalizationLayer("Name","batchnorm_2")
    reluLayer("Name","relu_2")];
lgraph = addLayers(lgraph,tempLayers);


tempLayers = [
    additionLayer(2,"Name","addition_2")
    convolution2dLayer([3 3],32,"Name","conv_3","Padding","same")];
lgraph = addLayers(lgraph,tempLayers);


tempLayers = [
    batchNormalizationLayer("Name","batchnorm_3")
    reluLayer("Name","relu_3")];
lgraph = addLayers(lgraph,tempLayers);


tempLayers = [
    additionLayer(2,"Name","addition_3")
    fullyConnectedLayer(1,"Name","fc")
    regressionLayer("Name","regressionoutput")];
lgraph = addLayers(lgraph,tempLayers);


% 清理輔助變量
clear tempLayers;


lgraph = connectLayers(lgraph,"conv","batchnorm");
lgraph = connectLayers(lgraph,"conv","addition/in2");
lgraph = connectLayers(lgraph,"relu","addition/in1");
lgraph = connectLayers(lgraph,"conv_1","batchnorm_1");
lgraph = connectLayers(lgraph,"conv_1","addition_1/in2");
lgraph = connectLayers(lgraph,"relu_1","addition_1/in1");
lgraph = connectLayers(lgraph,"conv_2","batchnorm_2");
lgraph = connectLayers(lgraph,"conv_2","addition_2/in2");
lgraph = connectLayers(lgraph,"relu_2","addition_2/in1");
lgraph = connectLayers(lgraph,"conv_3","batchnorm_3");
lgraph = connectLayers(lgraph,"conv_3","addition_3/in2");
lgraph = connectLayers(lgraph,"relu_3","addition_3/in1");


plot(lgraph);
analyzeNetwork(lgraph);
%% 設置網(wǎng)絡參數(shù)
maxEpochs = 60;
miniBatchSize = 20;
options = trainingOptions('adam', ...
 'MaxEpochs',maxEpochs, ...
 'MiniBatchSize',miniBatchSize, ...
 'InitialLearnRate',0.01, ...
 'GradientThreshold',1, ...
 'Shuffle','never', ...
 'Plots','training-progress',...
 'Verbose',0);


net = trainNetwork(x_train,lgraph ,options);


Predict_yNorm = predict(net,x_test);
Predict_y = double(Predict_yNorm)
plot(Test_y)
hold on 
plot(Predict_y)
legend('真實值','預測值')

網(wǎng)絡框架:

圖片

網(wǎng)絡分析:

圖片

網(wǎng)絡訓練:

圖片

預測結果:

圖片

聲明:本文內容及配圖由入駐作者撰寫或者入駐合作網(wǎng)站授權轉載。文章觀點僅代表作者本人,不代表電子發(fā)燒友網(wǎng)立場。文章及其配圖僅供工程師學習之用,如有內容侵權或者其他違規(guī)問題,請聯(lián)系本站處理。 舉報投訴
  • 神經(jīng)網(wǎng)絡

    關注

    42

    文章

    4811

    瀏覽量

    103081
  • MATLAB仿真
    +關注

    關注

    4

    文章

    176

    瀏覽量

    20331
  • cnn
    cnn
    +關注

    關注

    3

    文章

    354

    瀏覽量

    22672
  • resnet
    +關注

    關注

    0

    文章

    13

    瀏覽量

    3299
收藏 人收藏

    評論

    相關推薦
    熱點推薦

    matlab 神經(jīng)網(wǎng)絡 數(shù)學建模數(shù)值分析

    matlab神經(jīng)網(wǎng)絡 數(shù)學建模數(shù)值分析 精通的可以討論下
    發(fā)表于 09-18 15:14

    神經(jīng)網(wǎng)絡Matlab程序

    神經(jīng)網(wǎng)絡Matlab程序
    發(fā)表于 09-15 12:52

    matlab小波神經(jīng)網(wǎng)絡源程序下載

    基于MATLAB的有關小波與神經(jīng)網(wǎng)絡緊致結合的源程序[hide] [/hide]
    發(fā)表于 02-22 15:50

    matlab神經(jīng)網(wǎng)絡30個案例分析源碼

    matlab神經(jīng)網(wǎng)絡30個案例分析源碼
    發(fā)表于 12-19 14:51

    MATLAB神經(jīng)網(wǎng)絡

    MATLAB神經(jīng)網(wǎng)絡
    發(fā)表于 07-08 15:17

    什么是深度收縮網(wǎng)絡

    。  在一定程度上,深度收縮網(wǎng)絡的工作原理,可以理解為:通過注意力機制注意到不重要的特征,然后通過軟閾值化將它們置為零;或者說,通過注意力機制注意到重要的特征,將它們保留下來,從而加強深度
    發(fā)表于 11-26 06:33

    Matlab神經(jīng)網(wǎng)絡工具箱是什么? 它在同步中的應用有哪些?

    Matlab神經(jīng)網(wǎng)絡工具箱是什么?Matlab神經(jīng)網(wǎng)絡工具箱在同步中的應用有哪些?
    發(fā)表于 04-26 06:42

    卷積神經(jīng)網(wǎng)絡模型發(fā)展及應用

    地介紹了卷積 神經(jīng)網(wǎng)絡的發(fā)展歷史,然后分析了典型的卷積神經(jīng) 網(wǎng)絡模型通過堆疊結構、網(wǎng)中網(wǎng)結構、結構以及 注意力機制提升模型性能的方法,并
    發(fā)表于 08-02 10:39

    matlab神經(jīng)網(wǎng)絡應用設計

    matlab神經(jīng)網(wǎng)絡應用設計詳細的介紹了matlab神經(jīng)網(wǎng)絡的結合
    發(fā)表于 02-23 10:47 ?0次下載

    matlab神經(jīng)網(wǎng)絡應用設計》pdf下載

    matlab神經(jīng)網(wǎng)絡應用設計》電子資料下載
    發(fā)表于 01-13 10:07 ?0次下載

    基于深度神經(jīng)網(wǎng)絡的遠程監(jiān)督關系抽取模型

    基于卷積神經(jīng)網(wǎng)絡的遠程監(jiān)督關系抽取方法提取的特征單一,且標準交叉熵損失函數(shù)未能較好處理數(shù)據(jù)集中正負樣本比例不均衡的情況。為此,提出一種基于深度神經(jīng)網(wǎng)絡的遠程監(jiān)督關系抽取模型,通過改
    發(fā)表于 05-24 17:06 ?3次下載

    基于神經(jīng)網(wǎng)絡的微型電機轉子焊點圖像檢測

    基于神經(jīng)網(wǎng)絡的微型電機轉子焊點圖像檢測
    發(fā)表于 07-02 14:56 ?23次下載

    如何使用MATLAB神經(jīng)網(wǎng)絡工具箱

    神經(jīng)網(wǎng)絡是一種模擬人腦神經(jīng)元網(wǎng)絡的計算模型,廣泛應用于各種領域,如圖像識別、語音識別、自然語言處理等。在MATLAB中,可以使用神經(jīng)網(wǎng)絡工具箱(Neural Network Toolb
    的頭像 發(fā)表于 07-03 10:34 ?3880次閱讀

    如何利用Matlab進行神經(jīng)網(wǎng)絡訓練

    Matlab作為一款強大的數(shù)學計算軟件,廣泛應用于科學計算、數(shù)據(jù)分析、算法開發(fā)等領域。其中,Matlab神經(jīng)網(wǎng)絡工具箱(Neural Network Toolbox)為用戶提供了豐富的函數(shù)和工具
    的頭像 發(fā)表于 07-08 18:26 ?3560次閱讀

    網(wǎng)絡是深度神經(jīng)網(wǎng)絡

    網(wǎng)絡(Residual Network,通常簡稱為ResNet) 是深度神經(jīng)網(wǎng)絡的一種 ,其獨特的結構設計在解決深層網(wǎng)絡訓練中的梯度消失
    的頭像 發(fā)表于 07-11 18:13 ?1541次閱讀

    電子發(fā)燒友

    中國電子工程師最喜歡的網(wǎng)站

    • 2931785位工程師會員交流學習
    • 獲取您個性化的科技前沿技術信息
    • 參加活動獲取豐厚的禮品