1、相關(guān)理論
CNN 模型為深度學習模型,其具有局部連接、權(quán) 值共享和空間相關(guān)等特性,以及強魯棒性和容錯能力,適用于提取深層數(shù)據(jù)特征。經(jīng)典的 CNN 模型結(jié)構(gòu)包含輸入層、隱含層、全連接層和輸出層。卷積神經(jīng)網(wǎng)絡的模型如下圖所示。
2、數(shù)據(jù)集的準備
以手寫數(shù)據(jù)集為例,搭建卷積神經(jīng)網(wǎng)絡進行分類識別。數(shù)據(jù)集下載地址: ,下面為部分數(shù)據(jù)集圖片展示。
3、數(shù)據(jù)集的讀取與劃分
將下載好的數(shù)據(jù)集保存好,digitDatasetPath 填寫數(shù)據(jù)集的保存路徑即可。每一類隨機選擇750張圖片作為測試數(shù)據(jù),其他的作為訓練數(shù)據(jù)。
%%數(shù)據(jù)集的讀取
digitDatasetPath = 'D:\\MTALAB2019\\手寫數(shù)據(jù)集\\DigitDataset';
imds = imageDatastore(digitDatasetPath, ...
'IncludeSubfolders',true,'LabelSource','foldernames');
%%數(shù)據(jù)集的劃分
numTrainFiles = 750;
[imdsTrain,imdsValidation] = splitEachLabel(imds,numTrainFiles,'randomize');
4、卷積神經(jīng)網(wǎng)絡的搭建
layers = [
imageInputLayer([28 28 1]) %%輸入層
%%卷積層
convolution2dLayer(3,8,'Padding','same')
batchNormalizationLayer
reluLayer
%%池化層
maxPooling2dLayer(2,'Stride',2)
%%卷積層
convolution2dLayer(3,16,'Padding','same')
batchNormalizationLayer
reluLayer
%%池化層
maxPooling2dLayer(2,'Stride',2)
%%卷積層
convolution2dLayer(3,32,'Padding','same')
batchNormalizationLayer
reluLayer
%全連接層
fullyConnectedLayer(10)
softmaxLayer
classificationLayer];
網(wǎng)絡搭建好了,就需要對網(wǎng)絡的參數(shù)進行設置,相關(guān)參數(shù)代碼如下:
options = trainingOptions('sgdm', ...
'InitialLearnRate',0.01, ...
'MaxEpochs',10, ...
'Shuffle','every-epoch', ...
'ValidationData',imdsValidation, ...
'ValidationFrequency',30, ...
'Verbose',false, ...
'Plots','training-progress');
** 5、訓練卷積神經(jīng)網(wǎng)絡**
net = trainNetwork(imdsTrain,layers,options);
** 訓練結(jié)果如下 **
6、測試與運行結(jié)果
YPred = classify(net,imdsValidation);
YValidation = imdsValidation.Labels;
accuracy = sum(YPred == YValidation)/numel(YValidation)
accuracy = 0.9868
-
神經(jīng)網(wǎng)絡
+關(guān)注
關(guān)注
42文章
4814瀏覽量
103578 -
cnn
+關(guān)注
關(guān)注
3文章
354瀏覽量
22741 -
卷積神經(jīng)網(wǎng)絡
+關(guān)注
關(guān)注
4文章
369瀏覽量
12299
發(fā)布評論請先 登錄
評論