中国农业机械化科学研究院集团有限公司 主管

北京卓众出版有限公司 主办

基于SS-FusionNet的苍术与关苍术分类方法

Classification method for Atractylodes lancea (Thunb.) DC. and Atractylodes japonica Koidz. Ex Kitam. based on SS-FusionNet

  • 摘要: 苍术与关苍术在外观、成分等方面高度相似,传统基于形态或化学指标的鉴别方法在小样本或无损检测的条件下分类精度较低。提出一种基于融合光谱和图像信息的深度学习分类网络SS-FusionNet,用于高光谱成像和小样本情况下对苍术与关苍术饮片进行高精度分类。通过高光谱成像系统采集苍术与关苍术饮片数据;利用无标签的高光谱数据对自编码器进行预训练,实现编码器模块对数据中的图像特征进行提取;将光谱特征与图像特征进行深度融合,结合上采样卷积模块进行分类。试验结果表明,在小样本条件下,SS-FusionNet分类精度达到92.7%,比支持向量机85.2%的分类精度提高7.5个百分点;比卷积神经网络86.6%的分类精度提高6.1个百分点。该研究为中药品种深度鉴别研究提供了新的思路和方法。

     

    Abstract: Atractylodes lancea(Thunb.)DC. and Atractylodes japonica Koidz. Ex Kitam. are highly similar in appearance, composition, and other aspects.Traditional identification methods based on morphological or chemical indicators have low classification accuracy under conditions of small sample sizes or non-destructive testing.A deep learning classification network called SS-FusionNet was proposed, which integrated spectral and image information for high-precision classification of Atractylodes lancea(Thunb.)DC. and Atractylodes japonica Koidz. Ex Kitam. slices under hyperspectral image and small sample conditions.Atractylodes lancea(Thunb.)DC. and Atractylodes japonica Koidz. Ex Kitam. slices sample data were collected by hyperspectral imaging system.An autoencoder network was pre-trained using unlabeled hyperspectral data to enable encoder module to extract image features from spectral data.Spectral features were deeply fused with image features, and classification was performed by combining upsampling convolution module.Experimental results showed that under small sample conditions, SS-FusionNet achieved a classification accuracy of 92.7%, which was 7.5 percentage points higher than 85.2% classification accuracy of support vector machines and 6.1 percentage points higher than 86.6% accuracy of convolutional neural networks.A new ideas and methods was procided for in-depth identification research on traditional Chinese medicine species.

     

/

返回文章
返回