引用本文: | 郭泳澄,唐健浩,李珍妮,吕俊.基于近端凸差分方法的多层卷积变换学习算法[J].控制理论与应用,2023,40(11):2019~2027.[点击复制] |
GUO Yong-cheng,TANG Jian-hao,LI Zhen-ni,LU Jun.Multi-layer convolutional transform learning algorithm based on proximal difference of convex method[J].Control Theory and Technology,2023,40(11):2019~2027.[点击复制] |
|
基于近端凸差分方法的多层卷积变换学习算法 |
Multi-layer convolutional transform learning algorithm based on proximal difference of convex method |
摘要点击 1136 全文点击 380 投稿时间:2021-10-25 修订日期:2023-10-09 |
查看全文 查看/发表评论 下载PDF阅读器 |
DOI编号 10.7641/CTA.2022.11020 |
2023,40(11):2019-2027 |
中文关键词 稀疏表示 卷积变换学习 近端凸差分方法 log正则化 特征提取 机器学习 |
英文关键词 sparse representation convolutional transform learning proximal difference of convex algorithm log regularizer feature extraction machine learning |
基金项目 国家自然科学基金项目(62073086), 广东省自然科学基金项目(2022A1515011445) |
|
中文摘要 |
卷积变换学习(CTL)结合了无监督学习与卷积神经网络的优点, 通过无监督的方式训练卷积核, 是一种新兴的稀疏表示方法. 现有的单层CTL模型仅通过一层稀疏编码, 不仅难以有效提取输入信号的深层语义信息, 并且,基于?0范数的CTL模型得到的稀疏解虽然稀疏度强, 但它的求解是一个NP-hard难题, 而基于?1范数的CTL模型则存在稀疏度不足和参数过度惩罚的问题. 针对以上问题, 本文提出了一种基于 log 正则化函数的多层 CTL模型(CTLlog): 为了提取输入信号更具鉴别性与丰富语义的稀疏特征, 对单层的CTL模型进行多层拓展, 同时使用稀疏度强,偏差性小的非凸log正则化函数作为CTL模型的稀疏约束方法. 通过使用近端凸差分方法对模型的非凸优化问题进行优化求解, 开发出基于近端凸差分方法的多层卷积变换学习算法. 实验表明, 本文提出的基于近端凸差分方法的多层卷积变换学习算法所使用的log正则化稀疏约束效果优于现有的CTL模型, 且多层CTL-log的特征提取效果相较于单层取得了提升, 在支持向量机(SVM)分类器的分类精度提升了2个百分点左右. |
英文摘要 |
Convolutional Transform Learning(CTL) combines the advantages of unsupervised learning and convolutional neural network, learning filters in an unsupervised way, which is a new sparse representation method. However, the existing single-layer CTL model is difficult to effectively extract the deep semantic information of input signals through only one layer of sparse coding. Further more, the ?0-norm can enforce strong sparsity, but the ?0-norm-constrained CTL is an NP-hard optimization problem. And the ?1-norm-constrained CTL presents some drawbacks too, such as its inadequate sparsity and the overpenalization for large elements in the sparse vector. In order to solve these problems of the existing CTL model, This paper presents a multi-layer CTL model based on log regularizer(CTL-log): In order to extract the sparse features of input signals that are more discriminative and rich in semantics, the single-layer CTL model is extended by multiple layers. simultaneously, a log regularizer is used as sparse constraint of CTL model which can not only obtain accurate representations but also yield strong sparsity. Finally, we propose to employ the proximal difference of convex algorithm to efficiently address the nonconvex composite optimization, leading to a proximal difference of convex method based multi-layer convolutional transform learning algorithm. The experimental results demonstrate that the performance of the proposed CTL-log is better than the existing CTL model. And compared with the single-layer CTL-log, the multilayer CTL-log has a comprehensive improvement in feature extraction, and the classification accuracy of SVM classifier is improved by about 2 percentage points. |
|
|
|
|
|