科研成果

论文

A Self-supervised Pretraining Method Based on Mixup for Ordinal Classification

作者:   时间:2024-01-29   点击数:

发表:2023 8th International Conference on Intelligent Computing and Signal Processing (ICSP)

作者:Chao Zhang, Jianmei Cheng

方向:突发事件智能感知、状态判别及应对

摘要:Ordinal classification is an important research topic that assigns instance with ordinal category. In practical applications, it greatly relies on supervised deep models, which require massive amount of labeled samples. In this paper, for small or moderate dataset, we use self-supervised learning(SSL) as a pretraining process to guide supervised learning. In selfsupervised pertraining, rotating image embedded with Mixup strategy enables model to learn richer feature representation. Pretrained model is learned by rotation difference between two combined images, instead of single image rotation. After pretraining, SSL helps supervised learning with the style of finetuning. Finally, we evaluate the effectiveness of pretraining paradigm and Mixup rotation strategy on two datasets (Adience and Carstyle) in the experiment, and achieve promising gains on classification performance.


版权所有 : 四川警察学院   地址:四川省泸州市江阳区龙透关路186号    邮编:646000   主值班室电话:(0830)3197723    13882782110   川公网安备 51050202000306号   蜀ICP备05008395号