Year
Month
(Preprint) CLSEBERT: Contrastive Learning for Syntax Enhanced Code Pre-Trained Model
Xin Wang ¹, Yasheng Wang ², Pingyi Zhou ², Meng Xiao ², Yadao Wang ², Li Li ³, Xiao Liu ⁴, Hao Wu 武浩 ⁵, Jin Liu 刘进 ¹, Xin Jiang ²
¹ School of Computer Science, Wuhan University 武汉大学 计算机学院
² Noah's Ark Lab, Huawei 华为 诺亚方舟实验室
³ Faculty of Information Technology, Monash University
⁴ School of Information Technology, Deakin University
⁵ School of Information Science and Engineering, Yunnan University 云南大学 信息学院
arXiv, 2021-08-10
Abstract

Pre-trained models for programming languages have proven their significant values in various code-related tasks, such as code search, code clone detection, and code translation. Currently, most pre-trained models treat a code snippet as a sequence of tokens or only focus on the data flow between code identifiers.

However, rich code syntax and hierarchy are ignored which can provide important structure information and semantic rules of codes to help enhance code representations. In addition, although the BERT-based code pre-trained models achieve high performance on many downstream tasks, the native derived sequence representations of BERT are proven to be of low-quality, it performs poorly on code matching and similarity tasks.

To address these problems, we propose CLSEBERT, a Constrastive Learning Framework for Syntax Enhanced Code Pre-Trained Model, to deal with various code intelligence tasks. In the pre-training stage, we consider the code syntax and hierarchy contained in the Abstract Syntax Tree (AST) and leverage the constrastive learning to learn noise-invariant code representations. Besides the masked language modeling (MLM), we also introduce two novel pre-training objectives. One is to predict the edges between nodes in the abstract syntax tree, and the other is to predict the types of code tokens. Through extensive experiments on four code intelligence tasks, we successfully show the effectiveness of our proposed model.
CLSEBERT: Contrastive Learning for Syntax Enhanced Code Pre-Trained Model_1
CLSEBERT: Contrastive Learning for Syntax Enhanced Code Pre-Trained Model_2
CLSEBERT: Contrastive Learning for Syntax Enhanced Code Pre-Trained Model_3
CLSEBERT: Contrastive Learning for Syntax Enhanced Code Pre-Trained Model_4
  • Femtosecond laser micro/nano-processing via multiple pulses incubation
  • Jingbo Yin, Zhenyuan Lin, Lingfei Ji, Minghui Hong
  • Opto-Electronic Technology
  • 2025-09-18
  • Advances and new perspectives of optical systems and technologies for aerospace applications: a comprehensive review
  • Sandro Oliveira, Jan Nedoma, Radek Martinek, Carlos Marques
  • Opto-Electronic Advances
  • 2025-08-25
  • Dynamic spatial beam shaping for ultrafast laser processing: a review
  • Cyril Mauclair, Bahia Najih, Vincent Comte, Florent Bourquard, Martin Delaigue
  • Opto-Electronic Science
  • 2025-08-25
  • Aberration-corrected differential phase contrast microscopy with annular illuminations
  • Yao Fan, Chenyue Zheng, Yefeng Shu, Qingyang Fu, Lixiang Xiong, Guifeng Lu, Jiasong Sun, Chao Zuo, Qian Chen
  • Opto-Electronic Science
  • 2025-08-25
  • Meta-lens digital image correlation
  • Zhou Zhao, Xiaoyuan Liu, Yu Ji, Yukun Zhang, Yong Chen, Zhendong Luo, Yuzhou Song, Zihan Geng, Takuo Tanaka, Fei Qi, Shengxian Shi, Mu Ku Chen
  • Opto-Electronic Advances
  • 2025-07-29
  • Multi-resonance enhanced photothermal synergistic fiber-optic Tamm plasmon polariton tip for high-sensitivity and rapid hydrogen detection
  • Xinran Wei, Yuzhang Liang, Xuhui Zhang, Rui Li, Haonan Wei, Yijin He, Lanlan Shen, Yurui Fang, Ting Xu, Wei Peng
  • Opto-Electronic Science
  • 2025-07-25
  • Broadband ultrasound generator over fiber-optic tip for in vivo emotional stress modulation
  • Jiapu Li, Xinghua Liu, Zhuohua Xiao, Shengjiang Yang, Zhanfei Li, Xin Gui, Meng Shen, He Jiang, Xuelei Fu, Yiming Wang, Song Gong, Tuan Guo, Zhengying Li
  • Opto-Electronic Science
  • 2025-07-25
  • Non-volatile reconfigurable planar lightwave circuit splitter enabled by laser-directed Sb2S3 phase transitions
  • Shixin Gao, Tun Cao, Haonan Ren, Jingzhe Pang, Ran Chen, Yang Ren, Zhenqing Zhao, Xiaoming Chen, Dongming Guo
  • Opto-Electronic Technology
  • 2025-07-18
  • Progress in metalenses: from single to array
  • Chang Peng, Jin Yao, Din Ping Tsai
  • Opto-Electronic Technology
  • 2025-07-18
  • 30 years of nanoimprint: development, momentum and prospects
  • Wei-Kuan Lin, L. Jay Guo
  • Opto-Electronic Technology
  • 2025-07-18
  • Review for wireless communication technology based on digital encoding metasurfaces
  • Haojie Zhan, Manna Gu, Ying Tian, Huizhen Feng, Mingmin Zhu, Haomiao Zhou, Yongxing Jin, Ying Tang, Chenxia Li, Bo Fang, Zhi Hong, Xufeng Jing, Le Wang
  • Opto-Electronic Advances
  • 2025-07-17
  • Coulomb attraction driven spontaneous molecule-hotspot paring enables universal, fast, and large-scale uniform single-molecule Raman spectroscopy
  • Lihong Hong, Haiyao Yang, Jianzhi Zhang, Zihan Gao, Zhi-Yuan Li
  • Opto-Electronic Advances
  • 2025-07-17



  • Grassland: A Rapid Algebraic Modeling System for Million-variable Optimization                                China's Technology Cooperation with Russia: Geopolitics, Economics, and Regime Security
    About
    |
    Contact
    |
    Copyright © PubCard