Year
Month
(Preprint) CLSEBERT: Contrastive Learning for Syntax Enhanced Code Pre-Trained Model
Xin Wang ¹, Yasheng Wang ², Pingyi Zhou ², Meng Xiao ², Yadao Wang ², Li Li ³, Xiao Liu ⁴, Hao Wu 武浩 ⁵, Jin Liu 刘进 ¹, Xin Jiang ²
¹ School of Computer Science, Wuhan University 武汉大学 计算机学院
² Noah's Ark Lab, Huawei 华为 诺亚方舟实验室
³ Faculty of Information Technology, Monash University
⁴ School of Information Technology, Deakin University
⁵ School of Information Science and Engineering, Yunnan University 云南大学 信息学院
arXiv, 2021-08-10
Abstract

Pre-trained models for programming languages have proven their significant values in various code-related tasks, such as code search, code clone detection, and code translation. Currently, most pre-trained models treat a code snippet as a sequence of tokens or only focus on the data flow between code identifiers.

However, rich code syntax and hierarchy are ignored which can provide important structure information and semantic rules of codes to help enhance code representations. In addition, although the BERT-based code pre-trained models achieve high performance on many downstream tasks, the native derived sequence representations of BERT are proven to be of low-quality, it performs poorly on code matching and similarity tasks.

To address these problems, we propose CLSEBERT, a Constrastive Learning Framework for Syntax Enhanced Code Pre-Trained Model, to deal with various code intelligence tasks. In the pre-training stage, we consider the code syntax and hierarchy contained in the Abstract Syntax Tree (AST) and leverage the constrastive learning to learn noise-invariant code representations. Besides the masked language modeling (MLM), we also introduce two novel pre-training objectives. One is to predict the edges between nodes in the abstract syntax tree, and the other is to predict the types of code tokens. Through extensive experiments on four code intelligence tasks, we successfully show the effectiveness of our proposed model.
CLSEBERT: Contrastive Learning for Syntax Enhanced Code Pre-Trained Model_1
CLSEBERT: Contrastive Learning for Syntax Enhanced Code Pre-Trained Model_2
CLSEBERT: Contrastive Learning for Syntax Enhanced Code Pre-Trained Model_3
CLSEBERT: Contrastive Learning for Syntax Enhanced Code Pre-Trained Model_4
  • On-chip light control of semiconductor optoelectronic devices using integrated metasurfaces
  • Cheng-Long Zheng, Pei-Nan Ni, Yi-Yang Xie, Patrice Genevet
  • Opto-Electronic Advances
  • 2025-01-07
  • Ferroelectric domain engineering of lithium niobate
  • Jackson J. Chakkoria, Aditya Dubey, Arnan Mitchell, Andreas Boes
  • Opto-Electronic Advances
  • 2025-01-03
  • Smart reconfigurable metadevices made of shape memory alloy metamaterials
  • Shiqiang Zhao, Yuancheng Fan, Ruisheng Yang, Zhehao Ye, Fuli Zhang, Chen Wang, Weijia Luo, Yongzheng Wen, Ji Zhou
  • Opto-Electronic Advances
  • 2025-01-03
  • Direct detection with an optimal transfer function: toward the electrical spectral efficiency of coherent homodyne detection
  • Xingfeng Li, Jingchi Li, Xiong Ni, Hudi Liu, Qunbi Zhuge, Haoshuo Chen, William Shieh, Yikai Su
  • Opto-Electronic Science
  • 2024-12-24
  • Enhanced amplified spontaneous emission via splitted strong coupling mode in large-area plasmonic cone lattices
  • Jiazhi Yuan, Jiang Hu, Yan Zheng, Hao Wei, Jiamin Xiao, Yi Wang, Xuchao Zhao, Ye Xiang, Yong Lei, Wenxin Wang
  • Opto-Electronic Science
  • 2024-12-19
  • High-resolution tumor marker detection based on microwave photonics demodulated dual wavelength fiber laser sensor
  • Jie Hu, Weihao Lin, Liyang Shao, Chenlong Xue, Fang Zhao, Dongrui Xiao, Yang Ran, Yue Meng, Panpan He, Zhiguang Yu, Jinna Chen, Perry Ping Shum
  • Opto-Electronic Advances
  • 2024-12-16
  • High performance laser induced plasma assisted ablation by GHz burst mode femtosecond pulses
  • Jingbo Yin, Yulong Zhao, Minghui Hong
  • Opto-Electronic Advances
  • 2024-12-16
  • Sequential harmonic spin–orbit angular momentum generation in nonlinear optical crystals
  • Yutao Tang, Zixian Hu, Junhong Deng, Kingfai Li, Guixin Li
  • Opto-Electronic Advances
  • 2024-12-16
  • Design, setup, and facilitation of the speckle structured illumination endoscopic system
  • Elizabeth Abraham, Zhaowei Liu
  • Opto-Electronic Science
  • 2024-12-13
  • Ultra-high-Q photonic crystal nanobeam cavity for etchless lithium niobate on insulator (LNOI) platform
  • Zhi Jiang, Cizhe Fang, Xu Ran, Yu Gao, Ruiqing Wang, Jianguo Wang, Danyang Yao, Xuetao Gan, Yan Liu, Yue Hao, Genquan Han
  • Opto-Electronic Advances
  • 2024-10-31
  • Advanced biological imaging techniques based on metasurfaces
  • Yongjae Jo, Hyemi Park, Hyeyoung Yoon, Inki Kim
  • Opto-Electronic Advances
  • 2024-10-31
  • Orthogonal matrix of polarization combinations: concept and application to multichannel holographic recording
  • Shujun Zheng, Jiaren Tan, Hongjie Liu, Xiao Lin, Yusuke Saita, Takanori Nomura, Xiaodi Tan
  • Opto-Electronic Advances
  • 2024-10-23



  • Grassland: A Rapid Algebraic Modeling System for Million-variable Optimization                                China's Technology Cooperation with Russia: Geopolitics, Economics, and Regime Security
    About
    |
    Contact
    |
    Copyright © PubCard