Year
Month
(Preprint) CLSEBERT: Contrastive Learning for Syntax Enhanced Code Pre-Trained Model
Xin Wang ¹, Yasheng Wang ², Pingyi Zhou ², Meng Xiao ², Yadao Wang ², Li Li ³, Xiao Liu ⁴, Hao Wu 武浩 ⁵, Jin Liu 刘进 ¹, Xin Jiang ²
¹ School of Computer Science, Wuhan University 武汉大学 计算机学院
² Noah's Ark Lab, Huawei 华为 诺亚方舟实验室
³ Faculty of Information Technology, Monash University
⁴ School of Information Technology, Deakin University
⁵ School of Information Science and Engineering, Yunnan University 云南大学 信息学院
arXiv, 2021-08-10
Abstract

Pre-trained models for programming languages have proven their significant values in various code-related tasks, such as code search, code clone detection, and code translation. Currently, most pre-trained models treat a code snippet as a sequence of tokens or only focus on the data flow between code identifiers.

However, rich code syntax and hierarchy are ignored which can provide important structure information and semantic rules of codes to help enhance code representations. In addition, although the BERT-based code pre-trained models achieve high performance on many downstream tasks, the native derived sequence representations of BERT are proven to be of low-quality, it performs poorly on code matching and similarity tasks.

To address these problems, we propose CLSEBERT, a Constrastive Learning Framework for Syntax Enhanced Code Pre-Trained Model, to deal with various code intelligence tasks. In the pre-training stage, we consider the code syntax and hierarchy contained in the Abstract Syntax Tree (AST) and leverage the constrastive learning to learn noise-invariant code representations. Besides the masked language modeling (MLM), we also introduce two novel pre-training objectives. One is to predict the edges between nodes in the abstract syntax tree, and the other is to predict the types of code tokens. Through extensive experiments on four code intelligence tasks, we successfully show the effectiveness of our proposed model.
CLSEBERT: Contrastive Learning for Syntax Enhanced Code Pre-Trained Model_1
CLSEBERT: Contrastive Learning for Syntax Enhanced Code Pre-Trained Model_2
CLSEBERT: Contrastive Learning for Syntax Enhanced Code Pre-Trained Model_3
CLSEBERT: Contrastive Learning for Syntax Enhanced Code Pre-Trained Model_4
  • Three-dimensional multichannel waveguide grating filters
  • Si-Yu Yin, Qi Guo, Shan-Ren Liu, Ju-Wei He, Yong-Sen Yu, Zhen-Nan Tian, Qi-Dai Chen
  • Opto-Electronic Science
  • 2024-08-14
  • Ka-Band metalens antenna empowered by physics-assisted particle swarm optimization (PA-PSO) algorithm
  • Shibin Jiang, Wenjun Deng, Zhanshan Wang, Xinbin Cheng, Din Ping Tsai, Yuzhi Shi, Weiming Zhu
  • Opto-Electronic Science
  • 2024-07-26
  • Complete-basis-reprogrammable coding metasurface for generating dynamically-controlled holograms under arbitrary polarization states
  • Zuntian Chu, Xinqi Cai, Ruichao Zhu, Tonghao Liu, Huiting Sun, Tiefu Li, Yuxiang Jia, Yajuan Han, Shaobo Qu, Jiafu Wang
  • Opto-Electronic Advances
  • 2024-07-26
  • Optical micro/nanofiber enabled tactile sensors and soft actuators: A review
  • Lei Zhang, Yuqi Zhen, Limin Tong
  • Opto-Electronic Science
  • 2024-07-26
  • Soliton microcomb generation by cavity polygon modes
  • Botao Fu, Renhong Gao, Ni Yao, Haisu Zhang, Chuntao Li, Jintian Lin, Min Wang, Lingling Qiao, Ya Cheng
  • Opto-Electronic Advances
  • 2024-07-25
  • Focus control of wide-angle metalens based on digitally encoded metasurface
  • Yi Chen, Simeng Zhang, Ying Tian, Chenxia Li, Wenlong Huang, Yixin Liu, Yongxing Jin, Bo Fang, Zhi Hong, Xufeng Jing
  • Opto-Electronic Advances
  • 2024-07-23
  • Spin-controlled generation of a complete polarization set with randomly-interleaved plasmonic metasurfaces
  • Sören im Sande, Yadong Deng, Sergey I. Bozhevolnyi, Fei Ding
  • Opto-Electronic Advances
  • 2024-07-23
  • An inversely designed integrated spectrometer with reconfigurable performance and ultra-low power consumption
  • Ang Li, Yifan Wu, Chang Wang, Feixia Bao, Zongyin Yang, Shilong Pan
  • Opto-Electronic Advances
  • 2024-07-17
  • OptoGPT: A foundation model for inverse design in optical multilayer thin film structures
  • Taigao Ma, Haozhu Wang, L. Jay Guo
  • Opto-Electronic Advances
  • 2024-07-10
  • Paving continuous heat dissipation pathways for quantum dots in polymer with orange-inspired radially aligned UHMWPE fibers
  • Xuan Yang, Xinfeng Zhang, Tianxu Zhang, Linyi Xiang, Bin Xie, Xiaobing Luo
  • Opto-Electronic Advances
  • 2024-07-05
  • Multiplexed stimulated emission depletion nanoscopy (mSTED) for 5-color live-cell long-term imaging of organelle interactome
  • Yuran Huang, Zhimin Zhang, Wenli Tao, Yunfei Wei, Liang Xu, Wenwen Gong, Jiaqiang Zhou, Liangcai Cao, Yong Liu, Yubing Han, Cuifang Kuang, Xu Liu
  • Opto-Electronic Advances
  • 2024-07-05



  • Grassland: A Rapid Algebraic Modeling System for Million-variable Optimization                                China's Technology Cooperation with Russia: Geopolitics, Economics, and Regime Security
    About
    |
    Contact
    |
    Copyright © PubCard