Year
Month
(Preprint) CLSEBERT: Contrastive Learning for Syntax Enhanced Code Pre-Trained Model
Xin Wang ¹, Yasheng Wang ², Pingyi Zhou ², Meng Xiao ², Yadao Wang ², Li Li ³, Xiao Liu ⁴, Hao Wu 武浩 ⁵, Jin Liu 刘进 ¹, Xin Jiang ²
¹ School of Computer Science, Wuhan University 武汉大学 计算机学院
² Noah's Ark Lab, Huawei 华为 诺亚方舟实验室
³ Faculty of Information Technology, Monash University
⁴ School of Information Technology, Deakin University
⁵ School of Information Science and Engineering, Yunnan University 云南大学 信息学院
arXiv, 2021-08-10
Abstract

Pre-trained models for programming languages have proven their significant values in various code-related tasks, such as code search, code clone detection, and code translation. Currently, most pre-trained models treat a code snippet as a sequence of tokens or only focus on the data flow between code identifiers.

However, rich code syntax and hierarchy are ignored which can provide important structure information and semantic rules of codes to help enhance code representations. In addition, although the BERT-based code pre-trained models achieve high performance on many downstream tasks, the native derived sequence representations of BERT are proven to be of low-quality, it performs poorly on code matching and similarity tasks.

To address these problems, we propose CLSEBERT, a Constrastive Learning Framework for Syntax Enhanced Code Pre-Trained Model, to deal with various code intelligence tasks. In the pre-training stage, we consider the code syntax and hierarchy contained in the Abstract Syntax Tree (AST) and leverage the constrastive learning to learn noise-invariant code representations. Besides the masked language modeling (MLM), we also introduce two novel pre-training objectives. One is to predict the edges between nodes in the abstract syntax tree, and the other is to predict the types of code tokens. Through extensive experiments on four code intelligence tasks, we successfully show the effectiveness of our proposed model.
CLSEBERT: Contrastive Learning for Syntax Enhanced Code Pre-Trained Model_1
CLSEBERT: Contrastive Learning for Syntax Enhanced Code Pre-Trained Model_2
CLSEBERT: Contrastive Learning for Syntax Enhanced Code Pre-Trained Model_3
CLSEBERT: Contrastive Learning for Syntax Enhanced Code Pre-Trained Model_4
  • The possibilities of using a mixture of PDMS and phosphor in a wide range of industry applications
  • Rodrigo Rendeiro, Jan Jargus, Jan Nedoma, Radek Martinek, Carlos Marques
  • Opto-Electronic Advances
  • 2024-09-20
  • Agile cavity ringdown spectroscopy enabled by moderate optical feedback to a quantum cascade laser
  • Qinxue Nie, Yibo Peng, Qiheng Chen, Ningwu Liu, Zhen Wang, Cheng Wang, Wei Ren
  • Opto-Electronic Advances
  • 2024-09-20
  • Genetic algorithm assisted meta-atom design for high-performance metasurface optics
  • Zhenjie Yu, Moxin Li, Zhenyu Xing, Hao Gao, Zeyang Liu, Shiliang Pu, Hui Mao, Hong Cai, Qiang Ma, Wenqi Ren, Jiang Zhu, Cheng Zhang
  • Opto-Electronic Science
  • 2024-09-20
  • Finely regulated luminescent Ag-In-Ga-S quantum dots with green-red dual emission toward white light-emitting diodes
  • Zhi Wu, Leimeng Xu, Jindi Wang, Jizhong Song
  • Opto-Electronic Advances
  • 2024-09-18
  • Vortex-field enhancement through high-threshold geometric metasurface
  • Qingsong Wang, Yao Fang, Yu Meng, Han Hao, Xiong Li, Mingbo Pu, Xiaoliang Ma, Xiangang Luo
  • Opto-Electronic Advances
  • 2024-09-10
  • Cascaded metasurfaces enabling adaptive aberration corrections for focus scanning
  • Xiaotong Li, Xiaodong Cai, Chang Liu, Yeseul Kim, Trevon Badloe, Huanhuan Liu, Junsuk Rho, Shiyi Xiao
  • Opto-Electronic Advances
  • 2024-09-06
  • Functionality multiplexing in high-efficiency metasurfaces based on coherent wave interferences
  • Yuejiao Zhou, Tong Liu, Changhong Dai, Dongyi Wang, Lei Zhou
  • Opto-Electronic Advances
  • 2024-09-03
  • Physics and applications of terahertz metagratings
  • Shreeya Rane, Shriganesh Prabhu, Dibakar Roy Chowdhury
  • Opto-Electronic Science
  • 2024-09-03
  • Surface-patterned chalcogenide glasses with high-aspect-ratio microstructures for long-wave infrared metalenses
  • Zhaofeng Gu, Yixiao Gao, Kongsi Zhou, Junyang Ge, Chen Xu, Lei Xu, Mohsen Rahmani, Ran Jiang, Yimin Chen, Zijun Liu, Chenjie Gu, Yaoguang Ma, Jianrong Qiu, Xiang Shen
  • Opto-Electronic Science
  • 2024-09-03
  • Racemic dielectric metasurfaces for arbitrary terahertz polarization rotation and wavefront manipulation
  • Jie Li, Xueguang Lu, Hui Li, Chunyu Song, Qi Tan, Yu He, Jingyu Liu, Li Luo, Tingting Tang, Tingting Liu, Hang Xu, Shuyuan Xiao, Wanxia Huang, Yun Shen, Yan Zhang, Yating Zhang, Jianquan Yao
  • Opto-Electronic Advances
  • 2024-08-28
  • Miniature meta-device for dynamic control of Airy beam
  • Qichang Ma, Guixin Li
  • Opto-Electronic Advances
  • 2024-08-28
  • Multi-prior physics-enhanced neural network enables pixel super-resolution and twin-image-free phase retrieval from single-shot hologram
  • Xuan Tian, Runze Li, Tong Peng, Yuge Xue, Junwei Min, Xing Li, Chen Bai, Baoli Yao
  • Opto-Electronic Advances
  • 2024-08-28



  • Grassland: A Rapid Algebraic Modeling System for Million-variable Optimization                                China's Technology Cooperation with Russia: Geopolitics, Economics, and Regime Security
    About
    |
    Contact
    |
    Copyright © PubCard