< back 日本語版

Performance Validation of Pre-Trained BERT in the Financial Domain [in Japanese]

Masahiro SUZUMI, Hiroki SAKAJI, Masanori HIRANO, Kiyoshi IZUMI

The 18th Text Analytics Symposium, IEICE Tech. Rep., vol.12, no.178, NLC2021-12, pp. 26-29, Sep. 16, 2021


Conference

The 18th Text Analytics Symposium, IEICE Tech. Rep.

Abstract

Recently, general-purpose language models pre-trained on large corpora such as BERT have been widely used. In Japanese, several pre-trained models based on Wikipedia have been published. On the other hand, general-purpose models may not be sufficiently effective in the financial domain because of the use of specialized phrases. In this study, we construct a pre-training model using a corpus of the financial domain, and evaluate it on a task in the financial domain.

Keywords

Natural language processing; Language resources; BERT; Financial text;


Paper

Official page


bibtex

@inproceedings{Suzuki2021-text18,
  title={{Performance Validation of Pre-Trained BERT in the Financial Domain [in Japanese]}},
  author={Masahiro SUZUMI and Hiroki SAKAJI and Masanori HIRANO and Kiyoshi IZUMI},
  booktitle={The 18th Text Analytics Symposium, IEICE Tech. Rep.},
  issn={2432-6380},
  volume={12},
  number={178, NLC2021-12},
  pages={26-29},
  url={https://www.ieice.org/ken/paper/20210916yCfG/},
  year={2021}
}