< back

Enhancing Financial Domain Adaptation of Language Models via Model Augmentation

Kota Tanabe, Masanori Hirano, Kazuki Matoya, Kentaro Imajo, Hiroki Sakaji, Itsuki Noda

2024 IEEE International Conference on Big Data, Dec. 16, 2024


Conference

2024 IEEE International Conference on Big Data (IEEE BigData 2024)

Abstract

The domain adaptation of language models, including large language models (LLMs), has become increasingly important as the use of such models continues to expand. This study demonstrates the effectiveness of Composition to Augment Language Models (CALM) in adapting to the financial domain. CALM is a model to extend the capabilities of existing models by introducing cross-attention between two LLMs with different functions. In our experiments, we developed a CALM to enhance the financial performance of an LLM with strong response capabilities by leveraging a financial-specialized LLM. Notably, the CALM was trained using a financial dataset different from the one used to train the financial-specialized LLM, confirming CALM's ability to adapt to various datasets. The models were evaluated through quantitative Japanese financial benchmarks and qualitative response comparisons, demonstrating that CALM enables superior responses with higher scores than the original models and baselines. Additionally, comparative experiments on connection points revealed that connecting the middle layers of the models is most effective in facilitating adaptation to the financial domain. These findings confirm that CALM is a practical approach for adapting LLMs to the financial domain.

Keywords

Financial Natural Language Processing; Large Language Model; Domain Adaptation;


Paper

arXiv:2411.09249 (doi.org/10.48550/arXiv.2411.09249)


bibtex

@inproceedings{Tanabe2024-bigdata,
  title={{Enhancing Financial Domain Adaptation of Language Models via Model Augmentation}},
  author={Kota Tanabe and Masanori Hirano and Kazuki Matoya and Kentaro Imajo and Hiroki Sakaji and Itsuki Noda},
  booktitle={2024 IEEE International Conference on Big Data},
  publisher={IEEE},
  archivePrefix={arXiv},
  arxivId={2411.09249},
  year={2024}
}