< back

The Construction of Instruction-tuned LLMs for Finance without Instruction Data Using Continual Pretraining and Model Merging

Masanori Hirano, Kentaro Imajo

[Preprint] Sep. 30, 2024


Abstract

This paper proposes a novel method for constructing instruction-tuned large language models (LLMs) for finance without instruction data. Traditionally, developing such domain-specific LLMs has been resource-intensive, requiring a large dataset and significant computational power for continual pretraining and instruction tuning. Our study proposes a simpler approach that combines domain-specific continual pretraining with model merging. Given that general-purpose pretrained LLMs and their instruction-tuned LLMs are often publicly available, they can be leveraged to obtain the necessary instruction task vector. By merging this with a domain-specific pretrained vector, we can effectively create instruction-tuned LLMs for finance without additional instruction data. Our process involves two steps: first, we perform continual pretraining on financial data; second, we merge the instruction-tuned vector with the domain-specific pretrained vector. Our experiments demonstrate the successful construction of instruction-tuned LLMs for finance. One major advantage of our method is that the instruction-tuned and domain-specific pretrained vectors are nearly independent. This independence makes our approach highly effective. The Japanese financial instruction-tuned LLMs we developed in this study are available at https://huggingface.co/pfnet/nekomata-14b-pfn-qfin-inst-merge.

Keywords

finance; large language models; continual pretraining; model merging; instruction;


Paper

arXiv:2409.19854 (doi.org/10.48550/arXiv.2409.19854), ssrn.com/abstract=4971271 (doi.org/10.2139/ssrn.4971271)


bibtex

@preprint{Hirano2024-model-merge,
  title={{The Construction of Instruction-tuned LLMs for Finance without Instruction Data Using Continual Pretraining and Model Merging}},
  author={Masanori Hirano and Kentaro Imajo},
  doi={10.2139/ssrn.4971271},
  archivePrefix={arXiv},
  arxivId={2409.19854},
  year={2024}
}