[1] Chang, E. et al. Time-aware ancient Chinese text translation and inference [J/OL].arXiv preprint arXiv:2107.03179, 2021. https://arxiv.org/pdf/2107.03179 (2021-07-07) [2024-12-26]. [2] Church, K. W. et al. Emerging trends: A gentle introduction to fine-tuning [J].Natural Language Engineering, 2021, 27(6): 763-778. [3] Devlin, J. et al. Bert: Pre-training of deep bidirectional transformers for language understanding [A]. Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies [C]. Minneaplis: Association for Computational Linguistics, 2019: 4171-4186. [4] Gao, R. Y. et al. Machine translation of Chinese classical poetry: A comparison among ChatGPT, Google Translate, and DeepL Translator [J].Humanities and Social Sciences Communications, 2024, 11(1): 1-10. [5] Li, J. H. et al. Eliciting the translation ability of large language models via multilingual finetuning with translation instructions [J]. Transactions of the Association for Computational Linguistics, 2024, 12: 576-592. [6] Liu, Dayiheng et al. Ancient-modern Chinese translation with a new large training dataset [J]. ACM Transactions on Asian and Low-Resource Language Information Processing, 2019, 19(1): 1-13. [7] Ray, P. P. ChatGPT: A comprehensive review on background, applications, key challenges, bias, ethics, limitations and future scope [J].Internet of Things and Cyber-Physical Systems, 2023, 3: 121-154. [8] Reinauer, R. et al. Neural machine translation models can learn to be few-shot learners [J/OL]. arXiv preprint arXiv:2309.08590, 2023.https://arxiv.org/pdf/2309.08590 (2023-09-15) [2024-11-29]. [9] Richburg, A. and M. Carpuat. How multilingual are large language models fine-tuned for translation? [J/OL]. arXiv preprint arXiv:2405.20512, 2024.https://arxiv.org/pdf/2405.20512 (2024-05-30) [2024-12-29]. [10] Su, T. et al. Unlocking parameter-efficient fine-tuning for low-resource language translation [J/OL].arXiv preprint arXiv:2404.04212, 2024.https://arxiv.org/pdf/2404.04212? (2024-04-05) [2024-12-28]. [11] Vaswani, A. et al. Attention is all you need [J/OL]. arXiv: 1706.03762, 2017.https://proceedings.neurips.cc/paper/2017/file/3f5ee243547dee91fbd053c1c4a845aa-Paper.pdf (2023-08-02)[2024-11-27]. [12] Yang, Z. C. et al. Generating classical Chinese poems from vernacular Chinese [A]. Proceedings of the 2019 Conference on Empirical Methods in Natural Language Processing and the 9th International Joint Conference on Natural Language Processing [C]. Hong Kong: Association for Computational Linguistics, 2019: 6155-6164. [13] Yang, Z. N., K. J. Chen, and J. Q. Chen. Guwen-UNILM: Machine translation between ancient and modern Chinese based on pre-trained models [A]. In Lu Wang et al. (eds.). Natural Language Processing and Chinese Computing (Part I)[C]. Cham: Springer International Publishing, 2021: 116-128. [14] 王世钰.中华民族共同体视域下的海外楹联英译研究[J].上海翻译,2023(1):43-48. [15] 吴梦成 等.基于预训练语言模型的汉语古现翻译模型构建[J].信息资源管理学报,2024(6):143-155. [16] 许乾坤 等.基于UniLM模型的古文到现代文机器翻译词汇共享研究[J].情报资料工作, 2024(1): 89-100. [17] 赵霞,王钊.结合对比学习和双流网络融合知识图谱摘要模型[J].计算机应用研究,2025(3): 720-727. [18] 朱献珑,周丽.社会翻译学视域下的全人译者能力研究再思考[J].上海翻译, 2025(4): 52-58. |