Template-Type: ReDIF-Article 1.0
Author-Name: Xiang, Tanya
Title: Contextual Learning Support for Low-Resource Language Large Language Models: Efficient Training Strategies for Zero-Shot and Few-Shot Learning
Abstract: In natural language processing tasks for low-resource languages, building high-quality large language models faces numerous challenges, including data scarcity, domain mismatch, and limited annotation resources. This paper proposes an efficient training strategy based on contextual learning, aimed at addressing the difficulties encountered by low-resource languages in zero-shot and few-shot learning scenarios. The approach leverages cross-lingual knowledge transfer from high-resource languages and systematically enhances contextual information in prompts and representations to maximize the model's generalization ability. We explore the combination of multi-task learning and self-supervised learning to exploit heterogeneous corpora, using existing multilingual and monolingual resources for pretraining. A lightweight fine-tuning stage with a small amount of labeled data is then employed for targeted adaptation to specific downstream tasks and languages. The proposed framework is designed to be computationally efficient, reducing training cost while maintaining or improving performance. Experimental results on a range of language understanding and generation benchmarks demonstrate significant improvements in task performance across various low-resource languages under both zero-shot and few-shot conditions. Ablation studies further highlight the contribution of contextual learning components and cross-lingual transfer mechanisms. These findings provide practical guidance for developing scalable large language models for underrepresented languages and offer new ideas and methods for future research on inclusive and resource-efficient language technologies.
Keywords: low-resource languages, large language models, contextual learning, transfer learning, zero-shot learning, few-shot learning
Journal: Pinnacle Academic Press Proceedings Series
Pages: 344-352
Volume: 10
Issue: 
Year: 2026
File-URL: https://pinnaclepubs.com/index.php/PAPPS/article/view/664/642
File-Format: Application/pdf
Handle: RePEc:dba:pappsa:v:10:y:2026:i::p:344-352
