Chang Xinyue
School of Computer Science and Engineering, School of Art and Information Engineering, Dalian University of Technology
Abstract:
With the rapid development of artificial intelligence and the explosive growth of online textual data, sentiment analysis has emerged as a critical technique in natural language processing, finding extensive applications in diverse fields such as e-commerce, social media platforms, public opinion monitoring, market research, and brand management. Traditional approaches, including rule-based systems and statistical machine learning methods, often face significant limitations when dealing with complex semantic structures, contextual nuances, and the challenges of cross-domain adaptation. In contrast, deep learning techniques, particularly pre-trained language models like BERT (Bidirectional Encoder Representations from Transformers), provide superior contextual understanding, richer feature representations, and outstanding performance across a wide range of natural language processing tasks. This paper systematically explores the application of BERT in multi-domain text sentiment analysis. It begins with a comprehensive review of the fundamental concepts, classifications, and evolving trends in sentiment analysis, followed by a detailed comparison between traditional methods and modern deep learning approaches. The core principles of the BERT model are thoroughly examined, highlighting its architectural innovations and its effectiveness in text classification tasks. A robust multi-domain sentiment analysis framework is proposed, encompassing key stages such as data preprocessing, advanced feature extraction, model architecture design, and optimized training strategies. Extensive experiments conducted on multiple publicly available datasets from diverse domains validate the proposed approach, demonstrating superior accuracy, stability, and generalization capabilities compared to conventional baselines and other pre-trained models. The BERT-based model consistently achieves stable high performance across varying textual domains, offering substantial practical value for real-world deployment and providing fresh theoretical insights and technical contributions to the advancement of multi-domain sentiment analysis research.
Key Words:
BERT; sentiment analysis; multi-domain; text classification; deep learning